CN117014543A - Image display method and related device - Google Patents

Image display method and related device Download PDF

Info

Publication number
CN117014543A
CN117014543A CN202210450191.6A CN202210450191A CN117014543A CN 117014543 A CN117014543 A CN 117014543A CN 202210450191 A CN202210450191 A CN 202210450191A CN 117014543 A CN117014543 A CN 117014543A
Authority
CN
China
Prior art keywords
image
interface
terminal device
filter
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210450191.6A
Other languages
Chinese (zh)
Inventor
韩笑
张洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210450191.6A priority Critical patent/CN117014543A/en
Publication of CN117014543A publication Critical patent/CN117014543A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/0206Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings
    • H04M1/0208Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings characterized by the relative motions of the body parts
    • H04M1/0214Foldable telephones, i.e. with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • H04M1/0216Foldable in one direction, i.e. using a one degree of freedom hinge
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0266Details of the structure or mounting of specific components for a display module assembly
    • H04M1/0268Details of the structure or mounting of specific components for a display module assembly including a flexible display panel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides an image display method and a related device, which are applied to the technical field of terminals. The method comprises the following steps: the terminal equipment displays a first interface on a first display screen, wherein the first interface comprises a first image and a first control; the terminal equipment receives a first operation aiming at a first control; the terminal equipment responds to the first operation, and a second interface is displayed on the first display screen, wherein the second interface comprises a first image and a second image, and the second image is an image obtained by processing the first image by adopting parameters corresponding to the first control. Based on the method, the terminal equipment can display the first image and the second image simultaneously, so that the user can compare the images from the dimensions such as colors, and the user can conveniently determine the editing effect. In addition, when the terminal equipment is a folding screen mobile phone, a tablet computer or other large-screen equipment, the screen can be fully utilized, the interface is beautified, and the user experience is improved.

Description

Image display method and related device
Technical Field
The application relates to the technical field of terminals, in particular to an image display method and a related device.
Background
With the development of the current flexible screen technology, a flexible foldable screen (also called a folding screen) is applied to terminal equipment such as a mobile phone, so that a user can perform folding or unfolding operation on the mobile phone, and the use requirements of the user on different screen sizes are met.
In a possible implementation, a mobile phone using a folding screen may include an inner screen, an outer screen, and a back panel, the inner screen being foldable. The user can edit the saved image through the inner screen or the outer screen. Editing processes include, but are not limited to: adding filters, adjusting image brightness, increasing blurring effects, etc. During the editing process, the original image display can be switched at the position of the edited image for comparison so as to confirm the editing effect.
However, the mobile phone performs comparison by switching the original image display at the edited image position, so that the comparison effect is poor and the user experience is poor.
Disclosure of Invention
The embodiment of the application provides an image display method and a related device, which are applied to terminal equipment. The terminal equipment can display the edited image and the original image at the same time, so that the user can conveniently compare the editing effect, and the user experience is improved. In addition, when the terminal equipment is a folding screen mobile phone, a tablet computer or other large-screen equipment, the screen can be fully utilized, the interface is beautified, and the user experience is improved.
In a first aspect, an embodiment of the present application provides an image display method, applied to a terminal device, where the method includes: the terminal equipment displays a first interface on a first display screen, wherein the first interface comprises a first image and a first control; the terminal equipment receives a first operation aiming at a first control; the terminal equipment responds to the first operation, and a second interface is displayed on the first display screen, wherein the second interface comprises a first image and a second image, and the second image is an image obtained by processing the first image by adopting parameters corresponding to the first control.
The first image may be an original image. It can be understood that when the terminal device switches the editing type of the image, the original image edited by the next editing operation is the edited image obtained by the previous editing operation. Illustratively, when the terminal device receives an operation of editing an image by a user, the size of the image is changed. When the terminal equipment receives the operation of adding the filter by the user, the original image displayed on the filter interface is the image with the changed size.
The second interface may be referred to as a contrast interface or may also be referred to as a double-map interface. For example, the second interface may be an interface shown as d in fig. 7 and 9, or an interface shown as c in fig. 11 and 12. In the interface shown in c in fig. 11 and 12, the second image may be an image after adding the morning light filter.
Therefore, the terminal equipment can display the first image and the second image simultaneously, so that the user can compare the images from the dimensions such as colors and the like conveniently, and the user can determine the editing effect conveniently. In addition, when the terminal equipment is a folding screen mobile phone, a tablet computer or other large-screen equipment, the screen can be fully utilized, the interface is beautified, and the user experience is improved.
Optionally, the terminal device further includes a foldable main body and a second display screen, where the first display screen and the second display screen are respectively located on two opposite surfaces of the foldable main body, and the first display screen includes a first sub-screen and a second sub-screen; when the foldable main body is in a folded state, the second display screen is exposed on the surface of the foldable main body, and an included angle between two exposed surfaces of the first sub-screen and the second sub-screen is smaller than a first angle threshold; when the foldable main body is in an unfolding state, an included angle between two faces of the first sub-screen and the second sub-screen exposed outside is larger than a second angle threshold, wherein the first angle threshold is more than or equal to 0 degrees and less than 180 degrees, the second angle threshold is more than or equal to 0 degrees and less than or equal to 180 degrees, and the first angle threshold is more than or equal to the second angle threshold; the terminal equipment displays a user interface on a first display screen when detecting that the foldable main body is in an unfolding state, and displays the user interface on a second display screen when detecting that the foldable main body is in a folding state.
In this way, the terminal device may have multiple display screens. The terminal device may be a folding screen mobile phone or the like.
Optionally, the method further comprises: when the terminal equipment determines that the foldable main body is in a folded state, a third interface is displayed on the second display screen, wherein the third interface comprises a first image and a first control; the terminal equipment receives a first operation aiming at a first control; and the terminal equipment responds to the first operation, and displays a fourth interface on the second display screen, wherein the fourth interface comprises a second image and does not comprise the first image.
The fourth interface may be an interface processed by using the parameters corresponding to the first control, such as the interface shown by e in fig. 7 and 9, and the interface shown by e in fig. 11 and 12.
Thus, in the folded state, a single image is displayed; two images are displayed in the expanded state. The folding screen can be fully utilized, the interface is beautified, and the user experience is increased.
Optionally, the method further comprises: after the first display screen displays the second interface, when the terminal equipment detects that the foldable main body is switched from the unfolded state to the folded state, the terminal equipment displays a fourth interface on the second display screen, wherein the fourth interface comprises a second image and does not comprise the first image.
Optionally, the method further comprises: the terminal equipment receives a second operation aiming at a second image at a fourth interface; and the terminal equipment responds to the second operation, a fifth interface is displayed on the second display screen, and the fifth interface displays the first image.
The second operation may be a long press operation, a click operation or a touch operation, which is not limited in the embodiment of the present application.
In this way, when in the folded state, the user can display the original image through the image control terminal equipment, so that the user can conveniently compare and determine the editing effect.
Optionally, the fourth interface includes a second control; the method further comprises the steps of: the terminal equipment receives a third operation aiming at the second control; and the terminal equipment responds to the third operation, a fifth interface is displayed on the second display screen, and the fifth interface displays the first image.
The second control may be a contrast control in the interface corresponding to the folded state, for example, the contrast control 713 in the interface shown in e in fig. 7; an in-interface contrast control 914 shown in FIG. 9 e; an in-interface contrast control 1114 shown in fig. 11 e; in-interface contrast control 1214 shown in fig. 12 e.
The fifth interface displays an original image, for example, an interface shown as a in fig. 10 and an interface shown as b in fig. 13.
The third operation may be a long press operation, a click operation, or a touch operation, which is not limited in the embodiment of the present application.
In this way, when in the folded state, the user can control the terminal device to display the original image through the second control, so that the user can conveniently compare and determine the editing effect.
Optionally, the second interface includes a third control; the method further comprises the steps of: the terminal equipment receives a fourth operation aiming at the third control; and the terminal equipment responds to the fourth operation, and a sixth interface is displayed on the first display screen, wherein the sixth interface comprises a second image, and the sixth interface does not comprise the first image.
The third control may be a merge control, or a single-graph control, such as the contrast control shown as d in FIG. 7, the merge control 910 shown as d in FIG. 9, the merge control 1107 shown as c in FIG. 11, the double-graph control 1207 shown as c in FIG. 12.
The sixth interface displays a second image, for example, an interface shown as d in fig. 7, an interface shown as d in fig. 9, an interface shown as c in fig. 11, and an interface shown as c in fig. 12.
The third operation may be a lifting operation in the long press operation, or may be a clicking operation or a touch operation, which is not limited in the embodiment of the present application.
In this way, the terminal device can also switch to display a single image when in the expanded state.
Optionally, the sixth interface includes a fourth control; the terminal equipment receives a fifth operation aiming at the fourth control; and the terminal equipment responds to the fifth operation and displays a second interface on the first display screen.
The fourth control may be a split control, a double-graph control, or the like. For example, split control 707 shown in c in fig. 7, split control 907 shown in c in fig. 9, split control 1111 shown in d in fig. 11, and double-graph control 1211 shown in d in fig. 12.
The fifth operation may be a long press operation, a click operation, a touch operation, or the like.
Therefore, when the terminal equipment is in an unfolding state, the switching display of a single image and two images can be realized, the display mode is flexible, and the user experience is improved.
Optionally, the method further comprises: the terminal equipment receives a sixth operation aiming at the second image at the second interface; and the terminal equipment responds to the sixth operation, a seventh interface is displayed on the first display screen, and the seventh interface displays the first image and the first image in parallel.
The sixth operation may be a long press operation, a click operation, a touch operation, or the like. The seventh interface displays two original images.
Therefore, the user can control the terminal device to display the first image at the position of the second image through the second image, the first image is convenient to use for carrying out dimension contrast images such as texture lines, the editing effect is convenient to confirm, and the user experience is improved.
Optionally, the second interface includes a fifth control; the method further comprises the steps of: the terminal equipment receives a seventh operation aiming at the fifth control; and the terminal equipment responds to the seventh operation, an eighth interface is displayed on the first display screen, and the eighth interface displays the first image and the first image in parallel.
The seventh operation may be a long press operation, a click operation, a touch operation, or the like. The eighth interface displays two original images. For example, the interface shown as b in fig. 10, and the interface shown as a in fig. 13.
Therefore, the user can control the terminal device to display the first image at the position of the second image through the fifth control, the method is convenient to use for comparing images in dimensions such as texture lines, the editing effect is convenient to confirm, and the user experience is improved.
Optionally, the method further comprises: when the terminal equipment displays the second interface or the sixth interface or the eighth interface, receiving eighth operation; the terminal equipment responds to the eighth operation, a ninth interface is displayed on the first display screen, a first split-screen window corresponding to the first application and a second split-screen window corresponding to the second application are displayed, the first split-screen window comprises a second image, and the second split-screen window comprises an application interface of the second application.
The eighth operation is an operation for indicating split screen. The first application may be a gallery application, the second application may be a split screen application, or may be referred to as a split screen displayable application, e.g., a WeChat, calculator, etc. The second split screen window comprises an application interface corresponding to the split screen application, such as a chat interface of a WeChat application.
The ninth interface displays the contents of the second image and the split screen application, for example, the interface shown by f in fig. 6 to 9, the interface shown by f in fig. 11 and 12; for example, the second image may be displayed on the first sub-screen; the application interface of the second application may be displayed on the second sub-screen.
In this way, the terminal device can display images and other applications in a split screen mode, and a user can operate and control the split screen application simultaneously when editing the images.
Optionally, the method further comprises: the terminal equipment receives a ninth operation aiming at the second image at a ninth interface; and the terminal equipment responds to the ninth operation, and displays the first image at the position where the second image is displayed in the first split-screen window.
The ninth operation may be a long press operation, a click operation, a touch operation, or the like.
In this way, in the split screen state, the user can control the terminal device to display the first image at the position of the second image through the second image.
Optionally, the eighth interface includes a sixth control; the method further comprises the steps of: the terminal equipment receives tenth operation aiming at a sixth control; and the terminal equipment responds to the tenth operation, and displays the first image at the position where the second image is displayed in the first split-screen window.
The tenth operation may be a long press operation, a click operation, a touch operation, or the like.
In this way, in the split screen state, the user can control the terminal device to display the first image at the position of the second image through the sixth control.
Optionally, the second interface further includes a seventh control; the method further comprises the steps of: the terminal equipment receives eleventh operation aiming at a seventh control; the terminal equipment responds to eleventh operation, a tenth interface is displayed on the first display screen, the tenth interface comprises a first image and a third image, and the third image is an image obtained by processing the first image by adopting parameters corresponding to a seventh control.
The seventh control may correspond to any one of the filter selections other than the one corresponding to the first control, e.g., a classical filter of the filter selections. Any one of the other editing options may be associated with the editing option. Other editing options include, but are not limited to: contrast, exposure, brightness, saturation, etc. The seventh control may also correspond to any one of the parameter selection items corresponding to the contrast except for the one corresponding to the first control.
Therefore, the terminal equipment can switch the images corresponding to different parameters to be compared with the first image, for example, switch the images corresponding to different filters, so that the user can conveniently edit the images.
Optionally, the second interface further includes a seventh control; the method further comprises the steps of: the terminal equipment receives eleventh operation aiming at a seventh control at a second interface; the terminal equipment responds to eleventh operation, an eleventh interface is displayed on the first display screen, the eleventh interface comprises a second image and a third image, and the third image is an image obtained by processing the first image by parameters corresponding to the seventh control.
Therefore, the terminal equipment can compare images corresponding to different parameters, for example, images corresponding to different filters are compared, so that a user can edit the images conveniently, and the editing effect is confirmed.
Optionally, the method further comprises: the terminal device determines an arrangement of the first image and the second image in the second interface based on a first value, the first value being related to a size of the first image.
Therefore, different arrangement modes are selected based on the size of the image, the display screen can be utilized to the maximum extent, and the display effect is improved.
Optionally, the first value is a ratio of a length of the first image to a width of the first image; when the first value is larger than a first threshold value, arranging a first image and a second image in a second interface in an up-down mode, wherein the first threshold value is related to the size of a preview area in the second interface and a preset interval, and the preset interval is the interval between the preset images; when the first value is less than or equal to the first threshold value, the first image and the second image in the second interface are arranged in an up-and-down mode.
Optionally, the first threshold satisfies:
optionally, the second interface further comprises a guide animation; the guide animation is used for prompting the user of the using mode of the third control.
In this way, the user can be guided to use the image contrast function.
Optionally, the terminal device responds to the first operation to display a second interface on the first display screen, including: the terminal equipment displays a twelfth interface, and the twelfth interface displays a second image; the terminal equipment receives a twelfth operation aiming at the second image at a twelfth interface; and the terminal equipment responds to the twelfth operation and displays a second interface on the first display screen.
The twelfth operation may be a long press operation, or may be a click or touch operation, which is not limited herein.
The twelfth interface may be an interface shown as d in fig. 6 to 9, an interface shown as c in fig. 11, or an interface shown as c in fig. 12. Thus, the terminal equipment can enter the interface of a single image, enter the interface of two images through the interface of the single image, and perform image comparison.
Optionally, the terminal device responds to the first operation to display a second interface on the first display screen, including: the terminal equipment displays a thirteenth interface, and the thirteenth interface displays a second image and an eighth control; the terminal equipment receives thirteenth operation aiming at the eighth control; and the terminal equipment responds to the thirteenth operation and displays a second interface on the first display screen.
The thirteenth operation may be a long press operation, or may be a click or touch operation, and is not limited herein.
The thirteenth interface may be an interface shown as d in fig. 7 and 9, an interface shown as c in fig. 11, or an interface shown as c in fig. 12. In this way, the terminal equipment can enter the interface of the single image, and the image comparison is performed by triggering the control in the interface of the single image to enter the interface of the two images.
In a third aspect, an embodiment of the present application provides a terminal device, which may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), or the like. The terminal device may be a mobile phone, a smart television, a wearable device, a tablet (Pad), a computer with wireless transceiving function, a Virtual Reality (VR) terminal device, an augmented reality (augmented reality, AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in unmanned driving (self-driving), a wireless terminal in teleoperation (remote medical surgery), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), or the like.
The terminal device includes: comprising the following steps: a processor and a memory; the memory stores computer-executable instructions; the processor executes computer-executable instructions stored in the memory to cause the terminal device to perform a method as in the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program. The computer program, when executed by a processor, implements a method as in the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a computer program which, when run, causes a computer to perform the method as in the first aspect.
It should be understood that the second to fifth aspects of the present application correspond to the technical solutions of the first aspect of the present application, and the advantages obtained by each aspect and the corresponding possible embodiments are similar, and are not repeated.
Drawings
Fig. 1A is a schematic structural diagram of a terminal device according to an embodiment of the present application;
fig. 1B is a schematic structural diagram of a terminal device according to an embodiment of the present application;
fig. 2 is a schematic diagram of an interface corresponding to an edited image when a terminal device is in an expanded state in a possible implementation;
Fig. 3 is a software architecture block diagram of a terminal device according to an embodiment of the present application;
FIG. 4 is an interface schematic diagram of an image editing entry process according to an embodiment of the present application;
FIG. 5 is an interface schematic diagram of an image editing entry process according to an embodiment of the present application;
fig. 6 is a schematic diagram of an interface corresponding to an edited image when a terminal device is in an expanded state according to an embodiment of the present application;
fig. 7 is a schematic diagram of an interface corresponding to an edited image when a terminal device is in an expanded state according to an embodiment of the present application;
fig. 8 is a schematic diagram of an interface corresponding to an edited image when a terminal device is in an expanded state according to an embodiment of the present application;
fig. 9 is a schematic diagram of an interface corresponding to an edited image when a terminal device is in an expanded state according to an embodiment of the present application;
FIG. 10 is a schematic diagram of an interface corresponding to a superimposed contrast provided in an embodiment of the present application;
fig. 11 is a schematic diagram of an interface corresponding to an edited image when a terminal device is in an expanded state according to an embodiment of the present application;
fig. 12 is a schematic diagram of an interface corresponding to an edited image when a terminal device is in an expanded state according to an embodiment of the present application;
FIG. 13 is a schematic diagram of an interface corresponding to a superimposed contrast provided by an embodiment of the present application;
FIG. 14 is a schematic diagram of an interface corresponding to a comparison of different filters according to an embodiment of the present application;
FIG. 15 is a schematic diagram of an image layout according to an embodiment of the present application;
fig. 16 is an interface schematic diagram of a guiding animation in a terminal device according to an embodiment of the present application;
FIG. 17 is a schematic diagram of an interface for guiding an animation in a terminal device according to an embodiment of the present application;
fig. 18 is a schematic diagram of an interface corresponding to an edited video when a terminal device is in an expanded state according to an embodiment of the present application;
FIG. 19 is a schematic diagram of an interface corresponding to a superimposed contrast provided by an embodiment of the present application;
FIG. 20 is a flowchart of an image display method according to an embodiment of the present application;
fig. 21 is a schematic structural diagram of an image display device according to an embodiment of the present application;
fig. 22 is a schematic hardware structure of a terminal device according to an embodiment of the present application.
Detailed Description
For purposes of clarity in describing the embodiments of the present application, the words "exemplary" or "such as" are used herein to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
The "at … …" in the embodiment of the present application may be an instant when a certain situation occurs, or may be a period of time after a certain situation occurs, which is not particularly limited.
It should be noted that, the display interface provided by the embodiment of the present application is merely an example, and the display interface may further include more or less content.
The embodiment of the application can be applied to the terminal equipment with the folding screen. The terminal device may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), etc. The terminal device may be a mobile phone, a smart television, a wearable device, a tablet (Pad), a computer with wireless transceiving function, a Virtual Reality (VR) terminal device, an augmented reality (augmented reality, AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in unmanned driving (self-driving), a wireless terminal in teleoperation (remote medical surgery), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), or the like.
Illustratively, as shown in fig. 1A, the terminal device includes a first display 001, a foldable body (not shown in fig. 1A), and a second display 002, the first display 001 and the second display 002 being located on opposite surfaces of the foldable body, respectively, the first display 001 including a first sub-screen 003 and a second sub-screen 004.
When the foldable main body is in a folded state, the second display screen is exposed on the surface of the foldable main body, and the included angle between the two exposed surfaces of the first sub-screen and the second sub-screen is smaller than a first angle threshold value. When the foldable main body is in an unfolding state, the included angle between the two exposed surfaces of the first sub-screen and the second sub-screen is larger than a second angle threshold, wherein the first angle threshold is larger than or equal to 0 degrees and smaller than 180 degrees, the second angle threshold is smaller than or equal to 0 degrees and smaller than or equal to 180 degrees, and the first angle threshold is smaller than or equal to the second angle threshold;
the terminal device displays a user interface on the first display screen 001 when detecting that the foldable body is in the unfolded state, and displays a user interface on the second display screen 002 when detecting that the foldable body is in the folded state.
Alternatively, it may be understood that the terminal device includes a folded state, which may be understood that an included angle between both side screens of the folded screen is less than or equal to a first value, and an unfolded state, which may be understood that an included angle between both side screens of the folded screen is greater than or equal to a second value.
In a possible implementation, where the first value is equal to the second value, the implementation may be understood as if the terminal device is either in a folded state or in an unfolded state.
In another possible implementation, the second value is greater than the first value, and in this implementation, the terminal device may further include a transition state, where the transition state may be understood as a state where an angle between two side screens of the folding screen is greater than the first value and less than the second value.
Fig. 1B is a front view and a rear view of a terminal device with a folding screen according to an embodiment of the present application.
As shown in fig. 1B, when the terminal device is in a folded state, the display screen facing the user in the front view may be referred to as an external screen, and the display screen facing the user in the rear view may be referred to as a rear panel. The display screen facing the user in the front view may be referred to as an inner screen (or a folded screen) and the outer screen and the rear panel may be unfolded in one plane in the rear view when the terminal device is in the unfolded state.
That is, when the terminal device is in a folded state, the inner screen may be folded and hidden, and the outer screen may be used to display the interface. When the terminal equipment is in an unfolding state, the inner screen can be unfolded, and the inner screen can be used for displaying an interface.
Further, as shown in fig. 1B, cameras may be disposed in the inner screen, the outer screen, and the rear panel of the terminal device.
In a possible implementation, the cameras of the terminal device may be classified into front-facing cameras and rear-facing cameras. The front camera may include: an outer screen camera arranged on the outer screen and an inner screen camera arranged on the inner screen. The rear camera may include a camera disposed on the rear backplate.
It will be appreciated that the terminal device may include one or more external screen cameras and/or one or more internal screen cameras. The outer screen camera can be arranged at any position of the outer screen; the inner screen camera may be disposed at an arbitrary position of the inner screen, for example, the inner screen camera may be disposed on the display screens on both sides when the inner screen is unfolded, or on the display screens on either side when the inner screen is unfolded. The number of the inner screen cameras, the positions of the inner screen cameras, the number of the outer screen cameras and the positions of the outer screen cameras are not limited.
For example, in the terminal device shown in fig. 1B, when the terminal device is in a folded state, the external screen 101 and the external screen camera 102 on the external screen can be seen in a front view; in the corresponding rear view, the rear camera 103 can be seen.
When the terminal equipment is in an unfolding state, the inner screen is unfolded, and the inner screen 104 and an inner screen camera 105 on the inner screen can be seen in a front view; in the corresponding rear view, a rear camera 107 can be seen, as well as an external screen camera 106 on the external screen. Wherein the inner screen 104 can be divided into left and right parts.
Fig. 2 is a schematic diagram of an interface corresponding to an edited image when a terminal device is in an expanded state in a possible implementation.
Upon receiving an operation of editing an image by a user, the terminal device may enter an image editing interface shown as a in fig. 2. As shown in a of fig. 2, the image editing interface may include therein an image display area 201 and editing options. The image display area includes the original image 202. The original image 202 occupies the image display area 201 either horizontally or vertically. Editing options include, but are not limited to: clips, filters 203, adjustments, or other types of editing options.
When the user triggers the filter 203 by clicking, touching, or the like in the image editing interface shown in a in fig. 2, the terminal device receives an operation of adjusting the image filter by the user, and the terminal device enters the adjustment filter interface shown in b in fig. 2. The adjusting filter interface may include: preview area 204, edit options, and filter options. Filter options include, but are not limited to: artwork, classical, morning 205, black and white or other types of filter options.
In the embodiment of the application, the filter selection item may be a thumbnail of the image with the corresponding filter added, or may be a filter name, or the filter selection item includes both the filter name and the thumbnail of the image with the corresponding filter added.
When the user triggers the morning light 205 by clicking, touching, or the like in the adjustment filter interface shown in b in fig. 2, the terminal device receives the operation of the user to add the morning light filter, and the terminal device enters the morning light filter interface shown in c in fig. 2. The morning filter interface may include: preview area 206 and filter selections. The image 207 after adding the morning filter fills the preview area 206 either horizontally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options. In a possible implementation, the morning light filter interface also includes contrast control 208.
When the user triggers the image 207 after adding the morning light filter or the contrast control 208 in the morning light filter interface shown in c in fig. 2 through clicking, touching or the like, the terminal device receives the operation of comparing the image by the user, and the terminal device enters the contrast interface shown in d in fig. 2. The comparison interface includes a preview area 209 therein. An original image is displayed in the preview area 209. The position of the original image displayed in the contrast interface shown by d in fig. 2 coincides with the position of the image 207 displayed with the addition of the morning light filter in the morning light filter interface shown by c in fig. 2.
It can be understood that when the terminal device is in the folded state, the image display manner is similar to the image display when the terminal device is in the unfolded state in fig. 2, and will not be described herein.
As can be seen from fig. 2, the terminal device performs image comparison by switching the image with the added filter and the original image in the preview area. And, the image is centrally displayed, and when the image horizontally and vertically occupies the preview area, the left and right sides of the preview area are also partially unused, resulting in low screen utilization of the inner screen, and inconvenient for the user to compare the images before and after the processing.
In addition, as can be seen from fig. 2, the position of the contrast control is located at the upper right corner of the inner screen, which is inconvenient for the user to operate with one hand and has poor user experience.
In view of this, embodiments of the present application provide an image display method and related apparatus, where when a terminal device edits an image in an expanded state, a plurality of images may be displayed on an internal screen at the same time, so as to facilitate comparison of editing effects by a user, reduce tedious operations of repeated modification, improve a utilization rate of the internal screen, and improve user experience.
For ease of understanding, the software system of the terminal device will be described below. The software system of the terminal device may adopt a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, a cloud architecture, or the like, which will not be described herein.
Fig. 3 is a software architecture block diagram of a terminal device according to an embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 3, the application package may include camera, calendar, phone, map, phone, music, settings, mailbox, video, social, etc. applications.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 3, the application framework layer may include a window manager, a content provider, a resource manager, a view system, a notification manager, and the like.
The window manager is used for managing window programs. The window manager may obtain the display screen size, determine if there is a status bar, lock the screen, touch the screen, drag the screen, intercept the screen, etc.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the terminal equipment vibrates, and an indicator light blinks.
Android runtimes include core libraries and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The workflow of terminal equipment software and hardware is illustrated below in connection with the scenario of terminal equipment interface switching.
When a touch sensor in the terminal equipment receives touch operation, a corresponding hardware interrupt is sent to the kernel layer. The kernel layer processes the touch operation into the original input event (including information such as touch coordinates, touch strength, time stamp of the touch operation, etc.). The original input event is stored at the kernel layer. The application framework layer acquires an original input event from the kernel layer, and identifies a control corresponding to the input event. Taking the touch operation as a touch single click operation, taking a control corresponding to the single click operation as an example of a control of a gallery application icon, calling an interface of an application framework layer by the gallery application, starting the gallery application, and further starting a display driver by calling a kernel layer to display a functional interface of the gallery application.
The following first describes an entry mode of the image editing interface. The terminal device may enter the image editing interface through a gallery application (as shown in fig. 4) or through a camera application (as shown in fig. 5).
Fig. 4 is a schematic flow chart of entering an image editing interface according to an embodiment of the present application. When the terminal device receives an operation of opening the gallery application 401 of the terminal device by the user in the main interface shown in a in fig. 4, the terminal device may enter the gallery interface shown in b in fig. 4. A thumbnail of an image and/or a thumbnail of a video, such as thumbnail 402 of an image shown in fig. 4 b, may be included in the gallery interface.
When the user selects the thumbnail 402 of the image by clicking, touching, or the like in the gallery interface shown in b in fig. 4, the terminal device receives an operation of viewing the image by the user, and the terminal device enters the image interface shown in c in fig. 4. The image interface may include therein an image display area 403 and setting items. The image display area includes the original image 404. The original image 404 occupies the image display area 403 horizontally or vertically. The settings include, but are not limited to: sharing, favorites, edits 405, deletions, more or other types of edit options.
When the user triggers the editing 405 by clicking, touching, or the like in the image interface shown in c in fig. 4, the terminal device receives an operation of editing an image by the user, and the terminal device enters the image editing interface shown in d in fig. 4. The image editing interface may include an image display area 406, an editing option, and a parameter setting area corresponding to the editing option. The image display area includes the original image 407. The original image 407 occupies the image display area 406 horizontally or vertically. Editing options include, but are not limited to: editing, filtering, adjusting, or other types of editing options. The parameter setting area corresponding to each editing option is different, for example, as shown in d in fig. 4, the parameter setting area corresponding to the clip mainly includes parameter setting items such as parameter for adjusting the size of the image, direction of the rotated image, and the like. As above or below, the parameter setting areas corresponding to the filters include different filter options.
It will be appreciated that the gallery interface may include video and images, may include only images, and may include only video. The number of images may be one or more; the number of videos may also be one or more. The embodiment of the application does not limit the number of images, the number of videos, the arrangement mode of the images, the arrangement mode of the videos and the like in the gallery interface.
Fig. 5 is a schematic flow chart of another method for entering an image editing interface according to an embodiment of the present application.
When the terminal device receives an operation of opening the camera application 501 of the terminal device by the user in the main interface shown in a of fig. 5, the terminal device may enter the camera interface shown in b of fig. 5. As shown in b of fig. 5, a camera preview area, a camera mode selection item, a photographing control 502, and a thumbnail 503 may be included in the camera interface. Camera mode selections include, but are not limited to: video, photo, portrait, night scene or other mode selections.
When the user triggers the photographing control 502 by clicking, touching, or the like in the camera interface shown in b in fig. 5, the terminal device receives an operation of photographing an image by the user, and the terminal device enters the interface shown in c in fig. 5. A camera preview area, a camera mode selection item, a photographing control, and a thumbnail 504 may be included in the interface.
When the user triggers the thumbnail 504 by clicking, touching, or the like in the interface shown in c in fig. 5, the terminal device receives an operation of viewing an image by the user, and the terminal device enters the image interface shown in d in fig. 5. The image interface may include an image display area 505 and settings. The image display area includes the original image 506. The original image 506 occupies the image display area 505 either horizontally or vertically. The settings include, but are not limited to: sharing, favorites, edits 507, deletions, more or other types of edit options.
When the user triggers editing 507 by clicking, touching, or the like in the image interface shown in d in fig. 5, the terminal device receives an operation of editing an image by the user, and the terminal device enters the image editing interface shown in e in fig. 5. The image editing interface may include an image display area 508 and editing options. The image display area includes the original image 509. The original image 509 occupies the image display area 508 either horizontally or vertically. Editing options include, but are not limited to: editing, filtering, adjusting, or other types of editing options.
It will be appreciated that the thumbnail 503 displayed in the interface shown in b in fig. 5 described above corresponds to the thumbnail of the image or video that was last captured using the camera. When the thumbnail 503 corresponds to an image, the user may trigger the thumbnail 503 by clicking, touching, or other operations in the camera interface shown in b in fig. 5, and the terminal device receives the operation of viewing the image from the user, and enters the image interface corresponding to the thumbnail 503, and displays the image corresponding to the thumbnail 503. The user can edit the image on the image interface corresponding to the thumbnail 503. The image interface corresponding to the thumbnail 503 may refer to the interface shown as d in fig. 5, and will not be described herein.
The following describes in detail the operation procedure of image editing contrast and the display procedure of the image editing interface in the application program provided by the embodiment of the application with reference to the accompanying drawings.
The terminal equipment receives the comparison operation in the unfolded state, and simultaneously displays a plurality of images for comparison. By way of example, the comparison operation may be a continuous pressing operation, a clicking or touching operation, or the like. The image comparison flow provided by the embodiment of the application is described below with reference to fig. 6 to 14. Fig. 6 and 7 are flowcharts illustrating image contrast by taking a contrast operation as a continuous pressing operation as an example. Fig. 8 to 14 illustrate a flow of image contrast by taking a contrast operation as a click or touch operation as an example.
Fig. 6 is an interface schematic diagram corresponding to an edited image when a terminal device is in an expanded state according to an embodiment of the present application.
When the terminal device is receiving an operation of editing an image by a user, the user can edit the image in the manner as shown in fig. 4 or 5, and the terminal device can enter an image editing interface as shown in a in fig. 6. As shown in a of fig. 6, an image display area 601 and an editing selection item may be included in the image editing interface. The image display area includes the original image 602. The original image 602 occupies the image display area 601 horizontally or vertically. Editing options include, but are not limited to: clips, filters 603, adjustments, or other types of editing options.
When the user triggers the filter 603 by clicking, touching, or the like in the image editing interface shown in a in fig. 6, the terminal device receives an operation of adjusting the image filter by the user, and the terminal device enters the adjustment filter interface shown in b in fig. 6. The adjusting filter interface may include: preview area 604, edit options, and filter options. Filter options include, but are not limited to: artwork, classical, morning 605, black and white, or other types of filter options.
In the embodiment of the application, the filter selection item may be a thumbnail of the image with the corresponding filter added, or may be a filter name, or may include both the filter name and the thumbnail of the image with the corresponding filter added, and the filter name may be set on the thumbnail or around the thumbnail. The embodiment of the application does not limit the concrete expression form of the filter selection item.
When the user triggers the morning light 605 by clicking, touching, or the like in the adjustment filter interface shown in b in fig. 6, the terminal device receives the operation of the user to add the morning light filter, and the terminal device enters the morning light filter interface shown in c in fig. 6. The morning filter interface may include: preview area 606 and filter selections. The image 607 after adding the morning filter fills the preview area 606 either laterally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options.
When the user triggers the image 607 after adding the morning light filter by the continuous pressing operation in the morning light filter interface shown in c in fig. 6, the terminal device receives the operation of the user to compare the image, and the terminal device enters the comparison interface shown in d in fig. 6. The comparison interface includes a preview area 608 therein. An original image 609 and an image 610 with a morning filter added are displayed in the preview area 608.
When the terminal device detects a user's lifting operation in the comparison interface shown as d in fig. 6, the terminal device enters the morning light filter interface shown as c in fig. 6.
Thus, the edited image and the original image are compared separately, and the comparison by a user is facilitated. In addition, the split comparison can make full use of the screen and beautify the display interface.
In a possible implementation manner, when the terminal device enters the interface after adding the filter, the added filter name is displayed on the interface after adding the filter, and the filter name disappears after displaying the preset time. For example, when entering the morning light filter interface shown in c in fig. 6, the terminal device displays "morning light" on the interface, and the terminal device disappears after displaying the predetermined time period. The preset duration may be 1 second(s) or any other duration, which is not limited in the embodiment of the present application.
On the basis of the above embodiment, when the terminal device displays the morning light filter interface shown by c in fig. 6 on the inner screen, and switches from the unfolded state to the folded state, the terminal device displays the morning light filter interface shown by e in fig. 6 on the outer screen. The morning filter interface may include: preview area 611 and filter selections. The image 612 after adding the morning filter fills the preview area 611 either laterally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options.
When the user triggers the image 612 after adding the morning light filter through clicking, touching, continuous pressing, etc. on the morning light filter interface shown as e in fig. 6, the terminal device receives the operation of comparing the images by the user, and displays the original image in the preview area.
On the basis of the above embodiment, when the terminal device receives an operation for instructing split-screen when displaying the morning light filter interface shown by c in fig. 6 on the inner screen, the terminal device displays the interface shown by f in fig. 6 on the inner screen.
The interface may include: a gallery application display area and a split screen application display area. The gallery application display area includes: preview area 613 and filter options. The image 614 after adding the morning filter fills the preview area 613 either horizontally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options. The split-screen application display area may display content of other applications (e.g., interfaces of a calculator application, etc.), including but not limited to: calculator applications, weChat applications, setup applications, memo applications, etc. The embodiment of the application does not limit the specific content displayed in the split-screen application display area.
When the user's interface shown as f in fig. 6 triggers the image 614 after the morning light filter is added by clicking, touching, continuous pressing, etc., the terminal device receives the operation of comparing the images by the user, and displays the original image in the preview area.
In the embodiment shown in fig. 6, when the terminal device is in a folded state or split-screen state, the image comparison is implemented by adding the image after the filter, and the terminal device can also implement the image comparison through the control. In a possible implementation manner, the interface shown in e and f in fig. 6 further includes a comparison control. When the user triggers the contrast control through clicking, touching, continuous pressing and other operations on the interface shown as e in fig. 6 or the interface shown as f in fig. 6, the terminal device receives the operation of the user to contrast the image, and the original image is displayed in the preview area of the corresponding interface.
Fig. 7 is an interface schematic diagram corresponding to an edited image when a terminal device is in an expanded state according to an embodiment of the present application.
When the terminal device is receiving an operation of editing an image by the user, as well, the user can edit the image in the manner as shown in fig. 4 or 5, and the terminal device can enter an image editing interface as shown by a in fig. 7. As shown in a of fig. 7, an image display area 701 and an editing selection item may be included in the image editing interface. The image display area includes the original image 702. The original image 702 occupies the image display area 701 horizontally or vertically. Editing options include, but are not limited to: clips, filters 703, adjustments, or other types of editing options.
When the user triggers the filter 703 by clicking, touching, or the like in the image editing interface shown in a in fig. 7, the terminal device receives an operation of adjusting the image filter by the user, and the terminal device enters the adjustment filter interface shown in b in fig. 7. The adjusting filter interface may include: preview area 704 and filter options. Filter options include, but are not limited to: artwork, classical, morning light 705, black and white, or other types of filter options.
When the user triggers the morning light 705 by clicking, touching, or the like in the adjustment filter interface shown in b in fig. 7, the terminal device receives the operation of the user to add the morning light filter, and the terminal device enters the morning light filter interface shown in c in fig. 7. The morning filter interface may include: preview area 706, split control 707, and filter selection. The image 708 after adding the morning filter fills the preview area 706 either laterally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options.
When the user triggers the split control 707 by a continuous pressing operation in the morning light filter interface shown in c in fig. 7, the terminal device receives an operation of comparing images by the user, and the terminal device enters a comparison interface shown in d in fig. 7. The comparison interface includes a preview area 709 therein. An original image 710 and an image 711 with a morning filter added are displayed in the preview area 709.
When the user receives the lift-up operation in the contrast interface shown in d in fig. 7, the terminal device receives the operation of ending the contrast image by the user, and the terminal device enters the morning light filter interface shown in c in fig. 7.
Thus, the edited image and the original image are compared separately, and the comparison by a user is facilitated. In addition, the split comparison can make full use of the screen and beautify the display interface.
In a possible implementation manner, the split control 707 is located at the lower right corner of the inner screen of the terminal device, so that one-hand operation of a user is facilitated, and the operation efficiency of the user is improved.
In a possible implementation manner, when the terminal device enters the interface after adding the filter, the added filter name is displayed on the interface after adding the filter, and the filter name disappears after displaying the preset time. For example, when entering the morning light filter interface shown in c in fig. 7, the terminal device displays "morning light" on the interface, and the terminal device disappears after displaying the predetermined time period. The preset duration may be 1 second(s) or any other duration, which is not limited in the embodiment of the present application.
On the basis of the above embodiment, when the terminal device displays the morning light filter interface shown by c in fig. 7 on the inner screen, and switches from the unfolded state to the folded state, the terminal device displays the morning light filter interface shown by e in fig. 7 on the outer screen. The morning filter interface may include: preview area 712, contrast control 713, and filter selections. The image 714 after the morning filter is added fills the preview area 712 either laterally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options.
When the morning light filter interface shown as e in fig. 7 triggers the contrast control 713 through clicking, touching, continuous pressing, etc. operations, the terminal device receives the operations of the user contrast image, and displays the original image in the preview area.
On the basis of the above-described embodiment, when the terminal device receives an operation for instructing split-screen when displaying the morning light filter interface shown by c in fig. 7 on the inner screen, the terminal device displays the interface shown by f in fig. 7 on the inner screen. The interface may include: a gallery application display area and a split screen application display area. The gallery application display area includes: preview area 715, contrast control 716, and filter selection. The image 717 after adding the morning filter fills the preview area 715 either laterally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options. The split-screen application display area may refer to the above related description, and will not be described herein.
When the interface shown in f in fig. 7 triggers the contrast control 716 through clicking, touching, continuous pressing, etc., the terminal device receives the operation of comparing the images by the user, and displays the original image in the preview area.
In the embodiment shown in fig. 7, when the terminal device is in a folded state or split-screen state, the image comparison is realized through the control, and the terminal device can also realize the image comparison by adding the image after the filter. In a possible implementation, the interface shown in e and f in fig. 7 above does not include a contrast control. When the user triggers the image after the morning filter is added through clicking, touching, continuous pressing and other operations on the interface shown as e in fig. 7 or the interface shown as f in fig. 7, the terminal device receives the operation of comparing the images by the user, and displays the original image in the preview area of the corresponding interface.
The flow of image contrast for a click or touch operation is described below with reference to fig. 8 and 9.
Fig. 8 is an interface schematic diagram corresponding to an edited image when a terminal device is in an expanded state according to an embodiment of the present application.
When the terminal device is receiving an operation of editing an image by a user, the terminal device may enter an image editing interface as shown by a in fig. 8. As shown in a of fig. 8, an image display area 801 and an editing selection item may be included in the image editing interface. The image display area includes the original image 802. The original image 802 occupies the image display area 801 either horizontally or vertically. Editing options include, but are not limited to: clips, filters 803, adjustments, or other types of editing options.
When the user triggers the filter 803 by clicking, touching, or the like in the image editing interface shown in a in fig. 8, the terminal device receives an operation of adjusting the image filter by the user, and the terminal device enters the adjustment filter interface shown in b in fig. 8. The adjusting filter interface may include: preview area 804 and filter selections. Filter options include, but are not limited to: artwork, classical, morning 805, black and white or other types of filter options.
When the user triggers the morning light 805 by clicking, touching, or the like in the adjustment filter interface shown in b in fig. 8, the terminal device receives the operation of the user to add the morning light filter, and the terminal device enters the morning light filter interface shown in c in fig. 8. The morning filter interface may include: preview area 806 and filter selections. The image 807 after adding the morning filter fills the preview area 806 either laterally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options.
When the user triggers the image 807 after adding the morning filter by clicking, touching, or the like in the morning filter interface shown in c in fig. 8, the terminal device receives the operation of comparing the images by the user, and the terminal device enters the comparison interface shown in d in fig. 8. The comparison interface includes a preview area 808 therein. An original image 809 and an image 810 with an added morning filter are displayed in the preview area 808.
When the user triggers the image 810 after adding the morning filter by clicking, touching, etc. in the contrast interface shown in d in fig. 8, the terminal device receives the operation that the user ends the contrast image, and the terminal device enters the morning filter interface shown in c in fig. 8.
Thus, the edited image and the original image are compared separately, and the comparison by a user is facilitated. In addition, the split comparison can make full use of the screen and beautify the display interface.
In a possible implementation manner, when the terminal device enters the interface after adding the filter, the added filter name is displayed on the interface after adding the filter, and the filter name disappears after displaying the preset time. For example, when entering the morning light filter interface shown in c in fig. 8, the terminal device displays "morning light" on the interface, and the terminal device disappears after displaying the predetermined time period. The preset duration may be 1 second(s) or any other duration, which is not limited in the embodiment of the present application.
On the basis of the above embodiment, when the terminal device switches from the unfolded state to the folded state while the inner screen displays the morning light filter interface shown by c in fig. 8 or the contrast interface shown by d in fig. 8, the terminal device displays the morning light filter interface shown by e in fig. 8 on the outer screen. The morning filter interface may include: preview area 811 and filter options. The image 812 after adding the morning filter fills the preview area 811 either laterally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options.
When the user triggers the image 812 after adding the morning light filter through clicking, touching, continuous pressing, etc. on the morning light filter interface shown as e in fig. 8, the terminal device receives the operation of comparing the images by the user, and displays the original image in the preview area.
On the basis of the above embodiment, when the terminal device receives an operation for indicating split-screen, the terminal device displays the interface shown by f in fig. 8 on the inner screen when the terminal device displays the morning light filter interface shown by c in fig. 8 or the contrast interface shown by d in fig. 8 on the inner screen.
The interface may include: a gallery application display area and a split screen application display area. The gallery application display area includes: preview area 813 and filter options. The image 814 after adding the morning filter fills the preview area 813 either horizontally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options. The split-screen application display area may refer to the above related description, and will not be described herein.
When the user's interface shown in f in fig. 8 triggers the image 814 after the morning light filter is added by clicking, touching, continuous pressing, etc., the terminal device receives the operation of comparing the images by the user, and displays the original image in the preview area.
In the embodiment shown in fig. 8, when the terminal device is in a folded state or split-screen state, the terminal device may further implement image comparison by adding the images after the filters. In a possible implementation manner, the interface shown in e and f in fig. 8 further includes a comparison control. When the user triggers the contrast control through clicking, touching, continuous pressing and other operations on the interface shown as e in fig. 8 or the interface shown as f in fig. 8, the terminal device receives the operation of the user to contrast the image, and the original image is displayed in the preview area of the corresponding interface.
Fig. 9 is an interface schematic diagram corresponding to an edited image when a terminal device is in an expanded state according to an embodiment of the present application.
When the terminal device is receiving an operation of editing an image by a user, the terminal device may enter an image editing interface as shown by a in fig. 9. As shown in a of fig. 9, an image display area 901 and an editing selection item may be included in the image editing interface. The image display area includes the original image 902. The original image 902 occupies the image display area 901 horizontally or vertically. Editing options include, but are not limited to: clips, filters 903, adjustments, or other types of editing options.
When the user triggers the filter 903 by clicking, touching, or the like in the image editing interface shown in a in fig. 9, the terminal device receives an operation of adjusting the image filter by the user, and the terminal device enters the adjustment filter interface shown in b in fig. 9. The adjusting filter interface may include: preview area 904 and filter options. Filter options include, but are not limited to: artwork, classical, morning light 905, black and white or other types of filter options.
When the user triggers the morning light 905 by a click, touch, or the like operation in the adjustment filter interface shown in b in fig. 9, the terminal device receives an operation by the user to add the morning light filter, and the terminal device enters the morning light filter interface shown in c in fig. 9. The morning filter interface may include: preview area 906, split control 907, and filter selection. The image 908 after adding the morning filter fills the preview area 906 either laterally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options.
When the user triggers the split control 907 by clicking, touching, or the like in the morning light filter interface shown in c in fig. 9, the terminal device receives the operation of comparing the images by the user, and the terminal device enters the comparison interface shown in d in fig. 9. The comparison interface includes a preview area 909 and a merge control 910. An original image 911 and an image 912 with a morning filter added are displayed in the preview area 909.
When the user triggers the merge control 910 by clicking, touching, or the like in the contrast interface shown in d in fig. 9, the terminal device receives the operation that the user ends the contrast image, and the terminal device enters the morning filter interface shown in c in fig. 9.
Thus, the edited image and the original image are compared separately, and the comparison by a user is facilitated. In addition, the split comparison can make full use of the screen and beautify the display interface.
In a possible implementation manner, the splitting control 907 and/or the combining control 910 are located at the lower right corner of the inner screen of the terminal device, so that one-hand operation of a user is facilitated, and the operation efficiency of the user is improved.
In a possible implementation manner, when the terminal device enters the interface after adding the filter, the added filter name is displayed on the interface after adding the filter, and the filter name disappears after displaying the preset time. For example, when entering the morning light filter interface shown in c in fig. 9, the terminal device displays "morning light" on the interface, and the terminal device disappears after displaying the predetermined time period. The preset duration may be 1 second(s) or any other duration, which is not limited in the embodiment of the present application.
When the terminal device switches from the unfolded state to the folded state while the inner screen displays the morning filter interface shown by c in fig. 9 or the contrast interface shown by d in fig. 9, the terminal device displays the morning filter interface shown by e in fig. 9 on the outer screen. The morning filter interface may include: a preview area 913, a contrast control 914, and filter selections. The image 915 after adding the morning filter fills the preview area 913 either horizontally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options.
When the morning light filter interface shown as e in fig. 9 triggers the contrast control 914 through clicking, touching or other operations, the terminal device receives the operation of comparing the images by the user, and displays the original image in the preview area.
On the basis of a possible implementation manner two, when the terminal device receives an operation for indicating split screen, the terminal device displays an interface shown by f in fig. 9 on the inner screen when the terminal device displays an morning filter interface shown by c in fig. 9 or a contrast interface shown by d in fig. 9 on the inner screen. The interface may include: a gallery application display area and a split screen application display area. The gallery application display area includes: preview area 916, contrast control 917, and filter selection. The image 918 after adding the morning filter fills the preview area 916 either laterally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options. The split-screen application display area may refer to the above related description, and will not be described herein.
When the interface shown in f in fig. 9 triggers the contrast control 917 through clicking, touching or other operations, the terminal device receives the operation of the user to contrast the image, and displays the original image in the preview area.
In the embodiment shown in fig. 9, when the terminal device is in a folded state or split-screen state, the image comparison is realized through the control, and the terminal device can also realize the image comparison by adding the image after the filter. In a possible implementation, the interface shown in e and f in fig. 9 above does not include a contrast control. When the user triggers the image after the morning filter is added through clicking, touching, continuous pressing and other operations on the interface shown as e in fig. 9 or the interface shown as f in fig. 9, the terminal device receives the operation of comparing the images by the user, and displays the original image in the preview area of the corresponding interface.
In the embodiment shown in fig. 9, the display switching of the double-graph and the single-graph is realized through the switching of different controls (for example, the splitting control or the merging control), and the display switching of the double-graph and the single-graph can also be realized through different states (for example, on or off) of the same control, so that the specific implementation is similar to the interface display, and detailed description is omitted here.
In the embodiments shown in fig. 8 and 9, the terminal device confirms the editing effect by simultaneously displaying the image with the added filter and the original image for comparison in the unfolded state. The terminal device can also confirm the editing effect by overlapping and contrasting at the same position.
On the basis of the embodiment shown in fig. 8 or fig. 9, the terminal device may receive the operation of user overlapping contrast on the filter adding interface, so as to display the original image on the image with the filter added in the preview area. Therefore, the two images are switched and displayed at the original position, so that a user can conveniently confirm the editing and modifying area in the images, confirm the editing effect, and conveniently compare texture lines and the like.
In a possible implementation, the interface shown by c in fig. 8 and the interface shown by c in fig. 9 may each include a contrast control. Illustratively, taking the morning light filter interface shown in fig. 9 c as an example, the morning light filter interface shown in fig. 9 c further includes: contrast control 919. When the user triggers the contrast control 919 by clicking, touching or continuously pressing in the morning light filter interface shown in fig. 9 c, the terminal device receives the operation of the user contrast image, and displays the original image at the position of the image after the morning light filter is added in the preview area 906 (as shown in a in fig. 10).
In a possible implementation manner, when the terminal device triggers the image after adding the filter by clicking, touching or continuously pressing the interface in the unfolded state single-image interface (for example, the interface shown as c in fig. 9) on the basis of the embodiment shown in fig. 9, the terminal device receives the operation of comparing the images by the user, and the terminal device displays the original image at the position of the corresponding image.
On the basis of the embodiments shown in fig. 8 and fig. 9, the terminal device may receive the operation of overlaying and comparing by the user at the comparison interface, and further display the original image at the position of the image with the filter added in the preview area. Therefore, the original image display is replaced, the user can conveniently confirm the editing and modifying area in the image, and the editing effect is confirmed from the angles of line textures and the like.
In a possible implementation, the interface shown as d in fig. 8 and the interface shown as c in fig. 9 may each include a contrast control. Illustratively, taking the interface shown as d in fig. 9 as an example, the interface shown as d in fig. 9 further includes: contrast control 920. When the user triggers the contrast control 920 by clicking, touching, or continuously pressing the like in the interface shown in d in fig. 9, the terminal device receives the operation of the user contrast, and displays the original image at the position of the image after the morning light filter is added in the preview area 909 (as shown in b in fig. 10).
In a possible implementation manner, when the terminal device triggers the image after adding the filter by clicking, touching or continuously pressing in the unfolded state double-image interface (for example, the interface shown as d in fig. 9) on the basis of the embodiment shown in fig. 9, the terminal device receives the operation of comparing the images by the user, and the terminal device displays the original image at the position of the corresponding image.
In the embodiments corresponding to fig. 6 to fig. 10, the terminal device defaults to display the single image after selecting the filter until receiving the comparison operation; the following describes a procedure for comparing the default display double images of the terminal device after selecting the filter with reference to fig. 11 to 13.
Fig. 11 is an interface schematic diagram corresponding to an edited image when a terminal device is in an expanded state according to an embodiment of the present application.
Upon receiving an operation of editing an image by a user, the terminal device may enter an image editing interface as shown by a in fig. 11. As shown in a of fig. 11, an image display region 1101 and an editing option may be included in the image editing interface. The image display area includes an original image 1102. The original image 1102 occupies the image display area 1101 horizontally or vertically. Editing options include, but are not limited to: clips, filters 1103, adjustments, or other types of editing options.
When the user triggers the filter 1103 by clicking, touching, or the like in the image editing interface shown in a in fig. 11, the terminal device receives an operation of adjusting the image filter by the user, and the terminal device enters the adjustment filter interface shown in b in fig. 11. The adjusting filter interface may include: preview area 1104 and filter selections. Filter options include, but are not limited to: artwork, classic, morning 1105, black and white or other types of filter options.
When the user triggers morning light 1105 by clicking, touching, or the like in the adjustment filter interface shown in b in fig. 11, the terminal device receives the operation of the user to add the morning light filter, and the terminal device enters the contrast interface shown in c in fig. 11. Included in the comparison interface are a preview area 1106, merge control 1107, and filter selection. An original image 1108 and an image 1109 with a morning filter added are displayed in the preview area 1106. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options.
Thus, the edited image and the original image are compared separately, and the comparison by a user is facilitated. In addition, the split comparison can make full use of the screen and beautify the display interface.
In a possible implementation manner, when the terminal device enters the interface after adding the filter, the added filter name is displayed on the interface after adding the filter, and the filter name disappears after displaying the preset time. For example, when entering the morning light filter interface shown in c in fig. 11, the terminal device displays "morning light" on the interface, and the terminal device disappears after displaying the predetermined time period. The preset duration may be 1 second(s) or any other duration, which is not limited in the embodiment of the present application.
When the user triggers the merge control 1107 by clicking, touching, or the like in the contrast interface shown in c in fig. 11, the terminal device receives the operation of canceling the contrast by the user, and the terminal device enters the morning filter interface shown in d in fig. 11. The morning filter interface may include: preview area 1110, split control 1111, and filter options. The image 1112 after adding the morning filter fills the preview area 1110 either laterally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options.
When the user triggers the split control 1111 by clicking, touching, or the like in the morning light filter interface shown in d in fig. 11, the terminal device receives the operation of comparing images by the user, and enters the comparison interface shown in c in fig. 11.
In a possible implementation manner, the splitting control 1111 and/or the combining control 1107 are located at the lower right corner of the inner screen of the terminal device, so that the user can operate with one hand conveniently, and the operation efficiency of the user is improved.
It should be noted that, the merge control 1107 and the split control 1111 may be in two different states of one control, that is, the merge control 1107 and the split control 1111 may be implemented by one two-state selection switch.
It can be understood that, when the terminal device triggers other filters through clicking, touching, etc. operations on the morning light filter interface shown as d in fig. 11, images corresponding to the other filters are displayed in the preview area. Other filters include but are not limited to classical, black and white filters. If the user triggers the split control 1111 by clicking, touching or other operations after triggering other filters, the terminal device displays the original image and the image corresponding to the other filters in the preview area of the contrast interface shown in c in fig. 11.
On the basis of the embodiment corresponding to fig. 11, when the terminal device displays the contrast interface shown by c in fig. 11 or the morning filter interface shown by d in fig. 11 on the inner screen, the terminal device displays the morning filter interface shown by e in fig. 11 on the outer screen. The morning filter interface may include: preview area 1113, contrast control 1114, and filter selections. The image 1115 after adding the morning filter fills the preview area 1113 either horizontally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options.
When the morning light filter interface shown as e in fig. 11 of the user triggers the contrast control 1114 through clicking, touching or the like, the terminal device receives the operation of comparing the images by the user, and displays the original images in the preview area.
On the basis of the embodiment corresponding to fig. 11, when the terminal device receives an operation for indicating split screen, the terminal device displays the interface shown as e in fig. 11 on the external screen when the contrast interface shown as c in fig. 11 or the morning light filter interface shown as d in fig. 11 is displayed on the internal screen. The interface may include: a gallery application display area and a split screen application display area. The gallery application display area includes: preview area 1116, contrast control 1117, and filter selection. The image 1118 after the morning filter is added fills the preview area 1116 either laterally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options. The split-screen application display area may refer to the above related description, and will not be described herein.
When the morning light filter interface shown as f in fig. 11 triggers the contrast control 1117 through clicking, touching or other operations, the terminal device receives the operation of the user contrast image, and displays the original image in the preview area.
In the embodiment shown in fig. 11, the display switching of the double graph and the single graph is realized through the switching of different controls (for example, the splitting control or the merging control), and the display switching of the double graph and the single graph can also be realized through different states (for example, on or off) of the same control.
Fig. 12 is an interface schematic diagram corresponding to an edited image when a terminal device is in an expanded state according to an embodiment of the present application.
Upon receiving an operation of editing an image by a user, the terminal device may enter an image editing interface as shown by a in fig. 12. As shown in a of fig. 12, an image display area 1201 and an editing selection item may be included in the image editing interface. The image display area includes an original image 1202. The original image 1202 occupies the image display area 1201 either horizontally or vertically. Editing options include, but are not limited to: clips, filters 1203, adjustments, or other types of editing options.
When the user triggers the filter 1203 by clicking, touching, or the like in the image editing interface shown in a in fig. 12, the terminal device receives an operation of adjusting the image filter by the user, and the terminal device enters the adjustment filter interface shown in b in fig. 12. The adjusting filter interface may include: preview area 1204 and filter selections. Filter options include, but are not limited to: artwork, classical, morning 1205, black and white, or other types of filter options.
When the user triggers the morning light 1205 by clicking, touching, or the like in the adjustment filter interface shown in b in fig. 12, the terminal device receives the operation of the user to add the morning light filter, and the terminal device enters the comparison interface shown in c in fig. 12. The comparison interface includes a preview area 1206, a double-map control 1207, and filter selections. An original image 1208 and an image 1209 with a morning filter added are displayed in the preview area 1206. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options.
Thus, the edited image and the original image are compared separately, and the comparison by a user is facilitated. In addition, the split comparison can make full use of the screen and beautify the display interface.
In a possible implementation manner, when the terminal device enters the interface after adding the filter, the added filter name is displayed on the interface after adding the filter, and the filter name disappears after displaying the preset time. For example, when entering the morning light filter interface shown in c in fig. 12, the terminal device displays "morning light" on the interface, and the terminal device disappears after displaying the predetermined time period. The preset duration may be 1 second(s) or any other duration, which is not limited in the embodiment of the present application.
When the user triggers the state change of the double-image control 1207 by clicking, touching or the like in the contrast interface shown in c in fig. 12, the terminal device receives the operation of canceling the contrast by the user, and the terminal device enters the single-image interface shown in d in fig. 12, that is, the morning light filter interface. The morning filter interface may include: preview area 1210, double-map control 1211, and filter selection. The image 1212 after adding the morning filter fills the preview area 1210 either laterally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options.
It will be appreciated that clicking on the double-map control 1207 triggers a state change, i.e., triggers the double-map control 1207 to switch to the off state, if the double-map control 1207 is in the on state. If the double-map control 1207 is in the off state, clicking on the double-map control 1207 triggers a state change, i.e. triggers the double-map control 1207 to switch to the on state. The embodiment of the application does not limit the state specifically corresponding to the control.
When the user triggers the state change of the double-map control 1211 by clicking, touching, or the like in the morning light filter interface shown in d in fig. 12, the terminal device receives the operation of the user to compare the images, and the terminal device enters the comparison interface shown in c in fig. 12.
In a possible implementation manner, the double-diagram control in the embodiment is located at the lower right corner of the inner screen of the terminal device, so that the single-hand operation of a user is facilitated, and the operation efficiency of the user is improved.
It will be appreciated that when the terminal device triggers other filters through clicking, touching, etc. operations on the morning light filter interface shown as d in fig. 12, images corresponding to the other filters are displayed in the preview area. Other filters include but are not limited to classical, black and white filters. If the user triggers the other filters and then triggers the state change of the double-image control 1211 by clicking, touching or other operations, the terminal device displays the original image and the image corresponding to the other filters in the preview area of the contrast interface shown in c in fig. 12.
On the basis of the embodiment corresponding to fig. 12, when the terminal device displays the contrast interface shown by c in fig. 12 or the morning filter interface shown by d in fig. 12 on the inner screen, the terminal device displays the morning filter interface shown by e in fig. 12 on the outer screen. The morning filter interface may include: preview area 1213, contrast control 1214, and filter selections. The image 1215 after adding the morning filter fills the preview area 1213 either horizontally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options.
When the morning light filter interface shown as e in fig. 12 triggers the contrast control 1214 by clicking, touching or the like, the terminal device receives the operation of comparing the images by the user, and displays the original image in the preview area.
On the basis of the embodiment corresponding to fig. 12, when the terminal device receives an operation for indicating split screen, the terminal device displays the interface shown as e in fig. 12 on the external screen, when the contrast interface shown as c in fig. 12 or the morning light filter interface shown as d in fig. 12 is displayed on the internal screen. The interface may include: a gallery application display area and a split screen application display area. The gallery application display area includes: preview area 1216, contrast control 1217, and filter selections. The image 1218 after adding the morning filter fills the preview area 1216 laterally or longitudinally. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options. The split-screen application display area may refer to the above related description, and will not be described herein.
When the morning light filter interface shown as f in fig. 12 triggers the contrast control 1217 by clicking, touching or the like, the terminal device receives the operation of the user contrast image, and displays the original image in the preview area.
In the embodiments shown in fig. 11 and fig. 12, when the terminal device is in a folded state or split-screen, the image comparison is implemented through the control, and the terminal device can also implement the image comparison by adding the image after the filter. In a possible implementation, the interface shown in e and f in fig. 12 above does not include a contrast control. When the user triggers the image after the morning filter is added through clicking, touching, continuous pressing and other operations on the interface shown as e in fig. 12 or the interface shown as f in fig. 12, the terminal device receives the operation of comparing the images by the user, and displays the original image in the preview area of the corresponding interface.
In the embodiments shown in fig. 11 and 12, the terminal device confirms the editing effect by simultaneously displaying the image with the added filter and the original image for comparison in the unfolded state. The terminal device can also confirm the editing effect by overlapping and contrasting at the same position.
On the basis of the embodiment shown in fig. 11 or fig. 12, the terminal device may receive the operation of superimposing and comparing by the user on the comparison interface, and further display the original image at the position of the image after adding the filter in the preview area. Therefore, the two images are switched to be displayed, and the same position is compared with the same position, so that a user can conveniently confirm the editing and modifying area in the image, and confirm the editing effect.
In a possible implementation, the interface shown by c in fig. 11 and the interface shown by c in fig. 12 may each include a contrast control. Illustratively, taking the morning light filter interface shown in fig. 12 c as an example, the morning light filter interface shown in fig. 12 c further includes: control 1219 is compared. When the user triggers the contrast control 1219 by clicking, touching or continuously pressing the contrast control in the morning light filter interface shown in fig. 12 c, the terminal device receives the operation of the user to contrast the image, and displays the original image at the position of the image after the morning light filter is added in the preview area 1206 (as shown in a in fig. 13).
In a possible implementation manner, when the terminal device triggers the image after adding the filter through clicking, touching or continuous pressing operations in the expanded dual-image interface (for example, the interface shown by c in fig. 11 and the interface shown by c in fig. 12), the terminal device receives an operation of comparing the images by the user, and the terminal device displays the original image at the position of the corresponding image.
On the basis of the embodiments shown in fig. 11 and fig. 12, the terminal device may receive the operation of user superimposition contrast on the filter interface, and further display the original image at the position of the image after adding the filter in the preview area. Therefore, the original image display is displayed at the position of the image after the filter is added, so that a user can conveniently confirm the editing and modifying area in the image, and confirm the editing effect from the dimensions such as texture, lines and the like.
In a possible implementation, the interface shown as d in fig. 11 and the interface shown as d in fig. 12 may each include a contrast control. Illustratively, taking the interface shown as d in fig. 12 as an example, the interface shown as d in fig. 12 further includes: contrast control 1220. When the user triggers the contrast control 1220 by clicking, touching, or continuously pressing the like in the interface shown in d in fig. 12, the terminal device receives the operation of the user contrast, and displays the original image at the position of the image after the morning light filter is added in the preview area 1210 (as shown in b in fig. 13).
In a possible implementation manner, when the terminal device triggers the image after adding the filter by clicking, touching or continuously pressing in the unfolded state single image interface (for example, the interface shown by d in fig. 11 and the interface shown by d in fig. 12), the terminal device receives the operation of comparing the images by the user, and displays the original superimposed image at the position of the corresponding image.
On the basis of the embodiments shown in fig. 6 to 12, the terminal device may compare the images of the different filters with the original image, or the terminal device may compare the images of the different filters.
In a possible implementation manner, when the terminal device receives an operation for indicating the other filter, the terminal device compares the image to which the other filter is added with the original image.
By way of example, taking the embodiment shown in fig. 12 as an example, when the terminal device receives an operation for instructing a classical filter at the contrast interface shown by c in fig. 12, the terminal device compares the image to which the classical filter is added with the original image, for example, an image obtained by processing the original image with the classical filter is displayed at the position of the image 1209 on c in fig. 12.
In a second possible implementation manner, when the terminal device receives an operation that a user drags any one filter in the filter selection item to a preset area, the terminal device adds an image corresponding to the corresponding filter for comparison.
The preset area may be all preview areas or part of preview areas in the interface, or may be other areas in the interface, which is not limited in the embodiment of the present application.
It can be understood that the terminal device can add the images corresponding to the corresponding filters for comparison by increasing the number of the displayed images, and can also add the images corresponding to the corresponding filters for comparison by replacing the images at the corresponding positions.
For example, taking a preset area as a blank area in the preview area as an example, when the terminal device drags the classical filter in the filter selection item to the preset area in the comparison interface shown in c in fig. 12, the terminal device compares the image added with the classical filter, the image added with the morning filter, and the original image.
For example, taking a preset area as an area where an image is displayed as an example, if the terminal device drags the classical filter to the original image in the filter selection item through the comparison interface shown in c in fig. 12, the terminal device replaces the original image with the image added with the classical filter, and compares the image added with the classical filter with the image added with the optical filter. If the terminal device drags the classical filter to the image added with the morning filter in the filter selection item through the comparison interface user shown in c in fig. 12, the terminal device replaces the image added with the morning filter with the image added with the classical filter, and compares the image added with the classical filter with the original image.
The embodiment of the application does not limit the number of the images displayed simultaneously by the terminal equipment. The terminal device may also adjust the layout of the display interface according to the number of images displayed, so as to adjust the image size.
In a third possible implementation manner, the terminal device may compare the image of the filter added for the nth time with the image of the filter added for the N-1 th time based on the order in which the filters are selected.
For example, taking as an example when the terminal device selects the morning filter for the first time and selects the classical filter for the second time, when the terminal device is displaying an image added with the classical filter, the terminal device receives an operation for indicating a comparison image, and displays the image added with the morning filter and the image added with the classical filter for comparison.
For example, taking the example that when the terminal device selects the morning filter for the first time and selects the classical filter for the second time, when the terminal device compares the original image with the image added with the morning filter, the terminal device receives an operation for indicating to select the classical filter, and displays the image added with the morning filter and the image added with the classical filter for comparison.
In a possible implementation manner, when the terminal device receives an operation for instructing other filters in the comparison interface of the above embodiment, the terminal device displays the image of the N-1 th added filter and the image of the N-th added filter in the preview area, and displays the latest filter on the right side.
Illustratively, taking the embodiment shown in fig. 12 as an example, when the terminal device receives an operation for indicating a classical filter in the morning filter interface shown in c in fig. 12, the terminal device enters the contrast interface shown in a in fig. 14. The interface comprises: including preview area 1401, merge control 1402, and filter selections. An image 1403 with a morning light filter added and an image 1404 with a classical filter added are displayed in the preview area 1401. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options.
When the user triggers a state change of the double-map control 1402 by clicking, touching, or the like in the contrast interface shown in a in fig. 14, the terminal device receives an operation of canceling the contrast by the user, and the terminal device enters the classical filter interface shown in b in fig. 14. The classical filter interface may include: preview area 1405, double-view control 1406, and filter selection. The image 1407 after adding the classical filter fills the preview area 1405 either horizontally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options.
In a possible implementation manner, when the terminal device switches from the unfolded state to the folded state in the inner screen display of the contrast interface shown as a in fig. 14 or the classical filter interface shown as b in fig. 14, the terminal device displays the classical filter interface shown as c in fig. 14 on the outer screen. The classical filter interface may include: preview area 1408, contrast control 1409, and filter selection. The image 1410 after adding the classical filter fills the preview area 1408 either horizontally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options.
When the classical filter interface shown in c in fig. 14 triggers the contrast control 1409 through clicking, touching or other operations, the terminal device receives the operation of comparing images by the user, and displays the added morning light filter image in the preview area.
In a possible implementation manner, when the terminal device receives an operation for indicating split screen, the terminal device displays the contrast interface shown as a in fig. 14 or the classical filter interface shown as b in fig. 14 on the internal screen, the terminal device displays the interface shown as d in fig. 14 on the internal screen. The interface may include: a gallery application display area and a split screen application display area. The gallery application display area includes: preview area 1411, contrast control 1414, and filter selection. The image 1413 after adding the classical filter fills the preview area 1411 either horizontally or vertically. Filter options include, but are not limited to: artwork, classical, morning light, black and white or other types of filter options.
When the classical filter interface shown as d in fig. 14 triggers the contrast control 1414 through clicking, touching or other operations, the terminal device receives the operation of comparing images by the user, and displays the image of the morning filter in the preview area.
It can be understood that in the embodiments shown in fig. 6 to 14, the terminal device performs image comparison by means of a left-right layout; the terminal device can also perform image comparison in a top-bottom layout mode.
In some embodiments, when comparing images, the terminal device may select different layout modes based on the size of the image and the size of the preview area, so as to display the image to the maximum.
Illustratively, a top-bottom layout is employed when the ratio of the length to the width of the image is greater than a first threshold. When the ratio of the length and the width of the image is smaller than or equal to a first threshold value, a left-right layout is adopted.
The first threshold may be determined based on the size of the preview area and the preset spacing. When the terminal device displays N images simultaneously, the first threshold value satisfies
Exemplary, when the terminal device simultaneously displays 2 images, the first threshold satisfies
The terminal device selection layout will be described below with reference to fig. 15.
Fig. 15 is a schematic diagram of an image layout according to an embodiment of the present application. Taking the preview area length and width of 1984 pixels (px) and 1268px, respectively, the preset spacing is 78px as an example. The first threshold is
When the ratio of the length to the width of the image isWhen the image displayed in the left-right layout is 953px×59595 px (shown as a in fig. 15); the image displayed in the up-down layout is 953px×59595 px (shown as b in fig. 15).
When the ratio of the length to the width of the image isWhen the image displayed in the left-right layout is 953px x 1134px (shown as c in fig. 15); the image displayed in the up-down layout is 500px×59595px (shown as d in fig. 15). The images displayed in the left-right layout are larger.
When the ratio of the length to the width of the image isWhen the image is displayed in the up-down layout, the image is maximized1500px x 59595px (as shown in e of fig. 15); the image displayed in the up-down layout is at most 953px x 567px (shown as f in fig. 15). The image displayed in the up-down layout is large.
As can be seen from fig. 15, when the ratio of the length and the width of the image is equal to the first threshold value, the image size is identical in the up-down layout manner and the left-right layout manner. When the ratio of the length to the width of the image is smaller than the first threshold value, the image displayed in a left-right layout mode is larger. When the ratio of the length to the width of the image is larger than the first threshold value, the image displayed in an up-down layout mode is larger.
On the basis of the above embodiment, when the terminal device performs the filter adding process on the image for the first time in the expanded state, a guide animation may be displayed on the interface of the terminal device. The guide animation is used for prompting the user of the operation mode of image comparison.
For example, the guide animation may prompt the user for a trigger mode of image contrast (as shown by a in fig. 16), and prompt the user for a trigger mode of ending image contrast (as shown by b in fig. 16).
In this way, the user is prompted to experience by guiding the animation to prompt the user to compare the images when editing the images.
In a possible implementation, the terminal device may also match different guidance prompts based on the size of the image.
Illustratively, when the ratio of the length and width of the images is greater than a first threshold, the user is prompted to employ a top-bottom layout (such as the interfaces shown as a and b in FIG. 16) for image comparison. When the ratio of the length and the width of the image is smaller than or equal to the first threshold value, the user is prompted to adopt a left-right layout (an interface shown as a and b in fig. 17) when the images are compared.
In some embodiments, when the terminal device receives a click, touch, or the like operation at any position outside the boot animation area, the terminal device ends the boot animation playback. Or the guide animation ends within a preset time period.
It can be understood that the above image editing contrast is exemplified by adding a filter, and can also be applied to editing processes such as contrast adjustment, exposure adjustment, brightness adjustment, saturation adjustment, color temperature adjustment, highlight adjustment, shadow adjustment, sharpening adjustment, adding a mosaic, adding a text, and the like. The specific implementation is similar to the implementation when a filter is added, and will not be described here again.
In a possible implementation manner, the terminal device edits the image in different types, and when the editing type of the image is switched, the original image edited by the next editing operation is the edited image obtained by the previous editing operation. Illustratively, when the terminal device receives an operation of editing an image by a user, the size of the image is changed. When the terminal equipment receives the operation of adding the filter by the user, the original image displayed on the filter interface is the image with the changed size.
The method provided by the embodiment of the application can also be applied to the video editing scene, and the video editing interface is similar to the image editing in the embodiment, and detailed description is omitted here.
Fig. 18 is an interface schematic diagram corresponding to an edited video when a terminal device is in an expanded state according to an embodiment of the present application.
When the terminal device is receiving an operation of editing a video by a user, the terminal device may enter a video editing interface as shown by a in fig. 18. As shown in a of fig. 18, a video display area 1801 and editing options may be included in the video editing interface. The video display area includes the original video 1802. The original video 1802 occupies the video display area 1801 either horizontally or vertically. Editing options include, but are not limited to: clips, filters 1803, music, text, adjustments, or other types of editing options.
When the user triggers the filter 1803 by clicking, touching, or the like in the video editing interface shown in a in fig. 18, the terminal device receives an operation of adjusting the video filter by the user, and the terminal device enters the adjustment filter interface shown in b in fig. 18. The adjusting filter interface may include: preview area 1804 and filter selections. Filter options include, but are not limited to: artwork, blur, morning 1805, black and white, or other types of filter options.
When the user triggers the morning light 1805 by a click, touch, or the like operation in the adjustment filter interface shown in b in fig. 18, the terminal device receives an operation by the user to add the morning light filter, and the terminal device enters the morning light filter interface shown in c in fig. 18. The morning filter interface may include: preview area 1806, split control 1807, and filter selection. The video 1808 after adding the morning filter fills the preview area 1806 either horizontally or vertically. Filter options include, but are not limited to: artwork, blur, morning light, black and white, or other types of filter options.
When the user triggers the split control 1807 by clicking, touching, or the like in the morning light filter interface shown in c in fig. 18, the terminal device receives the operation of comparing videos by the user, and enters the comparison interface shown in d in fig. 18. Included in the comparison interface are a preview area 1809 and a merge control 1810. An original video 1811 and a video 1812 with a morning light filter added are displayed in the preview area 1809.
When the user triggers the merge control 1810 by clicking, touching, or the like in the contrast interface shown in d in fig. 18, the terminal device receives the operation of ending the contrast video by the user, and the terminal device enters the morning filter interface shown in c in fig. 18.
Thus, the edited video and the original video are compared separately, and the comparison is convenient for users. In addition, the split comparison can make full use of the screen and beautify the display interface.
In a possible implementation manner, the splitting control 1807 and/or the merging control 1810 are/is located at the lower right corner of the inner screen of the terminal device, so that one-hand operation of a user is facilitated, and the operation efficiency of the user is improved.
In a possible implementation manner, when the terminal device enters the interface after adding the filter, the added filter name is displayed on the interface after adding the filter, and the filter name disappears after displaying the preset time. For example, when entering the morning light filter interface shown in c in fig. 18, the terminal device displays "morning light" on the interface, and the terminal device disappears after displaying the predetermined time period. The preset duration may be 1 second(s) or any other duration, which is not limited in the embodiment of the present application.
When the terminal device switches from the unfolded state to the folded state while the inner screen displays the morning filter interface shown by c in fig. 18 or the contrast interface shown by d in fig. 18, the terminal device displays the morning filter interface shown by e in fig. 18 on the outer screen. The morning filter interface may include: preview area 1813, contrast control 1814, and filter selection. The video 1815 after adding the morning filter fills the preview area 1813 either horizontally or vertically. Filter options include, but are not limited to: artwork, blur, morning light, black and white, or other types of filter options.
When the morning light filter interface shown as e in fig. 18 of the user triggers the contrast control 1814 through clicking, touching or the like, the terminal device receives the operation of comparing the video by the user, and displays the original video in the preview area.
On the basis of a possible implementation manner two, when the terminal device receives an operation for indicating split screen, the terminal device displays an morning filter interface shown by c in fig. 18 or a contrast interface shown by d in fig. 18 on the internal screen, and the terminal device displays an interface shown by f in fig. 18 on the internal screen. The interface may include: a gallery application display area and a split screen application display area. The gallery application display area includes: preview area 1816, contrast control 1817, and filter selection. The video 1818 after adding the morning filter fills the preview area 1816 either horizontally or vertically. Filter options include, but are not limited to: artwork, blur, morning light, black and white, or other types of filter options. The split-screen application display area may refer to the above related description, and will not be described herein.
When the interface shown in f in fig. 18 of the user triggers the contrast control 1817 through clicking, touching or the like, the terminal device receives the operation of the user to contrast the video, and displays the original video in the preview area.
In the embodiment shown in fig. 18, when the terminal device is in a folded state or split-screen state, the video comparison is implemented through the control, and the terminal device can also implement the video comparison by adding the video after the filter. In a possible implementation, the interface shown in e and f in fig. 18 above does not include a contrast control. When the user triggers the video after the morning filter is added through clicking, touching, continuous pressing and other operations on the interface shown as e in fig. 18 or the interface shown as f in fig. 18, the terminal device receives the operation of comparing the video by the user, and displays the original video in the preview area of the corresponding interface.
In the embodiment shown in fig. 18, the display switching of the double-graph and the single-graph is implemented by switching different controls (for example, splitting controls or merging controls), and the display switching of the double-graph and the single-graph may also be implemented by different states (for example, on or off) of the same control, so that the specific implementation is similar to the interface display, and detailed description thereof is omitted herein.
In the embodiment shown in fig. 18, the terminal device determines the editing effect by simultaneously displaying the video after adding the filter and the original video for comparison in the unfolded state. The terminal equipment can also compare and confirm the editing effect by switching the video after adding the filter and the original video at the same position.
Based on the embodiment shown in fig. 18, the terminal device may receive the operation of user superimposition contrast on the filter interface, and further display the original video at the position of the video after adding the filter in the preview area. Therefore, the two videos are switched to be displayed, a user can conveniently confirm the editing and modifying area in the video, confirm the editing effect, and conveniently compare texture lines.
Illustratively, taking the morning light filter interface shown in fig. 18 c as an example, the morning light filter interface shown in fig. 18 c further includes: contrast control 1819. When the user triggers the contrast control 1819 by clicking, touching or continuously pressing the operation in the morning light filter interface shown in fig. 18 c, the terminal device receives the operation of the user to contrast the video, and displays the original video at the position of the video after the morning light filter is added in the preview area 1806 (as shown in a in fig. 19).
In a possible implementation manner, when the terminal device triggers the video after adding the filter by clicking, touching or continuously pressing the interface in the unfolded state single-view interface (for example, the interface shown as c in fig. 18) on the basis of the embodiment shown in fig. 18, the terminal device receives the operation of comparing the video by the user, and displays the original video at the position of the corresponding video.
Based on the embodiment shown in fig. 18, the terminal device may receive the operation of superimposing and comparing by the user on the comparison interface, and further display the original video at the position of adding the video behind the filter in the preview area. Therefore, the original video display is switched on the video with the added filter, so that a user can conveniently confirm the editing and modifying area in the video, and confirm the editing effect from the dimensions such as line textures and the like.
Illustratively, taking the interface shown as d in fig. 18 as an example, the interface shown as d in fig. 18 further includes: contrast control 1820. When the user triggers the contrast control 1820 by clicking, touching or continuously pressing in the interface shown in d in fig. 18, the terminal device receives the operation of user superimposition contrast, and displays the original video at the position of the video after the morning filter is added in the preview area 1809 (as shown in b in fig. 19).
In a possible implementation manner, when the terminal device triggers the video after adding the filter by clicking, touching or continuously pressing the interface in the unfolded state double-graph interface (for example, the interface shown as d in fig. 18) on the basis of the embodiment shown in fig. 18, the terminal device receives the operation of comparing the video by the user, and the terminal device displays the original video at the position of the corresponding video.
It will be appreciated that the display interfaces of the terminal devices described above are by way of example only, and that the display interfaces of the terminal devices may also include more or less content. Further, the shape and form of each control in the above embodiments are merely examples. The embodiment of the application does not limit the content of the display interface and the shape and form of each control.
On the basis of the embodiment, the embodiment of the application provides a method for displaying images. Fig. 20 is a schematic flow chart of a method for displaying an image according to an embodiment of the present application.
As shown in fig. 20, the method for displaying an image may include the steps of:
s2001, the terminal equipment displays a first interface on a first display screen, wherein the first interface comprises a first image and a first control;
the interface described in the embodiment of the present application may be understood as a User Interface (UI) interface, where the UI interface refers to a layout design of man-machine interaction, operation logic, and an attractive interface for software. It will be appreciated that a change in layout design in the interface will bring up a new interface.
The first display screen may be a folding screen, or may be an internal display screen of a folding screen device, or may be a display screen of a terminal device such as a tablet computer (Pad).
The first interface may be the adjustment filter interface described in the embodiment of the present application, for example, the first interface may be the interface shown by b in fig. 7 and 9, or the interface shown by b in fig. 11 and 12. The first interface may also be an interface corresponding to other editing options, which is not limited herein.
S2002, the terminal equipment receives a first operation aiming at the first control.
The first control may correspond to any one of the filter selections, e.g., a morning light filter in the filter selections. But may also correspond to any of the other editing options described above. Other editing options include, but are not limited to: contrast, exposure, brightness, saturation, etc. The first control may also correspond to any of the contrast-corresponding parameter selections.
The first operation may be a click operation or a touch operation, which is not limited in the embodiment of the present application.
And S2003, the terminal equipment responds to the first operation, and a second interface is displayed on the first display screen, wherein the second interface comprises a first image and a second image, and the second image is an image obtained by processing the first image by adopting parameters corresponding to the first control.
The first image may be an original image. It can be understood that when the terminal device switches the editing type of the image, the original image edited by the next editing operation is the edited image obtained by the previous editing operation. Illustratively, when the terminal device receives an operation of editing an image by a user, the size of the image is changed. When the terminal equipment receives the operation of adding the filter by the user, the original image displayed on the filter interface is the image with the changed size.
The second interface may be referred to as a contrast interface or may also be referred to as a double-map interface. For example, the second interface may be an interface shown as d in fig. 7 and 9, or an interface shown as c in fig. 11 and 12. In the interface shown in c in fig. 11 and 12, the second image may be an image after adding the morning light filter.
Based on the method, the terminal equipment can display the first image and the second image simultaneously, so that the user can compare the images from the dimensions such as colors, and the user can conveniently determine the editing effect. In addition, when the terminal equipment is a folding screen mobile phone, a tablet computer or other large-screen equipment, the screen can be fully utilized, the interface is beautified, and the user experience is improved.
Optionally, the terminal device further includes a foldable main body and a second display screen, where the first display screen and the second display screen are respectively located on two opposite surfaces of the foldable main body, and the first display screen includes a first sub-screen and a second sub-screen; when the foldable main body is in a folded state, the second display screen is exposed on the surface of the foldable main body, and an included angle between two exposed surfaces of the first sub-screen and the second sub-screen is smaller than a first angle threshold; when the foldable main body is in an unfolding state, an included angle between two faces of the first sub-screen and the second sub-screen exposed outside is larger than a second angle threshold, wherein the first angle threshold is more than or equal to 0 degrees and less than 180 degrees, the second angle threshold is more than or equal to 0 degrees and less than or equal to 180 degrees, and the first angle threshold is more than or equal to the second angle threshold; the terminal equipment displays a user interface on a first display screen when detecting that the foldable main body is in an unfolding state, and displays the user interface on a second display screen when detecting that the foldable main body is in a folding state.
In this way, the terminal device may have multiple display screens. The terminal device may be a folding screen mobile phone or the like.
Optionally, the method further comprises: when the terminal equipment determines that the foldable main body is in a folded state, a third interface is displayed on the second display screen, wherein the third interface comprises a first image and a first control; the terminal equipment receives a first operation aiming at a first control; and the terminal equipment responds to the first operation, and displays a fourth interface on the second display screen, wherein the fourth interface comprises a second image and does not comprise the first image.
The fourth interface may be an interface processed by using the parameters corresponding to the first control, such as the interface shown by e in fig. 7 and 9, and the interface shown by e in fig. 11 and 12.
Thus, in the folded state, a single image is displayed; two images are displayed in the expanded state. The folding screen can be fully utilized, the interface is beautified, and the user experience is increased.
Optionally, the method further comprises: after the first display screen displays the second interface, when the terminal equipment detects that the foldable main body is switched from the unfolded state to the folded state, the terminal equipment displays a fourth interface on the second display screen, and the fourth interface displays a second image without the first image.
Optionally, the method further comprises: the terminal equipment receives a second operation aiming at a second image at a fourth interface; and the terminal equipment responds to the second operation, a fifth interface is displayed on the second display screen, and the fifth interface displays the first image.
The second operation may be a long press operation, a click operation or a touch operation, which is not limited in the embodiment of the present application.
In this way, when in the folded state, the user can display the original image through the image control terminal equipment, so that the user can conveniently compare and determine the editing effect.
Optionally, the fourth interface includes a second control; the method further comprises the steps of: the terminal equipment receives a third operation aiming at the second control; and the terminal equipment responds to the third operation, a fifth interface is displayed on the second display screen, and the fifth interface displays the first image.
The second control may be a contrast control in the interface corresponding to the folded state, for example, the contrast control 713 in the interface shown in e in fig. 7; an in-interface contrast control 914 shown in FIG. 9 e; an in-interface contrast control 1114 shown in fig. 11 e; in-interface contrast control 1214 shown in fig. 12 e.
The fifth interface displays an original image, for example, an interface shown as a in fig. 10 and an interface shown as b in fig. 13.
The third operation may be a long press operation, a click operation, or a touch operation, which is not limited in the embodiment of the present application.
In this way, when in the folded state, the user can control the terminal device to display the original image through the second control, so that the user can conveniently compare and determine the editing effect.
Optionally, the second interface includes a third control; the method further comprises the steps of: the terminal equipment receives a fourth operation aiming at the third control; and the terminal equipment responds to the fourth operation, and a sixth interface is displayed on the first display screen, wherein the sixth interface comprises a second image, and the sixth interface does not comprise the first image.
The third control may be a merge control, or a single-graph control, such as the contrast control shown as d in FIG. 7, the merge control 910 shown as d in FIG. 9, the merge control 1107 shown as c in FIG. 11, the double-graph control 1207 shown as c in FIG. 12.
The sixth interface displays a second image, for example, an interface shown as d in fig. 7, an interface shown as d in fig. 9, an interface shown as c in fig. 11, and an interface shown as c in fig. 12.
The third operation may be a lifting operation in the long press operation, or may be a clicking operation or a touch operation, which is not limited in the embodiment of the present application.
In this way, the terminal device can also switch to display a single image when in the expanded state.
Optionally, the sixth interface includes a fourth control; the terminal equipment receives a fifth operation aiming at the fourth control; and the terminal equipment responds to the fifth operation and displays a second interface on the first display screen.
The fourth control may be a split control, a double-graph control, or the like. For example, split control 707 shown in c in fig. 7, split control 907 shown in c in fig. 9, split control 1111 shown in d in fig. 11, and double-graph control 1211 shown in d in fig. 12.
The fifth operation may be a long press operation, a click operation, a touch operation, or the like.
Therefore, when the terminal equipment is in an unfolding state, the switching display of a single image and two images can be realized, the display mode is flexible, and the user experience is improved.
Optionally, the method further comprises: the terminal equipment receives a sixth operation aiming at the second image at the second interface; and the terminal equipment responds to the sixth operation, a seventh interface is displayed on the first display screen, and the seventh interface displays the first image and the first image in parallel.
The sixth operation may be a long press operation, a click operation, a touch operation, or the like. The seventh interface displays two original images.
Therefore, the user can control the terminal device to display the first image at the position of the second image through the second image, the first image is convenient to use for carrying out dimension contrast images such as texture lines, the editing effect is convenient to confirm, and the user experience is improved.
Optionally, the second interface includes a fifth control; the method further comprises the steps of: the terminal equipment receives a seventh operation aiming at the fifth control; and the terminal equipment responds to the seventh operation, an eighth interface is displayed on the first display screen, and the eighth interface displays the first image and the first image in parallel.
The seventh operation may be a long press operation, a click operation, a touch operation, or the like. The eighth interface displays two original images. For example, the interface shown as b in fig. 10, and the interface shown as a in fig. 13.
Therefore, the user can control the terminal device to display the first image at the position of the second image through the fifth control, the method is convenient to use for comparing images in dimensions such as texture lines, the editing effect is convenient to confirm, and the user experience is improved.
Optionally, the method further comprises: when the terminal equipment displays the second interface or the sixth interface or the eighth interface, receiving eighth operation; the terminal equipment responds to the eighth operation, a ninth interface is displayed on the first display screen, a first split-screen window corresponding to the first application and a second split-screen window corresponding to the second application are displayed, the first split-screen window comprises a second image, and the second split-screen window comprises an application interface of the second application.
The eighth operation is an operation for indicating split screen. The first application may be a gallery application, the second application may be a split screen application, or may be referred to as a split screen displayable application, e.g., a WeChat, calculator, etc. The second split screen window comprises an application interface corresponding to the split screen application, such as a chat interface of a WeChat application.
The ninth interface displays the contents of the second image and the split screen application, for example, the interface shown by f in fig. 6 to 9, the interface shown by f in fig. 11 and 12; for example, the second image may be displayed on the first sub-screen; the application interface of the second application may be displayed on the second sub-screen.
In this way, the terminal device can display images and other applications in a split screen mode, and a user can operate and control the split screen application simultaneously when editing the images.
Optionally, the method further comprises: the terminal equipment receives a ninth operation aiming at the second image at a ninth interface; and the terminal equipment responds to the ninth operation, and displays the first image at the position where the second image is displayed in the first split-screen window.
The ninth operation may be a long press operation, a click operation, a touch operation, or the like.
In this way, in the split screen state, the user can control the terminal device to display the first image at the position of the second image through the second image.
Optionally, the eighth interface includes a sixth control; the method further comprises the steps of: the terminal equipment receives tenth operation aiming at a sixth control; and the terminal equipment responds to the tenth operation, and displays the first image at the position where the second image is displayed in the first split-screen window.
The tenth operation may be a long press operation, a click operation, a touch operation, or the like.
In this way, in the split screen state, the user can control the terminal device to display the first image at the position of the second image through the sixth control.
Optionally, the second interface further includes a seventh control; the method further comprises the steps of: the terminal equipment receives eleventh operation aiming at a seventh control; the terminal equipment responds to eleventh operation, a tenth interface is displayed on the first display screen, the tenth interface comprises a first image and a third image, and the third image is an image obtained by processing the first image by adopting parameters corresponding to a seventh control.
The seventh control may correspond to any one of the filter selections other than the one corresponding to the first control, e.g., a classical filter of the filter selections. Any one of the other editing options may be associated with the editing option. Other editing options include, but are not limited to: contrast, exposure, brightness, saturation, etc. The seventh control may also correspond to any one of the parameter selection items corresponding to the contrast except for the one corresponding to the first control.
Therefore, the terminal equipment can switch the images corresponding to different parameters to be compared with the first image, for example, switch the images corresponding to different filters, so that the user can conveniently edit the images.
Optionally, the second interface further includes a seventh control; the method further comprises the steps of: the terminal equipment receives eleventh operation aiming at a seventh control at a second interface; the terminal equipment responds to eleventh operation, an eleventh interface is displayed on the first display screen, the eleventh interface comprises a second image and a third image, and the third image is an image obtained by processing the first image by parameters corresponding to the seventh control.
Therefore, the terminal equipment can compare images corresponding to different parameters, for example, images corresponding to different filters are compared, so that a user can edit the images conveniently, and the editing effect is confirmed.
Optionally, the method further comprises: the terminal device determines an arrangement of the first image and the second image in the second interface based on a first value, the first value being related to a size of the first image.
Therefore, different arrangement modes are selected based on the size of the image, the display screen can be utilized to the maximum extent, and the display effect is improved.
Optionally, the first value is a ratio of a length of the first image to a width of the first image; when the first value is larger than a first threshold value, arranging a first image and a second image in a second interface in an up-down mode, wherein the first threshold value is related to the size of a preview area in the second interface and a preset interval, and the preset interval is the interval between preset images; when the first value is less than or equal to the first threshold value, the first image and the second image in the second interface are arranged in an up-and-down mode.
Optionally, the first threshold satisfies:
optionally, the second interface further comprises a guide animation; the guide animation is used for prompting the user of the using mode of the third control.
In this way, the user can be guided to use the image contrast function.
Optionally, the terminal device responds to the first operation to display a second interface on the first display screen, including: the terminal equipment displays a twelfth interface, and the twelfth interface displays a second image; the terminal equipment receives a twelfth operation aiming at the second image at a twelfth interface; and the terminal equipment responds to the twelfth operation and displays a second interface on the first display screen.
The twelfth operation may be a long press operation, or may be a click or touch operation, which is not limited herein.
The twelfth interface may be an interface shown as d in fig. 6 to 9, an interface shown as c in fig. 11, or an interface shown as c in fig. 12. Thus, the terminal equipment can enter the interface of a single image, enter the interface of two images through the interface of the single image, and perform image comparison.
Optionally, the terminal device responds to the first operation to display a second interface on the first display screen, including: the terminal equipment displays a thirteenth interface, and the thirteenth interface displays a second image and an eighth control; the terminal equipment receives thirteenth operation aiming at the eighth control; and the terminal equipment responds to the thirteenth operation and displays a second interface on the first display screen.
The thirteenth operation may be a long press operation, or may be a click or touch operation, and is not limited herein.
The thirteenth interface may be an interface shown as d in fig. 7 and 9, an interface shown as c in fig. 11, or an interface shown as c in fig. 12. In this way, the terminal equipment can enter the interface of the single image, and the image comparison is performed by triggering the control in the interface of the single image to enter the interface of the two images.
In a possible implementation manner, the one or more controls may be located in a lower right area of the display interface of the terminal device, so that one-hand operation of a user may be facilitated, and user experience is improved.
It should be noted that, in the above embodiments, the two states of the folding screen of the terminal device including the folded state and the unfolded state are described as examples, where the folding screen may be a folding screen (not shown in the figure) capable of implementing 3-fold or 4-fold. The states of the terminal device may also include three states, a folded state, a transitional state, and an unfolded state.
Illustratively, the folding screen may be a folding screen that implements N folds, N being an integer greater than or equal to 2.
For example, taking a folding screen of which the folding screen of the terminal device is 3-fold as an example, a back plate excluding the screen may be disposed behind one screen in the middle of the folding screen. When the folding screen is folded, the backboard excluding the screen can be hidden, and one screen of the folding screen can be used as the backboard.
In a possible implementation, when the state of the terminal device includes three states of a folded state, a transitional state and an unfolded state, in the transitional state, the terminal device can control the inner screen and the outer screen to be both lighted, and the inner screen and the outer screen both display images. For example, an angle between two adjacent screens of the folding screen may be defined as a folded state when the angle is 0-30 degrees, a transitional state when the angle is 30-45 degrees, and an unfolded state when the angle is 45-180 degrees.
The image display method according to the embodiment of the present application has been described above, and the apparatus for performing the method according to the embodiment of the present application is described below. As shown in fig. 21, fig. 21 is a schematic structural diagram of an image display device according to an embodiment of the present application, where the image display device may be a terminal device in the embodiment of the present application, or may be a chip or a chip system in the terminal device.
As shown in fig. 21, the image display apparatus 2100 may be used in a communication device, a circuit, a hardware component, or a chip, and includes: a display unit 2101, and a processing unit 2102. Wherein the display unit 2101 is used for supporting the step of displaying performed by the image display apparatus 2100; the processing unit 2102 is for supporting the image display apparatus 2100 to execute steps of information processing.
In a possible implementation manner, the image display device 2100 may also include a communication unit 2103. Specifically, the communication unit is for supporting the image display apparatus 2100 to perform the steps of transmission of data and reception of data. The communication unit 2103 may be an input or output interface, a pin or circuit, or the like.
In a possible embodiment, the image display apparatus may further include: a storage unit 2104. The processing unit 2102 and the storage unit 2104 are connected by a line. The memory unit 2104 may include one or more memories, which may be one or more devices, circuits, or means for storing programs or data. The storage unit 2104 may exist independently and be connected to the processing unit 2102 provided in the image display apparatus via a communication line. The memory unit 2104 may also be integrated with the processing unit 2102.
The storage unit 2104 may store computer-executed instructions of the method in the terminal apparatus to cause the processing unit 2102 to execute the method in the above-described embodiment. The storage unit 2104 may be a register, a cache, a RAM, or the like, and the storage unit 2104 may be integrated with the processing unit 2102. The storage unit 2104 may be a read-only memory (ROM) or other type of static storage device that may store static information and instructions, and the storage unit 2104 may be independent of the processing unit 2102.
Fig. 22 is a schematic structural diagram of a terminal device according to an embodiment of the present application. The terminal device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, a sensor module 180, a display screen 194, and the like. Wherein at least the sensor module 180 may include: an angle sensor 180A, a gyro sensor 180B, a touch sensor 180C, an acceleration sensor 180D, and the like.
It will be appreciated that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the terminal device. In other embodiments of the application, the terminal device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units. Wherein the different processing units may be separate devices or may be integrated in one or more processors. A memory may also be provided in the processor 110 for storing instructions and data.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge a terminal device, or may be used to transfer data between the terminal device and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. The power management module 141 is used for connecting the charge management module 140 and the processor 110.
The wireless communication function of the terminal device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Antennas in the terminal device may be used to cover single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G or the like applied on a terminal device. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wirelesslocal area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), etc. as applied on a terminal device.
The terminal device implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. In some embodiments, the terminal device may include 1 or N display screens 194, N being a positive integer greater than 1. In the embodiment of the present application, the display screen 194 may be a folding screen, a large screen, or the like.
In some embodiments, the folding screen may be a folding screen capable of achieving 2-fold, 3-fold, 4-fold, or the like, or the folding screen may be a folding screen capable of achieving 3-fold, 4-fold, or the like (not shown in the drawings). Illustratively, the folding screen is a folding screen that can achieve left and right 2 folds.
The folded screen state may include a folded state and an unfolded state; the folded screen state may also include three states, a folded state, a transitional state, and an unfolded state.
In some embodiments, the display 194 may also be an external screen of the terminal device, and the understanding of the external screen may refer to the description of the external screen in the above embodiments, which is not described herein. In some embodiments, the display 194 may also be an internal screen of the terminal device, and the understanding of the internal screen may refer to the description of the internal screen in the above embodiments, which is not repeated herein.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to realize expansion of the memory capability of the terminal device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The internal memory 121 may include a storage program area and a storage data area.
The angle sensor 180A is used to measure an angle. In the embodiment of the present application, when the terminal device is a folding-screen mobile phone, the angle sensor 180A is used to determine the folding angle of the folding-screen mobile phone.
The gyro sensor 180B may be used to determine a motion gesture of the terminal device. In the embodiment of the present application, the gyro sensor 180B may be used to determine a landscape screen state of the terminal device or a portrait screen state of the terminal device. When the terminal device is a folding screen phone, the gyro sensor 180B may also be used to determine the upper half or lower half of the folding screen phone.
The touch sensor 180C may be disposed on the display 194, and the touch sensor 180C and the display 194 form a touch screen, or "touch screen". In the embodiment of the present application, the touch sensor 180C is configured to receive a touch operation of a user on the touch screen, for example, a trigger operation of the user to open a video application, to open a short video application, or to a function control in the application.
The acceleration sensor 180D is used to monitor the acceleration of the folding motion. In the embodiment of the present application, when the terminal device is a folding screen mobile phone, the acceleration sensor 180D is used to determine the acceleration of the folding screen when folding or the acceleration of the folding screen when unfolding.
The embodiment of the application provides a terminal device, which may also be called a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), and the like. The terminal device may be a mobile phone, a smart television, a wearable device, a tablet (Pad), a computer with wireless transceiving function, a Virtual Reality (VR) terminal device, an augmented reality (augmented reality, AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in unmanned driving (self-driving), a wireless terminal in teleoperation (remote medical surgery), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), or the like.
The terminal device includes: comprising the following steps: a processor and a memory; the memory stores computer-executable instructions; the processor executes the computer-executable instructions stored in the memory to cause the terminal device to perform the method described above.
The embodiment of the application also provides a computer readable storage medium. The computer-readable storage medium stores a computer program. The computer program realizes the above method when being executed by a processor. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer readable media can include computer storage media and communication media and can include any medium that can transfer a computer program from one place to another. The storage media may be any target media that is accessible by a computer.
In one possible implementation, the computer readable medium may include RAM, ROM, compact disk-read only memory (CD-ROM) or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium targeted for carrying or storing the desired program code in the form of instructions or data structures and accessible by a computer. Also, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (Digital Subscriber Line, DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes optical disc, laser disc, optical disc, digital versatile disc (Digital Versatile Disc, DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
Embodiments of the present application provide a computer program product comprising a computer program which, when executed, causes a computer to perform the above-described method.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The foregoing detailed description of the application has been presented for purposes of illustration and description, and it should be understood that the foregoing is by way of illustration and description only, and is not intended to limit the scope of the application.

Claims (22)

1. An image display method, characterized by being applied to a terminal device, the method comprising:
the terminal equipment displays a first interface on a first display screen, wherein the first interface comprises a first image and a first control;
the terminal equipment receives a first operation aiming at a first control;
the terminal equipment responds to the first operation, and a second interface is displayed on the first display screen, wherein the second interface comprises the first image and a second image, and the second image is an image obtained by processing the first image by adopting parameters corresponding to the first control.
2. The method of claim 1, wherein the terminal device further comprises a foldable body and a second display screen, the first display screen and the second display screen being located on opposite surfaces of the foldable body, respectively, the first display screen comprising a first sub-screen and a second sub-screen;
when the foldable main body is in a folded state, the second display screen is exposed on the surface of the foldable main body, and an included angle between two surfaces of the first sub-screen and the second sub-screen exposed outside is smaller than a first angle threshold;
When the foldable main body is in an unfolding state, an included angle between two faces of the first sub-screen and the second sub-screen exposed outside is larger than a second angle threshold, wherein the first angle threshold is larger than or equal to 0 degrees and smaller than 180 degrees, the second angle threshold is smaller than or equal to 0 degrees and smaller than 180 degrees, and the first angle threshold is smaller than or equal to the second angle threshold;
and the terminal equipment displays a user interface on the first display screen when detecting that the foldable main body is in an unfolding state, and displays the user interface on the second display screen when detecting that the foldable main body is in a folding state.
3. The method according to claim 2, wherein the method further comprises:
when the terminal equipment determines that the foldable main body is in a folded state, displaying a third interface on the second display screen, wherein the third interface comprises the first image and the first control;
the terminal equipment receives a first operation aiming at a first control;
and the terminal equipment responds to the first operation, and a fourth interface is displayed on the second display screen, wherein the fourth interface comprises the second image and does not comprise the first image.
4. A method according to claim 2 or 3, characterized in that the method further comprises:
after the first display screen displays the second interface, when the terminal equipment detects that the foldable main body is switched from the unfolded state to the folded state, the terminal equipment displays a fourth interface on the second display screen, wherein the fourth interface comprises the second image and does not comprise the first image.
5. The method according to claim 4, wherein the method further comprises:
the terminal equipment receives a second operation aiming at the second image at the fourth interface;
and the terminal equipment responds to the second operation, and a fifth interface is displayed on the second display screen, wherein the fifth interface displays the first image.
6. The method of claim 4, wherein the fourth interface comprises a second control;
the method further comprises the steps of: the terminal equipment receives a third operation aiming at the second control;
and the terminal equipment responds to the third operation, and a fifth interface is displayed on the second display screen, wherein the fifth interface displays the first image.
7. The method of any of claims 1-5, wherein the second interface comprises a third control;
the method further comprises the steps of: the terminal equipment receives a fourth operation aiming at the third control;
and the terminal equipment responds to the fourth operation, and a sixth interface is displayed on the first display screen, wherein the sixth interface comprises the second image and does not comprise the first image.
8. The method of claim 7, wherein the sixth interface further comprises a fourth control;
the terminal equipment receives a fifth operation aiming at the fourth control;
and the terminal equipment responds to the fifth operation and displays the second interface on the first display screen.
9. The method according to any one of claims 1-8, further comprising:
the terminal equipment receives a sixth operation aiming at the second image at the second interface;
and the terminal equipment responds to the sixth operation, a seventh interface is displayed on the first display screen, and the first image are displayed on the seventh interface in parallel.
10. The method of any of claims 1-9, wherein the second interface comprises a fifth control;
The method further comprises the steps of: the terminal equipment receives a seventh operation aiming at the fifth control;
and the terminal equipment responds to the seventh operation, an eighth interface is displayed on the first display screen, and the first image are displayed on the eighth interface in parallel.
11. The method according to any one of claims 1-10, further comprising:
the terminal equipment receives an eighth operation when displaying the second interface or the sixth interface or the eighth interface;
the terminal equipment responds to the eighth operation, a ninth interface is displayed on the first display screen, the ninth interface comprises a first split-screen window corresponding to a first application and a second split-screen window corresponding to a second application, the first split-screen window comprises the second image, and the second split-screen window comprises an application interface of the second application.
12. The method of claim 11, wherein the method further comprises:
the terminal equipment receives a ninth operation aiming at the second image at the ninth interface;
and the terminal equipment responds to the ninth operation, and displays the first image at the position where the second image is displayed in the first split-screen window.
13. The method of claim 11 or 12, wherein the eighth interface comprises a sixth control;
the method further comprises the steps of: the terminal equipment receives tenth operation aiming at the sixth control;
and the terminal equipment responds to the tenth operation, and displays the first image at the position where the second image is displayed in the first split-screen window.
14. The method of any of claims 1-13, wherein the second interface further comprises a seventh control;
the method further comprises the steps of: the terminal equipment receives eleventh operation for the seventh control;
the terminal equipment responds to the eleventh operation, a tenth interface is displayed on the first display screen, the tenth interface comprises the first image and a third image, and the third image is an image obtained by processing the first image by adopting parameters corresponding to the seventh control.
15. The method of any of claims 1-13, wherein the second interface further comprises the seventh control;
the method further comprises the steps of: the terminal equipment receives eleventh operation aiming at the seventh control at the second interface;
The terminal equipment responds to the eleventh operation, an eleventh interface is displayed on the first display screen, the eleventh interface comprises the second image and a third image, and the third image is an image after the first image is processed by parameters corresponding to the seventh control.
16. The method according to any one of claims 1 to 15, wherein,
the terminal device determines an arrangement of the first image and the second image in the second interface based on a first value, the first value being related to a size of the first image.
17. The method of claim 16, wherein the first value is a ratio of a length of the first image and a width of the first image;
when the first value is larger than a first threshold value, the first image and the second image in the second interface are arranged in an up-down mode, the first threshold value is related to the size of a preview area in the second interface and a preset interval, and the preset interval is the interval between the preset images;
and when the first value is smaller than or equal to the first threshold value, the first image and the second image in the second interface are arranged in an up-and-down mode.
18. The method of claim 17, wherein the step of determining the position of the probe is performed,
the first threshold satisfies:
19. the method of any of claims 7-15, wherein the second interface further comprises a guide animation;
the guiding animation is used for prompting a user of the using mode of the third control.
20. A terminal device, comprising: a processor and a memory;
the memory stores computer-executable instructions;
the processor executing computer-executable instructions stored in the memory to cause the terminal device to perform the method of any one of claims 1-19.
21. A computer readable storage medium storing a computer program, which when executed by a processor performs the method of any one of claims 1-19.
22. A computer program product comprising a computer program which, when run, causes a computer to perform the method of any of claims 1-19.
CN202210450191.6A 2022-04-27 2022-04-27 Image display method and related device Pending CN117014543A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210450191.6A CN117014543A (en) 2022-04-27 2022-04-27 Image display method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210450191.6A CN117014543A (en) 2022-04-27 2022-04-27 Image display method and related device

Publications (1)

Publication Number Publication Date
CN117014543A true CN117014543A (en) 2023-11-07

Family

ID=88567692

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210450191.6A Pending CN117014543A (en) 2022-04-27 2022-04-27 Image display method and related device

Country Status (1)

Country Link
CN (1) CN117014543A (en)

Similar Documents

Publication Publication Date Title
US8826184B2 (en) Mobile terminal and image display controlling method thereof
CN113553014B (en) Application interface display method under multi-window screen projection scene and electronic equipment
KR101748668B1 (en) Mobile twrminal and 3d image controlling method thereof
KR101852428B1 (en) Mobile twrminal and 3d object control method thereof
CN111225108A (en) Communication terminal and card display method of negative screen interface
CN112230914B (en) Method, device, terminal and storage medium for producing small program
CN114721761B (en) Terminal equipment, application icon management method and storage medium
WO2022100326A1 (en) Electronic device and inter-device screen collaboration method and medium thereof
CN112822544A (en) Video material file generation method, video synthesis method, device and medium
CN114449171B (en) Method for controlling camera, terminal device, storage medium and program product
CN114428660B (en) Page processing method, device, equipment and storage medium
CN117014543A (en) Image display method and related device
CN111324255B (en) Application processing method based on double-screen terminal and communication terminal
CN115334239B (en) Front camera and rear camera photographing fusion method, terminal equipment and storage medium
CN116088740B (en) Interface processing method and device
CN116112781B (en) Video recording method, device and storage medium
CN114615362B (en) Camera control method, device and storage medium
CN116132790B (en) Video recording method and related device
WO2023226699A1 (en) Video recording method and apparatus, and storage medium
WO2023226694A1 (en) Video recording method and apparatus, and storage medium
CN116737291A (en) Desktop application processing method and electronic equipment
CN112817667A (en) Terminal and control icon display method
CN116088832A (en) Interface processing method and device
CN114546228A (en) Expression image sending method, device, equipment and medium
CN111857900A (en) Information setting method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination