WO2020199995A1 - 图片编辑方法及终端 - Google Patents

图片编辑方法及终端 Download PDF

Info

Publication number
WO2020199995A1
WO2020199995A1 PCT/CN2020/081030 CN2020081030W WO2020199995A1 WO 2020199995 A1 WO2020199995 A1 WO 2020199995A1 CN 2020081030 W CN2020081030 W CN 2020081030W WO 2020199995 A1 WO2020199995 A1 WO 2020199995A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
screen
input
processing
display
Prior art date
Application number
PCT/CN2020/081030
Other languages
English (en)
French (fr)
Inventor
黎浩正
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2020199995A1 publication Critical patent/WO2020199995A1/zh
Priority to US17/491,027 priority Critical patent/US11630561B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1647Details related to the display arrangement, including those related to the mounting of the display in the housing including at least an additional display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • the present disclosure relates to the field of communication technology, and in particular to a picture editing method and terminal.
  • the embodiments of the present disclosure provide a picture editing method and terminal to solve the problem of complicated operations and inconvenient operations during picture processing.
  • embodiments of the present disclosure provide a picture editing method, which is applied to a terminal.
  • the terminal includes a first screen and a second screen.
  • the picture editing method includes:
  • the third image is an image obtained by performing a second process on the first image, and the second process is a part of the process included in the first process.
  • the embodiments of the present disclosure also provide a terminal, the terminal includes a first screen and a second screen, and the terminal further includes:
  • the first receiving module is configured to receive a first input when the first image is displayed on the first screen
  • the first display module is configured to perform first processing on the first image in response to the first input to obtain a second image, and display the second image on the first screen, and Displaying the first image and/or the third image on a screen;
  • the third image is an image obtained by performing a second process on the first image, and the second process is a part of the process included in the first process.
  • the embodiments of the present disclosure also provide a mobile terminal, including a processor, a memory, and a computer program stored on the memory and running on the processor, and the computer program is executed by the processor.
  • the steps of the picture editing method as described above are realized when executed.
  • the embodiments of the present disclosure also provide a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, the steps of the image editing method described above are implemented .
  • the first image is first processed to obtain the second image
  • the second image is displayed on the first screen.
  • the second screen displays the first image and/or the third image processed by part of the image processing corresponding to the second image, which can be realized by the user's single or continuous processing operation input to the image in the first screen.
  • the images with different processing effects are displayed on the second screen, so that users can preview and compare different image editing effects, and assist users in quickly identifying the images with the best processing effects.
  • the image preview on the second screen reduces users
  • the operation processing steps facilitate the user's image editing process and provide users with a better experience.
  • FIG. 1 shows a flowchart of a picture editing method in an embodiment of the present disclosure
  • FIG. 2 shows another flowchart of the picture editing method in the embodiment of the present disclosure
  • FIG. 3 shows another flowchart of the picture editing method in an embodiment of the present disclosure
  • FIG. 4 shows another flowchart of the picture editing method in the embodiment of the present disclosure
  • Figure 5 shows a schematic diagram of a folding screen terminal in an embodiment of the present disclosure
  • FIG. 6 shows a schematic diagram of an unactivated image preview interface on the second screen in an embodiment of the present disclosure
  • FIG. 7 shows a schematic diagram of a second screen startup image preview interface in an embodiment of the present disclosure
  • FIG. 8 shows an operation schematic diagram of the first input in the embodiment of the present disclosure
  • FIG. 9 shows an operation schematic diagram of the fifth input in the embodiment of the present disclosure.
  • FIG. 10 shows an operation schematic diagram of the third input in the embodiment of the present disclosure
  • Figure 11 shows a structural block diagram 1 of a terminal in an embodiment of the present disclosure
  • Figure 12 shows a second structural block diagram of the terminal in an embodiment of the present disclosure
  • FIG. 13 is a schematic diagram of the hardware structure of a mobile terminal that implements various embodiments of the present disclosure.
  • the embodiment of the present disclosure discloses a picture editing method, which is applied to a terminal, and the terminal includes a first screen and a second screen.
  • the terminal in this embodiment is a folding screen terminal or a double-sided screen terminal.
  • the first screen and the second screen can be screens on the same side of the terminal, for example, the first screen and the second screen are both screens on the front of the terminal; they can also be screens on different sides of the terminal, such as the first screen.
  • the screen on the front of the terminal, and the second screen is the screen on the back of the terminal.
  • the specific situation can be set according to actual needs.
  • the terminal when the terminal is a folding screen terminal, it specifically refers to a terminal with a book page opening and closing structure as shown in FIG. 5.
  • the terminal can be folded or opened in the direction of the arrow indicated by number 3.
  • the terminal When the terminal is opened like a book, the first screen 1 and the second screen 2 are on the same side of the terminal, and the user can observe the first screen 1 and the second screen 2 of the terminal at the same time.
  • the terminal is closed like a book, the first screen 1 and the second screen 2 are on different sides of the terminal.
  • the picture editing method includes:
  • Step 101 Receive a first input when a first image is displayed on the first screen.
  • the first image is in an image editing state in the first screen.
  • the first screen currently displays an image editing interface including the first image; in the image editing interface, the first image is in an image editing state.
  • the image editing interface may be as shown in Figures 6 and 7.
  • the image editing interface on the first screen may include: an image display area 4 and a display area 5 of image editing options.
  • the image editing options may specifically include various Various effect filters, different effect maps, light brightness contrast and other parameter adjustment options. Through various image editing options, the image editing operation in the image editing interface is realized.
  • the first input can be an edit input to the first image, or a drag input to the first image, or a preset input that occurs in the first screen, or a preset input that occurs in the second screen , Or a preset input involving both the first screen and the second screen.
  • the first input is an operation input for a pop-up box in the image editing process. There are no specific restrictions here.
  • the first input includes at least one of the following:
  • the one side of the first screen is a side close to the second screen, and the sliding direction of the first sliding operation and the second sliding operation are the same.
  • the long-press operation specifically refers to an operation in which the pressing time of the first image exceeds the set duration.
  • the above-mentioned first input includes: a sliding operation 8 (arrow direction) of dragging the first image to a side of the first screen
  • a sliding operation 8 arrow direction
  • the one side of the first screen is the side close to the second screen. It is convenient for the specific execution of the first input, making the user operation more convenient.
  • Step 102 In response to the first input, perform first processing on the first image to obtain a second image, and display the second image on the first screen, and display the first image and/or the third image on the second screen.
  • the third image is an image obtained by performing a second process on the first image, and the second process is a part of the process included in the first process.
  • Multiple images can be displayed on the second screen at the same time.
  • the multiple images are tiled and displayed adjacently.
  • the image preview interface is started on the second screen.
  • the second screen is in the off-screen state, and the image previewer of the second screen 2 is in the closed state 6, as shown in FIG. 6.
  • the image previewer is in the open state 7, and an image preview interface is displayed.
  • the system when the system detects that the image editing interface is displayed on the first screen and the image editing interface includes the first image, it is determined that the first image is in the image editing state. At this point, the system can pop up an inquiry prompt box to ask whether it is necessary to start the image preview interface of the second screen. If it receives the user's selection operation of the open option, it will control the second screen to automatically start the previewer and display an image preview interface , If the user's selection operation of the non-open option is received, the second screen previewer remains closed, and the image preview interface is not displayed.
  • the first image is edited to obtain a processed second image.
  • the first input may be a single editing operation, or may include a group of editing operations.
  • the second image is an image of the first image that has been processed by the set of editing operations
  • the third image is the first image that has been processed by a part of the set of editing operations.
  • the second image corresponds to the first processing performed on the first image
  • the first processing corresponds to a set of editing operations included in the first input
  • the third image corresponds to the second processing performed on the first image
  • the The second processing corresponds to a part of a set of editing operations included in the first input.
  • the third image may be multiple.
  • the number of the third images is proportional to the specific number of editing operations in a set of editing operations included in the first input.
  • the first input includes: filter adjustment operation, light and shade adjustment operation, and contrast adjustment operation.
  • the first image is the original image.
  • the second image is an image obtained by performing superposition processing of the three image adjustment operations included in the first input on the first image.
  • the third image may be an image obtained after performing the filter adjustment operation processing included in the first input on the first image, or after performing the light and dark adjustment operation processing included in the first input on the first image.
  • the obtained image is either the image obtained by performing the contrast adjustment operation included in the first input on the first image, or the filter adjustment operation and the brightness adjustment included in the first input are performed on the first image
  • the image obtained after the operation superimposition process is either the image obtained by performing the light and dark adjustment operation and the contrast adjustment operation superimposition process included in the first input on the first image, or the first image is the image obtained by performing the first input on the first image.
  • the included filter adjustment operation and contrast adjustment operation superimpose the image obtained after processing.
  • the second processing corresponds to at least one processing procedure formed after permutation and combination of different processing operations in the partial processing procedures included in the first processing. It is possible to obtain multiple images with different effects generated during the processing by the user's continuous processing operation input to the image in the first screen, realize the effect display in the second screen, and find the optimal processing effect image.
  • the first image is processed first to obtain the second image
  • the second image is displayed on the first screen.
  • Image displaying the first image on the second screen and/or the third image processed by a part of the processing process in the image processing corresponding to the second image, enabling the user to perform single or continuous processing operations on the image in the first screen
  • Input to realize the effect display of images with different processing effects on the second screen, so that users can preview and compare different image editing effects, assist users to quickly identify the images with the best processing effects, and preview the images on the second screen , Reduce the user's operation processing steps, facilitate the user's image editing process, and provide users with a better experience.
  • first processing is performed on the first image to obtain a second image, and the second image is displayed on the first screen, After the first image and/or the third image are displayed on the second screen, the method further includes:
  • the fifth input is used to delete the target image in at least one image displayed on the second screen.
  • the fifth input is preferably a drag operation 10 with the third target image as the starting position and a display edge of the second screen 2 as the ending position.
  • the third target image moves with the sliding operation of the terminal until it is deleted from the image preview interface of the second screen.
  • the related records of the processing steps corresponding to the third target image are deleted. Improve the convenience of operation and enhance the user experience.
  • the embodiment of the present disclosure also discloses a picture editing method, which is applied to a terminal, and the terminal includes a first screen and a second screen.
  • the arrangement structure between the first screen and the second screen is the same as the arrangement structure in the foregoing embodiment, and will not be repeated here.
  • the picture editing method includes:
  • Step 201 Receive a first input when the first image is displayed on the first screen.
  • Step 202 In response to the first input, perform first processing on the first image to obtain a second image, and display the second image on the first screen, and display the first image and/or the third image on the second screen.
  • the third image is an image obtained by performing a second process on the first image, and the second process is a part of the process included in the first process.
  • the method further includes:
  • Step 203 Receive a second input.
  • the second input can be an edit input to the second image, or a drag input to the second image, or a preset input that occurs in the first screen, or a preset input that occurs in the second screen , Or a preset input involving both the first screen and the second screen.
  • the second input is an operation input for a pop-up box in the image editing process. There are no specific restrictions here.
  • the second input includes at least one of the following:
  • the one side of the first screen is a side close to the second screen, and the sliding direction of the first sliding operation and the second sliding operation are the same.
  • the long-press operation specifically refers to an operation in which the pressing time of the second image exceeds the set duration.
  • Step 204 In response to the second input, perform third processing on the second image to obtain a fourth image, and display the fourth image on the first screen, and display the second image and/or the fifth image on the second screen.
  • the fifth image is an image obtained by performing a fourth process on the first image, and the fourth process is a part of the process included in the first process and/or the third process.
  • the second screen may display the first image and/or the third image in the foregoing process.
  • step 2 further image processing operations are performed based on the obtained second image to obtain a fourth image
  • the second image displayed on the first screen is updated to display the newly obtained fourth image
  • the second image before processing is displayed
  • the fifth image can also be displayed on the second screen, or the second image and the fifth image can be displayed simultaneously.
  • the fifth image is generated based on the first image, and the second image, the third image, and the fourth image can all be regarded as generation based on the first image in essence.
  • the fourth processing corresponds to at least one processing procedure formed by the permutation and combination of different processing operations in the partial processing procedures included in the first processing, or corresponds to the permutation and combination of different processing operations in the partial processing procedures included in the third processing
  • At least one treatment process formed later, or formed by permutation and combination of different treatment operations in the partial treatment process included in the first treatment and different treatment operations in the partial treatment process included in the second treatment At least one process. It is possible to obtain multiple images with different effects generated during the processing by the user's continuous processing operation input to the image in the first screen, realize the effect display in the second screen, and find the optimal processing effect image.
  • the fifth image is an image obtained by performing partial processing included in the first processing on the first image, or the fifth image is an image obtained after performing partial processing included in the third processing on the first image, or the fifth image
  • the image is an image obtained after performing the first processing and a part of the processing included in the third processing on the first image.
  • the second input may be a single editing operation, or it may include a group of editing operations.
  • the fifth image may be an image obtained by performing the filter adjustment operation processing included in the first input on the first image, or after performing the light and dark adjustment operation processing included in the first input on the first image
  • the obtained image is either an image obtained by performing the contrast adjustment operation included in the second input on the first image, or an image obtained after performing the sticker adding operation included in the second input on the first image
  • the image is either the image obtained after the filter adjustment operation and the contrast adjustment operation superimposing the first image, or the image obtained after the filter adjustment operation and the sticker adding operation superimposing the first image, or the first image
  • An image is an image obtained by performing a light-dark adjustment operation and a contrast adjustment operation superimposing process, or an image obtained by performing a light-dark adjustment operation and a sticker adding operation superimposing process on the first image.
  • the number of the fifth image is proportional to the specific number of editing operations in a group of editing operations included in the first input and the specific number of editing operations in a group of editing operations included in the second input.
  • the first image is processed first to obtain the second image
  • the second image is displayed on the first screen.
  • Image displaying the first image on the second screen and/or the third image processed by part of the image processing corresponding to the second image, and through subsequent user input, more editing of the second image Process and display the second image on the second screen and/or the fifth image obtained by the partial processing process in the image processing corresponding to the second image, so that the user can process the image in the first screen individually or continuously Operation input to realize the effect display of images with different processing effects on the second screen, so that users can preview and compare different image editing effects, assist users to quickly identify the images with the best processing effects, and pass the images on the second screen Preview, reduce the user's operation processing steps, facilitate the user's image editing process, and provide users with a better experience.
  • the embodiment of the present disclosure also discloses a picture editing method, which is applied to a terminal, and the terminal includes a first screen and a second screen.
  • the arrangement structure between the first screen and the second screen is the same as the arrangement structure in the foregoing embodiment, and will not be repeated here.
  • the picture editing method includes:
  • Step 301 Receive a first input when the first image is displayed on the first screen.
  • Step 302 In response to the first input, perform first processing on the first image to obtain a second image, and display the second image on the first screen, and display the first image and/or the third image on the second screen.
  • the third image is an image obtained by performing a second process on the first image, and the second process is a part of the process included in the first process.
  • the first image is first processed to obtain a second image, and the second image is displayed on the first screen, and the second image is displayed on the second screen.
  • the method further includes:
  • Step 303 Receive a third input.
  • the fourth input may include at least one of the following:
  • the fifth input is an operation in the direction of the arrow identified by reference numeral 9.
  • the fourth input includes the first sliding operation of dragging the target image in the second screen to a side of the second screen, and the first sliding operation starting from a side of the first screen in the first screen During the second sliding operation, the sliding direction of the first sliding operation is the same as the sliding direction of the second sliding operation.
  • the selection of the target image in the second screen is realized, and it is restored to the current image editing interface of the first screen, which is convenient for the user to continue editing the selected image.
  • Step 304 In response to the third input, determine the first target image from the second screen.
  • At least one image is displayed on the second screen.
  • a target image is selected from the at least one image.
  • different displayed images in the second screen correspond to image processing steps for forming respective images themselves. It is necessary to record the image processing steps corresponding to the different images displayed on the second screen to form an image editing operation record.
  • Step 305 Display the first target image and the first target processing step corresponding to the first target image on the first screen.
  • displaying the first target image on the first screen specifically includes: restoring the first target image to an editing state corresponding to the first target processing step.
  • the first target processing step is also displayed on the first screen to assist the user to further adjust and modify the display effect of the first target image.
  • the editing state of the first target image is restored according to the first target processing step.
  • the user can perform the editing operation to withdraw the first target image, continue to perform more editing operations such as effect overlay, etc., to meet the user's diversified image editing needs.
  • This implementation process realizes the effect of displaying images with different processing effects on the second screen, so that users can preview and compare different image editing effects, and assist users in quickly identifying the images with the best processing effects.
  • Image preview reduces the user's operation and processing steps, facilitates the user's image editing process, provides users with a better experience, and realizes the restoration of the editing state of the preview image in the second screen according to the recorded editing operation steps, which is convenient for users
  • the process of image editing process provides users with a better experience.
  • the embodiment of the present disclosure also discloses a picture editing method, which is applied to a terminal, and the terminal includes a first screen and a second screen.
  • the arrangement structure between the first screen and the second screen is the same as the arrangement structure in the foregoing embodiment, and will not be repeated here.
  • the picture editing method includes:
  • Step 401 When the first image is displayed on the first screen, a first input is received.
  • Step 402 In response to the first input, perform first processing on the first image to obtain a second image, and display the second image on the first screen, and display the first image and/or the third image on the second screen.
  • the third image is an image obtained by performing a second process on the first image, and the second process is a part of the process included in the first process.
  • the first image is first processed to obtain a second image, and the second image is displayed on the first screen, and the second image is displayed on the second screen.
  • the method further includes:
  • Step 403 Receive the fourth input.
  • the third input may be a click selection operation on the image displayed on the second screen, or a touch input operation on the image selection button.
  • Step 404 In response to the fourth input, determine a second target image from the second screen.
  • At least one image is displayed on the second screen.
  • a target image is selected from the at least one image.
  • different displayed images in the second screen correspond to image processing steps for forming respective images themselves. It is necessary to record the image processing steps corresponding to the different images displayed on the second screen to form an image editing operation record.
  • Step 405 Display the second target processing step corresponding to the second target image in the setting area of the second screen.
  • the display interface of the second screen is divided into different display areas, wherein the second target image is displayed in the first area, and the second target processing step corresponding to the second target image is displayed in the second area.
  • the display of the image processing steps corresponding to the image allows the user to know the editing operation steps corresponding to the best editing effect when comparing the image editing effects, so that the user can apply it to the editing process of other images, which is convenient
  • the user's image editing process provides users with a better experience.
  • This implementation process realizes the effect of displaying images with different processing effects on the second screen, so that users can preview and compare different image editing effects, and assist users in quickly identifying the images with the best processing effects.
  • Image preview reduces the user's operation and processing steps, facilitates the user's image editing process, and provides users with a better experience, and the historical editing operations of the preview image are recorded and displayed on the second screen to facilitate the user's image editing process , Provide users with a better experience.
  • the embodiment of the present disclosure also discloses a terminal.
  • the terminal includes a first screen and a second screen, and the terminal further includes: a first receiving module 501 and a first display module 502.
  • the first receiving module 501 is configured to receive a first input when the first image is displayed on the first screen.
  • the first display module 502 is configured to perform first processing on the first image in response to the first input to obtain a second image, and display the second image on the first screen.
  • the first image and/or the third image are displayed on two screens; wherein, the third image is an image obtained by performing a second process on the first image, and the second process is included in the first process Part of the process.
  • the terminal further includes:
  • the second receiving module 503 is configured to receive a second input
  • the second display module 504 is configured to perform third processing on the second image in response to the second input to obtain a fourth image, and display the fourth image on the first screen, and display the fourth image on the second screen. Displaying the second image and/or the fifth image;
  • the fifth image is an image obtained by performing a fourth process on the first image, and the fourth process is a part of the process included in the first process and/or the third process.
  • the first input includes at least one of the following:
  • the terminal further includes:
  • the third receiving module 505 is configured to receive a third input
  • the first determining module 506 is configured to determine a first target image from the second screen in response to the third input;
  • the third display module 507 is configured to display the first target image and the first target processing step corresponding to the first target image on the first screen.
  • the terminal further includes:
  • the fourth receiving module 508 is configured to receive the fourth input
  • the second determining module 509 is configured to determine a second target image from the second screen in response to the fourth input;
  • the fourth display module 510 is configured to display the second target processing step corresponding to the second target image in a setting area in the second screen.
  • the terminal receives the first input when the first image is displayed on the first screen, performs first processing on the first image to obtain the second image, and displays the second image on the first screen, and displays the second image on the second screen.
  • the first image and/or the third image processed by part of the processing process in the image processing corresponding to the second image can be realized by the user's single or continuous processing operation input on the image in the first screen, and it can be realized on the second screen.
  • the images with different processing effects are displayed in the process, so that users can preview and compare different image editing effects, and assist users to quickly identify the images with the best processing effects. Through the image preview on the second screen, the user's operation and processing steps are reduced. , Facilitate the user's image editing process, and provide users with a better experience.
  • the mobile terminal provided by the embodiment of the present disclosure can implement the various processes of the embodiment of the above-mentioned picture editing method, and can achieve the same technical effect. In order to avoid repetition, details are not repeated here.
  • FIG. 13 is a schematic diagram of the hardware structure of a mobile terminal that implements various embodiments of the present disclosure.
  • the mobile terminal 900 includes, but is not limited to: a radio frequency unit 901, a network module 902, an audio output unit 903, an input unit 904, a sensor 905, a display unit 906, a user input unit 907, an interface unit 908, a memory 909, a processor 910, and Power 911 and other components.
  • a radio frequency unit 901 includes, but is not limited to: a radio frequency unit 901, a network module 902, an audio output unit 903, an input unit 904, a sensor 905, a display unit 906, a user input unit 907, an interface unit 908, a memory 909, a processor 910, and Power 911 and other components.
  • the mobile terminal may include more or fewer components than those shown in the figure, or combine certain components, or different components. Layout.
  • mobile terminals include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminals, wearable devices, and pedometers.
  • the mobile terminal 900 includes a first screen and a second screen.
  • the user input unit 907 is configured to receive a first input when a first image is displayed on the first screen; and the processor 910 is configured to respond to the First input, perform first processing on the first image to obtain a second image, and display the second image on the first screen, and display the first image and/or the first image on the second screen Three images; wherein the third image is an image obtained by performing a second process on the first image, and the second process is a part of the process included in the first process.
  • the mobile terminal receives the first input when the first image is displayed on the first screen, performs first processing on the first image to obtain the second image, and displays the second image on the first screen.
  • the images with different processing effects are displayed on the screen, so that users can preview and compare different image editing effects, assist users to quickly identify the images with the best processing effects, and use the image preview on the second screen to reduce user manipulation and processing Steps to facilitate the user's image editing process and provide users with a better experience.
  • the radio frequency unit 901 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 910; Uplink data is sent to the base station.
  • the radio frequency unit 901 includes but is not limited to an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 901 can also communicate with the network and other devices through a wireless communication system.
  • the mobile terminal provides users with wireless broadband Internet access through the network module 902, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 903 can convert the audio data received by the radio frequency unit 901 or the network module 902 or stored in the memory 909 into an audio signal and output it as sound. Moreover, the audio output unit 903 may also provide audio output related to a specific function performed by the mobile terminal 900 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 903 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 904 is used to receive audio or video signals.
  • the input unit 904 may include a graphics processing unit (GPU) 9041 and a microphone 9042.
  • the graphics processor 9041 is used for the image of a still picture or video obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame may be displayed on the display unit 906.
  • the image frames processed by the graphics processor 9041 may be stored in the memory 909 (or other storage medium) or sent via the radio frequency unit 901 or the network module 902.
  • the microphone 9042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to the mobile communication base station via the radio frequency unit 901 for output in the case of a telephone call mode.
  • the mobile terminal 900 also includes at least one sensor 905, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 9061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 9061 and the display panel 9061 when the mobile terminal 900 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tap), etc.; sensor 905 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 906 is used to display information input by the user or information provided to the user.
  • the display unit 906 may include a display panel 9061, and the display panel 9061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 907 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the mobile terminal.
  • the user input unit 907 includes a touch panel 9071 and other input devices 9072.
  • the touch panel 9071 also called a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 9071 or near the touch panel 9071. operating).
  • the touch panel 9071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 910, the command sent by the processor 910 is received and executed.
  • the touch panel 9071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 907 may also include other input devices 9072.
  • other input devices 9072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 9071 can cover the display panel 9061.
  • the touch panel 9071 detects a touch operation on or near it, it transmits it to the processor 910 to determine the type of the touch event.
  • the type of event provides corresponding visual output on the display panel 9061.
  • the touch panel 9071 and the display panel 9061 are used as two independent components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 9071 and the display panel 9061 can be integrated
  • the implementation of the input and output functions of the mobile terminal is not specifically limited here.
  • the interface unit 908 is an interface for connecting an external device with the mobile terminal 900.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 908 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the mobile terminal 900 or can be used to connect to the mobile terminal 900 and external Transfer data between devices.
  • the memory 909 can be used to store software programs and various data.
  • the memory 909 may mainly include a storage program area and a storage data area.
  • the storage program area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 909 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 910 is the control center of the mobile terminal. It uses various interfaces and lines to connect the various parts of the entire mobile terminal, runs or executes software programs and/or modules stored in the memory 909, and calls data stored in the memory 909 , Perform various functions of the mobile terminal and process data, so as to monitor the mobile terminal as a whole.
  • the processor 910 may include one or more processing units; preferably, the processor 910 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, application programs, etc., and the modem The processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 910.
  • the mobile terminal 900 may also include a power source 911 (such as a battery) for supplying power to various components.
  • a power source 911 such as a battery
  • the power source 911 may be logically connected to the processor 910 through a power management system, so as to manage charging, discharging, and power consumption management through the power management system. And other functions.
  • the mobile terminal 900 includes some functional modules not shown, which will not be repeated here.
  • the embodiment of the present disclosure further provides a mobile terminal, including a processor 910, a memory 909, a computer program stored in the memory 909 and running on the processor 910, when the computer program is executed by the processor 910
  • a mobile terminal including a processor 910, a memory 909, a computer program stored in the memory 909 and running on the processor 910, when the computer program is executed by the processor 910
  • the embodiments of the present disclosure also provide a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
  • a computer program is stored on the computer-readable storage medium.
  • the computer program is executed by a processor, each process of the above-mentioned picture editing method embodiment is realized, and the same technology can be achieved. The effect, in order to avoid repetition, will not be repeated here.
  • the computer readable storage medium such as read-only memory (Read-Only Memory, ROM for short), random access memory (Random Access Memory, RAM for short), magnetic disk or optical disk, etc.
  • the method of the above embodiments can be implemented by means of software plus the necessary general hardware platform. Of course, it can also be implemented by hardware, but in many cases the former is better. ⁇
  • the technical solution of the present disclosure essentially or the part that contributes to the related technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk). ) Includes several instructions to make a terminal (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the method described in each embodiment of the present disclosure.

Abstract

本公开提供了一种图片编辑方法及终端,所述终端包括第一屏及第二屏,所述图片编辑方法包括:在所述第一屏显示第一图像的情况下,接收第一输入;响应于所述第一输入,对所述第一图像进行第一处理,得到第二图像,并在所述第一屏显示所述第二图像,在所述第二屏显示所述第一图像和/或第三图像;其中,所述第三图像为对所述第一图像进行第二处理得到的图像,所述第二处理为所述第一处理所包含的部分处理过程。

Description

图片编辑方法及终端
相关申请的交叉引用
本申请主张在2019年4月1日在中国提交的中国专利申请号No.201910257242.1的优先权,其全部内容通过引用包含于此。
技术领域
本公开涉及通信技术领域,尤其涉及一种图片编辑方法及终端。
背景技术
随着手机等移动终端的广泛应用,手机中具备越来越多的功能。其中,图片编辑功能在照片处理方面具有广泛使用。
而在相关技术的图片处理过程中,往往是将一张照片应用几种图片编辑操作后进行保存,图片编辑过程结束,而保存后的照片会存在于相册中。当需要处理得到一种理想效果的图片时,往往需要重复若干次前述处理过程,最终打开相册,在相册中左右滑动图片,去查看图片的不同处理效果,操作繁复,图片编辑处理过程很不方便。
发明内容
本公开实施例提供一种图片编辑方法及终端,以解决图片处理过程中操作繁复,操作不方便的问题。
为了解决上述技术问题,本公开是这样实现的:
第一方面,本公开实施例提供了一种图片编辑方法,应用于终端,所述终端包括第一屏及第二屏,所述图片编辑方法,包括:
在所述第一屏显示第一图像的情况下,接收第一输入;
响应于所述第一输入,对所述第一图像进行第一处理,得到第二图像,并在所述第一屏显示所述第二图像,在所述第二屏显示所述第一图像和/或第三图像;
其中,所述第三图像为对所述第一图像进行第二处理得到的图像,所述 第二处理为所述第一处理所包含的部分处理过程。
第二方面,本公开实施例还提供了一种终端,所述终端包括第一屏及第二屏,所述终端还包括:
第一接收模块,用于在所述第一屏显示第一图像的情况下,接收第一输入;
第一显示模块,用于响应于所述第一输入,对所述第一图像进行第一处理,得到第二图像,并在所述第一屏显示所述第二图像,在所述第二屏显示所述第一图像和/或第三图像;
其中,所述第三图像为对所述第一图像进行第二处理得到的图像,所述第二处理为所述第一处理所包含的部分处理过程。
第三方面,本公开实施例还提供了一种移动终端,包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序被所述处理器执行时实现如上所述的图片编辑方法的步骤。
第四方面,本公开实施例还提供了一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现如上所述的图片编辑方法的步骤。
在本公开实施例中,通过接收在第一屏显示第一图像的情况下的第一输入,对第一图像进行第一处理,得到第二图像,并在第一屏显示第二图像,在第二屏显示第一图像和/或经与第二图像对应的图像处理中的部分处理过程处理得到的第三图像,实现可以通过用户对第一屏中图像的单个或连续处理操作输入,实现在第二屏中将产生的不同处理效果的图像进行效果展示,以便于用户预览并比较不同的图像编辑效果,辅助用户快速识别最优处理效果的图像,通过第二屏的图像预览,减少用户的操作处理步骤,方便用户的图像编辑处理过程,为用户提供更优体验。
附图说明
图1表示本公开实施例中图片编辑方法的一个流程图;
图2表示本公开实施例中图片编辑方法的另一个流程图;
图3表示本公开实施例中图片编辑方法的又一个流程图;
图4表示本公开实施例中图片编辑方法的再一个流程图;
图5表示本公开实施例中折叠屏终端的示意图;
图6表示本公开实施例中第二屏未启动图像预览界面的示意图;
图7表示本公开实施例中第二屏启动图像预览界面的示意图;
图8表示本公开实施例中第一输入的一个操作示意图;
图9表示本公开实施例中第五输入的一个操作示意图;
图10表示本公开实施例中第三输入的一个操作示意图;
图11表示本公开实施例中终端的结构框图一;
图12表示本公开实施例中终端的结构框图二;
图13为实现本公开各个实施例的一种移动终端的硬件结构示意图。
具体实施方式
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
本公开实施例中公开一种图片编辑方法,应用于终端,所述终端包括第一屏及第二屏。
其中,可选的,本实施例中的终端为折叠屏终端或双面屏终端。其中,第一屏与第二屏可以是处于终端同侧的屏幕,如第一屏和第二屏均是处于终端正面的屏幕;也可以是分别处于终端异侧的屏幕,如第一屏是处于终端正面的屏幕,第二屏是处于终端背面的屏幕,具体情况,可根据实际需求设置。
优选地,当终端为折叠屏终端,具体是指如图5所示的具有书页开合结构的终端。终端可沿标号3所指箭头方向进行折叠或打开。当终端像书一样打开时,第一屏1与第二屏2处于终端同侧,用户能够同时观察到终端的第一屏1与第二屏2。当终端像书本一样合上时,第一屏1与第二屏2处于终端异侧。
具体地,结合图1所示,所述图片编辑方法,包括:
步骤101,在第一屏显示第一图像的情况下,接收第一输入。
该第一图像在第一屏中处于图像编辑状态。该第一屏当前显示有包括第一图像的图像编辑界面;在该图像编辑界面中,第一图像处于图像编辑状态。
其中,该图像编辑界面可以如图6、图7中所示,在第一屏的图像编辑界面中可以包括:图像显示区域4及图像编辑选项的显示区域5,该图像编辑选项具体可以包括各种效果的滤镜、不同效果的贴图、光线亮度对比度等参数的调整选项等。通过各种图像编辑选项实现在图像编辑界面中对图像的编辑操作。
该第一输入可以是对第一图像的编辑输入,或者是对第一图像的拖拽输入,或者是发生在第一屏中的预设输入,或者是发生在第二屏中的预设输入,或者是同时涉及第一屏及第二屏的预设输入。或者,该第一输入为针对图像编辑过程中的弹出框的操作输入。此处不作具体限制。
作为一优选的实施方式,其中,第一输入包括以下至少一项:
将所述第一图像拖动至所述第一屏的一个侧边的滑动操作;
将所述第一图像拖动至所述第一屏的一个侧边的第一滑动操作及由所述第二屏的一个侧边开始在所述第二屏内的第二滑动操作。可选地,第一屏中的该一个侧边为靠近第二屏的侧边,该第一滑动操作与第二滑动操作的滑动方向相同。
对所述第一图像的双击操作;
对所述第一图像的单击操作;
对所述第一图像的长按操作。该长按操作具体指对第一图像的按压时间超出设定时长的操作。
其中,结合图8所示,当第一屏与第二屏处于终端的同一侧,上述第一输入包括:将第一图像拖动至第一屏的一个侧边的滑动操作8(箭头方向)的情况下,其中第一屏中的该一个侧边为靠近第二屏的侧边。便于第一输入的具体执行,使用户操作更便捷。
上述不同的操作可以在具体应用过程中进行具体设置,适应不同场景的需要或不同的用户编辑输入进行变换,此处不以此为限。
步骤102,响应于第一输入,对第一图像进行第一处理,得到第二图像,并在第一屏显示第二图像,在第二屏显示第一图像和/或第三图像。
其中,所述第三图像为对所述第一图像进行第二处理得到的图像,所述第二处理为所述第一处理所包含的部分处理过程。
该第二屏中可以同时显示多张图像。优选地,该多张图像为平铺式的进行相邻显示。
进一步地,在第二屏显示第一图像和/或第三图像之前,在第二屏启动图像预览界面。
优选地,该在第二屏启动图像预览界面之前,第二屏为处于熄屏状态,第二屏2的图像预览器处于关闭状态6,结合图6所示。结合图7所示,当系统检测到第一屏处于图像编辑界面,且编辑界面中显示有第一图像的情况下,确定第一屏处于图片编辑状态,此时控制第二屏2自动开启预览器,图像预览器处于开启状态7,显示一图像预览界面。
具体地,在当系统检测到第一屏显示图像编辑界面,且图像编辑界面中包括第一图像的情况下,确定第一图像处于图像编辑状态。此时系统可以弹出一询问提示框,用于询问是否需要启动第二屏的图像预览界面,若接收到用户对开启选项的选取操作,则控制第二屏自动开启预览器,显示一图像预览界面,若接收到用户对不开启选项的选取操作,则第二屏预览器仍然保持关闭状态,不显示图像预览界面。
在接收到第一输入之后,对第一图像进行编辑处理,得到处理后的第二图像。
其中,第一输入可以是单独的一个编辑操作,也可以是包括一组编辑操作。
当第一输入包括一组编辑操作的情况下,第二图像为第一图像经过该一组编辑操作处理后的图像,第三图像为第一图像经过该一组编辑操作中的一部分编辑操作处理后的图像。第二图像对应于对第一图像实施的第一处理,该第一处理对应于该第一输入所包括的一组编辑操作;第三图像则对应于对第一图像实施的第二处理,该第二处理对应于该第一输入所包括的一组编辑操作中的一部分。其中,该第三图像可以为多张。该第三图像的数量与第一输入中所包括的一组编辑操作中编辑操作的具体数量成正比。
举例说明:当第一输入中包括:滤镜调整操作、明暗调整操作、对比度 调整操作。第一图像为原始图像。第二图像为对第一图像进行该第一输入中所包含的三种图像调整操作的叠加处理后得到的图像。第三图像,则可以是对第一图像进行该第一输入中所包含的滤镜调整操作处理后得到的图像,或者是对第一图像进行该第一输入中所包含的明暗调整操作处理后得到的图像,或者是对第一图像进行该第一输入中所包含的对比度调整操作处理后得到的图像,或者是对第一图像进行该第一输入中所包含的滤镜调整操作及明暗调整操作叠加处理后得到的图像,或者是对第一图像进行该第一输入中所包含的明暗调整操作及对比度调整操作叠加处理后得到的图像,或者是对第一图像进行该第一输入中所包含的滤镜调整操作及对比度调整操作叠加处理后得到的图像。
该第二处理对应于所述第一处理所包含的部分处理过程中不同处理操作在排列组合后形成的至少一个处理过程。实现可以通过用户对第一屏中图像的连续处理操作输入,得到处理过程中产生的不同效果的多张图像,实现在第二屏中的效果展示,找到最优处理效果图像。
本实施例中的该图片编辑方法,通过接收在第一屏显示第一图像的情况下的第一输入,对第一图像进行第一处理,得到第二图像,并在第一屏显示第二图像,在第二屏显示第一图像和/或经与第二图像对应的图像处理中的部分处理过程处理得到的第三图像,实现可以通过用户对第一屏中图像的单个或连续处理操作输入,实现在第二屏中将产生的不同处理效果的图像进行效果展示,以便于用户预览并比较不同的图像编辑效果,辅助用户快速识别最优处理效果的图像,通过第二屏的图像预览,减少用户的操作处理步骤,方便用户的图像编辑处理过程,为用户提供更优体验。
进一步地,作为一优选的实施方式,其中,响应于所述第一输入,对所述第一图像进行第一处理,得到第二图像,并在所述第一屏显示所述第二图像,在所述第二屏显示所述第一图像和/或第三图像之后,还包括:
接收第五输入;响应于所述第五输入,从第二屏中确定第三目标图像并删除。
其中,在将第一图像和/或第三图像显示于第二屏中之后,第二屏中显示的图像为至少一个。该第五输入用于删除第二屏中显示的至少一个图像中的 目标图像。结合图10所示,该第五输入优选为由第三目标图像处为起点位置,以第二屏2的一个显示边缘为终点位置的拖动操作10。优选地,在该第五输入的拖动操作过程中,第三目标图像随终端操作的滑动而移动,直至从第二屏的图像预览界面中删除。同时,在删除第三目标图像的同时,删除第三目标图像对应的处理步骤的相关记录。提升操作的便捷度,提升用户体验。
本公开实施例中还公开一种图片编辑方法,应用于终端,所述终端包括第一屏及第二屏。该第一屏与第二屏之间的设置结构与前述实施例中的设置结构相同,此处不再赘述。
结合图2所示,该图片编辑方法包括:
步骤201,在第一屏显示第一图像的情况下,接收第一输入。
该过程的实施方式于前述实施例中的相同,此处不再重复赘述。
步骤202,响应于第一输入,对第一图像进行第一处理,得到第二图像,并在第一屏显示第二图像,在第二屏显示第一图像和/或第三图像。
其中,所述第三图像为对所述第一图像进行第二处理得到的图像,所述第二处理为所述第一处理所包含的部分处理过程。
该过程的实施方式于前述实施例中的相同,此处不再重复赘述。
进一步地,在所述第一屏显示所述第二图像,在所述第二屏显示所述第一图像和/或第三图像之后,还包括:
步骤203,接收第二输入。
该第二输入可以是对第二图像的编辑输入,或者是对第二图像的拖拽输入,或者是发生在第一屏中的预设输入,或者是发生在第二屏中的预设输入,或者是同时涉及第一屏及第二屏的预设输入。或者,该第二输入为针对图像编辑过程中的弹出框的操作输入。此处不作具体限制。
作为一优选的实施方式,其中,第二输入包括以下至少一项:
将所述第二图像拖动至所述第一屏的一个侧边的滑动操作;
将所述第二图像拖动至所述第一屏的一个侧边的第一滑动操作及由所述第二屏的一个侧边开始在所述第二屏内的第二滑动操作。可选地,第一屏中的该一个侧边为靠近第二屏的侧边,该第一滑动操作与第二滑动操作的滑动方向相同。
对所述第二图像的双击操作;
对所述第二图像的单击操作;
对所述第二图像的长按操作。该长按操作具体指对第二图像的按压时间超出设定时长的操作。
上述不同的操作可以在具体应用过程中进行具体设置,适应不同场景的需要或不同的用户编辑输入进行变换,此处不以此为限。
步骤204,响应于第二输入,对第二图像进行第三处理,得到第四图像,并在第一屏显示第四图像,在第二屏显示第二图像和/或第五图像。
其中,所述第五图像为对所述第一图像进行第四处理得到的图像,所述第四处理为所述第一处理和/或所述第三处理所包含的部分处理过程。
优选地,在第二屏显示第二图像和/或第五图像的同时,第二屏中可显示有前述过程中的第一图像和/或第三图像。
该步骤中,基于得到的第二图像进一步地实施图像处理操作,得到第四图像,将第一屏中显示的第二图像更新为显示最新得到的第四图像,将处理之前的第二图像显示于第二屏,也可以在第二屏中显示第五图像,或同时显示第二图像和第五图像。
该第五图像为基于第一图像进行的生成,第二图像、第三图像、第四图像在实质上也均可视为基于第一图像实现的生成。
第四处理对应于所述第一处理所包含的部分处理过程中不同处理操作排列组合后形成的至少一个处理过程,或者对应于所述第三处理所包含的部分处理过程中不同处理操作排列组合后形成的至少一个处理过程,或者对应于所述第一处理所包含的部分处理过程中不同处理操作及所述第二处理中所包含的部分处理过程中不同处理操作之间排列组合后形成的至少一个处理过程。实现可以通过用户对第一屏中图像的连续处理操作输入,得到处理过程中产生的不同效果的多张图像,实现在第二屏中的效果展示,找到最优处理效果图像。
第五图像为对第一图像执行第一处理所包含的部分处理过程后得到的图像,或者第五图像为对第一图像执行第三处理所包含的部分处理过程后得到的图像,或者第五图像为对第一图像执行第一处理及第三处理中所包含的部 分处理过程后得到的图像。
其中,第二输入可以是单独的一个编辑操作,也可以是包括一组编辑操作。
举例说明。当第一输入中包括:滤镜调整操作、明暗调整操作。当第二输入是一组编辑操作时,例如为对比度调整操作、贴纸添加操作。第五图像,则可以是对第一图像进行该第一输入中所包含的滤镜调整操作处理后得到的图像,或者是对第一图像进行该第一输入中所包含的明暗调整操作处理后得到的图像,或者是对第一图像进行该第二输入中所包含的对比度调整操作处理后得到的图像,或者是对第一图像进行该第二输入中所包含的贴纸添加操作处理后得到的图像,或者是对第一图像进行滤镜调整操作及对比度调整操作叠加处理后得到的图像,或者是对第一图像进行滤镜调整操作及贴纸添加操作叠加处理后得到的图像,或者是对第一图像进行明暗调整操作及对比度调整操作叠加处理后得到的图像,或者是对第一图像进行明暗调整操作及贴纸添加操作叠加处理后得到的图像。
该第五图像的数量与第一输入中所包括的一组编辑操作中编辑操作的具体数量及第二输入中所包括的一组编辑操作中编辑操作的具体数量均成正比。
实现可以通过用户对第一屏中图像的连续处理操作输入,得到处理过程中产生的不同效果的多张图像,实现在第二屏中的效果展示,找到最优处理效果图像。
本实施例中的该图片编辑方法,通过接收在第一屏显示第一图像的情况下的第一输入,对第一图像进行第一处理,得到第二图像,并在第一屏显示第二图像,在第二屏显示第一图像和/或经与第二图像对应的图像处理中的部分处理过程处理得到的第三图像,及通过后续的用户输入,对第二图像进行更多的编辑处理,并在第二屏显示第二图像和/或经与第二图像对应的图像处理中的部分处理过程处理得到的第五图像,实现可以通过用户对第一屏中图像的单个或连续处理操作输入,实现在第二屏中将产生的不同处理效果的图像进行效果展示,以便于用户预览并比较不同的图像编辑效果,辅助用户快速识别最优处理效果的图像,通过第二屏的图像预览,减少用户的操作处理步骤,方便用户的图像编辑处理过程,为用户提供更优体验。
本公开实施例中还公开一种图片编辑方法,应用于终端,所述终端包括第一屏及第二屏。该第一屏与第二屏之间的设置结构与前述实施例中的设置结构相同,此处不再赘述。
结合图3所示,该图片编辑方法包括:
步骤301,在第一屏显示第一图像的情况下,接收第一输入。
该过程的实施方式于前述实施例中的相同,此处不再重复赘述。
步骤302,响应于第一输入,对第一图像进行第一处理,得到第二图像,并在第一屏显示第二图像,在第二屏显示第一图像和/或第三图像。
其中,所述第三图像为对所述第一图像进行第二处理得到的图像,所述第二处理为所述第一处理所包含的部分处理过程。
该过程的实施方式于前述实施例中的相同,此处不再重复赘述。
进一步地,在响应于所述第一输入,对所述第一图像进行第一处理,得到第二图像,并在所述第一屏显示所述第二图像,在所述第二屏显示所述第一图像和/或第三图像之后,所述方法还包括:
步骤303,接收第三输入。
该第四输入可以包括以下至少一项:
将第二屏中目标图像拖动至所述第二屏的一个侧边的滑动操作;
将第二屏中目标图像拖动至所述第二屏的一个侧边的第一滑动操作及由所述第一屏的一个侧边开始在所述第一屏内的第二滑动操作;如图9所示,该第五输入为标号9标识的箭头方向的操作。
对第二屏中目标图像的双击操作;
对第二屏中目标图像的单击操作;
对第二屏中目标图像的长按操作。
当第四输入包括将第二屏中目标图像拖动至所述第二屏的一个侧边的第一滑动操作及由所述第一屏的一个侧边开始在所述第一屏内的第二滑动操作时,该第一滑动操作的滑动方向与所述第二滑动操作的滑动方向相同。
通过用户的操作输入,实现对第二屏中目标图像的选定,及出发将其恢复到第一屏的当前图像编辑界面中,便于用户对选中的图像进行继续编辑操作。
步骤304,响应于所述第三输入,从第二屏中确定第一目标图像。
第二屏中显示有至少一张图像。通过用户的第三输入,从该至少一张图像中选取一张目标图像。具体地,第二屏中不同显示图像对应有形成各自图像本身的图像处理步骤。需要记录第二屏中所显示的不同图像所对应的图像处理步骤,形成图像的编辑操作记录。
步骤305,将第一目标图像及与第一目标图像相对应的第一目标处理步骤显示于第一屏中。
优选地,在将第一目标图像显示于第一屏中,具体为:将第一目标图像恢复至与第一目标处理步骤相对应的编辑状态。
将第一目标处理步骤也显示于第一屏中,协助用户对第一目标图像进行显示效果的进一步调整与修改。
同时,第一目标图像显示于第一屏中时,优选为将第一屏中的当前显示图像替换为所述第一目标图像进行显示,以进行进一步的编辑。
在将第一屏中当前显示的图像替换为第一目标图像的同时,依据第一目标处理步骤,将该第一目标图像的编辑状态进行恢复。以使用户可以对该第一目标图像执行编辑操作撤回、继续进行更多的效果叠加等编辑操作,满足用户的多样化图像编辑需求。
该实施过程,实现在第二屏中将产生的不同处理效果的图像进行效果展示,以便于用户预览并比较不同的图像编辑效果,辅助用户快速识别最优处理效果的图像,通过第二屏的图像预览,减少用户的操作处理步骤,方便用户的图像编辑处理过程,为用户提供更优体验,且实现依据记录的编辑操作步骤实现对第二屏中的预览图像的编辑状态恢复操作,方便用户的图像编辑处理过程,为用户提供更优体验。
本公开实施例中还公开一种图片编辑方法,应用于终端,所述终端包括第一屏及第二屏。该第一屏与第二屏之间的设置结构与前述实施例中的设置结构相同,此处不再赘述。
结合图4所示,该图片编辑方法包括:
步骤401,在第一屏显示第一图像的情况下,接收第一输入。
该过程的实施方式于前述实施例中的相同,此处不再重复赘述。
步骤402,响应于第一输入,对第一图像进行第一处理,得到第二图像,并在第一屏显示第二图像,在第二屏显示第一图像和/或第三图像。
其中,所述第三图像为对所述第一图像进行第二处理得到的图像,所述第二处理为所述第一处理所包含的部分处理过程。
该过程的实施方式于前述实施例中的相同,此处不再重复赘述。
进一步地,在响应于所述第一输入,对所述第一图像进行第一处理,得到第二图像,并在所述第一屏显示所述第二图像,在所述第二屏显示所述第一图像和/或第三图像之后,所述方法还包括:
步骤403,接收第四输入。
该第三输入可以是对第二屏中所显示图像的点击选取操作,或者是图像选取按键的触摸输入操作。
步骤404,响应于第四输入,从第二屏中确定第二目标图像。
第二屏中显示有至少一张图像。通过用户的第四输入,从该至少一张图像中选取一张目标图像。具体地,第二屏中不同显示图像对应有形成各自图像本身的图像处理步骤。需要记录第二屏中所显示的不同图像所对应的图像处理步骤,形成图像的编辑操作记录。
步骤405,将第二目标图像对应的第二目标处理步骤显示于第二屏中的设定区域。
将第二屏的显示界面划分为不同的显示区域,其中在第一区域中显示第二目标图像,在第二区域中显示与第二目标图像相对应的第二目标处理步骤。
该图像对应的图像处理步骤的显示可以使用户在进行图像编辑效果比对时,能够获知最佳编辑效果所对应的编辑操作步骤,以使用户能够将其应用至其他图像的编辑过程中,方便用户的图像编辑处理过程,为用户提供更优体验。该实施过程,实现在第二屏中将产生的不同处理效果的图像进行效果展示,以便于用户预览并比较不同的图像编辑效果,辅助用户快速识别最优处理效果的图像,通过第二屏的图像预览,减少用户的操作处理步骤,方便用户的图像编辑处理过程,为用户提供更优体验,且在第二屏中对预览图像的历史编辑操作进行记录与显示,方便用户的图像编辑处理过程,为用户提供更优体验。
本公开实施例还公开了一种终端,结合图11、图12所示,所述终端包括第一屏及第二屏,所述终端还包括:第一接收模块501及第一显示模块502。
第一接收模块501,用于在所述第一屏显示第一图像的情况下,接收第一输入。
第一显示模块502,用于响应于所述第一输入,对所述第一图像进行第一处理,得到第二图像,并在所述第一屏显示所述第二图像,在所述第二屏显示所述第一图像和/或第三图像;其中,所述第三图像为对所述第一图像进行第二处理得到的图像,所述第二处理为所述第一处理所包含的部分处理过程。
进一步地,可选地,终端还包括:
第二接收模块503,用于接收第二输入;
第二显示模块504,用于响应于所述第二输入,对第二图像进行第三处理,得到第四图像,并在所述第一屏显示所述第四图像,在所述第二屏显示所述第二图像和/或第五图像;
其中,所述第五图像为对所述第一图像进行第四处理得到的图像,所述第四处理为所述第一处理和/或所述第三处理所包含的部分处理过程。
可选地,所述第一输入包括以下至少一项:
将所述第一图像拖动至所述第一屏的一个侧边的滑动操作;
将所述第一图像拖动至所述第一屏的一个侧边的第一滑动操作及由所述第二屏的一个侧边开始在所述第二屏内的第二滑动操作;
对所述第一图像的双击操作;
对所述第一图像的单击操作;
对所述第一图像的长按操作。
进一步地,可选地,终端还包括:
第三接收模块505,用于接收第三输入;
第一确定模块506,用于响应于所述第三输入,从所述第二屏中确定第一目标图像;
第三显示模块507,用于将所述第一目标图像及与所述第一目标图像相对应的第一目标处理步骤显示于所述第一屏中。
进一步地,可选地,终端还包括:
第四接收模块508,用于接收第四输入;
第二确定模块509,用于响应于所述第四输入,从所述第二屏中确定第二目标图像;
第四显示模块510,用于将所述第二目标图像对应的第二目标处理步骤显示于所述第二屏中的设定区域。
该终端,通过接收在第一屏显示第一图像的情况下的第一输入,对第一图像进行第一处理,得到第二图像,并在第一屏显示第二图像,在第二屏显示第一图像和/或经与第二图像对应的图像处理中的部分处理过程处理得到的第三图像,实现可以通过用户对第一屏中图像的单个或连续处理操作输入,实现在第二屏中将产生的不同处理效果的图像进行效果展示,以便于用户预览并比较不同的图像编辑效果,辅助用户快速识别最优处理效果的图像,通过第二屏的图像预览,减少用户的操作处理步骤,方便用户的图像编辑处理过程,为用户提供更优体验。
本公开实施例提供的移动终端能够实现上述图片编辑方法的实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
图13为实现本公开各个实施例的一种移动终端的硬件结构示意图。
该移动终端900包括但不限于:射频单元901、网络模块902、音频输出单元903、输入单元904、传感器905、显示单元906、用户输入单元907、接口单元908、存储器909、处理器910、以及电源911等部件。本领域技术人员可以理解,图13中示出的移动终端结构并不构成对移动终端的限定,移动终端可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本公开实施例中,移动终端包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载终端、可穿戴设备、以及计步器等。
其中,移动终端900包括第一屏及第二屏,用户输入单元907,用于在所述第一屏显示第一图像的情况下,接收第一输入;处理器910,用于响应于所述第一输入,对所述第一图像进行第一处理,得到第二图像,并在所述第一屏显示所述第二图像,在所述第二屏显示所述第一图像和/或第三图像;其中,所述第三图像为对所述第一图像进行第二处理得到的图像,所述第二 处理为所述第一处理所包含的部分处理过程。
该移动终端,通过接收在第一屏显示第一图像的情况下的第一输入,对第一图像进行第一处理,得到第二图像,并在第一屏显示第二图像,在第二屏显示第一图像和/或经与第二图像对应的图像处理中的部分处理过程处理得到的第三图像,实现可以通过用户对第一屏中图像的单个或连续处理操作输入,实现在第二屏中将产生的不同处理效果的图像进行效果展示,以便于用户预览并比较不同的图像编辑效果,辅助用户快速识别最优处理效果的图像,通过第二屏的图像预览,减少用户的操作处理步骤,方便用户的图像编辑处理过程,为用户提供更优体验。
应理解的是,本公开实施例中,射频单元901可用于收发信息或通话过程中,信号的接收和发送,具体的,将来自基站的下行数据接收后,给处理器910处理;另外,将上行的数据发送给基站。通常,射频单元901包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元901还可以通过无线通信系统与网络和其他设备通信。
移动终端通过网络模块902为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。
音频输出单元903可以将射频单元901或网络模块902接收的或者在存储器909中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元903还可以提供与移动终端900执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元903包括扬声器、蜂鸣器以及受话器等。
输入单元904用于接收音频或视频信号。输入单元904可以包括图形处理器(Graphics Processing Unit,GPU)9041和麦克风9042,图形处理器9041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元906上。经图形处理器9041处理后的图像帧可以存储在存储器909(或其它存储介质)中或者经由射频单元901或网络模块902进行发送。麦克风9042可以接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可以在电话通话模式的情况下转换为可经由射频单元901发送到移动通信基 站的格式输出。
移动终端900还包括至少一种传感器905,比如光传感器、运动传感器以及其他传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板9061的亮度,接近传感器可在移动终端900移动到耳边时,关闭显示面板9061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别移动终端姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器905还可以包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。
显示单元906用于显示由用户输入的信息或提供给用户的信息。显示单元906可包括显示面板9061,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板9061。
用户输入单元907可用于接收输入的数字或字符信息,以及产生与移动终端的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元907包括触控面板9071以及其他输入设备9072。触控面板9071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板9071上或在触控面板9071附近的操作)。触控面板9071可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器910,接收处理器910发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板9071。除了触控面板9071,用户输入单元907还可以包括其他输入设备9072。具体地,其他输入设备9072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
进一步的,触控面板9071可覆盖在显示面板9061上,当触控面板9071 检测到在其上或附近的触摸操作后,传送给处理器910以确定触摸事件的类型,随后处理器910根据触摸事件的类型在显示面板9061上提供相应的视觉输出。虽然在图13中,触控面板9071与显示面板9061是作为两个独立的部件来实现移动终端的输入和输出功能,但是在某些实施例中,可以将触控面板9071与显示面板9061集成而实现移动终端的输入和输出功能,具体此处不做限定。
接口单元908为外部装置与移动终端900连接的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元908可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到移动终端900内的一个或多个元件或者可以用于在移动终端900和外部装置之间传输数据。
存储器909可用于存储软件程序以及各种数据。存储器909可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器909可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器910是移动终端的控制中心,利用各种接口和线路连接整个移动终端的各个部分,通过运行或执行存储在存储器909内的软件程序和/或模块,以及调用存储在存储器909内的数据,执行移动终端的各种功能和处理数据,从而对移动终端进行整体监控。处理器910可包括一个或多个处理单元;优选的,处理器910可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器910中。
移动终端900还可以包括给各个部件供电的电源911(比如电池),优选的,电源911可以通过电源管理系统与处理器910逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
另外,移动终端900包括一些未示出的功能模块,在此不再赘述。
优选的,本公开实施例还提供一种移动终端,包括处理器910,存储器909,存储在存储器909上并可在所述处理器910上运行的计算机程序,该计算机程序被处理器910执行时实现上述图片编辑方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
本公开实施例还提供一种计算机可读存储介质,计算机可读存储介质上存储有计算机程序,该计算机程序被处理器执行时实现上述图片编辑方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。其中,所述的计算机可读存储介质,如只读存储器(Read-Only Memory,简称ROM)、随机存取存储器(Random Access Memory,简称RAM)、磁碟或者光盘等。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本公开的技术方案本质上或者说对相关技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本公开各个实施例所述的方法。
上面结合附图对本公开的实施例进行了描述,但是本公开并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本公开的启示下,在不脱离本公开宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本公开的保护之内。
以上所述的是本公开的优选实施方式,应当指出对于本技术领域的普通 人员来说,在不脱离本公开所述的原理前提下还可以作出若干改进和润饰,这些改进和润饰也在本公开的保护范围内。

Claims (12)

  1. 一种图片编辑方法,应用于终端,所述终端包括第一屏及第二屏,其中,所述图片编辑方法,包括:
    在所述第一屏显示第一图像的情况下,接收第一输入;
    响应于所述第一输入,对所述第一图像进行第一处理,得到第二图像,并在所述第一屏显示所述第二图像,在所述第二屏显示所述第一图像和/或第三图像;
    其中,所述第三图像为对所述第一图像进行第二处理得到的图像,所述第二处理为所述第一处理所包含的部分处理过程。
  2. 根据权利要求1所述的图片编辑方法,其中,
    所述在所述第一屏显示所述第二图像,在所述第二屏显示所述第一图像和/或第三图像之后,还包括:
    接收第二输入;
    响应于所述第二输入,对第二图像进行第三处理,得到第四图像,并在所述第一屏显示所述第四图像,在所述第二屏显示所述第二图像和/或第五图像;
    其中,所述第五图像为对所述第一图像进行第四处理得到的图像,所述第四处理为所述第一处理和/或所述第三处理所包含的部分处理过程。
  3. 根据权利要求1所述的图片编辑方法,其中,所述第一输入包括以下至少一项:
    将所述第一图像拖动至所述第一屏的一个侧边的滑动操作;
    将所述第一图像拖动至所述第一屏的一个侧边的第一滑动操作及由所述第二屏的一个侧边开始在所述第二屏内的第二滑动操作;
    对所述第一图像的双击操作;
    对所述第一图像的单击操作;
    对所述第一图像的长按操作。
  4. 根据权利要求1所述的图片编辑方法,其中,所述响应于所述第一输入,对所述第一图像进行第一处理,得到第二图像,并在所述第一屏显示所 述第二图像,在所述第二屏显示所述第一图像和/或第三图像之后,所述方法还包括:
    接收第三输入;
    响应于所述第三输入,从所述第二屏中确定第一目标图像;
    将所述第一目标图像及与所述第一目标图像相对应的第一目标处理步骤显示于所述第一屏中。
  5. 根据权利要求1所述的图片编辑方法,其中,所述响应于所述第一输入,对所述第一图像进行第一处理,得到第二图像,并在所述第一屏显示所述第二图像,在所述第二屏显示所述第一图像和/或第三图像之后,所述方法还包括:
    接收第四输入;
    响应于所述第四输入,从所述第二屏中确定第二目标图像;
    将所述第二目标图像对应的第二目标处理步骤显示于所述第二屏中的设定区域。
  6. 一种终端,包括:
    第一屏及第二屏;
    第一接收模块,用于在所述第一屏显示第一图像的情况下,接收第一输入;
    第一显示模块,用于响应于所述第一输入,对所述第一图像进行第一处理,得到第二图像,并在所述第一屏显示所述第二图像,在所述第二屏显示所述第一图像和/或第三图像;
    其中,所述第三图像为对所述第一图像进行第二处理得到的图像,所述第二处理为所述第一处理所包含的部分处理过程。
  7. 根据权利要求6所述的终端,包括:
    第二接收模块,用于接收第二输入;
    第二显示模块,用于响应于所述第二输入,对第二图像进行第三处理,得到第四图像,并在所述第一屏显示所述第四图像,在所述第二屏显示所述第二图像和/或第五图像;
    其中,所述第五图像为对所述第一图像进行第四处理得到的图像,所述 第四处理为所述第一处理和/或所述第三处理所包含的部分处理过程。
  8. 根据权利要求6所述的终端,其中,所述第一输入包括以下至少一项:
    将所述第一图像拖动至所述第一屏的一个侧边的滑动操作;
    将所述第一图像拖动至所述第一屏的一个侧边的第一滑动操作及由所述第二屏的一个侧边开始在所述第二屏内的第二滑动操作;
    对所述第一图像的双击操作;
    对所述第一图像的单击操作;
    对所述第一图像的长按操作。
  9. 根据权利要求6所述的终端,还包括:
    第三接收模块,用于接收第三输入;
    第一确定模块,用于响应于所述第三输入,从所述第二屏中确定第一目标图像;
    第三显示模块,用于将所述第一目标图像及与所述第一目标图像相对应的第一目标处理步骤显示于所述第一屏中。
  10. 根据权利要求6所述的终端,还包括:
    第四接收模块,用于接收第四输入;
    第二确定模块,用于响应于所述第四输入,从所述第二屏中确定第二目标图像;
    第四显示模块,用于将所述第二目标图像对应的第二目标处理步骤显示于所述第二屏中的设定区域。
  11. 一种终端,包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的计算机程序,其中,所述计算机程序被所述处理器执行时实现如上1至5任一项所述的图片编辑方法的步骤。
  12. 一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,其中,所述计算机程序被处理器执行时实现如上1至5任一项所述的图片编辑方法的步骤。
PCT/CN2020/081030 2019-04-01 2020-03-25 图片编辑方法及终端 WO2020199995A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/491,027 US11630561B2 (en) 2019-04-01 2021-09-30 Image editing method and terminal

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910257242.1 2019-04-01
CN201910257242.1A CN110007837B (zh) 2019-04-01 2019-04-01 一种图片编辑方法及终端

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/491,027 Continuation US11630561B2 (en) 2019-04-01 2021-09-30 Image editing method and terminal

Publications (1)

Publication Number Publication Date
WO2020199995A1 true WO2020199995A1 (zh) 2020-10-08

Family

ID=67169235

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/081030 WO2020199995A1 (zh) 2019-04-01 2020-03-25 图片编辑方法及终端

Country Status (3)

Country Link
US (1) US11630561B2 (zh)
CN (1) CN110007837B (zh)
WO (1) WO2020199995A1 (zh)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110007837B (zh) * 2019-04-01 2021-03-26 维沃移动通信有限公司 一种图片编辑方法及终端
CN110543276B (zh) * 2019-08-30 2021-04-02 维沃移动通信有限公司 图片的筛选方法及其终端设备
CN110609651A (zh) * 2019-09-16 2019-12-24 珠海格力电器股份有限公司 一种使用相册的方法、装置及设置有折叠屏的电子设备
CN111553854A (zh) * 2020-04-21 2020-08-18 维沃移动通信有限公司 一种图像处理方法及电子设备
KR20220017284A (ko) * 2020-08-04 2022-02-11 삼성전자주식회사 전자 장치 및 그의 화면을 제어하는 방법
CN116017176A (zh) * 2021-10-20 2023-04-25 北京字跳网络技术有限公司 视频生成方法、装置、电子设备及可读存储介质
CN114004915A (zh) * 2021-10-22 2022-02-01 珠海格力电器股份有限公司 图片编辑方法、装置、设备及存储介质
US20230215465A1 (en) * 2021-12-30 2023-07-06 Lemon Inc. Visual effect design using multiple preview windows
CN114816626A (zh) * 2022-04-08 2022-07-29 北京达佳互联信息技术有限公司 一种操作控制方法、装置、电子设备及存储介质
CN117616747A (zh) * 2022-05-26 2024-02-27 北京小米移动软件有限公司 一种拍摄预览方法、装置、终端设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106651761A (zh) * 2016-12-27 2017-05-10 维沃移动通信有限公司 一种为图片添加滤镜的方法及移动终端
CN107909634A (zh) * 2017-11-30 2018-04-13 努比亚技术有限公司 图片显示方法、移动终端及计算机可读存储介质
US20180316637A1 (en) * 2017-05-01 2018-11-01 Microsoft Technology Licensing, Llc Conversation lens for context
CN110007837A (zh) * 2019-04-01 2019-07-12 维沃移动通信有限公司 一种图片编辑方法及终端

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6999068B2 (en) * 2001-08-21 2006-02-14 Hewlett-Packard Development Company, L.P. System and method for enabling users to edit graphical images
KR100640808B1 (ko) * 2005-08-12 2006-11-02 엘지전자 주식회사 촬상 이미지의 듀얼 디스플레이 기능을 갖는 이동통신단말기 및 그 방법
US8077182B2 (en) * 2008-03-17 2011-12-13 Apple Inc. User interface controls for managing content attributes
KR101569776B1 (ko) * 2009-01-09 2015-11-19 삼성전자주식회사 접히는 표시부를 가지는 휴대 단말기 및 이의 운용 방법
DE202012012645U1 (de) * 2012-03-01 2013-07-11 Research In Motion Ltd. Ziehpunkt zum Anwenden von Bildfiltern in einem Bildeditor
US9704231B1 (en) * 2014-05-19 2017-07-11 Google Inc. Measuring and visualizing impact of image modifications
CN104142794B (zh) * 2014-07-30 2017-12-26 联想(北京)有限公司 一种信息处理方法及电子设备
US10691880B2 (en) * 2016-03-29 2020-06-23 Microsoft Technology Licensing, Llc Ink in an electronic document
US9762971B1 (en) * 2016-04-26 2017-09-12 Amazon Technologies, Inc. Techniques for providing media content browsing
CN106681606A (zh) * 2016-12-06 2017-05-17 宇龙计算机通信科技(深圳)有限公司 一种图片处理方法及终端
CN108469898B (zh) * 2018-03-15 2020-05-12 维沃移动通信有限公司 一种图像处理方法及柔性屏终端
CN110078370A (zh) 2019-04-19 2019-08-02 嘉兴市光泰照明有限公司 一种用于飞机起落架航空灯的高强度玻璃

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106651761A (zh) * 2016-12-27 2017-05-10 维沃移动通信有限公司 一种为图片添加滤镜的方法及移动终端
US20180316637A1 (en) * 2017-05-01 2018-11-01 Microsoft Technology Licensing, Llc Conversation lens for context
CN107909634A (zh) * 2017-11-30 2018-04-13 努比亚技术有限公司 图片显示方法、移动终端及计算机可读存储介质
CN110007837A (zh) * 2019-04-01 2019-07-12 维沃移动通信有限公司 一种图片编辑方法及终端

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
EAST FILM INDUSTRY: "How to use Photoshop on mobile and improve your pics in one minute!", SOHU, 17 April 2016 (2016-04-17), pages 1 - 11, XP009523766 *

Also Published As

Publication number Publication date
CN110007837B (zh) 2021-03-26
CN110007837A (zh) 2019-07-12
US20220019345A1 (en) 2022-01-20
US11630561B2 (en) 2023-04-18

Similar Documents

Publication Publication Date Title
WO2020199995A1 (zh) 图片编辑方法及终端
WO2021098678A1 (zh) 投屏控制方法及电子设备
WO2019137429A1 (zh) 图片处理方法及移动终端
WO2019228294A1 (zh) 对象分享方法及移动终端
WO2021036536A1 (zh) 视频拍摄方法及电子设备
WO2021036542A1 (zh) 录屏方法及移动终端
WO2019223494A1 (zh) 截屏方法及移动终端
US11675442B2 (en) Image processing method and flexible-screen terminal
WO2020192315A1 (zh) 控制方法及移动终端
WO2020042890A1 (zh) 视频处理方法、终端及计算机可读存储介质
WO2021104236A1 (zh) 一种共享拍摄参数的方法及电子设备
WO2021109907A1 (zh) 应用分享方法、第一电子设备及计算机可读存储介质
WO2021147779A1 (zh) 配置信息分享方法、终端设备及计算机可读存储介质
WO2019196929A1 (zh) 一种视频数据处理方法及移动终端
WO2020238449A1 (zh) 通知消息的处理方法及终端
WO2020151513A1 (zh) 信息处理方法及终端设备
CN109213416B (zh) 一种显示信息处理方法及移动终端
WO2020182035A1 (zh) 图像处理方法及终端设备
WO2019120192A1 (zh) 文本编辑方法及移动终端
WO2020238497A1 (zh) 图标移动方法及终端设备
WO2021004426A1 (zh) 内容选择方法及终端
WO2021143687A1 (zh) 图像显示方法及电子设备
WO2020156118A1 (zh) 管理方法及终端设备
WO2020238536A1 (zh) 信息处理方法及终端设备
WO2021063292A1 (zh) 显示控制方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20785061

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20785061

Country of ref document: EP

Kind code of ref document: A1