WO2021121253A1 - Procédé de traitement d'image et dispositif électronique - Google Patents

Procédé de traitement d'image et dispositif électronique Download PDF

Info

Publication number
WO2021121253A1
WO2021121253A1 PCT/CN2020/136731 CN2020136731W WO2021121253A1 WO 2021121253 A1 WO2021121253 A1 WO 2021121253A1 CN 2020136731 W CN2020136731 W CN 2020136731W WO 2021121253 A1 WO2021121253 A1 WO 2021121253A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
images
target
electronic device
input
Prior art date
Application number
PCT/CN2020/136731
Other languages
English (en)
Chinese (zh)
Inventor
杨蕾
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2021121253A1 publication Critical patent/WO2021121253A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/16File or folder operations, e.g. details of user interfaces specifically adapted to file systems
    • G06F16/168Details of user interfaces specifically adapted to file systems, e.g. browsing and visualisation, 2d or 3d GUIs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the embodiments of the present invention relate to the field of communication technologies, and in particular, to an image processing method and electronic equipment.
  • image sharing ie image sharing
  • beautiful image layout and image style have become the pursuit of more and more users.
  • the user may need to adjust the arrangement order of one or more images in the layout arrangement in the communication application before sharing the one or more images in the communication application, resulting in a cumbersome process of sharing the image with the electronic device.
  • the embodiments of the present invention provide an image processing method and an electronic device, so as to solve the problem that the process of sharing images by the electronic device is relatively complicated.
  • an embodiment of the present invention provides an image processing method applied to an electronic device.
  • the method includes: displaying a first editing interface, the first editing interface includes M image regions indicated by a target arrangement template, and the first editing interface N first images are displayed on the N image areas in the image area; receiving a first input; in response to the first input, editing at least one of the N first images to obtain a target image array; receiving a second input; responding to The second input is to send the target image array; where N is a positive integer less than or equal to M.
  • an embodiment of the present invention also provides an electronic device, the electronic device includes: a display module, a receiving module, an editing module, and a sending module; the display module is used to display a first editing interface, the first editing interface includes a target Arrange the M image areas indicated by the template, and N first images are displayed on the N image areas in the first editing interface; the receiving module is used for receiving the first input; the editing module is used for responding to the receiving module The first input is to edit at least one of the N first images displayed by the display module to obtain the target image array; the receiving module is also used to receive the second input; the sending module is used to respond to the second received by the receiving module Input and send the target image array obtained by the editing module; where N is a positive integer less than or equal to M.
  • an embodiment of the present invention provides an electronic device, including a processor, a memory, and a computer program stored in the memory and capable of running on the processor.
  • the computer program is executed by the processor to achieve the following On the one hand, the steps of the image processing method.
  • an embodiment of the present invention provides a computer-readable storage medium on which a computer program is stored.
  • the computer program is executed by a processor, the steps of the image processing method as described in the first aspect are implemented. .
  • the first editing interface including the M image regions indicated by the target arrangement template may be displayed, and N first images are displayed on the N image regions of the M image regions. Subsequently, through the first input, at least one of the N first images can be edited to obtain the target image array. Furthermore, through the second input, the target image array can be sent. In this way, when the user needs to share the images in the target image array, there is no need for the user to select these images one by one, edit these images in real time, and then send these images; instead, they can be arranged according to the target in the first image editing interface.
  • the images in the target image array obtained by template editing are taken as a whole, and the images in the target image array are selected and sent as a whole quickly and conveniently.
  • the electronic device can edit the N first images according to the target arrangement template in the first editing interface of the gallery application, instead of editing the N first images through a special third-party retouching application, so that the user does not need to open the multi In the gallery application, you can quickly and conveniently control the electronic device to edit the N first images.
  • the electronic device can send the target image array in the gallery application through the communication application, there is no need to control the communication application to call up images from the gallery application one by one to edit and send these images in real time. That is, the user can trigger the electronic device to edit and share images quickly and conveniently through the integrated operation of the gallery application and the communication application.
  • FIG. 1 is a schematic structural diagram of a possible Android operating system provided by an embodiment of the present invention
  • FIG. 2 is a schematic flowchart of an image processing method provided by an embodiment of the present invention.
  • FIG. 3 is one of schematic diagrams of displaying content of an electronic device according to an embodiment of the present invention.
  • FIG. 4 is the second schematic diagram of displaying content of an electronic device according to an embodiment of the present invention.
  • FIG. 5 is the third schematic diagram of displaying content of an electronic device according to an embodiment of the present invention.
  • FIG. 6 is a fourth schematic diagram of displaying content of an electronic device according to an embodiment of the present invention.
  • FIG. 7 is a fifth schematic diagram of displaying content of an electronic device according to an embodiment of the present invention.
  • FIG. 8 is a sixth schematic diagram of displaying content of an electronic device according to an embodiment of the present invention.
  • FIG. 9 is a seventh schematic diagram of displaying content of an electronic device according to an embodiment of the present invention.
  • FIG. 10 is the eighth schematic diagram of displaying content of an electronic device according to an embodiment of the present invention.
  • FIG. 11 is a schematic structural diagram of a possible electronic device provided by an embodiment of the present invention.
  • FIG. 12 is a schematic diagram of the hardware structure of an electronic device according to an embodiment of the present invention.
  • A/B can mean A or B
  • the "and/or” in this article is only an association relationship describing associated objects, indicating that there may be three A relationship, for example, A and/or B, can mean that: A alone exists, A and B exist at the same time, and B exists alone.
  • Multiple means two or more than two.
  • first and second in the specification and claims of the present invention are used to distinguish different objects, rather than to describe a specific order of objects.
  • first input and the second input are used to distinguish different inputs, rather than to describe a specific order of input.
  • the image processing method provided by the embodiment of the present invention can display a first editing interface including M image regions indicated by the target arrangement template, and N first images are displayed on the N image regions of the M image regions. Subsequently, through the first input, at least one of the N first images can be edited to obtain the target image array. Furthermore, through the second input, the target image array can be sent. In this way, when the user needs to share the images in the target image array, there is no need for the user to select these images one by one, edit these images in real time, and then send these images; instead, they can be arranged according to the target in the first image editing interface.
  • the images in the target image array obtained by template editing are taken as a whole, and the images in the target image array are selected and sent as a whole quickly and conveniently.
  • the electronic device can edit the N first images according to the target arrangement template in the first editing interface of the gallery application, instead of editing the N first images through a special third-party retouching application, so that the user does not need to open the multi In the gallery application, you can quickly and conveniently control the electronic device to edit the N first images.
  • the electronic device can send the target image array in the gallery application through the communication application, there is no need to control the communication application to call up images from the gallery application one by one to edit and send these images in real time. That is, the user can trigger the electronic device to edit and share images quickly and conveniently through the integrated operation of the gallery application and the communication application.
  • the electronic device in the embodiment of the present invention may be a mobile electronic device or a non-mobile electronic device.
  • Mobile electronic devices can be mobile phones, tablet computers, notebook computers, handheld computers, vehicle terminals, wearable devices, ultra-mobile personal computers (UMPC), netbooks, or personal digital assistants (personal digital assistants, PDAs), etc.
  • the non-mobile electronic device may be a personal computer (PC), a television (television, TV), a teller machine, or a self-service machine, etc.; the embodiment of the present invention does not specifically limit it.
  • the execution subject may be an electronic device, or the central processing unit (CPU) of the electronic device, or the electronic device for performing image processing The control module of the method.
  • an image processing method executed by an electronic device is taken as an example to illustrate the image processing method provided in the embodiment of the present invention.
  • the electronic device in the embodiment of the present invention may be an electronic device with an operating system.
  • the operating system may be an Android operating system, an ios operating system, or other possible operating systems, which is not specifically limited in the embodiment of the present invention.
  • the following takes the Android operating system as an example to introduce the software environment to which the image processing method provided by the embodiment of the present invention is applied.
  • FIG. 1 it is a schematic structural diagram of a possible Android operating system provided by an embodiment of the present invention.
  • the architecture of the Android operating system includes 4 layers, which are: application layer, application framework layer, system runtime library layer, and kernel layer (specifically, it may be the Linux kernel layer).
  • the application layer includes various applications (including system applications and third-party applications) in the Android operating system.
  • the application framework layer is the framework of the application. Developers can develop some applications based on the application framework layer while complying with the development principles of the application framework. For example, applications such as system setting applications, system chat applications, and system camera applications. Applications such as third-party settings applications, third-party camera applications, and third-party chat applications.
  • the system runtime layer includes a library (also called a system library) and an Android operating system runtime environment.
  • the library mainly provides various resources needed by the Android operating system.
  • the Android operating system operating environment is used to provide a software environment for the Android operating system.
  • the kernel layer is the operating system layer of the Android operating system and belongs to the lowest level of the Android operating system software level.
  • the kernel layer is based on the Linux kernel to provide core system services and hardware-related drivers for the Android operating system.
  • developers can develop a software program that implements the image processing method provided by the embodiment of the present invention based on the system architecture of the Android operating system as shown in FIG.
  • the processing method can be run based on the Android operating system as shown in FIG. 1. That is, the processor or the electronic device can implement the image processing method provided by the embodiment of the present invention by running the software program in the Android operating system.
  • the image processing method provided by the embodiment of the present invention will be described in detail below in conjunction with the flowchart of the image processing method shown in FIG. 2.
  • the logical sequence of the image processing method provided by the embodiment of the present invention is shown in the method flowchart, in some cases, the steps shown or described may be performed in a different order than here.
  • the image processing method shown in FIG. 2 may include S201-S205:
  • the electronic device displays a first editing interface.
  • the first editing interface includes M image areas indicated by the target arrangement template, and N first images are displayed on the N image areas in the first editing interface.
  • each of the N image areas displays a first image, and the first images displayed on different image areas are different.
  • N is a positive integer less than or equal to M.
  • the first editing interface may be an interface in a gallery application in an electronic device.
  • the N image areas in the first editing interface are N image areas among the M image areas.
  • the electronic device when the electronic device automatically fills the image in the M image areas, the electronic device can first fill the image in the image area before the display position in the M image areas, and then fill the image area in the rear of the display position. Fill the image.
  • the electronic device fills an image in an image area, that is, displays the image on the image area.
  • image area displays the image on the image area.
  • different descriptions are used in different positions.
  • the target arrangement template may be a three-square grid, a six-square grid, or a nine-square grid.
  • it can also be other templates, which are not specifically limited in the embodiment of the present invention.
  • the target layout is specifically used to indicate the display positions and display sizes of the M image regions.
  • the target arrangement template in the case where the target arrangement template is a three-square grid, the target arrangement template includes 3 image regions.
  • the target layout is used to indicate that the three image areas are arranged in a row and three columns in order from left to right on the screen of the electronic device, and the display sizes of the three image areas are the same.
  • the target arrangement template when the target arrangement template is a six-square grid, the target arrangement template includes 6 image regions.
  • the target layout is used to indicate that these 6 image areas are arranged on the screen of the electronic device in two rows and three columns from left to right and top to bottom, and the display sizes of these 6 image areas are the same. .
  • the target arrangement template when the target arrangement template is a nine-square grid, the target arrangement template includes 9 image regions.
  • the target layout is used to indicate that these 9 image areas are arranged in three rows and three columns on the screen of the electronic device in order from left to right and top to bottom, and the display sizes of these 9 image areas are the same. .
  • the target arrangement template is a nine-square grid as an example to illustrate the image processing method provided by the embodiment of the present invention.
  • the electronic device receives the first input.
  • the first input is used to trigger editing of images on the M image regions indicated by the target arrangement template.
  • the first input may be the user's input on M image areas.
  • the screen of the electronic device may be a touch screen, and the touch screen may be used to receive input from a user and display content corresponding to the input to the user in response to the input.
  • the above-mentioned first input may be touch screen input, fingerprint input, gravity input, key input, etc.
  • the touch screen input is the user's input on the touch screen of the electronic device such as pressing input, long press input, sliding input, click input, hovering input (input by the user near the touch screen).
  • Fingerprint input is the input of the user's sliding fingerprint, long-press fingerprint, single-click fingerprint, and double-click fingerprint to the fingerprint reader of the electronic device.
  • Gravity input is the user's input of shaking the electronic device in a specific direction or a certain number of times.
  • the key input corresponds to the user's input such as single-click input, double-click input, long-press input, and combination key input of the power button, volume button, and home button of the electronic device.
  • the embodiment of the present invention does not specifically limit the first input manner, and may be any achievable manner.
  • the first input is a user's drag input of images on M image areas.
  • the electronic device edits at least one image among the N first images to obtain a target image array.
  • the electronic device can edit part or all of the N first images.
  • the images in the target image array are images arranged according to the target arrangement template.
  • the target image array includes the above N first images.
  • the electronic device receives the second input.
  • the second input may be an input for the user to trigger the communication application to send the target image array.
  • the above-mentioned first editing interface may include a sharing control for triggering the electronic device to send the target image array through the communication application.
  • the second input may be an input to the sharing control in the first editing interface.
  • the second input is a click input to the share control in the target editing interface.
  • the electronic device In response to the second input, the electronic device sends the target image array.
  • the electronic device may send the target image array through the communication application therein.
  • the image processing method provided in the embodiment of the present invention may further include S206 after the foregoing S205:
  • S206 The electronic device displays the target image array on the sending interface according to the target arrangement template.
  • the foregoing sending interface is used to display images to be sent or sent by the electronic device.
  • the electronic device can intuitively display the images in the sent target image array to the user, for example, intuitively display the order of the images in the target image array arranged according to the target arrangement template.
  • the electronic device may display multiple sharing identifiers on the screen, and each sharing identifier is used to indicate an application or plug-in of an image to be shared.
  • the electronic device may display the sending interface of the communication application, and automatically fill the target image array into the sending interface, which is then triggered by the user The communication application sends the target image array.
  • the second input may include an input that the user triggers the electronic device to select a communication application (for example, an input that triggers the electronic device to display a sending interface of the communication application), and an input that triggers the electronic device to send the target image array through the communication application.
  • a communication application for example, an input that triggers the electronic device to display a sending interface of the communication application
  • an input that triggers the electronic device to send the target image array through the communication application for example, an input that triggers the electronic device to send the target image array through the communication application.
  • the electronic device sends the target image array through the communication application, which can be the electronic device sending the target image array to the social platform through the communication application, or sending the target image to the corresponding communication object (that is, one or more contacts in the communication application) Array.
  • the editing interface of the gallery application displayed by the electronic device includes the image area P1 to the image area P9, that is, the target arrangement template is a nine-square grid.
  • the image 1 to 9 are displayed on the image area P1 to the image area P9, respectively.
  • image area P1 to image area P9 are arranged in the order of three rows and three columns
  • image 1 to image 9 are arranged in the order of three rows and three columns in image area P1 to image area P9
  • image area P1 to image area P9 The sizes (that is, the display sizes) in the image area P1 to the image area P9 are the same.
  • the target image array is image 1 to image 9 arranged in three rows and three columns.
  • images 1 to 9 may all be custom images selected by the user-controlled electronic device, or images 1 to 9 include some custom images selected by the user-controlled electronic device, and also include some preset filling images. That is, the N first images are all or part of the images 1 to 9.
  • image 2, image 4, image 5, image 6, and image 8 in image 1 to image 9 may be the aforementioned N first images, that is, N is equal to 5.
  • the first input is the user's input to image 2, image 4, image 5, image 6, and image 8.
  • the editing interface shown in Figure 3 (a) also includes a sharing control S1.
  • the sharing control S1 denoted as input 1
  • the electronic device The sending interface of the communication application can be displayed, and the sending interface includes images 1 to 9 arranged in three rows and three columns.
  • the sending interface also includes a sending control Y.
  • the electronic device can send the images 1 to 9 arranged in the order of three rows and three columns on the social platform through the communication application.
  • the second input includes input 1 and input 2.
  • the image processing method provided by the embodiment of the present invention can display the first editing interface including the M image regions indicated by the target arrangement template, and N image regions of the M image regions are displayed.
  • the first image Subsequently, through the first input, at least one of the N first images can be edited to obtain the target image array. Furthermore, through the second input, the target image array can be sent. In this way, when the user needs to share the images in the target image array, there is no need for the user to select these images one by one, edit these images in real time, and then send these images; instead, they can be arranged according to the target in the first image editing interface.
  • the images in the target image array obtained by template editing are taken as a whole, and the images in the target image array are selected and sent as a whole quickly and conveniently.
  • the electronic device can edit the N first images according to the target arrangement template in the first editing interface of the gallery application, instead of editing the N first images through a special third-party retouching application, so that the user does not need to open the multi In the gallery application, you can quickly and conveniently control the electronic device to edit the N first images.
  • the electronic device can send the target image array in the gallery application through the communication application, there is no need to control the communication application to call up images from the gallery application one by one to edit and send these images in real time. That is, the user can trigger the electronic device to edit and share images quickly and conveniently through the integrated operation of the gallery application and the communication application.
  • the image processing method provided in the embodiment of the present invention may further include S207 before the foregoing S203, and the corresponding foregoing S203 may be implemented through S203a:
  • the electronic device displays K preset filling images on the K image areas in the first editing interface.
  • each of the K image areas displays a preset filling image
  • the preset filling images displayed on different image areas are different.
  • the electronic device may display K preset filling images on the K image areas provided in the gallery application.
  • the electronic device edits at least one image among the N first images, and edits at least one image among the K preset filling images, to obtain a target image array.
  • K is a positive integer
  • the sum of K and N is M. That is, N is less than M.
  • the above K preset filled images may include image 1, image 3, image 7, and image 9, that is, K, etc. 4.
  • the aforementioned preset filling image may be user-defined or preset.
  • the preset filled image may be a blank image or an image including a preset pattern (such as a star pattern).
  • the above-mentioned first editing interface may include a "filling" control for triggering the electronic device to fill the preset filling image in the target arrangement template.
  • the “fill” control is used to trigger the electronic device to display one or more different preset filling images in the target editing interface, so that the user can select from the one or more different preset filling images to be filled to The preset fill image in the target arrangement template.
  • the user may need the images obtained by the arrangement to have a certain interest.
  • the four corners in the Jiugongge are all blank images to make the images in the Jiugongge
  • the image takes on the shape of a cross.
  • the electronic device arranges images in the target arrangement template
  • the number of images selected by the user is too small to make up the number of images in the target arrangement template. Therefore, the user requires the electronic device to be unfilled in the target arrangement template.
  • the free image area of the image is filled with a preset filling image, such as filling a preset filling image on the last free image area in the target arrangement template.
  • the electronic device can provide a preset filling image for the target arrangement template, the user does not need to find a specific image (such as a blank image) among a large number of images in a gallery application, and then fill the specific image To the image area in the target arrangement template, so that the user can quickly and conveniently trigger the electronic device to fill the target arrangement template with the preset filling image.
  • a specific image such as a blank image
  • the user can trigger the electronic device to edit the image through one or more images in the gallery application.
  • the image processing method provided in the embodiment of the present invention further includes S208-S211 before the foregoing S201:
  • the electronic device receives a third input when displaying P second images.
  • each of the P image areas displays a second image
  • the second images displayed on different image areas are different
  • P is a positive integer less than or equal to N.
  • the electronic device can display P second images in the gallery application.
  • P is equal to 1.
  • the electronic device displays the image in the gallery application, that is, the image preview interface is displayed
  • the user can perform a long-press input (that is, the third input) on an image in the gallery application.
  • the electronic device may display a “delete” control and a “add multiple images to the template” control on the screen.
  • the "delete” control is used to trigger the electronic device to delete the image selected by the electronic device.
  • the “add multiple images to the template” control is used to trigger the electronic device to display the above-mentioned target editing interface, so as to trigger the electronic device to edit the image selected by the electronic device through a certain arrangement template.
  • the electronic device displays at least one arrangement template identification, one arrangement template identification is used to indicate one image arrangement template, and one image arrangement template includes a plurality of image regions.
  • the electronic device may provide a variety of arrangement templates, such as the above-mentioned three-square grid, six-square grid, or nine-square grid.
  • at least one arrangement template identifier can be a logo of three-square grid, six-square grid, or nine-square grid, respectively.
  • the electronic device receives a fourth input of the target arrangement template identification in the at least one arrangement template identification.
  • the target arrangement template identifier is used to indicate the target arrangement template.
  • the fourth input is the user's input of the logo of Jiugongge.
  • the electronic device displays a second editing interface.
  • the second editing interface includes M image regions pointed to by the target arrangement template, and P second images are displayed on the P image regions in the second editing interface. .
  • the second editing interface and the first editing interface are the same editing interface displayed by the electronic device at different times, that is, the second editing interface is an editing interface in a gallery application.
  • P second images are images in N first images, and P is a positive integer less than or equal to N.
  • a plus sign may be displayed on an idle image area on the target arrangement template in the second editing interface, which is used to indicate that no image is displayed on the image area and trigger the electronic device to fill the image area. image.
  • FIG 4(a) it is the image preview interface of the gallery application displayed by the electronic device.
  • the electronic device can display the "delete” control S3 and the "add multiple images to the template” control S4 on the image preview interface.
  • the electronic device can display a template selection interface, which includes "three "Gongge” control S5, “Sixgongge” control S6, "Jiugongge” control S7 and extended control S8.
  • the "three square grid” control S5, the “six square grid” control S6, and the “jiugong grid” control S7 are respectively used to indicate the three square grid arrangement template, the six square grid arrangement template and the nine square grid arrangement template.
  • the extended control S8 is used to trigger the electronic device to select other arrangement templates provided by the electronic device, and to save some arrangement templates customized by the user.
  • the fourth input may include input 3 and input 4.
  • the editing interface displayed by the electronic device includes image area P1 to image area P9 ,
  • the image area P1 includes the image 1, and the image area P2 to the image area P9 do not display images.
  • the image area P1 is the aforementioned P areas
  • the image 1 is the aforementioned P images, that is, the number of P image areas is one.
  • the electronic device may provide at least one arrangement template identifier, so that the user can trigger the electronic device to start editing according to the target arrangement template in the second editing interface by inputting the target arrangement template identifier in the at least one arrangement template identifier.
  • First image since the electronic device can provide multiple arrangement template identifiers, the electronic device can edit the image selected by the user in different arrangement templates, which improves the diversity of the electronic device's arrangement and editing of images.
  • the image processing method provided in the embodiment of the present invention may further include S212 after S211 and before S201, and the corresponding S201 may be implemented through S201a:
  • the electronic device receives Q fifth inputs.
  • each of the foregoing Q fifth inputs is used to trigger the electronic device to fill an idle image area in the second arrangement template with a user-defined image.
  • the user inputs an idle image area in the second arrangement template (for example, inputting a plus sign in the idle image area) (denoted as input 5)
  • the electronic device can display an image selection list.
  • the user selects an image input (denoted as input 6) from the image selection list, so that the electronic device determines the image as the image to be filled in the idle image area.
  • the image selection list may be a list that normally displays images in a gallery application, such as an image selection list that displays a group of images recently taken in a gallery application.
  • a fifth input may include the two inputs of input 5 and input 6 described above.
  • the electronic device displays a first editing interface, and Q third images are displayed on the Q image areas in the first editing interface.
  • Q image areas are image areas other than P image areas among N image areas
  • Q third images are images other than P second images among N first images.
  • each of the Q image areas displays a third image
  • the third images displayed on different image areas are different.
  • the electronic device can display an image selection list, and the image selection list includes image 2 to image 9. Subsequently, after the user inputs the image 2 shown in Fig. 5(b) (that is, the aforementioned input 6), as shown in Fig. 5(c), the electronic device can be placed on the image area P2 in the editing interface Image 2 is displayed, but no image is displayed in image area P3 to image area P9.
  • the electronic device may display (a) in FIG. 3 The editing interface shown.
  • the user after the user triggers the electronic device to start displaying the image on the image area indicated by the target arrangement template, the user can also trigger the electronic device to add an image to the free image area indicated by the target arrangement template, thereby improving the electronic device
  • the electronic device when it displays images in the target arrangement template, it can also display a list of images in the gallery application to support the user to view the images in the gallery application in real time, for example, to facilitate the user to add images in the gallery application To the image area in the target arrangement template.
  • the target editing interface also includes a list of target images.
  • the target image list includes Q third images
  • a fifth input is an input for dragging a third image from the target image list to an image area.
  • the target image list may be a list of thumbnails of images in a gallery application.
  • the user's left and right sliding input on the target image list can trigger the electronic device to update the image thumbnails displayed in the target image list, such as updating the thumbnails displayed as the most recently taken images.
  • the target image list may be located below the image area in the target arrangement template, such as located below the Jiugongge template.
  • the electronic device displays a target image list 61 below the image area P1 to the image area P9 in the editing interface, and the image list 61 includes the image 2 Thumbnail. Subsequently, the user drags the thumbnail of image 2 from the target image list 61 to the input of the image area P2 (that is, a fifth input), triggering the electronic device to display the editing interface as shown in (c) in FIG. 5, namely The image 2 is displayed on the image area P2.
  • the electronic device can display the target image list in the editing interface (such as the first editing interface or the second editing interface), so that when the electronic device arranges images in the target arrangement template, the user can use the target image list conveniently and intuitively View images in the Gallery app.
  • the user can quickly and conveniently trigger the electronic device to fill the image in the target arrangement template by dragging the image thumbnails in the target image list to the image position in the target arrangement template, thereby improving the selection and selection of the electronic device.
  • the speed and convenience of arranging images are examples of images.
  • the user may need to set the image filled in the target arrangement template, such as setting the size and color tone of the image.
  • the foregoing S203b may also include S213, and the corresponding S203b may be implemented through S203c:
  • the electronic device receives the sixth input.
  • the electronic device edits at least one of the N first images and edits at least one of the K preset filling images by performing a target operation to obtain a target image array.
  • the target operation includes any one of the following operations 1 to 4:
  • Operation 1 Combine the images of different image areas in the target arrangement template.
  • the sixth input may be the user's input to image areas of different image areas, such as continuous sliding input of images on different image areas.
  • the electronic device may merge the image 3 and the image 6 into the image 10. .
  • the diversity of the images arranged by the electronic device in the target arrangement template can be improved, and the user experience can be improved.
  • Operation 2 Split the image of the target image area in the target arrangement template.
  • the sixth input may be the user's input to the image area of the image area, such as long-press input for images on different image areas.
  • the pressing duration of the sixth input corresponds to the number of image regions divided into the image region.
  • the sixth input is used to trigger the electronic device to divide the image on one image area into two images;
  • the pressing duration of the six inputs is in a certain duration range (for example, a duration between 1.5 seconds and 2.5 seconds)
  • the sixth input is used to trigger the electronic device to divide the image on one image area into three images.
  • the electronic device may divide the image 3 into two images arranged up and down. .
  • Operation 3 Adjust the image area where the image in the target arrangement template is located.
  • the sixth input may be an input of the user dragging an image on a different image area.
  • the user can drag the image 3 shown in (a) of FIG. 3 from the image area P3 onto the image area P6 to trigger the electronic device to display the image 6 on the image area P3, and display the image on the image area P3.
  • Image 3 is displayed on area P6.
  • Operation 4 The operation indicated by the target function.
  • the target function is used to set part or all of the images in the target arrangement template.
  • the electronic device may display one or more functional controls on the editing interface (such as the first editing interface and the second editing interface) to support the user to trigger the electronic device to execute the corresponding Target function.
  • a function control corresponds to a target function.
  • the electronic device may display the above-mentioned one or more functional controls in a one-level menu, or display the above-mentioned one or more functional controls in a multi-level menu.
  • the electronic device may display an extended control in the upper right corner of the editing interface in a gallery application, and the next level of the extended control is one or more functional controls, that is, the extended control is used to expand one or more functional controls.
  • the target function is any one of the following: crop image, rotate image, image tone adjustment, image sharpening, image color temperature adjustment, image saturation adjustment, image brightness adjustment, image contrast adjustment, image
  • the filter is added, and the image is filled in the free image area among the M image areas indicated by the target arrangement template.
  • the image filter may be a style filter, such as a character filter, a landscape filter, and a gourmet filter.
  • the image filter can also include image special effects, such as displaying a specific pattern in the image display area, such as rainbow, love, apple, etc.
  • the electronic device may perform an operation indicated by the target function on the part of the image, such as adding a filter to the part of the image.
  • the electronic device can perform the operation indicated by the target function on all the images, such as adding a filter to the entire image, such as adding a filter including a rainbow pattern to the entire image. mirror.
  • the editing interface displayed by the electronic device includes an editing control K1, a filter control K2, a filling control K3, a sharing control K4, and an extended control K5.
  • the editing control K1 is used to provide cropping images, rotating images, image tone adjustment, image sharpening, image color temperature adjustment, image saturation adjustment, image brightness adjustment, and image contrast adjustment.
  • the filter control K2 is used to provide filter addition function.
  • the filling control K3 is used to trigger the electronic device to fill the preset filling image on the free image area in the target arrangement template.
  • the sharing control K4 is used to trigger the electronic device to send the target image array through the communication application.
  • the extended control K5 is used to provide other functions for setting the image, such as the function of switching the current arrangement template.
  • the electronic device may display one or more filter template identifiers under the target arrangement template. For example, the rainbow filter template logo 91, and the filter extension control 92 (used to display more filters) is displayed. Subsequently, after the user inputs the rainbow filter template identifier 91, as shown in (b) of FIG. 9, the electronic device may display that a rainbow filter is added to the entire image 1 to image 9. In addition, the convenience of setting the same filter for multiple images by the electronic device is improved.
  • the electronic device may display one or more filling effect identifiers under the target arrangement template, such as filling
  • the effect mark 10a that is, the filling effect used to indicate the preset filling image in which the image at the four corners of the Jiugong grid is added as a star pattern
  • the filling effect extension control 10b used to display more filling effects
  • the electronic device can display the image area P1, the image area P3, the image area P7, and the image area P9, respectively, which are filled with star patterns.
  • a filling pattern is preset, and image 2, image 4, image 5, image 6, and image 8 are displayed in image area P2, image area P4 to image area P6, and image area P8, respectively.
  • the electronic device can provide multiple ways to edit the image in the target arrangement template, such as triggering different operations on the image through the function controls of different functions, it is beneficial to provide the user with the diversity of triggering the electronic device to edit the image according to the target arrangement template.
  • the image processing method provided in the embodiment of the present invention may further include S214 after the foregoing S203:
  • the electronic device stores at least one of the following: a target image array, and a target filter of the target image array.
  • the target filter includes at least one of the hue value, sharpening value, color temperature value, saturation value, brightness value, and contrast value of the target image array.
  • the electronic device can save the target image array and the target filter of the target image array, subsequent users can easily obtain the saved target image array and target filter, which is beneficial for subsequent users to view or use the target image array and target filter. mirror.
  • the electronic device 11 includes: a display module 11a, a receiving module 11b, an editing module 11c, and a sending module 11d;
  • the display module 11a is used to display a first editing interface, the first editing interface including the target arrangement template indicated M image areas in the first editing interface, N first images are displayed on the N image areas in the first editing interface;
  • the receiving module 11b is used to receive the first input;
  • the editing module 11c is used to respond to the first input received by the receiving module 11b
  • One input is to edit at least one of the N first images displayed by the display module 11a to obtain the target image array;
  • the receiving module 11b is also used to receive the second input;
  • the sending module 11d is used to respond to the receiving module 11b receiving The second input of is to send the target image array obtained by the editing module 11c; where N is a positive integer less than or equal to M.
  • the display module 11a is further configured to display the target image array on the sending interface according to the target arrangement template after the sending module 11d sends the target image array; wherein the sending interface is used to display the image to be sent or already sent by the electronic device.
  • the display module 11a is also used for the editing module 11c to edit at least one image of the N first images, and before obtaining the target image array, display K preset fillings on the K image areas in the first editing interface Image; editing module 11c, specifically used to edit at least one image of the N first images, and edit at least one image of K preset filling images, to obtain a target image array; where K is a positive integer, and K and The sum of N is M.
  • the receiving module 11b is further configured to receive a third input when P second images are displayed before the display module 11a displays the first editing interface; the display module 11a is also configured to receive in response to the receiving module 11b
  • the third input for displaying at least one arrangement template identification, one arrangement template identification is used to indicate an image arrangement template, and one image arrangement template is used to indicate a plurality of image areas; the receiving module 11b is also used to receive the identification of at least one arrangement template
  • the fourth input of the target arrangement template identification in the display module 11a which is further configured to display a second editing interface in response to the fourth input received by the receiving module 11b, the second editing interface including the M image regions pointed to by the target arrangement template , P second images are displayed on the P image areas in the second editing interface; among them, the target arrangement template identifier is used to indicate the target arrangement template, the P second images are the images in the N first images, and P is A positive integer less than or equal to N.
  • the receiving module 11b is further configured to edit at least one of the N first images, and edit at least one of the K preset filling images, and receive the sixth input before obtaining the target image array.
  • the editing module 11c is specifically configured to edit at least one of the N first images and edit at least one of the K preset filled images by performing a target operation in response to the sixth input received by the receiving module 11b, Obtain the target image array; where the target operation includes any of the following: merge the images of different image areas in the M image areas, split the image of the target image area in the M image areas, and adjust the location of the image in the M image areas The image area of, and the operation indicated by the target function; the target function is used to edit the image in part or all of the M image areas, and the target image area is any one of the M image areas.
  • the target function is any one of the following: crop image, rotate image, image tone adjustment, image sharpening, image color temperature adjustment, image saturation adjustment, image brightness adjustment, image contrast adjustment, image filter addition, in M
  • the free image area in each image area is filled with an image.
  • the electronic device 11 further includes: a saving module; a saving module for editing at least one of the N first images by the editing module 11c, and editing at least one of the K preset filling images to obtain the target image After the array, save at least one of the following: the target image array, the target filter of the target image array; where the target filter includes the hue value, sharpening value, color temperature value, saturation value, brightness value, and contrast value of the target image array At least one of.
  • the electronic device 11 provided in the embodiment of the present invention can implement each process implemented by the electronic device in the foregoing method embodiment, and to avoid repetition, details are not described herein again.
  • the electronic device provided by the embodiment of the present invention can display a first editing interface including M image areas indicated by the target arrangement template, and N first images are displayed on the N image areas of the M image areas. Subsequently, through the first input, at least one of the N first images can be edited to obtain the target image array. Furthermore, through the second input, the target image array can be sent. In this way, when the user needs to share the images in the target image array, there is no need for the user to select these images one by one, edit these images in real time, and then send these images; instead, they can be arranged according to the target in the first image editing interface.
  • the images in the target image array obtained by template editing are taken as a whole, and the images in the target image array are selected and sent as a whole quickly and conveniently.
  • the electronic device can edit the N first images according to the target arrangement template in the first editing interface of the gallery application, instead of editing the N first images through a special third-party retouching application, so that the user does not need to open the multi In the gallery application, you can quickly and conveniently control the electronic device to edit the N first images.
  • the electronic device can send the target image array in the gallery application through the communication application, there is no need to control the communication application to call up images from the gallery application one by one to edit and send these images in real time. That is, the user can trigger the electronic device to edit and share images quickly and conveniently through the integrated operation of the gallery application and the communication application.
  • the electronic device 100 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, and a display unit 106 , User input unit 107, interface unit 108, memory 109, processor 110, power supply 111 and other components.
  • a radio frequency unit 101 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, and a display unit 106 , User input unit 107, interface unit 108, memory 109, processor 110, power supply 111 and other components.
  • the electronic device may include more or fewer components than those shown in the figure, or a combination of certain components, or different components. Layout.
  • electronic devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, in-vehicle electronic devices, wearable devices, and pedometers.
  • the processor 110 is configured to control the display unit 106 to display a first editing interface.
  • the first editing interface includes M image areas indicated by the target arrangement template, and N image areas are displayed on the N image areas in the first editing interface. One image.
  • the processor 110 is also used to control the user input unit 107 to receive the first input.
  • the processor 110 is further configured to edit at least one of the N first images displayed by the display unit 106 in response to the first input received by the user input unit 107 to obtain a target image array.
  • the processor 110 is also configured to control the user input unit 107 to receive the second input.
  • the processor 110 is further configured to control the radio frequency unit 101 to send the target image array in response to the second input received by the user input unit 107; where N is a positive integer less than or equal to M.
  • the electronic device provided by the embodiment of the present invention can display a first editing interface including M image areas indicated by the target arrangement template, and N first images are displayed on the N image areas of the M image areas. Subsequently, through the first input, at least one of the N first images can be edited to obtain the target image array. Furthermore, through the second input, the target image array can be sent. In this way, when the user needs to share the images in the target image array, there is no need for the user to select these images one by one, edit these images in real time, and then send these images; instead, they can be arranged according to the target in the first image editing interface.
  • the images in the target image array obtained by template editing are taken as a whole, and the images in the target image array are selected and sent as a whole quickly and conveniently.
  • the electronic device can edit the N first images according to the target arrangement template in the first editing interface of the gallery application, instead of editing the N first images through a special third-party retouching application, so that the user does not need to open the multi In the gallery application, you can quickly and conveniently control the electronic device to edit the N first images.
  • the electronic device can send the target image array in the gallery application through the communication application, there is no need to control the communication application to call up images from the gallery application one by one to edit and send these images in real time. That is, the user can trigger the electronic device to edit and share images quickly and conveniently through the integrated operation of the gallery application and the communication application.
  • the radio frequency unit 101 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 110; in addition, Uplink data is sent to the base station.
  • the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 101 can also communicate with the network and other devices through a wireless communication system.
  • the electronic device provides users with wireless broadband Internet access through the network module 102, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 103 can convert the audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output it as sound. Moreover, the audio output unit 103 may also provide audio output related to a specific function performed by the electronic device 100 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 104 is used to receive audio or video signals.
  • the input unit 104 may include a graphics processing unit (GPU) 1041 and a microphone 1042.
  • the graphics processing unit 1041 is used to capture images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame can be displayed on the display unit 106.
  • the image frame processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the network module 102.
  • the microphone 1042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 101 for output in the case of a telephone call mode.
  • the electronic device 100 further includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 1061 and the display panel 1061 when the electronic device 100 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three axes), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of electronic devices (such as horizontal and vertical screen switching, related games) , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, percussion), etc.; sensor 105 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 106 is used to display information input by the user or information provided to the user.
  • the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), or the like.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 107 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the electronic device.
  • the user input unit 107 includes a touch panel 1071 and other input devices 1072.
  • the touch panel 1071 also called a touch screen, can collect the user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 1071 or near the touch panel 1071. operating).
  • the touch panel 1071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 110, the command sent by the processor 110 is received and executed.
  • the touch panel 1071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 107 may also include other input devices 1072.
  • other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 1071 can be overlaid on the display panel 1061.
  • the touch panel 1071 detects a touch operation on or near it, it transmits it to the processor 110 to determine the type of the touch event, and then the processor 110 determines the type of the touch event according to the touch.
  • the type of event provides corresponding visual output on the display panel 1061.
  • the touch panel 1071 and the display panel 1061 are used as two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 1071 and the display panel 1061 can be integrated
  • the implementation of the input and output functions of the electronic device is not specifically limited here.
  • the interface unit 108 is an interface for connecting an external device with the electronic device 100.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 108 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the electronic device 100 or can be used to connect to the electronic device 100 and the external device. Transfer data between devices.
  • the memory 109 can be used to store software programs and various data.
  • the memory 109 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of mobile phones (such as audio data, phone book, etc.), etc.
  • the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 110 is the control center of the electronic device, which uses various interfaces and lines to connect the various parts of the entire electronic device, runs or executes software programs and/or modules stored in the memory 109, and calls data stored in the memory 109 , Perform various functions of electronic equipment and process data, so as to monitor the electronic equipment as a whole.
  • the processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, application programs, etc., and the modem
  • the processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
  • the electronic device 100 may also include a power source 111 (such as a battery) for supplying power to various components.
  • a power source 111 such as a battery
  • the power source 111 may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption management through the power management system. And other functions.
  • the electronic device 100 includes some functional modules not shown, which will not be repeated here.
  • the embodiment of the present invention also provides an electronic device, including a processor 110, a memory 109, a computer program stored on the memory 109 and capable of running on the processor 110, and when the computer program is executed by the processor 110
  • an electronic device including a processor 110, a memory 109, a computer program stored on the memory 109 and capable of running on the processor 110, and when the computer program is executed by the processor 110
  • the embodiment of the present invention also provides a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
  • a computer program is stored on the computer-readable storage medium.
  • the computer program is executed by a processor, each process of the foregoing method embodiment is implemented, and the same technical effect can be achieved. To avoid repetition, I won’t repeat them here.
  • the computer-readable storage medium such as read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk, etc.
  • the method of the above embodiments can be implemented by means of software plus the necessary general hardware platform. Of course, it can also be implemented by hardware, but in many cases the former is better. ⁇
  • the technical solution of this application essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, The optical disc) includes several instructions to make an electronic device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the method described in each embodiment of the present application.

Abstract

L'invention concerne un procédé de traitement d'image et un dispositif électronique. Le procédé consiste à : afficher une première interface d'édition, la première interface d'édition comprenant M régions d'image indiquées par un modèle d'agencement cible et N premières images étant affichées sur les N régions d'image dans la première interface d'édition (S201) ; recevoir une première entrée (S202) ; éditer au moins une image dans les N premières images en réponse à la première entrée afin d'obtenir un réseau d'images cibles (S203) ; recevoir une seconde entrée (S204) ; envoyer le réseau d'images cibles en réponse à la seconde entrée (S205) ; N étant un nombre entier positif inférieur ou égal à M.
PCT/CN2020/136731 2019-12-19 2020-12-16 Procédé de traitement d'image et dispositif électronique WO2021121253A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911319652.0A CN111127595B (zh) 2019-12-19 2019-12-19 图像处理方法及电子设备
CN201911319652.0 2019-12-19

Publications (1)

Publication Number Publication Date
WO2021121253A1 true WO2021121253A1 (fr) 2021-06-24

Family

ID=70500252

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/136731 WO2021121253A1 (fr) 2019-12-19 2020-12-16 Procédé de traitement d'image et dispositif électronique

Country Status (2)

Country Link
CN (1) CN111127595B (fr)
WO (1) WO2021121253A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114430460A (zh) * 2022-01-28 2022-05-03 维沃移动通信有限公司 拍摄方法、装置和电子设备
CN114500844A (zh) * 2022-01-28 2022-05-13 维沃移动通信有限公司 拍摄方法、装置和电子设备

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111127595B (zh) * 2019-12-19 2023-11-03 维沃移动通信有限公司 图像处理方法及电子设备
CN111866379A (zh) * 2020-07-03 2020-10-30 Oppo广东移动通信有限公司 一种图像处理方法、图像处理装置、电子设备和存储介质
CN112312022B (zh) * 2020-10-30 2022-04-15 维沃移动通信有限公司 图像处理方法、图像处理装置、电子设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104168417A (zh) * 2014-05-20 2014-11-26 腾讯科技(深圳)有限公司 图片处理方法及装置
CN104881844A (zh) * 2015-06-29 2015-09-02 北京金山安全软件有限公司 一种图片组合的方法、装置以及终端设备
CN105320695A (zh) * 2014-07-31 2016-02-10 腾讯科技(深圳)有限公司 图片处理方法及装置
US20180095653A1 (en) * 2015-08-14 2018-04-05 Martin Hasek Device, method and graphical user interface for handwritten interaction
CN111127595A (zh) * 2019-12-19 2020-05-08 维沃移动通信有限公司 图像处理方法及电子设备

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120082401A1 (en) * 2010-05-13 2012-04-05 Kelly Berger System and method for automatic discovering and creating photo stories
CN103337086B (zh) * 2013-06-17 2015-11-25 北京金山安全软件有限公司 用于移动终端的图片编辑方法和装置
CN105955607B (zh) * 2016-04-22 2020-06-19 北京小米移动软件有限公司 内容分享方法和装置
CN106407365A (zh) * 2016-09-08 2017-02-15 北京小米移动软件有限公司 图片共享方法及装置
CN110147190B (zh) * 2018-06-29 2024-03-08 腾讯科技(深圳)有限公司 图像处理方法及电子终端
CN110084871B (zh) * 2019-05-06 2020-11-27 珠海格力电器股份有限公司 图像排版方法及装置、电子终端
CN110490808B (zh) * 2019-08-27 2023-07-07 腾讯科技(深圳)有限公司 图片拼接方法、装置、终端及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104168417A (zh) * 2014-05-20 2014-11-26 腾讯科技(深圳)有限公司 图片处理方法及装置
CN105320695A (zh) * 2014-07-31 2016-02-10 腾讯科技(深圳)有限公司 图片处理方法及装置
CN104881844A (zh) * 2015-06-29 2015-09-02 北京金山安全软件有限公司 一种图片组合的方法、装置以及终端设备
US20180095653A1 (en) * 2015-08-14 2018-04-05 Martin Hasek Device, method and graphical user interface for handwritten interaction
CN111127595A (zh) * 2019-12-19 2020-05-08 维沃移动通信有限公司 图像处理方法及电子设备

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114430460A (zh) * 2022-01-28 2022-05-03 维沃移动通信有限公司 拍摄方法、装置和电子设备
CN114500844A (zh) * 2022-01-28 2022-05-13 维沃移动通信有限公司 拍摄方法、装置和电子设备

Also Published As

Publication number Publication date
CN111127595A (zh) 2020-05-08
CN111127595B (zh) 2023-11-03

Similar Documents

Publication Publication Date Title
WO2021121253A1 (fr) Procédé de traitement d'image et dispositif électronique
WO2021104195A1 (fr) Procédé d'affichage d'images et dispositif électronique
WO2019137429A1 (fr) Procédé de traitement d'image et terminal mobile
WO2021036531A1 (fr) Procédé de capture d'écran et équipement terminal
WO2020151519A1 (fr) Procédé d'entrée d'informations, dispositif terminal et support d'enregistrement lisible par ordinateur
WO2020215949A1 (fr) Procédé de traitement d'objet et dispositif terminal
WO2021104321A1 (fr) Procédé d'affichage d'image et dispositif électronique
WO2020151525A1 (fr) Procédé d'envoi de message et dispositif terminal
WO2021057585A1 (fr) Procédé d'affichage de message de notification et terminal mobile
KR102554191B1 (ko) 정보 처리 방법 및 단말
JP2021516374A (ja) 画像処理方法及びフレキシブルスクリーン端末
WO2020151460A1 (fr) Procédé de traitement d'objet et dispositif terminal
WO2020182035A1 (fr) Procédé de traitement d'image et dispositif terminal
WO2020238497A1 (fr) Procédé de déplacement d'icône et dispositif terminal
WO2020220873A1 (fr) Procédé d'affichage d'image et dispositif terminal
WO2021121398A1 (fr) Procédé d'enregistrement vidéo et dispositif électronique
WO2021057290A1 (fr) Procédé de commande d'informations et dispositif électronique
WO2020238911A1 (fr) Procédé d'envoi de message et terminal
WO2021073579A1 (fr) Procédé d'acquisition de capture d'écran à défilement et équipement terminal
WO2021057301A1 (fr) Procédé de commande de fichier et dispositif électronique
CN111064848B (zh) 图片显示方法及电子设备
CN111159449B (zh) 一种图像显示方法及电子设备
WO2020173316A1 (fr) Procédé d'affichage d'images, terminal et terminal mobile
WO2021017730A1 (fr) Procédé de capture d'écran et dispositif terminal
CN108696642B (zh) 整理图标的方法和移动终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20903001

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20903001

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20903001

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 20.02.2023)

122 Ep: pct application non-entry in european phase

Ref document number: 20903001

Country of ref document: EP

Kind code of ref document: A1