CN114968040A - Image processing method and device, electronic device and storage medium - Google Patents

Image processing method and device, electronic device and storage medium Download PDF

Info

Publication number
CN114968040A
CN114968040A CN202210557949.6A CN202210557949A CN114968040A CN 114968040 A CN114968040 A CN 114968040A CN 202210557949 A CN202210557949 A CN 202210557949A CN 114968040 A CN114968040 A CN 114968040A
Authority
CN
China
Prior art keywords
image
button
editing
processing
adjusted
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202210557949.6A
Other languages
Chinese (zh)
Inventor
陈思
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sensetime Technology Development Co Ltd
Original Assignee
Beijing Sensetime Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sensetime Technology Development Co Ltd filed Critical Beijing Sensetime Technology Development Co Ltd
Priority to CN202210557949.6A priority Critical patent/CN114968040A/en
Publication of CN114968040A publication Critical patent/CN114968040A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application discloses an image processing method and device, electronic equipment and a storage medium, and the method comprises the following steps: displaying an image comparison page, wherein the image comparison page comprises a toolbar and at least one image to be compared; the toolbar comprises at least one editing button for editing at least one image to be compared; in response to a first selection operation on at least one image in at least one image to be compared, determining the selected image as a first image; in response to a second selected operation for any one of the at least one editing button, displaying an interactive dialog box of the selected editing button; the interactive dialog box comprises image editing parameters to be adjusted; responding to a first adjusting operation aiming at the image editing parameters to be adjusted, and acquiring adjusted image editing parameter values; processing the first image based on the adjusted image editing parameter values; and displaying the processed first image on the canvas area of the first image in a new layer.

Description

Image processing method and device, electronic device and storage medium
Technical Field
The present invention relates to the field of electronic device technology, and relates to, but is not limited to, an image processing method and apparatus, an electronic device, and a storage medium.
Background
The traditional image comparison method needs manual work to determine the difference between the image to be compared and the sample image through visual observation. The images shot on site often have the conditions of definition, contrast and various image distortions caused by the problems of light and shooting angles, and the images need to be adjusted by means of an image processing tool so that the portrait in the images is clear and distinguishable.
Image processing in a traditional scene is realized through image processing software such as Adobe Photoshop and the like, the software needs to be installed, and the functions are complex, so that the image processing is not easy to be quickly and effectively carried out. And most of the image processing software is client-side, needs to be downloaded and is complex in use process. Therefore, an image processing tool that facilitates editing of the image to be compared and the sample image in the image comparison scenario is lacking.
Disclosure of Invention
The embodiment of the application provides an image processing method and device, electronic equipment and a storage medium.
The technical scheme of the embodiment of the application is realized as follows:
in one aspect, an embodiment of the present application provides an image processing method, where the method includes: displaying an image comparison page, wherein the image comparison page comprises a toolbar and at least one image to be compared; the toolbar comprises at least one editing button for editing the at least one image to be compared; in response to a first selection operation on at least one image in the at least one image to be compared, determining the selected image as a first image; in response to a second selected operation for any one of the at least one editing button, displaying an interactive dialog box of the selected editing button; the interactive dialog box comprises image editing parameters to be adjusted; responding to a first adjusting operation aiming at the image editing parameters to be adjusted, and acquiring adjusted image editing parameter values; processing the first image based on the adjusted image editing parameter value; displaying the processed first image on the canvas area of the first image in a new layer.
In another aspect, an embodiment of the present application provides an image processing apparatus, including: the first display module is used for displaying an image comparison page, and the image comparison page comprises a toolbar and at least one image to be compared; the toolbar comprises at least one editing button for editing the at least one image to be compared; the determining module is used for responding to a first selection operation aiming at least one image in the at least one image to be compared, and determining the selected image as a first image; the second display module is used for responding to a second selected operation aiming at any one of the at least one editing button and displaying an interactive dialog box of the selected editing button; the interactive dialog box comprises image editing parameters to be adjusted; the first obtaining module is used for responding to a first adjusting operation aiming at the image editing parameter to be adjusted and obtaining an adjusted image editing parameter value; a processing module for processing the first image based on the adjusted image editing parameter value; and the third display module is used for displaying the processed first image on the canvas area of the first image in a new layer.
In another aspect, an embodiment of the present application provides an electronic device, which includes a memory and a processor, where the memory stores a computer program that is executable on the processor, and the processor implements the steps in the method when executing the program.
In yet another aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps of the method.
The beneficial effects that technical scheme that this application embodiment brought include at least:
in the embodiment of the application, firstly, an image comparison page is displayed, wherein the image comparison page comprises a toolbar and at least one image to be compared; the toolbar comprises at least one editing button for editing the at least one image to be compared; in response to a second selected operation for any one of the at least one editing button, displaying an interactive dialog box of the selected editing button; the interactive dialog box comprises image editing parameters to be adjusted; therefore, visual image processing operation can be realized in the comparison of the display image of the webpage end and the page, and the interaction complexity of the user using the image processing tool is reduced. Secondly, responding to a second selected operation aiming at any one of the at least one editing button, and displaying an interactive dialog box of the selected editing button; the interactive dialog box comprises image editing parameters to be adjusted; therefore, the image editing parameters to be adjusted can be acquired through the visual operation of the interactive dialog box. Thirdly, responding to the first adjustment operation aiming at the image editing parameters to be adjusted, and acquiring adjusted image editing parameter values; processing the first image based on the adjusted image editing parameter value; therefore, the image can be edited based on the acquired image editing parameters to be adjusted through the image comparison page of the webpage end. And finally, displaying the processed first image on the canvas area of the first image in a new layer. Therefore, the visualization of the webpage-side image processing operation can be further improved by using the layout mode of the toolbar and the canvas area.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without inventive efforts, wherein:
fig. 1A is an alternative architecture diagram of an execution system of an image processing method according to an embodiment of the present application;
fig. 1B is a schematic flowchart of an image processing method according to an embodiment of the present disclosure;
fig. 2A is a first diagram of a layer provided in an embodiment of the present application;
fig. 2B is a second diagram of a layer provided in the embodiment of the present application;
fig. 2C is a schematic diagram of a layer iii provided in the embodiment of the present application;
fig. 3A is a schematic flowchart of an image processing method according to an embodiment of the present disclosure;
fig. 3B is a schematic diagram of an implementation provided by an embodiment of the present application;
fig. 3C is a schematic diagram of an implementation provided by an embodiment of the present application;
fig. 4 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present disclosure;
fig. 5 is a schematic diagram of an implementation provided by an embodiment of the present application;
fig. 6 is a schematic diagram of an implementation provided by an embodiment of the present application;
fig. 7 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present disclosure;
fig. 8 is a hardware entity diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. The following examples are intended to illustrate the present application, but are not intended to limit the scope of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
It should be noted that the terms "first \ second \ third" referred to in the embodiments of the present application are only used for distinguishing similar objects and do not represent a specific ordering for the objects, and it should be understood that "first \ second \ third" may be interchanged under the permission of a specific order or sequence, so that the embodiments of the present application described herein can be implemented in an order other than that shown or described herein.
It will be understood by those within the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which embodiments of the present application belong. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
To help understand the technical solution of the embodiments of the present application, the following describes concepts related to the embodiments of the present application:
the RGB curve is formed by superimposing curves of three channels of Red (R, Red) Green (G, Green) Blue (B, Blue), and is a luminance curve of the image for adjusting the luminance of the image. The horizontal axis of the RGB curve is the luminance of the original (before adjustment) image, 0 represents black, 1 to 254 represent gray, and 255 represents white; the histogram displayed on the horizontal axis indicates how many pixels exist in each brightness of the original image; the vertical axis is the brightness of the target (adjusted) image. When the image luminance is not adjusted, the RGB curve is a diagonal line and indicates that the luminance values on the horizontal axis (original image) and the vertical axis (target image) are equal.
A plurality of anchor points (control points) may be set on the RGB curve, and the anchor points are represented as (x, y), where x represents the luminance in the original image and y represents the luminance of the target image. The anchor points on the RGB curve correspond to pixel points in the original image.
Affine transformation: affine transformations may be used to adjust the shape and size of an image. The image can be tilted by an arbitrary angle by affine transformation, and arbitrary expansion and contraction of the image in the horizontal direction and the vertical direction is allowed. And carrying out matrix multiplication and matrix addition operation on the matrix of the image.
canvas: translated as a "canvas", a new canvas < canvas > element in the hypertext markup Language (HTML) 5, can dynamically draw graphics in the canvas in conjunction with a scripting Language. A Canvas may also be understood as a representation of a Canvas through which everything drawn is implemented, which may be understood as a tool for drawing. For example, for a Layer (Layer), each time the canvas draw xxx series function is called, a transparent Layer is generated to draw the graph.
Embodiments of the present disclosure provide an image processing method, which may be performed by a processor of an electronic device. The electronic device refers to an electronic device with image processing capability, such as a server, a notebook computer, a tablet computer, a desktop computer, a smart television, a set-top box, a mobile device (e.g., a mobile phone, a portable video player, a personal digital assistant, a dedicated messaging device, and a portable game device). An application may be run on the electronic device to interact with the user, which may be understood as a front end, which when implemented may be a Browser in Browser and Server architecture (B/S) mode, or a Client in Client and Server architecture (Client/Server, C/S) architecture. Opposite to the electronic device is a server device, on which an application program can be run, which can be implemented as a server in B/S or C/S mode.
Fig. 1A is an alternative architecture diagram of an execution system of an image processing method according to an embodiment of the present disclosure, and as shown in fig. 1A, an electronic device 300 is connected to an image capturing device 301 or a server 400 through a network 200. The network 200 may be a wide area network or a local area network, or a combination of both. The electronic device 300 and the image capturing device 301 may be physically separate or integrated. The image capturing device 301 may send or store the captured image to be compared and/or the sample image of the image to be compared to the electronic device 300 through the network 200. The electronic device 300 implements a technical solution of image comparison in the embodiment of the present disclosure.
In some embodiments, the system shown in fig. 1A may also have no network 200 and image capturing device 301, but only the electronic device 300, so that the electronic device locally obtains the image to be compared and performs image comparison. In other embodiments, the image capturing device 301 and the electronic device 300 may be integrated, for example, an electronic device with a camera, such as a mobile phone, so that the image capturing device 301 and the electronic device 300 may be connected by a bus instead of a network.
An embodiment of the present application provides an image processing method, which is applied to an electronic device, and as shown in fig. 1B, the method at least includes the following steps:
step S101, displaying an image comparison page, wherein the image comparison page comprises a toolbar and at least one image to be compared; the toolbar comprises at least one editing button for editing the at least one image to be compared;
the image alignment page may be used to: and displaying effect graphs obtained after at least one image processing operation is carried out on at least one image to be processed, wherein each effect graph can be an image to be compared. The image comparison page may display one image or more than two images (which may be images before or after the image editing process), for example, one image may be loaded first and then the image may be edited; the editing processing may be performed on at least two of the two or more images simultaneously, or may be performed on any one of the two or more images.
The toolbar includes various function buttons, such as a preview button, a save button, an image color processing button and an image form processing button, wherein the image color processing button may be a function button for adjusting image colors, for example, the image color processing button may include buttons for adjusting color saturation, brightness and exposure of an image, respectively, the image form processing button may be a function button for performing form processing on the image, for example, the image form processing button may include buttons for performing annotation, screenshot, cropping and rotation on the image, respectively.
In this case, the editing button may be a function button for adjusting the color of the image or a function button for adjusting the form of the image. When the function button is in an activated state, the function button for adjusting the image color can be used for carrying out image color processing on the image to be processed; or performing image form processing on the image to be processed by adjusting a function button of the image form.
Step S102, in response to a first selection operation aiming at least one image in the at least one image to be compared, determining the selected image as a first image;
under the condition that the images to be compared only comprise one image, the first selection operation is carried out on the images to be compared, namely the first image is the image to be compared; in the case that the images to be compared include two or more images, the first selection operation is performed on one or more of the two or more images to be compared, that is, the first image includes the selected one or more images. In the case where the first image includes two or more selected images, it may be followed by performing at least one same editing process operation on the two selected images.
Step S103, responding to a second selection operation aiming at any one of the at least one editing button, and displaying an interactive dialog box of the selected editing button; the interactive dialog box comprises image editing parameters to be adjusted;
here, the second selection operation may be for selecting one edit button from among at least one edit button. The interactive dialog box is used for acquiring image editing parameters for editing the first image. The image editing parameters can be image color parameters and can also be image form parameters. For example, the image color parameter may include at least one of color saturation, brightness, exposure, pixel value, and the like, and the image shape parameter may include a deflection degree (clockwise or counterclockwise) and an adjustment step size (unit degree) of the deflection degree, where the deflection degree refers to an angle offset with respect to the input original image, for example, the image in fig. 3C is offset by 45 degrees clockwise with respect to the image in fig. 3B.
In one implementation, each of the edit buttons includes the following two states: an inactive state and an active state. The edit button is in an activated state in a case where the edit button is selected by the second selection operation, and the edit button is in an inactivated state in a case where the edit button is not selected by the second selection operation.
The description is given by taking an editing button as an image color processing button as an example, and under the condition that a user selects the image color processing button, the image color processing button is in an activated state, at this time, the electronic device displays an interactive dialog box of the image color processing button, and the user realizes the adjustment of the image color by changing the image color parameters to be adjusted in the interactive dialog box. In the event that the user does not select the image color management button, such that the image color management button is in an inactive state, the electronic device will not display the interactive dialog of the image color management button.
And step S104, responding to the first adjustment operation aiming at the image editing parameter to be adjusted, and acquiring the adjusted image editing parameter value.
Illustratively, in the case that the image editing parameter is an image color parameter, in response to an adjustment operation for the image color parameter to be adjusted, obtaining an adjusted image color parameter value; and under the condition that the image editing parameters are image morphological parameters, responding to the adjustment operation aiming at the image morphological parameters to be adjusted, and acquiring the adjusted image morphological parameter values.
Step S105, processing the first image based on the adjusted image editing parameter value;
exemplarily, in a case that the image editing parameter is an image color parameter, color adjustment is performed on the first image, for example, the image may be changed into a gray scale image, or the brightness of the image may be changed, etc.; if the image editing parameter is an image form parameter, form adjustment is performed on the first image, for example, the size and direction of the image are changed.
And step S106, displaying the processed first image on the canvas area of the first image in a new layer.
Illustratively, the gray value of the first image is adjusted from 64 to 32, and the first image with the gray value of 32 is stored in the new layer. The new layer may be a layer adjacent to the image with the gray value of 64.
In a possible implementation manner, the display image comparison page includes at least two canvas areas, at least two images to be compared are loaded in the at least two canvas areas one by one, and in response to a first selection operation on at least one image in the at least one image to be compared, an unselected image in the at least one image to be compared is determined as a second image; and displaying the processed first image on the canvas area of the first image by using a new image layer so as to compare the processed first image with the second image.
In the above embodiment, first, the image comparison page includes a toolbar and at least one image to be compared; the toolbar comprises at least one editing button for editing the at least one image to be compared; in response to a second selected operation on any one of the at least one editing button, displaying an interactive dialog box of the selected editing button; the interactive dialog box comprises image editing parameters to be adjusted; therefore, visual image processing operation can be realized in the comparison of the display image of the webpage end and the page, and the interaction complexity of the user using the image processing tool is reduced.
Secondly, responding to a second selected operation aiming at any one of the at least one editing button, and displaying an interactive dialog box of the selected editing button; the interactive dialog box comprises image editing parameters to be adjusted; therefore, the image editing parameters to be adjusted can be acquired through the visual operation of the interactive dialog box.
Thirdly, responding to the first adjustment operation aiming at the image editing parameters to be adjusted, and acquiring adjusted image editing parameter values; processing the first image based on the adjusted image editing parameter value; therefore, the image can be edited based on the acquired image editing parameters to be adjusted through the image comparison page of the webpage end.
And finally, displaying the processed first image on the canvas area of the first image in a new layer. Therefore, the visualization of the webpage-side image processing operation can be further improved by using the layout mode of the toolbar and the canvas area.
The embodiment of the present application provides an image processing method, which further includes, before step S101, the following steps:
step S201, displaying an original page, wherein the original page comprises at least one canvas area, and each canvas area is used for loading at least one image to be compared;
step S202, in response to an image loading operation for at least one of the at least two canvas areas, correspondingly loading at least one image in the at least one canvas area with a first layer to obtain the image comparison page.
In an implementation manner, the images are compared with at least two canvas areas of the page, and the images to be loaded are loaded in the corresponding canvas areas in a one-to-one correspondence manner by using the first image layers; the new layer may be a second layer, and the processed first image is displayed on the canvas area of the first image in the second layer.
In an implementation manner, an original page is displayed, where the original page includes at least two canvas areas, and each canvas area is used for loading one image to be compared; and responding to the image loading operation aiming at least one canvas area in the at least two canvas areas, correspondingly loading an image in the first layer in the at least one canvas area, and obtaining the image comparison page.
As shown in fig. 2A, the original page 210 includes: a canvas area 211 and a canvas area 212. The image to be compared 2111 is loaded in the canvas area 211, the image to be compared 2121 is loaded in the canvas area 212, and the canvas area 211 and the canvas area 212 are in the same layer, that is, the first layer. And comparing the image to be compared 2111 with the image to be compared 2121 to obtain the effect of image editing processing. As shown in fig. 2B, one image 2211 is loaded on the layer 221 and the layer 222 of one canvas area, so as to obtain an image comparison page capable of operating different layers, and in the image comparison page, the images of different layers in one canvas area may be subjected to superposition comparison or stitching comparison.
In another implementation manner, an original page is displayed, where the original page includes a canvas area, and each canvas area is used for loading at least two images to be compared; and responding to the image loading operation aiming at the canvas area, correspondingly loading an image in at least two image layers in the canvas area, and obtaining the image comparison page. As also shown in FIG. 2C, one canvas area 231 loads one image 2311, and the other canvas area 232 loads two images 2321 and 2322; when the canvas area 232 displays the image 2322 on the top layer, left and right full image comparison can be performed; the two images 2321 and 2322 are loaded in the canvas area 232, and the coincidence alignment can be achieved independently.
In the above embodiment, by displaying an original page, the original page includes at least one canvas area, and each canvas area is used for loading at least one image to be compared; and responding to the image loading operation aiming at least one canvas area in the at least two canvas areas, correspondingly loading at least one image on the first layer in the at least one canvas area to obtain the image comparison page, so that the processed image can be displayed in a multi-angle manner in a manner of loading the image on different canvas areas or different layers of the same canvas area, and the image is processed in a visual manner.
In one implementation manner, step S103, in response to the second selection operation for any one of the at least one editing button, includes: in response to a second selected operation on any one of the at least one edit button, displaying, by the toolbar component, an interactive dialog of the selected edit button;
here, the toolbar component is configured to obtain the adjusted image color parameter value of the first image via a toolbar displayed on a page. The toolbar component may implement a toolbar plus interactive dialog interface layout. Illustratively, upon detecting a second selection operation in the toolbar, the toolbar component displays, in response to the second selection operation, an interactive dialog for the second selection operation, displaying an interface layout of the toolbar plus the interactive dialog. As shown in fig. 3B, the second selected operation may be an operation in which the user clicks an image color processing button 311, and the interactive dialog of the edit button may be an interactive dialog 312 of the image color processing button. As shown in fig. 3C, the second selected operation may be an operation in which the user clicks the image modality processing button 313, and the interactive dialog of the edit button may be the interactive dialog 314 of the image modality processing button.
In one implementation manner, the interactive dialog box includes: a title line, a pull-down menu, an operation detection area, an application button, an input box, and the like; wherein the title line is used for displaying the function of the second selected operation; taking image color processing as an example, the pull-down menu is used for selecting a color space curve displayed by the operation detection area, such as an RGB color space or a YUV color space. The operation detection area is used for displaying a color space curve selected by a user and an anchor point on the curve, and the user can change the image color parameters by dragging the anchor point. As shown in fig. 3B, the interactive dialog box of the image color processing button 312 includes a title line 3121, a pull-down menu 3122 for selecting a color space (RGB or YUV), an operation detection area 3123 for displaying a color space curve and dragging an anchor point on the curve, and an application button 3124. Upon activation of the apply button 3124, the modified image color parameters are applied.
Taking image form processing as an example, the pull-down menu is used for selecting the step length of angle change of the rotating image; the operation detection area is used for displaying the rotation angle and the rotation direction of the user operation image; the input frame is used for acquiring the rotation angle of the adjusting image input by the user. As shown in fig. 3C, the interactive dialog boxes of the image modality processing button 313 include a title line 3131, a drop-down menu 3132 for adjusting a step size of a rotation angle, an input box 3133 for inputting a deflection angle and a step size, an operation detection area 3134 for displaying a rotation angle and a rotation direction, and an application button 3135. After the application button 3135 is activated, the modified image morphological parameters are applied.
In some embodiments, the step S104, obtaining an adjusted image editing parameter value in response to a first adjustment operation for the image editing parameter to be adjusted, includes: responding to a first adjustment operation aiming at the image editing parameter to be adjusted, acquiring an adjusted image editing parameter value through the toolbar component, and sending the adjusted image editing parameter value to a corresponding image processing component;
here, the adjusted image editing parameter values are obtained through an interactive dialog implemented by the toolbar component, which may send the image editing parameters to an image processing component through an event notification message. Wherein the image processing component may include: a morphological processing component and a color processing component. Wherein the morphology processing component is to: and performing morphological processing such as turning and cutting on the image. The color processing component is to: and converting the image into a matrix in a color space such as RGB (red, green and blue), and performing affine transformation and other operations on the matrix to realize image color processing.
In some embodiments, step S105, processing the first image based on the adjusted image editing parameter value includes: acquiring the first image from a canvas component through the image processing component, and adjusting the first image based on the adjusted image editing parameter value;
illustratively, a binary image data stream is loaded by a browser on the web page side, and the binary image data stream is rendered on a canvas component in the browser using canvas technology.
Here, adjusting the image color of the first image may be achieved by re-determining the value of the parameter in the color space. And under the condition that the adjusted image editing parameter value is the parameter value of the RGB three-channel color, taking any point in the RGB curve as a control point (anchor point), changing the parameter value of the RGB three-channel color of the control point, informing the changed parameter value to the color processing component in an event notification message mode, recalculating the RGB three-channel parameter by the color processing component according to the changed parameter value, and adjusting the RGB of the image to change the color of the whole image.
In some embodiments, step S106, displaying the processed first image on the canvas area of the first image in a new layer, includes: sending the adjusted first image to the image layer component; and sending the adjusted first image to the canvas assembly through the image layer assembly in a new image layer, and displaying the new image layer on the canvas area of the first image through the canvas assembly.
Here, the canvas area includes a plurality of layers, the new layer is a top layer in the canvas area, and the drawn image is an adjusted first image; and drawing an unmodified first image in a binary image data stream form loaded from a browser in a bottom layer in the drawing area.
In one implementation, where the at least one edit button comprises an image color manipulation button, displaying, by the toolbar component, an interactive dialog for the image color manipulation button in response to a second selected operation for the image color manipulation button; the interactive dialog box of the image color processing button comprises image color parameters to be adjusted; responding to a first adjustment operation aiming at the image color parameters to be adjusted, acquiring adjusted image color parameter values through the toolbar component, and sending the adjusted image color parameter values to a color processing component; acquiring the first image from a canvas assembly through the color processing assembly, adjusting the image color of the first image based on the adjusted image color parameter value, and sending the adjusted first image to an image layer assembly; and sending the adjusted first image to the canvas assembly through the layer assembly in a new layer, and displaying the new layer on the canvas area of the first image through the canvas assembly.
In another implementation of the above embodiment, in a case that the at least one editing button includes an image modality processing button, in response to a second selected operation for the image modality processing button, displaying an interactive dialog of the image modality processing button through the toolbar component; the interactive dialog box of the image form processing button comprises image form parameters to be adjusted; responding to a first adjusting operation aiming at the image form parameters to be adjusted, acquiring adjusted image form parameter values through the toolbar component, and sending the adjusted image form parameter values to a form processing component; acquiring the first image from a canvas assembly through the form processing assembly, adjusting the image form of the first image based on the adjusted image form parameter value, and sending the adjusted first image to a drawing layer assembly; and sending the adjusted first image to the canvas assembly through the layer assembly in a new layer, and displaying the new layer on the canvas area of the first image through the canvas assembly.
In the embodiment, the user can conveniently adjust the image on the webpage end in a simple manner through the interactive dialog box of the editing button displayed in the page, and the image editing parameter to be adjusted is generated.
In one implementation, in a case where the at least one edit button includes an image color processing button, the image processing component includes a color processing component for image color adjustment of the first image; the image editing parameters are image color parameters, and the image color parameters comprise at least one of color saturation, brightness and exposure;
under the condition that the at least one editing button comprises an image form processing button, the image processing component comprises a form processing component used for processing at least one of marking, screenshot, cutting and rotating the first image, the image editing parameters are form processing parameters, and the form processing parameters comprise position information of a marking frame, position information of the screenshot and the cutting and angle information of the rotation.
Here, the image color parameter may be obtained by adjusting an RGB curve. As shown in fig. 3B, an anchor point 3125 is determined in the RGB curve, coordinates of the anchor point are changed, and a parameter value corresponding to the changed anchor point coordinates is determined as an adjusted image color parameter value. Wherein, the horizontal axis of the RGB curve is the brightness of the original (before adjustment) image, 0 represents black, 1-254 represents gray, and 255 represents white; the histogram displayed on the horizontal axis indicates how many pixels exist in each brightness of the original image; the vertical axis is the brightness of the target (adjusted) image. When the image luminance is not adjusted, the RGB curve is a diagonal line and indicates that the luminance values on the horizontal axis (original image) and the vertical axis (target image) are equal.
Here, the image form processing parameter may be obtained by button selection, input of an input frame, selection of an operation detection region, or the like. As shown in fig. 3C, it may be obtained by means of a button 3135, an input box 3133, and an operation detection area 3134, etc.
In the above embodiment, by clicking one image color processing button or image form processing button, the color processing component can be used to adjust the image colors for multiple dimensions such as color saturation, brightness, exposure and the like of the image; or the image can be position-marked, captured, rotated and cropped by using the image processing component.
In an implementation manner, the canvas component and the toolbar component are located in a view layer, and the image processing component and the layer component are located in a view model layer; the toolbar component and the image processing component communicate with each other through an event notification message.
Here, MVVM (Model View Model) is used to implement image processing on the web page side. The Model in the MVVM Model is used for transmitting data, the View is used for rendering a View presentation interface, the ViewModel is a bridge between the Model and the View, and when the data in the Model is changed, the ViewModel can detect the change and inform a View layer of corresponding modification.
Here, the canvas component and the toolbar component are located in the View layer, and the canvas area, the toolbar, and the interactive dialog are exposed by rendering. The image processing assembly and the layer assembly are located on a ViewModel layer, the adjusted first image is calculated through the image processing assembly based on the received change of the image editing parameter, the adjusted first image is sent to a canvas assembly of the View layer through the layer assembly, and the adjusted first image is displayed on a canvas area after being rendered.
In another implementation, MVC (Model View Controller) is used to implement image processing on the web page side. The Controller functions the same as the ViewModel. Under the MVC model, the canvas component and the toolbar component are located at the View layer. The image processing component and the layer component are positioned in a Controller layer.
In another implementation, the MVP (Model View Presenter) implements image processing on the web page side. Here, the Presenter is used instead of the Controller. Under the MVP model, the canvas component and the toolbar component are located at the View layer. The image processing component and the image layer component are positioned on a Presenter layer.
In the above embodiment, the canvas component and the toolbar component are located in a view layer, and the image processing component and the layer component are located in a view model layer; the toolbar component and the image processing component communicate with each other through an event notification message. Therefore, complex processing and operation of the image can be completed at the webpage end in a mode of combining the view layer with the view model layer, convenience of image processing is improved, and user experience is improved.
In one implementation manner, before step S101, as shown in fig. 3A, the method further includes:
step S301, acquiring configuration information of a toolbar, wherein the configuration information of the toolbar is used for selecting configured function buttons from a candidate button set of the toolbar; the candidate button set comprises a preview button, a storage button, an image color processing button and an image form processing button, wherein the image color processing button comprises an editing button used for adjusting the color saturation, the brightness and the exposure of the image respectively, and the image form processing button comprises an editing button used for marking, screenshot, cutting and rotating the image respectively;
in this embodiment, a configuration component may be provided during implementation, and an interactive dialog box of the configuration component is displayed in response to a user clicking a configuration button of a page; within the dialog box, the user obtains configuration information for the get toolbar by selecting each function button for configuration from the candidate button set for the toolbar.
For example, the candidate button set may include an image color manipulation button, and the user may deselect the identifier of the image color manipulation button, such that the configuration information of the toolbar is used to characterize the identifier of the image color manipulation button, and thus, no color manipulation component is loaded in the subsequent image manipulation component set. If the candidate button set comprises an image morphological processing button, the user can select the identifier of the image morphological processing button, so that the configuration information of the toolbar is used for representing the identifier of the selected image morphological processing button, and morphological processing components need to be loaded in the subsequent image processing component set. In other words, each function button in the toolbar is associated with some component in the image processing component set, and in the case that the identification of the button is selected, the component corresponding to the button needs to be loaded.
Step S302, loading target image processing components related to each function button selected and configured from an image processing component set based on the configuration information of the toolbar, wherein the image processing component set comprises a canvas component, a toolbar component, an image layer component, a persistence component, a form processing component and a color processing component;
step S303, displaying corresponding target function buttons on the toolbar based on the configuration information of the toolbar.
Here, in the case where the identifications of the function buttons are selected, the corresponding function buttons are displayed on the toolbar. In the case where the identification of some of the function buttons is not selected, the corresponding function buttons are not displayed on the toolbar.
In the embodiment, based on the plug-in organization mode, the components in the image processing tool can be loaded as required, and the number of the loaded image processing components and the function buttons in the display interface can be reduced by loading the target function buttons and the target image processing components as required, so that the interaction complexity of the user using the image processing tool is reduced.
The embodiment of the present application provides an image processing method, after step S106, further including steps S304 and S305, where:
step S304, under the condition that the processed first image is in an activated state, responding to a first trigger operation aiming at the preview button, and displaying the processed first image by taking a preview layer as a top layer of a canvas area of the first image;
as shown in fig. 2C, after the first trigger operation is detected, the preview layer 221 is displayed as a top layer of the canvas area.
Step S305, in response to the completion of the preview, delete the preview layer.
Illustratively, in response to a preview operation, the preview is layered on top of the canvas area of the first image, and the preview layer is deleted after the preview is completed.
In this embodiment, with a toolbar including a preview button, in response to a first trigger operation for the preview button when the processed first image is in an activated state, displaying the processed first image with a preview layer as a top layer of a canvas area of the first image; and deleting the preview layer after the preview is finished. Therefore, the user can select whether to back the processing effect or not in the preview process through the preview button, or the processing effect is stored, and the use experience of the user is improved.
In one implementation, the toolbar further includes a save button; the method further comprises the following steps:
step S306, responding to a second trigger operation aiming at the saving button, associating and saving the adjusted image editing parameter value and the processed first image at a server, and/or replacing the preview image layer with the image layer of the first image;
here, saving the first image on the server side may persist the first image. When the B/S architecture is adopted for implementation, if a save button is clicked, the effect of the editing processing can be applied and/or saved at the server, and the preview layer is used for replacing the layer of the first image.
In an implementation manner, when implemented by using a C/S architecture, the method further includes: and responding to a second trigger operation aiming at the saving button, and associating and saving the adjusted image editing parameter value and the processed first image in a local disk or a cache. Thus, the first image is stored in the client, so that the user can conveniently use the edited image after editing the image.
In this embodiment, by a toolbar comprising a save button; the adjusted image editing parameter value and the processed first image may be stored in a server in a correlated manner, and/or the preview image layer is replaced with the image layer of the first image. Therefore, the persistence of the image processing effect can be realized by storing the processed image, and the problem of storing the processing effect in the image processing process is solved.
In one implementation manner, the step S104, in response to the first adjustment operation on the image editing parameter to be adjusted, acquiring an adjusted image editing parameter value, includes: responding to the adjustment operation aiming at the image color parameter to be adjusted, and acquiring the adjusted image color parameter value; responding to the adjustment operation aiming at the image morphological parameters to be adjusted, and acquiring adjusted image morphological parameter values;
referring to fig. 3B, the user clicks a color-related tool button 311 (i.e., an image color processing button) in the image processing tool, opens a corresponding tool interactive dialog 312, and the user changes the value of the image color parameter in the dialog, so that the electronic device can obtain the adjusted image color parameter value through the dialog. Referring to fig. 3C again, the user clicks a modality-related tool button 313 (i.e., an image modality processing button) in the image processing tool, activates a corresponding tool interactive dialog 314, and operates a numerical value of an image modality parameter in the dialog through a mouse to realize adjustment of the image modality parameter; therefore, the electronic equipment can acquire the adjusted image morphological parameter value through the dialog box. In the embodiment, the color and the form of the image can be adjusted at the webpage end by responding to different adjusting operations.
In an implementation manner, the obtaining an adjusted image color parameter value in response to an adjustment operation on the image color parameter to be adjusted includes: in response to the adjustment operation of the interactive dialog box aiming at the image color processing button, acquiring an anchor point on an RGB curve in the interactive dialog box of the image color processing button, and determining the coordinate of the anchor point after movement; and determining the final coordinate after the movement as the adjusted image color parameter value. Therefore, the image color parameters for realizing the color adjustment algorithm can be obtained, and the problem of color adjustment of the image at the webpage end through the color adjustment algorithm is solved.
Here, in the interactive dialog of the image color processing button, the horizontal axis of the RGB curve is the brightness of the first image before adjustment, 0 represents black, 1 to 254 represent gray, and 255 represents white; the histogram displayed on the horizontal axis indicates how many pixels exist in each brightness of the original image; the vertical axis is the brightness of the adjusted first image. When the image luminance is not adjusted, the RGB curve is a diagonal line and indicates that the luminance values on the horizontal axis (original image) and the vertical axis (target image) are equal.
Referring to fig. 3B, the user adjusts the anchor point 3125 on the RGB curve in the interactive dialog box of the image color processing button, for example, the user clicks the selected anchor point 3125 and drags the anchor point 3125, so that the anchor point 3125 generates a displacement to trigger an update event, and the content of the update event is new coordinates of all anchor points on the curve. Referring to fig. 4, the algorithm processing module 403 performs calculation according to the new anchor coordinates (final coordinates after anchor movement) notified by the event and the image binary data stream to obtain a new image binary data stream; the algorithm processing module 403 sends the computed binary data stream to the canvas component 402 (which may be understood as a rendering module) via an update canvas event, and the canvas component 402 re-renders the canvas according to the resulting binary data stream.
In one implementation, in response to an adjustment operation for the image morphological parameter to be adjusted, acquiring an adjusted image morphological parameter value includes: in response to an adjustment operation in the interactive dialog for the image modality processing button, a rotation angle and a rotation direction in the interactive dialog for the image modality processing button are acquired. Therefore, the user can adjust the image form parameters in a dialog box interactive mode in the webpage, and the form adjustment of the image, such as the rotation angle and the rotation direction, can be realized.
And adjusting the rotation angle and the rotation direction of the image through an affine transformation algorithm. Namely, the matrix multiplication and the matrix addition operation are carried out on the matrix of the image to change the image.
The following describes the above image processing method with reference to a specific embodiment, but it should be noted that the specific embodiment is only for better describing the present application and is not to be construed as limiting the present application.
The embodiments of the present application take portrait comparison as an example for explanation. In the process of comparing the portrait, since the image (which may be a video, a photo, a picture, etc.) of the person may have distortion, image processing is required to be performed on the image, so that the portrait in the image is clearly recognizable, and thus the image comparison work is performed. For example, it is necessary to adjust an image with low sharpness and contrast or with an inappropriate light and shooting angle by means of an image processing tool. However, the image processing tool in the related art is desktop-end software, which needs to be installed and has complex functions, and is not beneficial for the user to quickly and effectively process the image. Therefore, a web page end (web) image processing tool which does not need to be installed, has simple functions and is convenient for a user to operate needs to be explored.
Based on the needs of image processing products and actual scene requirements, the present application provides an image processing method, comprising the functions of:
function one: adjusting image color parameters based on a color adjustment algorithm;
the image color parameters include at least: color saturation, color balance, gray level, and primary colors of optics (Red Green Blue, RGB) color mode parameters;
here, the color adjustment algorithm used includes at least one of: a color saturation adjusting algorithm, an image exposure adjusting algorithm and an RGB curve adjusting algorithm. The color adjustment algorithm can be used for adjusting the color saturation, the color balance, the gray level graph and the RGB curve of the portrait previewed by the webpage end. Illustratively, the portrait previewed at the web page end can be converted from an RGB image to a gray-scale image based on a color adjustment algorithm. In the process, the problem of color adjustment of the image at the webpage end is solved through a color adjustment algorithm.
And a second function: adjusting the form of the image based on an image transformation algorithm and a webpage end canvas element;
here, the image transformation algorithm includes at least one of: affine transformation algorithm, rigid transformation algorithm, perspective transformation algorithm, and nonlinear transformation algorithm. The method can be realized by adopting an image transformation algorithm and a webpage end canvas element: automatically aligning the portrait previewed by the webpage end with the canvas realized by the canvas elements of the webpage end, marking the portrait in the personnel image, measuring the manually marked distance in the image, performing screenshot and clipping on the image and the like. Illustratively, based on affine transformation algorithms and webpage-side canvas elements, the canvas implemented by the whole canvas element can be filled with the person images. In the process, the morphological problems of images such as marking, measuring, screenshot, cutting and rotating the images at the webpage end are solved through an image transformation algorithm and the canvas element at the webpage end.
And function III: based on the canvas element at the webpage end, the preview and persistence of the image processing effect are realized;
here, the persistence of the image processing effect can be achieved by storing the processed image. The user can select whether to rewind the processing effect or store the processing effect in the preview process. In the process, the problems of previewing and storing the processing effect in the image processing process are solved through the canvas element at the webpage end.
And the function is four: based on a plug-in organization mode, realizing the loading of components in the image processing tool according to the requirement;
here, the interface layout of the image processing tool may use a layout manner of a toolbar plus a canvas, so as to improve visualization of the image processing operation at the webpage end. In the process, the interaction complexity of the user using the image processing tool is reduced by loading the components according to the requirement and adjusting the layout mode.
The following describes editing processing of an image, and persistence (saving) and preview of the image after the editing processing, as follows:
editing and processing images
1) Image loading: the method comprises the steps of loading an image data stream through a browser at a webpage end, drawing the image data stream on a canvas by utilizing a canvas technology in the browser, analyzing the image data into image RGB data for activating an image processing tool, and activating an editing button in a tool bar according to plug-in configuration.
2) Image color processing:
an embodiment of the present application provides an image processing apparatus, as shown in fig. 4, the apparatus includes: a canvas component 402, an algorithm processing module 403 (color processing component), a morphological processing module 404 (morphological processing component), a layer management module 405 (layer component) and an image loading and persistence module 406 (persistence component), a configuration component 407 and a toolbar component 408, wherein:
in implementation, if the MVVM mode is adopted, the canvas component 402, the configuration component 407, and the toolbar component 408 are located in the view layer, and the image processing components (such as the algorithm processing module 403 and the form processing module 404), the layer management module 405, and the image loading and persisting module 406 included in the large box of the drawing are all located in the view model layer; the toolbar component 408 communicates with the modules within the large box via event notification messages. Since the model layer is interactive with the server, it is not shown in this embodiment.
The image loading and persistence module 406 converts the image 401 to a binary image data stream, and then the image loading and persistence module 406 draws the binary image data stream onto the canvas component 402; the algorithm processing module 403 may convert the binary image data stream of the image 401 into an RGB matrix, and operate the matrix to implement color processing on the image; a form processing module 404, which turns and cuts the image, and is used for implementing form processing on the image 401; the layer management module 405 is used for managing layers in the canvas to display the selected layers; an image loading and persistence module 406, configured to store or pre-process the image; and the configuration component 407 is configured to add configuration information and perform module loading based on the configuration information.
By clicking a button for adjusting the image color parameter in the toolbar 408 (i.e., the image color processing button can be immediately set), an interactive dialog box of the image color processing button is opened, the user adjusts the image color parameter by increasing or decreasing the numerical value of the image color parameter in the interactive dialog box, the toolbar component 408 transmits the adjusted image color parameter value to the algorithm processing module 403, and the canvas component 402 transmits the original binary image data stream of the image 401 to the algorithm processing module 403. The algorithm processing module 403 performs color processing on the original binary image data stream of the image 401 based on the adjusted image color parameter value to obtain a color-processed binary image data stream; the algorithm processing module 403 notifies the canvas component 402 of the color processed binary image data stream by event communication to update the image to complete the presentation.
3) And (3) image form processing:
with continued reference to FIG. 4, by clicking on the button in the toolbar 408 to adjust the image morphology, opening an interactive dialog of the image morphology processing button in which the user adjusts the image morphology parameters by increasing or decreasing the numerical value of the image morphology parameters, the toolbar component 408 transmits the adjusted image morphology parameter values to the morphology processing module 404, and the canvas component 402 streams the original binary image data of the image 401 to the morphology processing module 404. The morphological processing module 404 performs morphological processing on the original binary image data stream of the image 401 based on the adjusted image morphological parameter value to obtain a morphological-processed binary image data stream; the form processing module 404 notifies the canvas component 402 of the form processed binary image data stream by means of event communication to update the image, and the canvas component 402 redraws the canvas to show the effect of the form processing.
As shown in FIG. 5, the operation 501 of selecting an image to be compared is detected in the display image comparison page, the canvas component 502 draws a binary image data stream on the canvas using canvas technology, parses the binary image data stream into RGB data, and activates the toolbar 503.
Second, save and preview of edited image
By utilizing the canvas layer technology, the preview operation of the first image can be realized, the preview image is stacked to the top layer of the canvas area of the first image, the preview layer is deleted after the preview is finished, and if the save button is clicked, the effect of the editing process is applied, the preview layer is used to replace the layer of the first image.
Clicking on the ok in the dialog box or the application button on the toolbar persists the image color parameter changes to the image itself. As shown in fig. 6, a save operation 601 is detected in the display image matching page, and the image data stream of the preview image layer is used to replace the image layer of the first image.
Based on the foregoing embodiments, an embodiment of the present application further provides an image processing apparatus, where the control apparatus includes modules that can be implemented by a processor in an electronic device; of course, the implementation can also be realized through a specific logic circuit; in the implementation process, the Processor may be a Central Processing Unit (CPU), a microprocessor Unit (MPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), or the like.
Fig. 7 is a schematic structural diagram of an image processing apparatus provided in an embodiment of the present application, and as shown in fig. 7, the apparatus 700 includes a first display module 701, a determination module 702, a second display module 703, a first acquisition module 704, a processing module 705, and a third display module 706, where:
the first display module 701 is configured to display an image comparison page, where the image comparison page includes a toolbar and at least one image to be compared; the toolbar comprises at least one editing button for editing the at least one image to be compared;
a determining module 702, configured to determine, in response to a first selection operation on at least one image of the at least one image to be compared, a selected image as a first image;
a second display module 703, configured to display, in response to a second selected operation for any one of the at least one editing button, an interactive dialog box of the selected editing button; the interactive dialog box comprises image editing parameters to be adjusted;
a first obtaining module 704, configured to, in response to a first adjustment operation for the image editing parameter to be adjusted, obtain an adjusted image editing parameter value;
a processing module 705, configured to process the first image based on the adjusted image editing parameter value;
a third display module 706, configured to display the processed first image on a canvas area of the first image in a new layer.
In one implementation manner, the apparatus 700 further includes a fourth display module and a first loading module, wherein: the fourth display module is used for displaying an original page, the original page comprises at least one canvas area, and each canvas area is used for loading at least one image to be compared; and the first loading module is used for responding to an image loading operation aiming at least one canvas area in the at least two canvas areas, correspondingly loading at least one image in the at least one canvas area by using a first layer to obtain the image comparison page.
In one implementation, the toolbar further includes a preview button; the third display module 706 is further configured to: and under the condition that the processed first image is in an activated state, responding to a first trigger operation aiming at the preview button, and displaying the processed first image by taking a preview layer as a top layer of a canvas area of the first image. The apparatus 700 further includes a deleting module, configured to delete the preview layer in response to the preview being completed.
In one implementation, the toolbar further includes a save button; the apparatus 700 further comprises a storage module and/or a replacement module, wherein: the storage module is used for responding to a second trigger operation aiming at the storage button, and associating and storing the adjusted image editing parameter value and the processed first image at a server end; and the replacing module is used for replacing the preview layer with the layer of the first image in response to a second trigger operation aiming at the saving button.
In an implementation manner, the second display module 703 is further configured to display, through the toolbar component, an interactive dialog box of a selected edit button in response to a second selected operation for any one of the at least one edit button; the first obtaining module 704 is further configured to, in response to a first adjustment operation for the image editing parameter to be adjusted, obtain an adjusted image editing parameter value through the toolbar component, and send the adjusted image editing parameter value to a corresponding image processing component; the processing module 705 is further configured to obtain the first image from the canvas component through the image processing component, adjust the first image based on the adjusted image editing parameter value, and send the adjusted first image to the graph layer component; the third display module 706 is further configured to send the adjusted first image to the canvas component through the layer component in a new layer, and display the new layer on the canvas area of the first image through the canvas component.
In one implementation, in a case where the at least one edit button includes an image color processing button, the image processing component includes a color processing component for image color adjustment of the first image; the image editing parameters are image color parameters, and the image color parameters comprise at least one of color saturation, brightness and exposure;
under the condition that the at least one editing button comprises an image form processing button, the image processing component comprises a form processing component used for processing at least one of marking, screenshot, cutting and rotating the first image, the image editing parameters are form processing parameters, and the form processing parameters comprise position information of a marking frame, position information of the screenshot and the cutting and angle information of the rotation.
In an implementation manner, the canvas component and the toolbar component are located in a view layer, and the image processing component and the layer component are located in a view model layer;
the toolbar component and the image processing component communicate with each other through an event notification message.
In an implementation manner, the second obtaining module 704 is further configured to: acquiring configuration information of a toolbar, wherein the configuration information of the toolbar is used for selecting each configured function button from a candidate button set of the toolbar; the candidate button set comprises a preview button, a storage button, an image color processing button and an image form processing button, wherein the image color processing button comprises an editing button used for adjusting the color saturation, the brightness and the exposure of the image respectively, and the image form processing button comprises an editing button used for marking, screenshot, cutting and rotating the image respectively;
the second loading module is further configured to: loading target image processing components related to each function button of the selected configuration from an image processing component set based on the configuration information of the toolbar, wherein the image processing component set comprises a canvas component, a toolbar component, an image layer component, a persistence component, a form processing component and a color processing component;
the apparatus 700 further includes a fifth display module, configured to display corresponding target function buttons on the toolbar based on the configuration information of the toolbar.
Here, it should be noted that: the above description of the apparatus embodiments, similar to the above description of the method embodiments, has similar beneficial effects as the method embodiments. For technical details not disclosed in the embodiments of the apparatus of the present application, reference is made to the description of the embodiments of the method of the present application for understanding.
In the embodiment of the present application, if the image processing method is implemented in the form of a software functional module and sold or used as a standalone product, the image processing method may also be stored in a computer-readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing an electronic device to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, or an optical disk. Thus, embodiments of the present application are not limited to any specific combination of hardware and software.
Correspondingly, the present application provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps in the image processing method according to any of the above embodiments.
Correspondingly, in an embodiment of the present application, a chip is further provided, where the chip includes a programmable logic circuit and/or program instructions, and when the chip runs, the chip is configured to implement the steps in any of the image processing methods in the foregoing embodiments.
Correspondingly, in an embodiment of the present application, there is also provided a computer program product, which is used to implement the steps in the image processing method in any of the foregoing embodiments when the computer program product is executed by a processor of an electronic device.
Based on the same technical concept, the embodiment of the present application provides an electronic device for implementing the image processing method described in the above method embodiment. Fig. 8 is a hardware entity diagram of an electronic device according to an embodiment of the present application, as shown in fig. 8, the electronic device 800 includes a memory 810 and a processor 820, the memory 810 stores a computer program that can be executed on the processor 820, and the processor 820 implements steps in any image processing method according to the embodiment of the present application when executing the computer program.
The Memory 810 is configured to store instructions and applications executable by the processor 820, and may also buffer data (e.g., image data, audio data, voice communication data, and video communication data) to be processed or already processed by the processor 820 and modules in the electronic device, and may be implemented by a FLASH Memory (FLASH) or a Random Access Memory (RAM).
The processor 820 implements the steps of the image processing method of any one of the above when executing the program. The processor 820 generally controls the overall operation of the electronic device 800.
The Processor may be at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a Central Processing Unit (CPU), a controller, a microcontroller, and a microprocessor. It is understood that the electronic device implementing the above-mentioned processor function may be other electronic devices, and the embodiments of the present application are not particularly limited.
The computer storage medium/Memory may be a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Programmable Read Only Memory (EPROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a magnetic Random Access Memory (FRAM), a Flash Memory (Flash Memory), a magnetic surface Memory, an optical Disc, or a Compact Disc Read-Only Memory (CD-ROM), and the like; or may be various electronic devices including one or any combination of the above memories.
Here, it should be noted that: the above description of the storage medium and device embodiments is similar to the description of the method embodiments above, with similar advantageous effects as the method embodiments. For technical details not disclosed in the embodiments of the storage medium and the apparatus of the present application, reference is made to the description of the embodiments of the method of the present application for understanding.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application. The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiments of the present application.
In addition, all functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Alternatively, the integrated units described above in the present application may be stored in a computer-readable storage medium if they are implemented in the form of software functional modules and sold or used as independent products. Based on such understanding, the technical solutions of the embodiments of the present application may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing an automatic test line of a device to perform all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
The methods disclosed in the several method embodiments provided in the present application may be combined arbitrarily without conflict to obtain new method embodiments.
The features disclosed in the several method or apparatus embodiments provided in the present application may be combined arbitrarily, without conflict, to arrive at new method embodiments or apparatus embodiments.
The above description is only for the embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (11)

1. An image processing method, characterized in that the method comprises:
displaying an image comparison page, wherein the image comparison page comprises a toolbar and at least one image to be compared; the toolbar comprises at least one editing button for editing the at least one image to be compared;
in response to a first selection operation on at least one image in the at least one image to be compared, determining the selected image as a first image;
in response to a second selected operation for any one of the at least one editing button, displaying an interactive dialog box of the selected editing button; the interactive dialog box comprises image editing parameters to be adjusted;
responding to a first adjusting operation aiming at the image editing parameters to be adjusted, and acquiring adjusted image editing parameter values;
processing the first image based on the adjusted image editing parameter value;
and displaying the processed first image on the canvas area of the first image in a new layer.
2. The method of claim 1, wherein the obtaining an adjusted image editing parameter value in response to a second selected operation for any of the at least one editing button, in response to a first adjustment operation for the image editing parameter to be adjusted, processing the first image based on the adjusted image editing parameter value, displaying the processed first image in a new image layer over a canvas area of the first image, comprises:
in response to a second selected operation for any of the at least one edit button, displaying, by the toolbar component, an interactive dialog of the selected edit button;
responding to a first adjustment operation aiming at the image editing parameter to be adjusted, acquiring an adjusted image editing parameter value through the toolbar component, and sending the adjusted image editing parameter value to a corresponding image processing component;
acquiring the first image from a canvas component through the image processing component, adjusting the first image based on the adjusted image editing parameter value, and sending the adjusted first image to a picture layer component; and sending the adjusted first image to the canvas assembly through the layer assembly in a new layer, and displaying the new layer on the canvas area of the first image through the canvas assembly.
3. The method of claim 2, wherein in the case where the at least one edit button comprises an image color processing button, the image processing component comprises a color processing component for image color adjustment of the first image; the image editing parameters are image color parameters, and the image color parameters comprise at least one of color saturation, brightness and exposure;
under the condition that the at least one editing button comprises an image form processing button, the image processing component comprises a form processing component used for processing at least one of marking, screenshot, cutting and rotating the first image, the image editing parameters are form processing parameters, and the form processing parameters comprise position information of a marking frame, position information of the screenshot and the cutting and angle information of the rotation.
4. The method of claim 2 or 3, wherein the canvas component and the toolbar component are located in a view layer, and the image processing component and layer component are located in a view model layer;
the toolbar component and the image processing component communicate with each other through an event notification message.
5. The method of any of claims 2 to 4, further comprising:
acquiring configuration information of a toolbar, wherein the configuration information of the toolbar is used for selecting configured function buttons from a candidate button set of the toolbar; the candidate button set comprises a preview button, a storage button, an image color processing button and an image form processing button, wherein the image color processing button comprises an editing button used for adjusting the color saturation, the brightness and the exposure of the image respectively, and the image form processing button comprises an editing button used for marking, screenshot, cutting and rotating the image respectively;
loading target image processing components related to each function button of the selected configuration from an image processing component set based on the configuration information of the toolbar, wherein the image processing component set comprises a canvas component, a toolbar component, an image layer component, a persistence component, a form processing component and a color processing component;
and displaying corresponding target function buttons on the toolbar based on the configuration information of the toolbar.
6. The method of any of claims 1 to 4, further comprising:
displaying an original page, wherein the original page comprises at least one canvas area, and each canvas area is used for loading at least one image to be compared;
and in response to the image loading operation aiming at least one canvas area in the at least two canvas areas, correspondingly loading at least one image in the at least one canvas area by using a first image layer to obtain the image comparison page.
7. The method of any of claims 1 to 4, wherein the toolbar further includes a preview button; the displaying the processed first image on the canvas area of the first image with a new layer comprises: under the condition that the processed first image is in an activated state, responding to a first trigger operation aiming at the preview button, and displaying the processed first image by taking a preview layer as a top layer of a canvas area of the first image;
and deleting the preview layer after the preview is finished.
8. The method of claim 7, wherein the toolbar further includes a save button; after the processed first image is displayed with the preview layer as the top layer of the canvas area of the first image, the method further includes:
and responding to a second trigger operation aiming at the saving button, associating the adjusted image editing parameter value with the processed first image, saving the image editing parameter value at a server side, and/or replacing the preview image layer with the image layer of the first image.
9. An image processing apparatus, characterized in that the apparatus comprises:
the first display module is used for displaying an image comparison page, and the image comparison page comprises a toolbar and at least one image to be compared; the toolbar comprises at least one editing button for editing the at least one image to be compared;
the determining module is used for responding to a first selection operation aiming at least one image in the at least one image to be compared and determining the selected image as a first image;
the second display module is used for responding to a second selected operation aiming at any one of the at least one editing button and displaying an interactive dialog box of the selected editing button; the interactive dialog box comprises image editing parameters to be adjusted;
the first obtaining module is used for responding to a first adjusting operation aiming at the image editing parameter to be adjusted and obtaining an adjusted image editing parameter value;
a processing module for processing the first image based on the adjusted image editing parameter value;
and the third display module is used for displaying the processed first image on the canvas area of the first image in a new layer.
10. An electronic device comprising a memory and a processor, the memory storing a computer program operable on the processor, wherein the processor implements the steps of the method of any one of claims 1 to 8 when executing the program.
11. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 8.
CN202210557949.6A 2022-05-19 2022-05-19 Image processing method and device, electronic device and storage medium Withdrawn CN114968040A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210557949.6A CN114968040A (en) 2022-05-19 2022-05-19 Image processing method and device, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210557949.6A CN114968040A (en) 2022-05-19 2022-05-19 Image processing method and device, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN114968040A true CN114968040A (en) 2022-08-30

Family

ID=82984511

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210557949.6A Withdrawn CN114968040A (en) 2022-05-19 2022-05-19 Image processing method and device, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN114968040A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110347458A (en) * 2019-06-04 2019-10-18 广州视源电子科技股份有限公司 A kind of button control methods of exhibiting, device, storage medium and interactive intelligent tablet computer
CN111930979A (en) * 2020-07-29 2020-11-13 广州华多网络科技有限公司 Image processing method, device, equipment and storage medium
CN112312217A (en) * 2019-07-31 2021-02-02 腾讯科技(深圳)有限公司 Image editing method and device, computer equipment and storage medium
CN112558824A (en) * 2019-09-26 2021-03-26 腾讯科技(深圳)有限公司 Page display method and device and computer storage medium
CN112860163A (en) * 2021-01-21 2021-05-28 维沃移动通信(深圳)有限公司 Image editing method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110347458A (en) * 2019-06-04 2019-10-18 广州视源电子科技股份有限公司 A kind of button control methods of exhibiting, device, storage medium and interactive intelligent tablet computer
CN112312217A (en) * 2019-07-31 2021-02-02 腾讯科技(深圳)有限公司 Image editing method and device, computer equipment and storage medium
CN112558824A (en) * 2019-09-26 2021-03-26 腾讯科技(深圳)有限公司 Page display method and device and computer storage medium
CN111930979A (en) * 2020-07-29 2020-11-13 广州华多网络科技有限公司 Image processing method, device, equipment and storage medium
CN112860163A (en) * 2021-01-21 2021-05-28 维沃移动通信(深圳)有限公司 Image editing method and device

Similar Documents

Publication Publication Date Title
US10652455B2 (en) Guided video capture for item listings
US9491366B2 (en) Electronic device and image composition method thereof
US9300876B2 (en) Fill with camera ink
JP6079297B2 (en) Editing apparatus, editing method, and editing program
CN102538974A (en) Thermal image display device, thermal image display system and thermal image display method
WO2015192713A1 (en) Image processing method and device, mobile terminal, and computer storage medium
CN109688321B (en) Electronic equipment, image display method thereof and device with storage function
US20130076941A1 (en) Systems And Methods For Editing Digital Photos Using Surrounding Context
TWI545508B (en) Method for performing a face tracking function and an electric device having the same
US20170322680A1 (en) Method and apparatus for setting background of ui control, and terminal
US8830251B2 (en) Method and system for creating an image
CN111833234B (en) Image display method, image processing apparatus, and computer-readable storage medium
CN114003160B (en) Data visual display method, device, computer equipment and storage medium
CN110177216B (en) Image processing method, image processing device, mobile terminal and storage medium
CN114968040A (en) Image processing method and device, electronic device and storage medium
US20220070370A1 (en) Image processing device, image processing method, and image processing program
JP2015015699A (en) Image processing system, information processing method and program
CN108702441A (en) Image processing equipment, image processing system and program
US20240176566A1 (en) Processing method and apparatus thereof
US20230088309A1 (en) Device and method for capturing images or video
CN114363521B (en) Image processing method and device and electronic equipment
CN113489901B (en) Shooting method and device thereof
JP2015119494A (en) Network camera system, control method, and program
WO2021102939A1 (en) Image processing method and device
CN116009738A (en) Processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20220830