CN114257755A - Image processing method, device, equipment and storage medium - Google Patents

Image processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN114257755A
CN114257755A CN202010995170.3A CN202010995170A CN114257755A CN 114257755 A CN114257755 A CN 114257755A CN 202010995170 A CN202010995170 A CN 202010995170A CN 114257755 A CN114257755 A CN 114257755A
Authority
CN
China
Prior art keywords
image
image data
data
size
resolution
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010995170.3A
Other languages
Chinese (zh)
Inventor
郭计伟
肖凤言
蓝志勇
林明勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010995170.3A priority Critical patent/CN114257755A/en
Publication of CN114257755A publication Critical patent/CN114257755A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Abstract

The application discloses an image processing method, an image processing device, image processing equipment and a storage medium, and belongs to the technical field of multimedia. The embodiment of the application provides a method for editing an image on line, which can display a selected template image and display an editing result according to the editing operation of a user, so that what you see is what you get is realized, the generated image can more easily meet the requirements of the user without repeated modification, multiple communication between a terminal and a server is not needed, the cost of the image processing method can be effectively reduced, and the processing efficiency is improved. And when the second image is generated, the resolution of the image after the editing operation can be updated to generate the image with the target resolution, so that the equipment automatically generates the image with the resolution meeting the requirement, and the user does not need to download the image generated by the server and manually edit the image, thereby reducing the user operation, further improving the image processing efficiency, and having better practicability and applicability.

Description

Image processing method, device, equipment and storage medium
Technical Field
The present application relates to the field of multimedia technologies, and in particular, to an image processing method, apparatus, device, and storage medium.
Background
With the development of multimedia technology, the application of multimedia technology is more and more extensive, and convenience is brought to life. The image processing technology is a multimedia technology, the content of the image is edited through editing operation, then the image is generated, and the image is used as a propaganda medium, so that a better propaganda effect can be achieved.
At present, in an image processing method, a user generally uploads a template image and some produced material elements to a server, sets editing requirements of image contents, and feeds back the image generated by the server to the user. After the user sees the picture generated by the server, if the picture is not satisfactory, the user can modify the editing requirement again, and the server generates a new image until the user is satisfied.
In the image processing method, the server generates the image according to the editing requirement set by the user, the user cannot know the processing effect when setting the editing requirement, so the image processing method needs to be repeatedly modified, the terminal and the server need to communicate for many times, and the generation of one image needs to be executed for many times, so the image processing method has high cost and low processing efficiency, cannot well meet the user requirement, and has poor processing effect. The method only involves editing the image content, and if the user wants to print the image generated by the server for promotion, the image cannot be directly used for printing, the printing effect is very fuzzy, and the user needs to download the image from the server, and further process the image through image processing software, so the practicability and applicability of the image processing method are poor.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, image processing equipment and a storage medium, which can reduce cost, improve image processing efficiency and improve practicability and applicability. The technical scheme is as follows:
in one aspect, an image processing method is provided, and the method includes:
responding to a template selection instruction, and displaying a selected template image;
responding to the editing operation of the template image, and displaying a first image based on first image data, wherein the first image data is obtained by processing the image data of the template image based on the editing operation;
and responding to an image generation instruction, updating the resolution field data in the first image data based on a target resolution to obtain second image data of a second image, wherein the resolution of the second image is the target resolution.
In one possible implementation manner, the template image includes at least one element of a title, a background, text, an icon, and a banner, and the editing operation on the template image is any one of a drag operation, a zoom operation, a color setting operation, and a change operation on any one element of the at least one element.
In one aspect, an image processing apparatus is provided, the apparatus including:
the display module is used for responding to the template selection instruction and displaying the selected template image;
the display module is further used for responding to the editing operation of the template image, displaying a first image based on first image data, and processing the image data of the template image based on the editing operation by the first image data;
and the updating module is used for responding to an image generation instruction, updating the data of the resolution field in the first image data based on the target resolution, and obtaining second image data of a second image, wherein the resolution of the second image is the target resolution.
In a possible implementation manner, the updating module is configured to replace the data of the field with resolution in the first image data with the data corresponding to the target resolution, so as to obtain second image data of a second image.
In one possible implementation manner, the updating module includes a drawing unit, an obtaining unit and an updating unit;
the drawing unit is used for drawing the displayed first image onto a canvas based on the target resolution;
the acquisition unit is used for acquiring the image data of the canvas;
the updating unit is used for executing the step of updating the resolution field data based on the image data of the canvas.
In one possible implementation manner, the obtaining unit is configured to obtain the image data of the canvas based on a target application program interface of a browser application.
In one possible implementation, the first image data is encoded data based on base 64;
the updating module is used for replacing the data of the resolution field in the encoded data of the first image with the hexadecimal data corresponding to the target resolution to obtain second image data of a second image.
In one possible implementation, the display module is configured to:
responding to a template selection instruction, and acquiring image data and a first size of a selected template image;
acquiring image data of a preview image based on the target resolution, the first size and the image data of the template image;
displaying the preview image based on image data of the preview image.
In one possible implementation, the display module is configured to:
acquiring a second size corresponding to the first size based on a target conversion relation corresponding to a target resolution, wherein the unit of the first size is a first unit, the unit of the second size is a second unit, and the target conversion relation is a conversion relation between the first unit and the second unit;
according to the second size and a third size of a preview display area, carrying out zooming processing on the image data of the template image to obtain the image data of the preview image, wherein the size of the preview image is the third size;
in one possible implementation, the display module is configured to display the preview image in the preview display area based on image data of the preview image.
In one possible implementation, the update module is configured to:
acquiring a fourth size corresponding to the third size based on the target conversion relation, wherein the unit of the third size is the second unit, and the unit of the fourth size is the first unit;
according to the fourth size and the first size, carrying out scaling processing on the first image data to obtain third image data of a third image, wherein the size of the third image is the first size;
and updating the resolution field data in the third image data of the third image based on the target resolution to obtain second image data of a second image, wherein the size of the second image is the first size.
In one possible implementation, the apparatus further includes:
and the first storage module is used for sending the second image data to a target server for storage, and the target server is used for carrying out storage management on the image data.
In one possible implementation manner, the display module is configured to display a first image based on first image data in response to an editing operation on the preview image, wherein the size of the first image is the third size;
the updating module is used for updating the resolution field data in the first image data based on the target resolution to obtain second image data of a second image, wherein the size of the second image in the second image data is a third size;
the device further comprises:
the second storage module is used for sending the second image data and the first size to a server for storage;
an obtaining module, configured to obtain the second image data and the first size from the server in response to a obtaining instruction for the second image;
and the drawing module is used for drawing the second image based on the second image data and the first size to obtain a fourth image with the first size.
In one possible implementation, the apparatus further includes:
a modification module, configured to modify, in response to a configuration modification instruction for a target interface, a data transmission limitation threshold of the target interface from a first threshold to a second threshold, where the second threshold is greater than the first threshold, and the target interface is configured to transmit the second image data.
In one possible implementation, the apparatus further includes:
the compression module is used for compressing the second image data to obtain compressed data of the second image;
and the third storage module is used for sending the compressed data to a server for storage.
In one possible implementation manner, the template image includes at least one element of a title, a background, text, an icon, and a banner, and the editing operation on the template image is any one of a drag operation, a zoom operation, a color setting operation, and a change operation on any one element of the at least one element.
In one aspect, an electronic device is provided that includes one or more processors and one or more memories having at least one program code stored therein, the at least one program code being loaded into and executed by the one or more processors to implement various alternative implementations of the above-described image processing method.
In one aspect, a computer-readable storage medium is provided, in which at least one program code is stored, which is loaded and executed by a processor to implement various alternative implementations of the image processing method described above.
In one aspect, a computer program product or computer program is provided that includes one or more program codes stored in a computer-readable storage medium. One or more processors of the electronic device can read the one or more program codes from the computer-readable storage medium, and the one or more processors execute the one or more program codes, so that the electronic device can execute the image processing method of any one of the above possible embodiments.
The embodiment of the disclosure provides a method for editing an image online, which can display a selected template image, and can display an editing result according to an editing operation of a user, so that what you see is what you get is achieved, a generated image can more easily meet the user requirement without repeated modification, multiple communications between a terminal and a server are not needed, the cost of an image processing method can be effectively reduced, and the processing efficiency is improved. And when the second image is generated, the resolution of the image after the editing operation can be updated to generate the image with the target resolution, so that the equipment automatically generates the image with the resolution meeting the requirement, and the user does not need to download the image generated by the server and manually edit the image, thereby reducing the user operation, further improving the image processing efficiency, and having better practicability and applicability.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to be able to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an implementation environment of an image processing method according to an embodiment of the present application;
fig. 2 is a flowchart of an image processing method provided in an embodiment of the present application;
fig. 3 is a flowchart of an image processing method provided in an embodiment of the present application;
FIG. 4 is a schematic diagram of an image generation page provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of a target transformation relationship provided in an embodiment of the present application;
FIG. 6 is a schematic diagram of a metric calculator provided in an embodiment of the present application;
FIG. 7 is a schematic diagram of a template image provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of a template image provided by an embodiment of the present application;
fig. 9 is a flowchart of an image processing method provided in an embodiment of the present application;
FIG. 10 is a schematic diagram of encoded data provided by an embodiment of the present application;
FIG. 11 is a schematic diagram of encoded data provided by an embodiment of the present application;
fig. 12 is a flowchart of an image processing method provided in an embodiment of the present application;
fig. 13 is a flowchart of an image processing method provided in an embodiment of the present application;
FIG. 14 is a comparison of an example of the present application with a conventional scheme;
fig. 15 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 16 is a block diagram of a terminal according to an embodiment of the present disclosure;
fig. 17 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
The terms "first," "second," and the like in this application are used for distinguishing between similar items and items that have substantially the same function or similar functionality, and it should be understood that "first," "second," and "nth" do not have any logical or temporal dependency or limitation on the number or order of execution. It will be further understood that, although the following description uses the terms first, second, etc. to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, the first image can be referred to as a second image, and similarly, the second image can be referred to as a first image without departing from the scope of the various examples. The first image and the second image can both be images, and in some cases, can be separate and distinct images.
The term "at least one" is used herein to mean one or more, and the term "plurality" is used herein to mean two or more, e.g., a plurality of packets means two or more packets.
It is to be understood that the terminology used in the description of the various examples herein is for the purpose of describing particular examples only and is not intended to be limiting. As used in the description of the various examples and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. The term "and/or" is an associative relationship that describes an associated object, meaning that three relationships can exist, e.g., a and/or B, can mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" in the present application generally indicates that the former and latter related objects are in an "or" relationship.
It should also be understood that, in the embodiments of the present application, the size of the serial number of each process does not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
It should also be understood that determining B from a does not mean determining B from a alone, but can also determine B from a and/or other information.
It will be further understood that the terms "Comprises," "Comprising," "inCludes" and/or "inCluding," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also understood that the term "if" may be interpreted to mean "when" ("where" or "upon") or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined." or "if [ a stated condition or event ] is detected" may be interpreted to mean "upon determining.. or" in response to determining. "or" upon detecting [ a stated condition or event ] or "in response to detecting [ a stated condition or event ]" depending on the context.
The following describes an embodiment of the present application.
Fig. 1 is a schematic diagram of an implementation environment of an image processing method according to an embodiment of the present application. The implementation environment includes a terminal 101, or the implementation environment includes a terminal 101 and an image processing platform 102. The terminal 101 is connected to the image processing platform 102 through a wireless network or a wired network.
The terminal 101 can be at least one of a smart phone, a game console, a desktop computer, a tablet computer, an e-book reader, an MP3(Moving Picture Experts Group Audio Layer III, motion Picture Experts compression standard Audio Layer 3) player, an MP4(Moving Picture Experts Group Audio Layer IV, motion Picture Experts compression standard Audio Layer 4) player, and a laptop computer. The terminal 101 is installed and running with an application program supporting image processing, which can be, for example, a system application, an instant messaging application, a news push application, a shopping application, an online video application, a social application.
Illustratively, the terminal 101 can have an image processing function, can process an image, and can execute a corresponding function according to a processing result. The terminal 101 can independently complete the work and can also provide image processing services for the terminal through the image processing platform 102. The embodiments of the present application do not limit this.
The terminal 101 implements an image processing function through an installed client, or the terminal 101 accesses a portal of the image processing platform 102 in which the image processing function is implemented.
Illustratively, the terminal 101 is installed with a browser application, and by accessing a website of the image processing platform 102 in the browser application, that is, the terminal 101 displays a page provided by the image processing platform 102, the image processing platform 102 provides an image processing service according to an editing operation of a user in the page.
The image processing platform 102 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The image processing platform 102 is used to provide background services for image processing applications. Optionally, the image processing platform 102 undertakes primary processing, and the terminal 101 undertakes secondary processing; or, the image processing platform 102 undertakes the secondary processing work, and the terminal 101 undertakes the primary processing work; alternatively, the image processing platform 102 or the terminal 101 can be separately provided with processing work. Alternatively, the image processing platform 102 and the terminal 101 perform cooperative computing by using a distributed computing architecture.
Optionally, the image processing platform 102 includes at least one server 1021 and a database 1022, where the database 1022 is used for storing data, and in this embodiment, the database 1022 can store template images to provide data services for the at least one server 1021.
The server can be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, cloud functions, cloud storage, network service, cloud communication, middleware service, domain name service, security service, CDN, big data and artificial intelligence platform. The terminal can be, but is not limited to, a smart phone, a tablet computer, a laptop computer, a desktop computer, a smart speaker, a smart watch, and the like.
Those skilled in the art will appreciate that the number of the terminals 101 and the servers 1021 can be greater or smaller. For example, the number of the terminals 101 and the servers 1021 may be only one, or the number of the terminals 101 and the servers 1021 may be several tens or several hundreds, or more, and the number of the terminals or the servers and the device types are not limited in the embodiment of the present application.
Fig. 2 is a flowchart of an image processing method provided in an embodiment of the present application, where the method is applied to an electronic device, where the electronic device is a terminal or a server, and referring to fig. 2, the method includes the following steps.
201. The electronic device displays the selected template image in response to a template selection instruction.
The template refers to a fixed format of a mapping or design. The template is a result of fixing and standardizing the structural rule of an object, and the template embodies the standardization of the structural form. The template image refers to a template for creating an image. The template image includes some elements of the image, such as a title, background, text, icons, banner, etc.
According to the different element types of the image or the different positions or styles of the elements in the image, some template images can be made, and when the image is made subsequently, one template image is selected and edited to change the image content, so that the required image can be generated quickly.
202. The electronic equipment responds to the editing operation on the template image, and displays a first image based on first image data, wherein the first image data is obtained by processing the image data of the template image based on the editing operation.
After the user selects the template image, the template image can be edited according to the requirement of the user so as to generate a personalized image. In one possible implementation, the editing operation is an editing operation on any element in the template image, and may be, for example, a background replacing operation, or a title editing operation, or a drag operation on an icon position, or the like.
It is understood that the template image exists in the form of image data in the electronic device, and the electronic device can process the image data when performing the editing operation. After the editing operation is finished, the image data of the template image is changed into the first image data.
203. And the electronic equipment responds to the image generation instruction, and updates the resolution field data in the first image data based on the target resolution to obtain second image data of a second image, wherein the resolution of the second image is the target resolution.
The target resolution is a resolution required for the second image, for example, the target resolution is a resolution required for printing the second image, that is, the target resolution is a resolution satisfying a printing requirement of the second image.
After the editing processing, the image content in the first image data meets the requirements of the user, and when the user determines to generate the image, the user can perform image generation operation to trigger the image generation instruction. The electronic device receives the image generation instruction, and can automatically execute the step of updating the image data based on the target resolution, instead of directly generating the image based on the first image data, and manually performing secondary processing on the image by using image processing software after the generated image is downloaded by a user. Thus, through the steps 201 to 203, the electronic device can automatically generate the second image meeting the use requirement without excessive operation of the user, thereby effectively reducing the user operation, improving the image processing efficiency and improving the practicability and applicability of the image processing method.
Specifically, the electronic device can update the resolution field data in the first image data so that the resolution of the second image meets the use requirement.
The embodiment of the disclosure provides a method for editing an image online, which can display a selected template image, and can display an editing result according to an editing operation of a user, so that what you see is what you get is achieved, a generated image can more easily meet the user requirement without repeated modification, multiple communications between a terminal and a server are not needed, the cost of an image processing method can be effectively reduced, and the processing efficiency is improved. And when the second image is generated, the resolution of the image after the editing operation can be updated to generate the image with the target resolution, so that the equipment automatically generates the image with the resolution meeting the requirement, and the user does not need to download the image generated by the server and manually edit the image, thereby reducing the user operation, further improving the image processing efficiency, and having better practicability and applicability.
Fig. 3 is a flowchart of an image processing method provided in an embodiment of the present application, and referring to fig. 3, the method includes the following steps.
301. The electronic device acquires image data and a first size of a selected template image in response to a template selection instruction.
When making an image, a user can select a template image which is relatively in accordance with the making requirement, and edit the template image to design the image with the making requirement. The user can also input or select the size of the image required this time, in this embodiment, the image required by the user is referred to as a second image, and the required size is referred to as a first size.
At least one template image can be stored in the electronic equipment, and at least one candidate template image can be displayed in the image generation page and selected by a user.
In another possible implementation manner, the at least one template image may be stored in an image database, and when the electronic device needs to generate an image, the at least one template image can be extracted from the image database and displayed for selection by a user.
For example, as shown in fig. 4, in the image generation page 400, a plurality of template images 401 may be provided, and different template images 401 may include different kinds of elements, and positions or patterns of the elements may be different. The user may select a template image 401 as the basis for this image generation.
The operation of the user on the template image can trigger a template selection instruction, and the electronic device, receiving the template selection instruction, can respond to the template selection instruction, and can acquire the image data of the selected template image and the first size.
302. The electronic equipment obtains a second size corresponding to the first size based on a target conversion relation corresponding to a target resolution, wherein the unit of the first size is a first unit, the unit of the second size is a second unit, and the target conversion relation is a conversion relation between the first unit and the second unit.
In a possible implementation manner, the first unit is millimeter (mm), the second unit is pixel (px), the size of the template image manufactured by the designer generally adopts the first unit, and the size of the image generally adopts pixel (that is, the second unit) when the electronic device displays the image, so that after the electronic device acquires the template image and the first size, the electronic device can convert the size unit of the template image first, so as to scale the template image to a certain scale for display. Of course, the first unit may also be other units, such as decimeter, meter, and the like, which is not limited in this embodiment of the application.
The target resolution is the number of pixels per inch of length. When the resolutions are different, the second size corresponding to the first size may be different. It is understood that the size of the pixels converted from the first size is different when the number of pixels in the same inch length is different. Therefore, since the resolution of the second image is the target resolution, the target conversion relationship can be determined based on the target resolution, and the size conversion is performed using the target conversion relationship, so that the display effect of the preview image obtained in this way conforms to the resolution, and the user can really know the generation effect of the image.
The target conversion relationship may be determined based on a target resolution, and in one specific example, in a first unit of millimeters and a second unit of pixels, the target conversion relationship may be as shown in (a) of fig. 5,
Figure BDA0002692363010000081
the conversion ratio of pixels to mm is rate _ mm2px, DPI (Dots Per Inch) is the target resolution, and inches is 2.54 mm.
The target resolution can reflect the precision of the image, the precision is different, and the definition of the image is different. In the present embodiment, as shown in (b) of fig. 5, DPI (0) can be understood as a start resolution, that is, a display default resolution (72 DPI). The value of the DPI determines the trend of this curve, which can be understood as the trend of the image sharpness improvement. Wherein n represents a corresponding number taken from the curve, and the value of n is determined specifically, which can be determined by a relevant technician or user according to the precision of the image desired to be generated, and through printing and researching the material map, the requirement of the image printing precision is 300 pixels/inch, that is, 300 pixel points are included in one inch length. Thus, n may be set to 300. That is, the target resolution is 300, so the target conversion relationship corresponding to the target resolution is: the conversion ratio of millimeter (mm) to pixel (px) is (1 × 300)/(2.54 × 10) rate _ mm2 px. Of course, the target resolution may also be set to other values by a related technician or a user according to requirements, and this is not particularly limited in this embodiment of the application.
303. And the electronic equipment performs scaling processing on the image data of the template image according to the second size and the third size of the preview display area to obtain the image data of the preview image, wherein the size of the preview image is the third size.
When the electronic device displays the template image, it needs to be considered that the third size of the preview display area may be different from the second size, and the size of the template image may be smaller or larger than the size of the preview display area. After the electronic device determines the second size of the template image, the template image may be scaled according to the second size and the third size. In this way, the template image is displayed after being scaled to a size suitable for display in the preview display area.
In one possible implementation, when the electronic device zooms, the electronic device can determine the zoom ratio first and then zoom according to the zoom ratio. Specifically, in step 303, the electronic device may determine a scaling ratio according to the second size and the third size, and then perform scaling processing on the image data of the template image according to the scaling ratio to obtain the image data of the preview image.
For the scaling, the scaling may be a first ratio between a height in the third dimension and a height in the second dimension, or the scaling may be a second ratio between a width in the third dimension and a width in the third dimension. Or the scaling may be the minimum or maximum of the first or second ratios. The embodiments of the present application do not limit this.
In one possible implementation manner, the electronic device may perform scaling on the template image according to the second size and the third size to obtain the preview image. That is, the image data of the template image is subjected to the scaling processing to obtain the image data of the preview image. By means of the equal-proportion scaling mode, the shapes of elements in the scaled preview image and the proportion among the elements are the same as those of the template image and the second image which is finally generated, the patterns of the elements in the image are visually provided for a user, the user can directly know the shapes of the elements in the generated image and the proportion among the elements, the image which meets the requirements of the user can be generated, and therefore the image processing mode is more accurate.
For example, when the target resolution is 300, the target conversion relationship is: 1mm ≈ 3.779527559055px, 1pt ≈ 1.333333333333 px. Where pt is an abbreviation for point, in the sense of a point, a unit of one size. As shown in fig. 6, if the first size includes 17.7mm, 66.8976795276 pixels (px) can be obtained after converting the unit. In a specific possible embodiment, the target transformation relationship may be packaged as a unit of measure calculator 600, and the first size can be transformed into the corresponding second size by the unit of measure calculator 600.
As shown in fig. 7, a specific example is provided, and a size conversion relationship of the material diagram displayed under the browser is described below by taking an a4 size material diagram (width 210mm, height 297mm) as an example. In fig. 7, the material diagram may include four elements: element 1, element 2, element 3, and element 4. The left side illustrates the positions and sizes of the four elements, which the user can drag or zoom, and the right side illustrates the information of the four elements, such as width, height, internal spacing, font size, etc.
Width: the preview width of the material under the browser is set to 800px (this value can be modified, as the case may be).
Height: the width 210mm was converted to 793.70px (approximately equal to 794px) and the height 297mm was converted to 1122.52px (approximately equal to 1123 px). Then the preview height is 1123x 800/794 ≈ 1131px at a preview width of 800 px.
Internal spacing: taking the top word "WeChat pay summer false rabies season" top pitch as an example, 17.7mm translates to about 66.90px, 17.7mm under 794px pixels is 66.90px, that under 800 widths is 66.90 × 800/794 ≈ 67 px.
Font size: taking the material map "topmost subject matter document" as an example, the 20pt scale is about 26.67px, and then, following the same reasoning, 20pt under 794px pixels is about 26.67px, that under 800px is about 26.67 × 800/794 ≈ 27 px.
Note: rounding is used in the calculation process, and the decimal point is reserved as many bits as possible, so that more accurate values can be obtained, for example, 66.897637 × 800/794 ≈ 67.403160 ≈ 67 px.
Steps 302 to 303 are processes of obtaining image data of a preview image based on the target resolution, the first size, and the image data of the template image, in which the electronic device converts the unit of the size of the template image, then scales the converted unit, and generates the preview image. The electronic device may also obtain the preview image in other manners, for example, the electronic device may obtain a size of a first unit of the preview display area, and scale the template image based on the size and the first size to obtain the preview image. The embodiments of the present application do not limit this.
304. The electronic device displays the preview image in the preview display area based on the image data of the preview image.
And after the electronic equipment acquires the image data of the preview image, rendering and displaying the preview image. This step 304 is a process of displaying the preview image based on the image data of the preview image.
The steps 301 to 304 are a process of displaying a selected template image in response to a template selection instruction, which scales the template image to the preview image in the preview display area by a scaling operation. For example, as shown in FIG. 8, a user has selected a template image 801 and set a first size 802, and the electronic device can display the selected template image 801 and the first size 802, the first size 802 being 115 millimeters (mm) by 160 mm.
The process may also be implemented in other ways, for example, the electronic device may also store a preview image of the template image, and the preview image of the template image is displayed in response to a template selection instruction. The embodiments of the present application do not limit this.
305. The electronic equipment responds to the editing operation on the preview image, and displays a first image based on first image data, wherein the first image data is obtained by processing the image data of the preview image based on the editing operation.
After the electronic equipment displays the preview image, the user can edit elements in the preview image according to requirements so as to meet the use requirements. When the electronic device detects an editing operation by a user, the electronic device can process image data of the preview image based on the editing operation and display the processed image. And obtaining a first image after the editing operation is finished.
The electronic equipment can provide an online editing function, a user can view a page provided by the electronic equipment through a browser application on the terminal, and the electronic equipment can update the content displayed on the page according to the editing operation of the user so as to display an editing processing result. Therefore, the online editing process is visible and can be obtained, and a user can know the change of the image content after each editing, so that the image can be accurately edited.
In one possible implementation, the template image includes at least one element of a title, a background, text, an icon, and a banner, and the editing operation on the template image is any one of a drag operation, a zoom operation, a color setting operation, and a replace operation on any one element of the at least one element.
Upon determining that the requirements are met, the user can perform an image generation operation that can trigger an image generation instruction that is received by the electronic device and in response to which subsequent steps can be performed to generate a second image.
For example, as shown in fig. 8, the template image is displayed as a preview image, the template image or the preview image includes a title 801, a background 802, an icon 803, and the like, and the user can edit the title 801, change the background 802, or change the color of the background 802, and add or change the icon 803. The electronic device can generate a second image based on the currently displayed image.
This step 305 is a process of displaying a first image based on first image data processed based on an editing operation on the template image in response to the editing operation on the template image. The template image is displayed as a preview image when displayed, and the preview image can be edited to obtain first image data.
306. And the electronic equipment responds to an image generation instruction, and acquires a fourth size corresponding to the third size based on the target conversion relation, wherein the unit of the third size is the second unit, and the unit of the fourth size is the first unit.
After the editing operation, when the electronic device needs to generate the second image based on the edited first image, the size of the first image may be scaled back to the original size, that is, the first size. During zooming, the second unit can be converted into the first unit according to the target conversion relationship, and then zooming is performed.
The target conversion relationship may be specifically referred to in step 302, and in step 306, the electronic device may convert the third size of the second unit into the fourth size of the first unit according to the target conversion relationship.
Step 306 is similar to step 302, and will not be described herein. Except that in step 306 the second unit is converted to the first unit and in step 302 the first unit is converted to the second unit, with the inverse conversion.
307. And the electronic equipment performs scaling processing on the first image data according to the fourth size and the first size to obtain third image data of a third image, wherein the size of the third image is the first size.
Step 307 is similar to step 303, and will not be described herein.
Similarly, the electronic device may also determine a scaling ratio based on the fourth size and the first size, and then perform scaling processing on the first image data according to the scaling ratio to obtain third image data of the third image.
Similarly, the scaling may be a first ratio between the height in the first dimension and the height in the fourth dimension, or the scaling may be a second ratio between the width in the first dimension and the width in the fourth dimension. Or the scaling may be the minimum or maximum of the first or second ratios. The embodiments of the present application do not limit this.
Similarly, the scaling process may also be an equal scaling process.
308. And the electronic equipment updates the resolution field data in the third image data of the third image based on the target resolution to obtain second image data of a second image, wherein the size of the second image is the first size, and the resolution of the second image is the target resolution.
After the scaling processing, the electronic device can modify the resolution of the image according to the target resolution to obtain a second image with the definition meeting the requirement. The resolution modification mode can be realized by updating the resolution field data in the image data.
In a possible implementation manner, the updating step may be implemented by replacing the resolution field data, and specifically, the electronic device replaces the resolution field data in the third image data of the third image with the data corresponding to the target resolution to obtain the second image data of the second image. And after the resolution field data is replaced by the data corresponding to the target resolution, the resolution field data in the second image data is the data corresponding to the target resolution, and the resolution of the second image is the target resolution.
The third image data is the first image data after scaling, and the step 308 is that the electronic device replaces the data of the field with the resolution in the first image data with the data corresponding to the target resolution to obtain the second image data of the second image. In the above manner, the electronic device scales the first image, thereby generating the second image based on the scaled first image. In other implementations, the electronic device may also be based on
The first image data is encoded data based on base64, and the step of replacing resolution field data may be: and the electronic equipment replaces the data of the resolution field in the coded data of the first image with the hexadecimal data corresponding to the target resolution to obtain second image data of a second image.
The resolution field data can be determined by analyzing image data of images of different resolutions. For example, as shown in fig. 9, dpi information of an image is stored at a certain position of the head of an image file, and since base64 encoding is used for encoding a byte stream, it can be determined that a base64 character at a certain position of a character string after base64 encoding stores dpi information of an image, and only the position of the character string needs to be found and then modified to correspond to a 300dpi encoded base64 character.
Specifically, the character string position determination process of dpi information may be implemented in a manner.
In the first mode, two pictures processed by the picture editing tool can be selected, the two pictures are only different in dpi, then base64 encoding is respectively carried out, and different positions are tried to be found by comparing the encoded character strings.
In the second mode, a binary file of the picture can be obtained, the byte position for storing the dpi information is found from the binary file, and then the position of the base64 character string needing to be updated is calculated according to the encoding rule of the base 64.
In the second mode, a binary file in the image jpeg file format can be obtained, and the byte position for storing dpi information is found in the binary file. In one specific example, four byte positions 15-18 in the binary file hold dpi information for an image. The storage format is as follows: two bytes 15-16 hold horizontal dpi, and two bytes 17-18 hold vertical dpi. The horizontal dpi and the vertical dpi use two bytes to store respectively, and the used endian is big-end endian, i.e. the high-order byte is stored at the low memory address.
As shown in fig. 10 and 11, the picture file is opened by the binary viewing tool, and data with dpi of 96 × 96 is found to be saved in exactly these four bytes. Where one byte is represented by two hexadecimal numbers. First 20 bytes, namely the first 40 bits, of the source file are taken (red numbers are 16-ary characters for saving dpi): FFD8FFE 000104 a 46494600010101006000600000. The dpi byte data may then be changed to 300 × 300 corresponding 16-ary characters: FFD8FFE 000104A 46494600010101012C 012C 0000. Grouped every three bytes, the character can be represented as FFD8FF E000104 a 46494600010101012C 012C 0000. Thus the first 18 bytes are just divided into 6 groups, and the dpi data is stored in the 5 th group and the 6 th group, and the data in the first four groups are the same, so only the fifth group and the sixth group need to be subjected to base64 encoding. 0101012C 012C is the corresponding base64 character string is AQEB LAEs (total 8 characters) after coding rule according to base64, the first four groups of 12 bytes can be coded into 16 characters (the space is removed during conversion), and the leading character string' data: image/jpeg of the base; the base64, the 23 characters of ' are 39 characters in total, 8 characters from the 40 th character onward store dpi information, so the 40-47 characters of the character string after base64 coding are replaced by ' AQEB LAEs ', and the 8 characters can achieve the effect of modifying dpi of the picture base64 coding character string to 300.
Specifically, the above modification process can be implemented by the following objective function:
function formatBase64StrTo300Dpi(base64Str){
return base64Str.substr(0,39)+'AQEBLAEs'+base64Str.substr(47)
}
for the data 'AQEB lae' corresponding to the target resolution, the electronic device may be obtained by querying through a Base64 encoding table. The Base64 encoding table may be as shown in Table 1.
TABLE 1
Figure BDA0002692363010000121
It should be noted that, in the steps 306 to 308, in response to the image generation instruction, the process of updating the resolution field data in the first image data based on the target resolution to obtain the second image data of the second image, where the resolution of the second image is the target resolution, and in a possible implementation manner, in the process, the electronic device may convert an html (hypertext Markup Language) page into a canvas (canvas), and then convert the canvas into an image.
Specifically, it can be realized by the following steps one to three.
Step one, the electronic equipment draws the displayed first image on a canvas based on the target resolution.
And step two, the electronic equipment acquires the image data of the canvas.
In step two, the electronic device may be implemented based on the API of the canvas when acquiring the image data of the canvas. In particular, the electronic device can obtain image data for the canvas based on a target application program interface of the browser application.
And step three, the electronic equipment executes the step of updating the resolution field data based on the image data of the canvas.
In a possible implementation manner, after the electronic device acquires the second image data of the second image, the printing step may be directly executed, or the second image data may be stored in the server for subsequent use. Specifically, the electronic device may send the second image data to a target server for storage, where the target server is used for storage management of the image data. The target server may also be referred to as an image server, which has a storage space and a processing capability for image data better than those of a general server, and thus, the second image data is transmitted to the target server for storage, for example, the target server may be a cloud server COS.
The above steps 301 to 308 are a possible implementation manner, and after the above step 305, the electronic device may also directly generate the second image without scaling the first image back to the first size, so that the generated second image has a smaller size, then store the small-sized second image and the first size, and redraw the image into the first size when the image is subsequently downloaded. Specifically, after step 305, the electronic device may display a first image based on the first image data in response to the editing operation on the preview image, where the size of the first image is the third size. Accordingly, the electronic device may update the resolution field data in the first image data based on the target resolution directly without performing the scaling step, so as to obtain second image data of a second image, where the size of the second image in the second image data is a third size. The electronic equipment sends the second image data and the first size to a server for storage.
If the second image is downloaded subsequently, the electronic device may obtain the second image data and the first size from the server in response to an obtaining instruction of the second image, and draw the second image based on the second image data and the first size to obtain a fourth image of the first size. The fourth image is thus a second image of large size redrawn. Therefore, the second image with small data volume can be transmitted to the server, and the second image with large data volume can be redrawn in subsequent use, so that the transmitted data volume is reduced, and the subsequent normal use (such as printing) of the second image is not influenced. And the normal transmission of the image data can be ensured under the condition that the data transmission of the target interface is limited.
In another possible implementation manner, after the step 305, the electronic device may further perform a scaling process, which is to simply scale the image to a target size, where the target size is smaller than the first size and may also be smaller than the third size, so that the image is processed into a smaller size image and stored, and the amount of data transmitted can be reduced while ensuring subsequent normal use of the image.
In one possible implementation, the electronic device may modify, in response to a configuration modification instruction for a target interface, a data transfer limitation threshold of the target interface from a first threshold to a second threshold, the second threshold being greater than the first threshold, the target interface being configured to transfer the second image data. The target interface is an interface for transmitting image data to a server, and for example, the target interface may be a php (Hypertext Preprocessor) interface. The php interface itself is provided with a data transfer limit threshold, for example, the first threshold may be 10 MB. The data transmission upper limit of the php interface is modified, so that the target interface can transmit second image data with large data volume, and the storage step of the second image data can be realized.
In a possible implementation manner, the electronic device may further perform compression processing on the second image data to obtain compressed data of the second image, and send the compressed data to the server for storage. Thus, when the second image is downloaded subsequently, the compressed data can be downloaded and decompressed to obtain the second image data. By the compression processing, the data amount of the second image data is reduced, so that the data amount transmitted to the server can be reduced, the data transmission load can be reduced, the storage load of the server can also be reduced, and the normal transmission of the image data can also be ensured under the condition that the data transmission of the target interface is limited.
In a specific example, as shown in fig. 12 and 13, the image processing method provided by the embodiment of the present application can provide a convenient image processing service for a service provider, and the image processing method can be applied to a material cloud platform, and the material cloud platform is used for providing a material making service for the service provider. The material may be in the form of an image. In particular, the image processing method may comprise a plurality of stages: the method comprises a login stage, a template selection stage, a design stage, a synthesis stage, a download stage, a storage stage and a printing stage. In the login stage, the server can log in the material cloud platform. In the stage of selecting the template, the service provider selects one template image from the template images provided by the material cloud platform. In the design stage, the facilitator can perform personalized editing on various elements in the template image, wherein the elements can include a title (title), a Background (BG), a Text (Text), an Icon (Icon), a Banner (Banner), and the like. The specific editing operation may include a setting operation, a replacing operation, a drag (drag) operation, and the like. In the composition phase, the material cloud platform can convert the page elements into canvases based on the API of the html2canvas, and finally save (save) the canvases as images (such as images in jpg format). In the downloading and storing phase, the service provider can store (store) the synthesized image locally after obtaining it through the browser (browser). The server may also upload the synthesized image to a target server (service) for storage. The service provider may then download (download) the image from the destination server or upload (upload) the locally stored image to the destination server. In the printing phase, the service provider may print the image through a printer (printer).
By the method, the image which accords with the propaganda display and printing size can be generated through the image editing and processing process, repeated modification is not needed, multiple communication is not needed between the terminal and the server, the cost of image processing can be effectively reduced, and the processing efficiency is improved. And when the second image is generated, the resolution of the image after the editing operation can be changed to generate a new image of the target resolution, so that the terminal automatically generates the image meeting the requirements without downloading from the server by the user. Therefore, user operations are reduced, the image processing efficiency is further improved, and the practicability and the applicability of the image processing method are more excellent.
As shown in fig. 14, (a) in fig. 14 shows that the work of the method 90% + provided by the present application is completed by the front end, and compared with the work of about 55% in the conventional scheme shown in (b) in fig. 14, the completion by the front end can well reduce the number of data interactions between the front end and the server, so as to achieve what you see is what you get, and can better meet the personalized requirements of the user.
The embodiment of the disclosure provides a method for editing an image online, which can display a selected template image, and can display an editing result according to an editing operation of a user, so that what you see is what you get is achieved, a generated image can more easily meet the user requirement without repeated modification, multiple communications between a terminal and a server are not needed, the cost of an image processing method can be effectively reduced, and the processing efficiency is improved. And when the second image is generated, the resolution of the image after the editing operation can be updated to generate the image with the target resolution, so that the equipment automatically generates the image with the resolution meeting the requirement, and the user does not need to download the image generated by the server and manually edit the image, thereby reducing the user operation, further improving the image processing efficiency, and having better practicability and applicability.
All the above optional technical solutions can be combined arbitrarily to form optional embodiments of the present application, and are not described herein again.
Fig. 15 is a schematic structural diagram of an image processing apparatus provided in an embodiment of the present application, and referring to fig. 15, the apparatus includes:
a display module 1501, configured to display a selected template image in response to a template selection instruction;
the display module 1501 is further configured to display a first image based on first image data in response to an editing operation on the template image, where the first image data is obtained by processing image data of the template image based on the editing operation;
an updating module 1502 is configured to update, in response to an image generation instruction, resolution field data in the first image data based on a target resolution to obtain second image data of a second image, where the resolution of the second image is the target resolution.
In a possible implementation manner, the updating module 1502 is configured to replace the data of the field of resolution in the first image data with the data corresponding to the target resolution, so as to obtain second image data of the second image.
In one possible implementation, the update module 1502 includes a drawing unit, an obtaining unit, and an updating unit;
the drawing unit is used for drawing the displayed first image on a canvas based on the target resolution;
the acquisition unit is used for acquiring the image data of the canvas;
the updating unit is used for executing the step of updating the resolution field data based on the image data of the canvas.
In one possible implementation, the obtaining unit is configured to obtain the image data of the canvas based on a target application program interface of a browser application.
In one possible implementation, the first image data is base 64-based encoded data;
the updating module 1502 is configured to replace the data of the field with resolution in the encoded data of the first image with the hexadecimal data corresponding to the target resolution to obtain second image data of the second image.
In one possible implementation, the display module 1501 is configured to:
responding to a template selection instruction, and acquiring image data and a first size of a selected template image;
acquiring image data of a preview image based on the target resolution, the first size and the image data of the template image;
the preview image is displayed based on the image data of the preview image.
In one possible implementation, the display module 1501 is configured to:
acquiring a second size corresponding to the first size based on a target conversion relation corresponding to a target resolution, wherein the unit of the first size is a first unit, the unit of the second size is a second unit, and the target conversion relation is a conversion relation between the first unit and the second unit;
according to the second size and a third size of a preview display area, carrying out scaling processing on the image data of the template image to obtain the image data of the preview image, wherein the size of the preview image is the third size;
in one possible implementation, the display module 1501 is configured to display the preview image in the preview display area based on the image data of the preview image.
In one possible implementation, the update module 1502 is configured to:
acquiring a fourth size corresponding to the third size based on the target conversion relation, wherein the unit of the third size is the second unit, and the unit of the fourth size is the first unit;
according to the fourth size and the first size, carrying out scaling processing on the first image data to obtain third image data of a third image, wherein the size of the third image is the first size;
and updating the resolution field data in the third image data of the third image based on the target resolution to obtain second image data of a second image, wherein the size of the second image is the first size.
In one possible implementation, the apparatus further includes:
and the first storage module is used for sending the second image data to a target server for storage, and the target server is used for carrying out storage management on the image data.
In one possible implementation, the display module 1501 is configured to display a first image based on first image data in response to an editing operation on the preview image, where the size of the first image is the third size;
the updating module 1502 is configured to update resolution field data in the first image data based on the target resolution to obtain second image data of a second image, where a size of the second image in the second image data is a third size;
the device also includes:
the second storage module is used for sending the second image data and the first size to a server for storage;
an acquisition module, configured to acquire the second image data and the first size from the server in response to an acquisition instruction for the second image;
and the drawing module is used for drawing the second image based on the second image data and the first size to obtain a fourth image with the first size.
In one possible implementation, the apparatus further includes:
and the modification module is used for responding to a configuration modification instruction of a target interface, modifying the data transmission limit threshold of the target interface from a first threshold to a second threshold, wherein the second threshold is larger than the first threshold, and the target interface is used for transmitting the second image data.
In one possible implementation, the apparatus further includes:
the compression module is used for compressing the second image data to obtain compressed data of the second image;
and the third storage module is used for sending the compressed data to the server for storage.
In one possible implementation, the template image includes at least one element of a title, a background, text, an icon, and a banner, and the editing operation on the template image is any one of a drag operation, a zoom operation, a color setting operation, and a replace operation on any one element of the at least one element.
The device provided by the embodiment of the application can display the selected template image, and can display the editing result according to the editing operation of the user, so that what you see is what you get is realized, the generated image can more easily meet the user requirement, repeated modification is not needed, multiple communication is not needed between the terminal and the server, the cost of the image processing method can be effectively reduced, and the processing efficiency is improved. And when the second image is generated, the resolution of the image after the editing operation can be updated to generate the image with the target resolution, so that the equipment automatically generates the image with the resolution meeting the requirement, and the user does not need to download the image generated by the server and manually edit the image, thereby reducing the user operation, further improving the image processing efficiency, and having better practicability and applicability.
It should be noted that: in the image processing apparatus provided in the above embodiment, when processing an image, only the division of the above functional modules is taken as an example, and in practical applications, the above function allocation can be completed by different functional modules according to needs, that is, the internal structure of the image processing apparatus is divided into different functional modules so as to complete all or part of the above described functions. In addition, the image processing apparatus and the image processing method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments in detail and are not described herein again.
The electronic device in the above method embodiment can be implemented as a terminal. For example, fig. 16 is a block diagram of a terminal according to an embodiment of the present disclosure. The terminal 1600 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3(Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3) player, an MP4(Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4) player, a notebook computer or a desktop computer. Terminal 1600 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, etc.
Generally, terminal 1600 includes: a processor 1601, and a memory 1602.
Processor 1601 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 1601 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). Processor 1601 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1601 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, the processor 1601 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 1602 may include one or more computer-readable storage media, which may be non-transitory. The memory 1602 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1602 is used to store at least one instruction for execution by processor 1601 to implement an image processing method provided by method embodiments of the present application.
In some embodiments, the terminal 1600 may also optionally include: peripheral interface 1603 and at least one peripheral. Processor 1601, memory 1602 and peripheral interface 1603 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 1603 via buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of a radio frequency circuit 1604, a display 1605, a camera assembly 1606, audio circuitry 1607, a positioning assembly 1608, and a power supply 1609.
Peripheral interface 1603 can be used to connect at least one I/O (Input/Output) related peripheral to processor 1601 and memory 1602. In some embodiments, processor 1601, memory 1602, and peripheral interface 1603 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1601, the memory 1602 and the peripheral device interface 1603 may be implemented on a separate chip or circuit board, which is not limited by this embodiment.
The Radio Frequency circuit 1604 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1604 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 1604 converts the electrical signal into an electromagnetic signal to be transmitted, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1604 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1604 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 1604 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display 1605 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1605 is a touch display screen, the display screen 1605 also has the ability to capture touch signals on or over the surface of the display screen 1605. The touch signal may be input to the processor 1601 as a control signal for processing. At this point, the display 1605 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 1605 can be one, disposed on the front panel of the terminal 1600; in other embodiments, the display screens 1605 can be at least two, respectively disposed on different surfaces of the terminal 1600 or in a folded design; in other embodiments, display 1605 can be a flexible display disposed on a curved surface or a folded surface of terminal 1600. Even further, the display 1605 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 1605 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or other materials.
The camera assembly 1606 is used to capture images or video. Optionally, camera assembly 1606 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1606 can also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 1607 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1601 for processing or inputting the electric signals to the radio frequency circuit 1604 to achieve voice communication. For stereo sound acquisition or noise reduction purposes, the microphones may be multiple and disposed at different locations of terminal 1600. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1601 or the radio frequency circuit 1604 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 1607 may also include a headphone jack.
The positioning component 1608 is configured to locate a current geographic Location of the terminal 1600 for purposes of navigation or LBS (Location Based Service). The Positioning component 1608 may be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
Power supply 1609 is used to provide power to the various components of terminal 1600. Power supply 1609 may be alternating current, direct current, disposable or rechargeable. When power supply 1609 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1600 also includes one or more sensors 1610. The one or more sensors 1610 include, but are not limited to: acceleration sensor 1611, gyro sensor 1612, pressure sensor 1613, fingerprint sensor 1614, optical sensor 1615, and proximity sensor 1616.
Acceleration sensor 1611 may detect acceleration in three coordinate axes of a coordinate system established with terminal 1600. For example, the acceleration sensor 1611 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1601 may control the display screen 1605 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1611. The acceleration sensor 1611 may also be used for acquisition of motion data of a game or a user.
Gyroscope sensor 1612 can detect the organism direction and the turned angle of terminal 1600, and gyroscope sensor 1612 can gather the 3D action of user to terminal 1600 with acceleration sensor 1611 in coordination. From the data collected by the gyro sensor 1612, the processor 1601 may perform the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 1613 may be disposed on the side frames of terminal 1600 and/or underlying display 1605. When the pressure sensor 1613 is disposed on the side frame of the terminal 1600, a user's holding signal of the terminal 1600 can be detected, and the processor 1601 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1613. When the pressure sensor 1613 is disposed at the lower layer of the display 1605, the processor 1601 controls the operability control on the UI interface according to the pressure operation of the user on the display 1605. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1614 is configured to collect a fingerprint of the user, and the processor 1601 is configured to identify the user based on the fingerprint collected by the fingerprint sensor 1614, or the fingerprint sensor 1614 is configured to identify the user based on the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 1601 authorizes the user to perform relevant sensitive operations including unlocking a screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 1614 may be disposed on the front, back, or side of the terminal 1600. When a physical key or a vendor Logo is provided on the terminal 1600, the fingerprint sensor 1614 may be integrated with the physical key or the vendor Logo.
The optical sensor 1615 is used to collect ambient light intensity. In one embodiment, the processor 1601 may control the display brightness of the display screen 1605 based on the ambient light intensity collected by the optical sensor 1615. Specifically, when the ambient light intensity is high, the display luminance of the display screen 1605 is increased; when the ambient light intensity is low, the display brightness of the display screen 1605 is adjusted down. In another embodiment, the processor 1601 may also dynamically adjust the shooting parameters of the camera assembly 1606 based on the ambient light intensity collected by the optical sensor 1615.
A proximity sensor 1616, also referred to as a distance sensor, is typically disposed on the front panel of terminal 1600. The proximity sensor 1616 is used to collect the distance between the user and the front surface of the terminal 1600. In one embodiment, the processor 1601 controls the display 1605 to switch from the light screen state to the clear screen state when the proximity sensor 1616 detects that the distance between the user and the front surface of the terminal 1600 is gradually decreased; when the proximity sensor 1616 detects that the distance between the user and the front surface of the terminal 1600 is gradually increased, the display 1605 is controlled by the processor 1601 to switch from the breath-screen state to the bright-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 16 is not intended to be limiting of terminal 1600, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
The electronic device in the above method embodiment can be implemented as a server. For example, fig. 17 is a schematic structural diagram of a server according to an embodiment of the present application, where the server 1700 may generate a relatively large difference due to different configurations or performances, and may include one or more processors (CPUs) 1701 and one or more memories 1702, where the memory 1702 stores at least one program code, and the at least one program code is loaded and executed by the processors 1701 to implement the image Processing method according to the above-described method embodiments. Certainly, the server can also have components such as a wired or wireless network interface and an input/output interface to facilitate input and output, and the server can also include other components for implementing the functions of the device, which is not described herein again.
In an exemplary embodiment, there is also provided a computer-readable storage medium, such as a memory, including at least one program code, the at least one program code being executable by a processor to perform the image processing method in the above-described embodiments. For example, the computer-readable storage medium can be a Read-Only Memory (ROM), a Random Access Memory (RAM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product is also provided, in one aspect, a computer program product or a computer program is provided that includes one or more program codes stored in a computer readable storage medium. The one or more processors of the electronic device can read the one or more program codes from the computer-readable storage medium, and the one or more processors execute the one or more program codes, so that the electronic device can perform the image processing method described above.
It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
It should be understood that determining B from a does not mean determining B from a alone, but can also determine B from a and/or other information.
Those skilled in the art will appreciate that all or part of the steps for implementing the above embodiments can be implemented by hardware, or can be implemented by a program for instructing relevant hardware, and the program can be stored in a computer readable storage medium, and the above mentioned storage medium can be read only memory, magnetic or optical disk, etc.
The above description is intended only to be an alternative embodiment of the present application, and not to limit the present application, and any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. An image processing method, characterized in that the method comprises:
responding to a template selection instruction, and displaying a selected template image;
responding to the editing operation of the template image, and displaying a first image based on first image data, wherein the first image data is obtained by processing the image data of the template image based on the editing operation;
and responding to an image generation instruction, updating the resolution field data in the first image data based on a target resolution to obtain second image data of a second image, wherein the resolution of the second image is the target resolution.
2. The method of claim 1, wherein the updating the resolution field data in the first image data based on the target resolution to obtain second image data of a second image comprises:
and replacing the data of the field with the resolution in the first image data with the data corresponding to the target resolution to obtain second image data of a second image.
3. The method according to claim 2, wherein the first image data is base 64-based encoded data;
replacing the resolution field data in the first image data with the data corresponding to the target resolution to obtain second image data of a second image, wherein the method comprises the following steps:
and replacing the data of the resolution field in the coded data of the first image with the hexadecimal data corresponding to the target resolution to obtain second image data of a second image.
4. The method of claim 1, wherein the updating the resolution field data in the first image data based on the target resolution to obtain second image data of a second image comprises:
drawing the displayed first image onto a canvas based on the target resolution;
acquiring image data of the canvas;
and updating the resolution field data based on the image data of the canvas.
5. The method of claim 4, wherein obtaining the image data of the canvas comprises:
and acquiring the image data of the canvas based on a target application program interface of the browser application.
6. The method of claim 1, wherein displaying the selected template image in response to a template selection instruction comprises:
responding to a template selection instruction, and acquiring image data and a first size of a selected template image;
acquiring image data of a preview image based on the target resolution, the first size and the image data of the template image;
displaying the preview image based on image data of the preview image.
7. The method of claim 6, wherein obtaining image data for a preview image based on the target resolution, the first size, and the image data for the template image comprises:
acquiring a second size corresponding to the first size based on a target conversion relation corresponding to a target resolution, wherein the unit of the first size is a first unit, the unit of the second size is a second unit, and the target conversion relation is a conversion relation between the first unit and the second unit;
according to the second size and a third size of a preview display area, carrying out zooming processing on the image data of the template image to obtain the image data of the preview image, wherein the size of the preview image is the third size;
the displaying the preview image includes:
displaying the preview image in the preview display area based on image data of the preview image.
8. The method of claim 7, wherein the updating the resolution field data in the first image data based on the target resolution to obtain second image data of a second image comprises:
acquiring a fourth size corresponding to the third size based on the target conversion relation, wherein the unit of the third size is the second unit, and the unit of the fourth size is the first unit;
according to the fourth size and the first size, carrying out scaling processing on the first image data to obtain third image data of a third image, wherein the size of the third image is the first size;
and updating the resolution field data in the third image data of the third image based on the target resolution to obtain second image data of a second image, wherein the size of the second image is the first size.
9. The method of claim 8, further comprising:
and sending the second image data to a target server for storage, wherein the target server is used for storing and managing the image data.
10. The method of claim 7, wherein displaying the first image based on the first image data in response to the editing operation on the template image comprises:
displaying a first image based on first image data in response to an editing operation on the preview image, the first image having the size of the third size;
the updating the resolution field data in the first image data based on the target resolution to obtain second image data of a second image, includes:
updating resolution field data in the first image data based on the target resolution to obtain second image data of a second image, wherein the size of the second image in the second image data is a third size;
the method further comprises the following steps:
sending the second image data and the first size to a server for storage;
acquiring the second image data and the first size from the server in response to an acquisition instruction of the second image;
and drawing the second image based on the second image data and the first size to obtain a fourth image with the first size.
11. The method of claim 1, further comprising:
in response to a configuration modification instruction for a target interface, modifying a data transmission limit threshold of the target interface from a first threshold to a second threshold, the second threshold being greater than the first threshold, the target interface being configured to transmit the second image data.
12. The method of claim 1, further comprising:
compressing the second image data to obtain compressed data of the second image;
and sending the compressed data to a server for storage.
13. An image processing apparatus, characterized in that the apparatus comprises:
the display module is used for responding to the template selection instruction and displaying the selected template image;
the display module is further used for responding to the editing operation of the template image, displaying a first image based on first image data, and processing the image data of the template image based on the editing operation by the first image data;
and the updating module is used for responding to an image generation instruction, updating the data of the resolution field in the first image data based on the target resolution, and obtaining second image data of a second image, wherein the resolution of the second image is the target resolution.
14. An electronic device, comprising one or more processors and one or more memories having at least one program code stored therein, the at least one program code being loaded and executed by the one or more processors to implement the image processing method of any one of claims 1 to 12.
15. A computer-readable storage medium, characterized in that at least one program code is stored in the storage medium, which is loaded and executed by a processor to implement the image processing method according to any one of claims 1 to 12.
CN202010995170.3A 2020-09-21 2020-09-21 Image processing method, device, equipment and storage medium Pending CN114257755A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010995170.3A CN114257755A (en) 2020-09-21 2020-09-21 Image processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010995170.3A CN114257755A (en) 2020-09-21 2020-09-21 Image processing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114257755A true CN114257755A (en) 2022-03-29

Family

ID=80788981

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010995170.3A Pending CN114257755A (en) 2020-09-21 2020-09-21 Image processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114257755A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116048379A (en) * 2022-06-30 2023-05-02 荣耀终端有限公司 Data recharging method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116048379A (en) * 2022-06-30 2023-05-02 荣耀终端有限公司 Data recharging method and device
CN116048379B (en) * 2022-06-30 2023-10-24 荣耀终端有限公司 Data recharging method and device

Similar Documents

Publication Publication Date Title
CN108415705B (en) Webpage generation method and device, storage medium and equipment
CN110059685B (en) Character area detection method, device and storage medium
CN107885533B (en) Method and device for managing component codes
CN109948581B (en) Image-text rendering method, device, equipment and readable storage medium
CN108196755B (en) Background picture display method and device
WO2022042425A1 (en) Video data processing method and apparatus, and computer device and storage medium
CN110321126B (en) Method and device for generating page code
CN110839174A (en) Image processing method and device, computer equipment and storage medium
CN108734662B (en) Method and device for displaying icons
CN112257006A (en) Page information configuration method, device, equipment and computer readable storage medium
CN111325220B (en) Image generation method, device, equipment and storage medium
CN111105474B (en) Font drawing method, font drawing device, computer device and computer readable storage medium
CN113747199A (en) Video editing method, video editing apparatus, electronic device, storage medium, and program product
CN111625315A (en) Page display method and device, electronic equipment and storage medium
CN109726379B (en) Content item editing method and device, electronic equipment and storage medium
CN109635202B (en) Content item processing method and device, electronic equipment and storage medium
CN111083554A (en) Method and device for displaying live gift
CN112116681A (en) Image generation method and device, computer equipment and storage medium
CN111275607B (en) Interface display method and device, computer equipment and storage medium
CN112053360A (en) Image segmentation method and device, computer equipment and storage medium
CN114257755A (en) Image processing method, device, equipment and storage medium
CN113010258B (en) Picture issuing method, device, equipment and storage medium
CN113467663B (en) Interface configuration method, device, computer equipment and medium
CN113301422A (en) Method, terminal and storage medium for acquiring video cover
CN110390065B (en) Webpage acquisition method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40070017

Country of ref document: HK

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination