CN114782611A - Image processing method, image processing device, storage medium and electronic equipment - Google Patents

Image processing method, image processing device, storage medium and electronic equipment Download PDF

Info

Publication number
CN114782611A
CN114782611A CN202210723158.6A CN202210723158A CN114782611A CN 114782611 A CN114782611 A CN 114782611A CN 202210723158 A CN202210723158 A CN 202210723158A CN 114782611 A CN114782611 A CN 114782611A
Authority
CN
China
Prior art keywords
pixel
preset
target
original texture
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210723158.6A
Other languages
Chinese (zh)
Other versions
CN114782611B (en
Inventor
张强
朱旭平
宋彬
何文武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Feidu Technology Co ltd
Original Assignee
Beijing Feidu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Feidu Technology Co ltd filed Critical Beijing Feidu Technology Co ltd
Priority to CN202210723158.6A priority Critical patent/CN114782611B/en
Publication of CN114782611A publication Critical patent/CN114782611A/en
Application granted granted Critical
Publication of CN114782611B publication Critical patent/CN114782611B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Image Generation (AREA)

Abstract

The present disclosure relates to an image processing method, an image processing apparatus, a storage medium, and an electronic device, and relates to the field of computer technologies, the method comprising: the method comprises the steps of responding to a received pixel adjusting instruction, obtaining a target image according to each preset graph on a preset three-dimensional model and a to-be-processed area on the preset three-dimensional model, wherein each preset graph corresponds to an original texture image, the pixel adjusting instruction is used for indicating that the pixel value of the original texture image corresponding to the to-be-processed area is adjusted to be a target pixel value, determining a target pixel in each original texture image according to a plurality of preset graphs, the target image and each original texture image, and adjusting the pixel value of the target pixel to be the target pixel value. According to the method and the device, the pixel value of the target pixel needing to be subjected to pixel adjustment in the original texture image is adjusted, so that the pixel of the original texture image corresponding to the region to be processed is adjusted, and meanwhile, the efficiency of adjusting the pixel of the original texture image can be improved.

Description

Image processing method, image processing device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an image processing method and apparatus, a storage medium, and an electronic device.
Background
In recent years, with the rapid development of computer technology, three-dimensional models have been widely used. In order to make a three-dimensional model look more realistic and beautiful, the three-dimensional model is usually optimized. For a three-dimensional model that has already been created, when it is necessary to perform adjustment optimization on the pixels of the texture image of the model, the workload is definitely enormous if the adjustment is returned to the original modeling software. Even some models are reconstructed from photographs and cannot be adjusted in the original modeling software (e.g., three-dimensional live-action models), so it is desirable to have a technique that can adjust the pixels of the texture image corresponding to a specified region on the model.
Disclosure of Invention
In order to solve the problems in the related art, the present disclosure provides an image processing method, apparatus, storage medium, and electronic device.
In order to achieve the above object, according to a first aspect of embodiments of the present disclosure, there is provided an image processing method including:
responding to a received pixel adjustment instruction, and acquiring a target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and a region to be processed on the preset three-dimensional model; each preset graph corresponds to an original texture image, the pixel adjusting instruction is used for indicating that the pixel value of the original texture image corresponding to the area to be processed is adjusted to be a target pixel value, and the target image is composed of pixels with two different pixel values;
determining a target pixel in each original texture image according to the preset graphs, the target image and each original texture image;
and adjusting the pixel value of the target pixel to the target pixel value.
Optionally, the obtaining a target image according to each preset graph on a preset three-dimensional model composed of a plurality of preset graphs and a to-be-processed region on the preset three-dimensional model includes:
determining at least one intersected preset graph which has intersection with the area to be processed from the preset graphs according to the preset graphs and the area to be processed;
and shooting each intersected preset graph and the area to be processed according to a preset angle to obtain a target image corresponding to each intersected preset graph.
Optionally, the determining, according to the plurality of preset graphs and the to-be-processed region, at least one intersected preset graph having an intersection with the to-be-processed region from the plurality of preset graphs includes:
stretching the region to be processed along a first direction and a second direction respectively to obtain a target space region corresponding to the region to be processed; the first direction is opposite to the second direction, and the first direction is a normal direction of a plane where the area to be processed is located;
and taking the preset graph with intersection with the target space region in the plurality of preset graphs as the intersection preset graph.
Optionally, the determining, according to the preset graphics, the target image, and each original texture image, a target pixel in each original texture image includes:
acquiring vertex position information corresponding to each intersected preset graph; the vertex position information comprises a first vertex position of each vertex of the intersected preset graph on the target image and a second vertex position of each vertex of the intersected preset graph on the original texture image;
and aiming at each original texture image, determining a target pixel in the original texture image according to the pixel position of each pixel in the original texture image and the vertex position information.
Optionally, the determining a target pixel in the original texture image according to the pixel position of each pixel in the original texture image and the vertex position information includes:
determining whether one or more target preset graphs corresponding to the pixel exist in the at least one intersected preset graph or not according to the pixel position of each pixel, and determining the pixel value corresponding to the pixel in the target image according to the pixel position of the pixel and the vertex position information corresponding to each target preset graph under the condition that one or more target preset graphs corresponding to the pixel exist in the at least one intersected preset graph;
and aiming at each pixel, determining the target pixel according to the corresponding pixel value of the pixel in the target image.
Optionally, the target image includes a first pixel with a first pixel value and a second pixel with a second pixel value, where the first pixel is a pixel corresponding to the to-be-processed region in the target image; the determining the target pixel according to the corresponding pixel value of the pixel in the target image includes:
counting a first time that a pixel value corresponding to the pixel in the target image is the first pixel value and a second time that a pixel value corresponding to the pixel in each target image is the second pixel value;
and if the first times is greater than or equal to the second times, taking the pixel as the target pixel.
Optionally, the determining a target pixel in the original texture image according to the pixel position of each pixel in the original texture image and the vertex position information further includes:
and under the condition that one or more target preset graphs corresponding to the pixel do not exist in the at least one intersected preset graph, taking the pixel as the target pixel.
According to a second aspect of the embodiments of the present disclosure, there is provided an image processing apparatus, the apparatus including:
the acquisition module is used for responding to the received pixel adjustment instruction, and acquiring a target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and a to-be-processed area on the preset three-dimensional model; each preset graph corresponds to an original texture image, the pixel adjusting instruction is used for indicating that the pixel value of the original texture image corresponding to the area to be processed is adjusted to be a target pixel value, and the target image is composed of pixels with two different pixel values;
a determining module, configured to determine a target pixel in each original texture image according to the plurality of preset graphics, the target image, and each original texture image;
and the adjusting module is used for adjusting the pixel value of the target pixel to the target pixel value.
Optionally, the obtaining module includes:
the first determining submodule is used for determining at least one intersected preset graph which has an intersection with the area to be processed from the preset graphs according to the preset graphs and the area to be processed;
and the shooting submodule is used for shooting each intersected preset graph and the area to be processed according to a preset angle to obtain a target image corresponding to each intersected preset graph.
Optionally, the first determining sub-module is configured to:
stretching the area to be processed along a first direction and a second direction respectively to obtain a target space area corresponding to the area to be processed; the first direction is opposite to the second direction, and the first direction is a normal direction of a plane where the area to be processed is located;
and taking the preset graph with intersection with the target space region in the plurality of preset graphs as the intersection preset graph.
Optionally, the determining module includes:
the acquisition submodule is used for acquiring vertex position information corresponding to each intersected preset graph; the vertex position information comprises a first vertex position of each vertex of the intersected preset graph on the target image and a second vertex position of each vertex of the intersected preset graph on the original texture image;
and the second determining submodule is used for determining a target pixel in the original texture image according to the pixel position of each pixel in the original texture image and the vertex position information aiming at each original texture image.
Optionally, the second determining submodule is configured to:
determining whether one or more target preset graphs corresponding to the pixel exist in the at least one intersected preset graph or not according to the pixel position of each pixel, and determining the pixel value corresponding to the pixel in the target image according to the pixel position of the pixel and the vertex position information corresponding to each target preset graph under the condition that one or more target preset graphs corresponding to the pixel exist in the at least one intersected preset graph;
and aiming at each pixel, determining the target pixel according to the corresponding pixel value of the pixel in the target image.
Optionally, the target image includes a first pixel having a first pixel value and a second pixel having a second pixel value, where the first pixel is a pixel corresponding to the to-be-processed region in the target image; the second determination submodule is configured to:
counting a first time that a pixel value corresponding to the pixel in the target image is the first pixel value and a second time that the pixel value corresponding to the pixel in each target image is the second pixel value;
and if the first times is larger than or equal to the second times, taking the pixel as the target pixel.
Optionally, the second determining sub-module is further configured to:
and under the condition that one or more target preset graphs corresponding to the pixel do not exist in the at least one intersected preset graph, taking the pixel as the target pixel.
According to a third aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium, having stored thereon a computer program which, when executed by a processor, performs the steps of the method of any one of the first to the above aspects.
According to a fourth aspect of embodiments of the present disclosure, there is provided an electronic apparatus including:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to implement the steps of the method of any of the above first aspects.
According to the technical scheme, the method includes the steps of firstly responding to a received pixel adjusting instruction, obtaining a target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and a to-be-processed area on the preset three-dimensional model, wherein each preset graph corresponds to one original texture image, the pixel adjusting instruction is used for indicating that the pixel value of the original texture image corresponding to the to-be-processed area is adjusted to be a target pixel value, the target image is formed by pixels with two different pixel values, then determining a target pixel in each original texture image according to the plurality of preset graphs, the target image and each original texture image, and adjusting the pixel value of the target pixel to be the target pixel value. According to the method and the device, the target pixel needing pixel adjustment in each original texture image can be determined by combining the preset graph and the original texture image through the preset graph and the target image obtained from the to-be-processed area, the pixel value of the target pixel is adjusted, the pixel of the to-be-processed area corresponding to the original texture image is adjusted, and meanwhile the efficiency of adjusting the pixel of the original texture image can be improved.
Additional features and advantages of the present disclosure will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure, but do not constitute a limitation of the disclosure. In the drawings:
FIG. 1 is a flow diagram illustrating an image processing method according to an exemplary embodiment;
FIG. 2 is a flow chart of one step 101 shown in the embodiment of FIG. 1;
FIG. 3 is a schematic illustration of a target image shown in accordance with an exemplary embodiment;
FIG. 4 is a flow chart of one step 102 shown in the embodiment of FIG. 1;
FIG. 5 is a block diagram illustrating an image processing apparatus according to an exemplary embodiment;
FIG. 6 is a block diagram of an acquisition module shown in the embodiment of FIG. 5;
FIG. 7 is a block diagram of one type of determination module shown in the embodiment of FIG. 5;
FIG. 8 is a block diagram of an electronic device shown in accordance with an exemplary embodiment;
FIG. 9 is a block diagram illustrating another electronic device in accordance with an example embodiment.
Detailed Description
The following detailed description of specific embodiments of the present disclosure is provided in connection with the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating the present disclosure, are given by way of illustration and explanation only, not limitation.
It should be noted that all actions of acquiring signals, information or data in the present disclosure are performed under the premise of complying with the corresponding data protection regulation policy of the country of the location and obtaining the authorization given by the owner of the corresponding device.
FIG. 1 is a flow diagram illustrating a method of image processing according to an exemplary embodiment. As shown in fig. 1, the method may include the steps of:
step 101, in response to a received pixel adjustment instruction, acquiring a target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and a to-be-processed area on the preset three-dimensional model.
Each preset graph corresponds to an original texture image, the pixel adjusting instruction is used for indicating that the pixel value of the original texture image corresponding to the area to be processed is adjusted to be a target pixel value, and the target image is composed of pixels with two different pixel values.
For example, the pixels of the partial region to be subjected to pixel adjustment in the original texture image on the three-dimensional model may be identified, and the pixels of the partial region may be adjusted, so that the pixels of the texture image corresponding to the partial region may be adjusted on the premise of avoiding re-rendering the preset three-dimensional model, and the adjustment efficiency of the pixels of the texture image corresponding to the three-dimensional model may be improved. Specifically, when a user wants to perform pixel adjustment on an original texture image corresponding to some area on the preset three-dimensional model, a to-be-processed area (which can be understood as a calibrated range plane on the preset three-dimensional model) to be subjected to pixel adjustment is calibrated on the preset three-dimensional model, and a target pixel value to be adjusted to the original texture image corresponding to the to-be-processed area is input to trigger a pixel adjustment instruction. The preset three-dimensional model is formed by splicing a plurality of preset graphs (the preset graphs can be triangles or other polygons except the triangles), the original texture image is mapped onto the surface of the preset three-dimensional model according to a specific mode, the original texture image can be understood as a two-dimensional graph on the surface of the preset three-dimensional model and can also be called a texture map, each preset graph corresponds to one original texture image, and the original texture images corresponding to all the preset graphs form the surface texture of the preset three-dimensional model together. In addition, the method for calibrating the region to be processed may be manually calibrated by the user, or may be automatically calibrated according to an input condition set by the user (for example, the position of the region to be processed on the preset three-dimensional model).
Because the number of the preset graphs forming the preset three-dimensional model is huge, and more preset graphs do not have an intersection relation with the to-be-processed area, in order to further improve the adjustment efficiency of pixel adjustment, the preset graphs which do not intersect with the to-be-processed area can be filtered out from all the preset graphs, and all the preset graphs which intersect with the to-be-processed area are used as the intersected preset graphs. Then, each intersection preset graph and the area to be processed can be shot respectively, and a target image corresponding to each intersection preset graph is obtained. The target image may include a first pixel having a first pixel value and a second pixel having a second pixel value (the first pixel is a corresponding pixel of the region to be processed in the target image), and the first pixel value is different from the second pixel value, that is, the target image is an image composed of pixels with two different pixel values.
Step 102, determining a target pixel in each original texture image according to a plurality of preset graphics, the target image and each original texture image.
Step 103, adjusting the pixel value of the target pixel to the target pixel value.
For example, after the target image is obtained, first vertex positions of all vertices of each intersecting preset graph on the target image and second vertex positions on the original texture image may be obtained. Then, for each original texture image, sequentially traversing each pixel in the original texture image, determining a preset pattern corresponding to the pixel (one pixel may correspond to one or more different preset patterns) according to the pixel position of the pixel, and determining whether the preset pattern corresponding to the pixel is an intersecting preset pattern. If the preset pattern corresponding to the pixel is an intersecting preset pattern (i.e. one or more intersecting preset patterns corresponding to the pixel exist), the target images corresponding to all the intersecting preset patterns corresponding to the pixel are further determined (each intersecting preset pattern corresponds to one target image). Then, the target pixel position of the pixel on each corresponding target image can be respectively determined by combining the first vertex position of each intersecting preset graph corresponding to the pixel with the position relationship between the pixel position of the pixel and the second vertex position of each intersecting preset graph corresponding to the pixel. Then, for each target image corresponding to the pixel, a pixel value corresponding to a target pixel position of the pixel on the target image may be obtained. If the pixel value is the first pixel value, the pixel is labeled "1", otherwise labeled "0". And finally, counting a first time of the pixel marked as 1 and a second time of the pixel marked as 0, and if the first time is greater than or equal to the second time, taking the pixel as a target pixel. In the above manner, the target pixel in each original texture image can be determined. Finally, the pixel value of the target pixel in each original texture image can be adjusted to be the target pixel value, so as to realize the pixel adjustment of the texture image corresponding to the region to be processed.
In summary, the present disclosure first responds to a received pixel adjustment instruction, and obtains a target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and a to-be-processed area on the preset three-dimensional model, where each preset graph corresponds to one original texture image, the pixel adjustment instruction is used to instruct to adjust a pixel value of the original texture image corresponding to the to-be-processed area to a target pixel value, the target image is formed by pixels of two different pixel values, and then determines a target pixel in each original texture image according to the plurality of preset graphs, the target image and each original texture image, and adjusts the pixel value of the target pixel to the target pixel value. According to the method and the device, the target pixel needing pixel adjustment in each original texture image can be determined by combining the preset graph and the original texture image through the preset graph and the target image obtained from the to-be-processed area, the pixel value of the target pixel is adjusted, the pixel of the to-be-processed area corresponding to the original texture image is adjusted, and meanwhile the efficiency of adjusting the pixel of the original texture image can be improved.
Fig. 2 is a flow chart of one step 101 shown in the embodiment shown in fig. 1. As shown in fig. 2, step 101 may include the steps of:
step 1011, determining at least one intersected preset graph which has an intersection with the area to be processed from the plurality of preset graphs according to the plurality of preset graphs and the area to be processed.
For example, to further improve the adjustment efficiency of the pixel adjustment, the region to be processed may be stretched in the first direction and the second direction, respectively, to obtain a target spatial region corresponding to the region to be processed. The first direction is opposite to the second direction, and the first direction is a normal direction of a plane where the area to be processed is located. Then, the preset graph which has an intersection with the target space region in the plurality of preset graphs can be used as the intersection preset graph. For example, when the predetermined pattern is a triangle, the region to be processed may be stretched by 1 meter along a normal direction of a plane where the region to be processed is located, and a direction opposite to the normal direction, respectively, to obtain a target space region in a shape of a rectangular parallelepiped. All triangles may then be traversed to determine whether each triangle is completely outside the cuboid, and if so, the triangles that are completely outside the cuboid are removed. And finally, taking the remaining triangles as intersecting triangles.
Step 1012, shooting each intersecting preset graph and the area to be processed according to the preset angle to obtain a target image corresponding to each intersecting preset graph.
Specifically, after the intersecting preset patterns are determined, each intersecting preset pattern and the region to be processed may be photographed (or a screenshot manner may be adopted) according to a preset angle, so as to generate a target image corresponding to each intersecting preset pattern. For example, when the preset pattern is a triangle, the target image may be a black-and-white image (at this time, the target image only includes a white pixel value and a black pixel value), as shown in fig. 3, a white area in the black-and-white image is a corresponding area of the area to be processed in the black-and-white image.
Fig. 4 is a flow chart illustrating one step 102 of the embodiment shown in fig. 1. As shown in fig. 4, step 102 may include the steps of:
step 1021, obtaining vertex position information corresponding to each intersected preset graph. The vertex position information comprises a first vertex position of each vertex of the intersected preset graph on the target image and a second vertex position of each vertex of the intersected preset graph on the original texture image.
For example, after the target image is obtained, vertex position information including a first vertex position on the target image and a second vertex position on the original texture image of each vertex of the intersecting preset graph may be obtained for each intersecting preset graph. Namely, the following contents need to be saved after the target image is generated: (1) the method comprises the steps of (1) corresponding relation between a target image and an intersected preset graph, (2) pixel positions of each vertex of the intersected preset graph on the corresponding target image, (3) original texture images corresponding to the intersected preset graph, and pixel positions of each vertex of the intersected preset graph on the corresponding original texture images. Wherein the first vertex position and the second vertex position may be expressed in UV coordinates.
Step 1022, for each original texture image, determining a target pixel in the original texture image according to the pixel position and vertex position information of each pixel in the original texture image.
For example, after the vertex position information is obtained, it may be first determined, for each original texture image, whether one or more target preset graphics corresponding to each pixel exist in the at least one intersecting preset graphics according to the pixel position of each pixel in the original texture image. For example, which preset patterns the pixel belongs to can be determined by the pixel position of the pixel, and the preset patterns are taken as the preset patterns corresponding to the pixel, and then it is determined whether there is an intersecting preset pattern in all the preset patterns corresponding to the pixel. If the intersected preset pattern exists in all the preset patterns corresponding to the pixel, the preset pattern which is the intersected preset pattern in all the preset patterns corresponding to the pixel can be used as a target preset pattern, and at this time, one or more target preset patterns corresponding to the pixel can be determined to exist. If there is no intersecting preset pattern in all the preset patterns corresponding to the pixel, it may be determined that there are no one or more target preset patterns corresponding to the pixel.
In a case that it is determined that one or more target preset graphics corresponding to the pixel exist in at least one intersecting preset graphics, a corresponding pixel value of the pixel in the target image may be determined according to a pixel position of the pixel and vertex position information corresponding to each target preset graphics. Specifically, the target pixel position of the pixel on each target image corresponding to the pixel may be respectively determined by combining the position relationship between the pixel position of the pixel and the second vertex position of each target preset graph corresponding to the pixel and the first vertex position of each target preset graph corresponding to the pixel, and the pixel value corresponding to the target pixel position of the pixel on each target image may be obtained. The process of determining the target pixel position of the pixel on a target image is actually to deduce the target pixel position of the pixel on the target image according to the position relationship between the pixel and the target preset pattern corresponding to the target image on the original texture image, and by combining the position of the target preset pattern corresponding to the target image on the target image.
Then, for each pixel, a target pixel may be determined based on the corresponding pixel value of the pixel in the target image. For example, in the case that the target image includes a first pixel having a first pixel value and a second pixel having a second pixel value, and the first pixel is a corresponding pixel of the region to be processed in the target image, a first number of times that the pixel value of the corresponding pixel in the target image is the first pixel value and a second number of times that the pixel value of the corresponding pixel in each target image is the second pixel value may be counted. If the first number is greater than or equal to the second number, the pixel is a pixel corresponding to the region to be processed, and the pixel can be used as a target pixel. Further, in the case that it is determined that one or more target preset patterns corresponding to the pixel do not exist in at least one intersection preset pattern, the pixel may not be taken as a target pixel.
In the above manner, the target pixel in each original texture image can be determined. Finally, the pixel value of the target pixel in each original texture image can be adjusted to be the target pixel value, so as to realize the pixel adjustment of the texture image corresponding to the region to be processed.
In summary, the present disclosure first obtains, in response to a received pixel adjustment instruction, a target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and a to-be-processed area on the preset three-dimensional model, where each preset graph corresponds to one original texture image, the pixel adjustment instruction is used to instruct to adjust a pixel value of the original texture image corresponding to the to-be-processed area to a target pixel value, and the target image is formed by pixels with two different pixel values, and then determines a target pixel in each original texture image according to the plurality of preset graphs, the target image, and each original texture image, and adjusts the pixel value of the target pixel to the target pixel value. According to the method and the device, the target pixel needing pixel adjustment in each original texture image can be determined by combining the preset graph and the original texture image through the preset graph and the target image obtained from the to-be-processed area, the pixel value of the target pixel is adjusted, the pixel of the to-be-processed area corresponding to the original texture image is adjusted, and meanwhile the efficiency of adjusting the pixel of the original texture image can be improved.
Fig. 5 is a block diagram illustrating an image processing apparatus according to an exemplary embodiment. As shown in fig. 5, the image processing apparatus 200 includes:
the obtaining module 201 is configured to, in response to the received pixel adjustment instruction, obtain a target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and a to-be-processed area on the preset three-dimensional model. Each preset graph corresponds to an original texture image, the pixel adjusting instruction is used for indicating that the pixel value of the original texture image corresponding to the area to be processed is adjusted to be a target pixel value, and the target image is composed of pixels with two different pixel values.
A determining module 202, configured to determine a target pixel in each original texture image according to a plurality of preset graphics, a target image, and each original texture image.
And an adjusting module 203, configured to adjust the pixel value of the target pixel to a target pixel value.
Fig. 6 is a block diagram of an acquisition module shown in the embodiment of fig. 5. As shown in fig. 6, the obtaining module 201 includes:
the first determining sub-module 2011 is configured to determine, according to the multiple preset graphs and the to-be-processed region, at least one intersected preset graph having an intersection with the to-be-processed region from the multiple preset graphs.
The shooting submodule 2012 is configured to shoot each intersecting preset pattern and the to-be-processed area according to a preset angle, so as to obtain a target image corresponding to each intersecting preset pattern.
Optionally, the first determination sub-module 2011 is configured to:
and stretching the area to be processed along the first direction and the second direction respectively to obtain a target space area corresponding to the area to be processed. The first direction is opposite to the second direction, and the first direction is the normal direction of the plane where the area to be processed is located.
And taking the preset graph with intersection with the target space region in the plurality of preset graphs as an intersection preset graph.
FIG. 7 is a block diagram of one type of determination module shown in the embodiment of FIG. 5. As shown in fig. 7, the determining module 202 includes:
the obtaining sub-module 2021 is configured to obtain vertex position information corresponding to each intersecting preset graph. The vertex position information comprises a first vertex position of each vertex of the intersected preset graph on the target image and a second vertex position of each vertex of the intersected preset graph on the original texture image.
The second determining sub-module 2022 is configured to determine, for each original texture image, a target pixel in the original texture image according to the pixel position and the vertex position information of each pixel in the original texture image.
Optionally, the second determining sub-module 2022 is configured to:
and determining whether one or more target preset graphs corresponding to the pixel exist in at least one intersected preset graph or not according to the pixel position of each pixel, and determining the pixel value of the pixel in the target image according to the pixel position of the pixel and the vertex position information corresponding to each target preset graph under the condition that one or more target preset graphs corresponding to the pixel exist in at least one intersected preset graph.
And for each pixel, determining a target pixel according to the corresponding pixel value of the pixel in the target image.
Optionally, the target image includes a first pixel whose pixel value is a first pixel value and a second pixel whose pixel value is a second pixel value, where the first pixel is a pixel corresponding to the to-be-processed region in the target image. The second determination submodule 2022 is configured to:
and counting the first times that the pixel value corresponding to the pixel in the target image is the first pixel value and the second times that the pixel value corresponding to the pixel in each target image is the second pixel value.
And if the first times is larger than or equal to the second times, taking the pixel as a target pixel.
Optionally, the second determining sub-module 2022 is further configured to:
and under the condition that one or more target preset graphs corresponding to the pixel do not exist in at least one crossed preset graph, the pixel is not taken as a target pixel.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
In summary, the present disclosure first responds to a received pixel adjustment instruction, and obtains a target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and a to-be-processed area on the preset three-dimensional model, where each preset graph corresponds to one original texture image, the pixel adjustment instruction is used to instruct to adjust a pixel value of the original texture image corresponding to the to-be-processed area to a target pixel value, the target image is formed by pixels of two different pixel values, and then determines a target pixel in each original texture image according to the plurality of preset graphs, the target image and each original texture image, and adjusts the pixel value of the target pixel to the target pixel value. The method and the device can determine the target pixel needing to be subjected to pixel adjustment in each original texture image by combining the preset graph and the original texture image through the preset graph and the target image acquired from the to-be-processed area, and adjust the pixel value of the target pixel, so that the adjustment of the pixel of the to-be-processed area corresponding to the original texture image is realized, and meanwhile, the efficiency of adjusting the pixel of the original texture image can be improved.
Fig. 8 is a block diagram illustrating an electronic device 700 in accordance with an example embodiment. As shown in fig. 8, the electronic device 700 may be provided as a terminal, and the electronic device 700 may include: a first processor 701 and a first memory 702. The electronic device 700 may further include one or more of a multimedia component 703, a first input/output interface 704, and a first communication component 705.
The first processor 701 is configured to control the overall operation of the electronic device 700, so as to complete all or part of the steps in the image processing method. The first memory 702 is used to store various types of data to support operation at the electronic device 700, such as instructions for any application or method operating on the electronic device 700 and application-related data, such as contact data, messaging, pictures, audio, video, and the like. The first Memory 702 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as a Static Random Access Memory (SRAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), an Erasable Programmable Read-Only Memory (EPROM), a Programmable Read-Only Memory (PROM), a Read-Only Memory (ROM), a magnetic Memory, a flash Memory, a magnetic disk, or an optical disk. The multimedia components 703 may include screen and audio components. Wherein the screen may be, for example, a touch screen and the audio component is used for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signal may further be stored in the first memory 702 or transmitted through the first communication component 705. The audio assembly also includes at least one speaker for outputting audio signals. The first input/output interface 704 provides an interface between the first processor 701 and other interface modules, such as a keyboard, a mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The first communication component 705 is used for wired or wireless communication between the electronic device 700 and other devices. Wireless Communication, such as Wi-Fi, bluetooth, Near Field Communication (NFC for short), 2G, 3G, 4G, NB-IOT, eMTC, or other 5G, etc., or a combination of one or more of them, which is not limited herein. The corresponding first communication component 705 may thus comprise: Wi-Fi module, Bluetooth module, NFC module, etc.
In an exemplary embodiment, the electronic Device 700 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic components for performing the image Processing methods described above.
In another exemplary embodiment, there is also provided a computer readable storage medium including program instructions which, when executed by a processor, implement the steps of the image processing method described above. For example, the computer readable storage medium may be the first memory 702 comprising program instructions executable by the first processor 701 of the electronic device 700 to perform the image processing method described above.
In addition, the electronic device 700 may also be provided as a server, as shown in fig. 9, the electronic device 700 comprising a second processor 706, which may number one or more, and a second memory 707 for storing computer programs executable by the second processor 706. The computer program stored in the second memory 707 may include one or more modules that each correspond to a set of instructions. Further, the second processor 706 may be configured to execute the computer program to perform the image processing method described above.
Additionally, the electronic device 700 may also include a power component 708 and a second communication component 709, the power component 708 may be configured to perform power management of the electronic device 700, the second communication component 709 may be configured to enable communication, e.g., wired or wireless communication, of the electronic device 700. In addition, the electronic device 700 may also include a second input/output interface 710. The electronic device 700 may operate based on an operating system, such as Windows Server, stored in the second memory 707TM,Mac OS XTM,UnixTM,LinuxTMAnd so on.
In another exemplary embodiment, there is also provided a computer readable storage medium comprising program instructions which, when executed by a processor, implement the steps of the image processing method described above. For example, the non-transitory computer readable storage medium may be the second memory 707 described above that includes program instructions that are executable by the second processor 706 of the electronic device 700 to perform the image processing method described above.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the image processing method described above when executed by the programmable apparatus.
The preferred embodiments of the present disclosure are described in detail with reference to the accompanying drawings, however, the present disclosure is not limited to the specific details of the above embodiments, and various simple modifications may be made to the technical solution of the present disclosure within the technical idea of the present disclosure, and these simple modifications all belong to the protection scope of the present disclosure.
It should be noted that, in the foregoing embodiments, various features described in the above embodiments may be combined in any suitable manner, and in order to avoid unnecessary repetition, various combinations that are possible in the present disclosure are not described again.
In addition, any combination of various embodiments of the present disclosure may be made, and the same should be considered as the disclosure of the present disclosure as long as it does not depart from the gist of the present disclosure.

Claims (10)

1. An image processing method, characterized in that the method comprises:
responding to a received pixel adjusting instruction, and acquiring a target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and a to-be-processed area on the preset three-dimensional model; each preset graph corresponds to an original texture image, the pixel adjusting instruction is used for indicating that the pixel value of the original texture image corresponding to the area to be processed is adjusted to be a target pixel value, and the target image is composed of pixels with two different pixel values;
determining a target pixel in each original texture image according to the preset graphs, the target image and each original texture image;
and adjusting the pixel value of the target pixel to the target pixel value.
2. The method according to claim 1, wherein the obtaining a target image according to each preset figure on a preset three-dimensional model composed of a plurality of preset figures and a region to be processed on the preset three-dimensional model comprises:
determining at least one intersected preset graph which has intersection with the area to be processed from the preset graphs according to the preset graphs and the area to be processed;
and shooting each intersected preset graph and the area to be processed according to a preset angle to obtain a target image corresponding to each intersected preset graph.
3. The method according to claim 2, wherein the determining, according to the plurality of preset graphs and the region to be processed, at least one intersecting preset graph having an intersection with the region to be processed from the plurality of preset graphs comprises:
stretching the region to be processed along a first direction and a second direction respectively to obtain a target space region corresponding to the region to be processed; the first direction is opposite to the second direction, and the first direction is a normal direction of a plane where the area to be processed is located;
and taking a preset graph which has an intersection with the target space region in the plurality of preset graphs as the intersected preset graph.
4. The method according to claim 2, wherein determining the target pixel in each of the original texture images according to the plurality of preset graphics, the target image and each of the original texture images comprises:
acquiring vertex position information corresponding to each intersected preset graph; the vertex position information comprises a first vertex position of each vertex of the intersected preset graph on the target image and a second vertex position of each vertex of the intersected preset graph on the original texture image;
and aiming at each original texture image, determining a target pixel in the original texture image according to the pixel position of each pixel in the original texture image and the vertex position information.
5. The method of claim 4, wherein determining the target pixel in the original texture image according to the pixel position of each pixel in the original texture image and the vertex position information comprises:
determining whether one or more target preset graphs corresponding to the pixel exist in the at least one intersected preset graph or not according to the pixel position of each pixel, and determining the pixel value of the pixel in the target image according to the pixel position of the pixel and the vertex position information corresponding to each target preset graph under the condition that one or more target preset graphs corresponding to the pixel exist in the at least one intersected preset graph;
and aiming at each pixel, determining the target pixel according to the corresponding pixel value of the pixel in the target image.
6. The method according to claim 5, wherein the target image comprises a first pixel having a first pixel value and a second pixel having a second pixel value, the first pixel being a corresponding pixel of the region to be processed in the target image; the determining the target pixel according to the corresponding pixel value of the pixel in the target image includes:
counting a first time that a pixel value corresponding to the pixel in the target image is the first pixel value and a second time that a pixel value corresponding to the pixel in each target image is the second pixel value;
and if the first times is greater than or equal to the second times, taking the pixel as the target pixel.
7. The method of claim 5, wherein determining the target pixel in the original texture image according to the pixel position of each pixel in the original texture image and the vertex position information further comprises:
and under the condition that one or more target preset graphs corresponding to the pixel do not exist in the at least one intersected preset graph, taking the pixel as the target pixel.
8. An image processing apparatus, characterized in that the apparatus comprises:
the acquisition module is used for responding to a received pixel adjustment instruction, and acquiring a target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and a to-be-processed area on the preset three-dimensional model; each preset graph corresponds to an original texture image, the pixel adjusting instruction is used for indicating that the pixel value of the original texture image corresponding to the area to be processed is adjusted to be a target pixel value, and the target image is composed of pixels with two different pixel values;
a determining module, configured to determine a target pixel in each original texture image according to the preset graphics, the target image, and each original texture image;
and the adjusting module is used for adjusting the pixel value of the target pixel to the target pixel value.
9. A non-transitory computer-readable storage medium, on which a computer program is stored, which program, when executed by a processor, performs the steps of the method of any one of claims 1 to 7.
10. An electronic device, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to carry out the steps of the method of any one of claims 1 to 7.
CN202210723158.6A 2022-06-24 2022-06-24 Image processing method, image processing apparatus, storage medium, and electronic device Active CN114782611B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210723158.6A CN114782611B (en) 2022-06-24 2022-06-24 Image processing method, image processing apparatus, storage medium, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210723158.6A CN114782611B (en) 2022-06-24 2022-06-24 Image processing method, image processing apparatus, storage medium, and electronic device

Publications (2)

Publication Number Publication Date
CN114782611A true CN114782611A (en) 2022-07-22
CN114782611B CN114782611B (en) 2022-09-20

Family

ID=82422309

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210723158.6A Active CN114782611B (en) 2022-06-24 2022-06-24 Image processing method, image processing apparatus, storage medium, and electronic device

Country Status (1)

Country Link
CN (1) CN114782611B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115830091A (en) * 2023-02-20 2023-03-21 腾讯科技(深圳)有限公司 Texture image generation method, device, equipment, storage medium and product

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101261681A (en) * 2008-03-31 2008-09-10 北京中星微电子有限公司 Road image extraction method and device in intelligent video monitoring
CN105608239A (en) * 2014-11-24 2016-05-25 富泰华工业(深圳)有限公司 Coordinate measuring machine programming system and method
CN107452046A (en) * 2017-06-30 2017-12-08 百度在线网络技术(北京)有限公司 The Texture Processing Methods and device of D Urban model, equipment and computer-readable recording medium
JP2018032301A (en) * 2016-08-26 2018-03-01 株式会社アクセル Image data processing method in image processing processor and program therefor
CN112307553A (en) * 2020-12-03 2021-02-02 之江实验室 Method for extracting and simplifying three-dimensional road model
WO2021254110A1 (en) * 2020-06-19 2021-12-23 京东方科技集团股份有限公司 Image processing method, apparatus and device, and storage medium
CN114140568A (en) * 2021-10-28 2022-03-04 北京达佳互联信息技术有限公司 Image processing method, image processing device, electronic equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101261681A (en) * 2008-03-31 2008-09-10 北京中星微电子有限公司 Road image extraction method and device in intelligent video monitoring
CN105608239A (en) * 2014-11-24 2016-05-25 富泰华工业(深圳)有限公司 Coordinate measuring machine programming system and method
JP2018032301A (en) * 2016-08-26 2018-03-01 株式会社アクセル Image data processing method in image processing processor and program therefor
CN107452046A (en) * 2017-06-30 2017-12-08 百度在线网络技术(北京)有限公司 The Texture Processing Methods and device of D Urban model, equipment and computer-readable recording medium
WO2021254110A1 (en) * 2020-06-19 2021-12-23 京东方科技集团股份有限公司 Image processing method, apparatus and device, and storage medium
CN112307553A (en) * 2020-12-03 2021-02-02 之江实验室 Method for extracting and simplifying three-dimensional road model
CN114140568A (en) * 2021-10-28 2022-03-04 北京达佳互联信息技术有限公司 Image processing method, image processing device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
戴璐平等: "结合局部二元图特征的运动目标阴影抑制方法", 《华中科技大学学报(自然科学版)》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115830091A (en) * 2023-02-20 2023-03-21 腾讯科技(深圳)有限公司 Texture image generation method, device, equipment, storage medium and product

Also Published As

Publication number Publication date
CN114782611B (en) 2022-09-20

Similar Documents

Publication Publication Date Title
CN109242961B (en) Face modeling method and device, electronic equipment and computer readable medium
US11330172B2 (en) Panoramic image generating method and apparatus
US20170186219A1 (en) Method for 360-degree panoramic display, display module and mobile terminal
CN110163831B (en) Method and device for dynamically displaying object of three-dimensional virtual sand table and terminal equipment
CN111243049A (en) Face image processing method and device, readable medium and electronic equipment
CN112017133B (en) Image display method and device and electronic equipment
CN114782611B (en) Image processing method, image processing apparatus, storage medium, and electronic device
CN113436338A (en) Three-dimensional reconstruction method and device for fire scene, server and readable storage medium
CN113643414A (en) Three-dimensional image generation method and device, electronic equipment and storage medium
JP7262530B2 (en) Location information generation method, related device and computer program product
CN116485969A (en) Voxel object generation method, voxel object generation device and computer-readable storage medium
CN112308766B (en) Image data display method and device, electronic equipment and storage medium
CN111292414B (en) Method and device for generating three-dimensional image of object, storage medium and electronic equipment
CN114119831A (en) Snow accumulation model rendering method and device, electronic equipment and readable medium
JP6967150B2 (en) Learning device, image generator, learning method, image generation method and program
CN112184543B (en) Data display method and device for fisheye camera
CN112837375B (en) Method and system for camera positioning inside real space
CN114782614B (en) Model rendering method and device, storage medium and electronic equipment
CN114782616B (en) Model processing method and device, storage medium and electronic equipment
CN114520903B (en) Rendering display method, rendering display device, electronic equipment and storage medium
CN111311491B (en) Image processing method and device, storage medium and electronic equipment
CN113763530B (en) Image processing method, device, computing equipment and storage medium
US20230140932A1 (en) Method and device for outputting an image
CN116596994A (en) Target pose determining method, device, equipment and storage medium based on trinocular vision
CN115049729A (en) Method, device and equipment for determining welding pose of part and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: Room Y579, 3rd Floor, Building 3, No. 9 Keyuan Road, Daxing District Economic Development Zone, Beijing 102600

Patentee after: Beijing Feidu Technology Co.,Ltd.

Address before: 100162 608, floor 6, building 1, courtyard 15, Xinya street, Daxing District, Beijing

Patentee before: Beijing Feidu Technology Co.,Ltd.

CP03 Change of name, title or address