CN113055579B - Image processing method and device, electronic equipment and readable storage medium - Google Patents

Image processing method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN113055579B
CN113055579B CN201911363993.8A CN201911363993A CN113055579B CN 113055579 B CN113055579 B CN 113055579B CN 201911363993 A CN201911363993 A CN 201911363993A CN 113055579 B CN113055579 B CN 113055579B
Authority
CN
China
Prior art keywords
image
area
processed
scene
specular reflection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911363993.8A
Other languages
Chinese (zh)
Other versions
CN113055579A (en
Inventor
常文涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Oneplus Technology Shenzhen Co Ltd
Original Assignee
Oneplus Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oneplus Technology Shenzhen Co Ltd filed Critical Oneplus Technology Shenzhen Co Ltd
Priority to CN201911363993.8A priority Critical patent/CN113055579B/en
Priority to PCT/CN2020/139403 priority patent/WO2021129806A1/en
Publication of CN113055579A publication Critical patent/CN113055579A/en
Application granted granted Critical
Publication of CN113055579B publication Critical patent/CN113055579B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T5/94
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof

Abstract

The application provides an image processing method, an image processing device, an electronic device and a readable storage medium, and relates to the technical field of image processing. Detecting the obtained image to be processed, determining a specular reflection area in the image to be processed, and then performing replacement or correction processing on the pattern in the specular reflection area according to the reference image to change the pattern in the specular reflection area. Therefore, patterns which are not expected to appear in the image by the user can be removed, so that the image quality is improved, and the user experience is improved.

Description

Image processing method and device, electronic equipment and readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, an electronic device, and a readable storage medium.
Background
With the continuous progress of technology, more and more devices have a camera function. When a user shoots, if an object with mirror reflection exists in the shooting range of the camera, the object can reflect a pattern, so that the pattern reflected by the object is included in the image obtained by the camera.
The pattern reflected by the object will typically include patterns that the user does not wish to be present in the image. For example, a user may use a smart phone to take a self-portrait while wearing sunglasses, and may see the phone from the acquired image.
Disclosure of Invention
In view of the above, an object of the present application is to provide an image processing method, an apparatus, an electronic device and a readable storage medium.
In order to achieve the above purpose, the embodiments of the present application employ the following technical solutions:
in a first aspect, an embodiment of the present application provides an image processing method, which is applied to an electronic device, and the method includes:
obtaining an image to be processed;
detecting the image to be processed, and determining a specular reflection area in the image to be processed;
and carrying out replacement or correction processing on the pattern in the specular reflection area according to the reference image so as to change the pattern in the specular reflection area.
In an optional embodiment, the obtaining the image to be processed based on a first shooting direction, and the detecting the image to be processed and determining a specular reflection area in the image to be processed includes:
obtaining a scene image, wherein the scene image is obtained by shooting in a second shooting direction in a target environment in which the image to be processed is obtained, and the relationship between the second shooting direction and the first shooting direction is the same as the relationship between incident light and reflected light;
carrying out scene matching on the scene image and the image to be processed;
and if the area similar to the scene of the scene image exists in the image to be processed, taking the area as the specular reflection area.
In an optional embodiment, an object recognition model is stored in the electronic device, and the detecting the image to be processed and determining the specular reflection area in the image to be processed includes:
carrying out object recognition on the image to be processed through the object recognition model so as to judge whether the image to be processed comprises an object with mirror reflection;
and if so, taking the area where the object with the specular reflection exists as the specular reflection area.
In an optional embodiment, the detecting the image to be processed and determining a specular reflection area in the image to be processed includes:
and receiving selection operation input by a user, and taking an area corresponding to the selection operation as the specular reflection area.
In an alternative embodiment, the replacing or modifying the pattern in the specular reflection area according to the reference image to change the pattern in the specular reflection area includes:
replacing the pattern in the specular reflection area with a pattern in the reference image; alternatively, the first and second electrodes may be,
according to the matching result of the pattern in the specular reflection area and the pattern in the scene area, obtaining a first area in the specular reflection area, the pattern of which is different from the pattern of the scene area, and a second area in the scene area, the pattern of which is different from the pattern in the specular reflection area, and replacing the pattern in the first area with the pattern in the second area, wherein the scene area is an area in which a scene image is similar to the scene of the specular reflection area, the image to be processed is obtained based on a first shooting direction, the scene image is shot in a second shooting direction in a target environment in which the image to be processed is obtained, and the relationship between the second shooting direction and the first shooting direction is the same as the relationship between incident light and reflected light.
In a second aspect, an embodiment of the present application provides an image processing apparatus, which is applied to an electronic device, and the apparatus includes:
the acquisition module is used for acquiring an image to be processed;
the detection module is used for detecting the image to be processed and determining a specular reflection area in the image to be processed;
and the processing module is used for replacing or correcting the pattern in the specular reflection area according to the reference image so as to change the pattern in the specular reflection area.
In an optional embodiment, the image to be processed is obtained based on a first shooting direction, and the detection module is specifically configured to:
obtaining a scene image, wherein the scene image is obtained by shooting in a second shooting direction in a target environment in which the image to be processed is obtained, and the relationship between the second shooting direction and the first shooting direction is the same as the relationship between incident light and reflected light;
carrying out scene matching on the scene image and the image to be processed;
and if the area similar to the scene of the scene image exists in the image to be processed, taking the area as the specular reflection area.
In an optional embodiment, an object recognition model is stored in the electronic device, and the detection module is specifically configured to:
carrying out object recognition on the image to be processed through the object recognition model so as to judge whether the image to be processed comprises an object with mirror reflection;
and if so, taking the area where the object with the specular reflection exists as the specular reflection area.
In an optional embodiment, the detection module is specifically configured to:
and receiving selection operation input by a user, and taking an area corresponding to the selection operation as the specular reflection area.
In an optional embodiment, the processing module is specifically configured to:
replacing the pattern in the specular reflection area with a pattern in the reference image; alternatively, the first and second electrodes may be,
according to the matching result of the pattern in the specular reflection area and the pattern in the scene area, obtaining a first area in the specular reflection area, the pattern of which is different from the pattern of the scene area, and a second area in the scene area, the pattern of which is different from the pattern in the specular reflection area, and replacing the pattern in the first area with the pattern in the second area, wherein the scene area is an area in which a scene image is similar to the scene of the specular reflection area, the image to be processed is obtained based on a first shooting direction, the scene image is shot in a second shooting direction in a target environment in which the image to be processed is obtained, and the relationship between the second shooting direction and the first shooting direction is the same as the relationship between incident light and reflected light.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor and a memory, where the memory stores machine executable instructions that can be executed by the processor, and the processor can execute the machine executable instructions to implement the image processing method described in any one of the foregoing embodiments.
In a fourth aspect, the present application provides a readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the image processing method according to any one of the foregoing embodiments.
The image processing method, the image processing device, the electronic device and the readable storage medium provided by the embodiment of the application detect the obtained image to be processed, determine the specular reflection area in the image to be processed, and then replace or correct the pattern in the specular reflection area according to the reference image so as to change the pattern in the specular reflection area. Therefore, patterns which are not expected to appear in the image by the user can be removed, so that the image quality is improved, and the user experience is improved.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a block schematic diagram of an electronic device provided in an embodiment of the present application;
FIG. 2 is a schematic flowchart of an image processing method provided in an embodiment of the present application;
FIG. 3 is a schematic flow chart of the substeps involved in step S120 of FIG. 2;
fig. 4 is a schematic diagram illustrating a relationship between a first shooting direction and a second shooting direction provided in an embodiment of the present application;
fig. 5 is a block diagram of an image processing apparatus according to an embodiment of the present application.
Icon: 100-an electronic device; 110-a memory; 120-a processor; 130-a communication unit; 200-an image processing apparatus; 210-an obtaining module; 220-a detection module; 230-processing module.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It is noted that relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Referring to fig. 1, fig. 1 is a block diagram of an electronic device 100 according to an embodiment of the present disclosure. The electronic device 100 may be, but is not limited to, a smart phone, a camera, a computer, etc. The electronic device 100 may include a memory 110, a processor 120, and a communication unit 130. The elements of the memory 110, the processor 120 and the communication unit 130 are electrically connected to each other directly or indirectly to realize data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines.
The memory 110 is used to store programs or data. The Memory 110 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like.
The processor 120 is used to read/write data or programs stored in the memory 110 and perform corresponding functions. For example, the memory 110 stores therein the image processing apparatus 200, and the image processing apparatus 200 includes at least one software functional module that can be stored in the memory 110 in the form of software or firmware (firmware). The processor 120 executes various functional applications and data processing, i.e., implements the image processing method in the embodiment of the present application, by running software programs and modules stored in the memory 110, such as the image processing apparatus 200 in the embodiment of the present application.
The communication unit 130 is used for establishing a communication connection between the electronic apparatus 100 and another communication terminal via a network, and for transceiving data via the network.
It should be understood that the structure shown in fig. 1 is only a schematic structural diagram of the electronic device 100, and the electronic device 100 may also include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination thereof.
Referring to fig. 2, fig. 2 is a flowchart illustrating an image processing method according to an embodiment of the present disclosure. The method is applied to the electronic device 100. The following describes a specific flow of the image processing method in detail.
Step S110, an image to be processed is obtained.
Step S120, detecting the image to be processed, and determining a specular reflection area in the image to be processed.
Step S130, performing replacement or correction processing on the pattern in the specular reflection area according to the reference image to change the pattern in the specular reflection area.
In the present embodiment, after the specular reflection area in the image to be processed is determined by detection, the pattern in the specular reflection area is subjected to replacement or correction processing according to the reference image to complete the modification of the pattern in the specular reflection area. Therefore, patterns which are not expected to appear in the image by the user can be removed, so that the image quality is improved, and the user experience is improved.
Optionally, in this embodiment, the electronic device 100 may include a first camera, and the to-be-processed image is obtained by shooting with the first camera. Or, the electronic device 100 obtains the image to be processed by receiving an image sent by another device. Of course, it is understood that the electronic device 100 may obtain the image to be processed in other manners.
In one implementation of this embodiment, the specular reflection area may be obtained in the following manner. Referring to fig. 3, fig. 3 is a flowchart illustrating sub-steps included in step S120 in fig. 2. Step S120 may include substeps S121 through substep S123.
In substep S121, a scene image is obtained.
And a substep S122, performing scene matching on the scene image and the image to be processed.
And a substep S123, if there is a region similar to the scene of the scene image in the image to be processed, taking the region as the specular reflection region.
In this embodiment, the image to be processed is obtained based on a first shooting direction, the scene image is obtained by shooting in a second shooting direction in a target environment in which the image to be processed is obtained, and a relationship between the second shooting direction and the first shooting direction is the same as a relationship between incident light and reflected light. When light is emitted to the reflecting surface, the light is reflected to the original medium by the reflecting surface, the incident light and the reflected light are positioned on two sides of the normal of the reflecting surface, and the incident angle is equal to the reflecting angle. That is, when the image to be processed shot in the first shooting direction includes the specular reflection area, the image of the scene is shot in the second shooting direction, as shown in fig. 4, the first shooting direction and the second shooting direction are located on two sides of the normal of the reflection surface, and the included angle 1 between the first shooting direction and the normal is the same as or close to the included angle 2 between the second shooting direction and the normal. The reflection surface in fig. 4 represents an actual reflection surface corresponding to the specular reflection area in the image to be processed. For example, if the mirror of sunglasses is included in the image to be processed, the reflection surface in fig. 4 represents the reflection surface of the mirror. It is understood that the second photographing direction is opposite to the first photographing direction if the first photographing direction is perpendicular to the reflection surface.
Optionally, the electronic device 100 or the device that obtains the to-be-processed image through shooting may include a first camera and a second camera that have opposite shooting directions, and after the to-be-processed image is obtained by the first camera, the scene image may be obtained by the second camera by adjusting the shooting direction of the second camera.
For example, the electronic device 100 includes a front camera and a rear camera. When the user takes a self-timer through the front camera, the image obtained through the front camera can be used as the image to be processed, and the shooting direction of the rear camera is adjusted, so that the image which can be used as the scene image is obtained in the second shooting direction.
Alternatively, after the to-be-processed image is obtained, the camera that has obtained the to-be-processed image may be adjusted from the first shooting direction to the second shooting direction, and then shooting again, and taking the image obtained by shooting again as the scene image.
After the scene image is obtained, the scene image and the image to be processed can be subjected to scene matching to obtain scene similarity, and whether a region similar to the scene image is included in the image to be processed or not is judged according to the scene similarity. And if the scene similarity is not greater than the preset similarity, determining that the object with the specular reflection is not included in the image to be processed. If the scene similarity is greater than the preset similarity, determining that the to-be-processed image comprises a region similar to the scene image, and taking the region of the to-be-processed image similar to the scene in the scene image as the specular reflection region.
Wherein scene matching can be achieved through feature matching in the image. For example, if a cuboid building exists in a certain region of the image to be processed, and a cuboid building also exists in the scene image, it may be determined that the region in which the cuboid building exists in the image to be processed is a specular reflection region.
Optionally, in this embodiment, if the second shooting direction can be directly determined, an angle movement prompt message is displayed according to the second shooting direction and the current shooting direction, so that the user directly adjusts the shooting direction to the second shooting direction according to the information.
Optionally, in this embodiment, if the second shooting direction cannot be directly determined, after the to-be-processed image is obtained in the first shooting direction, the user may adjust the direction according to an actual scene corresponding to the pattern in the specular reflection area in the to-be-processed image, and then obtain an image in the adjusted direction. And then, displaying the angle movement prompt information according to the scene similarity of the newly obtained image and the image to be processed. The adjusting action is repeated to enable the user to gradually adjust the shooting direction to the second shooting direction, so that the scene image is obtained. When the scene similarity is greater than the preset similarity, it can be determined that the image corresponding to the scene similarity is the scene image obtained in the second shooting direction, and the area corresponding to the scene similarity in the image to be processed is the specular reflection area.
For example, if the scene similarity between the newly obtained image and the image to be processed is lower than the preset similarity, and a part of the scene of the newly obtained image is matched with a part of the scene of the image to be processed, the missing part may be determined, and the user is prompted to move according to the missing part. For example, if a part of patterns in the image to be processed are the same as the patterns on the left side of the newly obtained image, but the similarity of the current scene is lower than the preset similarity, it may be determined that the right scene is missing, and the user is prompted to move the shooting direction to the right; if the four weeks are missing, the user is prompted to back up a little. Of course, it is also possible that the user determines which part is missing (for example, the right side scene or the left side scene), and then the electronic device 100 prompts the user a specific moving manner (for example, to the right or to the left) according to the missing part determined by the user, so as to adjust the shooting direction, and further adjust the shooting direction to the second shooting direction, so as to obtain the scene image.
Optionally, if a scene image with a corresponding scene similarity greater than the preset similarity is not obtained after the preset times or the preset angle is adjusted, the image obtained at a certain time may be directly and randomly used as the scene image, and it is determined that the image to be processed does not include the specular reflection area.
Optionally, in this embodiment, if the second shooting direction cannot be directly determined, the direction may be adjusted by the user according to the actual scene corresponding to the pattern in the specular reflection area in the image to be processed, and then multiple images are obtained at different angles, so as to combine the multiple images into one image including the entire scene, that is, obtain an image including the scene image. Or when the missing part is determined (for example, the missing around is determined), prompt information for prompting the user to shoot a plurality of images at different angles according to the actual scene may be displayed, so as to synthesize an image including the scene image, and further determine the specular reflection area through scene matching according to the image including the scene image and the image to be processed.
In another implementation manner of this embodiment, model training is performed on a common object (e.g., sunglasses, pupils, etc.) with specular reflection through machine learning, so as to obtain a trained object recognition model, and the trained object recognition model is stored in the electronic device 100. The electronic device 100 may perform object recognition on the image to be processed through the object recognition model. If the object with the specular reflection is not identified, the object with the specular reflection can be determined not to be included in the image to be processed. If the object with the specular reflection is identified, the object with the specular reflection in the image to be processed can be determined, and the area where the object with the specular reflection is located is used as the specular reflection area.
In another implementation manner of this embodiment, the electronic device 100 may further receive a selection operation input by a user for delineating an area, and use an area corresponding to the selection operation as the specular reflection area.
In another implementation manner of this embodiment, the electronic device 100 may further determine the specular reflection area through at least two of the above three detection schemes. For example, if a specular reflection area obtained through scene matching or recognition by an object recognition model and an area corresponding to a selection operation of a user are obtained, the area corresponding to the selection operation of the user is used as the specular reflection area of the image to be processed.
If the specular reflection areas are determined by respectively identifying through the scene matching model and the object identification model, and the specular reflection areas obtained in the two modes are different, the user can be prompted to input selection operation, and the area designated by the selection operation of the user is used as the specular reflection area in the image to be processed.
After the specular reflection area is determined, the pattern in the specular reflection area may be replaced with the pattern in the reference image, thereby removing the pattern in the specular reflection area that is not desired to be displayed by the user. Optionally, the reference image may be the scene image, or may be an image completely unrelated to a target environment in which the image to be processed is obtained.
For example, a user uses a smart phone to take a self-portrait before A, B, C, D four logos when wearing sunglasses, and under a normal condition (when there is no occlusion between the sunglasses and the logos), A, B, C, D of the four logos can be reflected by the sunglasses, and due to the occlusion of the mobile phone, the patterns on the sunglasses are two logos of the mobile phone and C, D, that is, the patterns in the specular reflection area of the obtained image to be processed are two logos of the mobile phone and C, D. If the reference image is a scene image with A, B, C, D four logos, the scene image can be scaled down as required, and the pattern in the specular reflection area is completely replaced according to the scaled-down scene image, so that the pattern in the specular reflection area after replacement is A, B, C, D four logos. The reference image may also be an image of a pattern of a mountain or sea, i.e. independent of the A, B, C, D four markers, from which the pattern in the specular reflection area is then entirely replaced in the same way. Thereby, the mobile phone pattern in the image to be processed can be removed.
After the specular reflection area is determined, an area of the scene image similar to the scene of the specular reflection area may be used as a scene area. The scene area may be the scene image or a part of the scene image, and is determined according to an actual situation. Then, matching the pattern in the specular reflection area with the pattern in the scene area to obtain a first area in the specular reflection area, the pattern of which is different from the pattern of the scene area, and a second area in the scene area, the pattern of which is different from the pattern of the specular reflection area. The pattern in the first area is then replaced with the pattern in the second area. Thus, the pattern in the first area in the original specular reflection area can be removed.
For example, a user uses a smart phone to take a self-portrait before A, B, C, D four logos when wearing sunglasses, and under a normal condition (when there is no occlusion between the sunglasses and the logos), A, B, C, D of the four logos can be reflected by the sunglasses, and due to the occlusion of the mobile phone, the patterns on the sunglasses are two logos of the mobile phone and C, D, that is, the patterns in the specular reflection area of the obtained image to be processed are two logos of the mobile phone and C, D. The method comprises the steps that a specular reflection area is matched with a pattern of a scene area, the area where a mobile phone is located in the specular reflection area can be determined to be a first area, the area where A and B marks are located in the specular reflection area is determined to be a second area, and after the pattern in the first area is replaced by the pattern in the second area, the pattern in the first area of an image to be processed is not the mobile phone, but the A and B marks.
In order to execute the corresponding steps in the above embodiments and various possible manners, an implementation manner of the image processing apparatus 200 is given below, and optionally, the image processing apparatus 200 may adopt the device structure of the electronic device 100 shown in fig. 1. Further, referring to fig. 5, fig. 5 is a block diagram illustrating an image processing apparatus 200 according to an embodiment of the present disclosure. It should be noted that the image processing apparatus 200 provided in the present embodiment has the same basic principle and technical effect as those of the above embodiments, and for the sake of brief description, no part of the present embodiment is mentioned, and reference may be made to the corresponding contents in the above embodiments. Functionally divided, the image processing apparatus 200 includes: an obtaining module 210, a detecting module 220 and a processing module 230.
The obtaining module 210 is configured to obtain an image to be processed.
The detecting module 220 is configured to detect the image to be processed, and determine a specular reflection area in the image to be processed.
The processing module 230 is configured to perform replacement or modification processing on the pattern in the specular reflection area according to a reference image to change the pattern in the specular reflection area.
Optionally, in an implementation manner of this embodiment, the image to be processed is obtained based on a first shooting direction, and the detecting module 220 is specifically configured to:
obtaining a scene image, wherein the scene image is obtained by shooting in a second shooting direction in a target environment in which the image to be processed is obtained, and the relationship between the second shooting direction and the first shooting direction is the same as the relationship between incident light and reflected light;
carrying out scene matching on the scene image and the image to be processed;
and if the area similar to the scene of the scene image exists in the image to be processed, taking the area as the specular reflection area.
Optionally, in an implementation manner of this embodiment, an object recognition model is stored in the electronic device 100, and the detection module 220 is specifically configured to:
carrying out object recognition on the image to be processed through the object recognition model so as to judge whether the image to be processed comprises an object with mirror reflection;
and if so, taking the area where the object with the specular reflection exists as the specular reflection area.
Optionally, in an implementation manner of this embodiment, the detecting module 220 is specifically configured to:
and receiving selection operation input by a user, and taking an area corresponding to the selection operation as the specular reflection area.
Optionally, in an implementation manner of this embodiment, the processing module 230 is specifically configured to:
replacing the pattern in the specular reflection area with a pattern in the reference image; alternatively, the first and second electrodes may be,
according to the matching result of the pattern in the specular reflection area and the pattern in the scene area, obtaining a first area in the specular reflection area, the pattern of which is different from the pattern of the scene area, and a second area in the scene area, the pattern of which is different from the pattern in the specular reflection area, and replacing the pattern in the first area with the pattern in the second area, wherein the scene area is an area with a scene image similar to the scene of the specular reflection area.
Alternatively, the modules may be stored in the memory 110 shown in fig. 1 in the form of software or Firmware (Firmware) or be fixed in an Operating System (OS) of the electronic device 100, and may be executed by the processor 120 in fig. 1. Meanwhile, data, codes of programs, and the like required to execute the above-described modules may be stored in the memory 110.
In summary, the embodiments of the present application provide an image processing method, an image processing apparatus, an electronic device, and a readable storage medium. After obtaining the image to be processed, detecting the image to be processed, determining a specular reflection area in the image to be processed, and then performing replacement or correction processing on the pattern in the specular reflection area according to the reference image to change the pattern in the specular reflection area. Therefore, patterns which are not expected to appear in the image by the user can be removed, so that the image quality is improved, and the user experience is improved.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. An image processing method applied to an electronic device, the method comprising:
obtaining an image to be processed;
detecting the image to be processed, and determining a specular reflection area in the image to be processed;
according to the matching result of the pattern in the specular reflection area and the pattern in the scene area, obtaining a first area in the specular reflection area, the pattern of which is different from the pattern of the scene area, and a second area in the scene area, the pattern of which is different from the pattern in the specular reflection area, and replacing the pattern in the first area with the pattern in the second area, wherein the scene area is an area in which a scene image is similar to the scene of the specular reflection area, the image to be processed is obtained based on a first shooting direction, the scene image is shot in a second shooting direction in a target environment in which the image to be processed is obtained, and the relationship between the second shooting direction and the first shooting direction is the same as the relationship between incident light and reflected light.
2. The method according to claim 1, wherein the image to be processed is obtained based on a first shooting direction, the detecting the image to be processed, and the determining the specular reflection area in the image to be processed comprises:
obtaining a scene image, wherein the scene image is obtained by shooting in a second shooting direction in a target environment in which the image to be processed is obtained, and the relationship between the second shooting direction and the first shooting direction is the same as the relationship between incident light and reflected light;
carrying out scene matching on the scene image and the image to be processed;
and if the area similar to the scene of the scene image exists in the image to be processed, taking the area as the specular reflection area.
3. The method according to claim 1, wherein an object recognition model is stored in the electronic device, and the detecting the image to be processed and the determining the specular reflection area in the image to be processed comprise:
carrying out object recognition on the image to be processed through the object recognition model so as to judge whether the image to be processed comprises an object with mirror reflection;
and if so, taking the area where the object with the specular reflection exists as the specular reflection area.
4. The method according to claim 1, wherein the detecting the image to be processed and determining the specular reflection area in the image to be processed comprises:
and receiving selection operation input by a user, and taking an area corresponding to the selection operation as the specular reflection area.
5. An image processing apparatus applied to an electronic device, the apparatus comprising:
the acquisition module is used for acquiring an image to be processed;
the detection module is used for detecting the image to be processed and determining a specular reflection area in the image to be processed;
the processing module is used for obtaining a first area in the specular reflection area, the pattern of which is different from the pattern of the scene area, and a second area in the scene area, the pattern of which is different from the pattern of the specular reflection area, according to a matching result of the pattern in the specular reflection area and the pattern in the scene area, and replacing the pattern in the first area with the pattern in the second area, wherein the scene area is an area in which a scene image is similar to the scene of the specular reflection area, the image to be processed is obtained based on a first shooting direction, the scene image is shot in a second shooting direction in a target environment in which the image to be processed is obtained, and a relationship between the second shooting direction and the first shooting direction is the same as a relationship between incident light and reflected light.
6. The apparatus according to claim 5, wherein the image to be processed is obtained based on a first shooting direction, and the detection module is specifically configured to:
obtaining a scene image, wherein the scene image is obtained by shooting in a second shooting direction in a target environment in which the image to be processed is obtained, and the relationship between the second shooting direction and the first shooting direction is the same as the relationship between incident light and reflected light;
carrying out scene matching on the scene image and the image to be processed;
and if the area similar to the scene of the scene image exists in the image to be processed, taking the area as the specular reflection area.
7. The apparatus according to claim 5, wherein the electronic device stores an object recognition model, and the detection module is specifically configured to:
carrying out object recognition on the image to be processed through the object recognition model so as to judge whether the image to be processed comprises an object with mirror reflection;
and if so, taking the area where the object with the specular reflection exists as the specular reflection area.
8. The apparatus of claim 5, wherein the detection module is specifically configured to:
and receiving selection operation input by a user, and taking an area corresponding to the selection operation as the specular reflection area.
9. An electronic device comprising a processor and a memory, the memory storing machine executable instructions executable by the processor to implement the image processing method of any one of claims 1-4.
10. A readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the image processing method according to any one of claims 1 to 4.
CN201911363993.8A 2019-12-26 2019-12-26 Image processing method and device, electronic equipment and readable storage medium Active CN113055579B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911363993.8A CN113055579B (en) 2019-12-26 2019-12-26 Image processing method and device, electronic equipment and readable storage medium
PCT/CN2020/139403 WO2021129806A1 (en) 2019-12-26 2020-12-25 Image processing method, apparatus, electronic device, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911363993.8A CN113055579B (en) 2019-12-26 2019-12-26 Image processing method and device, electronic equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN113055579A CN113055579A (en) 2021-06-29
CN113055579B true CN113055579B (en) 2022-02-01

Family

ID=76505881

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911363993.8A Active CN113055579B (en) 2019-12-26 2019-12-26 Image processing method and device, electronic equipment and readable storage medium

Country Status (2)

Country Link
CN (1) CN113055579B (en)
WO (1) WO2021129806A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114125276A (en) * 2021-11-11 2022-03-01 广东维沃软件技术有限公司 Image processing method and device
CN114565531A (en) * 2022-02-28 2022-05-31 上海商汤临港智能科技有限公司 Image restoration method, device, equipment and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101147157A (en) * 2005-01-26 2008-03-19 数字逻辑扫描公司 Data reader and methods for imaging targets subject to specular reflection
KR20090093223A (en) * 2008-02-29 2009-09-02 홍익대학교 산학협력단 Removal Eye Glasses using Variable Mask and Inpainting for Improved Performance of Face Recognition System
CN105046250A (en) * 2015-09-06 2015-11-11 广州广电运通金融电子股份有限公司 Glasses elimination method for face recognition
CN108564540A (en) * 2018-03-05 2018-09-21 广东欧珀移动通信有限公司 Remove image processing method, device and the terminal device that eyeglass is reflective in image

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101147157A (en) * 2005-01-26 2008-03-19 数字逻辑扫描公司 Data reader and methods for imaging targets subject to specular reflection
KR20090093223A (en) * 2008-02-29 2009-09-02 홍익대학교 산학협력단 Removal Eye Glasses using Variable Mask and Inpainting for Improved Performance of Face Recognition System
CN105046250A (en) * 2015-09-06 2015-11-11 广州广电运通金融电子股份有限公司 Glasses elimination method for face recognition
CN108564540A (en) * 2018-03-05 2018-09-21 广东欧珀移动通信有限公司 Remove image processing method, device and the terminal device that eyeglass is reflective in image

Also Published As

Publication number Publication date
CN113055579A (en) 2021-06-29
WO2021129806A1 (en) 2021-07-01

Similar Documents

Publication Publication Date Title
US8754963B2 (en) Processing images having different focus
US7929804B2 (en) System and method for tracking objects with a synthetic aperture
CN113055579B (en) Image processing method and device, electronic equipment and readable storage medium
US7995866B2 (en) Rotation angle detection apparatus, and control method and control program of rotation angle detection apparatus
CN109451240B (en) Focusing method, focusing device, computer equipment and readable storage medium
CN111445526A (en) Estimation method and estimation device for pose between image frames and storage medium
US20140198229A1 (en) Image pickup apparatus, remote control apparatus, and methods of controlling image pickup apparatus and remote control apparatus
CN109241345B (en) Video positioning method and device based on face recognition
CN105007413B (en) A kind of filming control method and user terminal
CN104219428B (en) A kind of camera mounting device
US8810665B2 (en) Imaging device and method to detect distance information for blocks in secondary images by changing block size
CN109443305B (en) Distance measuring method and device
CN111010554A (en) Projection processing method, projection processing device, projector and readable storage medium
CN110473227A (en) Method for tracking target, device, equipment and storage medium
CN110458857B (en) Central symmetry primitive detection method and device, electronic equipment and readable storage medium
EP3819815A1 (en) Human body recognition method and device, as well as storage medium
CN108289176B (en) Photographing question searching method, question searching device and terminal equipment
CN110737326A (en) Virtual object display method and device, terminal equipment and storage medium
CN109901716B (en) Sight point prediction model establishing method and device and sight point prediction method
JP2005182745A (en) Image processing method, device and program
CN113031582A (en) Robot, positioning method, and computer-readable storage medium
CN112802112B (en) Visual positioning method, device, server and storage medium
CN108898572B (en) Light spot extraction method
WO2020255766A1 (en) Information processing device, information processing method, program, projection device, and information processing system
CN113920196A (en) Visual positioning method and device and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant