CN115272718A - Image processing method, image processing device, storage medium and electronic equipment - Google Patents

Image processing method, image processing device, storage medium and electronic equipment Download PDF

Info

Publication number
CN115272718A
CN115272718A CN202210885516.3A CN202210885516A CN115272718A CN 115272718 A CN115272718 A CN 115272718A CN 202210885516 A CN202210885516 A CN 202210885516A CN 115272718 A CN115272718 A CN 115272718A
Authority
CN
China
Prior art keywords
region
image
difference
processed
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210885516.3A
Other languages
Chinese (zh)
Inventor
朱文波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210885516.3A priority Critical patent/CN115272718A/en
Publication of CN115272718A publication Critical patent/CN115272718A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses an image processing method, an image processing device, a storage medium and an electronic device. The method comprises the following steps: acquiring an image to be processed and acquiring a historical reference image; carrying out difference comparison on the image to be processed and the historical reference image to obtain a comparison result; according to the comparison result, determining a difference area and a non-difference area from the image to be processed, and calculating the depth information of the difference area through a depth information calculation strategy; and determining the depth information of the region corresponding to the non-difference region in the historical reference image as the depth information of the non-difference region. The depth information of the image can be determined.

Description

Image processing method, image processing device, storage medium and electronic equipment
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, a storage medium, and an electronic device.
Background
With the rapid development of image processing technology, the application demand of image three-dimensional information is increasing. In many fields such as industrial manufacturing, augmented reality, object reconstruction, and game entertainment, depth information of an image is required, and thus, a scheme for determining the depth information of the image is required.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, a storage medium and electronic equipment, which can determine depth information of an image.
In a first aspect, an embodiment of the present application provides an image processing method, including:
acquiring an image to be processed and acquiring a historical reference image;
carrying out difference comparison on the image to be processed and the historical reference image to obtain a comparison result;
according to the comparison result, determining a difference region and a non-difference region from the image to be processed, and calculating the depth information of the difference region through a depth information calculation strategy;
determining depth information of a region corresponding to the non-difference region in the historical reference image as depth information of the non-difference region.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including:
the image acquisition module is used for acquiring an image to be processed and acquiring a historical reference image;
the difference comparison module is used for carrying out difference comparison on the image to be processed and the historical reference image to obtain a comparison result;
the region determining module is used for determining a difference region and a non-difference region from the image to be processed according to the comparison result, and calculating the depth information of the difference region through a depth information calculating strategy;
an information determining module, configured to determine depth information of a region corresponding to the non-difference region in the historical reference image as depth information of the non-difference region.
In a third aspect, an embodiment of the present application provides an electronic device, including:
the prepositive image processor is used for acquiring an image to be processed and acquiring a historical reference image; carrying out difference comparison on the image to be processed and the historical reference image to obtain a comparison result; determining a difference area and a non-difference area from the image to be processed according to the comparison result; calculating depth information of the difference region;
an application processor for determining depth information of a region corresponding to the non-difference region in the history reference image as depth information of the non-difference region.
In a fourth aspect, an embodiment of the present application provides an electronic device, which includes a memory and a processor, where the processor is configured to execute the image processing method provided in the embodiment of the present application by calling a computer program stored in the memory.
In a fifth aspect, an embodiment of the present application provides a storage medium, on which a computer program is stored, which, when executed on a computer, causes the computer to execute an image processing method provided by an embodiment of the present application.
In the embodiment of the application, the image to be processed is obtained, and the historical reference image is obtained; carrying out difference comparison on the image to be processed and the historical reference image to obtain a comparison result; determining a difference area and a non-difference area from the image to be processed according to the comparison result, and calculating the depth information of the difference area through a depth information calculation strategy; the depth information of the area corresponding to the non-difference area in the historical reference image is determined as the depth information of the non-difference area, so that the depth information of the difference area of the image to be processed is calculated through a depth information calculation strategy, the depth information of the non-difference area of the image to be processed is determined through the historical reference image, and the depth information of the image can be determined.
Drawings
The technical solutions and advantages of the present application will become apparent from the following detailed description of specific embodiments of the present application when taken in conjunction with the accompanying drawings.
Fig. 1 is a schematic flowchart of an image processing method according to an embodiment of the present application.
Fig. 2 is a schematic view of a first scene of an image processing method according to an embodiment of the present application.
Fig. 3 is a schematic view of a second scene of an image processing method according to an embodiment of the present application.
Fig. 4 is a schematic diagram of a third scenario of an image processing method according to an embodiment of the present application.
Fig. 5 is a schematic diagram of a fourth scene of an image processing method according to an embodiment of the present application.
Fig. 6 is a schematic diagram of a fifth scene of an image processing method according to an embodiment of the present application.
Fig. 7 is a sixth scene schematic diagram of an image processing method according to an embodiment of the present application.
Fig. 8 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application.
Fig. 9 is a schematic structural diagram of a first electronic device according to an embodiment of the present application.
Fig. 10 is a schematic structural diagram of a second electronic device according to an embodiment of the present application.
Fig. 11 is a third structural schematic diagram of an electronic device according to an embodiment of the present application.
Detailed Description
It should be noted that the terms "first", "second", and "third", etc. in this application are used for distinguishing different objects, and are not used for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or modules is not limited to only those steps or modules recited, but rather, some embodiments include additional steps or modules not recited, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein may be combined with other embodiments.
The embodiment of the application provides an image processing method, an image processing device, a storage medium and an electronic device, wherein an execution subject of the image processing method can be the image processing device provided by the embodiment of the application or the electronic device integrated with the image processing device, and the image processing device can be realized in a hardware or software mode. The electronic device may be a smart phone, a tablet computer, a palm computer, a notebook computer, or the like having image processing capability and configured with a processor.
Referring to fig. 1, fig. 1 is a schematic flow chart of an image processing method according to an embodiment of the present application, where the flow chart may include:
in 101, an image to be processed is acquired, and a historical reference image is acquired.
In this embodiment, the electronic device acquires an image to be processed and acquires a historical reference image.
The image to be processed may be an image whose depth information needs to be determined. The image to be processed may be a RAW image, a YUV image, or an RGB image, or the like.
The historical reference image may be an image captured by the electronic device prior to capturing the image to be processed. For example, the electronic device may perform continuous image acquisition on a shooting scene to sequentially obtain images M11, M12, M13, and M14, where M14 may be an image to be processed, and at least one image of the images M11, M12, and M13 may be a history reference image.
It should be noted that in the embodiment of the present application, each image, such as the image to be processed, the historical reference image, the historical candidate image, and the like, may be a RAW image, a YUV image, an RGB image, or the like.
In 102, the difference comparison is performed between the image to be processed and the historical reference image to obtain a comparison result.
In 103, according to the comparison result, a difference region and a non-difference region are determined from the image to be processed, and the depth information of the difference region is calculated through a depth information calculation strategy.
In this embodiment, after the to-be-processed image and the historical reference image are obtained, the electronic device performs difference comparison on the to-be-processed image and the historical reference image to obtain a comparison result. And after the comparison result is obtained, the electronic equipment determines a difference area and a non-difference area from the image to be processed according to the comparison result.
That is, the electronic apparatus determines a region having a difference (different image content) and a region having no difference (same image content) in the image to be processed and the history reference image by comparing the difference between the image to be processed and the history reference image, determines the region having a difference in the image to be processed as a difference region, and determines the region having no difference in the image to be processed as a non-difference region.
The depth information of the image can be understood as: and the distance from the image acquisition equipment to each point in the shooting scene corresponding to the image. After the electronic device starts a shooting application program (for example, a system application "camera" of the electronic device) according to a user operation, a scene aimed at by a camera of the electronic device is a shooting scene. For example, after a user clicks an icon of a "camera" application on the electronic device through a finger to start the "camera application", if the user uses a camera of the electronic device to aim at a certain scene, the scene is a shooting scene. From the above description, it will be understood by those skilled in the art that the shooting scene is not specific to a particular scene, but is a scene aligned in real time following the orientation of the camera.
In this embodiment, after determining the difference region from the image to be processed, the electronic device calculates the depth information of the difference region through a depth information calculation policy. The depth information of the difference region comprises the depth information of each pixel point in the difference region.
For example, the depth information calculation strategy includes a depth estimation model, and after a difference region is determined from the image to be processed, the electronic device may segment the difference region from the image to be processed, and input the difference region into a depth estimation model trained in advance to obtain the depth information of the difference region.
For another example, the depth information calculation policy includes a binocular stereo vision policy, and the electronic device acquires an image to be processed of a certain shooting scene through a camera and also acquires an image to be segmented of the shooting scene through another camera which is spaced from the camera by a certain distance. After determining the difference region from the image to be processed, the electronic device may segment the difference region from the image to be processed, and segment a segmentation region corresponding to the difference region from the image to be segmented; then, the electronic device finds corresponding pixel points in the difference region and the segmentation region through a stereo matching algorithm, then calculates parallax information according to a triangle principle, and the parallax information can be used for representing depth information of objects in a scene through conversion, so that the depth information of the difference region can be obtained.
It should be noted that the depth information calculation policy may also include other policies that can calculate the depth information of the image, and those skilled in the art may adopt a corresponding depth information calculation policy to calculate the depth information of the difference region according to the actual situation, which is not limited herein.
In 104, depth information of a region corresponding to the non-difference region in the history reference image is determined as depth information of the non-difference region.
In this embodiment, after determining the non-difference region from the image to be processed, the electronic device determines the depth information of the region corresponding to the non-difference region in the history reference image as the depth information of the non-difference region. The depth information of the non-difference region comprises the depth information of each pixel point in the non-difference region.
It can be understood that, in the present embodiment, the depth information of the historical reference image is already determined, and the depth information of the image to be processed needs to be determined in real time. In the above steps, the depth information of the difference region is already calculated by the depth information calculation strategy, and then the depth information of the non-difference region needs to be determined. Since the non-difference region is not different from the corresponding region in the history reference image, that is, the image contents of the two regions are the same, when the depth information of the non-difference region is determined, the depth information of the corresponding region in the history reference image can be directly determined as the depth information of the non-difference region, so that the depth information of the non-difference region does not need to be calculated through a depth information calculation strategy.
It can be understood that, by directly determining the depth information of the corresponding region in the historical reference image as the depth information of the non-difference region, the pressure of data processing can be reduced compared with the scheme of calculating the depth information of the non-difference region through the depth information calculation strategy.
In addition, in this embodiment, for an electronic device including a front-facing image processor and an application processor, the front-facing image processor may determine depth information of a difference region, and then send an image to be processed and the depth information of the difference region to the application processor, and the application processor determines depth information of a non-difference region. Because only the depth information of the difference area needs to be sent, compared with the scheme of sending the depth information of the whole image to be processed, the method and the device can reduce the data transmission amount, save the transmission speed of the image and further save the processing speed of the image.
In the embodiment, the image to be processed is obtained, and the historical reference image is obtained; carrying out difference comparison on the image to be processed and the historical reference image to obtain a comparison result; according to the comparison result, determining a difference area and a non-difference area from the image to be processed, and calculating the depth information of the difference area through a depth information calculation strategy; and determining the depth information of the area corresponding to the non-difference area in the historical reference image as the depth information of the non-difference area, thereby calculating the depth information of the difference area of the image to be processed through a depth information calculation strategy, determining the depth information of the non-difference area of the image to be processed through the historical reference image, and determining the depth information of the image.
In an optional embodiment, performing difference comparison on the image to be processed and the historical reference image to obtain a comparison result, includes:
determining the same object in the image to be processed and the historical reference image;
determining a first area of an object in an image to be processed and a second area in a historical reference image;
carrying out difference comparison on the first area and the second area to obtain a comparison result;
according to the comparison result, determining a difference area and a non-difference area from the image to be processed, including:
and if the comparison result shows that the area of the first region is different from the area of the second region and/or the position of the first region is different from the position of the second region, determining a difference region and a non-difference region from the image to be processed according to the first region.
It can be understood that, when a moving object exists in a shooting scene, the position and/or area of the moving object in the multi-frame images continuously captured by the electronic device may change, for example, as the moving object is closer to the electronic device, the area of the moving object in the captured images may also be larger, then the region including the region where the moving object is located is a difference region of the multi-frame images, and for a non-moving object in the shooting scene, the change (possibly caused by shaking of the electronic device) of the position and area of the moving object in the multi-frame images continuously captured by the electronic device may be negligible, and for a background scene in the shooting scene, the change may also be negligible.
That is, in this embodiment, the electronic device first determines the same object in the image to be processed and the historical reference image; then determining a first area of the object in the image to be processed and a second area in the historical reference image; then, carrying out difference comparison on the first area and the second area to obtain a comparison result; if the comparison result shows that the area of the first region is different from the area of the second region and/or the position of the first region is different from the position of the second region, the object is a moving object, and then the electronic device determines a difference region and a non-difference region from the image to be processed according to the first region.
For example, as shown in fig. 2, the area of the first region a21 of the object B2 in the image M21 to be processed is different from the area of the second region a22 of the object B2 in the history reference image M22, and then the electronic device may determine the difference region and the non-difference region from the image M21 to be processed according to the first region a 21.
For another example, as shown in fig. 3, the position of the first area a23 of the object B2 in the image M23 to be processed is different from the position of the second area a24 of the object B2 in the history reference image M24, and then the electronic device may determine the difference area and the non-difference area from the image M23 to be processed according to the first area a 23.
For another example, as shown in fig. 4, the area of the first region a25 of the object B2 in the to-be-processed image M25 is different from the area of the second region a26 of the object B2 in the history reference image M26, and the position of the first region a25 of the object B2 in the to-be-processed image M25 is also different from the position of the second region a26 of the object B2 in the history reference image M26, so that the electronic device can determine the difference region and the non-difference region from the to-be-processed image M25 according to the first region a 25.
In an optional embodiment, determining the difference region and the non-difference region from the image to be processed according to the first region includes:
determining an area in a preset range as a difference area by taking the center of the first area as an origin in the image to be processed, and determining an area except the difference area in the image to be processed as a non-difference area, wherein the difference area comprises the first area.
In view of the fact that the algorithm for accurately segmenting the first region consumes the computing power of the electronic device, in this embodiment, the electronic device may determine the region within the preset range with the center of the first region as the origin as the difference region, and determine the region except the difference region in the image to be processed as the non-difference region. Wherein the difference region comprises a first region. The preset range may be preset by a user or determined by the electronic device based on a certain rule.
For example, as shown in fig. 5, assuming that the first area is a21, the electronic device may determine the area a31 as a difference area and determine an area other than the area a31 in the image M21 to be processed as a non-difference area.
In an optional embodiment, determining a difference region and a non-difference region from the image to be processed according to the first region, and calculating depth information of the difference region includes:
and determining a region formed by the region corresponding to the second region and the first region in the image to be processed as a difference region, and determining a region except the difference region in the image to be processed as a non-difference region.
For example, as shown in fig. 6, the electronic device may determine, as the difference region, a region a32 composed of the first region a23 and a region corresponding to the second region a24 in the image to be processed M23, and determine, as the non-difference region, a region other than the difference region in the image to be processed.
In an optional embodiment, the electronic device may also directly determine the first region as the difference region, and determine a region other than the difference region in the image to be processed as the non-difference region.
In an optional embodiment, before determining a region other than the difference region in the image to be processed as the non-difference region, the method further includes:
determining a region to be determined from regions except for the difference region in the image to be processed, and calculating the depth information of the region to be determined through a depth information calculation strategy;
determining a region to be compared corresponding to the region to be determined from the historical reference image, and acquiring depth information of the region to be compared;
determining other regions except the difference region in the image to be processed as non-difference regions, including:
and if the depth information of the region to be determined is the same as that of the region to be compared, determining the region except the difference region in the image to be processed as a non-difference region.
It can be understood that, in order to further determine that the image content of the non-difference region is the same as that of the region corresponding to the non-difference region in the historical reference image, the electronic device may determine at least one region to be determined from the regions of the image to be processed, except for the difference region, and calculate the depth information of the region to be determined by using the depth information calculation strategy; determining a region to be compared corresponding to the region to be determined from the historical reference image, and acquiring depth information of the region to be compared; and if the depth information of the region to be determined is the same as that of the region to be compared, determining the region except the difference region in the image to be processed as a non-difference region.
For example, as shown in fig. 7, the electronic device determines regions a41 and a42 to be determined from the regions of the image to be processed M21 other than the difference region a31, determines a region a43 to be compared corresponding to the region a41 to be determined and a region a44 to be compared corresponding to the region a42 to be determined from the history reference image M22, and if the depth information of the region a41 to be determined and the depth information of the region a43 to be compared are the same and the depth information of the region a42 to be determined and the depth information of the region a44 to be compared are the same, the electronic device may determine the regions of the image to be processed other than the difference region as non-difference regions.
It should be noted that the specific implementation of "calculating the depth information of the to-be-determined region by using the depth information calculation strategy" may refer to the specific implementation of "calculating the depth information of the difference region by using the depth information calculation strategy", and is not described herein again.
In an optional embodiment, performing difference comparison on the image to be processed and the historical reference image to obtain a comparison result includes:
determining a first focusing area of an image to be processed and determining a second focusing area of a historical reference image;
carrying out difference comparison on the first focusing area and the second focusing area to obtain a comparison result;
according to the comparison result, determining a difference area from the image to be processed, and determining a non-difference area from the historical reference image, wherein the steps of:
and if the comparison result shows that the first focusing area is different from the second focusing area, determining the first focusing area as a difference area, and determining an area except the first focusing area in the image to be processed as a non-difference area.
Considering that the focused area is the focus of the user and the non-focused area is not much focused, in this embodiment, the electronic device may determine a first focused area of the image to be processed and determine a second focused area of the historical reference image; if the first focus area is different from the second focus area, the electronic device may determine the first focus area as a difference area and determine an area other than the first focus area in the image to be processed as a non-difference area.
In an optional embodiment, before determining a region other than the first focus region in the image to be processed as the non-difference region, the method further includes:
determining a region to be determined from a region except the first focus region in the image to be processed, and calculating depth information of the region to be determined through a depth information calculation strategy;
determining a region to be compared corresponding to the region to be determined from the historical reference image, and acquiring depth information of the region to be compared;
determining other regions except the first focus region in the image to be processed as non-difference regions, including:
and if the depth information of the region to be determined is the same as that of the region to be compared, determining the region except the first focus region in the image to be processed as a non-difference region.
It can be understood that "determining a region to be determined from regions of the image to be processed except the first focus region, and calculating depth information of the region to be determined by a depth information calculation strategy; determining a region to be compared corresponding to the region to be determined from the historical reference image, and acquiring depth information of the region to be compared; if the depth information of the region to be determined is the same as the depth information of the region to be compared, determining the region except the first focus region in the image to be processed as a non-difference region, determining the region to be determined from the region except the difference region in the image to be processed, and calculating the depth information of the region to be determined through a depth information calculation strategy; determining a region to be compared corresponding to the region to be determined from the historical reference image, and acquiring depth information of the region to be compared; if the depth information of the region to be determined is the same as the depth information of the region to be compared, determining the region except the difference region in the image to be processed as a non-difference region, which is not described herein again.
In an optional embodiment, after determining the depth information of the region corresponding to the non-difference region in the historical reference image as the depth information of the non-difference region, the method further includes:
and performing blurring processing on the image to be processed by using the depth information of the difference area and the depth information of the non-difference area to obtain a blurring image.
In this embodiment, after the electronic device obtains the depth information of the difference region and the depth information of the non-difference region, that is, the depth information of the image to be processed, the electronic device may perform blurring processing on the image to be processed by using the depth information to obtain a blurred image.
For example, the electronic device may use a plane where a focus area of the image to be processed is located as a focal plane, that is, in the blurred image, the focus area and an area where the depth information is the same as the depth information of the focus area are clearest, and in other areas in the blurred image, the definition is lower as the difference from the depth information of the focus area is larger.
In an optional embodiment, after performing a blurring process on the image to be processed by using the depth information of the difference region and the depth information of the non-difference region to obtain a blurred image, the electronic device may further perform a subsequent image processing, such as a beautifying or anti-shake process, on the blurred image.
In an optional embodiment, after obtaining the depth information of the difference region and the depth information of the non-difference region, the electronic device generates a depth image of the image to be processed according to the depth information of the difference region and the depth information of the non-difference region. And each pixel point in the depth image represents the depth information of the corresponding pixel point in the image to be processed.
In an optional embodiment, before acquiring the historical reference image, the method further includes:
determining the number of history candidate images;
acquiring a historical reference image, comprising:
and if the number meets the preset condition, determining a historical reference image from the historical candidate images.
In order to ensure the real-time performance of the depth information, the electronic device may calculate the depth information of the image through a depth information calculation strategy every preset frame to correct errors. Then, the preset condition may be that the number of the history reference images and the images to be processed is not a preset number, such as an integer multiple of 5. And if the number does not meet the preset condition, determining a historical reference image from the historical candidate images. And if the number meets the preset condition, calculating the depth information of the image to be processed through a depth information calculation strategy. For example, assuming that the electronic device continuously captures a captured scene to obtain images M11, M12, M13, M14, M15, and M16, for the image M11, since it is a first frame image obtained by the electronic device, the electronic device may calculate depth information of the first frame image through a depth information calculation strategy, for the images M12, M13, M14, and M16, the electronic device may obtain depth information of each image through the image processing method provided in the embodiment of the present application, and for the image M15, the electronic device may calculate depth information of the image M15 through a depth information calculation strategy. The preset frames and the preset number may be set by a user or determined by the electronic device based on a certain rule.
It should be noted that "calculating the depth information of the image to be processed by using the depth information calculation strategy" may refer to "calculating the depth information of the difference region by using the depth information calculation strategy", which is not described herein again.
In an alternative embodiment, determining the historical reference image from the historical candidate images may include: the history candidate image with the shooting time closest to that of the image to be processed in the history candidate images is used as the history reference image, and any history candidate image in the history candidate images can also be determined as the history reference image.
In an optional embodiment, in addition to comparing the depth information of the to-be-compared region of the historical reference image with the depth information of the to-be-determined region, the depth information of the region corresponding to the to-be-determined region in the at least one historical candidate image may be compared with the depth information of the to-be-determined region. And if the depth information of the region to be determined is the same as that of the region to be compared, and the depth information of the region to be determined is the same as that of a region corresponding to the region to be determined in the at least one historical candidate image, determining the region except the difference region in the image to be processed as a non-difference region.
In an optional embodiment, for the image M15, the electronic device may calculate the depth information of the image M15 through a depth information calculation strategy while obtaining the depth information of the image M15 through the image processing method provided in the embodiment of the present application, so as to correct an error and facilitate determination of the depth information of a subsequent image. Since the depth information of the image M15 is obtained by the image processing method provided in the embodiment of the present application, compared with a scheme of calculating the depth information of the image M15 by using the depth information calculation strategy, time consumption is less, and therefore, after the depth information of the image M15 is obtained by the image processing method provided in the embodiment of the present application, subsequent image processing, such as blurring processing, etc., can be performed by directly using the depth information of the image M15.
Referring to fig. 8, fig. 8 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present disclosure. The image processing apparatus 200 includes: an image acquisition module 201, a difference comparison module 202, a region determination module 203, and an information determination module 204.
The image obtaining module 201 is configured to obtain an image to be processed and obtain a historical reference image.
And the difference comparison module 202 is configured to perform difference comparison on the image to be processed and the historical reference image to obtain a comparison result.
And the region determining module 203 is configured to determine a difference region and a non-difference region from the image to be processed according to the comparison result, and calculate depth information of the difference region through a depth information calculation strategy.
An information determining module 204, configured to determine depth information of a region corresponding to the non-difference region in the historical reference image as depth information of the non-difference region.
In an alternative embodiment, the difference comparing module 202 may be configured to: determining the same object in the image to be processed and the historical reference image; determining a first area of an object in an image to be processed and a second area in a historical reference image; carrying out difference comparison on the first region and the second region to obtain a comparison result;
a region determination module 203, which may be configured to: and if the comparison result shows that the area of the first region is different from the area of the second region and/or the position of the first region is different from the position of the second region, determining a difference region and a non-difference region from the image to be processed according to the first region.
In an alternative embodiment, the region determining module 203 may be configured to: determining an area in a preset range as a difference area by taking the center of the first area as an origin in the image to be processed, and determining an area except the difference area in the image to be processed as a non-difference area, wherein the difference area comprises the first area.
In an optional embodiment, the area determining module 203 may be configured to: determining a region to be determined from regions except for the difference region in the image to be processed, and calculating the depth information of the region to be determined through a depth information calculation strategy; determining a region to be compared corresponding to the region to be determined from the historical reference image, and acquiring depth information of the region to be compared; and if the depth information of the region to be determined is the same as that of the region to be compared, determining the region except the difference region in the image to be processed as a non-difference region.
In an alternative embodiment, the difference comparing module 202 may be configured to: determining a first focusing area of an image to be processed and determining a second focusing area of a historical reference image; carrying out difference comparison on the first focusing area and the second focusing area to obtain a comparison result;
a region determination module 203, which may be configured to: and if the comparison result shows that the first focus area is different from the second focus area, determining the first focus area as a difference area, and determining an area except the first focus area in the image to be processed as a non-difference area.
In an alternative embodiment, the image acquiring module 201 may be configured to: determining the number of history candidate images; and if the number meets the preset condition, determining a historical reference image from the historical candidate images.
The embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, which, when executed on a computer, causes the computer to execute the image processing method provided by the embodiment.
The embodiment of the present application further provides an electronic device, which includes a memory and a processor, where the processor is configured to execute the image processing method provided in this embodiment by calling a computer program stored in the memory.
For example, the electronic device may be a mobile terminal such as a tablet computer or a smart phone. Referring to fig. 9, fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
The electronic device 300 may include a processor 301, a memory 302, and the like. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 9 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The processor 301, i.e., a central processing unit, is a control center of the electronic device, and is connected to various parts of the whole electronic device through various interfaces and lines, and executes various functions of the electronic device and processes data by running or executing application programs stored in the memory 302 and calling data stored in the memory 302, thereby performing overall monitoring of the electronic device. In an alternative embodiment, the processor 301 may also be a processor with computing power, such as an application processor, a neural network processor, or the like.
The memory 302 may be used to store applications and data. The memory 302 stores applications containing executable code. The application programs may constitute various functional modules. The processor 301 executes various functional applications and data processing by running an application program stored in the memory 302.
In this embodiment, the processor 301 in the electronic device loads the executable code corresponding to the processes of one or more application programs into the memory 302 according to the following instructions, and the processor 301 runs the application programs stored in the memory 302, thereby implementing the following processes:
acquiring an image to be processed and acquiring a historical reference image;
carrying out difference comparison on the image to be processed and the historical reference image to obtain a comparison result;
according to the comparison result, determining a difference area and a non-difference area from the image to be processed, and calculating the depth information of the difference area through a depth information calculation strategy;
and determining the depth information of the region corresponding to the non-difference region in the historical reference image as the depth information of the non-difference region.
Referring to fig. 10, an electronic device 300 may include a processor 301, a memory 302, an input unit 303, an output unit 304, and the like.
The processor 301, i.e., a central processing unit, is a control center of the electronic device, and is connected to various parts of the whole electronic device by using various interfaces and lines, and executes various functions of the electronic device and processes data by running or executing application programs stored in the memory 302 and calling data stored in the memory 302, thereby integrally monitoring the electronic device. In an alternative embodiment, the processor 301 may also be a processor with computing power, such as an application processor, a neural network processor, or the like.
The memory 302 may be used to store applications and data. The memory 302 stores applications containing executable code. The application programs may constitute various functional modules. The processor 301 executes various functional applications and data processing by running an application program stored in the memory 302.
The input unit 303 may be used to receive input numbers, character information, or user characteristic information, such as a fingerprint, and generate a keyboard, mouse, joystick, optical, or trackball signal input related to user setting and function control.
The output unit 304 may be used to display information input by or provided to a user and various graphical user interfaces of the electronic device, which may be made up of graphics, text, icons, video, and any combination thereof. The output unit may include a display screen, and the display screen may include a display area.
In this embodiment, the processor 301 in the electronic device loads the executable code corresponding to the processes of one or more application programs into the memory 302 according to the following instructions, and the processor 301 runs the application programs stored in the memory 302, thereby implementing the following processes:
acquiring an image to be processed and acquiring a historical reference image;
carrying out difference comparison on the image to be processed and the historical reference image to obtain a comparison result;
according to the comparison result, determining a difference area and a non-difference area from the image to be processed, and calculating the depth information of the difference area through a depth information calculation strategy;
and determining the depth information of the region corresponding to the non-difference region in the historical reference image as the depth information of the non-difference region.
In an optional embodiment, when the processor 301 performs difference comparison between the image to be processed and the historical reference image to obtain a comparison result, it may perform: determining the same object in the image to be processed and the historical reference image; determining a first area of an object in an image to be processed and a second area in a historical reference image; carrying out difference comparison on the first area and the second area to obtain a comparison result; when the processor 301 determines the difference region and the non-difference region from the image to be processed according to the comparison result, it may perform: and if the comparison result shows that the area of the first region is different from the area of the second region and/or the position of the first region is different from the position of the second region, determining a difference region and a non-difference region from the image to be processed according to the first region.
In an alternative embodiment, when the processor 301 determines the difference region and the non-difference region from the image to be processed according to the first region, it may perform: determining an area in a preset range as a difference area by taking the center of the first area as an origin in the image to be processed, and determining an area except the difference area in the image to be processed as a non-difference area, wherein the difference area comprises the first area.
In an optional embodiment, before the processor 301 determines the region other than the difference region in the image to be processed as the non-difference region, it may further perform: determining a region to be determined from regions except for the difference region in the image to be processed, and calculating the depth information of the region to be determined through a depth information calculation strategy; determining a region to be compared corresponding to the region to be determined from the historical reference image, and acquiring depth information of the region to be compared; when the processor 301 determines the other region except the difference region in the image to be processed as the non-difference region, it may perform: and if the depth information of the region to be determined is the same as that of the region to be compared, determining the region except the difference region in the image to be processed as a non-difference region.
In an optional embodiment, when the processor 301 performs difference comparison between the image to be processed and the historical reference image to obtain a comparison result, it may perform: determining a first focusing area of an image to be processed and determining a second focusing area of a historical reference image; carrying out difference comparison on the first focusing area and the second focusing area to obtain a comparison result; when the processor 301 determines a difference region from the image to be processed according to the comparison result and determines a non-difference region from the historical reference image, the following steps may be performed: and if the comparison result shows that the first focusing area is different from the second focusing area, determining the first focusing area as a difference area, and determining an area except the first focusing area in the image to be processed as a non-difference area.
In an optional embodiment, before the processor 301 performs acquiring the historical reference image, it may further perform: determining the number of history candidate images; when the processor 301 performs acquiring the history reference image, it may perform: and if the number meets the preset condition, determining a historical reference image from the historical candidate images.
Referring to fig. 11, the electronic device 300 includes a front-end image processor 305 and an application processor 306. Wherein the content of the first and second substances,
a pre-image processor 305, configured to obtain an image to be processed and obtain a historical reference image; carrying out difference comparison on the image to be processed and the historical reference image to obtain a comparison result; determining a difference area and a non-difference area from the image to be processed according to the comparison result; calculating depth information of the difference region;
and the application processor 306 is used for determining the depth information of the region corresponding to the non-difference region in the historical reference image as the depth information of the non-difference region.
In an alternative embodiment, the pre-image processor 305 may be configured to: determining the same object in the image to be processed and the historical reference image; determining a first area of an object in an image to be processed and a second area in a historical reference image; and carrying out difference comparison on the first region and the second region to obtain a comparison result. And if the comparison result shows that the area of the first region is different from the area of the second region and/or the position of the first region is different from the position of the second region, determining a difference region and a non-difference region from the image to be processed according to the first region.
In an alternative embodiment, the pre-image processor 305 may be configured to: and determining a region in a preset range as a difference region by taking the center of the first region as an origin in the image to be processed, and determining a region except the difference region in the image to be processed as a non-difference region, wherein the difference region comprises the first region.
In an alternative embodiment, the pre-image processor 305 may be configured to: determining a region to be determined from regions except for the difference region in the image to be processed, and calculating the depth information of the region to be determined through a depth information calculation strategy; determining a region to be compared corresponding to the region to be determined from the historical reference image, and acquiring depth information of the region to be compared; and if the depth information of the region to be determined is the same as that of the region to be compared, determining the region except the difference region in the image to be processed as a non-difference region.
In an alternative embodiment, the pre-image processor 305 may be configured to: determining a first focusing area of an image to be processed and determining a second focusing area of a historical reference image; carrying out difference comparison on the first focusing area and the second focusing area to obtain a comparison result; and if the comparison result shows that the first focus area is different from the second focus area, determining the first focus area as a difference area, and determining an area except the first focus area in the image to be processed as a non-difference area.
In an alternative embodiment, the pre-image processor 305 may be configured to: determining a region to be determined from a region except the first focus region in the image to be processed, and calculating depth information of the region to be determined through a depth information calculation strategy; determining a region to be compared corresponding to the region to be determined from the historical reference image, and acquiring depth information of the region to be compared; and if the depth information of the region to be determined is the same as that of the region to be compared, determining the region except the first focus region in the image to be processed as a non-difference region.
In an alternative embodiment, the pre-image processor 305 may be configured to: sending the depth information of the difference region and the image to be processed to the application processor 306;
the application processor 305 may be operable to: and performing blurring processing on the received image to be processed by utilizing the received depth information of the difference area and the received depth information of the non-difference area to obtain a blurring image.
In an alternative embodiment, the application processor 306 may be configured to: the blurred image is used for subsequent image processing, such as beautifying or anti-shaking processing, etc.
In an alternative embodiment, the pre-image processor 305 may be configured to: sending the depth information of the difference region and the image to be processed to the application processor 306;
the application processor 306 may be configured to: and generating a depth image of the image to be processed according to the depth information of the difference area and the depth information of the non-difference area.
In an alternative embodiment, the pre-image processor 305 may be configured to: determining the number of history candidate images; and if the number meets the preset condition, determining a historical reference image from the historical candidate images.
In an alternative embodiment, the pre-image processor 305 may be configured to: and taking the history candidate image with the shooting time closest to the shooting time of the image to be processed in the history candidate images as a history reference image, or determining any history candidate image in the history candidate images as the history reference image.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and parts that are not described in detail in a certain embodiment may refer to the above detailed description of the image processing method, and are not described again here.
The image processing apparatus provided in the embodiment of the present application and the image processing method in the above embodiment belong to the same concept, and any method provided in the embodiment of the image processing method may be run on the image processing apparatus, and a specific implementation process thereof is described in detail in the embodiment of the image processing method, and is not described herein again.
It should be noted that, for the image processing method in the embodiments of the present application, it can be understood by those skilled in the art that all or part of the processes for implementing the image processing method in the embodiments of the present application can be implemented by controlling the related hardware through a computer program, the computer program can be stored in a computer readable storage medium, such as a memory, and executed by at least one processor, and the processes such as the processes of the embodiments of the image processing method can be included in the execution process. The storage medium may be a magnetic disk, an optical disk, a Read Only Memory (ROM), a Random Access Memory (RAM), or the like.
In the image processing apparatus according to the embodiment of the present application, each functional module may be integrated into one processing chip, each module may exist alone physically, or two or more modules may be integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a separate product, may also be stored in a computer-readable storage medium, such as a read-only memory, a magnetic disk or an optical disk.
The foregoing detailed description has provided an image processing method, an image processing apparatus, a storage medium, and an electronic device according to embodiments of the present application, and specific examples are applied herein to explain principles and embodiments of the present application, and the description of the foregoing embodiments is only used to help understand methods and core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. An image processing method, characterized by comprising:
acquiring an image to be processed and acquiring a historical reference image;
carrying out difference comparison on the image to be processed and the historical reference image to obtain a comparison result;
according to the comparison result, determining a difference region and a non-difference region from the image to be processed, and calculating the depth information of the difference region through a depth information calculation strategy;
determining depth information of a region corresponding to the non-difference region in the historical reference image as depth information of the non-difference region.
2. The image processing method according to claim 1, wherein the comparing the difference between the image to be processed and the historical reference image to obtain a comparison result comprises:
determining the same object in the image to be processed and the historical reference image;
determining a first area of the object in the image to be processed and a second area in the historical reference image;
carrying out difference comparison on the first region and the second region to obtain a comparison result;
determining a difference region and a non-difference region from the image to be processed according to the comparison result, wherein the determining comprises the following steps:
and if the comparison result shows that the area of the first region is different from the area of the second region and/or the position of the first region is different from the position of the second region, determining a difference region and a non-difference region from the image to be processed according to the first region.
3. The image processing method according to claim 2, wherein the determining, according to the first region, a difference region and a non-difference region from the image to be processed comprises:
determining a region in a preset range in the image to be processed by taking the center of the first region as an origin as a difference region, and determining a region except the difference region in the image to be processed as a non-difference region, wherein the difference region comprises the first region.
4. The image processing method according to claim 3, wherein before determining a region other than the difference region in the image to be processed as a non-difference region, the method further comprises:
determining a region to be determined from the region except the difference region in the image to be processed, and calculating the depth information of the region to be determined through a depth information calculation strategy;
determining a region to be compared corresponding to the region to be determined from the historical reference image, and acquiring depth information of the region to be compared;
the determining the other regions except the difference region in the image to be processed as non-difference regions includes:
and if the depth information of the region to be determined is the same as the depth information of the region to be compared, determining the region except the difference region in the image to be processed as a non-difference region.
5. The image processing method according to claim 1, wherein the comparing the difference between the image to be processed and the historical reference image to obtain a comparison result comprises:
determining a first focusing area of the image to be processed and determining a second focusing area of the historical reference image;
performing difference comparison on the first focusing area and the second focusing area to obtain a comparison result;
the determining a difference region from the image to be processed and a non-difference region from the historical reference image according to the comparison result comprises:
if the comparison result shows that the first focusing area is different from the second focusing area, determining the first focusing area as a difference area, and determining an area except the first focusing area in the image to be processed as a non-difference area.
6. The image processing method according to any one of claims 1 to 5, wherein before the acquiring the historical reference image, further comprising:
determining the number of history candidate images;
the acquiring of the historical reference image comprises:
and if the number meets a preset condition, determining a historical reference image from the historical candidate images.
7. An image processing apparatus characterized by comprising:
the image acquisition module is used for acquiring an image to be processed and acquiring a historical reference image;
the difference comparison module is used for carrying out difference comparison on the image to be processed and the historical reference image to obtain a comparison result;
the region determining module is used for determining a difference region and a non-difference region from the image to be processed according to the comparison result and calculating the depth information of the difference region through a depth information calculating strategy;
an information determining module, configured to determine depth information of a region corresponding to the non-difference region in the historical reference image as depth information of the non-difference region.
8. An electronic device, comprising:
the front-facing image processor is used for acquiring an image to be processed and acquiring a historical reference image; carrying out difference comparison on the image to be processed and the historical reference image to obtain a comparison result; determining a difference area and a non-difference area from the image to be processed according to the comparison result; calculating depth information of the difference region;
an application processor for determining depth information of a region corresponding to the non-difference region in the history reference image as depth information of the non-difference region.
9. An electronic device, characterized in that the electronic device comprises a processor and a memory, in which a computer program is stored, the processor being adapted to perform the image processing method of any of claims 1 to 6 by calling the computer program stored in the memory.
10. A storage medium having stored therein a computer program which, when run on a computer, causes the computer to execute the image processing method according to any one of claims 1 to 6.
CN202210885516.3A 2022-07-26 2022-07-26 Image processing method, image processing device, storage medium and electronic equipment Pending CN115272718A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210885516.3A CN115272718A (en) 2022-07-26 2022-07-26 Image processing method, image processing device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210885516.3A CN115272718A (en) 2022-07-26 2022-07-26 Image processing method, image processing device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN115272718A true CN115272718A (en) 2022-11-01

Family

ID=83769584

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210885516.3A Pending CN115272718A (en) 2022-07-26 2022-07-26 Image processing method, image processing device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN115272718A (en)

Similar Documents

Publication Publication Date Title
CN111091590B (en) Image processing method, device, storage medium and electronic equipment
CN113286194A (en) Video processing method and device, electronic equipment and readable storage medium
CN109691080B (en) Image shooting method and device and terminal
KR101620933B1 (en) Method and apparatus for providing a mechanism for gesture recognition
EP2630616A1 (en) Method and apparatus for providing hand detection
CN110009555B (en) Image blurring method and device, storage medium and electronic equipment
CN114339102B (en) Video recording method and equipment
CN112258404A (en) Image processing method, image processing device, electronic equipment and storage medium
CN114390201A (en) Focusing method and device thereof
CN113838151B (en) Camera calibration method, device, equipment and medium
CN111212222A (en) Image processing method, image processing apparatus, electronic apparatus, and storage medium
US8610831B2 (en) Method and apparatus for determining motion
CN111583329B (en) Augmented reality glasses display method and device, electronic equipment and storage medium
CN111385481A (en) Image processing method and device, electronic device and storage medium
CN111726526A (en) Image processing method and device, electronic equipment and storage medium
CN117132515A (en) Image processing method and electronic equipment
CN115272718A (en) Image processing method, image processing device, storage medium and electronic equipment
US9952671B2 (en) Method and apparatus for determining motion
CN115134532A (en) Image processing method, image processing device, storage medium and electronic equipment
CN113660420A (en) Video frame processing method and video frame processing device
CN114241127A (en) Panoramic image generation method and device, electronic equipment and medium
CN116091572B (en) Method for acquiring image depth information, electronic equipment and storage medium
CN111091513A (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN112702528B (en) Video anti-shake method and device and electronic equipment
CN113709372B (en) Image generation method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination