CN115134532A - Image processing method, image processing device, storage medium and electronic equipment - Google Patents

Image processing method, image processing device, storage medium and electronic equipment Download PDF

Info

Publication number
CN115134532A
CN115134532A CN202210887708.8A CN202210887708A CN115134532A CN 115134532 A CN115134532 A CN 115134532A CN 202210887708 A CN202210887708 A CN 202210887708A CN 115134532 A CN115134532 A CN 115134532A
Authority
CN
China
Prior art keywords
image
depth information
virtual
plane
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210887708.8A
Other languages
Chinese (zh)
Inventor
朱文波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210887708.8A priority Critical patent/CN115134532A/en
Publication of CN115134532A publication Critical patent/CN115134532A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • G06T5/94
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Abstract

The application discloses an image processing method, an image processing device, a storage medium and an electronic device. The method comprises the following steps: in response to the focusing instruction, taking a plane where a focusing area indicated by the focusing instruction is located as a virtual focal plane, and taking a plane where an area which is not located in the virtual focal plane in the shooting area is located as a virtual other plane; obtaining first depth information of a virtual focal plane and second depth information of other virtual planes according to historical depth information of a historical shooting image; shooting a shooting scene to obtain a first shooting image; and performing blurring processing on the first shot image according to the difference between the first depth information and the second depth information to obtain a first blurred image. The image blurring effect can be achieved.

Description

Image processing method, image processing device, storage medium and electronic equipment
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, a storage medium, and an electronic device.
Background
At present, with the rapid development of electronic device technologies, electronic devices such as smart phones or tablet computers are more and more deeply inserted into lives of people, photographing functions of the electronic devices such as smart phones or tablet computers are more and more powerful, and the electronic devices such as smart phones or tablet computers have become the first choice for users to photograph. When taking a picture through electronic equipment such as a smart phone or a tablet computer, in some scenes, an imaging main body needs to be highlighted, other viewing objects are weakened, and the blurring effect of the image is also realized. Therefore, it is necessary to provide a scheme capable of realizing the blurring effect of an image.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, a storage medium and an electronic device, which can realize the blurring effect of an image.
In a first aspect, an embodiment of the present application provides an image processing method, including:
in response to a focusing instruction, taking a plane where a focusing area indicated by the focusing instruction is located as a virtual focal plane, and taking a plane where an area which is not located on the virtual focal plane in a shooting area is located as a virtual other plane;
obtaining first depth information of the virtual focal plane and second depth information of the virtual other planes according to historical depth information of a historical shooting image;
shooting a shooting scene to obtain a first shooting image;
and performing blurring processing on the first shot image according to the difference between the first depth information and the second depth information to obtain a first blurred image.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including:
the plane determining module is used for responding to a focusing instruction, taking a plane where a focusing area indicated by the focusing instruction is located as a virtual focal plane, and taking a plane where an area which is not located in the virtual focal plane in a shooting area is located as a virtual other plane;
the information determining module is used for obtaining first depth information of the virtual focal plane and second depth information of the virtual other planes according to historical depth information of historical shooting images;
the image shooting module is used for shooting a shooting scene to obtain a first shooting image;
and the blurring processing module is used for blurring the first shot image according to the difference between the first depth information and the second depth information to obtain a first blurring image.
In a third aspect, an embodiment of the present application provides an electronic device, including:
the front-end image processor is used for responding to a focusing instruction, taking a plane where a focusing area indicated by the focusing instruction is located as a virtual focal plane, and taking a plane where an area which is not located in the virtual focal plane in a shooting area is located as a virtual other plane; obtaining first depth information of the virtual focal plane and second depth information of the virtual other planes according to historical depth information of a historical shooting image;
the camera is used for shooting a shooting scene to obtain a first shooting image;
and the application processor is used for carrying out blurring processing on the first shot image according to the difference between the first depth information and the second depth information to obtain a first blurring image.
In a fourth aspect, an embodiment of the present application provides an electronic device, which includes a memory and a processor, where the processor is configured to execute an image processing method provided by an embodiment of the present application by calling a computer program stored in the memory.
In a fifth aspect, an embodiment of the present application provides a storage medium, on which a computer program is stored, which, when executed on a computer, causes the computer to execute an image processing method provided by an embodiment of the present application.
In the embodiment of the application, a plane where a focusing area indicated by a focusing instruction is located is taken as a virtual focal plane, and a plane where an area which is not located in the virtual focal plane is located in a shooting area is taken as a virtual other plane; then according to the depth information of the historical shooting image, obtaining first depth information of a virtual focal plane and second depth information of other virtual planes; and blurring the first shot image obtained by shooting the shot scene according to the difference between the first depth information and the second depth information to obtain a first blurred image, so that the blurring effect of the image can be realized.
Drawings
The technical solutions and advantages of the present application will be apparent from the following detailed description of specific embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a schematic flowchart of an image processing method according to an embodiment of the present application.
Fig. 2 is a schematic view of a first scene of an image processing method according to an embodiment of the present application.
Fig. 3 is a schematic view of a second scene of an image processing method according to an embodiment of the present application.
Fig. 4 is a schematic diagram of a third scenario of an image processing method according to an embodiment of the present application.
Fig. 5 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application.
Fig. 6 is a schematic structural diagram of a first electronic device according to an embodiment of the present application.
Fig. 7 is a schematic structural diagram of a second electronic device according to an embodiment of the present application.
Fig. 8 is a third structural schematic diagram of an electronic device according to an embodiment of the present application.
Detailed Description
It should be noted that the terms "first", "second", "third" and "fourth", etc. in this application are used to distinguish different objects, and are not used to describe a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or modules is not limited to only those steps or modules listed, but rather, some embodiments may include other steps or modules not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The embodiment of the application provides an image processing method, an image processing device, a storage medium and an electronic device, wherein an execution subject of the image processing method can be the image processing device provided by the embodiment of the application or the electronic device integrated with the image processing device, and the image processing device can be realized in a hardware or software mode. The electronic device may be a smart phone, a tablet computer, a palm computer, a notebook computer, or the like having image processing capability and configured with a processor.
Referring to fig. 1, fig. 1 is a schematic flow chart of an image processing method according to an embodiment of the present application, where the flow chart may include:
in 101, in response to a focusing instruction, a plane where a focusing area indicated by the focusing instruction is located is taken as a virtual focal plane, and a plane where an area not located in the virtual focal plane in the shooting area is located is taken as a virtual other plane.
For example, when the user wants the electronic device to focus a certain area in the shooting area, the user may click the area, so that the electronic device receives a focusing instruction indicating that the area is a focusing area. In response to the focusing instruction, the electronic device drives a camera of the electronic device to move so as to perform focusing processing. In this embodiment, in response to the focusing instruction, the electronic device further uses a plane where the focusing area indicated by the focusing instruction is located as a virtual focal plane, and uses a plane where an area in the shooting area that is not located in the virtual focal plane is located as a virtual other plane.
For another example, the electronic device may also automatically determine that a certain area of the shooting area is a focusing area through a corresponding policy, so that the electronic device receives a focusing instruction indicating that the certain area is the focusing area. In response to the focusing instruction, the electronic device drives a camera of the electronic device to move so as to perform focusing processing. In this embodiment, in response to the focusing instruction, the electronic device further uses a plane where the focusing area indicated by the focusing instruction is located as a virtual focal plane, and uses a plane where an area in the shooting area that is not located in the virtual focal plane is located as a virtual other plane.
The shooting area refers to an area presented by a viewfinder frame of the electronic equipment, and the focusing area refers to an area in the shooting area, which needs to be focused.
The real focal plane is the plane that passes through the in-focus point and is parallel to the imaging plane of the electronic device, while other planes parallel to the imaging plane of the electronic device may be real other planes. Since the real focal plane, i.e. the actual focal plane, needs to wait for the focusing to be completed before determining, it usually takes a certain time, e.g. 3 frames to 5 frames, from the start of focusing to the completion of focusing. In this embodiment, in order to achieve the image blurring effect earlier, after receiving the focusing instruction, before the focusing is completed, the electronic device directly takes the plane where the focusing area indicated by the focusing instruction is located as the virtual focal plane, and takes the plane where the area in the shooting area that is not located in the virtual focal plane as the virtual other plane.
In 102, first depth information of a virtual focal plane and second depth information of a virtual other plane are obtained from historical depth information of a historical captured image.
It is understood that, in the present embodiment, the depth information of the historical reference image is already determined, i.e., the depth information of the historical reference image is known. Also, the image content of the history captured image is the same as the image content presented in the captured area. In some embodiments, the degree of similarity of the image content of the history captured image and the image content presented by the capturing area is greater than or equal to a preset degree of similarity. The preset similarity can be set by a user or determined by the electronic equipment based on a certain rule.
In this embodiment, the electronic device obtains first depth information of the virtual focal plane and second depth information of the virtual other planes according to the historical depth information of the historical captured image.
For example, before step 101, the electronic device further uses a plane where a history focusing region of the history captured image is located as a history real focal plane, and divides a region not located in the history real focal plane in the history captured image into a plurality of history other regions according to depth information of the history captured image, and uses the plane where each history other region is located as a history real other plane; then, taking the depth information of the historical focusing area as the depth information of the historical real focal plane, and taking the depth information of other historical areas as the depth information of other historical real planes where the historical real focal plane is located; and finally, blurring the historical photographed image according to the depth information of the historical real focal plane and the depth information of each historical real other plane to obtain a historical blurred image.
And after receiving the focusing instruction, the electronic equipment determines a historical focusing area in the historical shooting image and a historical area corresponding to the focusing area indicated by the focusing instruction in the plurality of historical other areas, takes the depth information of a historical real plane where the historical area is located as the depth information of a virtual focal plane where the focusing area is located, determines an area corresponding to each historical area in the historical focusing area and the areas except the historical area in the plurality of historical other areas from the area which is not located in the virtual focal plane in the shooting area, and obtains a plurality of areas to be determined. And the electronic equipment takes the depth information of the historical real plane where the area corresponding to each area to be determined is located as the depth information of the virtual other plane where each area to be determined is located.
Wherein, the depth information of the image can be understood as: the distances, i.e., depth values, from the image capture device to the various points in the scene to which the image corresponds. After the electronic device starts a shooting application program (for example, a system application "camera" of the electronic device) according to a user operation, a scene aimed at by a camera of the electronic device is a shooting scene. For example, after a user clicks an icon of a "camera" application on the electronic device through a finger to start the "camera application", if the user uses a camera of the electronic device to aim at a certain scene, the scene is a shooting scene. From the above description, it will be understood by those skilled in the art that the shooting scene is not specific to a particular scene, but is a scene aligned in real time following the orientation of the camera.
For example, as shown in fig. 2, assuming that the history focused area of the history captured image M21 is the area a21, the plane in which the history focused area is located is the history focal plane (the other areas of the history captured image M21 having the same depth value as the history focused area are all located at the history focal plane); for the area not located in the history focal plane in the history captured image M21, the area may also be divided into a plurality of history other areas according to the depth values of the history other areas, and the depth value of each history other area is used as the depth value of the history real other plane where the area is located, where the depth values of the pixel points in each history other area are the same or the difference is negligible.
And after receiving the focusing instruction, the electronic device determines a history focusing area a21 in the history captured image and a history area corresponding to the focusing area a24 indicated by the focusing instruction among the plurality of history other areas. The electronic device may determine that the history area is the history other area a22, and then the electronic device takes the depth value of the history real other plane where the history other area a22 is located as the depth value of the virtual focal plane where the focusing area a24 is located, and determines an area corresponding to each history area among the history focusing area a21 and the areas other than the history other area a22 among the plurality of history other areas from the area not located at the virtual focal plane in the photographing area a22, resulting in a plurality of areas to be determined. And the electronic equipment takes the depth value of the historical real plane where the area corresponding to each area to be determined is located as the depth value of the virtual other plane where each area to be determined is located. For example, the electronic device determines that the to-be-determined region a23 corresponds to the history focus region a21, then the electronic device takes the depth value of the history real focus plane where the history focus region a21 is located as the depth value of the to-be-determined region a 23.
In 103, a shooting scene is shot, resulting in a first shot image.
In this embodiment, the electronic device photographs a shooting scene through the camera to obtain a first shot image. The image content of the first captured image is also the same as the image content of the history captured image. In some embodiments, the similarity of the image content of the first captured image and the history captured image is greater than or equal to a preset similarity. The preset similarity can be set by a user or determined by the electronic equipment based on a certain rule.
It should be noted that step 101 may be executed first, and then step 103 may be executed, or step 103 may be executed first, and then step 101 may be executed, and step 101 and step 103 may also be executed simultaneously.
At 104, a first blurred image is obtained by blurring the first captured image according to a difference between the first depth information and the second depth information.
In this embodiment, after obtaining the first depth information of the virtual focal plane and the second depth information of the virtual other plane, the electronic device performs blurring processing on the first captured image according to a difference between the first depth information and the second depth information, so as to obtain a first blurred image.
It can be understood that, in the blurring processing of the image, the imaging of the focal plane is the clearest, and theoretically, the farther away from the focal plane, the bigger the circle of confusion, the more blurred the imaging. That is, the closer to the focal plane the image is. The ideal blurring effect is that although the range outside the depth of field is blurred, the blur degree is different, and the closer to the focal plane, the sharper the blur, and the farther away from the focal plane, the more blurred the blur. In this embodiment, the electronic device may set the sharpness of the region in the virtual focal plane to be the highest, determine, based on a difference between the first depth information and the second depth information, virtual other planes within the depth of field from the virtual other planes, set the sharpness of the region in the virtual other planes to be higher, determine, for the virtual other planes, the sharpness of the region in the virtual other planes according to a principle that the farther from the focal plane is, the more blurred the other virtual other planes are, and perform blurring processing on the first captured image based on the determined sharpness, to obtain a blurred image.
In an optional embodiment, the electronic device may also set the sharpness of the region in the virtual focal plane to be the highest, set the sharpness of the region in the virtual other plane according to a principle that the farther away from the focal plane is, the more blurred the region is, and perform blurring processing on the first captured image based on the determined sharpness to obtain a blurred image.
In this embodiment, a plane where a focusing area indicated by the focusing instruction is located is taken as a virtual focal plane, and a plane where an area not located in the virtual focal plane in the shooting area is located is taken as a virtual other plane; then according to the depth information of the historical shooting image, obtaining first depth information of a virtual focal plane and second depth information of other virtual planes; and blurring a first shot image obtained by shooting the shooting scene according to the difference between the first depth information and the second depth information to obtain a first blurred image, so that the blurring effect of the image can be realized.
In the related art, when blurring an image, the position of an in-focus point is determined after focusing is completed, and the acquired image can be blurred only by using a plane which passes through the in-focus point and is parallel to an imaging plane of an electronic device as a focal plane, and a plurality of frames, such as 3 to 5 frames, are usually required to start focusing until focusing is completed, so that a user can see a blurring effect only by waiting for a plurality of frames, and user experience is poor. By the image processing method provided by the embodiment, the effect of blurring the acquired image can be realized in a short time, such as 1 frame, before focusing is completed, so that a user can see the blurring effect in advance, and further the user can determine whether to determine the focusing point again based on the blurring effect, and the user experience is good.
In an optional embodiment, before obtaining the first depth information of the virtual focal plane and the second depth information of the virtual other plane according to the historical depth information of the historical captured images, the method further includes:
taking a plane between a history focal plane corresponding to the history shot image and the virtual focal plane as an intermediate virtual focal plane;
obtaining third depth information of the middle virtual focal plane and fourth depth information of other middle virtual planes according to the historical depth information;
shooting a shooting scene to obtain a second shooting image;
blurring the second shot image according to the difference between the third depth information and the fourth depth information to obtain a second blurred image;
the second blurred image is displayed.
It will be appreciated that the intermediate focal plane is the plane between the historical focal plane and the virtual focal plane. For example, assuming that the depth value of the history focal plane is 50 cm and the depth value of the virtual focal plane is 100 cm, any plane with a depth value between 50 cm and 100 cm can be used as the intermediate focal plane.
It may also be understood that the specific implementation of "performing blurring processing on the second captured image according to the difference between the third depth information and the fourth depth information to obtain the second blurred image" may refer to the specific implementation of "performing blurring processing on the first captured image according to the difference between the first depth information and the second depth information to obtain the first blurred image", and details are not repeated here.
For example, assuming that the depth value of the real focal plane of the history captured image is 50 centimeters, the depth values of the respective real other planes of the history captured image are 20 centimeters, 30 centimeters, 70 centimeters, and 100 centimeters, and the depth value of the virtual focal plane is 100 centimeters, the electronic device may take the real other plane having the depth value of 70 centimeters as the intermediate focal plane, and take the real focal plane and the real other planes other than the intermediate focal plane as the intermediate other planes, and since the depth values of the real focal plane and the respective real other planes of the history captured image are known, the depth values of the intermediate focal plane and the intermediate other planes may be obtained accordingly. And the image contents of the second captured image and the history captured image are also the same. In some embodiments, the similarity of the image content of the first captured image and the history captured image is greater than or equal to a preset similarity. The preset similarity can be set by a user or determined by the electronic equipment based on a certain rule. Then, the second photographed image has the same depth value as the history photographed image, and based on this, the electronic apparatus may determine areas at the intermediate focal plane and the intermediate other planes from the second photographed image according to the depth value of the second photographed image and the depth values of the intermediate focal plane and the intermediate other planes, and blur the second photographed image according to a difference between the depth value of the intermediate focal plane and the depth values of the intermediate other planes, to obtain a second blurred image.
In an optional embodiment, after performing a blurring process on the first captured image according to a difference between the first depth information and the second depth information to obtain a first blurred image, the method further includes:
the second blurred image is dismissed from being displayed, and the first blurred image is displayed.
For example, as shown in fig. 3, after obtaining the second blurred image, the electronic device may display the second blurred image, and after obtaining the first blurred image, the electronic device may cancel displaying the second blurred image and display the first blurred image, so as to achieve the transitional blurring effect.
In some embodiments, after obtaining the history blurred image of the history photographed image, the electronic device may display the history blurred image, after obtaining the second blurred image, the electronic device may display the second blurred image, and after obtaining the first blurred image, the electronic device may cancel displaying the second blurred image and display the first blurred image, thereby achieving a blurring effect of a transition of the history blurred image to the first blurred image.
In an optional embodiment, the image processing method further comprises:
responding to a focusing instruction, and carrying out focusing processing according to a focusing area indicated by the focusing instruction;
after the second blurring image is undone and the first blurring image is displayed, the method further includes:
shooting a shooting scene after focusing is finished to obtain a third shooting image;
calculating the depth information of the third shot image through a depth information calculation strategy;
according to the depth information of the third shot image, fifth depth information of a real focal plane corresponding to the focusing area and sixth depth information of other real planes are obtained;
and blurring the third shot image according to the difference between the fifth depth information and the sixth depth information to obtain a third blurred image.
In response to the focusing instruction, the electronic equipment drives a camera of the electronic equipment to move according to the focusing area indicated by the focusing instruction so as to perform focusing processing on the focusing area. And after focusing is finished, the electronic equipment can shoot a shooting scene through the camera to obtain a third shot image. And calculating the depth information of the third shot image through a depth information calculation strategy.
For example, the depth information calculation strategy includes a depth estimation model, and after the third captured image is obtained, the electronic device may input the third captured image into a depth estimation model trained in advance to obtain the depth information of the third captured image.
For another example, the depth information calculation policy includes a binocular stereo vision policy, and the electronic device acquires a third captured image of a certain captured scene through a camera and also acquires other captured images of the captured scene through another camera spaced from the camera by a certain distance. Then, the electronic device finds corresponding pixel points in the third shot image and other shot images through a stereo matching algorithm, then calculates parallax information according to a triangle principle, and the parallax information can be used for representing depth information of objects in a scene through conversion, so that the depth information of the third shot image can be obtained.
It should be noted that the depth information calculation policy may also include other policies that can calculate the depth information of the image, and those skilled in the art may adopt a corresponding depth information calculation policy to calculate the depth information of the third captured image according to the actual situation, which is not limited in this respect.
In an optional embodiment, if the third captured image has the same image content as the history captured image, or the similarity is greater than or equal to the preset similarity, the electronic device may also use the depth information of the history captured image as the depth information of the third captured image. The preset similarity may be set by a user, or may be determined by the electronic device based on a certain rule.
After the depth information of the third shot image is obtained, the electronic equipment determines an area corresponding to the focusing area from the third shot image, and takes the depth information of the area as fifth depth information of a real focal plane. For other depth information of the third captured image, the electronic device may divide the region not located in the real focal plane in the third captured image into a plurality of regions according to the other depth information of the third captured image, use the plane where each region is located as a real other plane, and determine sixth depth information of each real other plane. And then, the electronic equipment performs blurring processing on the third shot image according to the difference between the fifth depth information and the sixth depth information to obtain a third blurred image.
For a specific implementation, "performing the blurring process on the third captured image according to the difference between the fifth depth information and the sixth depth information to obtain the third blurred image" may refer to a specific implementation "performing the blurring process on the first captured image according to the difference between the first depth information and the second depth information to obtain the first blurred image", and details thereof are not repeated here.
In an optional embodiment, after performing a blurring process on the third captured image according to a difference between the fifth depth information and the sixth depth information to obtain a third blurred image, the method further includes:
the first blurred image is undisplayed and the third blurred image is displayed.
For example, as shown in fig. 4, after obtaining the second blurred image, the electronic device may display the second blurred image, after obtaining the first blurred image, the electronic device may cancel displaying the second blurred image and display the first blurred image, and after obtaining the third blurred image, the electronic device may cancel displaying the first blurred image and display the third blurred image, thereby implementing a blurring effect of a transition from the second blurred image to the third blurred image, and enabling a user to preview a blurring effect of a transition from the second blurred image to the third blurred image.
In some embodiments, after obtaining the history blurred image of the history photographed image, the electronic device may display the history blurred image, after obtaining the second blurred image, the electronic device may cancel displaying the history blurred image and displaying the second blurred image, after obtaining the first blurred image, the electronic device may cancel displaying the second blurred image and displaying the first blurred image, and after obtaining the third blurred image, the electronic device may cancel displaying the first blurred image and displaying the third blurred image, so that a blurring effect of a transition from the history blurred image to the third blurred image is achieved, and a blurring effect of a transition from the history blurred image to the third blurred image is previewed by the user. When a photographing instruction is received, at least one of the historical blurred image, the second blurred image, the first blurred image and the third blurred image may be used as a photographing result in response to the photographing instruction, such as a photograph obtained in response to the photographing instruction. When a recording instruction is received, the history ghosting image, the second ghosting image, the first ghosting image and the third ghosting image can be used as a recording result responding to the recording instruction, such as continuous multiframe images in a video obtained in response to the recording instruction.
In an optional embodiment, the step of blurring the first captured image according to a difference between the first depth information and the second depth information to obtain a first blurred image includes:
determining the definition of each virtual other plane according to the difference between the first depth information and the second depth information of each virtual other plane;
and performing blurring processing on the first shot image according to the definition of each virtual other plane to obtain a first blurred image.
Wherein the closer to the virtual other plane of the virtual focal plane the higher the sharpness, and the further away from the virtual other plane of the virtual focal plane the lower the sharpness.
For example, assuming that the depth value of the virtual focal plane is 100 cm, the depth values of the virtual other planes, such as the virtual other planes P1, P2, P3, P4, and P5, are 30 cm, 50 cm, 80 cm, 120 cm, and 150 cm, respectively. The difference values of the depth values of the virtual other planes P1, P2, P3, P4 and P5 and the virtual focal plane are 70 cm, 50 cm, 20 cm and 50 cm in sequence. Then, the sharpness of the virtual focal plane may be set to the first sharpness, the sharpness of the virtual other planes P3 and P4 may be set to the second sharpness, the sharpness of the virtual other planes P2 and P5 may be set to the third sharpness, and the sharpness of the virtual other plane P1 may be set to the fourth sharpness. Subsequently, the electronic device may perform blurring processing on the first captured image according to the degrees of sharpness corresponding to the virtual focal plane and the virtual other planes P1, P2, P3, P4, and P5, respectively, to obtain a first blurred image. Wherein the first definition is greater than the second definition, the second definition is greater than the third definition, and the third definition is greater than the fourth definition.
In an optional embodiment, after obtaining the third blurred image, the electronic device may further perform subsequent image processing, such as anti-shake or facial processing, on the third blurred image.
In an optional embodiment, the image processing method further comprises:
acquiring a current exposure parameter and a historical exposure parameter;
determining an intermediate exposure parameter according to the current exposure parameter and the historical exposure parameter;
shooting a shooting scene according to the intermediate exposure parameters to obtain a fourth shooting image;
and shooting the shooting scene according to the current exposure parameters to obtain a fifth shooting image.
The exposure parameters include exposure duration, aperture parameters, shutter speed and other parameters.
It can be understood that, in this embodiment, when the exposure parameter changes, the image may be gradually adjusted, so as to prevent sudden change of the aperture parameter, the exposure duration, or the shutter speed from causing sudden change of the image, which affects the user experience.
For example, assuming that the current exposure time is 20 milliseconds and the historical exposure time is 10 milliseconds, the electronic device may sequentially photograph the shooting scene according to the intermediate exposure time of 11 milliseconds, 12 milliseconds, 13 milliseconds, 14 milliseconds, 15 milliseconds, 16 milliseconds, 17 milliseconds, 18 milliseconds and 19 milliseconds, sequentially obtain a plurality of fourth shooting images, then photograph the shooting scene according to the current exposure time of 20 milliseconds, obtain a fifth shooting image, and sequentially display the obtained plurality of fourth shooting images and the fifth shooting image according to the shooting time from early to late, so as to prevent sudden changes in the exposure time from causing sudden changes in the images and affecting the user experience.
Referring to fig. 5, fig. 5 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present disclosure. The image processing apparatus 200 includes: a plane determination module 201, an information determination module 202, an image capturing module 203, and a blurring processing module 204.
And the plane determining module 201 is configured to, in response to the focusing instruction, take a plane where the focusing area indicated by the focusing instruction is located as a virtual focal plane, and take a plane where an area not located in the virtual focal plane in the shooting area is located as a virtual other plane.
The information determining module 202 is configured to obtain first depth information of a virtual focal plane and second depth information of virtual other planes according to historical depth information of historical captured images.
And the image shooting module 203 is used for shooting a shooting scene to obtain a first shooting image.
The blurring processing module 204 is configured to perform blurring processing on the first captured image according to a difference between the first depth information and the second depth information, so as to obtain a first blurred image.
In an optional embodiment, the image processing apparatus 200 may further include an image display module, and the plane determination module 201 may be configured to: taking a plane between a history focal plane corresponding to the history shot image and the virtual focal plane as an intermediate virtual focal plane;
an information determination module 202, which may be configured to: obtaining third depth information of the middle virtual focal plane and fourth depth information of other middle virtual planes according to the historical depth information;
an image capture module 203 operable to: shooting a shooting scene to obtain a second shooting image;
a blurring processing module 204, configured to: blurring the second shot image according to the difference between the third depth information and the fourth depth information to obtain a second blurred image;
and the image display module can be used for displaying the second blurring image.
In an alternative embodiment, the image display module may be configured to: the second blurred image is dismissed from being displayed, and the first blurred image is displayed.
In an optional embodiment, the image processing apparatus 200 may further include a focusing processing module, and the focusing processing module may be configured to: responding to a focusing instruction, and carrying out focusing processing according to a focusing area indicated by the focusing instruction;
an image capture module 203 operable to: shooting a shooting scene after focusing is finished to obtain a third shooting image;
an information determination module 202, which may be configured to: calculating the depth information of the third shot image through a depth information calculation strategy; according to the depth information of the third shot image, fifth depth information of a real focal plane corresponding to the focusing area and sixth depth information of other real planes are obtained;
a blurring processing module 204, configured to: and blurring the third shot image according to the difference between the fifth depth information and the sixth depth information to obtain a third blurred image.
In an alternative embodiment, the image display module may be configured to: the first blurred image is undisplayed and the third blurred image is displayed.
In an alternative embodiment, the blurring processing module 204 may be configured to: determining the definition of each virtual other plane according to the difference between the first depth information and the second depth information of each virtual other plane; and performing blurring processing on the first shot image according to the definition of each virtual other plane to obtain a first blurred image.
The embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, which, when executed on a computer, causes the computer to execute the image processing method provided by the embodiment.
The embodiment of the present application further provides an electronic device, which includes a memory and a processor, where the processor is used to execute the image processing method provided in this embodiment by calling a computer program stored in the memory.
For example, the electronic device may be a mobile terminal such as a tablet computer or a smart phone. Referring to fig. 6, fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
The electronic device 300 may include a processor 301, a memory 302, and the like. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 6 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The processor 301, i.e., a central processing unit, is a control center of the electronic device, and is connected to various parts of the whole electronic device by using various interfaces and lines, and executes various functions of the electronic device and processes data by running or executing application programs stored in the memory 302 and calling data stored in the memory 302, thereby integrally monitoring the electronic device. In an alternative embodiment, the processor 301 may also be a processor with computing power, such as an application processor, a neural network processor, or the like.
The memory 302 may be used to store applications and data. The memory 302 stores applications containing executable code. The application programs may constitute various functional modules. The processor 301 executes various functional applications and data processing by running an application program stored in the memory 302.
In this embodiment, the processor 301 in the electronic device loads the executable code corresponding to the processes of one or more application programs into the memory 302 according to the following instructions, and the processor 301 runs the application programs stored in the memory 302, thereby implementing the following processes:
in response to the focusing instruction, taking a plane where a focusing area indicated by the focusing instruction is located as a virtual focal plane, and taking a plane where an area which is not located in the virtual focal plane in the shooting area is located as a virtual other plane;
obtaining first depth information of a virtual focal plane and second depth information of other virtual planes according to historical depth information of a historical shooting image;
shooting a shooting scene to obtain a first shooting image;
and performing blurring processing on the first shot image according to the difference between the first depth information and the second depth information to obtain a first blurred image.
Referring to fig. 7, the electronic device 300 may include a processor 301, a memory 302, an input unit 303, an output unit 304, and the like.
The processor 301, i.e., a central processing unit, is a control center of the electronic device, and is connected to various parts of the whole electronic device through various interfaces and lines, and executes various functions of the electronic device and processes data by running or executing application programs stored in the memory 302 and calling data stored in the memory 302, thereby performing overall monitoring of the electronic device. In an alternative embodiment, the processor 301 may also be a processor with computing power, such as an application processor, a neural network processor, or the like.
The memory 302 may be used to store applications and data. The memory 302 stores applications containing executable code. The application programs may constitute various functional modules. The processor 301 executes various functional applications and data processing by running an application program stored in the memory 302.
The input unit 303 may be used to receive input numbers, character information, or user characteristic information, such as a fingerprint, and generate a keyboard, mouse, joystick, optical, or trackball signal input related to user setting and function control.
The output unit 304 may be used to display information input by or provided to a user and various graphical user interfaces of the electronic device, which may be made up of graphics, text, icons, video, and any combination thereof. The output unit may include a display screen, and the display screen may include a display area.
In this embodiment, the processor 301 in the electronic device loads the executable code corresponding to the processes of one or more application programs into the memory 302 according to the following instructions, and the processor 301 runs the application programs stored in the memory 302, thereby implementing the following processes:
in response to the focusing instruction, taking a plane where a focusing area indicated by the focusing instruction is located as a virtual focal plane, and taking a plane where an area which is not located in the virtual focal plane in the shooting area is located as a virtual other plane;
obtaining first depth information of a virtual focal plane and second depth information of other virtual planes according to historical depth information of a historical shooting image;
shooting a shooting scene to obtain a first shooting image;
and performing blurring processing on the first shot image according to the difference between the first depth information and the second depth information to obtain a first blurred image.
In an alternative embodiment, before the processor 301 performs obtaining the first depth information of the virtual focal plane and the second depth information of the virtual other plane according to the historical depth information of the historical captured images, it may further perform: taking a plane between a history focal plane corresponding to the history shot image and the virtual focal plane as an intermediate virtual focal plane; according to the historical depth information, obtaining third depth information of the middle virtual focal plane and fourth depth information of other middle virtual planes; shooting a shooting scene to obtain a second shooting image; blurring the second shot image according to the difference between the third depth information and the fourth depth information to obtain a second blurred image; the second blurred image is displayed.
In an optional embodiment, after the processor 301 performs blurring processing on the first captured image according to a difference between the first depth information and the second depth information to obtain a first blurred image, the processor may further perform: the second blurred image is dismissed from being displayed, and the first blurred image is displayed.
In an alternative embodiment, the processor 301 may further perform: responding to a focusing instruction, and carrying out focusing processing according to a focusing area indicated by the focusing instruction; after the processor 301 cancels the displaying of the second blurred image and displays the first blurred image, it may further perform: shooting a shooting scene after focusing is finished to obtain a third shooting image; calculating the depth information of the third shot image through a depth information calculation strategy; according to the depth information of the third shot image, fifth depth information of a real focal plane corresponding to the focusing area and sixth depth information of other real planes are obtained; and blurring the third shot image according to the difference between the fifth depth information and the sixth depth information to obtain a third blurred image.
In an optional embodiment, after the processor 301 performs blurring processing on the third captured image according to a difference between the fifth depth information and the sixth depth information to obtain a third blurred image, the following may be further performed: the first blurred image is undisplayed and the third blurred image is displayed.
In an optional embodiment, when the processor 301 performs blurring processing on the first captured image according to a difference between the first depth information and the second depth information to obtain a first blurred image, the processor may perform: determining the definition of each virtual other plane according to the difference between the first depth information and the second depth information of each virtual other plane; and performing blurring processing on the first shot image according to the definition of each virtual other plane to obtain a first blurred image.
Referring to fig. 8, the electronic device 300 includes a front-end image processor 305, a camera 306, an application processor 307, and a display 208. Wherein the content of the first and second substances,
the front-end image processor 305 is configured to, in response to the focusing instruction, take a plane where a focusing area indicated by the focusing instruction is located as a virtual focal plane, and take a plane where an area in the shooting area that is not located in the virtual focal plane is located as a virtual other plane; according to the historical depth information of the historical shooting image, first depth information of a virtual focal plane and second depth information of other virtual planes are obtained.
And the camera 306 is configured to shoot a shooting scene to obtain a first shot image.
The application processor 307 is configured to perform blurring processing on the first captured image according to a difference between the first depth information and the second depth information, so as to obtain a first blurred image.
In an alternative embodiment, the pre-image processor 305 may be configured to: taking a plane between a history focal plane corresponding to the history shot image and the virtual focal plane as an intermediate virtual focal plane; obtaining third depth information of the middle virtual focal plane and fourth depth information of other middle virtual planes according to the historical depth information;
a camera 306, which may be used to: shooting a shooting scene to obtain a second shooting image;
a front-facing image processor 305 operable to: blurring the second shot image according to the difference between the third depth information and the fourth depth information to obtain a second blurred image;
a display screen 308, operable to: the second blurred image is displayed.
In an alternative embodiment, the display screen 308 may be used to: the second blurred image is dismissed from being displayed, and the first blurred image is displayed.
In an alternative embodiment, the application processor 307 may be configured to: responding to a focusing instruction, and carrying out focusing processing according to a focusing area indicated by the focusing instruction;
a camera 306, which may be used to: shooting a shooting scene after focusing is finished to obtain a third shooting image;
a front-facing image processor 305 operable to: calculating the depth information of the third shot image through a depth information calculation strategy; according to the depth information of the third shot image, fifth depth information of a real focal plane corresponding to the focusing area and sixth depth information of other real planes are obtained;
an application processor 307, which may be configured to: and blurring the third shot image according to the difference between the fifth depth information and the sixth depth information to obtain a third blurred image.
In an alternative embodiment, the display screen 308 may be used to: the first blurred image is undisplayed and the third blurred image is displayed.
In an alternative embodiment, the application processor 307, which may be configured to: determining the definition of each virtual other plane according to the difference between the first depth information and the second depth information of each virtual other plane; and performing blurring processing on the first shot image according to the definition of each virtual other plane to obtain a first blurred image.
In an alternative embodiment, the application processor 306 may be configured to: and performing subsequent image processing, such as beautifying or anti-shaking processing, and the like, by using the third blurred image.
In the above embodiments, the descriptions of the embodiments have respective emphasis, and parts that are not described in detail in a certain embodiment may refer to the above detailed description of the image processing method, and are not described herein again.
The image processing apparatus provided in the embodiment of the present application and the image processing method in the above embodiment belong to the same concept, and any method provided in the embodiment of the image processing method may be run on the image processing apparatus, and a specific implementation process thereof is described in detail in the embodiment of the image processing method, and is not described herein again.
It should be noted that, for the image processing method in the embodiments of the present application, it can be understood by those skilled in the art that all or part of the processes for implementing the image processing method in the embodiments of the present application can be completed by controlling the related hardware through a computer program, the computer program can be stored in a computer readable storage medium, such as a memory, and executed by at least one processor, and the processes of the embodiments of the image processing method can be included in the execution process. The storage medium may be a magnetic disk, an optical disk, a Read Only Memory (ROM), a Random Access Memory (RAM), or the like.
In the image processing apparatus according to the embodiment of the present application, each functional module may be integrated into one processing chip, each module may exist alone physically, or two or more modules may be integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium, such as a read-only memory, a magnetic or optical disk, or the like.
The foregoing detailed description has provided an image processing method, an image processing apparatus, a storage medium, and an electronic device according to embodiments of the present application, and specific examples are applied herein to explain the principles and implementations of the present application, and the descriptions of the foregoing embodiments are only used to help understand the method and the core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. An image processing method, comprising:
in response to a focusing instruction, taking a plane where a focusing area indicated by the focusing instruction is located as a virtual focal plane, and taking a plane where an area which is not located in the virtual focal plane in a shooting area is located as a virtual other plane;
obtaining first depth information of the virtual focal plane and second depth information of the virtual other planes according to historical depth information of a historical shooting image;
shooting a shooting scene to obtain a first shooting image;
and performing blurring processing on the first shot image according to the difference between the first depth information and the second depth information to obtain a first blurred image.
2. The image processing method according to claim 1, wherein before obtaining the first depth information of the virtual focal plane and the second depth information of the virtual other plane from the historical depth information of the historical captured images, the method further comprises:
taking a plane between a history focal plane corresponding to the history shot image and the virtual focal plane as an intermediate virtual focal plane;
obtaining third depth information of the middle virtual focal plane and fourth depth information of other middle virtual planes according to the historical depth information;
shooting a shooting scene to obtain a second shooting image;
blurring the second shot image according to the difference between the third depth information and the fourth depth information to obtain a second blurred image;
displaying the second blurred image.
3. The image processing method according to claim 2, wherein after blurring the first captured image according to the difference between the first depth information and the second depth information to obtain a first blurred image, the method further comprises:
and canceling the display of the second virtualized image, and displaying the first virtualized image.
4. The image processing method according to claim 3, characterized in that the method further comprises:
responding to a focusing instruction, and carrying out focusing processing according to a focusing area indicated by the focusing instruction;
the undoing the display of the second blurred image and displaying the first blurred image further include:
shooting a shooting scene after focusing is finished to obtain a third shooting image;
calculating the depth information of the third shot image through a depth information calculation strategy;
according to the depth information of the third shot image, fifth depth information of a real focal plane corresponding to the focusing area and sixth depth information of other real planes are obtained;
and performing blurring processing on the third shot image according to the difference between the fifth depth information and the sixth depth information to obtain a third blurred image.
5. The image processing method according to claim 4, wherein after blurring the third captured image according to the difference between the fifth depth information and the sixth depth information to obtain a third blurred image, the method further comprises:
and canceling to display the first blurring image, and displaying the third blurring image.
6. The image processing method according to any one of claims 1 to 5, wherein the plurality of virtual other planes are provided, and the blurring the first captured image according to the difference between the first depth information and the second depth information to obtain a first blurred image includes:
determining the definition of each virtual other plane according to the difference between the first depth information and the second depth information of each virtual other plane;
and performing blurring processing on the first shot image according to the definition of each virtual other plane to obtain a first blurring image.
7. An image processing apparatus characterized by comprising:
the plane determining module is used for responding to a focusing instruction, taking a plane where a focusing area indicated by the focusing instruction is located as a virtual focal plane, and taking a plane where an area which is not located in the virtual focal plane in a shooting area is located as a virtual other plane;
the information determining module is used for obtaining first depth information of the virtual focal plane and second depth information of the virtual other planes according to historical depth information of historical shooting images;
the image shooting module is used for shooting a shooting scene to obtain a first shooting image;
and the blurring processing module is used for blurring the first shot image according to the difference between the first depth information and the second depth information to obtain a first blurring image.
8. An electronic device, comprising:
the front-end image processor is used for responding to a focusing instruction, taking a plane where a focusing area indicated by the focusing instruction is located as a virtual focal plane, and taking a plane where an area which is not located in the virtual focal plane in a shooting area is located as a virtual other plane; obtaining first depth information of the virtual focal plane and second depth information of the virtual other planes according to historical depth information of a historical shooting image;
the camera is used for shooting a shooting scene to obtain a first shooting image;
and the application processor is used for carrying out blurring processing on the first shot image according to the difference between the first depth information and the second depth information to obtain a first blurring image.
9. An electronic device, characterized in that the electronic device comprises a processor and a memory, wherein the memory stores a computer program, and the processor is used for executing the image processing method according to any one of claims 1 to 6 by calling the computer program stored in the memory.
10. A storage medium having stored therein a computer program which, when run on a computer, causes the computer to execute the image processing method according to any one of claims 1 to 6.
CN202210887708.8A 2022-07-26 2022-07-26 Image processing method, image processing device, storage medium and electronic equipment Pending CN115134532A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210887708.8A CN115134532A (en) 2022-07-26 2022-07-26 Image processing method, image processing device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210887708.8A CN115134532A (en) 2022-07-26 2022-07-26 Image processing method, image processing device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN115134532A true CN115134532A (en) 2022-09-30

Family

ID=83385876

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210887708.8A Pending CN115134532A (en) 2022-07-26 2022-07-26 Image processing method, image processing device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN115134532A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115866399A (en) * 2023-02-28 2023-03-28 广东欧谱曼迪科技有限公司 Automatic focusing method and device for 3D endoscope, electronic equipment and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104333700A (en) * 2014-11-28 2015-02-04 广东欧珀移动通信有限公司 Image blurring method and image blurring device
WO2015070737A1 (en) * 2013-11-15 2015-05-21 深圳市中兴移动通信有限公司 Shooting method and shooting device
CN105100615A (en) * 2015-07-24 2015-11-25 青岛海信移动通信技术股份有限公司 Image preview method, apparatus and terminal
CN107426493A (en) * 2017-05-23 2017-12-01 深圳市金立通信设备有限公司 A kind of image pickup method and terminal for blurring background
WO2017206589A1 (en) * 2016-06-02 2017-12-07 广东欧珀移动通信有限公司 Blurred photo generation method and apparatus, and mobile terminal
CN107945105A (en) * 2017-11-30 2018-04-20 广东欧珀移动通信有限公司 Background blurring processing method, device and equipment
CN110324532A (en) * 2019-07-05 2019-10-11 Oppo广东移动通信有限公司 A kind of image weakening method, device, storage medium and electronic equipment
JP2020102687A (en) * 2018-12-20 2020-07-02 キヤノン株式会社 Information processing apparatus, image processing apparatus, image processing method, and program
CN111726526A (en) * 2020-06-22 2020-09-29 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment and storage medium
CN112311965A (en) * 2020-10-22 2021-02-02 北京虚拟动点科技有限公司 Virtual shooting method, device, system and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015070737A1 (en) * 2013-11-15 2015-05-21 深圳市中兴移动通信有限公司 Shooting method and shooting device
CN104333700A (en) * 2014-11-28 2015-02-04 广东欧珀移动通信有限公司 Image blurring method and image blurring device
CN105100615A (en) * 2015-07-24 2015-11-25 青岛海信移动通信技术股份有限公司 Image preview method, apparatus and terminal
WO2017206589A1 (en) * 2016-06-02 2017-12-07 广东欧珀移动通信有限公司 Blurred photo generation method and apparatus, and mobile terminal
CN107426493A (en) * 2017-05-23 2017-12-01 深圳市金立通信设备有限公司 A kind of image pickup method and terminal for blurring background
CN107945105A (en) * 2017-11-30 2018-04-20 广东欧珀移动通信有限公司 Background blurring processing method, device and equipment
JP2020102687A (en) * 2018-12-20 2020-07-02 キヤノン株式会社 Information processing apparatus, image processing apparatus, image processing method, and program
CN110324532A (en) * 2019-07-05 2019-10-11 Oppo广东移动通信有限公司 A kind of image weakening method, device, storage medium and electronic equipment
CN111726526A (en) * 2020-06-22 2020-09-29 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment and storage medium
CN112311965A (en) * 2020-10-22 2021-02-02 北京虚拟动点科技有限公司 Virtual shooting method, device, system and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115866399A (en) * 2023-02-28 2023-03-28 广东欧谱曼迪科技有限公司 Automatic focusing method and device for 3D endoscope, electronic equipment and storage medium
CN115866399B (en) * 2023-02-28 2023-05-16 广东欧谱曼迪科技有限公司 3D endoscope automatic focusing method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
TWI706379B (en) Method, apparatus and electronic device for image processing and storage medium thereof
CN110677621B (en) Camera calling method and device, storage medium and electronic equipment
CN112822412B (en) Exposure method, exposure device, electronic equipment and storage medium
CN109923850B (en) Image capturing device and method
CN110290299B (en) Imaging method, imaging device, storage medium and electronic equipment
CN112637500B (en) Image processing method and device
CN112887617B (en) Shooting method and device and electronic equipment
CN106331497A (en) Image processing method and terminal
CN110636276A (en) Video shooting method and device, storage medium and electronic equipment
CN115209057B (en) Shooting focusing method and related electronic equipment
CN111770277A (en) Auxiliary shooting method, terminal and storage medium
CN115134532A (en) Image processing method, image processing device, storage medium and electronic equipment
CN113747067A (en) Photographing method and device, electronic equipment and storage medium
WO2018219274A1 (en) Method and apparatus for denoising processing, storage medium and terminal
CN114071010A (en) Shooting method and equipment
CN112653841B (en) Shooting method and device and electronic equipment
CN112672058B (en) Shooting method and device
CN112383708B (en) Shooting method and device, electronic equipment and readable storage medium
CN112887624B (en) Shooting method and device and electronic equipment
CN111654623B (en) Photographing method and device and electronic equipment
CN114241127A (en) Panoramic image generation method and device, electronic equipment and medium
CN114245018A (en) Image shooting method and device
CN113873147A (en) Video recording method and device and electronic equipment
CN113873160B (en) Image processing method, device, electronic equipment and computer storage medium
CN112399092A (en) Shooting method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination