CN113138387B - Image acquisition method and device, mobile terminal and storage medium - Google Patents

Image acquisition method and device, mobile terminal and storage medium Download PDF

Info

Publication number
CN113138387B
CN113138387B CN202010054546.0A CN202010054546A CN113138387B CN 113138387 B CN113138387 B CN 113138387B CN 202010054546 A CN202010054546 A CN 202010054546A CN 113138387 B CN113138387 B CN 113138387B
Authority
CN
China
Prior art keywords
image
acquisition target
acquisition
information
contour
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010054546.0A
Other languages
Chinese (zh)
Other versions
CN113138387A (en
Inventor
高文俊
彭聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202010054546.0A priority Critical patent/CN113138387B/en
Publication of CN113138387A publication Critical patent/CN113138387A/en
Application granted granted Critical
Publication of CN113138387B publication Critical patent/CN113138387B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B7/00Measuring arrangements characterised by the use of electric or magnetic techniques
    • G01B7/28Measuring arrangements characterised by the use of electric or magnetic techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/414Discriminating targets with respect to background clutter
    • G06T5/73
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Image Analysis (AREA)

Abstract

The disclosure relates to an image acquisition method, an image acquisition device, a mobile terminal and a storage medium, wherein the method comprises the following steps: transmitting radar waves and detecting echoes of the radar waves when image acquisition is carried out; determining profile information of an acquisition target according to the emission parameters of the radar waves and the echo parameters of the echo; and adjusting the image information of the first image formed by the acquisition target according to the contour information, and acquiring a second image comprising the imaging of the acquisition target based on the adjusted image information. Therefore, the contour information of the acquisition target obtained by radar wave detection is more accurate than that obtained by a deep learning model, so that the contour information of the acquisition target obtained by radar wave detection is adjusted to the image information of the first image formed by the acquisition target, the image division of the acquisition target is more accurate, the phenomena of inaccurate imaging division of the acquisition target and the like are reduced, and the imaging effect is improved.

Description

Image acquisition method and device, mobile terminal and storage medium
Technical Field
The disclosure relates to the technical field of mobile terminals, and in particular relates to an image acquisition method, an image acquisition device, a mobile terminal and a storage medium.
Background
When the mobile terminal collects images, the images are required to be divided according to different objects, and different areas in the images can be processed differently through the division. In the related art, a mobile terminal generally performs image segmentation on an image by using a deep learning model or the like, so as to implement different processing on different acquisition objects in the image, for example, blurring a background or the like. Image segmentation in this manner, however, tends to cause problems with inaccuracy in the edges of the segmentation.
Disclosure of Invention
According to a first aspect of an embodiment of the present disclosure, there is provided an image acquisition method, including:
transmitting radar waves and detecting echoes of the radar waves when image acquisition is carried out;
determining profile information of an acquisition target according to the emission parameters of the radar waves and the echo parameters of the echo;
adjusting image information of a first image formed by the acquisition target according to the contour information;
and acquiring a second image comprising the acquired target imaging based on the adjusted image information.
Optionally, the adjusting the image information of the first image formed by the acquisition target according to the contour information includes:
And adjusting the edge contour of the preview image of the acquisition target presented in the preview page of the mobile terminal according to the contour information.
Optionally, the acquiring, based on the adjusted image information, a second image including the acquisition target includes:
based on the adjusted edge profile, adjusting a focusing area aiming at the acquisition target on the preview page;
and acquiring the second image comprising the acquisition target imaging by using the adjusted focusing area.
Optionally, the method further comprises:
and displaying a reference image object on the preview page, wherein the shape of the reference image object is consistent with the shape of the adjusted edge contour.
Optionally, the first image includes: a third image acquired by the acquisition target after previewing;
the adjusting the image information of the first image formed by the acquisition target according to the contour information comprises the following steps:
and adjusting the image information of the third image according to the contour information.
Optionally, the third image includes: the main body area where the acquisition target is imaged is located;
the adjusting the image information of the third image according to the contour information includes:
And sharpening the main body area according to the contour information.
Optionally, the third image further includes: a background area outside the imaging of the acquisition target;
the method further comprises the steps of:
blurring the background region.
According to a second aspect of embodiments of the present disclosure, there is provided an image acquisition apparatus, applied to a mobile terminal, including:
the transmitting and detecting module is configured to transmit radar waves and detect echoes of the radar waves when image acquisition is carried out;
the first determining module is configured to determine profile information of an acquisition target according to the emission parameters of the radar waves and the echo parameters of the echo;
the adjusting module is configured to adjust the image information of the first image formed by the acquisition target according to the contour information;
and the acquisition module is configured to acquire a second image containing the acquisition target imaging based on the adjusted image information.
Optionally, the adjustment module is further configured to:
and adjusting the edge contour of the preview image of the acquisition target presented in the preview page of the mobile terminal according to the contour information.
Optionally, the acquiring module further includes:
the adjustment sub-module is configured to adjust a focusing area aiming at the acquisition target on the preview page based on the adjusted edge profile;
An acquisition sub-module configured to acquire the second image including the acquisition target imaging using the adjusted focus area.
Optionally, the apparatus further comprises:
and the display module is configured to display a reference graphic object on the preview page, wherein the shape of the reference graphic object is consistent with the shape of the adjusted edge contour.
Optionally, the first image includes: a third image acquired by the acquisition target after previewing;
the adjustment module is further configured to:
and adjusting the image information of the third image according to the contour information.
Optionally, the third image includes: the main body area where the acquisition target is imaged is located;
the adjustment module is further configured to:
and sharpening the main body area according to the contour information.
Optionally, the third image further includes: a background area outside the imaging of the acquisition target;
the apparatus further comprises;
and an blurring module configured to blurring the background region.
According to a third aspect of embodiments of the present disclosure, there is provided a mobile terminal comprising:
a processor;
a memory for storing processor-executable instructions;
Wherein the processor is configured to implement the method steps of any of the above, when executing executable instructions.
According to a fourth aspect of embodiments of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon a computer program for execution by a processor to perform the method steps of any one of the above.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects:
embodiments of the present disclosure provide for transmitting radar waves and detecting echoes of the radar waves while image acquisition is performed; determining profile information of an acquisition target according to the emission parameters of the radar waves and the echo parameters of the echo; adjusting image information of a first image formed by the acquisition target according to the contour information; and acquiring a second image comprising the acquired target imaging based on the adjusted image information. Compared with contour information obtained by adopting a deep learning model and the like in the related art, the contour information of the acquisition target detected by radar waves in the embodiment is more accurate; based on the method, the image of the acquisition target can be accurately divided by using the more accurate contour information, and the image of the acquisition target and other image backgrounds are divided by using the accurately acquired target contour, so that the phenomena of blurring or misplacement of the image edge of the acquisition target caused by the error in the image division of the acquisition target are reduced, and the imaging effect is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a flowchart illustrating an image acquisition method according to an exemplary embodiment;
FIG. 2 is a schematic diagram of a mobile terminal with a radar sensor, according to an example embodiment;
FIG. 3 is a schematic view of a scenario illustrating an image acquisition method according to an exemplary embodiment;
FIG. 4 is another scene diagram illustrating an image acquisition method according to an exemplary embodiment;
FIG. 5 is a schematic view of yet another scenario of an image acquisition method according to an exemplary embodiment;
FIG. 6 is a specific flow chart illustrating a method of image acquisition according to an exemplary embodiment;
FIG. 7 is a block diagram of an image acquisition apparatus according to an exemplary embodiment;
fig. 8 is a block diagram of a mobile terminal according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the invention. Rather, they are merely examples of apparatus and methods consistent with aspects of the invention as detailed in the accompanying claims.
Fig. 1 is a flowchart illustrating an image acquisition method according to an exemplary embodiment, which is applied to a mobile terminal, as shown in fig. 1, and includes the steps of:
step 101: during image acquisition, a radar wave is emitted and echoes of the radar wave are detected.
Here, the mobile terminal may be a mobile phone, a tablet computer, a notebook computer, or the like; the mobile terminal may also be a wearable device, e.g. a smart watch or the like. In a word, any mobile terminal with an image acquisition module and capable of image acquisition can be used.
Here, the radar wave is emitted by a radar sensor mounted on the mobile terminal, referring to fig. 2, fig. 2 is a schematic view of a mobile terminal having the radar sensor, and as shown in fig. 2, at least two radar sensors may be mounted on the mobile terminal 2, for example, the two radar sensors may include: a front radar sensor 21 mounted in the mobile terminal 2 on the same side as the front camera, and a rear radar sensor 22 mounted in the mobile terminal 2 on the same side as the rear camera. Of course, in other embodiments, the radar sensor may be installed in other positions of the mobile terminal, which is not limited in any way.
The radar sensors mounted on the different sides of the mobile terminal may emit radar waves at any angle in front of the different sides of the mobile terminal. For example, a front radar sensor 21 installed in the mobile terminal on the same side as the front camera may be used to capture objects in the image information that the front camera can capture; and a rear radar sensor 22 mounted in the mobile terminal on the same side as the rear camera may be used to capture objects in the image information that the rear camera can capture.
In other embodiments, there may be only one radar sensor, and the radar sensor may be disposed on a rotating component of the mobile terminal and capable of changing direction based on the driving of the rotating component in the mobile terminal. For example, the radar sensor is directed toward one side of the display screen based on actuation of the rotating assembly; alternatively, the radar sensor is caused to emit radar waves in a set direction based on the driving of the rotating assembly.
In practical applications, the radar sensor may be one or more. The radar sensor may be provided on either side of the mobile terminal.
To be able to reduce the scanning range of radar waves, in some embodiments the method further comprises:
Identifying the direction of the acquisition target in the focusing area;
the method for transmitting radar waves during image acquisition comprises the following steps:
and transmitting the radar wave to the direction of the acquisition target when the image acquisition is carried out.
In this embodiment, the radar wave may be emitted to the direction in which the acquisition target is located by identifying the direction in which the acquisition target is located in the focusing area. For example, taking the front radar sensor as an example, if the acquisition target is directly in front of the mobile terminal, only the radar wave needs to be controlled to scan and emit within a certain range in front of the mobile terminal; if the acquisition target is in a direction which is offset by 30 degrees to the right from the right in front of the mobile terminal, the radar wave is only controlled to scan and emit within a certain range in the direction which is offset by 30 degrees to the right in front of the mobile terminal. Therefore, the emission range of the radar wave can be reduced, the speed of detecting and collecting the contour information of the target can be improved, and the power consumption of the mobile terminal can be saved.
The method further comprises the steps of: the focus area is determined based on an input operation in the user preview page.
Referring to fig. 3, fig. 3 is a schematic view of a scenario illustrating an image acquisition method according to an exemplary embodiment, where, as shown in fig. 3, an image to be acquired by a mobile terminal includes: a collection target 31, and a background region 32 outside the collection target.
It should be noted that the image of the acquisition target is presented in the focusing area. The mobile terminal can focus the acquisition target by utilizing the focusing area.
The identifying the direction of the acquisition target in the focusing area comprises the following steps: according to the position information of the focusing area in the whole image area, the direction of the acquisition target 31 relative to the mobile terminal can be determined. Here, the position information of the focusing area in the area of the whole image is positively correlated with the direction in which the acquisition target is located with respect to the mobile terminal, that is, the position information of the focusing area in the area of the whole image indicates that the closer the focusing area is to the center position, the closer the acquisition target is to the front with respect to the direction in which the mobile terminal is located.
Referring to fig. 3, based on a click operation of the preview page 30 by the user, position information of the focus area in the entire image area may be determined. In some embodiments, one focus box 34 is displayed based on a user's click operation on the preview page 30. In this way, the position information of the focusing frame can be used to determine the direction in which the acquisition target is located relative to the mobile terminal, thereby controlling the radar wave to transmit radar in this direction. Here, the radar wave is emitted along the direction 33 in which the acquisition target 31 determined by the focusing frame 34 is located with respect to the mobile terminal.
In this embodiment, the radar wave is emitted to the direction in which the acquisition target is located, so that the radar is not required to perform large-area scanning, the detection time is saved, and the detection efficiency is improved. And the direction of the acquisition target can be conveniently and rapidly determined through the focusing area, so that the detection efficiency is further improved.
Step 102: and determining the profile information of the acquisition target according to the emission parameters of the radar waves and the echo parameters of the echo.
Here, the contour information includes a shape of an outer contour of the acquisition target. For example, if the acquisition target is a human face, the shape of the outline indicated by the outline information is consistent with the shape of the human face; if the acquisition target is a square table, the shape of the outer contour indicated by the contour information is square.
In some embodiments, the profile information further includes a size of an outer profile of the acquisition target, and the like. In summary, the profile information can describe approximately the profile of the acquisition object.
In some embodiments, the determining profile information of the acquisition target according to the transmission parameter of the radar wave and the echo parameter of the echo comprises:
determining distance information of an acquisition point in an acquisition range relative to the mobile terminal according to the emission parameters of the radar waves and the echo parameters of the echo;
And determining the profile information of the acquisition target according to the difference value between the distance information of the acquisition point relative to the mobile terminal and the distance information of the adjacent acquisition points of the acquisition point relative to the mobile terminal.
Specifically, the determining the profile information of the acquisition target according to a difference between the distance information of the acquisition point relative to the mobile terminal and the distance information of the adjacent acquisition point relative to the mobile terminal includes:
according to the difference value between the distance information of the acquisition point relative to the mobile terminal and the distance information of the adjacent acquisition point relative to the mobile terminal, determining the acquisition point with the difference value larger than a difference value threshold;
and determining the profile information of the acquisition target according to the acquisition points corresponding to which the difference value is larger than a difference value threshold.
Here, the transmission parameters include: a transmission time; the echo parameters include: a reception time;
the determining distance information of the acquisition point in the acquisition range relative to the mobile terminal according to the transmission parameter of the radar wave and the echo parameter of the echo comprises the following steps:
And determining the distance information of the acquisition point in the acquisition range relative to the mobile terminal according to the transmitting time and the receiving time.
It will be appreciated that radar waves, if they encounter an obstacle after transmission, reflect echoes. The closer the obstacle is, the shorter the time the echo is received, and the farther the obstacle is, the longer the echo is received. As such, the time at which different obstacles return echoes varies. An obstacle is here understood to be in fact an object to be acquired in image acquisition, including an acquisition target, or other objects than an acquisition target, such as objects in a background image.
It will be appreciated that if the difference is greater than the difference threshold, it indicates that the object to which the acquisition point belongs is not in the same location area as the object of the acquisition point adjacent to the acquisition point. And if the difference value is smaller than or equal to a difference value threshold value, indicating that the object to which the acquisition point belongs and the object to which the adjacent acquisition point of the acquisition point belongs are in the same position area. That is, the acquisition point corresponding to the difference value being greater than the difference threshold may include a boundary point of an object to which the acquisition point belongs. Therefore, the contour information of the acquisition target can be obtained simply and conveniently in a radar wave ranging mode, the processing method is simple and rapid, the system is not required to analyze the image, and the acquired object is directly obtained through radar wave scanning, so that the contour information of the acquisition target is obtained in the mode, the efficiency is higher, and the obtained contour information is more accurate.
Step 103: and adjusting the image information of the first image formed by the acquisition target according to the contour information.
It should be understood that, the first image herein may refer to an image in a preview page before image acquisition; the image acquisition module can also be based on the original recognition function of the image acquisition module, for example, the image acquired after the acquisition target is segmented in the preview process by a deep learning model.
In some scenes, when the image acquisition module of the mobile terminal acquires images, the images formed by the acquisition targets and the images formed by the areas outside the acquisition targets need to be segmented. For example, if the acquisition target is a personal image, only the personal image may be subjected to a portrait process or the like, for example, a face recognition process, a face beauty process, or the like, based on dividing the personal image from an image other than the personal image, and the image formed in the area other than the face image does not need to be subjected to the process, thereby simplifying the image process. In this case, the image information of the first image formed by the acquisition target is adjusted based on the contour information, so that the person image can be well distinguished from the images other than the person image, and phenomena such as incomplete processing of the person image due to division of the partial region of the person image into the images other than the person image, and excessive processing range due to division of the images other than the person image into the images other than the person image can be reduced.
In other situations, it is desirable to highlight the image of the acquisition object, so that the image of the region outside the acquisition object is subjected to blurring processing to generate an image highlighting the acquisition object. In this scenario, according to the contour information, the image information of the first image formed by the acquisition target is adjusted, so that the image formed by the acquisition target can be well distinguished from the image formed by the region outside the acquisition target, and the phenomenon that the partial region of the image formed by the acquisition target is blurred due to the fact that the image formed by the acquisition target is divided into the region outside the acquisition target, or the partial region of the image formed by the region outside the acquisition target is highlighted due to the fact that the image formed by the region outside the acquisition target is divided into the region outside the acquisition target is reduced.
Of course, in other scenarios, it may be desirable to highlight the image of the region outside the acquisition target, thereby blurring the image of the acquisition target to generate an image of the region outside the acquisition target.
Therefore, whether the first image is an image in a preview page before image acquisition or an image acquired after the acquisition target is segmented during preview based on the original recognition function of the image acquisition module. Based on the accurate contour information, the image information of the first image acquired by the acquisition is adjusted, and the image acquired by the acquisition target can be accurately divided from the image acquired by the area outside the acquisition target.
Step 104: and acquiring a second image comprising the acquired target imaging based on the adjusted image information.
The second image refers to an image meeting the image quality requirement after adjusting the image information of the first image. This image quality requirement is a requirement that the acquisition target can be accurately distinguished from areas other than the acquisition target.
In the above embodiment, the radar wave detection is used to obtain more accurate profile information of the acquisition target, and the image information of the first image formed by the acquisition target is adjusted according to the more accurate profile information, that is, the image of the acquisition target is accurately segmented from the background of other images by using the profile of the acquisition target, so as to obtain the second image which accurately divides the area outside the acquisition target from the area outside the acquisition target. Therefore, the imaging effect of the obtained second image is better, and the phenomena of blurring of the image edge or edge errors and the like of the acquisition target can be reduced.
In some embodiments, the adjusting the image information of the first image of the acquisition target according to the contour information includes:
and adjusting the edge contour of the preview image of the acquisition target presented in the preview page of the mobile terminal according to the contour information.
Here, the edge profile includes a position of an edge point presented by the acquisition target in the preview page at the display screen; may also include: and the acquisition target is positioned at the pixel point of the edge point of the preview image presented in the preview page.
In this way, in this embodiment, by adjusting the edge profile of the preview image of the acquisition target presented in the preview page of the mobile terminal according to the profile information, the acquisition target and the area other than the acquisition target can be distinguished before image acquisition, so that the imaging quality of the original image is improved, and the burden of image processing is reduced.
In some embodiments, the acquiring a second image including the acquisition target based on the adjusted image information includes:
based on the adjusted edge profile, adjusting a focusing area aiming at the acquisition target on the preview page;
and acquiring the second image comprising the acquisition target imaging by using the adjusted focusing area.
Referring to fig. 4, fig. 4 is another schematic view of a scenario illustrating an image acquisition method according to an exemplary embodiment, as shown in fig. 4, an image to be acquired by a mobile terminal includes: a collection target 31, and a background region 32 outside the collection target.
Here, the acquisition target 31 is a person image, and the contour information of the acquisition target can approximately characterize the contour of the person image. In this way, the focusing frame can be adjusted to an edge contour conforming to the outline of the figure image according to the adjusted edge contour, and the area surrounded by the edge contour conforming to the figure image is the adjusted focusing area. Therefore, before image acquisition, the acquisition target can be accurately focused by the adjusted focusing area, and then a second image with accurate focusing can be acquired.
Additionally, in some embodiments, the method further comprises:
and displaying a reference graphic object on the preview page, wherein the shape of the reference graphic object is consistent with the shape of the adjusted edge contour.
Referring back to fig. 4, after entering the preview page 30, a reference image object 35 is displayed on the preview page 30, where the shape of the reference image object 35 matches the adjusted edge contour shape, as shown in fig. 4. Here, the adjusted edge profile refers to a profile that the acquisition target 31 presents on the display screen.
Here, the reference graphic object may include an auxiliary line having the same contour as the collection target so that a user can focus-shoot the collection target based on the auxiliary line.
Taking a square table as an example, the auxiliary line of the reference graph may include: a figure formed by four fold lines of 90 degrees to outline the four top corners of the square table. Taking the example of collecting the object as the head portrait of the person, the auxiliary line of the reference graph may include: various arcuate line segments are formed to outline the character head portrait.
In this embodiment, the image acquisition module can be assisted in focusing accurately by displaying the reference graphic object on the preview page.
In other embodiments, the first image comprises: a third image acquired by the acquisition target after previewing;
the adjusting the image information of the first image formed by the acquisition target according to the contour information comprises the following steps:
and adjusting the image information of the third image according to the contour information.
Here, the third image refers to an image acquired after the acquisition target is segmented during previewing based on an original recognition function of the image acquisition module.
The adjusting the image information of the third image according to the contour information includes: re-segmenting the imaging of the acquisition target and the imaging of the region outside the imaging of the acquisition target according to the contour information; and respectively carrying out corresponding processing according to the acquired target imaging obtained after re-segmentation and the region imaging except the acquired target imaging obtained after re-segmentation.
Therefore, the embodiment can correct the image acquired by the image acquisition module, generate the image accurately divided for the acquisition target and the area beyond the acquisition target, and improve the final imaging effect.
Taking an example in which an image of the acquisition target is required to be processed, and an image of an area other than the acquisition target is not subjected to image processing.
Specifically, in some embodiments, the third image comprises: the main body area where the acquisition target is imaged is located;
the adjusting the image information of the third image according to the contour information includes:
and carrying out image processing on the main body area according to the contour information.
Here, the image processing includes: the image is subjected to a beautifying process, or the image is subjected to an identification process, or the image is subjected to a blurring process, or the like.
Taking as an example an image of an area outside the acquisition target that needs to be emphasized and weakened.
In some embodiments, the third image comprises: the main body area where the acquisition target is imaged is located;
the adjusting the image information of the third image according to the contour information includes: and sharpening the main body area according to the contour information.
The third image further includes: a background area outside the imaging of the acquisition target;
the method further comprises the steps of:
blurring the background region.
According to the embodiment, based on the obtained accurate contour information, the main body area where the acquisition target is imaged can be sharpened, and the background area outside the acquisition target imaging is subjected to blurring. Because the contour information is based on the accurate contour information, the main body area obtained by sharpening and the background area obtained by blurring are more accurate, so that the problems of blurring or misplacement of the edges of the acquisition target in the original image caused by inaccurate contour information are reduced, and the imaging effect is finally improved.
In other embodiments, the method further comprises: determining an image area between the edge contour of the image formed by the acquisition target before adjustment and the edge contour of the image formed by the acquisition target after adjustment;
and if the image area belongs to the area within the edge contour of the image formed by the adjusted acquisition target, performing image processing on the image area according to the processing mode of the image formed by the acquisition target before adjustment. For example, if the processing mode of the image formed by the acquisition target before adjustment is a beautifying processing, the image area is subjected to the beautifying processing; and if the image formed by the acquisition target is processed in a sharpening manner, sharpening the image area.
In other embodiments, the method further comprises: and if the image area belongs to the area outside the edge contour of the image formed by the adjusted acquisition target, performing image processing on the image area according to the processing mode of the image formed by the area outside the edge contour of the image formed by the acquisition target before adjustment. For example, if the image formed by the area outside the edge contour of the image formed by the acquisition target is processed in a blurring manner before adjustment, the image area is blurring.
Therefore, only the image area between the edge contour of the image formed by the acquisition target before adjustment and the edge contour formed by the acquisition target after adjustment is correspondingly processed, so that the problems of blurring or misplacement of the edge of the image formed by the acquisition target in the original image are solved, and the imaging effect is finally improved.
Further, the present disclosure provides a specific embodiment to further understand the image acquisition method provided by the embodiments of the present disclosure.
When image acquisition is performed, background areas except the acquisition target are often subjected to blurring, so that the acquisition target is highlighted. In the related art, the image is segmented by using a deep learning model to complete blurring of the background region. However, the background blurring achieved by the method often causes the problems of inaccurate edges and the like when the acquisition target and the background are blurring due to inaccurate segmented images.
The embodiment solves the problem that the blurring edge frequently occurs in the process of acquiring the acquisition targets is deviated, or the problem that the image details of a plurality of acquisition targets are contained or missing.
Referring specifically to fig. 5, fig. 5 is a schematic view of still another scenario of an image acquisition method according to an exemplary embodiment, as shown in fig. 5,
when the mobile terminal 50 starts the image acquisition module to acquire images, firstly, determining the direction of the acquisition target 31 according to the focusing area of the image acquisition; starting a radar sensor, transmitting radar waves to the direction of the acquisition target 31, and acquiring contour information of the acquisition target 31; and according to the contour information, optimizing a first image obtained by carrying out image acquisition on the acquisition target at present, and finally obtaining a second image comprising the imaging of the acquisition target.
Specifically, referring to fig. 6, fig. 6 is a specific flowchart illustrating an image acquisition method according to an exemplary embodiment, as shown in fig. 6, the method includes:
step 61: the mobile phone enters a focusing shooting mode.
Here, a mobile terminal will be described as an example of a mobile phone. When the mobile phone enters a focusing shooting mode, the mobile phone is indicated to be carrying out image acquisition. That is, this step 61 may correspond to "at the time of image acquisition" in step 101 described in the above embodiment.
Step 62: transmitting radar waves;
step 63: the radar wave monitors profile information of the acquisition target.
Here, the steps 62 and 63 may be equivalent to "transmitting and detecting the echo of the radar wave" in the step 101 described in the above embodiments, and the step 102 "determining the profile information of the acquisition target according to the transmission parameter of the radar wave and the echo parameter of the echo"
Step 64: optimizing image blurring information according to the contour information;
step 65: and outputting the shot photo.
Here, the photographed picture is the second image described in the above embodiment. Here, the image blurring information is optimized according to the contour information, i.e. the image information of the first image formed by the acquisition target is adjusted according to the contour information. Therefore, the image which is not originally the acquisition target can be subjected to blurring, and the image which is originally the acquisition target can be subjected to sharpening, so that the image blurring information is optimized.
In the embodiment, the acquisition of the contour information of the acquisition target in the image acquisition is completed through the radar wave, so that the acquired contour information is more accurate, the imaging effect of the image is improved, and the problem of edge blurring in the original image is solved. Meanwhile, the imaging effect of the acquisition target can be complemented, and the problem of fuzzy dislocation of edge information in image segmentation is solved.
Fig. 7 is a block diagram of an image acquisition apparatus according to an exemplary embodiment. Referring to fig. 7, the apparatus is applied to a mobile terminal, including: a transmitting and detecting module 71, a first determining module 72 and an adjusting module 73;
the transmitting and detecting module 71 is configured to transmit radar waves and detect echoes of the radar waves when image acquisition is performed;
the first determining module 72 is configured to determine profile information of an acquisition target according to the transmission parameter of the radar wave and the echo parameter of the echo;
the adjustment module 73 is configured to adjust image information of a first image formed by the acquisition target according to the profile information;
an acquisition module 74 is configured to acquire a second image comprising the acquisition target imaging based on the adjusted image information.
In some embodiments, the adjustment module 73 is further configured to:
and adjusting the edge contour of the preview image of the acquisition target presented in the preview page of the mobile terminal according to the contour information.
In some embodiments, the acquisition module 74 further comprises:
the adjustment sub-module is configured to adjust a focusing area aiming at the acquisition target on the preview page based on the adjusted edge profile;
An acquisition sub-module configured to acquire the second image including the acquisition target imaging using the adjusted focus area.
In some embodiments, the apparatus further comprises:
and the display module is configured to display a reference image object on the preview page, wherein the shape of the reference image object is consistent with the shape of the adjusted edge contour.
In some embodiments, the first image comprises: a third image acquired by the acquisition target after previewing;
the adjustment module 73 is further configured to:
and adjusting the image information of the third image according to the contour information.
In some embodiments, the third image comprises: a main body area formed by imaging the acquisition target;
the adjustment module 73 is further configured to:
and sharpening the main body area according to the contour information.
In some embodiments, the third image further comprises: a background area outside the imaging of the acquisition target;
the apparatus further comprises:
and an blurring module configured to blurring the background region.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
Fig. 8 is a block diagram illustrating a test mobile terminal 800 according to an exemplary embodiment. For example, test mobile terminal 800 may be a mobile phone, computer, digital broadcast mobile terminal, messaging device, game console, tablet device, medical device, exercise device, personal digital assistant, or the like.
Referring to fig. 8, a test mobile terminal 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operation of the test mobile terminal 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 802 may include one or more processors 820 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interactions between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support the operation of the mobile terminal 800 under test. Examples of such data include instructions for any application or method operating on the test mobile terminal 800, contact data, phonebook data, messages, pictures, videos, and the like. The memory 804 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power component 806 provides power for testing the various components of the mobile terminal 800. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the test mobile terminal 800.
The multimedia component 808 includes a screen between the mobile terminal 800 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front camera and/or a rear camera. When the test mobile terminal 800 is in an operation mode, such as a photographing mode or a video mode, the front camera and/or the rear camera may receive external multimedia data. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the mobile terminal 800 is tested in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 further includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 814 includes one or more sensors for providing status assessment of various aspects of the mobile terminal 800. For example, the sensor assembly 814 may detect an on/off state of the mobile terminal 800, a relative positioning of the components, such as a display and keypad of the mobile terminal 800, a change in position of the mobile terminal 800 or a component of the mobile terminal 800, the presence or absence of a user's contact with the mobile terminal 800, an orientation or acceleration/deceleration of the mobile terminal 800, and a change in temperature of the mobile terminal 800. The sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communication between the mobile terminal 800 and other devices, either wired or wireless. The mobile terminal 800 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the mobile terminal 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 804 including instructions executable by processor 820 of mobile terminal 800 to perform the above-described method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
A non-transitory computer readable storage medium, which when executed by a processor of a mobile terminal, enables the mobile terminal to perform the image acquisition method described in the above embodiments.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It is to be understood that the invention is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (16)

1. The image acquisition method is characterized by being applied to the mobile terminal and comprising the following steps of:
transmitting radar waves and detecting echoes of the radar waves when image acquisition is carried out;
Determining profile information of an acquisition target according to the emission parameters of the radar waves and the echo parameters of the echo;
adjusting the image information of a first image formed by the acquisition target according to the contour information;
acquiring a second image comprising the acquisition target imaging based on the adjusted image information;
the acquiring, based on the adjusted image information, a second image including the acquisition target imaging includes:
determining an image area between an edge contour in image information of a first image formed by the acquisition target before adjustment and an edge contour of the acquisition target in the adjusted image information;
and if the image area belongs to an area within the edge contour of the acquisition target in the adjusted image information, performing image processing on the image area according to a processing mode of a first image formed by the acquisition target before adjustment to obtain the second image.
2. The method of claim 1, wherein adjusting image information of the first image of the acquisition target based on the profile information comprises:
and adjusting the edge contour of the preview image of the acquisition target presented in the preview page of the mobile terminal according to the contour information.
3. The method of claim 2, wherein the acquiring a second image including the acquisition target based on the adjusted image information comprises:
based on the adjusted edge profile, adjusting a focusing area aiming at the acquisition target on the preview page;
and acquiring the second image comprising the acquisition target imaging by using the adjusted focusing area.
4. A method according to claim 2 or 3, characterized in that the method further comprises:
and displaying a reference graphic object on the preview page, wherein the shape of the reference graphic object is consistent with the shape of the adjusted edge contour.
5. The method of claim 1, wherein the first image comprises: a third image acquired by the acquisition target after previewing;
the adjusting the image information of the first image formed by the acquisition target according to the contour information comprises the following steps:
and adjusting the image information of the third image according to the contour information.
6. The method of claim 5, wherein the third image comprises: the main body area where the acquisition target is imaged is located;
The adjusting the image information of the third image according to the contour information includes:
and sharpening the main body area according to the contour information.
7. The method of claim 5, wherein the third image further comprises: a background area outside the imaging of the acquisition target;
the method further comprises the steps of:
blurring the background region.
8. An image acquisition device, characterized in that it is applied to mobile terminal, includes:
the transmitting and detecting module is configured to transmit radar waves and detect echoes of the radar waves when image acquisition is carried out;
the first determining module is configured to determine profile information of an acquisition target according to the emission parameters of the radar waves and the echo parameters of the echo;
the adjusting module is configured to adjust the image information of the first image formed by the acquisition target according to the contour information;
an acquisition module configured to acquire a second image including the acquisition target imaging based on the adjusted image information; determining an image area between an edge contour in image information of a first image formed by the acquisition target before adjustment and an edge contour of the acquisition target in the adjusted image information; and if the image area belongs to an area within the edge contour of the acquisition target in the adjusted image information, performing image processing on the image area according to a processing mode of a first image formed by the acquisition target before adjustment to obtain the second image.
9. The apparatus of claim 8, wherein the adjustment module is further configured to:
and adjusting the edge contour of the preview image of the acquisition target presented in the preview page of the mobile terminal according to the contour information.
10. The apparatus of claim 9, wherein the acquisition module further comprises:
the adjustment sub-module is configured to adjust a focusing area aiming at the acquisition target on the preview page based on the adjusted edge profile;
an acquisition sub-module configured to acquire the second image including the acquisition target imaging using the adjusted focus area.
11. The apparatus according to claim 9 or 10, characterized in that the apparatus further comprises:
and the display module is configured to display a reference graphic object on the preview page, wherein the shape of the reference graphic object is consistent with the shape of the adjusted edge contour.
12. The apparatus of claim 8, wherein the first image comprises: a third image acquired by the acquisition target after previewing;
the adjustment module is further configured to:
and adjusting the image information of the third image according to the contour information.
13. The apparatus of claim 12, wherein the third image comprises: the main body area where the acquisition target is imaged is located;
the adjustment module is further configured to:
and sharpening the main body area according to the contour information.
14. The apparatus of claim 12, wherein the third image further comprises: a background area outside the imaging of the acquisition target;
the apparatus further comprises:
and an blurring module configured to blurring the background region.
15. A mobile terminal, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to implement the method steps of any of claims 1 to 7 when executing executable instructions.
16. A non-transitory computer readable storage medium having stored thereon a computer program, characterized in that the program is executed by a processor to implement the method steps of any of claims 1 to 7.
CN202010054546.0A 2020-01-17 2020-01-17 Image acquisition method and device, mobile terminal and storage medium Active CN113138387B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010054546.0A CN113138387B (en) 2020-01-17 2020-01-17 Image acquisition method and device, mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010054546.0A CN113138387B (en) 2020-01-17 2020-01-17 Image acquisition method and device, mobile terminal and storage medium

Publications (2)

Publication Number Publication Date
CN113138387A CN113138387A (en) 2021-07-20
CN113138387B true CN113138387B (en) 2024-03-08

Family

ID=76808596

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010054546.0A Active CN113138387B (en) 2020-01-17 2020-01-17 Image acquisition method and device, mobile terminal and storage medium

Country Status (1)

Country Link
CN (1) CN113138387B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105450946A (en) * 2014-06-09 2016-03-30 华为技术有限公司 Image processing method and device and calculation device
CN105933607A (en) * 2016-05-26 2016-09-07 维沃移动通信有限公司 Photographing effect adjusting method of mobile terminal and mobile terminal
CN105933589A (en) * 2016-06-28 2016-09-07 广东欧珀移动通信有限公司 Image processing method and terminal
CN107197170A (en) * 2017-07-14 2017-09-22 维沃移动通信有限公司 A kind of exposal control method and mobile terminal
CN107370958A (en) * 2017-08-29 2017-11-21 广东欧珀移动通信有限公司 Image virtualization processing method, device and camera terminal
CN107395965A (en) * 2017-07-14 2017-11-24 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN107507239A (en) * 2017-08-23 2017-12-22 维沃移动通信有限公司 A kind of image partition method and mobile terminal
CN107948517A (en) * 2017-11-30 2018-04-20 广东欧珀移动通信有限公司 Preview screen virtualization processing method, device and equipment
CN107977940A (en) * 2017-11-30 2018-05-01 广东欧珀移动通信有限公司 background blurring processing method, device and equipment
CN109444839A (en) * 2018-11-29 2019-03-08 北京行易道科技有限公司 The acquisition methods and device of objective contour
CN109496423A (en) * 2018-10-15 2019-03-19 华为技术有限公司 Image display method and electronic equipment under a kind of photographed scene

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4197858B2 (en) * 2001-08-27 2008-12-17 富士通株式会社 Image processing program
US7817837B2 (en) * 2004-05-19 2010-10-19 Hitachi Medical Corporation Image processing device and method

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105450946A (en) * 2014-06-09 2016-03-30 华为技术有限公司 Image processing method and device and calculation device
CN105933607A (en) * 2016-05-26 2016-09-07 维沃移动通信有限公司 Photographing effect adjusting method of mobile terminal and mobile terminal
CN105933589A (en) * 2016-06-28 2016-09-07 广东欧珀移动通信有限公司 Image processing method and terminal
CN107197170A (en) * 2017-07-14 2017-09-22 维沃移动通信有限公司 A kind of exposal control method and mobile terminal
CN107395965A (en) * 2017-07-14 2017-11-24 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN107507239A (en) * 2017-08-23 2017-12-22 维沃移动通信有限公司 A kind of image partition method and mobile terminal
CN107370958A (en) * 2017-08-29 2017-11-21 广东欧珀移动通信有限公司 Image virtualization processing method, device and camera terminal
WO2019042216A1 (en) * 2017-08-29 2019-03-07 Oppo广东移动通信有限公司 Image blurring processing method and device, and photographing terminal
CN107948517A (en) * 2017-11-30 2018-04-20 广东欧珀移动通信有限公司 Preview screen virtualization processing method, device and equipment
CN107977940A (en) * 2017-11-30 2018-05-01 广东欧珀移动通信有限公司 background blurring processing method, device and equipment
CN109496423A (en) * 2018-10-15 2019-03-19 华为技术有限公司 Image display method and electronic equipment under a kind of photographed scene
CN109444839A (en) * 2018-11-29 2019-03-08 北京行易道科技有限公司 The acquisition methods and device of objective contour

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Virtualization of video quality enhancement process for mobile device;Hansu Kim;2014 International Conference on Information and Communication Technology Convergence (ICTC);20141215;全文 *
移动设备交互式图像测量的研究与应用;陈灿;中国硕士学位论文全文数据库 信息科技;20180215;全文 *

Also Published As

Publication number Publication date
CN113138387A (en) 2021-07-20

Similar Documents

Publication Publication Date Title
US8817160B2 (en) Mobile terminal and method of controlling the same
US10452890B2 (en) Fingerprint template input method, device and medium
US20170287188A1 (en) Method and apparatus for intelligently capturing image
CN110557547B (en) Lens position adjusting method and device
JP2017526316A (en) Method, apparatus, program and recording medium for processing moving picture file identifier
CN109784164B (en) Foreground identification method and device, electronic equipment and storage medium
CN112200040A (en) Occlusion image detection method, device and medium
CN106469446B (en) Depth image segmentation method and segmentation device
CN112188096A (en) Photographing method and device, terminal and storage medium
CN113138387B (en) Image acquisition method and device, mobile terminal and storage medium
US11252341B2 (en) Method and device for shooting image, and storage medium
CN112702514B (en) Image acquisition method, device, equipment and storage medium
CN114244999B (en) Automatic focusing method, device, image pickup apparatus and storage medium
CN113315903B (en) Image acquisition method and device, electronic equipment and storage medium
CN114418865A (en) Image processing method, device, equipment and storage medium
CN109035136B (en) Image processing method and device and storage medium
CN114666490A (en) Focusing method and device, electronic equipment and storage medium
CN107707819B (en) Image shooting method, device and storage medium
CN112804462B (en) Multi-point focusing imaging method and device, mobile terminal and storage medium
EP4304190A1 (en) Focus chasing method, electronic device, and storage medium
CN109376588B (en) A kind of face surveys luminous point choosing method, device and capture apparatus
CN109862252B (en) Image shooting method and device
CN116797447A (en) Image processing method and device, electronic equipment and storage medium
CN116012661A (en) Action recognition method, device, storage medium and terminal
CN112804462A (en) Multi-point focusing imaging method and device, mobile terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant