CN113138387A - Image acquisition method and device, mobile terminal and storage medium - Google Patents

Image acquisition method and device, mobile terminal and storage medium Download PDF

Info

Publication number
CN113138387A
CN113138387A CN202010054546.0A CN202010054546A CN113138387A CN 113138387 A CN113138387 A CN 113138387A CN 202010054546 A CN202010054546 A CN 202010054546A CN 113138387 A CN113138387 A CN 113138387A
Authority
CN
China
Prior art keywords
image
acquisition
acquisition target
information
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010054546.0A
Other languages
Chinese (zh)
Other versions
CN113138387B (en
Inventor
高文俊
彭聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202010054546.0A priority Critical patent/CN113138387B/en
Publication of CN113138387A publication Critical patent/CN113138387A/en
Application granted granted Critical
Publication of CN113138387B publication Critical patent/CN113138387B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B7/00Measuring arrangements characterised by the use of electric or magnetic techniques
    • G01B7/28Measuring arrangements characterised by the use of electric or magnetic techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/414Discriminating targets with respect to background clutter
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Image Analysis (AREA)

Abstract

The disclosure relates to an image acquisition method, an image acquisition device, a mobile terminal and a storage medium, wherein the method comprises the following steps: when image acquisition is carried out, transmitting radar waves and detecting echoes of the radar waves; determining the contour information of the acquired target according to the transmitting parameters of the radar waves and the echo parameters of the echo; and adjusting the image information of the first image formed by the acquisition target according to the contour information, and acquiring a second image including the imaging of the acquisition target based on the adjusted image information. Therefore, the contour information of the acquired target obtained by utilizing radar wave detection is more accurate than that obtained by a deep learning model, so that the contour information of the acquired target obtained by utilizing radar wave detection is used for adjusting the image information of the first image formed by the acquired target, the image division of the acquired target is more accurate, the phenomena of inaccurate imaging division of the acquired target and the like are reduced, and the imaging effect is improved.

Description

Image acquisition method and device, mobile terminal and storage medium
Technical Field
The present disclosure relates to the field of mobile terminal technologies, and in particular, to an image capturing method and apparatus, a mobile terminal, and a storage medium.
Background
When the mobile terminal collects images, the images need to be divided according to different objects, and different image processing can be performed on different areas in the images through the division. In the related art, a mobile terminal generally performs image segmentation on an image by using a deep learning model and the like, so as to implement different processing on different acquisition objects in the image, for example, blurring a background and the like. However, the image segmentation in this way often causes the problem that the segmented edges are inaccurate.
Disclosure of Invention
According to a first aspect of the embodiments of the present disclosure, there is provided an image capturing method, including:
when image acquisition is carried out, transmitting radar waves and detecting echoes of the radar waves;
determining the contour information of the acquired target according to the transmitting parameters of the radar waves and the echo parameters of the echo;
adjusting image information of a first image formed by the acquisition target according to the contour information;
and acquiring a second image including the acquired target image based on the adjusted image information.
Optionally, the adjusting, according to the contour information, image information of the first image of the acquisition target includes:
and adjusting the edge contour of the preview image of the acquisition target presented in the preview page of the mobile terminal according to the contour information.
Optionally, the obtaining a second image including the acquisition target based on the adjusted image information includes:
adjusting a focusing area on the preview page for the acquisition target based on the adjusted edge profile;
acquiring the second image including the acquisition target image by using the adjusted focusing area.
Optionally, the method further comprises:
and displaying a reference image object on the preview page, wherein the shape of the reference image object is consistent with the shape of the adjusted edge contour.
Optionally, the first image comprises: acquiring a third image of the acquired target after previewing;
the adjusting the image information of the first image formed by the acquisition target according to the contour information comprises:
and adjusting the image information of the third image according to the contour information.
Optionally, the third image comprises: the main body area where the acquisition target is imaged;
the adjusting the image information of the third image according to the contour information includes:
and sharpening the main body area according to the contour information.
Optionally, the third image further comprises: a background region outside the acquisition target image;
the method further comprises the following steps:
blurring the background area.
According to a second aspect of the embodiments of the present disclosure, there is provided an image capturing apparatus applied in a mobile terminal, including:
the transmitting and detecting module is configured to transmit radar waves and detect echoes of the radar waves when image acquisition is carried out;
the first determining module is configured to determine the contour information of an acquisition target according to the emission parameters of the radar waves and the echo parameters of the echo;
the adjusting module is configured to adjust image information of a first image formed by the acquisition target according to the contour information;
an acquisition module configured to acquire a second image including the acquisition target image based on the adjusted image information.
Optionally, the adjusting module is further configured to:
and adjusting the edge contour of the preview image of the acquisition target presented in the preview page of the mobile terminal according to the contour information.
Optionally, the obtaining module further includes:
an adjusting submodule configured to adjust a focusing area on the preview page for the acquisition target based on the adjusted edge profile;
an acquisition sub-module configured to acquire the second image including the acquisition target image using the adjusted focus region.
Optionally, the apparatus further comprises:
a display module configured to display a reference graphical object on the preview page, wherein a shape of the reference graphical object is consistent with a shape of the adjusted edge profile.
Optionally, the first image comprises: acquiring a third image of the acquired target after previewing;
the adjustment module further configured to:
and adjusting the image information of the third image according to the contour information.
Optionally, the third image comprises: the main body area where the acquisition target is imaged;
the adjustment module further configured to:
and sharpening the main body area according to the contour information.
Optionally, the third image further comprises: a background region outside the acquisition target image;
the device further comprises;
a blurring module configured to blur the background region.
According to a third aspect of the embodiments of the present disclosure, there is provided a mobile terminal including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to carry out the executable instructions to carry out the method steps of any of the above.
According to a fourth aspect of embodiments of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon a computer program for execution by a processor to perform the method steps of any of the above.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
according to the embodiment of the disclosure, when image acquisition is carried out, radar waves are emitted and echoes of the radar waves are detected; determining the contour information of the acquired target according to the transmitting parameters of the radar waves and the echo parameters of the echo; adjusting image information of a first image formed by the acquisition target according to the contour information; and acquiring a second image including the acquired target image based on the adjusted image information. Compared with the profile information obtained by adopting a deep learning model and the like in the related technology, the profile information of the acquisition target detected by the radar wave is more accurate; based on the above, the more accurate contour information can be used for more accurately dividing the image of the acquired target, and the image of the acquired target and the background of other images are divided by accurately acquiring the contour of the target, so that the phenomena of image edge blurring or edge dislocation and the like of the acquired target caused by the wrong division of the image of the acquired target are reduced, and the imaging effect is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a flow chart illustrating a method of image acquisition according to an exemplary embodiment;
FIG. 2 is a schematic diagram illustrating a mobile terminal having a radar sensor in accordance with an exemplary embodiment;
FIG. 3 is a scene schematic diagram illustrating an image acquisition method according to an exemplary embodiment;
FIG. 4 is another scene schematic diagram illustrating a method of image acquisition according to an exemplary embodiment;
FIG. 5 is a schematic diagram of yet another scene illustrating a method of image acquisition according to an exemplary embodiment;
FIG. 6 is a detailed flow diagram illustrating a method of image acquisition according to an exemplary embodiment;
FIG. 7 is a block diagram illustrating an image capture device according to an exemplary embodiment;
fig. 8 is a block diagram illustrating a mobile terminal according to an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
Fig. 1 is a flowchart illustrating an image capturing method according to an exemplary embodiment, which is applied to a mobile terminal, as shown in fig. 1, and includes the following steps:
step 101: when image acquisition is carried out, radar waves are transmitted and echoes of the radar waves are detected.
Here, the mobile terminal may be a mobile phone, a tablet computer, a notebook computer, or the like; the mobile terminal may also be a wearable device, such as a smart watch or the like. In a word, any mobile terminal that has the image acquisition module and can carry out image acquisition can all.
Here, the radar wave is emitted by a radar sensor installed on the mobile terminal, referring to fig. 2, fig. 2 is a schematic diagram of a mobile terminal having a radar sensor according to an exemplary embodiment, as shown in fig. 2, at least two radar sensors may be installed on the mobile terminal 2, for example, the two radar sensors may include: a front radar sensor 21 mounted in the mobile terminal 2 on the same side as the front camera, and a rear radar sensor 22 mounted in the mobile terminal 2 on the same side as the rear camera. Of course, in other embodiments, the radar sensor may be installed in other positions of the mobile terminal, which is not limited in this respect.
It should be noted that the radar sensors mounted on different sides of the mobile terminal can emit radar waves at any angle in front of the different sides of the mobile terminal. For example, the front radar sensor 21 installed in the same side of the mobile terminal as the front camera may be used to capture an object in the image information that the front camera can collect; and the rear radar sensor 22 installed in the same side of the mobile terminal as the rear camera may be used to capture objects in the image information that the rear camera can collect.
In other embodiments, there may be only one radar sensor, and the radar sensor may be disposed on a rotating member of the mobile terminal and may change direction based on driving of the rotating member in the mobile terminal. For example, the driving based on the rotating assembly makes the radar sensor face one side of the display screen; or, based on the driving of the rotating component, the radar sensor is enabled to emit radar waves to a set direction.
In practical applications, the radar sensor may be one or more. The radar sensor may be disposed at any side of the mobile terminal.
To enable a reduction of the scanning range of the radar waves, in some embodiments the method further comprises:
identifying a direction in which the acquisition target is located within a focus area;
when image acquisition is carried out, radar waves are transmitted, and the method comprises the following steps:
and when image acquisition is carried out, the radar waves are transmitted to the direction of the acquisition target.
In this embodiment, by identifying the direction of the acquisition target in the focusing area, the radar wave may be transmitted to the direction of the acquisition target. For example, taking the aforementioned front radar sensor as an example, if the acquisition target is right ahead of the mobile terminal, only the radar wave needs to be controlled to scan and emit within a certain range right ahead; if the collected target is in a direction 30 degrees to the right from the front of the mobile terminal, the radar waves are only required to be controlled to scan and emit within a certain range of the direction 30 degrees to the right from the front. Therefore, the transmitting range of the radar waves can be reduced, the speed of detecting and acquiring the contour information of the target can be improved, and the power consumption of the mobile terminal can be saved.
It should be noted that the method further includes: and determining the focusing area based on the input operation in the user preview page.
Referring to fig. 3, fig. 3 is a scene schematic diagram illustrating an image capturing method according to an exemplary embodiment, and as shown in fig. 3, an image to be captured by a mobile terminal includes: an acquisition target 31, and a background region 32 outside the acquisition target.
It should be noted that the image of the acquisition target is presented in the focus area. The mobile terminal can focus the acquisition target by using the focusing area.
The identifying the direction of the acquisition target in the focusing area comprises: according to the position information of the focusing area in the whole image area, the direction of the acquisition target 31 relative to the mobile terminal can be determined. Here, the position information of the focusing area in the area of the whole image is positively correlated with the direction in which the acquisition target is located relative to the mobile terminal, that is, the position information of the focusing area in the area of the whole image indicates that the closer the focusing area is to the center position, the closer the acquisition target is to the direct front relative to the direction in which the mobile terminal is located.
Referring to fig. 3, based on a user's clicking operation on the preview page 30, the position information of the focused area in the entire image area may be determined. In some embodiments, a focus frame 34 is displayed based on a user clicking on the preview page 30. Therefore, the position information of the focusing frame can be used for determining the direction of the acquisition target relative to the mobile terminal, so that the radar waves are controlled to emit radar along the direction. Here, the radar wave is emitted in the direction 33 of the acquisition target 31 determined by the focusing frame 34 with respect to the mobile terminal.
In the embodiment, by transmitting the radar waves to the direction of the acquisition target, large-area scanning by a radar is not required, so that the detection time is saved, and the detection efficiency is improved. And the direction of the acquisition target can be conveniently and quickly determined through the focusing area, so that the detection efficiency is further improved.
Step 102: and determining the contour information of the acquired target according to the transmitting parameters of the radar waves and the echo parameters of the echo.
Here, the contour information includes a shape of an outer contour of the acquisition target. For example, if the acquisition target is a human face, the shape of the outer contour indicated by the contour information is consistent with the shape of the human face; if the acquisition target is a square table, the shape of the outer contour indicated by the contour information is a square.
In some embodiments, the contour information further includes a size of an outer contour of the acquisition target, and the like. In summary, the contour information can approximately describe the outline of the acquisition target.
In some embodiments, the determining, according to the emission parameter of the radar wave and the echo parameter of the echo, the contour information of the acquisition target includes:
determining distance information of an acquisition point in an acquisition range relative to the mobile terminal according to the transmitting parameters of the radar waves and the echo parameters of the echo;
and determining the contour information of the acquisition target according to the difference between the distance information of the acquisition point relative to the mobile terminal and the distance information of the adjacent acquisition point of the acquisition point relative to the mobile terminal.
Specifically, the determining the profile information of the acquisition target according to the difference between the distance information of the acquisition point relative to the mobile terminal and the distance information of the adjacent acquisition point of the acquisition point relative to the mobile terminal includes:
according to the difference value between the distance information of the acquisition point relative to the mobile terminal and the distance information of the adjacent acquisition point relative to the mobile terminal, determining the acquisition point corresponding to the difference value larger than the difference value threshold;
and determining the contour information of the acquisition target according to the acquisition point corresponding to the difference value greater than the difference value threshold.
Here, the transmission parameters include: the time of transmission; the echo parameters include: a reception time;
the determining distance information of the acquisition point in the acquisition range relative to the mobile terminal according to the emission parameters of the radar waves and the echo parameters of the echo comprises the following steps:
and determining the distance information of the acquisition point in the acquisition range relative to the mobile terminal according to the transmitting time and the receiving time.
It is understood that radar waves, after transmission, reflect echoes if they encounter obstacles. The closer the obstacle is, the shorter the time to receive the echo, and the farther the obstacle is, the longer the time to receive the echo. Thus, the time for returning the echo varies from obstacle to obstacle. Here, the obstacle may be actually understood as an object to be captured at the time of image capture, including a capture target, or an object other than the capture target, for example, an object in a background image.
It is understood that if the difference is greater than the difference threshold, it indicates that the object to which the acquisition point belongs is not in the same location area as the object to which the acquisition point is adjacent. If the difference is smaller than or equal to the difference threshold value, the object to which the acquisition point belongs is in the same position region with the object to which the adjacent acquisition point of the acquisition point belongs. That is, the acquisition point corresponding to the difference value being greater than the difference threshold value may include a boundary point of an object to which the acquisition point belongs. Therefore, the contour information of the acquired target can be simply and conveniently obtained in a radar wave ranging mode, the processing method is simple and quick, the image is not required to be analyzed by a system, but the acquired real object is directly scanned by the radar wave to obtain the contour information of the acquired target, and therefore the efficiency is higher, and the obtained contour information is more accurate.
Step 103: and adjusting the image information of the first image formed by the acquisition target according to the contour information.
It should be understood that the first image herein may refer to an image within a preview page prior to image acquisition; or the method can also be based on the original recognition function of an image acquisition module, for example, the acquired image is acquired after the acquisition target is segmented during preview by a deep learning model.
In some scenes, when an image acquisition module of the mobile terminal acquires an image, the image acquired by an acquisition target and the image acquired by an area other than the acquisition target need to be segmented. For example, if the acquisition target is a human image, based on the segmentation of the human image from an image other than the human image, it is possible to perform only human image processing or the like, such as face recognition, face beautification or the like, on the human image, while an image of a region other than the human image does not need to be processed, thereby simplifying the image processing. In this scene, the image information of the first image as the acquisition target is adjusted based on the contour information, so that the person image can be well distinguished from the images other than the person image, and the phenomena that the person image processing is incomplete due to the fact that the partial area of the person image is divided into the images other than the person image, or the processing range is too large due to the fact that the images other than the person image are divided into the person image and the images other than the person image are processed can be reduced.
In other scenarios, it is necessary to highlight the image of the acquisition target, so that the image of the region outside the acquisition target is blurred to generate the image of the acquisition target. In this scenario, the image information of the first image formed by the acquisition target is adjusted according to the contour information, so that the image formed by the acquisition target can be well distinguished from the image formed by the region other than the acquisition target, and the phenomenon that a partial region of the image formed by the acquisition target is blurred due to the image formed by dividing the image formed by the acquisition target into the region other than the acquisition target or the partial region of the image formed by the region other than the acquisition target is protruded due to the image formed by dividing the image formed by the region other than the acquisition target into the region other than the acquisition target is reduced.
Of course, in other scenarios, it may be necessary to highlight the image of the region outside the acquisition target, so as to blur the image of the acquisition target to generate the image of the region outside the acquisition target.
Therefore, no matter the first image is an image in a page previewed before image acquisition, or an image acquired after the acquisition target is segmented during previewing based on the original recognition function of the image acquisition module. Based on the accurate contour information, the image information of the first image acquired by the acquisition is adjusted, and the image acquired by the acquisition target and the image acquired by the area outside the acquisition target can be accurately divided.
Step 104: and acquiring a second image including the acquired target image based on the adjusted image information.
The second image is an image that satisfies the image quality requirement after the image information of the first image is adjusted. This image quality requirement is a requirement to be able to accurately distinguish the acquisition target from the region other than the acquisition target.
In the embodiment, more accurate contour information of the acquisition target is obtained through radar wave detection, and the image information of the first image formed by the acquisition target is adjusted through the more accurate contour information, namely, the image of the acquisition target is accurately divided from the background of other images by using the contour of the acquisition target, so that the second image which accurately divides the acquisition target and the region except the acquisition target is obtained. Therefore, the obtained second image has better imaging effect, and the phenomena of image edge blurring or edge errors and the like of the acquisition target can be reduced.
In some embodiments, said adjusting image information of the first image of the acquisition target according to the contour information comprises:
and adjusting the edge contour of the preview image of the acquisition target presented in the preview page of the mobile terminal according to the contour information.
Here, the edge contour comprises a position where an edge point presented by the acquisition target in the preview page is located on the display screen; may also include: and the pixel point position of the edge point of the preview image presented in the preview page by the acquisition target.
In this way, in this embodiment, the edge contour of the preview image of the acquisition target presented in the preview page of the mobile terminal is adjusted through the contour information, so that the acquisition target can be distinguished from the region other than the acquisition target before image acquisition, the imaging quality of the original image is improved, and the burden of image processing is reduced.
In some embodiments, the obtaining a second image including the acquisition target based on the adjusted image information includes:
adjusting a focusing area on the preview page for the acquisition target based on the adjusted edge profile;
acquiring the second image including the acquisition target image by using the adjusted focusing area.
Referring to fig. 4, fig. 4 is a schematic view of another scene of an image capturing method according to an exemplary embodiment, and as shown in fig. 4, an image to be captured by a mobile terminal includes: an acquisition target 31, and a background region 32 outside the acquisition target.
Here, the capture target 31 is a human image, and the contour information of the capture target can roughly represent the outline of the human image. In this way, the focusing frame can be adjusted to an edge contour that matches the outline contour of the human image according to the adjusted edge contour, and the region surrounded by the edge contour that matches the human image is the adjusted focusing region. Therefore, before image acquisition, the acquisition target can be accurately focused by the adjusted focusing area, and then a second image with accurate focusing is acquired.
Additionally, in some embodiments, the method further comprises:
and displaying a reference graphic object on the preview page, wherein the shape of the reference graphic object is consistent with the shape of the adjusted edge contour.
Referring to fig. 4, as shown in fig. 4, after entering the preview page 30, a reference image object 35 is displayed on the preview page 30, where the shape of the reference image object 35 is consistent with the adjusted edge contour shape. Here, the adjusted edge profile refers to a profile that the acquisition target 31 presents on the display screen.
Here, the reference graphic object may include an auxiliary line having the same contour as the capture target so that a user can photograph the capture target in focus based on the auxiliary line.
Taking a table with a square acquisition target as an example, the auxiliary line of the reference pattern may include: a figure formed by four 90-degree fold lines for outlining the four top corners of the square table. Taking the captured target as the character avatar as an example, the auxiliary line of the reference pattern may include: various arc line segments are formed to outline the character avatar.
In this embodiment, the reference graphic object is displayed on the preview page, so that the image acquisition module can be assisted to perform accurate focusing.
In other embodiments, the first image comprises: acquiring a third image of the acquired target after previewing;
the adjusting the image information of the first image formed by the acquisition target according to the contour information comprises:
and adjusting the image information of the third image according to the contour information.
Here, the third image is an image acquired after the acquisition target is segmented during the preview based on the original recognition function of the image acquisition module.
The adjusting the image information of the third image according to the contour information includes: according to the contour information, re-dividing the collected target imaging and the area imaging except the collected target imaging; and respectively carrying out corresponding processing according to the acquired target image obtained after the re-segmentation and the area image except the acquired target image obtained after the re-segmentation.
Therefore, the image acquisition module can be used for correcting the acquired image, generating an image accurately divided aiming at the acquisition target and the region except the acquisition target, and improving the final imaging effect.
For example, the image of the acquisition target needs to be processed, and the image of the region other than the acquisition target is not processed.
Specifically, in some embodiments, the third image comprises: the main body area where the acquisition target is imaged;
the adjusting the image information of the third image according to the contour information includes:
and performing image processing on the main body area according to the contour information.
Here, the image processing includes: the image is subjected to a beautifying process, or the image is subjected to a recognition process, or the image is subjected to a blurring process, or the like.
Take the example of weakening the image of the region other than the acquisition target, which requires highlighting the image of the acquisition target.
In some embodiments, the third image comprises: the main body area where the acquisition target is imaged;
the adjusting the image information of the third image according to the contour information includes: and sharpening the main body area according to the contour information.
The third image further includes: a background region outside the acquisition target image;
the method further comprises the following steps:
blurring the background area.
By the embodiment, based on the obtained accurate contour information, the main body region where the acquisition target is imaged can be sharpened, and the background region except the acquisition target is imaged is blurred. Because the accurate contour information is based, the main body area obtained by sharpening and the background area obtained by blurring can be more accurate, so that the problems of fuzzy or edge dislocation and the like of the acquired target edge in the original image caused by inaccurate contour information are reduced, and the imaging effect is finally improved.
In other embodiments, the method further comprises: determining an image area between the edge contour of the image formed by the acquisition target before adjustment and the edge contour of the image formed by the acquisition target after adjustment;
and if the image area belongs to an area within the edge contour of the adjusted image formed by the acquisition target, performing image processing on the image area according to a processing mode of the image formed by the acquisition target before adjustment. For example, if the processing mode of the image formed by the acquisition target before adjustment is beauty processing, the image area is beautified and processed; and if the processing mode of the image formed by the acquisition target is sharpening, sharpening the image area.
In other embodiments, the method further comprises: and if the image area belongs to the area outside the edge contour of the image formed by the adjusted acquisition target, performing image processing on the image area according to the processing mode of the image formed by the area outside the edge contour of the image formed by the acquisition target before adjustment. For example, if the processing method of the image formed by the region outside the edge contour of the image formed by the acquisition target before the adjustment is the blurring processing, the image region is blurred.
Therefore, only the image area between the edge contour of the image formed by the acquisition target before the adjustment and the edge contour of the image formed by the acquisition target after the adjustment needs to be correspondingly processed, so that the problems of edge blurring or edge dislocation and the like of the image formed by the acquisition target in the original image are solved, and the imaging effect is finally improved.
Further, the present disclosure also provides a specific embodiment to further understand the image capturing method provided by the embodiment of the present disclosure.
When image acquisition is performed, a background region other than an acquisition target is often blurred, and the acquisition target is highlighted. In the related art, an image is segmented by using a deep learning model to complete blurring of a background region. However, the background blurring completed in this way often causes problems such as inaccurate edges when the acquired target and the background blurring due to inaccurate segmented images.
The embodiment solves the problem that the blurring edge is deviated frequently encountered when the acquisition target is acquired, or the image details of a plurality of acquisition targets are included or lost.
Referring specifically to fig. 5, fig. 5 is a schematic diagram illustrating yet another scenario of an image capture method according to an exemplary embodiment, as shown in fig. 5,
when the mobile terminal 50 starts the image acquisition module to acquire an image, firstly, the direction of the acquisition target 31 is determined according to the focusing area of the image acquisition; starting a radar sensor, and transmitting radar waves to the direction of the acquisition target 31 to acquire the contour information of the acquisition target 31; and according to the contour information, optimizing a first image obtained by currently carrying out image acquisition on an acquisition target, and finally acquiring a second image including the acquisition target.
Specifically, referring to fig. 6, fig. 6 is a specific flowchart illustrating an image capturing method according to an exemplary embodiment, and as shown in fig. 6, the method includes:
step 61: the mobile phone enters a focusing shooting mode.
Here, the description will be given taking a mobile terminal as an example. When the mobile phone enters the focusing shooting mode, the mobile phone is indicated to be in image acquisition. That is, this step 61 may correspond to "at the time of image acquisition" in step 101 described in the above embodiment.
Step 62: transmitting radar waves;
and step 63: and monitoring and acquiring the contour information of the target by using the radar wave.
Here, the steps 62 and 63 may correspond to the steps 101 of "transmitting and detecting the echo of the radar wave" and 102 of "determining the profile information of the acquisition target according to the transmission parameters of the radar wave and the echo parameters of the echo" described in the above embodiment "
Step 64: optimizing image blurring information according to the contour information;
step 65: and outputting the shot picture.
Here, the shot photograph is the second image described in the above embodiment. Here, the image blurring information is optimized based on the contour information, i.e., the image information of the first image as the acquisition target is adjusted based on the contour information. Therefore, the image which is not originally the acquisition target can be blurred, and the image which originally belongs to the acquisition target can be sharpened, so that the image blurring information is optimized.
In the embodiment, the acquisition of the contour information of the acquired target during image acquisition is completed through the radar waves, so that the acquired contour information is more accurate, the imaging effect of the image is improved, and the problem of edge blurring in the original image is solved. Meanwhile, the imaging effect of the acquired target can be complemented, and the problem of fuzzy dislocation of edge information in image segmentation is solved.
FIG. 7 is a block diagram illustrating an image capture device according to an exemplary embodiment. Referring to fig. 7, the apparatus is applied to a mobile terminal, and includes: a transmission and detection module 71, a first determination module 72 and an adjustment module 73;
the transmitting and detecting module 71 is configured to transmit radar waves and detect echoes of the radar waves when image acquisition is performed;
the first determining module 72 is configured to determine contour information of an acquisition target according to the emission parameters of the radar waves and the echo parameters of the echoes;
the adjusting module 73 is configured to adjust image information of the first image formed by the acquisition target according to the contour information;
an acquisition module 74 configured to acquire a second image comprising the acquisition target image based on the adjusted image information.
In some embodiments, the adjusting module 73 is further configured to:
and adjusting the edge contour of the preview image of the acquisition target presented in the preview page of the mobile terminal according to the contour information.
In some embodiments, the obtaining module 74 further includes:
an adjusting submodule configured to adjust a focusing area on the preview page for the acquisition target based on the adjusted edge profile;
an acquisition sub-module configured to acquire the second image including the acquisition target image using the adjusted focus region.
In some embodiments, the apparatus further comprises:
a display module configured to display a reference image object on the preview page, wherein a shape of the reference image object is consistent with a shape of the adjusted edge contour.
In some embodiments, the first image comprises: acquiring a third image of the acquired target after previewing;
the adjusting module 73 is further configured to:
and adjusting the image information of the third image according to the contour information.
In some embodiments, the third image comprises: a subject region imaged by the acquisition target;
the adjusting module 73 is further configured to:
and sharpening the main body area according to the contour information.
In some embodiments, the third image further comprises: a background region outside the acquisition target image;
the device further comprises:
a blurring module configured to blur the background region.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 8 is a block diagram illustrating a test mobile terminal 800 according to an example embodiment. For example, the testing mobile terminal 800 may be a mobile phone, a computer, a digital broadcast mobile terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, and the like.
Referring to fig. 8, the test mobile terminal 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operation of the test mobile terminal 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operation of the mobile terminal 800 under test. Examples of such data include instructions for any application or method operating on the test mobile terminal 800, contact data, phonebook data, messages, pictures, videos, and the like. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power component 806 provides power for testing various components of the mobile terminal 800. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the testing mobile terminal 800.
The multimedia component 808 includes a screen providing an output interface between the mobile terminal 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the test mobile terminal 800 is in an operation mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the test mobile terminal 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 814 includes one or more sensors for providing various aspects of state assessment for the mobile terminal 800. For example, sensor assembly 814 may detect an open/closed state of mobile terminal 800, the relative positioning of components, such as a display and keypad of mobile terminal 800, sensor assembly 814 may detect a change in position of mobile terminal 800 or a component of mobile terminal 800, the presence or absence of user contact with mobile terminal 800, orientation or acceleration/deceleration of mobile terminal 800, and a change in temperature of mobile terminal 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communications between the mobile terminal 800 and other devices in a wired or wireless manner. The mobile terminal 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the mobile terminal 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the mobile terminal 800 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium, wherein instructions of the storage medium, when executed by a processor of a mobile terminal, enable the mobile terminal to perform the image capturing method according to the above embodiments.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (16)

1. An image acquisition method is applied to a mobile terminal, and comprises the following steps:
when image acquisition is carried out, transmitting radar waves and detecting echoes of the radar waves;
determining the contour information of the acquired target according to the transmitting parameters of the radar waves and the echo parameters of the echo;
adjusting the image information of a first image formed by the acquisition target according to the contour information;
and acquiring a second image including the acquired target image based on the adjusted image information.
2. The method of claim 1, wherein said adjusting image information of the first image of the acquisition target based on the contour information comprises:
and adjusting the edge contour of the preview image of the acquisition target presented in the preview page of the mobile terminal according to the contour information.
3. The method of claim 2, wherein the obtaining a second image including the acquisition target based on the adjusted image information comprises:
adjusting a focusing area on the preview page for the acquisition target based on the adjusted edge profile;
acquiring the second image including the acquisition target image by using the adjusted focusing area.
4. A method according to claim 2 or 3, characterized in that the method further comprises:
and displaying a reference graphic object on the preview page, wherein the shape of the reference graphic object is consistent with the shape of the adjusted edge contour.
5. The method of claim 1, wherein the first image comprises: acquiring a third image of the acquired target after previewing;
the adjusting the image information of the first image formed by the acquisition target according to the contour information comprises:
and adjusting the image information of the third image according to the contour information.
6. The method of claim 4, wherein the third image comprises: the main body area where the acquisition target is imaged;
the adjusting the image information of the third image according to the contour information includes:
and sharpening the main body area according to the contour information.
7. The method of claim 5, wherein the third image further comprises: a background region outside the acquisition target image;
the method further comprises the following steps:
blurring the background area.
8. An image acquisition device, applied to a mobile terminal, comprising:
the transmitting and detecting module is configured to transmit radar waves and detect echoes of the radar waves when image acquisition is carried out;
the first determining module is configured to determine the contour information of an acquisition target according to the emission parameters of the radar waves and the echo parameters of the echo;
the adjusting module is configured to adjust image information of a first image formed by the acquisition target according to the contour information;
an acquisition module configured to acquire a second image including the acquisition target image based on the adjusted image information.
9. The apparatus of claim 8, wherein the adjustment module is further configured to:
and adjusting the edge contour of the preview image of the acquisition target presented in the preview page of the mobile terminal according to the contour information.
10. The apparatus of claim 9, wherein the obtaining module further comprises:
an adjusting submodule configured to adjust a focusing area on the preview page for the acquisition target based on the adjusted edge profile;
an acquisition sub-module configured to acquire the second image including the acquisition target image using the adjusted focus region.
11. The apparatus of claim 9 or 10, further comprising:
a display module configured to display a reference graphical object on the preview page, wherein a shape of the reference graphical object is consistent with a shape of the adjusted edge profile.
12. The apparatus of claim 8, wherein the first image comprises: acquiring a third image of the acquired target after previewing;
the adjustment module further configured to:
and adjusting the image information of the third image according to the contour information.
13. The apparatus of claim 12, wherein the third image comprises: the main body area where the acquisition target is imaged;
the adjustment module further configured to:
and sharpening the main body area according to the contour information.
14. The apparatus of claim 9 or 10, wherein the third image further comprises: a background region outside the acquisition target image;
the device further comprises:
a blurring module configured to blur the background region.
15. A mobile terminal, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to carry out the executable instructions to carry out the method steps of any one of claims 1 to 7.
16. A non-transitory computer-readable storage medium, on which a computer program is stored, characterized in that the program is executed by a processor to implement the method steps of any of claims 1 to 7.
CN202010054546.0A 2020-01-17 2020-01-17 Image acquisition method and device, mobile terminal and storage medium Active CN113138387B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010054546.0A CN113138387B (en) 2020-01-17 2020-01-17 Image acquisition method and device, mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010054546.0A CN113138387B (en) 2020-01-17 2020-01-17 Image acquisition method and device, mobile terminal and storage medium

Publications (2)

Publication Number Publication Date
CN113138387A true CN113138387A (en) 2021-07-20
CN113138387B CN113138387B (en) 2024-03-08

Family

ID=76808596

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010054546.0A Active CN113138387B (en) 2020-01-17 2020-01-17 Image acquisition method and device, mobile terminal and storage medium

Country Status (1)

Country Link
CN (1) CN113138387B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030053689A1 (en) * 2001-08-27 2003-03-20 Fujitsu Limited Image processing method and systems
US20070230760A1 (en) * 2004-05-19 2007-10-04 Yasuo Omi Image Processing Device and Method
CN105450946A (en) * 2014-06-09 2016-03-30 华为技术有限公司 Image processing method and device and calculation device
CN105933607A (en) * 2016-05-26 2016-09-07 维沃移动通信有限公司 Photographing effect adjusting method of mobile terminal and mobile terminal
CN105933589A (en) * 2016-06-28 2016-09-07 广东欧珀移动通信有限公司 Image processing method and terminal
CN107197170A (en) * 2017-07-14 2017-09-22 维沃移动通信有限公司 A kind of exposal control method and mobile terminal
CN107370958A (en) * 2017-08-29 2017-11-21 广东欧珀移动通信有限公司 Image virtualization processing method, device and camera terminal
CN107395965A (en) * 2017-07-14 2017-11-24 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN107507239A (en) * 2017-08-23 2017-12-22 维沃移动通信有限公司 A kind of image partition method and mobile terminal
CN107948517A (en) * 2017-11-30 2018-04-20 广东欧珀移动通信有限公司 Preview screen virtualization processing method, device and equipment
CN107977940A (en) * 2017-11-30 2018-05-01 广东欧珀移动通信有限公司 background blurring processing method, device and equipment
CN109444839A (en) * 2018-11-29 2019-03-08 北京行易道科技有限公司 The acquisition methods and device of objective contour
CN109496423A (en) * 2018-10-15 2019-03-19 华为技术有限公司 Image display method and electronic equipment under a kind of photographed scene

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030053689A1 (en) * 2001-08-27 2003-03-20 Fujitsu Limited Image processing method and systems
US20070230760A1 (en) * 2004-05-19 2007-10-04 Yasuo Omi Image Processing Device and Method
CN105450946A (en) * 2014-06-09 2016-03-30 华为技术有限公司 Image processing method and device and calculation device
CN105933607A (en) * 2016-05-26 2016-09-07 维沃移动通信有限公司 Photographing effect adjusting method of mobile terminal and mobile terminal
CN105933589A (en) * 2016-06-28 2016-09-07 广东欧珀移动通信有限公司 Image processing method and terminal
CN107197170A (en) * 2017-07-14 2017-09-22 维沃移动通信有限公司 A kind of exposal control method and mobile terminal
CN107395965A (en) * 2017-07-14 2017-11-24 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN107507239A (en) * 2017-08-23 2017-12-22 维沃移动通信有限公司 A kind of image partition method and mobile terminal
CN107370958A (en) * 2017-08-29 2017-11-21 广东欧珀移动通信有限公司 Image virtualization processing method, device and camera terminal
WO2019042216A1 (en) * 2017-08-29 2019-03-07 Oppo广东移动通信有限公司 Image blurring processing method and device, and photographing terminal
CN107948517A (en) * 2017-11-30 2018-04-20 广东欧珀移动通信有限公司 Preview screen virtualization processing method, device and equipment
CN107977940A (en) * 2017-11-30 2018-05-01 广东欧珀移动通信有限公司 background blurring processing method, device and equipment
CN109496423A (en) * 2018-10-15 2019-03-19 华为技术有限公司 Image display method and electronic equipment under a kind of photographed scene
CN109444839A (en) * 2018-11-29 2019-03-08 北京行易道科技有限公司 The acquisition methods and device of objective contour

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
HANSU KIM: "Virtualization of video quality enhancement process for mobile device", 2014 INTERNATIONAL CONFERENCE ON INFORMATION AND COMMUNICATION TECHNOLOGY CONVERGENCE (ICTC), 15 December 2014 (2014-12-15) *
陈灿: "移动设备交互式图像测量的研究与应用", 中国硕士学位论文全文数据库 信息科技, 15 February 2018 (2018-02-15) *

Also Published As

Publication number Publication date
CN113138387B (en) 2024-03-08

Similar Documents

Publication Publication Date Title
CN107025419B (en) Fingerprint template inputting method and device
CN108154465B (en) Image processing method and device
CN110557547B (en) Lens position adjusting method and device
CN111105454B (en) Method, device and medium for obtaining positioning information
CN111314597A (en) Terminal, focusing method and device
CN108154466B (en) Image processing method and device
CN109784164B (en) Foreground identification method and device, electronic equipment and storage medium
CN112200040A (en) Occlusion image detection method, device and medium
CN108717542B (en) Method and device for recognizing character area and computer readable storage medium
CN110650288B (en) Focusing control method and device, electronic equipment and computer readable storage medium
CN109509195B (en) Foreground processing method and device, electronic equipment and storage medium
CN107292901B (en) Edge detection method and device
CN106469446B (en) Depth image segmentation method and segmentation device
CN112188096A (en) Photographing method and device, terminal and storage medium
CN113138387B (en) Image acquisition method and device, mobile terminal and storage medium
CN114244999B (en) Automatic focusing method, device, image pickup apparatus and storage medium
CN112702514B (en) Image acquisition method, device, equipment and storage medium
US11252341B2 (en) Method and device for shooting image, and storage medium
CN113315903B (en) Image acquisition method and device, electronic equipment and storage medium
CN114612485A (en) Image clipping method and device and storage medium
CN109035136B (en) Image processing method and device and storage medium
CN114418865A (en) Image processing method, device, equipment and storage medium
CN114390189A (en) Image processing method, device, storage medium and mobile terminal
CN107707819B (en) Image shooting method, device and storage medium
CN113141458A (en) Image acquisition method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant