CN112217992A - Image blurring method, image blurring device, mobile terminal, and storage medium - Google Patents

Image blurring method, image blurring device, mobile terminal, and storage medium Download PDF

Info

Publication number
CN112217992A
CN112217992A CN202011049565.0A CN202011049565A CN112217992A CN 112217992 A CN112217992 A CN 112217992A CN 202011049565 A CN202011049565 A CN 202011049565A CN 112217992 A CN112217992 A CN 112217992A
Authority
CN
China
Prior art keywords
image
processed
position information
blurred
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011049565.0A
Other languages
Chinese (zh)
Inventor
周晨光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Oppo Chongqing Intelligent Technology Co Ltd
Original Assignee
Oppo Chongqing Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo Chongqing Intelligent Technology Co Ltd filed Critical Oppo Chongqing Intelligent Technology Co Ltd
Priority to CN202011049565.0A priority Critical patent/CN112217992A/en
Publication of CN112217992A publication Critical patent/CN112217992A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/675Focus control based on electronic image sensor signals comprising setting of focusing regions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application is applicable to the technical field of image processing, and provides an image blurring method, an image blurring device, a mobile terminal and a storage medium, wherein the image blurring method is applied to the mobile terminal, the mobile terminal comprises an infrared sensor and a camera, and the image blurring method comprises the following steps: acquiring an image to be processed acquired by a camera, wherein the image to be processed is an image of a target shooting scene; if target position information sent by an infrared sensor is received, determining a region to be blurred in an image to be processed according to the target position information, wherein the target position information refers to the position information of an object radiating infrared rays in a target shooting scene in the image to be processed; and performing blurring treatment on the area to be blurred to obtain a blurred image. The image blurring efficiency can be improved through the method and the device.

Description

Image blurring method, image blurring device, mobile terminal, and storage medium
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to an image blurring method, an image blurring device, a mobile terminal, and a storage medium.
Background
With the development of mobile terminals, functions of the mobile terminals are more and more diversified, for example, when the mobile terminals are used for taking pictures, the mobile terminals can perform blurring processing on the shot images, so that the image layers are clear. The current image blurring method is generally to obtain the region to be blurred in the image based on image recognition. However, when the region to be blurred is obtained based on image recognition, the complexity of the image recognition algorithm is high, and blurring efficiency is low.
Disclosure of Invention
The application provides an image blurring method, an image blurring device, a mobile terminal and a storage medium, so as to improve the image blurring efficiency.
In a first aspect, an embodiment of the present application provides an image blurring method, which is applied to a mobile terminal, where the mobile terminal includes an infrared sensor and a camera, and the image blurring method includes:
acquiring an image to be processed acquired by the camera, wherein the image to be processed is an image of a target shooting scene;
if target position information sent by the infrared sensor is received, determining a region to be blurred in the image to be processed according to the target position information, wherein the target position information refers to position information of an object radiating infrared rays in the target shooting scene in the image to be processed;
and performing virtualization processing on the region to be virtualized to obtain a virtualized image.
In a second aspect, an embodiment of the present application provides an image blurring device, which is applied to a mobile terminal, where the mobile terminal includes an infrared sensor and a camera, and the image blurring device includes:
the image acquisition module is used for acquiring an image to be processed acquired by the camera, wherein the image to be processed is an image of a target shooting scene;
the area determining module is used for determining an area to be blurred in the image to be processed according to the target position information if the target position information sent by the infrared sensor is received, wherein the target position information refers to the position information of an object radiating infrared rays in the target shooting scene in the image to be processed;
and the blurring processing module is used for blurring the area to be blurred to obtain a blurred image.
In a third aspect, an embodiment of the present application provides a mobile terminal, which includes an infrared sensor, a camera, a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the image blurring method according to the first aspect when executing the computer program.
In a fourth aspect, the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the steps of the image blurring method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product, which, when run on a mobile terminal, causes the mobile terminal to perform the steps of the image blurring method according to the first aspect.
It is thus clear that this application can acquire the positional information of the object of radiation infrared in the pending image when the pending image is gathered to the camera through setting up infrared sensor in mobile terminal, can confirm the regional of waiting to virtualize in the pending image according to this positional information, through treating the regional blurring of waiting to virtualize to this, can obtain the image after the blurring. The object of radiation infrared ray that this application sent through infrared sensor promptly can discern waiting to virtualize the region in waiting to handle the image through the positional information of waiting to handle the image, owing to need not to carry out image recognition in waiting to virtualize regional definite process, has improved the regional recognition efficiency of waiting to virtualize, and then has improved image virtualization efficiency.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart illustrating an implementation of an image blurring method according to an embodiment of the present application;
fig. 2 is a schematic flow chart illustrating an implementation of an image blurring method according to a second embodiment of the present application;
fig. 3 is a schematic flow chart illustrating an implementation of an image blurring method according to a third embodiment of the present application;
fig. 4 is a schematic structural diagram of an image blurring device according to a fourth embodiment of the present application;
fig. 5 is a schematic structural diagram of a mobile terminal according to a fifth embodiment of the present application;
fig. 6 is a schematic structural diagram of a mobile terminal according to a sixth embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
In particular implementations, the mobile terminals described in embodiments of the present application include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having touch sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments, the device is not a portable communication device, but is a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or touchpad).
In the discussion that follows, a mobile terminal that includes a display and a touch-sensitive surface is described. However, it should be understood that the mobile terminal may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The mobile terminal supports various applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disc burning application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a digital camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications that may be executed on the mobile terminal may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed between applications and/or within respective applications. In this way, a common physical architecture (e.g., touch-sensitive surface) of the terminal can support various applications with user interfaces that are intuitive and transparent to the user.
It should be understood that, the sequence numbers of the steps in this embodiment do not mean the execution sequence, and the execution sequence of each process should be determined by the function and the inherent logic of the process, and should not constitute any limitation to the implementation process of the embodiment of the present application.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Referring to fig. 1, which is a schematic view of an implementation flow of an image blurring method provided in an embodiment of the present application, where the image blurring method is applied to a mobile terminal, and the mobile terminal includes an infrared sensor and a camera, as shown in the figure, the image blurring method may include the following steps:
step 101, acquiring an image to be processed, which is acquired by a camera.
The image to be processed is an image of a target shooting scene, and specifically may be a preview image displayed on a preview interface when the image of the target shooting scene is acquired, that is, a preview image of the target shooting scene. The target shooting scene may refer to any shooting scene.
Taking the preview image as an example, when the mobile terminal detects that the camera application installed in the mobile terminal is started, a preview interface is displayed on a screen of the mobile terminal, a preview image of the target shooting scene is displayed on the preview interface, and the displayed image of the preview image is the image of the target shooting scene.
And 102, if target position information sent by the infrared sensor is received, determining a region to be blurred in the image to be processed according to the target position information.
The target position information refers to position information of an object radiating infrared rays in a target shooting scene in an image to be processed, and specifically, the infrared sensor can obtain the position information of the object radiating infrared rays in the image to be processed according to the received radiation direction of the infrared rays.
The position information of the infrared ray-radiated object in the image to be processed may refer to coordinates of the image of the infrared ray-radiated object in a pixel coordinate system of the image to be processed.
The infrared sensor may also be referred to as an infrared sensor, which is a sensor capable of sensing infrared rays radiated from an object and performing measurement using physical properties of the infrared rays. Objects (e.g., human, animal, etc.) that inherently have a certain temperature (above absolute zero) are all capable of radiating infrared light. When blurring the image to be processed, focusing a focus on a subject to blur the image of the subject other than the subject in the target shooting scene, where the subject is usually an object with temperature (i.e. an object capable of radiating infrared rays), so that whether an object radiating infrared rays exists in the target shooting scene is detected by an infrared sensor provided in the mobile terminal, and when the object radiating infrared rays exists in the target shooting scene, position information of the image of the object radiating infrared rays in the image to be processed is obtained, and the region to be blurred in the image to be processed can be obtained based on the position information. Wherein, the object can radiate infrared ray and can be understood as the object has infrared characteristic.
Because the infrared sensor is not influenced by ambient light, whether an object radiating infrared rays exists can be accurately sensed even in a complex shooting scene or a dark environment, and when the object radiating infrared rays exists, target position information is accurately obtained, the accuracy of the target position information is improved, and the accuracy of an area to be blurred is further improved.
The infrared sensor acquires target position information when sensing that an object radiating infrared rays exists in a target shooting scene, and sends the target position information to the processor, and the processor determines a to-be-blurred region in a to-be-processed image acquired by the camera according to the target position information after receiving the target position information. It should be noted that, if the infrared sensor does not sense that an object radiating infrared rays exists in the target shooting scene, it may be determined that the image of the target shooting scene does not need to be subjected to blurring processing, that is, the image to be processed does not need to be subjected to blurring processing. It should be noted that, when the infrared sensor sends the target location information to the processor, the infrared sensor may send the target location information in a binary form, so as to simplify the calculation amount for identifying the area to be blurred according to the target location information, and improve the identification efficiency of the area to be blurred. When the infrared sensor sends the target position information in a binary form, a two-dimensional matrix can be set based on a pixel coordinate system of an image to be processed, coordinates of an area of an object which is identified to radiate infrared rays in the pixel coordinate system can be represented by 1 in the two-dimensional matrix, coordinates of an area of an object which is not identified to radiate infrared rays in the pixel coordinate system can be represented by 0 in the two-dimensional matrix, when the processor receives the two-dimensional matrix sent by the infrared sensor, the area of the object which radiates infrared rays in the image to be processed can be obtained by counting the area where 1 in the two-dimensional matrix is located, and the area to be blurred in the image to be processed can be obtained by counting the area where 0 in the two-dimensional matrix is located.
In this embodiment, the position between the infrared sensor and the camera in the mobile terminal is relatively fixed, and since the position of the camera in the mobile terminal is usually fixed, when the infrared sensor is disposed in the mobile terminal, the position of the infrared sensor needs to be calibrated to ensure that the infrared sensing viewing angle of the infrared sensor is the same as the image collecting viewing angle of the camera, so that the infrared sensing range of the infrared sensor is the same as the image collecting range of the camera, and the position information determined according to the radiation direction of the infrared ray is the position information of the object radiating the infrared ray in the image to be processed. Specifically, for an image to be processed, a pixel coordinate system generally exists to describe position information of each pixel point in the image to be processed, and then, for an object radiating infrared rays, a position coordinate system may be set, and by defining that an infrared ray sensing view angle of an infrared sensor is the same as an image acquisition view angle of a camera, it may be ensured that the position coordinate system is the same as the pixel coordinate system of the image to be processed, so that when the infrared sensor senses infrared rays, coordinates of the object radiating infrared rays in the position coordinate system may be obtained according to a radiation direction of the infrared rays, that is, coordinates of the object radiating infrared rays in the pixel coordinate system of the image to be processed are obtained.
The image acquisition visual angle of the camera can refer to the range which can be covered by the camera, and the infrared sensing visual angle of the infrared sensor can refer to the range which can be sensed by the infrared sensor.
Optionally, determining the region to be blurred in the image to be processed according to the target position information includes:
determining a focus area in the image to be processed according to the target position information, wherein the focus area refers to an area of an object radiating infrared rays in the image to be processed;
and determining the area except the focus area in the image to be processed as the area to be blurred.
The focus area refers to an area which does not need to be subjected to blurring processing in the image to be processed, and the area to be blurred refers to an area which needs to be subjected to blurring processing in the image to be processed.
Optionally, after determining the focus area in the image to be processed, the method further includes:
acquiring contour information of a focusing area;
detecting whether an object radiating infrared rays is a target object or not according to the contour information of the focusing area;
correspondingly, determining the region other than the focus region in the image to be processed as the region to be blurred comprises:
and if the object radiating the infrared ray is the target object, determining the area except the focusing area in the image to be processed as the area to be blurred.
In one embodiment, at least one contour template of a target object may be preset, and when one contour template of the target object is preset, if the contour information of the focus area matches with the contour template of the target object, it is determined that the object radiating infrared rays is the target object, and if the contour information of the focus area does not match with the contour template of the target object, it is determined that the object radiating infrared rays is not the target object; when at least two different target object contour templates are preset, if the contour information of the focusing area is matched with the contour template of any one of the at least two target objects, determining that the object radiating infrared rays is the target object matched with the contour information of the focusing area, and if the contour information of the focusing area is not matched with the contour templates of the at least two target objects, determining that the object radiating infrared rays is not the target object. The blurring processing of the to-be-processed image containing different target objects can be realized by presetting the contour templates of at least two different target objects so as to adapt to different shooting scenes. The contour information of the focus area may refer to contour information of an object radiating infrared rays in an image to be processed.
In another embodiment, a target object detection model may be set in advance, and the contour information of the focus area may be input to the target object detection model, by which it is possible to detect whether or not an object that radiates infrared rays is a target object.
In the embodiment, whether the object radiating infrared rays is a target object is detected, and when the object radiating infrared rays is the target object, the region except the focus region in the image to be processed is determined to be the region to be blurred, so that the region to be blurred can be more accurately acquired.
Optionally, before determining the region to be blurred in the image to be processed according to the target position information if the target position information sent by the infrared sensor is received, the method further includes:
the virtualization processing function is started.
A blurring processing function option may be set in the camera application program (for example, a blurring processing function option is set in the setting option of the camera application program), and when it is detected that the blurring processing function option is selected by the user, the blurring processing function is started to obtain a to-be-blurred region in the to-be-processed image, and perform blurring processing on the to-be-blurred region to obtain a well-graded picture.
And 103, performing blurring processing on the area to be blurred to obtain a blurred image.
The blurring processing may be performed on the image according to a preset blurring processing algorithm. Optionally, the blurring algorithm may be set by the user according to actual requirements, which is not limited herein.
After the blurred image is obtained, the blurred image may be displayed on a preview interface, or may be stored as a picture, which is not limited herein.
According to the embodiment of the application, the to-be-blurred region in the to-be-processed image is identified through hardware (namely an infrared sensor) in the mobile terminal, the identification efficiency is high, algorithm processing is not needed, the identification efficiency of the to-be-blurred region is improved, and the image blurring efficiency is further improved.
Referring to fig. 2, which is a schematic view of an implementation flow of an image blurring method provided in the second embodiment of the present application, where the image blurring method is applied to a mobile terminal, as shown in the figure, the image blurring method may include the following steps:
step 201, acquiring an image to be processed acquired by a camera.
The step is the same as step 101, and reference may be made to the related description of step 101, which is not described herein again.
Step 202, if the target position information and the temperature value of the object radiating the infrared ray sent by the infrared sensor are received, detecting whether the temperature value of the object radiating the infrared ray is within a preset temperature range.
When receiving infrared rays, the infrared sensor can convert radiation energy of the infrared rays into an electric signal, so that a temperature value of an object radiating the infrared rays is obtained. It should be noted that the object radiating infrared rays may be composed of a plurality of portions having different temperature values (for example, the temperature value of the head of the human body is different from the temperature value of the hand of the human body), and then the average value of the temperature values of the plurality of portions may be used as the temperature value of the object radiating infrared rays, or the temperature value of one portion may be used as the temperature value of the object radiating infrared rays, which is not limited herein.
Because the temperature values of different objects may be different, as the temperature of the object is higher, the more infrared rays radiated outwards are increased, and the temperature of the object is lower, the less infrared rays radiated outwards are decreased, so that by detecting whether the temperature value of the object radiating infrared rays is within the preset temperature range, the object needing to be focused (i.e. the object not needing blurring processing) in the target shooting scene can be further screened, the object needing to be focused in the target shooting scene can be more accurately obtained, and the probability that the object is blurring processed is reduced.
Wherein, can set up different temperature range of predetermineeing to different objects to ensure that every temperature range homoenergetic of predetermineeing selects the object that predetermineeing the temperature range and correspond comparatively accurately, realize the discernment to different objects.
Step 203, if the temperature value of the object radiating the infrared ray is within the preset temperature range, determining the region to be blurred in the image to be processed according to the target position information.
If the temperature value of the object radiating infrared rays is within the preset temperature range, the object radiating infrared rays is determined to be the object needing to be focused, so that the area to be blurred in the image to be processed needs to be determined, and the focus area in the image to be processed (namely, the image or the area of the object radiating infrared rays in the image to be processed) becomes more clearly prominent.
If the temperature value of the object radiating the infrared ray is not within the preset temperature range, it is determined that the object radiating the infrared ray does not need to be clearly protruded in the image to be processed, namely, the image to be processed does not have a region to be blurred, and blurring of the image to be processed is not needed.
It should be noted that, in the present application, the infrared sensor is used to identify the to-be-blurred region in the to-be-processed image, and the to-be-blurred region may be located in the background of the to-be-processed image or may be located in the foreground of the to-be-processed image.
And 204, performing blurring processing on the area to be blurred to obtain a blurred image.
The step is the same as step 103, and reference may be made to the related description of step 103, which is not described herein again.
According to the method and the device, before the area to be blurred is determined according to the target position information, whether the temperature value of the object radiating infrared rays is within the preset temperature range or not is judged, the object needing to be focused in the target shooting scene can be further screened, the object needing to be focused in the target shooting scene is accurately obtained, and therefore the probability that the object is blurred is reduced.
Referring to fig. 3, which is a schematic view of an implementation flow of an image blurring method provided in the third embodiment of the present application, where the image blurring method is applied to a mobile terminal, as shown in the figure, the image blurring method may include the following steps:
step 301, acquiring an image to be processed acquired by a camera.
The step is the same as step 101, and reference may be made to the related description of step 101, which is not described herein again.
Step 302, if target position information sent by the infrared sensor is received, determining a region to be blurred in the image to be processed according to the target position information.
The step is the same as step 102, and reference may be made to the related description of step 102, which is not repeated herein.
Step 303, obtaining the distance between the object radiating infrared rays and the mobile terminal.
The distance sensor can be arranged in the mobile terminal, and the distance between an object radiating infrared rays in a target shooting scene and the mobile terminal is measured through the distance sensor. It should be noted that the distance between the object radiating infrared rays and the mobile terminal may be acquired by other methods, which are not limited herein.
And step 304, acquiring the virtualization level of the area to be virtualized according to the distance between the object radiating infrared rays and the mobile terminal.
Wherein, the blurring degrees corresponding to different blurring levels are different.
Specifically, the corresponding relationship between the distance and the virtualization level may be preset, so that after the distance between the object radiating infrared rays and the mobile terminal is obtained, the virtualization level of the area to be virtualized may be determined according to the preset corresponding relationship.
It should be noted that, when setting the corresponding relationship between the distance and the virtualization level, the larger the distance between the object radiating infrared rays and the mobile terminal is, the smaller the virtualization level of the region to be virtualized is, and the smaller the virtualization degree when performing virtualization processing on the region to be virtualized is; the smaller the distance between the object radiating infrared rays and the mobile terminal is, the larger the virtualization level of the area to be virtualized is, and the larger the virtualization degree of the area to be virtualized is; i.e. the principle that the distance is inversely proportional to the level of blurring.
And 305, performing virtualization processing on the region to be virtualized according to the virtualization level of the region to be virtualized to obtain a virtualized image.
Specifically, according to the virtualization level of the region to be virtualized, the virtualization degree of the region to be virtualized may be obtained, and the virtualization processing is performed on the region to be virtualized according to the virtualization degree.
The embodiment of the application obtains the virtualization grade of the region to be virtualized according to the distance between the object radiating infrared rays and the mobile terminal, so that the virtualization degree can be changed along with the change of the distance, the virtualization processing effect can adapt to different shooting scenes, and the virtualization processing accuracy is improved.
Fig. 4 is a schematic structural diagram of an image blurring device according to a fourth embodiment of the present application, and only a part related to the embodiment of the present application is shown for convenience of description. This image blurring device is applied to mobile terminal, and mobile terminal includes infrared sensor and camera, and above-mentioned image blurring device includes:
the image acquisition module 41 is configured to acquire an image to be processed acquired by a camera, where the image to be processed is an image of a target shooting scene;
the area determining module 42 is configured to determine an area to be blurred in the image to be processed according to target position information if the target position information sent by the infrared sensor is received, where the target position information is position information of an object radiating infrared rays in a target shooting scene in the image to be processed;
and a blurring processing module 43, configured to perform blurring processing on the region to be blurred, so as to obtain a blurred image.
Optionally, the area determining module 42 is specifically configured to:
if target position information and the temperature value of the object radiating infrared rays sent by the infrared ray sensor are received, whether the temperature value of the object radiating infrared rays is within a preset temperature range is detected;
and if the temperature value of the object radiating the infrared ray is within the preset temperature range, determining the area to be blurred in the image to be processed according to the target position information.
Optionally, the area determining module 42 is specifically configured to:
determining a focus area in the image to be processed according to the target position information, wherein the focus area refers to an area of an object radiating infrared rays in the image to be processed;
and determining the area except the focus area in the image to be processed as the area to be blurred.
Optionally, the region determining module 42 is further configured to:
acquiring contour information of a focusing area;
detecting whether an object radiating infrared rays is a target object or not according to the contour information of the focusing area;
and if the object radiating the infrared ray is the target object, determining the area except the focusing area in the image to be processed as the area to be blurred.
Optionally, an infrared sensing view angle of the infrared sensor is the same as an image capturing view angle of the camera.
Optionally, the image blurring device further includes:
the distance acquisition module is used for acquiring the distance between an object radiating infrared rays and the mobile terminal;
the level acquisition module is used for acquiring the virtualization level of the area to be virtualized according to the distance between the object radiating infrared rays and the mobile terminal;
accordingly, the blurring processing module 43 is specifically configured to:
and performing virtualization treatment on the area to be virtualized according to the virtualization level of the area to be virtualized.
Optionally, the image blurring device further includes:
and the function starting module is used for starting the virtualization processing function.
The image blurring processing device provided in the embodiment of the present application can be applied to the foregoing method embodiments, and for details, reference is made to the description of the foregoing method embodiments, and details are not repeated here.
Fig. 5 is a schematic structural diagram of a mobile terminal according to a fifth embodiment of the present application. The mobile terminal as shown in the figure may include: one or more processors 501 (only one shown); one or more input devices 502 (only one shown), one or more output devices 503 (only one shown), and a memory 504. The processor 501, the input device 502, the output device 503, and the memory 504 are connected by a bus 505. The memory 504 is used for storing instructions, and the processor 501 is used for executing the instructions stored in the memory 504 to implement the steps in the above-mentioned embodiments of the image blurring method.
It should be understood that in the embodiments of the present Application, the Processor 501 may be a Central Processing Unit (CPU), and the Processor may also be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 502 may include a touch pad, a fingerprint sensor (for collecting fingerprint information of a user and direction information of a fingerprint), a microphone, a data receiving interface, a camera, an infrared sensor, and the like. The output device 503 may include a display (LCD, etc.), a speaker, a data transmission interface, and the like.
The memory 504 may include a read-only memory and a random access memory, and provides instructions and data to the processor 501. A portion of the memory 504 may also include non-volatile random access memory. For example, the memory 504 may also store device type information.
In a specific implementation, the processor 501, the input device 502, the output device 503, and the memory 504 described in this embodiment of the present application may execute the implementation described in the embodiment of the image blurring method provided in this embodiment of the present application, or may execute the implementation described in the fourth image blurring device of the embodiment, which is not described herein again.
Fig. 6 is a schematic structural diagram of a mobile terminal according to a sixth embodiment of the present application. As shown in fig. 6, the mobile terminal 6 of this embodiment includes: one or more processors 60 (only one shown), a memory 61, and a computer program 62 stored in the memory 61 and executable on the at least one processor 60. The steps in the various image blurring method embodiments described above are implemented when the processor 60 executes the computer program 62.
The mobile terminal 6 may be a smartphone, a tablet computer, or the like having a camera and an infrared sensor. The mobile terminal may include, but is not limited to, a processor 60, a memory 61. Those skilled in the art will appreciate that fig. 6 is merely an example of a mobile terminal 6 and is not intended to limit the mobile terminal 6 and may include more or fewer components than those shown, or some of the components may be combined, or different components, e.g., the mobile terminal may also include input-output devices, network access devices, buses, etc.
The processor 60 may be a central processing unit CPU, but may also be other general purpose processors, digital signal processors DSP, application specific integrated circuits ASIC, off-the-shelf programmable gate arrays FPGA or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 61 may be an internal storage unit of the mobile terminal 6, such as a hard disk or a memory of the mobile terminal 6. The memory 61 may also be an external storage device of the mobile terminal 6, such as a plug-in hard disk provided on the mobile terminal 6, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the memory 61 may also include both an internal storage unit of the mobile terminal 6 and an external storage device. The memory 61 is used for storing computer programs and other programs and data required by the mobile terminal. The memory 61 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/mobile terminal and method may be implemented in other ways. For example, the above-described apparatus/mobile terminal embodiments are merely illustrative, and for example, a division of modules or units is merely a logical division, and an actual implementation may have another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method according to the embodiments described above may be implemented by a computer program, which is stored in a computer readable storage medium and used by a processor to implement the steps of the embodiments of the methods described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, U.S. disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution media, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, in accordance with legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunications signals.
When the computer program product runs on the mobile terminal, the steps in the method embodiments can be realized when the mobile terminal executes the computer program product.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. An image blurring method is applied to a mobile terminal, and is characterized in that the mobile terminal comprises an infrared sensor and a camera, and the image blurring method comprises the following steps:
acquiring an image to be processed acquired by the camera, wherein the image to be processed is an image of a target shooting scene;
if target position information sent by the infrared sensor is received, determining a region to be blurred in the image to be processed according to the target position information, wherein the target position information refers to position information of an object radiating infrared rays in the target shooting scene in the image to be processed;
and performing virtualization processing on the region to be virtualized to obtain a virtualized image.
2. The image blurring method according to claim 1, wherein the determining, according to the target position information, the region to be blurred in the image to be processed if the target position information sent by the infrared sensor is received comprises:
if the target position information sent by the infrared sensor and the temperature value of the object radiating the infrared ray are received, whether the temperature value of the object radiating the infrared ray is within a preset temperature range is detected;
and if the temperature value of the object radiating the infrared ray is within the preset temperature range, determining the area to be blurred in the image to be processed according to the target position information.
3. The image blurring method according to claim 1 or 2, wherein the determining the region to be blurred in the image to be processed according to the target position information comprises:
determining a focus area in the image to be processed according to the target position information, wherein the focus area refers to an area of the object radiating infrared rays in the image to be processed;
and determining the region except the focusing region in the image to be processed as the region to be blurred.
4. The image blurring method as claimed in claim 3, further comprising, after said determining the focus region in the image to be processed:
acquiring contour information of the focusing area;
detecting whether the object radiating infrared rays is a target object or not according to the contour information of the focusing area;
correspondingly, the determining that the region of the image to be processed except the focus region is the region to be blurred comprises:
and if the object radiating the infrared ray is the target object, determining that the region except the focusing region in the image to be processed is the region to be blurred.
5. An image blurring method as claimed in claim 1, wherein an infrared sensing view angle of said infrared sensor is the same as an image capturing view angle of said camera.
6. An image blurring method as claimed in claim 1, further comprising, before blurring the region to be blurred:
acquiring the distance between the object radiating infrared rays and the mobile terminal;
acquiring the virtualization level of the area to be virtualized according to the distance between the object radiating infrared rays and the mobile terminal;
correspondingly, the blurring the region to be blurred comprises:
and performing virtualization processing on the area to be virtualized according to the virtualization level of the area to be virtualized.
7. The image blurring method according to claim 1, before determining the region to be blurred in the image to be processed according to the target position information if the target position information sent by the infrared sensor is received, further comprising:
the virtualization processing function is started.
8. The utility model provides an image blurring device, is applied to mobile terminal, its characterized in that, mobile terminal includes infrared sensor and camera, image blurring device includes:
the image acquisition module is used for acquiring an image to be processed acquired by the camera, wherein the image to be processed is an image of a target shooting scene;
the area determining module is used for determining an area to be blurred in the image to be processed according to the target position information if the target position information sent by the infrared sensor is received, wherein the target position information refers to the position information of an object radiating infrared rays in the target shooting scene in the image to be processed;
and the blurring processing module is used for blurring the area to be blurred to obtain a blurred image.
9. A mobile terminal comprising an infrared sensor, a camera, a memory, a processor and a computer program stored in said memory and executable on said processor, characterized in that said processor, when executing said computer program, implements the steps of the image blurring method according to any one of claims 1 to 7.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the image blurring method according to any one of claims 1 to 7.
CN202011049565.0A 2020-09-29 2020-09-29 Image blurring method, image blurring device, mobile terminal, and storage medium Pending CN112217992A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011049565.0A CN112217992A (en) 2020-09-29 2020-09-29 Image blurring method, image blurring device, mobile terminal, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011049565.0A CN112217992A (en) 2020-09-29 2020-09-29 Image blurring method, image blurring device, mobile terminal, and storage medium

Publications (1)

Publication Number Publication Date
CN112217992A true CN112217992A (en) 2021-01-12

Family

ID=74051478

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011049565.0A Pending CN112217992A (en) 2020-09-29 2020-09-29 Image blurring method, image blurring device, mobile terminal, and storage medium

Country Status (1)

Country Link
CN (1) CN112217992A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113129334A (en) * 2021-03-11 2021-07-16 宇龙计算机通信科技(深圳)有限公司 Object tracking method and device, storage medium and wearable electronic equipment
CN117152398A (en) * 2023-10-30 2023-12-01 深圳优立全息科技有限公司 Three-dimensional image blurring method, device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107038681A (en) * 2017-05-31 2017-08-11 广东欧珀移动通信有限公司 Image weakening method, device, computer-readable recording medium and computer equipment
CN107395965A (en) * 2017-07-14 2017-11-24 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN107704798A (en) * 2017-08-09 2018-02-16 广东欧珀移动通信有限公司 Image weakening method, device, computer-readable recording medium and computer equipment
CN108769505A (en) * 2018-03-30 2018-11-06 联想(北京)有限公司 A kind of image procossing set method and electronic equipment
CN110996078A (en) * 2019-11-25 2020-04-10 深圳市创凯智能股份有限公司 Image acquisition method, terminal and readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107038681A (en) * 2017-05-31 2017-08-11 广东欧珀移动通信有限公司 Image weakening method, device, computer-readable recording medium and computer equipment
CN107395965A (en) * 2017-07-14 2017-11-24 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN107704798A (en) * 2017-08-09 2018-02-16 广东欧珀移动通信有限公司 Image weakening method, device, computer-readable recording medium and computer equipment
CN108769505A (en) * 2018-03-30 2018-11-06 联想(北京)有限公司 A kind of image procossing set method and electronic equipment
CN110996078A (en) * 2019-11-25 2020-04-10 深圳市创凯智能股份有限公司 Image acquisition method, terminal and readable storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113129334A (en) * 2021-03-11 2021-07-16 宇龙计算机通信科技(深圳)有限公司 Object tracking method and device, storage medium and wearable electronic equipment
CN117152398A (en) * 2023-10-30 2023-12-01 深圳优立全息科技有限公司 Three-dimensional image blurring method, device, equipment and storage medium
CN117152398B (en) * 2023-10-30 2024-02-13 深圳优立全息科技有限公司 Three-dimensional image blurring method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN111654594B (en) Image capturing method, image capturing apparatus, mobile terminal, and storage medium
CN108921806B (en) Image processing method, image processing device and terminal equipment
CN109215037B (en) Target image segmentation method and device and terminal equipment
CN110457963B (en) Display control method, display control device, mobile terminal and computer-readable storage medium
CN110119733B (en) Page identification method and device, terminal equipment and computer readable storage medium
CN110431563B (en) Method and device for correcting image
CN108564550B (en) Image processing method and device and terminal equipment
CN111290684B (en) Image display method, image display device and terminal equipment
CN108764139B (en) Face detection method, mobile terminal and computer readable storage medium
KR20160027862A (en) Method for processing image data and electronic device supporting thereof
CN112217992A (en) Image blurring method, image blurring device, mobile terminal, and storage medium
CN111142650B (en) Screen brightness adjusting method, screen brightness adjusting device and terminal
CN112351271A (en) Camera shielding detection method and device, storage medium and electronic equipment
CN110166696B (en) Photographing method, photographing device, terminal equipment and computer-readable storage medium
CN107679222B (en) Picture processing method, mobile terminal and computer readable storage medium
CN109886864B (en) Privacy mask processing method and device
US9778796B2 (en) Apparatus and method for sensing object, and method of identifying calibration pattern in object sensing apparatus
KR20200127928A (en) Method and apparatus for recognizing object of image in electronic device
CN111861965B (en) Image backlight detection method, image backlight detection device and terminal equipment
CN111754435A (en) Image processing method, image processing device, terminal equipment and computer readable storage medium
CN108629219B (en) Method and device for identifying one-dimensional code
CN110610178A (en) Image recognition method, device, terminal and computer readable storage medium
CN111382831A (en) Method and device for accelerating forward reasoning of convolutional neural network model
KR102605451B1 (en) Electronic device and method for providing multiple services respectively corresponding to multiple external objects included in image
CN111784607A (en) Image tone mapping method, device, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210112

RJ01 Rejection of invention patent application after publication