CN111861965A - Image backlight detection method, image backlight detection device and terminal equipment - Google Patents

Image backlight detection method, image backlight detection device and terminal equipment Download PDF

Info

Publication number
CN111861965A
CN111861965A CN201910276705.9A CN201910276705A CN111861965A CN 111861965 A CN111861965 A CN 111861965A CN 201910276705 A CN201910276705 A CN 201910276705A CN 111861965 A CN111861965 A CN 111861965A
Authority
CN
China
Prior art keywords
image
target object
backlight
bounding box
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910276705.9A
Other languages
Chinese (zh)
Inventor
凌健
边思雨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TCL Corp
TCL Research America Inc
Original Assignee
TCL Research America Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TCL Research America Inc filed Critical TCL Research America Inc
Priority to CN201910276705.9A priority Critical patent/CN111861965A/en
Publication of CN111861965A publication Critical patent/CN111861965A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Abstract

The application is applicable to the technical field of image processing, and provides an image backlight detection method, an image backlight detection device, a terminal device and a computer readable storage medium, which comprise: acquiring the average brightness values of the area where the target object is located and the area where the target object is located in the image to be detected, wherein the range of the area where the target object is located is marked by using a boundary frame; amplifying the boundary frame and acquiring the average brightness value of the region in the amplified boundary frame; calculating the ratio of the average brightness value of the region where the target object is located to the average brightness value of the region in the amplified bounding box; and judging whether the ratio is smaller than a preset brightness threshold value, and if so, marking that the target object in the image to be detected is in backlight. Whether the image is backlighted or not can be rapidly and effectively detected through the method and the device.

Description

Image backlight detection method, image backlight detection device and terminal equipment
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to an image backlight detection method, an image backlight detection apparatus, a terminal device, and a computer-readable storage medium.
Background
When shooting is carried out by using equipment such as a mobile phone or a camera in daily life, if the shooting is carried out by using backlight, the result that the background part is too bright and the target object part is too dark usually occurs, the quality of the shot image is poor, and how to quickly and effectively detect whether the image is in a backlight state is very important. In the prior art, when detecting whether an image is backlighted, a bright area and a dark area in the whole image are usually obtained, and whether the image is backlighted is judged according to the bright area and the dark area.
Disclosure of Invention
In view of the above, embodiments of the present application provide an image backlight detection method, an image backlight detection apparatus, a terminal device, and a computer-readable storage medium, so as to quickly and efficiently detect whether an image is backlit.
A first aspect of an embodiment of the present application provides an image backlight detection method, where the image backlight detection method includes:
acquiring the average brightness values of the area where the target object is located and the area where the target object is located in the image to be detected, wherein the range of the area where the target object is located is marked by using a boundary frame;
Amplifying the boundary frame and acquiring the average brightness value of the region in the amplified boundary frame;
calculating the ratio of the average brightness value of the region where the target object is located to the average brightness value of the region in the amplified bounding box;
and judging whether the ratio is smaller than a preset brightness threshold value, and if so, marking that the target object in the image to be detected is in backlight.
A second aspect of an embodiment of the present application provides an image backlight detection apparatus, including:
the first brightness obtaining module is used for obtaining the area of a target object in an image to be detected and the average brightness value of the area of the target object, wherein the range of the area of the target object is marked by using a boundary frame;
the second brightness obtaining module is used for amplifying the boundary frame and obtaining the average brightness value of the region in the amplified boundary frame;
a ratio calculation module, configured to calculate a ratio between an average brightness value of a region where the target object is located and an average brightness value of the enlarged region within the bounding box;
and the backlight determining module is used for judging whether the ratio is smaller than a preset brightness threshold value or not, and if the ratio is smaller than the brightness threshold value, marking that the target object in the image to be detected is in backlight.
A third aspect of embodiments of the present application provides a terminal device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the image backlight detection method according to the first aspect when executing the computer program.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, which stores a computer program, and the computer program, when executed by a processor, implements the steps of the image backlight detection method according to the first aspect.
A fifth aspect of the present application provides a computer program product comprising a computer program which, when executed by one or more processors, performs the steps of the image backlight detection method as described in the first aspect above.
It is thus clear that, this application scheme acquire the regional scope in target object place that uses the boundary frame mark in waiting to detect the image and the average brightness value in target object place region to enlarge the boundary frame, acquire the average brightness value in the boundary frame after enlargeing, calculate the average brightness value in target object place region with the ratio of the average brightness value in the boundary frame after enlargeing, and judge whether the ratio is less than predetermined luminance threshold when the ratio is less than luminance threshold, mark target object is in the backlight in waiting to detect the image. According to the method and the device, the target object in the image to be detected is detected firstly, the backlight degree of the image is calculated on the basis of detecting the target object, and whether the target object in the image is backlighted or not is judged according to the backlight degree, so that the bright area and the dark area in the whole image to be detected can be avoided being detected, the calculation time is shortened, the calculation cost is reduced, and whether the image is backlighted or not can be quickly and effectively detected.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart illustrating an implementation of an image backlight detection method according to an embodiment of the present application;
FIG. 2 is an exemplary diagram of marking a target object in an image to be detected using a bounding box;
fig. 3 is a schematic flow chart illustrating an implementation of an image backlight detection method according to a second embodiment of the present application;
FIG. 4 is a schematic diagram of an image backlight detecting apparatus according to a third embodiment of the present application;
fig. 5 is a schematic diagram of a terminal device according to a fourth embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
In particular implementations, the terminal devices described in embodiments of the present application include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having touch sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments, the device is not a portable communication device, but is a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or touchpad).
In the discussion that follows, a terminal device that includes a display and a touch-sensitive surface is described. However, it should be understood that the terminal device may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The terminal device supports various applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disc burning application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a digital camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications that may be executed on the terminal device may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed between applications and/or within respective applications. In this way, a common physical architecture (e.g., touch-sensitive surface) of the terminal can support various applications with user interfaces that are intuitive and transparent to the user.
It should be understood that, the sequence numbers of the steps in this embodiment do not mean the execution sequence, and the execution sequence of each process should be determined by the function and the inherent logic of the process, and should not constitute any limitation to the implementation process of the embodiment of the present application.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Referring to fig. 1, which is a schematic view of an implementation flow of an image backlight detection method provided in an embodiment of the present application, where the image backlight detection method is applied to a terminal device, as shown in the figure, the image backlight detection method may include the following steps:
step S101, obtaining the area of a target object in an image to be detected and the average brightness value of the area of the target object, wherein the range of the area of the target object is marked by using a boundary frame.
In the embodiment of the application, the area where the target object is located in the image to be detected can be obtained first, and the range of the area where the target object is located is marked by using the bounding box. As shown in fig. 2, the probability of detecting the target object as the person is 0.99, and then the target object as the person in fig. 2 can be determined, where the frame a is the boundary frame that marks the area where the person is located, the image a is the image of the area in the frame a, and the image B is the image of the area in the frame B. The image to be detected may refer to an image to be detected whether the image is in a backlight state. Optionally, the area where the target object is located may refer to an area including only the target object, or may refer to an area within a bounding box, which is not limited herein. The area within the bounding box may include not only the target object but also a few non-target objects, as shown in box a in fig. 2.
In this embodiment of the application, when obtaining the average brightness value of the area where the target object is located, the brightness value of each pixel and the total number of pixels in the area where the target object is located may be obtained, the brightness values of all pixels in the area where the target object is located are accumulated, and the average brightness value of the area where the target object is located may be obtained by dividing the accumulated values by the total number of pixels; since the image to be detected is usually an RGB three-channel image, the average brightness value of the region where the target object is located may also be calculated according to a formula Y1 ═ 0.299 × R1) + (0.587 × G1) + (0.114 × B1), where Y1 is the average brightness value of the region where the target object is located, R1 is the average brightness value of the region where the target object is located on the R channel, and the average value of the brightness values of all pixels in the region where the target object is located on the R channel may be used as the average brightness value of the region where the target object is located on the R channel; g1 is an average luminance value of the region of the target object on the G channel, and an average value of luminance values of all pixels in the region of the target object on the G channel may be used as the average luminance value of the region of the target object on the G channel; b1 is an average luminance value of the area where the target object is located on the B channel, and an average value of luminance values of all pixels in the area where the target object is located on the B channel may be used as the average luminance value of the area where the target object is located on the B channel.
Optionally, the bounding box is a bounding box.
In the embodiment of the present application, a bounding box is a bounding box used for marking the range of the region where the target object is located in the image to be detected. The region where the target object marked by the bounding box is located can be obtained through the trained deep learning model, that is, the image to be detected is input into the trained deep learning model to perform the antecedent propagation for the first time, and an image marked with the bounding box is output, as shown in fig. 2. The deep learning model includes, but is not limited to, a convolutional neural network.
In the embodiment of the application, a plurality of sample images can be obtained first, a target frame including a region where a target object is located in each sample image is marked manually, the plurality of sample images are input into a deep learning model, the deep learning model identifies the target object in each sample image and a bounding box including the region where the target object is located, a mapping relation between the bounding box and the target frame is obtained for each sample image, and the deep learning model is adjusted according to the mapping relation, so that the bounding box output by the deep learning model is closer to the target frame, and the training of the deep learning model is completed.
Step S102, the boundary frame is enlarged, and the average brightness value of the area in the enlarged boundary frame is obtained.
In the embodiment of the present application, the bounding box may be enlarged by N times. The method specifically comprises the following steps: firstly, obtaining a four-dimensional vector (w1, h1, x, y) of the bounding box, wherein (x, y) represents the center point coordinates of the bounding box, w1 represents the width of the bounding box, h1 represents the height of the bounding box, and then, amplifying w1 and h1 of the bounding box by N times, namely, the four-dimensional vector of the amplified bounding box is (w 1N, h 1N, x, y), wherein x represents multiplication, and N is a number larger than 1, such as 1.5 or 2. Box B is an enlarged bounding box as shown in fig. 2. It should be noted that, if the enlarged bounding box exceeds the range of the image to be detected, the overlapping region of the enlarged bounding box and the image to be detected is the region in the enlarged bounding box.
In this embodiment of the present application, when obtaining the average brightness value of the amplified region within the boundary frame, the brightness value and the total number of pixels of each pixel in the amplified region within the boundary frame may be obtained, the brightness values of all pixels in the amplified region within the boundary frame are accumulated, and the average brightness of the amplified region within the boundary frame may be obtained by dividing the accumulated value by the total number of pixels; since the image to be detected is usually an RGB three-channel image, the average brightness value of the region in the expanded bounding box may also be calculated according to the formula Y2 ═ 0.299 × R2) + (0.587 × G2) + (0.114 × B2), where Y2 is the average brightness value of the region in the expanded bounding box, R2 is the average brightness value of the region in the expanded bounding box on the R channel, and the average value of the brightness values of all pixels in the region in the expanded bounding box on the R channel may be used as the average brightness value of the region in the expanded bounding box on the R channel; g2 is the average brightness value of the region in the bounding box on the G channel after the amplification, and the average value of the brightness values of all pixels on the G channel in the region in the bounding box after the amplification can be used as the average brightness value of the region in the bounding box after the amplification on the G channel; b2 is the average brightness value of the enlarged region in the bounding box on the B channel, and the average value of the brightness values of all the pixels in the enlarged region in the bounding box on the B channel may be used as the average brightness value of the enlarged region in the bounding box on the B channel.
Step S103, calculating the ratio of the average brightness value of the area where the target object is located to the average brightness value of the enlarged area in the boundary frame.
In the implementation of the present application, a ratio Y1/Y2 between an average luminance value Y1 of a region where the target object is located and an average luminance value Y2 of the enlarged region within the bounding box may be used as a backlight degree of the image to be detected, and whether the target object in the image to be detected is in backlight is determined according to the backlight degree.
And step S104, judging whether the ratio is smaller than a preset brightness threshold value, and if the ratio is smaller than the brightness threshold value, marking that the target object in the image to be detected is in backlight.
In this embodiment of the present application, the ratio calculated in step S103 may be compared with a brightness threshold to determine whether the target object in the image to be detected is in backlight, if the ratio is smaller than the brightness threshold, it may be determined that the target object in the image to be detected is in backlight, and if the ratio is greater than or equal to the brightness threshold, it may be determined that the target object in the image to be detected is in non-backlight. The brightness threshold may be preset according to the sample image, or may be set by the user according to the actual requirement, which is not limited herein.
Optionally, after step S103, the embodiment of the present application further includes:
and acquiring the brightness threshold according to a plurality of sample images in backlight.
In this embodiment of the application, a plurality of sample images known to be in a backlight may be selected, an average brightness value of a region where a target object is located in each sample image is obtained and recorded as a first average brightness value, a boundary frame is also used to mark the region where the target object is located in each sample image, the boundary frame is enlarged, an average brightness value of the region in the enlarged boundary frame is obtained and recorded as a second average brightness value, and a ratio of the first average brightness value to the second average brightness value corresponding to each sample image is calculated.
According to the image backlight detection method and device, the target object in the image to be detected is detected, the backlight degree of the image is calculated on the basis of the target object, and whether the target object in the image is backlighted or not is judged according to the backlight degree, so that the bright area and the dark area in the whole image to be detected can be prevented from being detected, the calculation time is shortened, the calculation cost is reduced, and whether the image is backlighted or not can be quickly and effectively detected.
Referring to fig. 3, it is a schematic diagram of an implementation flow of an image backlight detection method provided in the second embodiment of the present application, where the image backlight detection method is applied to a terminal device, and as shown in the figure, the image backlight detection method may include the following steps:
step S301, obtaining an area where a target object is located in an image to be detected and an average brightness value of the area where the target object is located, wherein the image to be detected is a preview picture in a camera, and a boundary frame is used for marking the range of the area where the target object is located.
In this embodiment of the application, after the terminal device starts the camera function, a preview picture of a captured scene is usually displayed on a screen of the terminal device, and the preview picture may be used as an image to be detected to determine whether a target object in the preset picture is in a backlight state.
Step S302, the bounding box is enlarged, and the average brightness value of the area in the enlarged bounding box is obtained.
The step is the same as step S102, and reference may be made to the related description of step S102, which is not repeated herein.
Step S303, calculating the ratio of the average brightness value of the region where the target object is located to the average brightness value of the region in the enlarged bounding box.
The step is the same as step S103, and reference may be made to the related description of step S103, which is not described herein again.
Step S304, judging whether the ratio is smaller than a preset brightness threshold value, and if the ratio is smaller than the brightness threshold value, marking that a target object in the image to be detected is in backlight.
The step is the same as step S104, and reference may be made to the related description of step S104, which is not repeated herein.
Step S305, prompt the user to adjust the shooting angle of the camera or execute backlight compensation processing.
In the embodiment of the application, when a target object in a marked preview picture is in backlight (that is, the preview picture is in a backlight scene), a user can be prompted to adjust the shooting angle of a camera, and the target object is changed from backlight to non-backlight by adjusting the shooting angle of the camera; or performing backlight compensation processing to enhance the brightness of the region where the target object is located. The backlight compensation process includes, but is not limited to, increasing an exposure value of the camera, presetting a corresponding relationship between different ratios and different exposure values, finding an exposure value corresponding to the ratio calculated in step S303 from the preset corresponding relationship between different ratios and different exposure values when it is determined that the target object in the image to be detected is in backlight, and adjusting the current exposure value of the camera to the exposure value corresponding to the ratio.
The embodiment of the application adds 'the image to be detected is a preview picture in the camera' and 'prompts a user to adjust the shooting angle of the camera or execute backlight compensation processing' on the basis of the first embodiment, so that the backlight phenomenon of the image can be improved in the shooting process of the camera, and the image with better picture quality can be shot.
Fig. 4 is a schematic diagram of an image backlight detection device provided in the third embodiment of the present application, and for convenience of description, only the relevant portions of the third embodiment of the present application are shown.
The image backlight detection device includes:
a first brightness obtaining module 41, configured to obtain an area where a target object is located in an image to be detected and an average brightness value of the area where the target object is located, where a range of the area where the target object is located is marked by using a bounding box;
a second brightness obtaining module 42, configured to amplify the bounding box and obtain an average brightness value of an area in the amplified bounding box;
a ratio calculating module 43, configured to calculate a ratio between an average brightness value of a region where the target object is located and an average brightness value of the enlarged region within the bounding box;
and a backlight determining module 44, configured to determine whether the ratio is smaller than a preset brightness threshold, and if the ratio is smaller than the brightness threshold, mark that the target object in the image to be detected is in backlight.
Optionally, the image to be detected is a preview image in the camera.
Optionally, the image backlight detection device further includes:
and the processing module 45 is configured to prompt a user to adjust a shooting angle of the camera or execute backlight compensation processing.
Optionally, the image backlight detection device further includes:
a threshold obtaining module 46, configured to obtain the brightness threshold according to multiple sample images in the backlight.
Optionally, the bounding box is a bounding box.
Optionally, the second brightness acquiring module 42 includes:
an obtaining unit, configured to obtain a four-dimensional vector (w1, h1, x, y) of the bounding box, where (x, y) represents coordinates of a center point of the bounding box, w1 represents a width of the bounding box, and h1 represents a height of the bounding box;
and the amplifying unit is used for amplifying the width w1 and the height h1 of the bounding box by N times, wherein the four-dimensional vector of the amplified bounding box is (w1 × N, h1 × N, x, y), and N is a number greater than 1.
Optionally, the second brightness obtaining module 42 further includes:
and the determining unit is used for determining the overlapped area of the boundary frame and the image to be detected after amplification as the area in the boundary frame after amplification if the boundary frame after amplification exceeds the range of the image to be detected.
The apparatus provided in the embodiment of the present application may be applied to the first method embodiment and the second method embodiment, and for details, reference is made to the description of the first method embodiment and the second method embodiment, and details are not repeated here.
Fig. 5 is a schematic diagram of a terminal device according to a fourth embodiment of the present application. As shown in fig. 5, the terminal device 5 of this embodiment includes: a processor 50, a memory 51 and a computer program 52 stored in said memory 51 and executable on said processor 50. The processor 50, when executing the computer program 52, implements the steps in the above-described embodiments of the image backlight detection method, such as the steps S101 to S103 shown in fig. 1. Alternatively, the processor 50, when executing the computer program 52, implements the functions of the modules/units in the above-mentioned device embodiments, such as the functions of the modules 41 to 46 shown in fig. 4.
Illustratively, the computer program 52 may be partitioned into one or more modules/units, which are stored in the memory 51 and executed by the processor 50 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 52 in the terminal device 5. For example, the computer program 52 may be divided into a first brightness obtaining module, a second brightness obtaining module, a ratio calculating module, a backlight determining module, a processing module, and a threshold obtaining module, and the specific functions of the modules are as follows:
The first brightness obtaining module is used for obtaining the area of a target object in an image to be detected and the average brightness value of the area of the target object, wherein the range of the area of the target object is marked by using a boundary frame;
the second brightness obtaining module is used for amplifying the boundary frame and obtaining the average brightness value of the region in the amplified boundary frame;
a ratio calculation module, configured to calculate a ratio between an average brightness value of a region where the target object is located and an average brightness value of the enlarged region within the bounding box;
and the backlight determining module is used for determining that the target object in the image to be detected is in backlight if the ratio is smaller than the brightness threshold.
Optionally, the image to be detected is a preview image in the camera.
Optionally, the processing module is configured to prompt a user to adjust a shooting angle of the camera or execute backlight compensation processing.
Optionally, the threshold obtaining module is configured to obtain the brightness threshold according to a plurality of sample images in the backlight.
Optionally, the bounding box is a bounding box.
Optionally, the second brightness obtaining module includes:
an obtaining unit, configured to obtain a four-dimensional vector (w1, h1, x, y) of the bounding box, where (x, y) represents coordinates of a center point of the bounding box, w1 represents a width of the bounding box, and h1 represents a height of the bounding box;
And the amplifying unit is used for amplifying the width w1 and the height h1 of the bounding box by N times, wherein the four-dimensional vector of the amplified bounding box is (w1 × N, h1 × N, x, y), and N is a number greater than 1.
Optionally, the second brightness obtaining module further includes:
and the determining unit is used for determining the overlapped area of the boundary frame and the image to be detected after amplification as the area in the boundary frame after amplification if the boundary frame after amplification exceeds the range of the image to be detected.
The terminal device 5 may be a mobile phone, a notebook, a palm computer, a camera, or the like. The terminal device may include, but is not limited to, a processor 50, a memory 51. Those skilled in the art will appreciate that fig. 5 is merely an example of a terminal device 5 and does not constitute a limitation of terminal device 5 and may include more or fewer components than shown, or some components may be combined, or different components, e.g., the terminal device may also include input-output devices, network access devices, buses, etc.
The Processor 50 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may be an internal storage unit of the terminal device 5, such as a hard disk or a memory of the terminal device 5. The memory 51 may also be an external storage device of the terminal device 5, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 5. Further, the memory 51 may also include both an internal storage unit and an external storage device of the terminal device 5. The memory 51 is used for storing the computer program and other programs and data required by the terminal device. The memory 51 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. An image backlight detection method, characterized in that the image backlight detection method comprises:
acquiring the average brightness values of the area where the target object is located and the area where the target object is located in the image to be detected, wherein the range of the area where the target object is located is marked by using a boundary frame;
amplifying the boundary frame and acquiring the average brightness value of the region in the amplified boundary frame;
calculating the ratio of the average brightness value of the region where the target object is located to the average brightness value of the region in the amplified bounding box;
and judging whether the ratio is smaller than a preset brightness threshold value, and if so, marking that the target object in the image to be detected is in backlight.
2. The image backlight detection method according to claim 1, wherein the image to be detected is a preview picture in a camera.
3. The image backlight detection method of claim 2, further comprising, after marking the target object in the image to be detected in backlight:
and prompting a user to adjust the shooting angle of the camera or execute backlight compensation processing.
4. The image backlight detection method of claim 1, further comprising:
and acquiring the brightness threshold according to a plurality of sample images in backlight.
5. The image backlight detection method of claim 1, wherein the bounding box is a bounding box.
6. The image backlighting detection method of claim 1 wherein said enlarging the bounding box comprises:
obtaining a four-dimensional vector (w1, h1, x, y) of the bounding box, wherein (x, y) represents the coordinates of the center point of the bounding box, w1 represents the width of the bounding box, and h1 represents the height of the bounding box;
and amplifying the width w1 and the height h1 of the bounding box by N times, wherein the four-dimensional vector of the enlarged bounding box is (w1 × N, h1 × N, x, y), and N is a number greater than 1.
7. The image backlight detection method of claim 1, further comprising:
and if the enlarged boundary frame exceeds the range of the image to be detected, determining the overlapped area of the enlarged boundary frame and the image to be detected as the area in the enlarged boundary frame.
8. An image backlight detection device, characterized by comprising:
the first brightness obtaining module is used for obtaining the area of a target object in an image to be detected and the average brightness value of the area of the target object, wherein the range of the area of the target object is marked by using a boundary frame;
the second brightness obtaining module is used for amplifying the boundary frame and obtaining the average brightness value of the region in the amplified boundary frame;
a ratio calculation module, configured to calculate a ratio between an average brightness value of a region where the target object is located and an average brightness value of the enlarged region within the bounding box;
and the backlight determining module is used for judging whether the ratio is smaller than a preset brightness threshold value or not, and if the ratio is smaller than the brightness threshold value, marking that the target object in the image to be detected is in backlight.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the image backlight detection method according to any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, carries out the steps of the image backlighting detection method according to any one of claims 1 to 7.
CN201910276705.9A 2019-04-08 2019-04-08 Image backlight detection method, image backlight detection device and terminal equipment Pending CN111861965A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910276705.9A CN111861965A (en) 2019-04-08 2019-04-08 Image backlight detection method, image backlight detection device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910276705.9A CN111861965A (en) 2019-04-08 2019-04-08 Image backlight detection method, image backlight detection device and terminal equipment

Publications (1)

Publication Number Publication Date
CN111861965A true CN111861965A (en) 2020-10-30

Family

ID=72951889

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910276705.9A Pending CN111861965A (en) 2019-04-08 2019-04-08 Image backlight detection method, image backlight detection device and terminal equipment

Country Status (1)

Country Link
CN (1) CN111861965A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112949423A (en) * 2021-02-07 2021-06-11 深圳市优必选科技股份有限公司 Object recognition method, object recognition device, and robot
CN114007019A (en) * 2021-12-31 2022-02-01 杭州魔点科技有限公司 Method and system for predicting exposure based on image brightness in backlight scene

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05176220A (en) * 1991-12-25 1993-07-13 Matsushita Electric Ind Co Ltd Automatic exposure controller
JP2010102426A (en) * 2008-10-22 2010-05-06 Mitsubishi Electric Corp Image processing apparatus and image processing method
JP2015114465A (en) * 2013-12-11 2015-06-22 キヤノン株式会社 Imaging device, control method therefor, and control program
CN106973236A (en) * 2017-05-24 2017-07-21 上海与德科技有限公司 A kind of filming control method and device
CN108734676A (en) * 2018-05-21 2018-11-02 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment, computer readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05176220A (en) * 1991-12-25 1993-07-13 Matsushita Electric Ind Co Ltd Automatic exposure controller
JP2010102426A (en) * 2008-10-22 2010-05-06 Mitsubishi Electric Corp Image processing apparatus and image processing method
JP2015114465A (en) * 2013-12-11 2015-06-22 キヤノン株式会社 Imaging device, control method therefor, and control program
CN106973236A (en) * 2017-05-24 2017-07-21 上海与德科技有限公司 A kind of filming control method and device
CN108734676A (en) * 2018-05-21 2018-11-02 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment, computer readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112949423A (en) * 2021-02-07 2021-06-11 深圳市优必选科技股份有限公司 Object recognition method, object recognition device, and robot
CN114007019A (en) * 2021-12-31 2022-02-01 杭州魔点科技有限公司 Method and system for predicting exposure based on image brightness in backlight scene

Similar Documents

Publication Publication Date Title
CN111654594B (en) Image capturing method, image capturing apparatus, mobile terminal, and storage medium
EP3547218B1 (en) File processing device and method, and graphical user interface
US9959601B2 (en) Distortion rectification method and terminal
CN108769634B (en) Image processing method, image processing device and terminal equipment
CN108737739B (en) Preview picture acquisition method, preview picture acquisition device and electronic equipment
CN109215037B (en) Target image segmentation method and device and terminal equipment
CN108961183B (en) Image processing method, terminal device and computer-readable storage medium
WO2021083059A1 (en) Image super-resolution reconstruction method, image super-resolution reconstruction apparatus, and electronic device
CN112102164A (en) Image processing method, device, terminal and storage medium
CN112071267B (en) Brightness adjusting method, brightness adjusting device, terminal equipment and storage medium
CN109359582B (en) Information searching method, information searching device and mobile terminal
CN110618852B (en) View processing method, view processing device and terminal equipment
CN111861965A (en) Image backlight detection method, image backlight detection device and terminal equipment
CN112055156B (en) Preview image updating method and device, mobile terminal and storage medium
CN111597009A (en) Application program display method and device and terminal equipment
CN110677586B (en) Image display method, image display device and mobile terminal
CN108763491B (en) Picture processing method and device and terminal equipment
CN107360361B (en) Method and device for shooting people in backlight mode
CN113391779B (en) Parameter adjusting method, device and equipment for paper-like screen
CN111784607A (en) Image tone mapping method, device, terminal equipment and storage medium
CN111754411B (en) Image noise reduction method, image noise reduction device and terminal equipment
CN108769527B (en) Scene identification method and device and terminal equipment
CN109089040B (en) Image processing method, image processing device and terminal equipment
CN112291548B (en) White balance statistical method, device, mobile terminal and storage medium
CN111382831A (en) Method and device for accelerating forward reasoning of convolutional neural network model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination