CN110086987B - Camera visual angle cutting method and device and storage medium - Google Patents

Camera visual angle cutting method and device and storage medium Download PDF

Info

Publication number
CN110086987B
CN110086987B CN201910324884.9A CN201910324884A CN110086987B CN 110086987 B CN110086987 B CN 110086987B CN 201910324884 A CN201910324884 A CN 201910324884A CN 110086987 B CN110086987 B CN 110086987B
Authority
CN
China
Prior art keywords
camera
area
shielding
visual angle
scale
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910324884.9A
Other languages
Chinese (zh)
Other versions
CN110086987A (en
Inventor
于洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huizhou TCL Mobile Communication Co Ltd
Original Assignee
Huizhou TCL Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huizhou TCL Mobile Communication Co Ltd filed Critical Huizhou TCL Mobile Communication Co Ltd
Priority to CN201910324884.9A priority Critical patent/CN110086987B/en
Publication of CN110086987A publication Critical patent/CN110086987A/en
Application granted granted Critical
Publication of CN110086987B publication Critical patent/CN110086987B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Studio Devices (AREA)
  • Telephone Function (AREA)

Abstract

The application relates to a camera visual angle cutting method, a device and a storage medium, wherein the camera visual angle cutting method is applied to a mobile terminal and comprises the following steps: receiving a shooting request of a preset application; starting a camera of the mobile terminal according to the shooting request; detecting whether a shielding area exists in a preview picture shot by a camera; if yes, cutting the visual angle of the camera according to the shielding area. Through the mode, the visual angle of the camera can be adaptively cut, so that the problem of visual angle interference caused by the assembly tolerance of the camera is avoided, and the imaging effect of the camera is further improved.

Description

Camera visual angle cutting method and device and storage medium
Technical Field
The application relates to the technical field of display, in particular to a method and a device for cutting a visual angle of a camera and a storage medium.
Background
At present, the front and back camera configurations of mobile terminal products have become mainstream standard configurations, and appearance camera holes required by users are smaller and smaller.
However, when the apparent camera holes required by users and markets are reduced to the utmost extent, the apparent camera holes interfere with the imaging view angle cone of the camera due to the assembly tolerance of the camera, and further the imaging effect of the camera is affected.
Disclosure of Invention
The application aims to provide a camera visual angle cutting method, a camera visual angle cutting device and a storage medium, so that the problem of visual angle interference caused by the assembly tolerance of a camera is avoided, and the imaging effect of the camera is further improved.
In order to solve the above problem, an embodiment of the present application provides a method for clipping a camera view angle, where the method for clipping a camera view angle is applied to a mobile terminal, and includes: receiving a shooting request of a preset application; starting a camera of the mobile terminal according to the shooting request; detecting whether a shielding area exists in a preview picture shot by a camera; if yes, cutting the visual angle of the camera according to the shielding area.
Wherein, carry out the step of tailorring according to the visual angle of shielding region to the camera, specifically include: determining a target rectangular area according to the shielded area and the preset length-width ratio, wherein the target rectangular area is located in an unshielded area of the preview picture and is a rectangular area which meets the preset length-width ratio and has the largest area; determining a cutting parameter according to the target rectangular area; and based on the cutting parameters, cutting the visual angle of the camera.
The method for detecting whether the preview picture has the shielding area comprises the steps of: reading three-primary-color data corresponding to a plurality of mark areas; and detecting whether a shielding area exists in the preview picture according to the three primary color data.
The method comprises the following steps of reading three-primary-color data corresponding to a plurality of marked areas, wherein the marked areas are four scale marks, one end of each scale mark is located at the geometric center of a preview picture, the other end of each scale mark is located at four vertex points of the preview picture, and the method specifically comprises the following steps: reading three-primary-color data of scale points on the four scale marks; detecting whether a shielding area exists in a preview picture according to the three primary color data, which specifically comprises the following steps: calculating the tristimulus values of the scale points according to the three primary color data; for each scale mark, sequentially judging whether the difference between the three primary color values of two adjacent scale points on the scale mark is greater than a first preset threshold value; when the difference between the three primary color values of two adjacent scale points on the scale mark is larger than a first preset threshold value, taking the scale point close to the geometric center of the two adjacent scale points as a shielding critical point; and determining a shielding area in the preview picture according to the shielding critical point.
The step of determining the target rectangular area according to the shielded area and the preset length-width ratio specifically comprises the following steps: determining a rectangular region set to be selected according to the shielding critical point, the shielding region and a preset length-width ratio, wherein the rectangular regions to be selected in the rectangular region set to be selected correspond to the shielding critical point one by one; and calculating the area of the rectangular region to be selected in the rectangular region set to be selected, and screening the rectangular region to be selected with the largest area as a target rectangular region.
The step of reading the three-primary-color data corresponding to the plurality of mark areas specifically includes: reading three-primary-color data of pixel points in four vertex angle areas; the step of detecting whether a shielding area exists in the preview picture according to the three primary color data specifically comprises the following steps: calculating the tristimulus values of the pixel points according to the tristimulus data; judging whether the tristimulus values of the pixel points are smaller than a second preset threshold value or not; if yes, the pixel point is taken as a black pixel point; and counting the number of black pixel points in each vertex angle region, and taking the vertex angle region with the maximum number of the black pixel points as a shielding region.
The step of determining the target rectangular area according to the shielded area and the preset length-width ratio specifically comprises the following steps: traversing the pixel points in the shielding region to obtain a boundary pixel point set between the black pixel point and other pixel points in the shielding region; fitting based on the boundary pixel point set to obtain a boundary line, and taking an intersection point of the boundary line and a diagonal line of the preview picture as a reference point; and determining a target rectangular area according to the reference point, the shielded area and the preset length-width ratio.
After the step of detecting whether the preview picture shot by the camera has the shielding area, the method further comprises the following steps: if the preview picture has a shielding area, determining the current visual angle of the camera according to the shielding area; judging whether the current visual angle is larger than a preset visual angle or not; if yes, the step of cutting the visual angle of the camera according to the shielding area is executed.
In order to solve the above problem, an embodiment of the present application further provides a device is tailor to camera visual angle, and the device is tailor to camera visual angle is applied to mobile terminal, and includes: the receiving module is used for receiving a shooting request of a preset application; the starting module is used for starting a camera of the mobile terminal according to the shooting request; the detection module is used for detecting whether a shielding area exists in a preview picture shot by the camera; and the cutting module is used for cutting the visual angle of the camera according to the shielding area when the shielding area exists in the preview picture.
In order to solve the above problem, an embodiment of the present application further provides a computer-readable storage medium, where a plurality of instructions are stored, where the instructions are adapted to be loaded by a processor to execute any one of the above camera angle cropping methods.
The beneficial effect of this application is: different from the prior art, the camera visual angle cutting method provided by the application starts the camera of the mobile terminal according to the shooting request by receiving the shooting request of the preset application, then detects whether a shielding area exists in a preview picture shot by the camera, and cuts the visual angle of the camera according to the shielding area if the shielding area exists, so that the visual angle of the camera can be adaptively cut, the problem of visual angle interference caused by the assembly tolerance of the camera is avoided, and the imaging effect of the camera is further improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flow chart of a camera viewing angle cutting method provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of assembly deviation of a camera provided in the embodiment of the present application;
FIG. 3 is a schematic diagram of a preview screen captured by the camera in FIG. 2;
fig. 4 is a schematic diagram of a state in which a preset picture is shot by a camera according to an embodiment of the present application;
fig. 5 is another schematic flow chart of a camera view angle cropping method provided in the embodiment of the present application;
FIG. 6 is a schematic flow chart of S14 in FIG. 1;
fig. 7 is a schematic structural diagram of a preview screen provided in an embodiment of the present application;
fig. 8 is another schematic flow chart of a camera view angle cropping method provided in the embodiment of the present application;
fig. 9 is a schematic flowchart of S141 in fig. 6;
fig. 10 is another schematic structural diagram of a preview screen provided in an embodiment of the present application;
fig. 11 is another schematic flow chart of a camera view angle cropping method provided in the embodiment of the present application;
fig. 12 is another schematic flow chart of S141 in fig. 6;
fig. 13 is another schematic flow chart diagram of a camera view angle cropping method provided in the embodiment of the present application;
fig. 14 is a schematic structural diagram of a camera viewing angle cutting device according to an embodiment of the present application;
fig. 15 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application;
fig. 16 is another schematic structural diagram of a mobile terminal according to an embodiment of the present application.
Detailed Description
The present application provides a method, an apparatus and a storage medium for clipping a camera view angle, so that the purpose, technical solution and technical effect of the present application are more clear and clear, and the following further detailed description of the present application should be understood that the specific embodiments described herein are only for explaining the present application and are not intended to limit the present application.
At present, appearance camera holes required by mobile terminal users are smaller and smaller, but when the appearance camera holes required by users and markets are reduced to the utmost extent, the appearance camera holes can interfere with the imaging visual angle cone of a camera due to the assembly tolerance of the camera, and then the imaging effect of the camera is influenced. In order to solve the technical problem, the technical scheme adopted by the application is to provide the camera visual angle cutting method so as to avoid the problem of visual angle interference caused by the assembly tolerance of the camera and further improve the imaging effect of the camera.
Referring to fig. 1, fig. 1 is a schematic flow chart of a camera view angle clipping method according to an embodiment of the present application, which is applied to a camera view angle clipping device, where the camera view angle clipping device is located in a mobile terminal, and a specific flow of the camera view angle clipping method may be as follows:
s11: and receiving a shooting request of a preset application.
The mobile terminal receives a shooting request sent by a preset application. The preset application can be an application program which is pre-implanted in the whole assembly process of the mobile terminal and is about the visual angle test of the camera. After the complete machine is assembled, the visual angle of the camera can be tested by opening the preset application which is installed on the mobile terminal and is about to the visual angle test. Specifically, after the preset application is opened, a shooting request is sent to the mobile terminal.
S12: and starting a camera of the mobile terminal according to the shooting request.
After receiving a photographing request sent by a preset application, the mobile terminal starts a camera to photograph. Specifically, after the camera is started, an image within the visual angle range of the camera can be shot, but the shooting shutter is not triggered, that is, the display screen of the mobile terminal displays a preview picture shot by the camera in real time.
S13: whether a shielded area exists in a preview screen shot by the camera is detected, if so, S14 is executed, otherwise, the process is passed and the process is ended.
After the camera is assembled according to the prior art, due to the existence of assembly tolerance, the camera is deviated to a certain degree, and then a photosensitive area of the camera is possibly covered by a camera light inlet hole, so that the imaging effect of the camera is influenced. For example, as shown in fig. 2, due to assembly tolerance, two top corners of the right side of the light sensing area 21 of the camera are partially covered by the camera light inlet hole 22. Correspondingly, as shown in fig. 2 and 3, since the light sensing area 21 of the camera is partially covered by the camera light inlet 22, a shielded area 311 and an unmasked area 312 exist in the preview screen 31 captured by the camera.
The preview image may be obtained by taking a reference object by the camera, and specifically, as shown in fig. 4, the reference object 41 may be a circular scale, a color picture or a gray scale picture which is placed in advance within the range a of the angle of view taken by the camera 42. In this embodiment, the masked region corresponds to a black or dark region in the preview screen.
In one embodiment, the preview screen may include a plurality of mark areas, as shown in fig. 5, S13 may specifically include:
s131: reading three primary color data corresponding to the plurality of mark areas.
If the three primary color data of each pixel point in the preview picture is read, the calculation amount is large. Therefore, in order to reduce the amount of computation, only the three primary color data corresponding to the plurality of mark regions in the preview screen may be read.
The three primary color data includes R, G and B values of the pixel points. Considering that the shaded areas in the preview screen are generally located at four corners of the preview screen, the plurality of marked areas may be distributed at the four corners of the preview screen.
S132: whether or not a mask area exists in the preview screen is detected from the three primary color data, and if so, S14 is executed, otherwise, the process is passed and ended.
In a preview picture shot by a camera, generally, adjacent pixel points have similar three primary color data. Therefore, when the three primary color data of a pixel point is obviously lower than the three primary color data of other adjacent pixel points, the pixel point can be considered as a shielded pixel point. And the plurality of shielded pixel points form a shielding area in the preview picture.
When the preview screen is a grayscale screen, S131 may further include: reading the gray scale data corresponding to the plurality of mark regions, S132 may further be: and detecting whether a shielding area exists in the preview picture according to the gray scale data. However, considering that the three primary color data of the pixel points is more beneficial to improving the accuracy of detecting the shaded area in the preview picture, in this embodiment, the preview picture is preferably a color picture.
S14: and cutting the visual angle of the camera according to the shielding area.
In this embodiment, the viewing angle of the camera may be adaptively tailored according to the shielded area, so as to avoid the shielded area appearing in the preview screen due to the shielded photosensitive area of the camera.
In an embodiment, referring to fig. 6, S14 may specifically include:
s141: and determining a target rectangular area according to the shielded area and the preset length-width ratio, wherein the target rectangular area is located in an unshielded area of the preview picture and is the rectangular area which has the largest area and meets the preset length-width ratio in the unshielded area.
For example, the area of the preview picture except the shaded area is determined as the non-shaded area, and one or more rectangular areas are formed in the non-shaded area according to the preset length-width ratio. Then, the rectangular region having the largest area is set as a target rectangular region. The preset length-width ratio is the length-width ratio of the display screen of the mobile terminal, namely the length-width ratio of the preview picture.
S142: and determining a cutting parameter according to the target rectangular area.
For example, four vertex point coordinates of the target rectangular area are obtained, and the cutting parameters are determined according to the four vertex point coordinates.
S143: and based on the cutting parameters, cutting the visual angle of the camera.
For example, based on the four vertex point coordinates, a preview picture shot by a camera is cut and then output to a display screen of the mobile terminal for display. Therefore, the visual angle of the camera is cut based on the rectangular area which has the largest area in the unshaded area and meets the preset length-width ratio, the minimum visual angle cutting can be realized, and the imaging effect of the camera can be exerted to the maximum extent.
In one embodiment, as shown in fig. 7, the preview screen 71 includes a plurality of mark areas 711, and the plurality of mark areas 711 may be four tick marks a/b/c/d. One end of the four scale lines a/b/c/d is located at the geometric center of the preview screen 71, and the other end is located at four vertex points of the preview screen 71, respectively, because the preview screen 71 is rectangular, that is, the four scale lines a/b/c/d coincide with diagonal lines of the preview screen 71. As shown in fig. 8, S131 may specifically include:
s1311: reading the three primary color data of the scale points on the four scale marks.
For example, with continued reference to fig. 7, the graduation mark a may include 10 graduation points a 1-a 10, the graduation mark b may include 10 graduation points b 1-b 10, the graduation mark c may include 10 graduation points c 1-c 10, and the graduation mark d may include 10 graduation points d 1-d 10. The three primary color data (R value, G value, B value) of the scale points X on the four scale lines a/B/c/d are shown in Table 1:
table 1 tricolor data and tristimulus values of scale points on scale marks
Figure GDA0002756485340000061
Figure GDA0002756485340000071
In some embodiments, in consideration of the fact that the probability that the mask region appears in the region near the vertex point in the preview screen is high, in order to improve the accuracy of determining the mask region, the distribution density of the scale points on the scale lines may be set such that the distribution density of the scale points close to the vertex point is higher than the distribution density of the scale points far away from the vertex point.
With continuing reference to fig. 8, S132 may specifically include:
s1321: and calculating the tristimulus values of the scale points according to the tristimulus data.
For example, with continued reference to Table 1 above, the tristimulus value Y for scale point X is equal to the sum of the R, G, and B values for scale point X. For example, the three primary color data of the scale point a1 is: if the R value is 225, G value 243 and B value 23, the tristimulus value Y of the graduation point a1 is 491.
S1322: and for each scale mark, sequentially judging whether the difference between the three primary color values of two adjacent scale points on the scale mark is greater than a first preset threshold value, if so, executing S1323, and if not, passing and ending.
For example, for each scale mark, whether the difference between the tristimulus values of two adjacent scale points on the scale mark is larger than 100 is sequentially judged. Specifically, for each scale mark, it may be sequentially determined whether a difference between three primary color values of two adjacent scale points on the scale mark is greater than a first preset threshold value along a direction from a geometric center of the preview picture to a vertex point of the preview picture. For example, for the scale mark a, it is sequentially determined whether the difference between a1 and a2 is greater than 100, the difference between a2 and a3 is greater than 100, the difference between a8 and a9 is greater than 100, and the difference between a9 and a10 is greater than 100 according to the direction from the geometric center of the preview screen to the vertex point of the preview screen.
S1323: and when the difference between the three primary color values of two adjacent scale points on the scale mark is greater than a first preset threshold value, taking the scale point close to the geometric center of the two adjacent scale points as a shielding critical point.
In S1322 above, for each scale mark, once it is detected that the difference between the tristimulus values of two adjacent scale points on the scale mark is greater than the first preset threshold, the remaining scale points on the scale mark are stopped being determined, and the scale point close to the geometric center of the two adjacent scale points is used as the masking critical point.
For example, please refer to table 1 above, sequentially determine whether the difference between a1 and a2 is greater than 100, whether the difference between a2 and a3 is greater than 100, whether the difference between a8 and a9 is greater than 100, and whether the difference between a9 and a10 is greater than 100 for the scale mark a according to the direction from the geometric center of the preview screen to the vertex point of the preview screen, and when the difference between a8 and a9 is greater than 100, determine whether the difference between a9 and a10 is greater than 100 and use a8 as the masking threshold; for the scale mark b, sequentially judging whether the difference between b1 and b2 is greater than 100, whether the difference between b2 and b3 is greater than 100, and whether the difference between b8 and b9 is greater than 100, and whether the difference between b9 and b10 is greater than 100 according to the direction from the geometric center of the preview picture to the vertex point of the preview picture, wherein the difference between the three primary color values of two adjacent scale points on the scale mark b is not greater than 100, so that the fact that a shielding critical point does not exist on the scale mark b can be determined; in the same way, it can be determined that there is no masking critical point on the scale c and a masking critical point d9 on the scale d.
S1324: and determining a shielding area in the preview picture according to the shielding critical point.
Next, in the above example, the area surrounded by the sequentially connected scale point a8, scale point b10, scale point c10, and scale point d9 may be an unmasked area, and correspondingly, the area other than the unmasked area in the preview screen may be a masked area.
Further, as shown in fig. 9, S141 may specifically include:
s1411: and determining a rectangular region set to be selected according to the shielding critical point, the shielding region and a preset length-width ratio, wherein the rectangular regions to be selected in the rectangular region set to be selected correspond to the shielding critical point one to one.
Specifically, for each scale mark, if there is a mask critical point, the mask critical point is used as a connection point of the scale mark, and if there is no mask critical point, an end point of the scale mark at the vertex angle position of the preview screen is used as a connection point of the scale mark. And determining the area enclosed by the connecting points of the four scale marks which are sequentially connected according to the clockwise direction or the anticlockwise direction as the non-shielding area. Then, for each shielding critical point, the shielding critical point is taken as a vertex point, and a rectangular region to be selected with the largest area meeting the preset length-width ratio is made in the non-shielding region. The number of the shielding critical points is at least one, each shielding critical point corresponds to one rectangular region to be selected, and the rectangular regions to be selected corresponding to the at least one shielding critical point form a rectangular region set to be selected.
S1412: and calculating the area of the rectangular region to be selected in the rectangular region set to be selected, and screening the rectangular region to be selected with the largest area as a target rectangular region.
In another embodiment, as shown in fig. 10, the preview screen 81 includes a plurality of mark areas 811, and the plurality of mark areas 811 may be four corner areas a/B/CD. The four corner areas a/B/CD are located at the four corners of the preview screen 81, respectively. In the present embodiment, the preview screen 81 is preferably a solid screen, and for example, the preview screen 81 may be a yellow screen. As shown in fig. 11, S131 may specifically include:
s1312: reading the three primary color data of the pixel points in the four vertex angle areas.
For example, please continue to refer to fig. 10, which reads the rgb data of all the pixels located in the four corner regions a/B/CD, wherein the rgb data includes the R value, G value and B value of the pixels.
With continued reference to fig. 11, S132 may specifically include:
s1325: and calculating the tristimulus values of the pixel points according to the tristimulus data.
And the tristimulus values of the pixel points are equal to the sum of the R values, the G values and the B values of the pixel points.
S1326: and judging whether the tristimulus values of the pixel points are smaller than a second preset threshold value, if so, executing S1327, and if not, passing and ending.
For example, it is determined whether the tristimulus values of the pixels are less than 510.
S1327: and taking the pixel points as black pixel points.
For example, if the tristimulus value of a pixel is less than 510, the pixel is regarded as a black pixel.
S1328: and counting the number of black pixel points in each vertex angle region, and taking the vertex angle region with the maximum number of the black pixel points as a shielding region.
For example, please continue to refer to fig. 10, if the number of black pixels in the vertex angle region a is 100, the number of black pixels in the vertex angle region B is 20, the number of black pixels in the vertex angle region C is 25, and the number of black pixels in the vertex angle region D is 30, it is known that the number of black pixels in the vertex angle region a is the largest, and it is determined that the vertex angle region a in the preview screen 81 is the shielding region.
Further, as shown in fig. 12, S141 may specifically include:
s1413: and traversing the pixel points in the shielding region to obtain a boundary pixel point set between the black pixel point and other pixel points in the shielding region.
Specifically, the pixel points in the shaded region may be represented as P (x, y), where x and y are the row coordinates and the column coordinates of the pixel points, respectively. For each row of pixel points P (x, y), sequentially judging whether the tristimulus values of two adjacent pixel points P (x, y) and P (x-1, y) are greater than a third preset threshold (for example, 50) according to the sequence of the row coordinate x from small to large, and when the difference between the tristimulus values of two adjacent pixel points existing on the row of pixel points P (x, y) is greater than the third preset threshold, taking the pixel point with the larger tristimulus value in the two adjacent pixel points as a boundary pixel point Cy located on the row of pixel points P (x, y). Therefore, a plurality of boundary pixel points Cy can be obtained, and the boundary pixel points Cy form a boundary pixel point set.
S1414: and fitting based on the boundary pixel point set to obtain a boundary line, and taking the intersection point of the boundary line and the diagonal line of the preview picture as a reference point.
For example, linear fitting is performed based on a plurality of boundary pixel points in the boundary pixel point set to obtain a boundary straight line. Then, an intersection of the boundary line and a diagonal line of the preview screen is calculated, and the intersection is set as a reference point.
S1415: and determining a target rectangular area according to the reference point, the shielded area and the preset length-width ratio.
Specifically, a region enclosed between the boundary line and the vertex angle where the shielded region is located is determined as an actual shielded region, and correspondingly, a region other than the actual shielded region in the preview screen is set as an unmasked region. And then, a rectangular region with the largest area meeting the preset length-width ratio is made on the unmasked region by taking the reference point as a vertex point, and the rectangular region is determined as a target rectangular region. The preset length-width ratio is the length-width ratio of the display screen of the mobile terminal, namely the length-width ratio of the preview picture.
In another embodiment, as shown in fig. 13, after S13, the method may further include:
s15: and if the preview picture has the shielding area, determining the current visual angle of the camera according to the shielding area.
Specifically, the area except the shielded area in the preview picture is used as an unmasked area, a rectangular area with the largest area meeting a preset length-width ratio is made in the unmasked area, then the diagonal length of the rectangular area is calculated, and then the current visual angle of the camera can be obtained according to the imaging depth of the camera. For example, the diagonal length of the rectangular area is 3.086, that is, the diagonal length of the imaging angle of view of the camera is 3.086, and the imaging depth of the camera is 2.0, then the current angle of view of the camera can be calculated to be 75 degrees.
S16: and judging whether the current visual angle is larger than the preset visual angle, if so, executing S14, otherwise, not passing and ending.
The preset visual angle may be a maximum visual angle of a camera promised by a specification of the mobile terminal in an ideal state. For example, the preset viewing angle may be 69 degrees. That is, when the current visual angle of camera is less than or equal to the visual angle of predetermineeing, because the equipment deviation of camera is too big, even cut out through the visual angle to the camera, can avoid appearing shielding region in the preview picture, the visual angle after nevertheless cutting out is not more than predetermineeing the visual angle, can not satisfy the visual angle size requirement of mobile terminal camera. Therefore, when the shielding area exists in the preview image and the current viewing angle of the camera is not greater than the preset viewing angle, the mobile terminal needs to be maintained or scrapped.
Further, with continued reference to fig. 13, after executing S14, execution may return to S13 to ensure that no more occlusion regions appear in the preview screen captured by the camera after the perspective cropping.
Different from the prior art, the camera view angle clipping method provided in this embodiment starts the camera of the mobile terminal according to the shooting request by receiving the shooting request of the preset application, and then detects whether a shielded area exists in a preview picture shot by the camera, and if the shielded area exists, clips the view angle of the camera according to the shielded area, so that the view angle of the camera can be adaptively clipped, thereby avoiding the problem of view angle interference caused by the assembly tolerance of the camera, and further improving the imaging effect of the camera.
Referring to fig. 14, fig. 14 is a schematic structural diagram of a camera view angle cutting device according to an embodiment of the present application. As shown in fig. 14, the camera view cropping device 600, applied to a mobile terminal, includes: a receiving module 601, a starting module 602, a detecting module 603, and a clipping module 604, wherein:
(1) receiving module 601
The receiving module 601 is configured to receive a shooting request of a preset application.
(2) Start module 602
The starting module 602 is configured to start a camera of the mobile terminal according to the shooting request.
(3) Detection module 603
The detecting module 603 is configured to detect whether a masked area exists in a preview picture captured by the camera.
(4) Cropping module 604
The cropping module 604 is configured to crop the view angle of the camera according to the masked area when the masked area exists in the preview screen.
It should be noted that, in a specific implementation, the above modules may be implemented as independent entities, or may be combined arbitrarily to be implemented as one or several entities, and the specific implementation of the above modules may refer to the foregoing method embodiments, which are not described herein again.
Being different from the prior art, the camera visual angle cutting device provided by this embodiment starts the camera of the mobile terminal according to the shooting request by receiving the shooting request of the preset application, and then detects whether a shielding region exists in the preview picture shot by the camera, and if the shielding region exists, cuts the visual angle of the camera according to the shielding region, so that the visual angle of the camera can be adaptively cut, the problem of visual angle interference caused by the assembly tolerance of the camera is avoided, and the imaging effect of the camera is improved.
Referring to fig. 15, an embodiment of the present application further provides a mobile terminal, where the mobile terminal may be a smart phone, a tablet computer, or the like. As shown in fig. 15, the mobile terminal 400 includes a processor 401, a memory 402. The processor 401 is electrically connected to the memory 402.
The processor 401 is a control center of the mobile terminal 400, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by running or loading an application program stored in the memory 402 and calling data stored in the memory 402, thereby integrally monitoring the mobile terminal.
In this embodiment, the processor 401 in the mobile terminal 400 loads instructions corresponding to processes of one or more application programs into the memory 402 according to the following steps, and the processor 401 runs the application programs stored in the memory 402, thereby implementing various functions:
receiving a shooting request of a preset application;
starting a camera of the mobile terminal according to the shooting request;
detecting whether a shielding area exists in a preview picture shot by a camera;
if yes, cutting the visual angle of the camera according to the shielding area.
Fig. 16 shows a specific structural block diagram of a mobile terminal provided in an embodiment of the present application, where the mobile terminal 300 may be used to implement the camera view angle cropping method provided in the foregoing embodiment. The mobile terminal 300 may be a smart phone or a tablet computer.
The RF circuit 310 is used for receiving and transmitting electromagnetic waves, and performing interconversion between the electromagnetic waves and electrical signals, thereby communicating with a communication network or other devices. RF circuitry 310 may include various existing circuit elements for performing these functions, such as an antenna, a radio frequency transceiver, a digital signal processor, an encryption/decryption chip, a Subscriber Identity Module (SIM) card, memory, and so forth. RF circuit 310 may communicate with various networks such as the internet, an intranet, a wireless network, or with other devices over a wireless network. The wireless network may comprise a cellular telephone network, a wireless local area network, or a metropolitan area network. The Wireless network may use various Communication standards, protocols, and technologies, including, but not limited to, Global System for Mobile Communication (GSM), Enhanced Data GSM Environment (EDGE), Wideband Code Division Multiple Access (WCDMA), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Wireless Fidelity (Wi-Fi) (e.g., Institute of Electrical and Electronics Engineers (IEEE) standard IEEE802.11 a, IEEE802.11 b, IEEE802.11g, and/or IEEE802.11 n), Voice over Internet Protocol (VoIP), world wide mail Access (Microwave Access for micro), wimax-1, other suitable short message protocols, and any other suitable Protocol for instant messaging, and may even include those protocols that have not yet been developed.
The memory 320 may be used to store software programs and modules, such as program instructions/modules corresponding to the camera angle cropping method and the camera angle cropping device in the foregoing embodiments, and the processor 380 executes various functional applications and data processing, i.e., implements data transmission functions, by running the software programs and modules stored in the memory 320. The memory 320 may include high speed random access memory and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 320 may further include memory located remotely from the processor 380, which may be connected to the mobile terminal 300 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input unit 330 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, the input unit 330 may include a touch-sensitive surface 331 as well as other input devices 332. The touch-sensitive surface 331, also referred to as a touch screen or touch pad, may collect touch operations by a user on or near the touch-sensitive surface 331 (e.g., operations by a user on or near the touch-sensitive surface 331 using a finger, a stylus, or any other suitable object or attachment), and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface 331 may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 380, and can receive and execute commands sent by the processor 380. In addition, the touch-sensitive surface 331 may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. The input unit 330 may comprise other input devices 332 in addition to the touch sensitive surface 331. In particular, other input devices 332 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 340 may be used to display information input by or provided to the user and various graphical user interfaces of the mobile terminal 300, which may be made up of graphics, text, icons, video, and any combination thereof. The Display unit 340 may include a Display panel 341, and optionally, the Display panel 341 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like. Further, touch-sensitive surface 331 may overlay display panel 341, and when touch-sensitive surface 331 detects a touch operation thereon or thereabout, communicate to processor 380 to determine the type of touch event, and processor 380 then provides a corresponding visual output on display panel 341 in accordance with the type of touch event. Although in fig. 16, touch-sensitive surface 331 and display panel 341 are implemented as two separate components for input and output functions, in some embodiments, touch-sensitive surface 331 and display panel 341 may be integrated for input and output functions.
The mobile terminal 300 may also include at least one sensor 350, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 341 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 341 and/or the backlight when the mobile terminal 300 is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which may be further configured on the mobile terminal 300, detailed descriptions thereof are omitted.
Audio circuitry 360, speaker 361, and microphone 362 may provide an audio interface between a user and the mobile terminal 300. The audio circuit 360 may transmit the electrical signal converted from the received audio data to the speaker 361, and the audio signal is converted by the speaker 361 and output; on the other hand, the microphone 362 converts the collected sound signal into an electrical signal, which is received by the audio circuit 360 and converted into audio data, which is then processed by the audio data output processor 380 and then transmitted to, for example, another terminal via the RF circuit 310, or the audio data is output to the memory 320 for further processing. The audio circuit 360 may also include an earbud jack to provide communication of a peripheral headset with the mobile terminal 300.
The mobile terminal 300, which may assist the user in e-mail, web browsing, streaming media access, etc., through the transmission module 370 (e.g., a Wi-Fi module), provides the user with wireless broadband internet access. Although fig. 16 shows the transmission module 370, it is understood that it does not belong to the essential constitution of the mobile terminal 300 and may be omitted entirely within the scope not changing the essence of the invention as needed.
The processor 380 is a control center of the mobile terminal 300, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the mobile terminal 300 and processes data by operating or executing software programs and/or modules stored in the memory 320 and calling data stored in the memory 320, thereby integrally monitoring the mobile phone. Optionally, processor 380 may include one or more processing cores; in some embodiments, processor 380 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 380.
The mobile terminal 300 also includes a power supply 390 (e.g., a battery) that provides power to the various components and, in some embodiments, may be logically coupled to the processor 380 via a power management system to manage charging, discharging, and power consumption management functions via the power management system. The power supply 390 may also include any component including one or more of a dc or ac power source, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown, the mobile terminal 300 may further include a camera (e.g., a front camera, a rear camera), a bluetooth module, etc., which will not be described herein. Specifically, in this embodiment, the display unit of the mobile terminal is a touch screen display, the mobile terminal further includes a memory, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the one or more processors, and the one or more programs include instructions for:
receiving a shooting request of a preset application;
starting a camera of the mobile terminal according to the shooting request;
detecting whether a shielding area exists in a preview picture shot by a camera;
if yes, cutting the visual angle of the camera according to the shielding area.
In specific implementation, the above modules and units may be implemented as independent entities, or may be combined arbitrarily and implemented as one or several entities, and specific implementations of the above modules and units may refer to the foregoing method embodiments, and are not described herein again.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor. To this end, the present application provides a storage medium, in which a plurality of instructions are stored, where the instructions can be loaded by a processor to execute the steps in any one of the camera angle cropping methods provided in the present application.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the storage medium can execute the steps in any of the camera view angle cropping methods provided in the embodiments of the present application, beneficial effects that can be achieved by any of the camera view angle cropping methods provided in the embodiments of the present application can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A camera visual angle cutting method is applied to a mobile terminal and comprises the following steps:
receiving a shooting request of a preset application, wherein the preset application is an application program which is installed on the mobile terminal in advance and used for testing the visual angle of the camera;
starting a camera of the mobile terminal according to the shooting request;
detecting whether a shielding area exists in a preview picture shot by the camera, wherein the shielding area is formed by partially shielding a photosensitive area of the camera by a camera light inlet;
and if so, cutting the visual angle of the camera according to the shielding area.
2. The method for cropping a camera view angle according to claim 1, wherein the step of cropping a camera view angle according to the masked region specifically comprises:
determining a target rectangular area according to the shielded area and a preset length-width ratio, wherein the target rectangular area is located in an unshielded area of the preview picture and is a rectangular area which meets the preset length-width ratio and has the largest area;
determining a cutting parameter according to the target rectangular area;
and based on the cutting parameters, cutting the visual angle of the camera.
3. The camera perspective cropping method of claim 2, wherein the preview screen comprises a plurality of marked areas,
the step of detecting whether a masked region exists in the preview picture specifically includes:
reading three-primary-color data corresponding to the plurality of mark areas;
and detecting whether a shielding area exists in the preview picture according to the three primary color data.
4. The camera viewing angle cropping method according to claim 3, wherein the plurality of marked areas are four scale lines, one end of each of the four scale lines is located at a geometric center of the preview screen, and the other end of each of the four scale lines is located at each of four vertex points of the preview screen,
the step of reading the three primary color data corresponding to the plurality of mark regions specifically includes:
reading three-primary-color data of the scale points on the four scale marks;
the detecting whether a shielding area exists in the preview picture according to the three primary color data specifically includes:
calculating the tristimulus values of the scale points according to the three primary color data;
for each scale mark, sequentially judging whether the difference between the three primary color values of two adjacent scale points on the scale mark is greater than a first preset threshold value;
when the difference between the tristimulus values of two adjacent scale points on the scale mark is larger than the first preset threshold value, taking the scale point close to the geometric center of the two adjacent scale points as a shielding critical point;
and determining a shielding area in the preview picture according to the shielding critical point.
5. The camera view clipping method according to claim 4, wherein the step of determining the target rectangular region according to the masked region and the preset aspect ratio specifically includes:
determining a rectangular region set to be selected according to the shielding critical point, the shielding region and a preset length-width ratio, wherein the rectangular regions to be selected in the rectangular region set to be selected correspond to the shielding critical point one by one;
and calculating the area of the rectangular region to be selected in the rectangular region set to be selected, and screening the rectangular region to be selected with the largest area as a target rectangular region.
6. The method for cropping a camera view angle according to claim 3, wherein said plurality of marked regions are four corner regions, said four corner regions being located at four corners of said preview screen,
the step of reading the three primary color data corresponding to the plurality of mark regions specifically includes:
reading three primary color data of pixel points in the four vertex angle areas;
the step of detecting whether a mask area exists in the preview picture according to the three primary color data specifically includes:
calculating the tristimulus values of the pixel points according to the three primary color data;
judging whether the tristimulus values of the pixel points are smaller than a second preset threshold value or not;
if so, taking the pixel point as a black pixel point;
and counting the number of the black pixel points in each vertex angle area, and taking the vertex angle area with the maximum number of the black pixel points as the shielding area.
7. The method for cropping the angle of view of a camera according to claim 6, wherein the step of determining a target rectangular region according to the masked region and a preset aspect ratio specifically comprises:
traversing the pixel points in the shielding region to obtain a boundary pixel point set between the black pixel point and other pixel points in the shielding region;
fitting based on the boundary pixel point set to obtain a boundary line, and taking an intersection point of the boundary line and a diagonal line of the preview picture as a reference point;
and determining a target rectangular area according to the reference point, the shielding area and a preset length-width ratio.
8. The method for cropping a camera view angle according to claim 1, wherein after the step of detecting whether a masked region exists in a preview picture taken by the camera, the method further comprises:
if the shielding area exists in the preview picture, determining the current visual angle of the camera according to the shielding area;
judging whether the current visual angle is larger than a preset visual angle or not;
and if so, executing the step of cutting the visual angle of the camera according to the shielding area.
9. The utility model provides a device is tailor to camera visual angle which characterized in that is applied to mobile terminal, includes:
the mobile terminal comprises a receiving module, a judging module and a judging module, wherein the receiving module is used for receiving a shooting request of a preset application, and the preset application is an application program which is installed on the mobile terminal in advance and used for testing the visual angle of a camera;
the starting module is used for starting a camera of the mobile terminal according to the shooting request;
the detection module is used for detecting whether a shielding area exists in a preview picture shot by the camera, wherein the shielding area is formed by shielding a photosensitive area of the camera by a camera light inlet part;
and the cutting module is used for cutting the visual angle of the camera according to the shielding area when the shielding area exists in the preview picture.
10. A computer-readable storage medium having stored thereon a plurality of instructions adapted to be loaded by a processor to perform the camera perspective cropping method of any of claims 1-8.
CN201910324884.9A 2019-04-22 2019-04-22 Camera visual angle cutting method and device and storage medium Active CN110086987B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910324884.9A CN110086987B (en) 2019-04-22 2019-04-22 Camera visual angle cutting method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910324884.9A CN110086987B (en) 2019-04-22 2019-04-22 Camera visual angle cutting method and device and storage medium

Publications (2)

Publication Number Publication Date
CN110086987A CN110086987A (en) 2019-08-02
CN110086987B true CN110086987B (en) 2021-01-05

Family

ID=67416033

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910324884.9A Active CN110086987B (en) 2019-04-22 2019-04-22 Camera visual angle cutting method and device and storage medium

Country Status (1)

Country Link
CN (1) CN110086987B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110691235B (en) * 2019-09-11 2021-02-23 华为技术有限公司 Detection method, device and equipment
CN113301250A (en) * 2021-05-13 2021-08-24 Oppo广东移动通信有限公司 Image recognition method and device, computer readable medium and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105144703A (en) * 2012-11-30 2015-12-09 派尔高公司 Window blanking for pan/tilt/zoom camera
CN105657274A (en) * 2016-02-29 2016-06-08 广东欧珀移动通信有限公司 Control method, control device and electronic device
CN107024828A (en) * 2017-03-29 2017-08-08 深圳市未来媒体技术研究院 A kind of camera system for being total to photocentre, panoramic seamless splicing component and method
CN107968859A (en) * 2017-12-26 2018-04-27 广东欧珀移动通信有限公司 Export module and electronic device
CN108288264A (en) * 2017-12-26 2018-07-17 横店集团东磁有限公司 A kind of dirty test method of wide-angle camera module

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7424218B2 (en) * 2005-07-28 2008-09-09 Microsoft Corporation Real-time preview for panoramic images
US9137444B2 (en) * 2011-09-26 2015-09-15 Sony Corporation Image photography apparatus for clipping an image region
KR20180040451A (en) * 2016-10-12 2018-04-20 엘지전자 주식회사 Mobile terminal and operating method thereof
CN106815846B (en) * 2016-12-30 2019-07-02 歌尔科技有限公司 The dark angle of image cuts method of determining range, image vignetting compensation method and device
JP6938237B2 (en) * 2017-06-21 2021-09-22 キヤノン株式会社 Information processing equipment, information processing methods and programs
CN108833752A (en) * 2018-06-29 2018-11-16 上海与德通讯技术有限公司 Picture Generation Method and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105144703A (en) * 2012-11-30 2015-12-09 派尔高公司 Window blanking for pan/tilt/zoom camera
CN105657274A (en) * 2016-02-29 2016-06-08 广东欧珀移动通信有限公司 Control method, control device and electronic device
CN107024828A (en) * 2017-03-29 2017-08-08 深圳市未来媒体技术研究院 A kind of camera system for being total to photocentre, panoramic seamless splicing component and method
CN107968859A (en) * 2017-12-26 2018-04-27 广东欧珀移动通信有限公司 Export module and electronic device
CN108288264A (en) * 2017-12-26 2018-07-17 横店集团东磁有限公司 A kind of dirty test method of wide-angle camera module

Also Published As

Publication number Publication date
CN110086987A (en) 2019-08-02

Similar Documents

Publication Publication Date Title
EP3370204B1 (en) Method for detecting skin region and device for detecting skin region
US10510136B2 (en) Image blurring method, electronic device and computer device
CN105828068A (en) Method and device for carrying out occlusion detection on camera and terminal device
CN109002787B (en) Image processing method and device, storage medium and electronic equipment
US20160112701A1 (en) Video processing method, device and system
CN103871051A (en) Image processing method, device and electronic equipment
CN111083386B (en) Image processing method and electronic device
CN111031178A (en) Video stream clipping method and electronic equipment
CN106204552A (en) The detection method of a kind of video source and device
US10636122B2 (en) Method, device and nonvolatile computer-readable medium for image composition
CN110086987B (en) Camera visual angle cutting method and device and storage medium
CN109561255B (en) Terminal photographing method and device and storage medium
CN109639981B (en) Image shooting method and mobile terminal
CN108628534B (en) Character display method and mobile terminal
CN114143523A (en) Brightness adjusting method and device, projection equipment and storage medium
CN110740265B (en) Image processing method and terminal equipment
CN112489082A (en) Position detection method, position detection device, electronic equipment and readable storage medium
CN113676663B (en) Camera white balance adjustment method and device, storage medium and terminal equipment
CN105184750A (en) Method and device of denoising real-time video images on mobile terminal
CN111064886B (en) Shooting method of terminal equipment, terminal equipment and storage medium
CN110933305B (en) Electronic equipment and focusing method
CN116363174A (en) Parameter calibration method, storage medium, co-processing chip and electronic equipment
CN107194363B (en) Image saturation processing method and device, storage medium and computer equipment
CN112489104B (en) Ranging method, ranging device, electronic equipment and readable storage medium
CN112468725B (en) Photo shooting method and device, storage medium and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant