Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
The application provides a shooting method of an evidence-provided image. It should be noted that the shooting method of the evidence image provided by the application is applied to the field investigation work of any kind of natural resources, including but not limited to mountains, rivers, forests, animal habitats, and points of interest.
In addition, the shooting method of the evidence image provided by the application does not limit the execution subject. Alternatively, the execution subject of the method for capturing an evidence image provided in the present application may be a capturing system for capturing an evidence image. Specifically, the execution subject of the method for capturing an evidence image provided in the present application may be a processing terminal in the capturing system of the evidence image.
As shown in fig. 1, in an embodiment of the present application, the method for capturing the evidence image includes the following steps S100 to S620:
s100, establishing a three-dimensional space coordinate system.
Specifically, the purpose of the application is to display the camera shooting range (i.e. the range covered by the camera preview window) on the display interface of the mobile terminal in a linkage way. Thus, a three-dimensional space coordinate system is first established as a reference.
S200, acquiring camera parameters and shooting azimuth parameters, and calculating coordinates of all vertexes of the first graph according to the camera parameters and the shooting azimuth parameters; the first graph is a graph mapped by a camera shooting range in a three-dimensional space coordinate system.
Specifically, the camera parameters are hardware parameters of the camera itself, which may include one or more of aperture size, lens focal length, camera blind zone data.
The shooting azimuth parameter is a parameter related to the position and the angle of the camera. The shooting azimuth parameters may include one or more of camera placement point coordinates, azimuth angle, shooting spatial position, and pitch angle.
Please refer to fig. 2. And the camera placement point is 0 point, and after the coordinates of the 0 point are determined, taking the 0 point as a starting point, and making a ray OE. The drawing direction of the ray OE is determined by the direction angle, since the direction angle determines the photographing angle of the camera. Assuming that the north-positive direction is the reference direction, if the direction angle is 0 degrees, then OE is the north-positive direction, as shown in fig. 2. The vertical OE ray has two segments, a near segment CD and a far segment AB, which are the closest and farthest distances that the camera can take. The lengths of the near line segment CD and the far line segment AB are related to the camera blind area data.
The length of the distance FE between the proximal line segment CD and the distal line segment AB can be derived.
For example, the CD length is 10 meters, the AB length is 80 meters, and the FE length is 70 meters. It is noted that in actual photographing, the actual length of FE is affected by the magnitude of the pitch angle. The pitch angle may cause the camera lens to pitch up or down. The applicant has proved through a limited number of experiments that when the pitch angle is 0 degrees, the actual length of FE is 70 meters. When the pitch angle is 45 degrees, the actual length of FE becomes 10 meters. Thus, the actual length of fe=the actual length of FE at a pitch angle of 0 degrees- (4/3) ×pitch angle can be found.
Further, the coordinates of the point a and the coordinates of the point B may be calculated according to the actual length of FE. According to the camera blind area data and the coordinates OF the O point, the length OF can be obtained, and finally the coordinates OF the C point and the D point can be calculated.
At this time, the coordinates of the point a, the point B, the point C and the point D are all obtained, and the closed graph enclosed by the point a, the point B, the point C and the point D is the first graph, that is, the graph mapped by the camera shooting range in the three-dimensional space coordinate system.
And S300, importing the coordinates of all the vertexes of the first graph into a GIS map of the mobile terminal so as to display the first graph in the GIS map. The outer border of the first graphic is displayed as a solid border of a first color.
Specifically, the mobile terminal of the embodiment displays the first graphic through the GIS map. GIS, geographic information system (Geographic Information System). The coordinates of the point a, the point B, the point C and the point D are obtained in the step S200, and the four coordinates are imported into the GIS map of the mobile terminal, so as to display the first graphic in the GIS map. For the purpose of distinguishing the subsequent figures from other figures, the step also displays the outer border of the first figure as a solid border of the first color. The first color may be white.
S400, the coordinates of all the vertexes of the to-be-proved pattern spots are imported into a GIS map of the mobile terminal, so that the to-be-proved pattern spots are displayed in the GIS map. The outer border of the pattern spot to be demonstrated is displayed as a solid border of the second color.
Specifically, the coordinates of all the vertices of the pattern to be proved are known, and the coordinates of all the vertices of the pattern to be proved are imported into the GIS map of the mobile terminal so as to display the pattern to be proved in the GIS map. In order to distinguish from the first graph, the outer frame of the to-be-demonstrated pattern spot is displayed as a solid frame of the second color. The second color may be red.
S500, calculating the overlapping area of the first graph and the pattern spot to be proved, and judging whether the overlapping area of the first graph and the pattern spot to be proved is smaller than a first preset percentage of the area of the first graph.
Specifically, under the condition that the coordinates of all the vertices of the pattern to be proved and the coordinates of all the vertices of the first pattern are known, the overlapping area of the first pattern and the pattern to be proved can be calculated, and whether the overlapping area of the first pattern and the pattern to be proved is smaller than a first preset percentage of the area of the first pattern is judged. The first preset percentage may be 10%.
S610, if the overlapping area of the first graph and the to-be-demonstrated graph spot is larger than or equal to a first preset percentage of the area of the first graph, controlling the camera to enter a shooting state.
Specifically, as shown in fig. 3, if the overlapping area of the first pattern and the pattern spot to be demonstrated is greater than or equal to a first preset percentage of the area of the first pattern, it is determined that the shooting condition is met, and the camera is controlled to enter a shooting possible state.
S620, controlling the camera to execute shooting action, and taking the shot image as a proof image.
Specifically, the photographing action may be performed by controlling the shutter key of the camera to be pressed. Alternatively, a photographing button on a GIS map of the user terminal may be set to a "clickable" state, so that the user may press the photographing button, trigger a shutter key-press action of the camera, and perform a photographing action. And (5) proving that the image shooting is completed.
In this embodiment, the camera shooting range is converted into the first graph and mapped on the GIS map, the overlapping area of the first graph and the pattern spot to be authenticated is calculated, and the spatial position relationship between the camera shooting range and the pattern spot to be authenticated is determined by determining the overlapping area of the first graph and the pattern spot to be authenticated, so as to determine whether to execute shooting actions, increase the effectiveness of the image to be authenticated, and greatly improve the shooting efficiency of shooting the image to be authenticated.
In an embodiment of the present application, after S500, the method for capturing the proof image further includes the following S710 to S720:
and S710, if the overlapping area of the first graph and the to-be-demonstrated graph spot is smaller than a first preset percentage of the graph area of the to-be-demonstrated graph spot, controlling the camera to enter a non-shooting state.
Specifically, if the overlapping area of the first graph and the to-be-demonstrated pattern spot is smaller than a first preset percentage of the to-be-demonstrated pattern spot, determining that the shooting condition is not met, and controlling the camera to enter a non-shooting state. Controlling the camera to enter the non-photographable state may be accomplished by the camera data interface entering the non-callable state. Optionally, when the camera data interface enters the non-callable state, the shutter key of the camera fails, that is, when the shutter key of the camera receives a shooting instruction, the non-pressing state cannot be changed to the pressing state. S720, displaying the first movement prompt identifier on the GIS map. The first movement prompt identifier is used for prompting the first graph to be close to the pattern spot to be authenticated. Specifically, as shown in fig. 4, the first movement hint identifier may be a graphic arrow pointing to a pattern spot to be authenticated. The first mobile alert identification may alert the user: the camera shooting range needs to be adjusted. The algorithm of the first movement hint identification may be: obtaining a first image physical center point and a physical center point of a pattern spot to be demonstrated, connecting the two physical center points by using a straight line to generate a straight line, and then moving out the straight line in parallel, and adding an arrow for pointing to the pattern spot to be demonstrated.
In this embodiment, when the overlapping area of the first graphic and the graphic spot to be demonstrated is smaller than a first preset percentage of the graphic area of the graphic spot to be demonstrated, the camera is controlled to enter a non-photographable state, so that the user is prevented from photographing meaningless pictures, and meanwhile, the user can be prompted to adjust the photographing range of the camera by displaying a first mobile prompt identifier on the GIS map.
In an embodiment of the present application, S710 includes the following steps:
s711, controlling the virtual shooting button displayed in the GIS map to enter a non-clickable state.
Specifically, optionally, when the camera enters the non-photographable state, a photographing button on the GIS map of the user terminal is set to a "non-clickable" state. Entering the non-clickable state is to block the key triggering function, i.e. the action of pressing the shutter key of the camera cannot be triggered.
In this embodiment, when the overlapping area of the first graphic and the graphic spot to be demonstrated is smaller than the first preset percentage of the graphic area of the graphic spot to be demonstrated, the virtual shooting button displayed in the GIS map is controlled to enter the non-clickable state, so that the camera is controlled to be unable to continue shooting in actual action, and a prompt that the user cannot shoot can be given to the user on the GIS map at the user side.
In an embodiment of the present application, after the step S710, the step S700 further includes the following steps:
s720, controlling the outer frame of the first graph displayed in the GIS map to be changed from the solid line frame of the first color to the dotted line frame of the third color.
Specifically, the purpose of this step is also to enhance the prompting effect of the camera entering the non-photographable state. As shown in fig. 4, the outer border of the first graph is a dashed border.
In an embodiment of the present application, after S620, the following S631 to S632 are further included:
s631, after the shooting of the evidence-provided image is completed, acquiring the camera parameters, shooting azimuth parameters and shooting time nodes of the evidence-provided image.
S632, the evidence image, the shooting time node, the camera parameters of the evidence image and the shooting azimuth parameters of the evidence image are correspondingly stored in the server.
Specifically, after the shooting of one evidence image is finished, the step stores the camera parameters, shooting azimuth parameters, shooting time nodes and the evidence image of the evidence image in a mapping relation, so that the subsequent comparison and processing are convenient.
Optionally, the evidence image, the shooting time node, the camera parameter of the evidence image and the shooting azimuth parameter of the evidence image may be stored in a storage medium of a mobile device in communication with the camera, and then uploaded to the server.
In an embodiment of the present application, the method further includes the following S810 to S851:
s810, acquiring camera parameters and shooting azimuth parameters of the evidence image of the previous shooting time node from a server.
Specifically, since the capturing time node of the evidence image is stored after the capturing of one evidence image is completed, the camera parameter and the capturing azimuth parameter of the evidence image of the previous capturing time node can be obtained from the server by taking the capturing time node as an index. S820, calculating the coordinates of all vertexes of the second graph according to the camera parameters and shooting azimuth parameters of the evidence image of the previous shooting time node. The second graph is a graph which is mapped in a three-dimensional space coordinate system by a camera shooting range of the evidence image of the previous shooting time node.
Specifically, the working principle of this step is consistent with S200, and will not be described herein.
S830, the coordinates of all vertexes of the second graph are imported into a GIS map of the mobile terminal, so that the second graph is displayed in the GIS map; the outer frame of the second graph is displayed as a solid line frame of a fourth color.
Specifically, the working principle of this step is consistent with S300, and will not be described here again. In order to distinguish the follow-up pattern from the first pattern, the outline of the second pattern is also displayed as a solid outline of a fourth color. Alternatively, the fourth color may be blue.
S840, calculating the overlapping area of the first graph and the second graph, and judging whether the overlapping area of the first graph and the second graph is smaller than a second preset percentage of the area of the first graph.
Specifically, the working principle of this step is consistent with S500. However, the comparison is made between the pattern (i.e., the first pattern) in which the camera shooting range of the current shooting time node is mapped in the three-dimensional space coordinate system and the pattern (i.e., the second pattern) in which the camera shooting range of the previous shooting time node is mapped in the three-dimensional space coordinate system, so as to prevent repeated shooting and improve the shooting efficiency of the evidence image.
S851, if the overlapping area of the first pattern and the second pattern is smaller than the second preset percentage of the area of the first pattern, confirming that the camera does not repeatedly shoot, controlling the camera to enter a shooting state, and executing S620.
Specifically, the second preset percentage may be set to 80%. If the overlapping area of the first graph and the second graph is smaller than the second preset percentage of the area of the first graph, the camera is confirmed to be not repeatedly shot, the camera can be controlled to enter a shooting state, and a new evidence image is shot.
Specifically, S500 to S620 may be performed first, and then S810 to S851 may be performed again, or S810 to S851 may be performed first, and then S500 to S620 may be performed again. However, S810 to S851 must be performed after S400, that is, the coordinates of all vertices of the first pattern must be calculated first, and the first pattern is displayed on the GIS map, so that the coordinates of all vertices of the second pattern can be calculated later, and the overlapping areas of the first pattern and the second pattern are compared.
In this embodiment, the camera parameters and the shooting azimuth parameters of the evidence image of the previous shooting time node are obtained in the server, the coordinates of all the vertices of the second graph are calculated, the overlapping area of the first graph and the second graph is calculated, and whether the overlapping area of the first graph and the second graph is smaller than the second preset percentage of the area of the first graph or not is judged, so that whether the camera shoots repeatedly or not can be judged, and the user can be guided to shoot the evidence image more reasonably and more efficiently.
In an embodiment of the present application, after S840, the following S852 to S853 are further included:
s852, if the overlapping area of the first graph and the second graph is larger than or equal to the second preset percentage of the area of the first graph, the repeated shooting of the camera is confirmed, and S710 is executed.
Specifically, if the overlapping area of the first pattern and the second pattern is greater than or equal to the second preset percentage of the area of the first pattern, the repeated shooting of the camera is confirmed, S710 is executed, and the camera is controlled to enter a non-shooting state. The shooting range of the camera needs to be adjusted subsequently so as to prevent repeated shooting of the same evidence-providing image.
S853, displaying a second movement prompt identifier on the GIS map, wherein the second movement prompt identifier is used for prompting that the second graph is far away from the first graph and is close to the to-be-authenticated graph spot.
Specifically, as shown in fig. 5, the second movement hint identifier may be a curved arrow graph that is far from the first graph and near the pattern spot to be authenticated. The second mobile alert identification may alert the user: the camera shooting range needs to be adjusted. In fig. 5, a regular pentagon with a solid border is the pattern spot to be demonstrated. The triangle with the solid border is the first graphic, and is displayed as a solid border because the overlapping area of the first image and the pattern spot to be demonstrated is greater than or equal to a first preset percentage of the pattern spot area to be demonstrated. And the triangle with the frame with the broken line is a second graph, and the repeated shooting of the camera is confirmed and displayed as the frame with the broken line because the overlapping area of the first graph and the second graph is larger than or equal to the second preset percentage of the area of the first graph.
In this embodiment, when the overlapping area of the first graphic and the second graphic is greater than or equal to the second preset percentage of the area of the first graphic, the camera is controlled to enter the non-photographable state, so that the user is prevented from photographing nonsensical repeated pictures, and meanwhile, the user can be prompted to adjust the photographing range of the camera by displaying the second mobile prompt identifier on the GIS map.
In an embodiment of the present application, after S620, the following S641 to S643 are further included:
s641, selecting a tracking point in the first graph.
S642, calculating the coordinates of the tracking points in the GIS map according to the camera parameters.
And S643, mapping coordinates of the tracking point in the GIS map to a camera shooting range, and displaying the coordinates in a camera shooting preview interface.
Specifically, the purpose of the present embodiment is to achieve a point-to-point mapping of the camera shooting range and the GIS map of the user terminal. The final purpose is to make click on any point on the GIS map as the tracking point, if the point is in the shooting range of the camera, the point can be displayed in the preview window of the camera, and the user is assisted to judge the validity of the tracking point.
The specific implementation mode is as follows:
as shown in fig. 6, 1) define the T point as a key point of interest. On the basis of knowing the coordinates of the four points a, B, C and D (this inherits the algorithm for calculating the coordinates of the four points a, B, C and D in the foregoing embodiment, and is not described here again), the upper left corner a is used as a reference point, when we click on the tracking point-T point in the camera shooting range (i.e. the trapezoid in fig. 6), we can calculate the length of the line segment AT by the distance between the two points, and then combine the angle value of the angle BAT, and use the trigonometric function and the vector algorithm to obtain the length of the line segment ET. The reason why the distance of ET is not directly measured is because the vector algorithm works well. Similarly, the length of the line segment FD can be obtained by using the length of the line segment AD and the angle value of the +.bad, so the longitudinal ratio of the point T in the trapezoid yrate=et/FD.
3) Based on the longitudinal ratio YRate of the point T in the trapezoid, we can determine the coordinates of the H and I points and calculate the transverse ratio xrate=ht/HI of the click point T in the trapezoid (with the upper left corner as the reference point);
4) According to the vertical ratio YRate and the horizontal ratio XRate, the position of the clicking point T can be conveniently found in a GIS map (namely a rectangle of 7 in the figure) of the user terminal.
Therefore, any point on the GIS map of the user terminal can be clicked, and if the point is in the shooting range of the camera, the point can be displayed in the preview window of the camera, so that the user is assisted in judging the effectiveness of the tracking point.
Otherwise, if clicking a point in the camera preview window (at this time, the point is in the shooting range of the camera), the point can be displayed on the GIS map of the user terminal, so as to assist the user in judging the validity of the tracking point.
In an embodiment of the present application, when the camera enters the photographable state, the outer frame of the first graphic is controlled to be changed from the solid line frame of the first color to the dotted line frame of the fifth color.
Specifically, in S851, if the overlapping area of the first pattern and the second pattern is smaller than the second preset percentage of the area of the first pattern, it is determined that the camera does not repeatedly shoot, and when the overlapping area of the first pattern and the pattern spot to be verified is larger than or equal to the first preset percentage of the area of the pattern spot to be verified, that is, when two shooting conditions are satisfied at the same time, the camera is controlled to enter a shooting state. Before S620 is performed, the method further includes controlling the outer frame of the first graphic to be changed from the solid frame of the first color to the dotted frame of the fifth color. The fifth color may be yellow.
In fig. 5, a regular pentagon with a solid border is the pattern spot to be demonstrated. The triangle with the solid border is the first graphic, and is displayed as a solid border because the overlapping area of the first image and the pattern spot to be demonstrated is greater than or equal to a first preset percentage of the pattern spot area to be demonstrated. And the triangle with the frame with the broken line is a second graph, and the repeated shooting of the camera is confirmed and displayed as the frame with the broken line because the overlapping area of the first graph and the second graph is larger than or equal to the second preset percentage of the area of the first graph. But the fifth color cannot be represented in fig. 5.
Controlling the outer frame of the first graph to be changed from the solid frame of the first color to the broken frame of the fifth color, performing S620 photographing the evidence image,
in this embodiment, the outer frame of the first graph is controlled to be changed from the solid line frame of the first color to the dotted line frame of the fifth color, so that the shooting range of the camera can be shown to the user while two shooting conditions are satisfied: the method is overlapped with the pattern spots to be proved and is not repeatedly shot.
The application also provides a shooting system for the evidence-provided image.
As shown in fig. 8, in an embodiment of the present application, the capturing system for the evidence image includes a camera 100, a processing terminal 200, a mobile terminal 300, and a server 400. The processing terminal 200 is in communication with the camera 100. The processing terminal 200 is used to perform the shooting method of the proof image mentioned in the content. The mobile terminal 300 is in communication with the processing terminal 200. The server 400 is communicatively connected to the processing terminal 200.
The technical features of the above embodiments may be combined arbitrarily, and the steps of the method are not limited to the execution sequence, so that all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description of the present specification.
The above examples only represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.