SUMMERY OF THE UTILITY MODEL
The to-be-solved technical problem of the utility model lies in, to the above-mentioned defect that needs the cost a large amount of manpower and materials of following of shooting of prior art, provides a three-dimensional automatic positioning and the shooting system who follows, can use manpower sparingly material resources, and can guarantee the controllability of output picture.
The utility model provides a technical scheme that its technical problem adopted is: constructing a three-dimensional automatic positioning and following shooting system, which comprises an aircraft and a main camera arranged on the aircraft, wherein the three-dimensional automatic positioning and following shooting system further comprises:
an auxiliary camera arranged on the aircraft and used for capturing an image of a shooting area, wherein a reference point on a target object in the shooting area has light with a preset wavelength band;
a narrow band filter arranged in front of the auxiliary camera;
the distance measuring module is arranged on the aircraft and is used for measuring the distance to a target object in real time;
the control module is arranged on the aircraft and connected with the auxiliary camera, the ranging module and the main camera; furthermore, it is possible to provide a liquid crystal display device,
the distance measurement module, the auxiliary camera and the main camera are arranged in a close proximity mode, and the central axes of the auxiliary camera, the main camera and the main camera are parallel.
In the shooting system of three-dimensional automatic positioning and following, three-dimensional automatic positioning and following system still includes:
a narrow-band light source disposed on a reference point of the target; or,
the reflector is arranged on a reference point of the target object, and the reference point is positioned in the irradiation range of the narrow-band light source.
In the shooting system of three-dimensional automatic positioning and following, the ranging module is at least one of the following: the device comprises a laser ranging module, an ultrasonic ranging module or a microwave ranging module.
Three-dimensional automatic positioning and the shooting system who follows in, three-dimensional automatic positioning and the shooting system who follows still including set up on the monitoring device of distal end and through wireless mode with the supplementary display module that control module connects.
Three-dimensional automatic positioning and the shooting system who follows in, three-dimensional automatic positioning and the shooting system setting of following are on the monitoring device of distal end, and through wireless mode with the main display module that control module connects.
Implement the technical scheme of the utility model, owing to be provided with the narrowband optical filter before the supplementary camera, the supplementary camera is when shooing, only presets the light accessible of wavelength band on the reference point, consequently can only shoot the image of reference point in the image. Meanwhile, the ranging module measures the distance to the target object. Therefore, the position of the target object can be determined according to the shot image and the position of the reference point image in the image and the measured distance value, and when the target object moves in the three-dimensional space, the aircraft can be guaranteed to follow the target object in the three-dimensional space all the time by controlling the aircraft to move, so that the main camera on the aircraft can automatically follow and shoot the target object. For example, the user is when skiing or auto heterodyne, uses the utility model discloses a shooting system that three-dimensional automatic positioning and follow follows the shooting, when using manpower sparingly material resources, can guarantee the controllability of output picture.
Detailed Description
Fig. 1 is the logic diagram of the first embodiment of the three-dimensional automatic positioning and following shooting system of the present embodiment, the three-dimensional automatic positioning and following shooting system of the present embodiment includes an aircraft (not shown), a control module 11 disposed on the aircraft, a main camera 12, a distance measuring module 15, an auxiliary camera 13 and a narrow-band filter 14 disposed in front of the auxiliary camera 13, moreover, the auxiliary camera 13, the distance measuring module 15 and the main camera 12 are disposed in close proximity and parallel with the central axis of the three, so that the shooting range of the auxiliary camera 13 and the main camera 12 is ensured to be approximately consistent.
In this embodiment, the auxiliary camera 13 is used to capture an image of a shooting area in which a reference point on a target object, such as a person, an animal, a car, or the like, has light of a preset wavelength band.
In this embodiment, the distance measuring module 15 is used for measuring the distance to the target object in real time, and the distance measuring module 15 is, for example, a laser distance measuring module, an ultrasonic distance measuring module or a microwave distance measuring module.
In this embodiment, the control module 11 is configured to process the image captured by the auxiliary camera 13 and the distance value measured by the distance measurement module 15, determine a two-dimensional position of the target object relative to the aircraft according to the processed image, determine a three-dimensional position of the target object relative to the aircraft according to the processed distance value, and predict a three-dimensional position of the target object relative to the aircraft at a next time according to the three-dimensional position determined in a period of time before the current time, for example, the position prediction may be performed according to a least square method or a markov prediction method. And then, controlling the aircraft to correspondingly move so that the current position of the aircraft and the predicted three-dimensional position accord with a preset rule, and enabling the mapping of the reference point in the processed image to fall into a locking window of a specific position. It should be noted that, since the gravity sensor, the gyroscope, and the acceleration sensor are installed on the aircraft, the position and the attitude information of the aircraft itself can be acquired.
Further, if the control module 11 finds that the three-dimensional position of the target object relative to the aircraft at the current moment is inconsistent with the predicted three-dimensional position, the control module controls the aircraft to move so that the aircraft retreats to the position at the previous moment again, and automatic following is performed again.
Regarding the light of the predetermined wavelength band on the reference point, it may be light with a wavelength greater than 760nm (e.g., infrared light) or light with a wavelength less than 380nm (e.g., ultraviolet light).
Furthermore, in some embodiments, the light of the reference point is emitted directly by a narrow-band light source, for example, an infrared emitting tube or an ultraviolet emitting tube is disposed on the reference point, and a corresponding fluorescent agent may be disposed on the reference point. In other embodiments, the light of the reference point may be further reflected by a reflector, specifically: and a reflector is arranged on the reference point, a narrow-band light source is arranged at other places, and the reflector is positioned in the irradiation range of the narrow-band light source. For example, in the embodiment shown in fig. 2, a narrow-band light source 16 is provided on the aircraft, and a reflector (not shown) is provided on the reference point of the target, the reflector being located within the illumination range of the narrow-band light source.
In addition, whether light of a preset wavelength band is emitted or reflected on the reference point of the object, the emitted or reflected light may preferably be light of a specific flashing sequence obtained by optically modulating the light wave of the narrow-band light source using the identification information. Moreover, after the auxiliary camera 13 captures the flash sequence, the control module 11 is further configured to detect the flash sequence emitted or reflected by the reference point, demodulate the flash sequence to obtain the identification information, and start processing the image captured by the auxiliary camera only when it is determined that the obtained identification information is consistent with the preset identification information. If the two light beams are inconsistent, the light beams are interfered, so that the interference of other light beams on the target object to automatic positioning and following can be avoided.
The principle of three-dimensional automatic positioning and following is explained below with reference to the examples shown in fig. 3A, 3B, 3C: before the automatic tracking, since the narrow-band filter is arranged in front of the auxiliary camera 13, when the auxiliary camera 13 shoots, only the light energy of the preset wavelength band on the reference point of the target object passes through, so that only the image 1 of the reference point can be shot in the image, and then the image is processed. Meanwhile, the ranging module 15 measures a distance to a target object and then processes the measured distance value. The two-dimensional position of the target object relative to the aircraft at the current moment can be determined according to the processed image, and then the three-dimensional position of the target object relative to the aircraft is determined according to the processed distance value, as shown in fig. 3A, the current position of the aircraft is at a point O, a three-dimensional coordinate system is established by taking the current position O of the aircraft as a coordinate origin, and the three-dimensional position of the target object relative to the aircraft is at a point M. With the movement of the target object, the three-dimensional position of the target object relative to the aircraft at the next moment can be predicted according to the three-dimensional position of the target object determined in a period of time before the current moment, for example, it is predicted that the three-dimensional position of the target object relative to the aircraft at the next moment moves to a point M ', and at this time, the aircraft is controlled to correspondingly move so that the current position of the aircraft (the aircraft moves to the point O ') and the predicted three-dimensional position (the point M ') conform to a preset rule, and the preset rule is, for example, that the difference between the three-dimensional coordinates of the position of the aircraft and the position of the target object is a fixed value. Of course, in other embodiments, the rule may also be preset as follows: the three-dimensional coordinate difference between the position of the aircraft and the position of the target object is a value which changes according to a specific rule.
Furthermore, the aircraft is controlled to move so that the map 1 of the reference points in the processed image falls within the lock-out window 2 at a specific position. With reference to fig. 3B and 3C, the preset positions of the lock window 2 are: the center of the lock window 2 is located at the center in the image, and the lock window 2 is rectangular. Before automatic positioning and following, image 1 of the reference point is located at the upper left of locking window 2, as shown in fig. 3B, at which point the aircraft is controlled to move left upwards until image 1 of the reference point falls within locking window 2, as shown in fig. 3C. It should be noted that the above is only one embodiment of the present invention, and in other embodiments, the position of the locking window 2 may be preset at other places of the image, and the user may set the position of the locking window through the user interface, and the shape of the locking window 2 may also be circular, elliptical, hexagonal, etc.
In some cases, because there may be one or more reflection points of the reference point on the target object, in the image captured by the auxiliary camera, besides the image of the reference point, there may also be an image of the reflection point, as shown in fig. 4, 1 is an image of the reference point, 1 'and 1 "are images of the reflection point of the reference point, respectively, and the brightness of the images 1' and 1" of the reflection point is not as high as that of the image 1 of the reference point, in order to achieve accurate following, the control module first needs to determine the position of the image of the reference point in the image according to the brightness value, for example, in fig. 4, 1 is an image of the reference point, and then adjusts the movement of the aircraft according to the relationship between the position of the reference point and the position of the locking window.
Fig. 5 is a logic diagram of the third embodiment of the three-dimensional automatic positioning and following shooting system of the present invention, and the three-dimensional automatic positioning and following shooting system of the present embodiment compares with the embodiment shown in fig. 1, and further includes an auxiliary display module 18 and a main display module 17, and the auxiliary display module 18 and the main display module 17 are disposed on the monitoring device at the far end, and are connected with the control module 11 in a wireless manner.
The control module 11 is configured to superimpose the locked window and the processed image captured by the auxiliary camera. The auxiliary display module 18 is used for displaying the superimposed images and further displaying the measured distance values, so that the user or the tester can conveniently check the automatic positioning and following processes in real time.
The control module 11 is configured to superimpose the processed image captured by the auxiliary camera with the processed image captured by the main camera. The main display module 17 is used for displaying the superposed images. In this way, the user can see the mapping of the target object and the reference point in one display screen at the same time, and further, when the determined reference point in the image is found to be inconsistent with the actual situation, the user can correct by reselecting the reference point, and the system performs automatic positioning and following again. Of course, in other embodiments, the control module 11 may also superimpose the lock window with the image captured by the main camera and the image captured by the auxiliary camera.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the claims of the present invention.