CN210327775U - Optical unmanned aerial vehicle monitoring system based on three-dimensional light field technology - Google Patents

Optical unmanned aerial vehicle monitoring system based on three-dimensional light field technology Download PDF

Info

Publication number
CN210327775U
CN210327775U CN201920464236.9U CN201920464236U CN210327775U CN 210327775 U CN210327775 U CN 210327775U CN 201920464236 U CN201920464236 U CN 201920464236U CN 210327775 U CN210327775 U CN 210327775U
Authority
CN
China
Prior art keywords
camera
unmanned aerial
aerial vehicle
monitoring system
light field
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201920464236.9U
Other languages
Chinese (zh)
Inventor
李应樵
李莉华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Visual Sense Power Technology Co ltd
Original Assignee
Shenzhen Visual Sense Power Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Visual Sense Power Technology Co ltd filed Critical Shenzhen Visual Sense Power Technology Co ltd
Priority to CN201920464236.9U priority Critical patent/CN210327775U/en
Application granted granted Critical
Publication of CN210327775U publication Critical patent/CN210327775U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Closed-Circuit Television Systems (AREA)

Abstract

The utility model discloses an unmanned aerial vehicle monitoring system based on light field technology, wherein a first camera is used for continuously obtaining image information in a monitoring area; the second camera is a light field camera comprising a fly-eye lens and is used for obtaining the light field information of the unmanned aerial vehicle when the obtained image information is judged as the unmanned aerial vehicle; and a vertical rotation platform and a horizontal rotation platform disposed vertically to each other; wherein the first and second cameras are capable of rotating synchronously under the control of the vertical and horizontal rotating platforms; and the computer processor is used for calculating the depth information of the unmanned aerial vehicle through the obtained light field information so as to obtain the position of the unmanned aerial vehicle. The utility model provides an optics unmanned aerial vehicle monitoring system of three-dimensional light field technique can keep apart the vibrations in the monitoring process to efficiency and accuracy are improved at monitoring or survey unmanned aerial vehicle's in-process.

Description

Optical unmanned aerial vehicle monitoring system based on three-dimensional light field technology
Technical Field
The utility model belongs to unmanned aerial vehicle monitoring field, in particular to unmanned aerial vehicle monitoring system based on light field technique.
Background
Along with the development of unmanned aerial vehicle technique, there is extensive demand to the improvement of unmanned aerial vehicle's monitoring system, adopts the monitoring system that radar and camera combined together among the prior art more. The radar monitoring is easy to be deceived by stealth technology, the low-altitude monitoring effect is poor, and the camera is generally low in resolution. Chinese patent application 201810128587.2 discloses an unmanned aerial vehicle monitoring system and a supervision method thereof. In the method, a software method is used for scanning images in an area, stereoscopic vision is formed through a first camera and a second camera to judge whether suspicious targets exist in the images, and the suspicious targets are tracked and shot by calculating the accurate positions of the suspicious targets. The technology is mainly improved in a software part. Chinese patent application 201810396720.2 discloses a method and apparatus for detecting an unmanned aerial vehicle, and an electronic device. The method also mainly comprises the steps of controlling the cameras on the detection platform to rotate from the angle of software, and sending a rotation instruction to a motor of the rotary table so that the motor drives a plurality of cameras on the rotary table to rotate by a preset angle; sending a stop instruction to the motor so that the motor controls the rotary table to stop rotating after rotating by a preset angle; when the plurality of cameras stop for the preset time, controlling the plurality of cameras to shoot for one time to obtain a plurality of images; carrying out image recognition on the plurality of images, and determining whether an unmanned aerial vehicle exists in a monitoring area; and if no unmanned aerial vehicle exists in the monitoring area, the steps are executed again.
There is a need for a new high resolution, stable monitoring system that obtains clear stereo images to improve efficiency and accuracy in monitoring or detecting unmanned aerial vehicles.
SUMMERY OF THE UTILITY MODEL
An object of the utility model is to provide a new resolution ratio is high to stable monitoring system obtains clear stereogram, thereby is monitoring or surveys unmanned aerial vehicle's in-process raise the efficiency and the accuracy.
The utility model discloses an unmanned aerial vehicle monitoring system based on light field technology, wherein a first camera is used for continuously obtaining image information in a monitoring area; the second camera is a light field camera comprising a fly-eye lens and is used for obtaining the light field information of the unmanned aerial vehicle when the obtained image information is judged as the unmanned aerial vehicle; and a vertical rotation platform and a horizontal rotation platform disposed vertically to each other; the vertical rotating platform is used for controlling the first camera and the second camera to rotate along the direction vertical to the horizontal rotating platform; and the horizontal rotating platform is used for controlling the first camera and the second camera to rotate along the horizontal direction; wherein the first and second cameras are capable of rotating synchronously under the control of the vertical and horizontal rotating platforms; and the computer processor is used for analyzing and judging whether the obtained image information in the monitoring area is the unmanned aerial vehicle or not, and calculating the depth information of the unmanned aerial vehicle through the obtained light field information so as to obtain the position of the unmanned aerial vehicle.
In one aspect of the present invention, the vertical rotation platform controls the rotation range of the first camera and the second camera to be an elevation angle of 15 to 45 degrees. The horizontal rotating platform controls the rotating range of the first camera and the second camera to be 0-120 degrees. The second video camera further includes a telephoto lens group having a diagonal of about 1 ° and a camera light-sensing element, and the fly-eye lens is disposed between the telephoto lens group and the camera light-sensing element and close to the camera light-sensing element. Wherein the fly-eye lens is a micro-lens array. The microlens array is arranged in a linear or hexagonal arrangement mode, wherein each row of the hexagonal microlens array is arranged in a staggered mode compared with the previous row. Wherein each microlens of the hexagonal microlens array has a width of 60 μm, but there are two microlenses within 90 μm.
Wherein said
The light-field image I (x, y) may be formulated by:
I(x,y)=∫∫LF(u,v,x,y)dudv (1)
where (u, v, x, y) denotes light traveling along a ray intersecting the main lens at (u, v) and the microlens plane at (x, y), and a full aperture is used; and fig. 3(d) is a schematic diagram of the principle of calculating the refocused image by moving the sub-aperture image of the light field imaging system of the present invention, and the refocused image is calculated by moving the sub-aperture image in the manner shown in fig. 3 (d):
the shifted light field function can be expressed as:
Figure DEST_PATH_GDA0002358334150000031
the utility model provides an optics unmanned aerial vehicle monitoring system of three-dimensional light field technique can keep apart the vibrations in the monitoring process to efficiency and accuracy are improved at monitoring or survey unmanned aerial vehicle's in-process.
Drawings
In order to more clearly illustrate the technical solution in the embodiments of the present invention, the drawings required to be used in the embodiments will be briefly described below. It is obvious that the drawings in the following description are only examples of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
Fig. 1 is the utility model discloses an unmanned aerial vehicle monitoring system's schematic structure diagram.
Fig. 2 is the utility model discloses an unmanned aerial vehicle monitoring system's second camera's schematic structure diagram.
Fig. 3(a) and 3(b) are schematic diagrams of the light field imaging system of the present invention.
Fig. 3(c) is an exemplary diagram of the processed light field image.
Fig. 3(d) is a schematic diagram of the image principle of calculating refocusing by moving sub-aperture image of the light field imaging system of the present invention.
Fig. 4(a) is a microscope image of the hexagonal microlens array of the second camera 103 of the unmanned aerial vehicle monitoring system of the present invention.
Fig. 4(b) is a white light interferometer detection diagram of the hexagonal microlens array of the second camera 103 of the unmanned aerial vehicle monitoring system of the present invention.
Fig. 5 is the utility model discloses an unmanned aerial vehicle monitoring system's working principle diagram.
Detailed Description
Specific embodiments of the present invention will now be described with reference to the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein. These embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. The terminology used in the detailed description of the embodiments illustrated in the accompanying drawings is not intended to be limiting of the invention.
Fig. 1 is the utility model discloses an unmanned aerial vehicle monitoring system 100's schematic structure diagram. Wherein the system 100 includes a housing 101 for housing a processor and a rotating platform controller, a first camera 102, a second camera 103, a telephoto lens 104, a wide-angle lens 105, a vertical rotating platform 106, a horizontal rotating platform 107, a vertical support panel 108, and a horizontal support plate 109. Wherein the first camera 102 and the second camera 103 are high-resolution cameras, the axes of which are parallel to each other and fixed on the horizontal support plate 109 by a fixing rod 110; the vertical rotary platform 106 is fixed on one side of the horizontal support plate 109; the axis center of the vertical rotary platform 106 is fixed to the vertical support panel 108, the axis of the vertical rotary platform 106 is horizontal, and the vertical rotary platform can rotate around the axis in the direction of arrow a and in the opposite direction of arrow a. Since the vertical rotation platform 106, the horizontal support plate 109 and the first fixing rod 110 are fixed to each other, when the vertical rotation platform 106 rotates around its axis, the first fixing rod 110 fixed to the horizontal support plate 109 can be driven to make the first camera 102 and the second camera 103 move simultaneously, that is, the first camera 102 and the second camera 103 rotate around the vertical rotation platform synchronously in the direction of arrow a or in the direction opposite to arrow a, in an embodiment, the rotation range may be an elevation angle of 15-45 degrees. Further, the vertical support panel 108 is fixed to a second fixing rod 111, and the second support rod 111 is fixed to the horizontal rotation platform 107. As shown in fig. 1, driven by the second support rod 111, the horizontal support plate 109 including the first camera 102, the second camera 103, the telephoto lens 104, the wide-angle lens 105, the vertical rotary platform 106, and the vertical support panel 108 can rotate around the axis of the horizontal rotary platform 107 along arrow B or along the direction opposite to arrow B; the axis of the horizontal rotation platform 107 is perpendicular to the rotation direction of the horizontal rotation platform. Similarly, since the second fixing rod 111 is fixed to the horizontal rotating platform 107, when the second fixing rod 111 rotates on the horizontal rotating platform 107 in the direction of arrow B or in the direction opposite to arrow B, the light field camera 102 and the normal video camera 103 can be simultaneously driven to move, that is, the light field camera 102 and the normal video camera 103 rotate around the horizontal rotating platform in the direction of arrow B or in the direction opposite to arrow B synchronously. In one embodiment, the rotation may range from 15-45 degrees in elevation. In one embodiment, the range of rotation may be as allowed by a horizontal rotation platform, particularly preferably in the range of 0-120 degrees.
The horizontal rotating platform 107 is a vibration-free optical platform, which is placed on the case 101, the case 101 is used to accommodate the computer workstation including a processor and a rotating platform controller, the horizontal rotating platform 107 provides a horizontal plane, and the camera system fixed on the horizontal rotating platform 107 can track the unmanned aerial vehicle, and at the same time can keep the camera system relatively fixed and not disturbed by vibration.
The two high- resolution video cameras 102, 103 may employ the same camera body, for example, an ultra high definition 4K (4096 × 2160 pixels) video camera; different camera bodies may also be employed. The second camera 103 is a wide-angle camera, and has a wide shooting range, and has shooting characteristics such as an exaggerated distance sense and a wide focusing range. When the wide-angle camera is used, the object in front of the eyes can be enlarged, and the object at a distance can be reduced to be smaller, so that the image around the camera is distorted. The wide angle also enables any point in the image to be adjusted to the most appropriate focal length, so that the picture is clearer, which can also be called as full automatic focusing, and the second wide-angle camera 103 can adopt a camera with a single pixel size in the range of 1-8 microns on a camera photosensitive element; a view angle lens or a diagonal lens is included in the wide-angle second camera 103. The second camera 103 is an ultra-long-distance camera, the second camera 103 tracks the flying unmanned aerial vehicle by using a super telephoto lens, and the super telephoto lens adopts a lens in a focal length range of 100-2200 mm. The second camera 103 will be explained in detail below. The computer workstation is located inside the case 101 of the optical unmanned aerial vehicle monitoring system 100, processes acquired information, monitors the flight of the unmanned aerial vehicle, and gives an alarm in time. The chassis 101 is primarily used to protect the optical drone monitoring system 100 outdoors, such as in an airport environment.
Fig. 2 is the utility model discloses a second camera 103 of unmanned aerial vehicle monitoring system's schematic structure diagram. The microlens array 201 is disposed between the light-sensing element 202 of the second video camera 103 (i.e., light field camera) and the camera lenses 204 and 205. The picture taken by the second camera 103 is magnified on the display and consists of circular sub-images that are equally spaced in both the vertical and horizontal directions. Each sub-image corresponds to a microlens. The second camera 103 may be a light field camera, and in the second camera 103, further comprises a super telephoto lens group 204, 205 having a diagonal lens of about 1 °; the super telephoto lens group 204, 205 may employ lenses in the range of 100 and 2200mm focal length. In one embodiment, the second camera 103 further comprises a micro lens array 201, and the presence of the micro lens array 201 makes the second camera 103 a compound eye camera, for example, an ultra high definition (HUD)4K compound eye camera, wherein the micro lens array 201 is designed according to the image sensor specification and the optical path of the microscope. The operation mode of the camera with the fly-eye lens is the same as that of the common camera, and the whole image can be seen through the enlargement of the shot picture before the picture is processed, wherein the whole image is formed by combining small images of each fly-eye lens. The light field computing software can refocus the image to different focal planes, and the light field image can obtain the depth information of the shot image because of the refocusing characteristic. Because the compound eye lens can shoot and obtain the image of the recorded light field information before being arranged on the photosensitive element of the camera.
Fig. 3(a) and 3(b) are schematic diagrams of the light field imaging system of the present invention. The mechanism of a light field imaging system with a microlens array 302 in front of a CMOS sensor 301 is shown. Fig. 3(a) all light rays passing through a pixel pass through its parent microlens and through the conjugate square (sub-aperture) on the main lens 303. Fig. 3(b) all light rays passing through the sub-aperture are focused by the corresponding pixel under the different microlens. These pixels form a picture seen through the sub-aperture.
The light-field image I (x, y) may be formulated by:
I(x,y)=∫∫LF(u,v,x,y)dudv (1)
where (u, v, x, y) denotes the light traveling along the ray intersecting the main lens at (u, v) and the microlens plane at (x, y), and a full aperture is used. Fig. 3(d) is a schematic diagram of the principle of calculating the refocused image by moving the sub-aperture image of the light field imaging system according to the present invention, and the refocused image is calculated by moving the sub-aperture image in the manner shown in fig. 3 (d):
the shifted light field function can be expressed as
Figure DEST_PATH_GDA0002358334150000081
Light field imaging techniques allow refocusing images and estimating a depth map of a scene. And calculating a basic depth range through the light field, and determining the position of the unmanned aerial vehicle by combining the position on the image.
For semiconductor manufacturing for chip-on-board applications, compound eyes may be used to find the maximum loop height of the aluminum bond line, a first bonding height on the chip, and a second bonding height on the substrate. Fig. 3(c) is an exemplary diagram of the processed light field image. In fig. 3(c), a larger number (μm) in the positive direction means a closer virtual focal plane toward the objective lens. The focal plane on the surface of the objective lens is calibrated to 0 μm. A processed light field image. The top left image of fig. 3(c) is the top line layer, the top right image of fig. 3(c) is the middle layer, the bottom left image of fig. 3(c) is the bottom metal layer, and the bottom right image of fig. 3(c) is the all-in-focus image. Autofocus software will be developed to capture all line images without any mechanical movement of the commanded vertical axis. Real-time AOI software will be developed and used in conjunction with auto-focus software. The user interface will display the image taken by the camera and the full focus image, and will mark any defects detected.
Compared with the existing mode of obtaining the distance through a distance calibration object, the mode of obtaining the distance through the depth information is simpler and more convenient, and clear images and accurate distance information can be obtained.
Therefore, the microlens array 201 is attached only to the second camera 103 and not to the second camera 103. In the design process of the parameters of the micro-lens array 201, the pixel pitch and the sensor size are the factors. There are both linear and hexagonal in microlens array alignment strategies. The hexagonal microlens array is slightly different from the linear array, each microlens has a width of 60 μm, but there are two microlenses within 90 μm. The second video camera 103 also includes a camera light sensing element 202 adjacent to the microlens array 201. An example of the light sensing element 202 is a CMOS charge coupled device image sensor CCD for converting the sensed light signal into an electrical signal. When the second camera 103 finds a subject 206, the subject forms a virtual image 203 through the super telephoto lens group 204, 205, and the virtual image 203 forms a high-definition electrical image signal in the photosensitive element 202 through the processing of the microlens array 201.
Fig. 4(a) is a microscope image of the hexagonal microlens array of the second camera 103 of the unmanned aerial vehicle monitoring system of the present invention. The coordinate units are in microns. Fig. 4(b) is a white light interferometer detection diagram of the hexagonal microlens array of the second camera 103 of the unmanned aerial vehicle monitoring system of the present invention. Fig. 4(a) and 4(b) detect the same microlens array. In one example, the microlens array employs a linear array arrangement. However, the linear arrangement may result in a large gap between the microlenses, and the gap may not generate an image, and therefore, in another example, the microlens array may be arranged in a hexagonal manner, and each row of the hexagonal microlens array may be arranged in a staggered manner compared to the previous row, so that the gap between the microlenses is greatly reduced, and the utilization efficiency of the image sensor after the microlenses is improved. When calculating the light field information, it is necessary to ensure that each sub-image is extracted by the program through linear alignment, so that a transverse reference line and a longitudinal reference line are generated, the reference line passes through the centers of the transverse sub-image and the longitudinal sub-image, and then all the sub-images can be captured and the depth information can be calculated. For the hexagonal microlens array shown in fig. 4(a), the microlenses are distributed with a misalignment, so that a new alignment method is required for alignment, and software can accurately extract and compare and analyze images of each microlens. As shown in fig. 4(a), the hexagonal microlens array pattern is much denser than the linear microlens array pattern. The hexagonal microlens array, although linearly aligned in the vertical direction, is translated in the horizontal direction. Therefore, the utility model discloses an unmanned aerial vehicle monitoring system can adopt the hexagonal microlens array mode. This is because the higher density hexagonal microlens array pattern can provide a higher resolution light field imaging system.
Fig. 5 is the utility model discloses an unmanned aerial vehicle monitoring system's work flow chart 500. In step 501, starting an unmanned aerial vehicle monitoring system to start monitoring; in step 502, the second camera 103 continues to operate and constantly sends the acquired video image information over a larger range to the computer workstation; since the second camera 103 is a wide-angle camera, the camera can scan a protection area and capture a high definition 4K video. Meanwhile, the scanning range is large, so that the resolution of the obtained image is low, and a clear unmanned aerial vehicle image cannot be obtained. The video captured by the wide-angle second camera 103 is always processed by the computer. After the unmanned aerial vehicle enters the field of view of the camera, the computer monitors through a monitoring algorithm based on machine learning. If no unmanned aerial vehicle is monitored, no control signal is generated. When monitoring confirms that the unmanned aerial vehicle is behind by the comparison of unmanned aerial vehicle position coordinate and camera visual field center coordinate, if exceed the threshold value of horizontal or vertical direction then can produce control signal, control signal is connected with platform controller through the USB interface. In step 503, comparing with a database stored in a computer workstation, determining whether the obtained image conforms to the shape of the unmanned aerial vehicle in the database, if the target is matched, entering step 504, when the monitored unmanned aerial vehicle is in the center of the camera view field, generating a signal to control the horizontal rotating platform 107 and the vertical rotating platform 106 to rotate to proper positions, taking a picture by the second camera 103, wherein the taken picture contains the light field image of the scene, and then submitting the light field image to light field processing software for information processing; that is, the drone monitoring system 100 will drive the second camera 103, focus on the telephoto lens on the locked target to track the suspicious target, thereby obtaining the high resolution image and the light field information of the target drone, and proceed to step 504; in case of no match, returning to step 502, repeating the sending of the video image information obtained by the second camera 103 to the computer workstation; at step 504, after the drone monitoring system 100 again uses the database to calculate and verify this information, the depth and location information of the light field image is calculated and obtained and alerts the user or the control tower. That is, once the high resolution image has a high degree of similarity to the drone shape information in the database, an alarm signal is sent to the monitoring system, and the drone depth and position information is also sent back to the surveillance center.
Reference herein to "one embodiment," "an embodiment," or "one or more embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. Moreover, it is noted that instances of the word "in one embodiment" are not necessarily all referring to the same embodiment.
The above description is only for illustrating the technical solutions of the present invention, and any person skilled in the art can modify and change the above embodiments without departing from the spirit and scope of the present invention. Therefore, the scope of the present invention should be determined by the following claims. The invention has been elucidated above in connection with an example. However, other embodiments than the above described are equally possible within the scope of the disclosure. The different features and steps of the invention may be combined in other ways than those described. The scope of the present invention is limited only by the appended claims. More generally, those of ordinary skill in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are exemplary parameters, dimensions, materials, and/or configurations that may be used for exemplary purposes and that are dependent upon the particular application or applications for which the teachings of the present invention is/are used.

Claims (7)

1. An unmanned aerial vehicle monitoring system based on light field technology includes:
the first camera is used for continuously obtaining image information in a monitoring area; its characterized in that unmanned aerial vehicle monitoring system still includes
A second camera, which is a light field camera including a fly-eye lens, and is configured to obtain light field information of the unmanned aerial vehicle when the obtained image information is determined as the unmanned aerial vehicle; and
a vertical rotation platform and a horizontal rotation platform vertically disposed to each other;
the vertical rotating platform is used for controlling the first camera and the second camera to rotate along the direction vertical to the horizontal rotating platform; and is
The horizontal rotating platform is used for controlling the first camera and the second camera to rotate along the horizontal direction;
wherein the first and second cameras are capable of rotating synchronously under the control of the vertical and horizontal rotating platforms;
and the computer processor is used for analyzing and judging whether the obtained image information in the monitoring area is the unmanned aerial vehicle or not, and calculating the depth information of the unmanned aerial vehicle through the obtained light field information so as to obtain the position of the unmanned aerial vehicle.
2. The drone monitoring system of claim 1 wherein the vertical rotating platform controls the range of rotation of the first and second cameras to be 15-45 degrees in elevation.
3. The drone monitoring system of claim 1, wherein the horizontal rotation platform controls a range of rotation of the first and second cameras to be 0-120 degrees.
4. The drone monitoring system of any one of claims 1-3, the second camera further comprising a telephoto lens group having about a 1 ° diagonal lens and a camera photosensitive element, the fly eye lens being disposed between the telephoto lens group and the camera photosensitive element and proximate to the camera photosensitive element.
5. The drone monitoring system of claim 4 wherein the fly eye lens is a microlens array.
6. The drone monitoring system of claim 5, wherein the microlens array is in a linear arrangement or a hexagonal arrangement, wherein each row of the hexagonal arrangement of microlens array is offset from a previous row.
7. The drone monitoring system of claim 6 wherein each microlens of the hexagonal microlens array is 60 μ ι η wide, but there are two microlenses within 90 μ ι η.
CN201920464236.9U 2019-04-08 2019-04-08 Optical unmanned aerial vehicle monitoring system based on three-dimensional light field technology Active CN210327775U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201920464236.9U CN210327775U (en) 2019-04-08 2019-04-08 Optical unmanned aerial vehicle monitoring system based on three-dimensional light field technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201920464236.9U CN210327775U (en) 2019-04-08 2019-04-08 Optical unmanned aerial vehicle monitoring system based on three-dimensional light field technology

Publications (1)

Publication Number Publication Date
CN210327775U true CN210327775U (en) 2020-04-14

Family

ID=70135562

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201920464236.9U Active CN210327775U (en) 2019-04-08 2019-04-08 Optical unmanned aerial vehicle monitoring system based on three-dimensional light field technology

Country Status (1)

Country Link
CN (1) CN210327775U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020207185A1 (en) * 2019-04-08 2020-10-15 深圳市视觉动力科技有限公司 Three-dimensional light field technology-based optical unmanned aerial vehicle monitoring system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020207185A1 (en) * 2019-04-08 2020-10-15 深圳市视觉动力科技有限公司 Three-dimensional light field technology-based optical unmanned aerial vehicle monitoring system
US11978222B2 (en) 2019-04-08 2024-05-07 Shenzhen Vision Power Technology Co., Ltd. Three-dimensional light field technology-based optical unmanned aerial vehicle monitoring system

Similar Documents

Publication Publication Date Title
US11978222B2 (en) Three-dimensional light field technology-based optical unmanned aerial vehicle monitoring system
CN105744163B (en) A kind of video camera and image capture method based on depth information tracking focusing
CN109859272B (en) Automatic focusing binocular camera calibration method and device
CN105574847B (en) Camera system and image registration method thereof
WO2006050430A2 (en) Optical tracking system using variable focal length lens
CN110132226B (en) System and method for measuring distance and azimuth angle of unmanned aerial vehicle line patrol
CN111080705B (en) Calibration method and device for automatic focusing binocular camera
CN104079916A (en) Panoramic three-dimensional visual sensor and using method
WO2020207172A1 (en) Method and system for optical monitoring of unmanned aerial vehicles based on three-dimensional light field technology
CN110602376B (en) Snapshot method and device and camera
Li Real-time spherical stereo
KR101836882B1 (en) All directional Camera and Pan Tilt Zoom Camera Linkage Possible Photographing Apparatus and Method Thereof
CN210327775U (en) Optical unmanned aerial vehicle monitoring system based on three-dimensional light field technology
Li et al. A cooperative camera surveillance method based on the principle of coarse-fine coupling boresight adjustment
CN109765747A (en) A kind of aerial image focusing test method, focus detection system and camera
WO2023098362A1 (en) Target area security and monitoring system based on hundred-million-level pixel camera
Lu et al. Image-based system for measuring objects on an oblique plane and its applications in 2-D localization
CN108537831B (en) Method and device for performing CT imaging on additive manufacturing workpiece
CN112702513B (en) Double-optical-pan-tilt cooperative control method, device, equipment and storage medium
US20130076868A1 (en) Stereoscopic imaging apparatus, face detection apparatus and methods of controlling operation of same
CN211047088U (en) Positionable panoramic three-dimensional imaging system
Somanath et al. Single camera stereo system using prism and mirrors
Sueishi et al. Mirror-based high-speed gaze controller calibration with optics and illumination control
KR101091564B1 (en) Omnidirectional camera
KR102598630B1 (en) Object tracking pan-tilt apparatus based on ultra-wide camera and its operation method

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant