CN211205210U - Four-dimensional hyperspectral depth imaging system - Google Patents

Four-dimensional hyperspectral depth imaging system Download PDF

Info

Publication number
CN211205210U
CN211205210U CN201922286118.6U CN201922286118U CN211205210U CN 211205210 U CN211205210 U CN 211205210U CN 201922286118 U CN201922286118 U CN 201922286118U CN 211205210 U CN211205210 U CN 211205210U
Authority
CN
China
Prior art keywords
hyperspectral
dimensional
information
lens
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201922286118.6U
Other languages
Chinese (zh)
Inventor
何赛灵
李常青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201922286118.6U priority Critical patent/CN211205210U/en
Application granted granted Critical
Publication of CN211205210U publication Critical patent/CN211205210U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The utility model discloses a four-dimensional hyperspectral depth imaging system, which comprises a calibration camera, a hyperspectral atlas module and a three-dimensional morphology scanning module; the method comprises the following steps that a calibration camera acquires surface information of an object, a hyperspectral atlas module acquires hyperspectral information of the surface, and a three-dimensional morphology scanning module acquires three-dimensional morphology information and distance information; the three-dimensional shape scanning module and the calibration camera are calibrated, and the hyperspectral atlas module and the calibration camera are calibrated, so that each pixel of the calibration camera corresponds to one piece of three-dimensional shape information and one piece of hyperspectral information. The utility model discloses can acquire the spectrum of object and the four-dimensional information of 3D appearance simultaneously according to single pixel, compact structure integrates the degree height, and is easy and simple to handle.

Description

Four-dimensional hyperspectral depth imaging system
Technical Field
The utility model belongs to the technique in three-dimensional appearance survey and drawing field and spectral imaging field, the demarcation of an auxiliary camera and high spectrum atlas appearance module and three-dimensional appearance scanning module has realized a four-dimensional high spectrum degree of depth imaging system.
Background
The light sheet appearance instrument manufactured according to the Samm principle can effectively acquire the three-dimensional appearance of the surface of an object, and meanwhile, the hyperspectral imager can detect the fluorescence spectrum data of the surface of the object, but at present, no equipment system integrates the two technologies to acquire four-dimensional information of the appearance and the hyperspectral information of the object by scanning.
In the related art in the past, three-dimensional topography scanning and hyperspectral imaging are separated from each other, and if the spectral information of a certain point in a space is to be acquired, the spectral information and the spatial position can be in one-to-one correspondence only through naked eye estimation or through complicated subsequent inversion processing.
SUMMERY OF THE UTILITY MODEL
In order to overcome the defects of the prior art, the utility model aims to provide a four-dimensional hyperspectral depth imaging system.
A four-dimensional hyperspectral depth imaging system comprises a calibration camera, a hyperspectral atlas module and a three-dimensional morphology scanning module; the method comprises the following steps that a calibration camera acquires surface information of an object, a hyperspectral atlas module acquires hyperspectral information of the surface, and a three-dimensional morphology scanning module acquires three-dimensional morphology information and distance information; the three-dimensional shape scanning module and the calibration camera are calibrated, and the hyperspectral atlas module and the calibration camera are calibrated, so that each pixel of the calibration camera corresponds to one piece of three-dimensional shape information and one piece of hyperspectral information.
The hyperspectral atlas instrument module comprises: the laser, the beam splitter, the long-pass filter, the imaging lens, the slit, the first aspheric lens, the prism grating group, the second aspheric lens and the area array camera are sequentially connected on the light path; exciting light emitted by a laser horizontally enters a beam splitter arranged at an angle of 45 degrees, the reflected exciting light irradiates the surface of an object to be detected, the object to be detected is irradiated by the laser and generates a fluorescence signal in a linear area irradiated by the exciting light, the fluorescence signal passes through the beam splitter and a long-pass filter and then is imaged on a slit by an imaging lens, and the exciting light subjected to diffuse reflection is filtered by the beam splitter and the long-pass filter; after the fluorescence signal passes through the slit, the first aspheric lens collimates the fluorescence signal into parallel light, the prism grating group carries out dispersion light splitting, and the split fluorescence is focused on the area-array camera by the second aspheric lens to obtain fluorescence spectrum data corresponding to each pixel.
The prism grating group consists of two optical wedges and a blazed grating.
The three-dimensional shape scanning module is any three-dimensional imaging system including a structured light, a light sheet shape instrument and a binocular stereo vision system.
The three-dimensional topography scanning module adopts an optical sheet topography instrument, and comprises: imaging lens, optical filter and area array detector.
When extension surfaces of three surfaces of a shot plane, an image plane and a lens plane are intersected in a straight line, the light sheet morphology instrument influences the plane to obtain a comprehensive and clear image, targets at different distances of the shot plane correspond to points on the image plane one by one, the focal length of a lens is f, the distance from the center of the lens to the shot plane is L, the included angle between the lens plane and the shot plane is theta, the shot plane corresponds to a distance direction z, and if the calibration distance is z0The distance is specifically calculated as follows 1:
Figure 100002_DEST_PATH_IMAGE001
wherein P is a target pixel point to be detected,P 0to calibrate the distance z0Corresponding pixel points are located, and Pixptich is pixel spacing and parameterP 0Determined by the following formula 2:
Figure 100002_DEST_PATH_IMAGE002
the utility model has the advantages that:
1. the spectrum of the object and the four-dimensional information of the 3D appearance can be acquired simultaneously according to a single pixel point;
2. the structure is compact, the integration degree is high, and the operation is simple and convenient;
3. the spectral information in the space can be accurately accurate to the picture pixel level, one-to-one correspondence between the spectral information and the space position information is realized, and the method can be widely applied to various mapping scenes, such as disease, pest and disease plant detection of a farmland based on the hyperspectrum.
Drawings
Fig. 1 is a schematic structural view of the present invention;
FIG. 2 is a schematic view of the schemer principle;
in the figure, an object to be detected 1, a beam splitter 2, a long-pass filter 3, an imaging lens 4, a slit 5, a first aspheric lens 6, a prism grating group 7, a second aspheric lens 8, an area-array camera 9, a filter and area-array detector 10, a calibration camera 11 and a laser 12.
Detailed Description
The invention is further explained below with reference to the drawings and examples.
As shown in fig. 1, laser 12 emits laser light, and the excitation light horizontally enters a beam splitter 2 disposed at an angle of 45 degrees, is reflected, and then irradiates an object 1 to be detected. The object 1 to be detected is irradiated with laser light and generates a fluorescence signal in a linear region irradiated with the excitation light. After passing through the beam splitter 2 and the long pass filter 3, the fluorescence signal is imaged on a slit 5 with a width of 50um by an imaging lens 4. In this process, a small amount of excitation light diffusely reflected by the sample surface will be filtered by the beam splitter 2 and the long pass filter 3 and not received by the lens. After the fluorescent signal passes through the slit 5, the first aspheric lens 6 collimates the fluorescent signal into parallel light, and the parallel light is subjected to dispersion splitting by the prism grating group 7. The prism grating group 7 is composed of two optical wedges with apex angle of 9.72 degrees and blazed gratings with blazed angle of 17.5 degrees and the number of rulings of 300. The dispersed fluorescence is focused on an area-array camera 9 (cmos) by a second aspheric lens 8 to obtain final fluorescence spectrum data.
On the other hand, the optical sheet topography instrument composed of the imaging lens 4, the optical filter and the area array detector 10 can scan and acquire the three-dimensional topography of the object according to the samm principle, and the schematic diagram of the samm principle is shown in fig. 2.
The linear light source emits light beams, the extension surfaces of the three surfaces of the shot plane, the wide-angle imaging lens and the area array detector are intersected in a straight line, and targets at different distances of the shot plane are correspondingly arranged on the area array detector one by one to form comprehensive and clear images.
The light sheet appearance is used as a shot plane, an image plane and a lens planeThe objects at different distances from the object plane correspond to points on the image plane one by one, wherein the focal length of the lens is f, the distance from the lens center to the object plane is L, the included angle between the lens plane and the object plane is theta, the object plane corresponds to the distance direction z, and the calibration distance is z0. The distance is specifically calculated as follows 1:
Figure DEST_PATH_IMAGE003
wherein P is a target pixel point to be detected,P 0to calibrate the distance z0And (4) corresponding pixel points are located, and Pixptich is the pixel distance. Parameter(s)P 0Determined by the following formula 2:
Figure DEST_PATH_IMAGE004
the calibration camera 11 and the area-array camera 9, the area-array camera 9 and the optical filter and area-array detector 10 are respectively calibrated, pixel-by-pixel mapping of calibration camera image pixel information, hyperspectral information and three-dimensional morphology information is achieved, and therefore scanning of object morphology and spectral thinking information is achieved.
The area-array camera 9 can also be rectified by the video stream captured by the calibration camera 11: the method comprises the steps of collecting videos in hyperspectral data through a calibration camera to extract offset of pixels between video frames caused by shaking, and using the offset to compensate offset values, so that a scene after shake offset correction is obtained.
Assuming that the start frame of the video is set to fsEnd frame of video is set to feThe current frame is set to fiSubsequent frame is set to fi+jThen the offset is:
Figure DEST_PATH_IMAGE005
Figure DEST_PATH_IMAGE006
where N is the number of feature points that have been screened for matches,x 1n is the abscissa of the feature point in the current frame,x 2n is the abscissa, y, of the feature point in the (i + j) frame1nIs the ordinate, y, of the feature point in the current frame2nIs the ordinate of the feature point in the (i + j) frame,Horis the pixel horizontal offset between the (i + j) th frame and the starting frame,Veris the pixel vertical offset of the (i + j) th frame from the starting frame.
Judging whether to compensate and correct: if the vertical offset A _ Ver (f)i+j) Less than thresholdT_shJ = j +1, if greater than the threshold, the current frame fiMove to the (i + j) th frame of the video, iterate until the current frame fi moves to the end frame.
Unified coordinate system: through calibration, a rotation and translation matrix of correlation transformation is found, wherein the rotation matrix (the image surface of the hyperspectral imager is parallel to the image surface of the calibration camera) is an identity matrix, and the relationship is as follows:
Figure DEST_PATH_IMAGE007
wherein xs,ysCoordinates, x, of the stitched image for the hyperspectral imagerc,ycAnd (3) calibrating coordinates of a camera splicing image, wherein R is a rotation vector, S is a zooming vector, and T is a translation vector.

Claims (6)

1. A four-dimensional hyperspectral depth imaging system is characterized in that: the device comprises a calibration camera (11), a hyperspectral atlas module and a three-dimensional morphology scanning module; the calibration camera (11) acquires surface information of an object, the hyperspectral atlas module acquires hyperspectral information of the surface, and the three-dimensional topography scanning module acquires three-dimensional topography information and distance information; the three-dimensional morphology scanning module and the calibration camera (11) are calibrated, and the hyperspectral atlas module and the calibration camera (11) are calibrated, so that each pixel of the calibration camera (11) corresponds to one piece of three-dimensional morphology information and one piece of hyperspectral information.
2. The four-dimensional hyperspectral depth imaging system according to claim 1, wherein: the hyperspectral atlas instrument module comprises: the system comprises a laser (12), a beam splitter (2), a long-pass filter (3), an imaging lens (4), a slit (5), a first aspheric lens (6), a prism grating group (7), a second aspheric lens (8) and an area array camera (9) which are sequentially connected on a light path; exciting light emitted by a laser (12) horizontally enters a beam splitter (2) arranged at an angle of 45 degrees, the reflecting light irradiates the surface of a detected object (1), the detected object is irradiated by the laser and generates a fluorescence signal in a linear area irradiated by the exciting light, the fluorescence signal passes through the beam splitter (2) and a long-pass filter (3) and then is imaged on a slit (5) by an imaging lens (4), and the diffusely reflected exciting light is filtered by the beam splitter (2) and the long-pass filter (3);
after the fluorescence signal passes through the slit (5), the first aspheric lens (6) collimates the fluorescence signal into parallel light, the prism grating group (7) disperses and splits the light, and the split fluorescence is focused on the area array camera (9) by the second aspheric lens (8) to obtain fluorescence spectrum data corresponding to each pixel.
3. The four-dimensional hyperspectral depth imaging system according to claim 2, wherein: the prism grating group (7) is composed of two optical wedges and a blazed grating.
4. The four-dimensional hyperspectral depth imaging system according to claim 1, wherein: the three-dimensional shape scanning module is any three-dimensional imaging system including a structured light, a light sheet shape instrument and a binocular stereo vision system.
5. A four-dimensional hyperspectral depth imaging system according to claim 1 or 4, wherein: the three-dimensional topography scanning module adopts an optical sheet topography instrument, and comprises: imaging lens (4), optical filter and area array detector (10).
6. The four-dimensional hyperspectral depth imaging system according to claim 5, characterized in that the light sheet topographer influences a plane to obtain a full-face clear image when extension planes of three planes, namely a shot plane, an image plane and a lens plane, intersect in a straight line, targets at different distances of the shot plane correspond to points on the image plane one by one, wherein the focal length of a lens is f, the distance from the center of the lens to the shot plane is L, the included angle between the lens plane and the shot plane is theta, the shot plane corresponds to a distance direction z, and assuming that a calibration distance is z0The distance is specifically calculated as follows 1:
Figure DEST_PATH_IMAGE001
wherein P is a target pixel point to be detected,P 0to calibrate the distance z0Corresponding pixel points are located, and Pixptich is pixel spacing and parameterP 0Determined by the following formula 2:
Figure DEST_PATH_IMAGE002
CN201922286118.6U 2019-12-18 2019-12-18 Four-dimensional hyperspectral depth imaging system Active CN211205210U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201922286118.6U CN211205210U (en) 2019-12-18 2019-12-18 Four-dimensional hyperspectral depth imaging system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201922286118.6U CN211205210U (en) 2019-12-18 2019-12-18 Four-dimensional hyperspectral depth imaging system

Publications (1)

Publication Number Publication Date
CN211205210U true CN211205210U (en) 2020-08-07

Family

ID=71855712

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201922286118.6U Active CN211205210U (en) 2019-12-18 2019-12-18 Four-dimensional hyperspectral depth imaging system

Country Status (1)

Country Link
CN (1) CN211205210U (en)

Similar Documents

Publication Publication Date Title
US11272161B2 (en) System and methods for calibration of an array camera
US10043290B2 (en) Image processing to enhance distance calculation accuracy
TWI525382B (en) Camera array systems including at least one bayer type camera and associated methods
JP3983573B2 (en) Stereo image characteristic inspection system
US7417717B2 (en) System and method for improving lidar data fidelity using pixel-aligned lidar/electro-optic data
US8427632B1 (en) Image sensor with laser for range measurements
US9945721B2 (en) Selective wavelength imaging systems and methods
JP5440615B2 (en) Stereo camera device
WO2013012335A1 (en) Imaging device for motion detection of objects in a scene, and method for motion detection of objects in a scene
CN104568963A (en) Online three-dimensional detection device based on RGB structured light
WO2014011182A1 (en) Convergence/divergence based depth determination techniques and uses with defocusing imaging
JP7378219B2 (en) Imaging device, image processing device, control method, and program
CN110520768B (en) Hyperspectral light field imaging method and system
CN108548603A (en) A kind of non co axial four-way polarization imaging method and system
JP5500879B2 (en) Image processing apparatus and image processing method
CN111272101A (en) Four-dimensional hyperspectral depth imaging system
CN106846385B (en) Multi-sensing remote sensing image matching method, device and system based on unmanned aerial vehicle
CN108088561A (en) A kind of fast illuminated light field-optical spectrum imagers and imaging method
AU2020408599A1 (en) Light field reconstruction method and system using depth sampling
CN211205210U (en) Four-dimensional hyperspectral depth imaging system
KR101909528B1 (en) System and method for 3 dimensional imaging using structured light
Suliga et al. Microlens array calibration method for a light field camera
US11644682B2 (en) Systems and methods for diffraction line imaging
CN108242044A (en) Image processing apparatus and image processing method
CN111258166B (en) Camera module, periscopic camera module, image acquisition method and working method

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant