Disclosure of Invention
The invention aims to provide a 3D (three-dimensional) imaging capsule endoscope system and a method based on structured light, which can display the internal information of the alimentary canal in an omnibearing manner and are convenient for doctors to diagnose more accurately and effectively.
To achieve the purpose, the invention designs a structured light-based 3D stereo imaging capsule endoscope system, which is characterized in that: it comprises a capsule shell, a controller, a light-emitting subsystem for emitting 3D imaging structured light, an illuminating device for illuminating the inside of the digestive tract cavity of a detected person, an imaging subsystem for shooting images inside the digestive tract of the detected person, and a capsule positioning subsystem for acquiring capsule position information and capsule posture information, wherein the controller, the light-emitting subsystem, the illuminating device, the imaging subsystem and the capsule positioning subsystem are all arranged in the capsule shell, the control signal output end of the light-emitting subsystem of the controller is connected with the signal input end of the light-emitting subsystem, the illumination light control signal output end of the controller is connected with the control signal input end of the illuminating device, the imaging signal output end of the imaging subsystem is connected with the imaging signal input end of the controller, and the positioning information communication end of the capsule positioning subsystem is connected with the positioning information communication end of the controller.
An endoscopic imaging method using the system, comprising the steps of:
step 1: placing the capsule shell in the alimentary tract;
step 2: the upper computer is used for controlling the controller through the data transmission subsystem, and the controller controls the lighting device to be turned on;
and step 3: the controller controls the light source to emit white light, the white light emitted by the light source irradiates the structured light generation module, the structured light generation module performs filtering processing on the white light to form required 3D imaging structured light, and the 3D imaging structured light irradiates the surface of a target object in the alimentary canal to form a 3D imaging light band;
and 4, step 4: the imaging subsystem images a 3D imaging optical band on the surface of a target object in the alimentary canal and transmits imaging information to the controller, the controller transmits the imaging information to the upper computer through the data transmission subsystem, the upper computer performs imaging decoding processing on the imaging information by utilizing coding information corresponding to the arrangement sequence of the optical filter and the shading baffle in the structured light generation module, and the spatial coordinate of the surface of the target object in 3D imaging is calculated according to the imaging decoding information;
and 5: the capsule positioning subsystem transmits the obtained posture and position information of the capsule shell to the controller, the controller transmits the posture and position information of the capsule shell to the upper computer through the data transmission subsystem, and the upper computer converts the space coordinate of the target object in the 3D imaging into world coordinate according to the space coordinate of the surface of the target object in the 3D imaging and the posture and position information of the capsule shell at the moment to form point cloud data;
step 6: obtaining corresponding point cloud data at different positions of the capsule shell in the alimentary canal by the method in the step 2-5;
and 7: the upper computer performs registration and fusion on the point cloud data obtained from different positions in the alimentary canal to obtain complete point cloud information, performs three-dimensional modeling on all the point cloud data of the entire alimentary canal, and displays a three-dimensional modeling pattern.
The invention has the beneficial effects that:
1. the invention can observe the three-dimensional information in the alimentary canal without additional auxiliary equipment;
2. the invention can obtain complete three-dimensional information of the digestive tract and carry out three-dimensional modeling;
3. the invention replaces a camera with the light-emitting subsystem and the imaging subsystem, and improves the precision of 3D imaging by using the structured light for auxiliary calculation;
4. the invention can obtain distance information, thereby obtaining the actual size of a lesion area in the digestive tract and facilitating the diagnosis of doctors;
5. the invention utilizes the 3D imaging based on the structured light, can observe the information in the alimentary canal in an omnibearing and multi-angle way, and improves the detection rate;
drawings
FIG. 1 is a schematic side sectional view of the present invention;
FIG. 2 is a schematic top view of the present invention;
FIG. 3 is a functional block diagram of the present invention;
FIG. 4 is a schematic diagram of the 3D imaging setup process based on structured light technology according to the present invention;
fig. 5 is a schematic structural diagram of a structural light generating module according to the present invention.
The device comprises a capsule shell, a transparent half shell, a non-transparent half shell, a light-emitting subsystem, a light source, a structured light generating module, a light filter, a shading baffle, an imaging subsystem, an image sensor, an optical lens, a lighting device, a controller, a data transmission subsystem, a capsule positioning subsystem, a power supply, a flexible circuit mounting plate, an upper horizontal flexible circuit mounting plate, a vertical flexible circuit mounting plate, a lower horizontal flexible circuit mounting plate, a controller, a data transmission subsystem, a capsule positioning subsystem, a power supply, a flexible circuit mounting plate, a lower horizontal flexible circuit mounting plate, a light source, a light shading baffle, a light source.
Detailed Description
The invention is described in further detail below with reference to the following figures and specific examples:
the 3D stereoscopic imaging capsule endoscope system based on structured light as shown in fig. 1-3 comprises a capsule shell 1, a controller 5, a light-emitting subsystem 2 for emitting 3D imaging structured light, an illuminating device 4 for illuminating the inside of the digestive tract cavity of a subject, an imaging subsystem 3 for shooting images inside the digestive tract of the subject, and a capsule positioning subsystem 7 for acquiring capsule position information and capsule posture information, wherein the controller 5, the light-emitting subsystem 2, the illuminating device 4, the imaging subsystem 3 and the capsule positioning subsystem 7 are all arranged in the capsule shell 1, the light-emitting subsystem control signal output end of the controller 5 is connected with the signal input end of the light-emitting subsystem 2, the illuminating light control signal output end of the controller 5 is connected with the control signal input end of the illuminating device 4, the imaging signal output end of the imaging subsystem 3 is connected with the imaging signal input end of the controller 5, the positioning information communication end of the capsule positioning subsystem 7 is connected with the positioning information communication end of the controller 5.
In the above technical solution, the light emitting subsystem 2 and the imaging subsystem 3 are located at two ends of the capsule shell 1, and face to the same side, and the distance between the light emitting subsystem 2 and the imaging subsystem 3 ranges from 15mm to 25 mm. In order to ensure that the light-emitting subsystem 2 and the imaging subsystem 3 work simultaneously and avoid the influence caused by the movement of the digestive tract, the delay between the two is not more than 1 ms.
Among the above-mentioned technical scheme, light-emitting subsystem 2 shines the inside object of alimentary canal that needs were shot with 3D structure light, imaging subsystem 3 images the inside object of alimentary canal, capsule positioning subsystem 7 is taken notes the position and the attitude data of capsule, data transmission subsystem 6 receives behind imaging subsystem 3 and capsule positioning subsystem 7's the data, handle and send the external host computer to, the host computer solves the 3D information of the inside object of alimentary canal, then carry out the 3D demonstration, make things convenient for the inside object of all-round observation alimentary canal.
In the above technical solution, the capsule shell 1 is formed by integrally connecting a transparent half shell 1.1 of the upper half part and a non-transparent half shell 1.2 of the lower half part. The light outlet of the structured light generating module 2.2 faces the transparent half-shell 1.1, and the light outlet of the lighting device 4 faces the transparent half-shell 1.1. The transparent half-shell 1.1 enables the light of the illumination subsystem 2 and the imaging subsystem 3 to pass through.
in the above technical scheme, the light emitting subsystem 2 includes a light source 2.1 and a structured light generating module 2.2, a light emitting subsystem control signal output end of the controller 5 is connected with a signal input end of the light source 2.1, an optical axis of the light source 2.1 is coaxial with an axis of an optical signal input end of the structured light generating module 2.2, and an included angle α between a light emitting surface of the structured light generating module 2.2 and an axis of the capsule shell 1 is 0 to 45 degrees, which enables a pattern shot by the imaging subsystem 3 to be clearer.
In the above technical solution, the structured light generating module 2.2 is formed by alternately arranging a plurality of optical filters 2.3 and a plurality of light shielding baffles 2.4 (i.e. a light shielding baffle is arranged between two adjacent optical filters), and the color arrangement order of the plurality of optical filters 2.3 is determined according to the preset 3D imaging structured light coding order. As shown in fig. 5, the color arrangement of the plurality of filters 2.3 is green, blue, red, yellow, and cyan in sequence.
In the above technical scheme, each optical filter 2.3 and each shading baffle 2.4 correspond to a light plane, and in order to ensure the system precision, the light plane has a small imaging width on the object surface, and the widths of each optical filter 2.3 and each shading baffle 2.4 are equal and less than 1 mm.
In the above technical solution, it further includes a data transmission subsystem 6 for connecting to an upper computer, and the communication end of the controller 5 is connected to the data transmission subsystem 6. The data transmission subsystem 6 comprises an ASIC (application specific integrated circuit) chip and a wireless transmission module, wherein the ASIC chip compresses image data and then transmits the data to an upper computer through the wireless transmission module.
In the above technical solution, a flexible circuit mounting plate 9 is fixed in the capsule housing 1, the flexible circuit mounting plate 9 includes an upper horizontal flexible circuit mounting plate 9.1, a vertical flexible circuit mounting plate 9.2 and a lower horizontal flexible circuit mounting plate 9.3, the upper horizontal flexible circuit mounting plate 9.1 and the lower horizontal flexible circuit mounting plate 9.3 are connected by the vertical flexible circuit mounting plate 9.2, and the upper horizontal flexible circuit mounting plate 9.1 and the lower horizontal flexible circuit mounting plate 9.3 are parallel to the axis of the capsule housing 1; the flexible circuit mounting plate 9 makes the capsule as soft as possible, which is convenient for the patient to swallow.
Go up horizontal flexible circuit mounting panel 9.1 and go up and install lighting subsystem 2, lighting device 4 and formation of image subsystem 3, install data transmission subsystem 6, capsule location subsystem 7, power 8 and controller 5 on the horizontal flexible circuit mounting panel 9.3 down, power 8 is the power supply to lighting subsystem 2, formation of image subsystem 3, controller 5, data transmission subsystem 6 and capsule location subsystem 7 respectively.
In the above technical solution, the imaging subsystem 3 includes a CMOS (Complementary Metal oxide semiconductor), an image sensor 3.1 and an optical lens 3.2, the optical lens 3.2 is installed at an image sensing end of the CMOS image sensor 3.1 (the CMOS image sensor 3.1 has advantages of small size and low power consumption, and is suitable for being placed in a capsule), and a signal output end of the CMOS image sensor 3.1 is connected to an imaging signal input end of the controller 5. The diagonal diameter of the CMOS image sensor 3.1 is 5.658mm, and the frame rate of the image collected by the CMOS image sensor 3.1 is 2 frames/s.
An endoscopic imaging method using the system comprises the following steps:
step 1: placing the capsule shell 1 in the alimentary canal;
step 2: the upper computer is used for controlling the controller 5 through the data transmission subsystem 6, the controller 5 controls the lighting device 4 to be lightened, the imaging subsystem images the interior of the digestive tract, and the lighting device is turned off after imaging is finished;
and step 3: the controller 5 controls the light source 2.1 to emit white light, the white light emitted by the light source 2.1 irradiates the structured light generation module 2.2, the structured light generation module 2.2 performs filtering processing on the white light to form required 3D imaging structured light, and the 3D imaging structured light irradiates the surface of a target object in the alimentary canal to form a 3D imaging light band;
and 4, step 4: the imaging subsystem 3 images a 3D imaging optical band on the surface of a target object in the alimentary canal, as shown in figure 4, imaging information is transmitted to the controller 5, the controller 5 transmits the imaging information to the upper computer through the data transmission subsystem 6, the upper computer performs imaging decoding processing on the imaging information by using coding information corresponding to the arrangement sequence of the optical filter 2.3 and the shading baffle 2.4 in the structured light generation module 2.2, and the spatial coordinate of the surface of the target object in 3D imaging is calculated according to the imaging decoding information;
and 5: the capsule positioning subsystem 7 transmits the obtained attitude and position information of the capsule shell 1 at the moment to the controller 5, the controller 5 transmits the attitude and position information of the capsule shell 1 at the moment to the upper computer through the data transmission subsystem 6, the upper computer converts the spatial coordinate of the target object in 3D imaging into world coordinate according to the spatial coordinate of the surface of the target object in the 3D imaging and the attitude and position information of the capsule shell 1 at the moment to form point cloud data, and the upper computer performs filtering and denoising on the point cloud data;
step 6: obtaining corresponding point cloud data at different positions of the capsule shell 1 in the alimentary canal by the method in the step 2-5;
and 7: the upper computer performs registration and fusion on the point cloud data obtained from different positions in the alimentary canal to obtain complete point cloud information, performs three-dimensional modeling on all the point cloud data of the entire alimentary canal, and displays a three-dimensional modeling pattern.
Before step 1 of the above technical solution, initial parameter calibration needs to be performed, and the initial parameter calibration includes parameter calibration of the CMOS image sensor 3.1 and system parameter calibration. The CMOS image sensor 3.1 is used for calibrating parameters to obtain lens distortion and internal parameters including focal length; the system calibration obtains the distortion parameters of the light emitting subsystem 2 and the parameters of the light plane generated by the light emitting subsystem 2.
In step 3 of the above technical solution, one optical plane Ax + By + Cz + D formed By the structured light generating module 2.2 forms one optical band on the surface of the object (the object is simultaneously irradiated By a plurality of optical planes, thereby increasing the processing speed), the imaging subsystem 3 images the optical band including the shape information of the surface of the object, and the spatial coordinates (x, y, z) of the target surface in the 3D imaging are calculated By using the following formula (1):
wherein,
where f is the focal length of the imaging subsystem 3, x 'and y' are the pixel coordinates of the target in the imaging subsystem 3, x, y, and z are the spatial coordinates in 3D imaging, the optical plane Ax + By + Cz + D is 0 in 3D imaging as a reference frame, and the coefficient A, B, C, D can be obtained during system calibration.
Details not described in this specification are within the skill of the art that are well known to those skilled in the art.