CN108392165A - Method and utensil for the introscope with the range measurement scaled for object - Google Patents
Method and utensil for the introscope with the range measurement scaled for object Download PDFInfo
- Publication number
- CN108392165A CN108392165A CN201810091804.5A CN201810091804A CN108392165A CN 108392165 A CN108392165 A CN 108392165A CN 201810091804 A CN201810091804 A CN 201810091804A CN 108392165 A CN108392165 A CN 108392165A
- Authority
- CN
- China
- Prior art keywords
- image
- capsule
- normal images
- camera
- range information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/041—Capsule endoscopes for imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/05—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0605—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements for spatially modulated illumination
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0646—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements with illumination filters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0082—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
- A61B5/0084—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for introduction into the body, e.g. by catheters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/07—Endoradiosondes
- A61B5/073—Intestinal transmitters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/1032—Determining colour for diagnostic purposes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1072—Measuring physical dimensions, e.g. size of the entire body or parts thereof measuring distances on the body, e.g. measuring length, height or thickness
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1076—Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions inside body cavities, e.g. using catheters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/42—Detecting, measuring or recording for evaluating the gastrointestinal, the endocrine or the exocrine systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7225—Details of analog processing, e.g. isolation amplifier, gain or sensitivity adjustment, filtering, baseline or drift compensation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30092—Stomach; Gastric
Abstract
The present invention relates to the method and utensil for the introscope with the range measurement scaled for object, the method and utensil of the image for being used for capturing scene using the capsule apparatus comprising camera are disclosed.When the capsule apparatus travels through human gastrointestinal road, the camera pick-up image sequence is used.Also, when the capsule apparatus travels through mankind's intestines and stomach, one or more objects in the visual field for passing through projecting structural optical to the camera using the camera are to capture structure optical image.The structure optical image interlocks with the normal images in the image sequence.It derives and the relevant range information of selected object of the image in relation to the capsule cameras.Export both the image sequence and the range information.Also the method that the size of attention object is judged using the range information is disclosed.In another method, which is for zooming object or adjustment intensity.
Description
The interaction reference of related application
The present invention is the continuous case of PCT Patent Application number PCT/US17/15668 filed in 30 days January in 2017, and
Ask the priority of the PCT Patent Application.The present invention is also patent application number 14/ filed in 16 days October in 2015
884, the 788 continuous case in part, and ask the priority of the U.S. Patent application.The PCT Patent Application and the United States Patent (USP) Shen
Full content please is incorporated herein as reference.
Technical field
The present invention relates in order to examine purpose and for capture human gastrointestinal road image introscope.In particular, should
Introscope can be in the visual field of enable to measure the camera object distance.Next the range information can be used for handling acquisition
Image sequence, for example, measure be interested in object size or suture the image sequence to reduce viewing time.
Background technology
Device is known in the art to obtain the image of body cavity or channel, it includes introscopes and spontaneous packaged type phase
Machine.Introscope is flexible or rigid piping systems, can enter human body via aperture or operation opening, typically enter via oral cavity
Esophagus or per rectum enter colon.Image is to be formed in remotely using lens, and pass through relay-lens system (lens-
Relay system) or by coherent bundle (coherent fiber-optic bundle), and be sent to external close
End.Conceptive similar instrument can be for example electronically long-range at this by image using CCD or CMOS arrays, and should
Image data is sent to the proximal end as electric signal via cable.Introscope permission surgeon's control visual field, and because
This is that tool is regarded in well accepted.
Capsular endoscope is the internal introscope of another kind developed in recent years.For capsular endoscope, camera is dress
In swallowable capsule, together with radio transmitter, by the main data transmission for including the image recorded by the digital camera
To in the external base station receiver of this or transceiver and data logger.The capsule also may include wireless receiver, be used for
From the instruction of base station transmitter receipt or other data.Other than radio frequency transmits, it is possible to use low frequency electromagnetic signal.Electric energy can be with
Inner inductor in battery to the capsule of the mode of inductance from external inductance supply or out of this capsule.
Independence capsule camera system with data storage device on plate is to be exposed in the U.S. of mandate on July 19th, 2011
Patent number 7,983,458, entitled " In Vivo Autonomous Camera with On-Board Data
Storage or Digital Wireless Transmission in Regulatory Approved Band”.With plate
The image of acquisition is built file onboard in non-voltile memory by the capsule cameras of upper reservoir.The capsule cameras is at it from people
It is retracted when body comes out.The image in the non-voltile memory of the capsule cameras of the withdrawal is stored in then via the capsule cameras
On output port access.
When the introscope is for being imaged human body intestines and stomach, a main purpose is any possible exception of identification.Such as
Fruit finds any exception, then is further interested in judge the characteristic of the exception, such as the size of the exception.The image of the acquisition will
By medical speciality inspection, to check or diagnose.The number of the image captured is usually 25,000 or more.It is just skilled at last
Professional person, it is also desirable to the long inspection time, could identify the image.Therefore, image suture has been used for reducing and will have been watched
Image number.For example, in the published PCT Patent Application number WO2014/ that publication date is on December 4th, 2014
In 193670A2, discloses and give image suture for the image captured using capsule cameras.Development in need can be improved further
The method or utensil of the efficiency of image suture.
Invention content
It discloses and is used for using capsule cameras to capture the method and utensil of the image of scene.It is gulped down by patient in the capsule cameras
After pharynx, when the capsule cameras travels through mankind's intestines and stomach, the capsule cameras pick-up image sequence is used.Also, working as should
When capsule cameras travels through mankind's intestines and stomach, structure optical image is captured.Export the normal images and the structure optical image or
The derivation both information of the structure optical image.Can derive with about the capsule cameras the normal images object it is relevant away from
From information.Export the relevant information between the range information and the corresponding image of the normal images.The relevant information can correspond to
The picture frame number or acquisition occurrence of the correspondence image of the normal images.
The method that the present invention also discloses the size for object of being interested in judgement image.Reception is captured just by capsule cameras
Normal image and the structure optical image.The range information of the object in relation to the camera in the normal images is from the structure shadow
As deriving.The size of interesting object in selected normal images can be according to the pixel data of the target object and the range information
Judgement.The size of the interesting object is to select the interesting object of normal images through the object to the capsule cameras according to this
Body distance is judged with the image size after the proportional zoom of the focal length of the capsule cameras.The image size of the interesting object
It is measured by the number of the pixel in the interesting object for should select normal images.
The present invention is also disclosed sutures the normal images to generate the image after suture using the information comprising the range information
The method of sequence.Again, which can derive from the structure optical image.In one embodiment, which is
For scaling the object of the normal images, to suture the normal images.In another embodiment, which is for adjusting
The image intensity of the whole normal images, to suture the image sequence.
Description of the drawings
Fig. 1 illustrates the pass between the focal length of the size of object, the size of corresponding object image, the object distance and the camera
The example of system.
Fig. 2A and Fig. 2 B illustrate object image of the same object in two images that two different objects distances are captured
Various sizes of example.
Fig. 3, which is illustrated, is used for the exemplary flowchart that embodiment according to the present invention captures positive normal images and structure optical image,
Wherein, which is the related range information of object for deriving the camera related to the normal images.
Fig. 4, which is illustrated, is used for embodiment according to the present invention according to the pixel data of the object and the range information to judge just
The exemplary flowchart of the size of attention object in normal image.
Fig. 5, which is illustrated, is used for embodiment according to the present invention using the information suture normal images comprising the range information to produce
The exemplary flowchart of image sequence after raw suture.
Specific implementation mode
It will immediately appreciate that the component of the present invention described in this paper schemas and illustrated, can be matched with various different structures of organizing
It sets and designs.Therefore, next as the more detailed description of the embodiment of the present invention presented in schema is not intended to limitation originally
Requested range is invented, and is only the representative of the selected embodiment of the present invention.In specification for " one embodiment ",
The reference of " embodiment " or similar language, it is intended that linking described especially feature, structure or characteristic with the embodiment can wrap
Containing at least one embodiment of the present invention.Therefore, " in one embodiment " appeared in the different places of specification
Or the sentence of " in embodiment " is unnecessary equal with reference to identical embodiment.
In addition, the feature of the description, structure or characteristic can be in any appropriate manner incorporated in one or more embodiments.So
And the skilled person in related field will recognize that the present invention can be in the case of one or more no specific details or with it
Its method, component etc. are practiced.In other examples, it is known that structure or operation there is no display the details of or description, to keep away
Exempt to obscure aspect of the invention.The illustrative embodiments of the present invention can refer to schema and are best appreciated, wherein identical portion
Part is specified by being identically numbered in the text.Following description is merely intended to by example, and is only illustrated and institute of the present invention
The utensil of the request specific selected embodiment consistent with method.
Introscope is typically to be inserted into human body via the natural openings in such as oral cavity or anus.Therefore, introscope is best
With small size, with minimally invasive property.As mentioned above, introscope can be used for diagnosing the road human body stomach (GI).It may be viewed by
The image sequence of acquisition, to recognize any possible exception.If it find that any exception, then be interested in recognize the spy of the exception
Property, such as size.Therefore, invention of the invention discloses the introscope for including distance-measuring device, to measure the camera and the phase
Object distance between the different location of object in the visual field of machine.
Have between the different location of the object in the visual field of the various known devices to measure the camera and the camera away from
From.For example, there is the distance-measuring device of grade a kind of according to flight time (ToF, Time of Flight) or light source
Phase offset judges the distance.The light source can be laser or light emitting diode (LED, Light Emitting Diode).Light
Sensor is for detecting the return light.Time difference exclusive or between the transmitting light and the reception light of the light source of the light source
Phase difference is for judging the distance.Ultrasonic is also that one kind can be used to measure object and the camera (being used for GI imaging applications)
The distance between signal source.The distance-measuring device is known, also, existing use light or Supersonic source are according to ToF or phase
Position deviates to describe the different documents of the range measurement.Therefore, it is omitted in this exposure using light or Supersonic source according to ToF or phase
The details of the distance-measuring device of position offset.
If light source is for measurement distance, this can be illuminated with during image capture by being used for measuring the light of the distance
The flash of light in the roads GI interlocks.In this case, the light and the flash of light for image capture for being used for range measurement will not be simultaneously
Apply or at least one light source needs are substantially dimmed.The range information can be stored or be stored together with coherent video respectively.
This can be captured apart from information before or after coherent video.If the range information is to store respectively, the coherent video
Related information (being known as related information in this exposure) will be stored, so that the range information can be suitably used.It should
Related information can be the acquisition time of the coherent video, as frame time or as frame number.If using ultrasonic come measure this away from
From then can simultaneously being occurred with the image capture for illuminating the roads GI with by applying the flash of light using supersonic range measurement.
Although according to the distance-measuring device of ToF or phase offset be using light or ultrasonic it is known, by it is this away from
It is installed to introscope from measuring device, is still a kind of tool challenge task and incurs a great expense, because in large scale is not suitable for
Visor application.Therefore, other distance-measuring devices are the image procossing according to the image captured using image sensor.
For example, a kind of technology for capturing depth information is to use chromatic filter, is placed on selected sensor
The top of pixel, and passband (passband) reasonably narrows simultaneously and captures the colour information and depth information.In the optical filtering
The environment light source with frequency spectrum, will cause insignificant energy to be projected on the sensor in device passband.For rgb pixel
For situation, the pixel of the 4th type can be added, to capture the frequency in the passband with the optical filter being placed in these pixels
The light of spectrum.Later, have the structure light projectable of the frequency spectrum of the essence in the passband in the scene.However, this mode
The spatial resolution of the image or video signal that are captured using this image sensor will be reduced, and therefore needs unconventional coloured silk
Color optical filter.
Another technology is to obtain the depth information by the visible structured light patterns in RGB sensors and 3D is opened up
It flutters.However, real-time imaging and/or video signal can be mixed up by the structure light of institute's addition thereon.When capturing to structure light, can derive
The depth or shape information of object in the scene.The depth or shape information then by the image or immediately structure optical image
Before or after the image that is captured used.Since the normal images are with fairly slow by capsular endoscope as frame rate (example
Such as, 5 frame per second), therefore, which corresponds to the image captured using the structure light, and the scene corresponds to normal shadow
Picture, since the introscope is mobile or the wriggling of intestines, it is different that they can dramatically ground.It is derived to improve from the structure optical image
Depth information accuracy, have shortization U.S. as filed in the structuring image in frame period is disclosed on October in 2015 16
In state's patent application number 14/884,788.Due to the structure light be in time close to the correspondence normal images, derive
Depth information should than according to have the longer structure optical image as the frame period derived it is more acurrate.
When using the distance-measuring device via structure light, which will be from the structure shadow
As deriving.In other words, untreated range information is stored in the form of structure optical image.In this case, the range information is (also
That is, the structure optical image) it can store or be stored together with the coherent video acquired by normal light respectively.The range information can
It is captured before or after coherent video.If the range information is to store respectively, the related information of the coherent video is (also
That is, relevant information) it will also store, so that the range information can be suitably used.It is known in the art from the structure optical image
Derive the technology of the depth information.The details that depth is derived from the structure optical image is omitted in this exposure.
In introscope, focal length is known via design.If can determine that the distance of object and the camera (in this exposure
Also referred to as object distance), then the size of object can be judged using only geometry.Fig. 1 is illustrated to be sentenced according to object-camera distance
Determine the simplification example of dimension of object.In camera system, which is located in the focal plane 120 at 110 rear of lens
Place.The scene of extension angle α in the camera fechtable visual field.Focal length f is the distance between the lens and the image sensor.
The focal length is usually fixed introscope application, and known via being designed as.It is passed through however, advancing when capsular endoscope
When the roads GI Guo Gai, object distance D can be with the position of the capsular endoscope and and relative to the opposite of the GI walls being imaged
Angle and change.If distance D it is known that if the size of object can be from the size by measuring the object image in the image
The image captured is judged.For example, if with height H object 130 apart from the camera be distance D, the object
Body image height H can be from the object image height h foundations in the image
To be derived.
In above equation sequence, h be from the radiographic measurement, focal length f via design be it is known, and distance D be by
What selected distance measuring device mentioned above was judged.Therefore, if can determine that the distance, the object ruler can be derived
It is very little.The entity size of the dimension of object in the image can be measured.However, the image captures in a digital manner, because
This, may describe the dimensional measurement so that the number of pixel is more convenient.Since then, the entity size on image sensor surface and the optics
Footprint is known.Also, the number of pixel is known (for example, 320x240).Therefore, object image in the image
Size can be measured with the number of pixel, and the physical objects image size being converted into the image.
It is as indicated above, the object image size in the image with the real-world object size and its away from the camera away from
From related.Relatively wisp at closer distance can in the shadow for compared with the larger object of distant location seeming that there is identical size
Picture.For example, the object 140 it is smaller than object 130 but closer to, it appears that in the image with object 130 have identical height
Degree.Therefore, which is to judge the important information of the dimension of object.Therefore, above disclosed distance-measuring device foundation makes
The image that is captured with introscope judges dimension of object.
Range information is also useful for image suture.In the ideal situation of physical objects model, the image of acquisition
In dimension of object variation can impliedly be dealt with by registration process (registration process).Correspondence in different images
Object can be recognized and be registered.Because the hypothesis of the different objects size of different variations has been subject to by the registration process in different images
Consider.The correspondence object that will be matched in the reference frame with the various sizes of object in target picture frame.Similarly, for the national games
Movable model (global motion model) will be suitable for the target image to scale the object, so that the image can be appropriate
Ground scales and suture.However, being usually very different from the ideal physical objects with the relevant image of object in the roads the GI environment
Body Model.In addition, when the variable of such as distance involved in the optimization process, iterative process may can't always be convergence or convergence
To local minimum (local minima).As known in the art, iterative process is typically used in the one of entire registration process
Part.In one embodiment, this is for scaling apart from information.In particular, the range information can be used to that the image is helped to note
Volume, to improve the registration accuracy.
Fig. 2A and Fig. 2 B are illustrated in the object shadow of the same object in two images that two different objects distances are captured
The various sizes of example of picture.In fig. 2, it is the interesting object in the roads Ju Gai GI 212 to illustrate 210 to correspond to the capsule 211
213 longer-distance situations.Image 220 corresponds to for illustrating 210 images captured.In fig. 2b, 230 are illustrated to correspond to
In the capsule 211 is in interesting 213 closer distance of object in the roads Ju Gai GI 212 the case where.Image 240, which corresponds to, is used for the example
Show 230 images captured.Since then, image 240 is captured with the camera close to the GI walls.Therefore, in image 240
The object image appear larger than the object image size in image 220.Therefore, this apart from information can be used to scale this two
The object in image.
In the roads Gai GI environment, image is always captured using the light source for illuminating the visual field.When the closer intestinal wall of the camera
When, the object being imaged will be brighter, and the image intensity will be higher.On the other hand, when the camera is further away from intestinal wall
When, the object being imaged will be dark, and the image intensity will be relatively low.Therefore, the integral strength of image will be with this
The object in the visual field is related with the distance between the camera.Since the object-camera distance is led to for the roads the GI environment
Chang Feichang is short, and therefore, the variation in the integral strength of image will be quite big.This big Strength Changes can deteriorate registration effect
Can, and therefore reduce the suture efficiency.
Therefore, in another embodiment of the invention, by the image intensity of two images of registration will according to the distance and
It is adjusted.Image pixel intensities be generally inversely proportional to distance square or other functional forms.The image pixel intensities and the distance
Relationship also can tabular, with substitution be presented in functional form.Image can be adjusted downward to another image of matching in the intensity of closer distance
In longer-distance intensity.Alternatively, image longer-distance intensity can on be adjusted to match another image in the strong of closer distance
Degree.It is adjusted in intensity to compensate because after the variation of different distance, the registration and image suture can should more preferably show.
Fig. 3 illustrates the exemplary flowchart that embodiment according to the present invention illustrates pick-up image sequence and structure optical image,
In, which derived from the structure optical image.In the step 310, which is to be thrown to patient.In step
In 320, scene of the non-structural light in from non-structural light sources project to the visual field of the camera.In a step 330, when the capsule apparatus
When travelling through human gastrointestinal road, normal images are formed on the common image plane of the camera using the camera.In step 340
In, the scene in the visual field of structure light from structure light sources project to the camera.In step 350, which advances
When by mankind's intestines and stomach, the structure optical image is captured using the camera, wherein the structure optical image is handed over normal images
Mistake, with derivation and about the relevant range information of object about the camera in the normal images.In step 360, output should
The derivation information of structure optical image or the structure optical image.In step 370, the range information is exported.From the structure optical image
The range information extracted means that individual range informations are that more than one place judges in the image or the visual field.This hair
Bright invention is comprising one or more points in image.
Fig. 4 illustrates embodiment according to the present invention and judges in the image sequence according to the image sequence and the range information
The exemplary flowchart of the size of interesting object.In step 410, it when the capsule cameras travels through human gastrointestinal road, connects
Receive the normal images captured by the capsule cameras.At step 420, judgement and the selected normal images in relation to the capsule cameras
The relevant range information of object.In step 430, the pixel data according to the target object and the range information, judgement
The size of interesting object in the selected normal images.In step 440, ruler of the output about the size of the object of being interested in
Very little information.
The use that Fig. 5 illustrates embodiment according to the present invention includes the range information to suture image sequence and generate suture
The exemplary flowchart of image sequence afterwards.In step 510, when the capsule cameras travels through human gastrointestinal road, receive by
The normal images that the capsule cameras is captured.In step 520, it when the capsule cameras travels through mankind's intestines and stomach, pushes away
Lead relevant distance of one or more objects in the normal images captured with the capsule cameras of the related capsule cameras
Information.In step 530, the normal images are sutured using the information comprising the range information, to generate the image sequence after suture
Row.In step 540, the image sequence after the suture is exported.
Above description is presented so that those skilled in the art are carried out the institute in particular application and its context of requirement
The present invention of offer.The different modifications of the embodiment of the description are apparent to those skilled in the art, and therefore, this place is fixed
The rule of justice can be applied to other embodiments.Therefore, the present invention is not intended to be limited in shown and described special
Embodiment, but meet the widest scope consistent with the principle and novel features disclosed herein.It is retouched in detail in above
In stating, different specific details is illustrated, for the overall understanding of the present invention must be to provide.However, the skilled person of the present invention is answered
Understanding the present invention also can implementation.
Present invention may be embodied in other particular forms, and without departing from its spirit or key property.Described example exists
It only illustrates and unrestricted takes in all aspects.Therefore, the scope of the present invention be by appended claims rather than
It is indicated by description before.All changes in the meaning and range of equipollent from claim should all cover at it
In the range of.
Claims (20)
1. a kind of capsule apparatus of use comprising camera be the method that captures the image of scene, including:
The capsule apparatus is launched to patient;
Scene from non-structural light sources project non-structural light to the visual field of the camera;
When the capsule apparatus travels through human gastrointestinal road, is captured by the camera and formed on the common image plane of the camera
Normal images;
Scene in the visual field from structure light source projecting structural optical to the camera;
When the capsule apparatus travels through mankind's intestines and stomach, institute on the common image plane of the camera is captured by the camera
The structure optical image of formation, wherein the structure optical image interlocks with the normal images, with derive with about the camera this is normal
The relevant range information of object in image;And
The normal images are exported, and export the structure optical image or the derivation information of the structure optical image.
2. the method as described in claim 1 also includes to be derived and the normal images about the camera from the structure optical image
In the relevant range information of the object, wherein the derivation information of the structure optical image correspond to the range information.
3. method as claimed in claim 2, wherein the related letter between the range information and the correspondence image of the normal images
Breath is exported.
4. method as claimed in claim 3, wherein the relevant information corresponds to the picture frame of the correspondence image of the normal images
Number or acquisition occurrence.
5. a kind of to the introscope for the image for using capsule apparatus to capture scene, which includes:
Camera;
Non-structural light source;
Structure light source;
One or more processors are coupled to the camera and the structure light source;
One or more output interfaces are coupled to one or more processors;And
Wherein, one or more processors, group structure with:
Human gastrointestinal road is travelled through in the capsule apparatus, using the camera by from the non-structural light sources project non-structural light,
To capture normal images;
When the capsule apparatus travels through mankind's intestines and stomach, using the camera by from the structure light source projecting structural optical to
Scene in the visual field of the camera, wherein the structure optical image interlocks with the normal images, to derive and being somebody's turn to do about the camera
The relevant range information of object in normal images;
The normal images are exported by one or more output interfaces;And
Export the structure optical image or the derivation information of the structure optical image;And
Shell is adjusted to be swallowed, wherein the camera and one or more processors are enclosed in sealed environment by the shell.
6. capsule apparatus as claimed in claim 5, wherein one or more processors also organize structure to be pushed away from the structure optical image
It leads and about relevant range information of the object in the normal images of the camera, wherein this of the structure optical image pushes away
It leads information and corresponds to the range information.
7. capsule apparatus as claimed in claim 6, wherein one or more output interfaces provide and export the range information with
Relevant information between the correspondence image of the normal images.
8. capsule apparatus as claimed in claim 7, wherein the relevant information corresponds to the correspondence image of the normal images
As frame number or acquisition occurrence.
9. a kind of method for the image that processing is captured using capsule cameras, this method include:
When the capsule cameras travels through human gastrointestinal road, the normal images captured by the capsule cameras are received;
Range information of the target object about the capsule apparatus in normal images is selected in judgement, wherein the range information be in
When the capsule apparatus travels through mankind's intestines and stomach, what the structure optical image that is captured from the capsule apparatus was derived;
According to the pixel data of the target object and the range information, the ruler of the target object in the selected normal images is judged
It is very little;And
Export the dimension information of the size about the target object.
10. method as claimed in claim 9, wherein the size of the target object is to select being somebody's turn to do for normal images according to this
Target object is judged through the object distance of the capsule cameras and the image size after the proportional zoom of the focal length of the capsule cameras
's.
11. method as claimed in claim 10, wherein the image size of the target object should select normal images
Measured by the number of pixel in the target object.
Also include from the capsule cameras to receive the structure optical image and from the structure shadow 12. method as claimed in claim 9
As deriving the range information, wherein the structure optical image is to pass through projecting structural optical to the capsule cameras using the capsule cameras
The visual field in scene and capture.
13. a kind of method handling the image captured by capsule cameras, this method include:
When the capsule apparatus travels through human gastrointestinal road, the normal images captured by the capsule apparatus are received
Judge range information of one or more target objects about the capsule apparatus in the normal images, wherein the distance is believed
Breath is when the capsule apparatus travels through mankind's intestines and stomach, and the structure optical image captured from the capsule apparatus is derived
's;
The normal images are sutured using the information comprising the range information, to generate the image sequence after suture;And
Image sequence after output suture.
Also include from the capsule cameras to receive the structure optical image and from the structure light 14. method as claimed in claim 13
Image derives the range information, wherein the structure optical image is to pass through projecting structural optical to the capsule phase using the capsule cameras
Scene in the visual field of machine and capture.
15. method as claimed in claim 14, wherein the range information is should for being directed to suture normal images scaling
One or more target objects in normal images.
16. method as claimed in claim 14, wherein the range information be for be directed to the suture normal images adjust to
The image intensity of few normal images.
17. method as claimed in claim 13, wherein related between the range information and the correspondence image of the normal images
Information is also received and for the suture normal images.
Be used for handling the utensil of the image captured using capsule cameras 18. a kind of, the utensil comprising one or more electronic circuits or
Processor, configuration with:
When the capsule cameras travels through human gastrointestinal road, the normal images captured by the capsule cameras are received;
Judge range information of one or more target objects about the capsule apparatus in the normal images, wherein the distance is believed
Breath is when the capsule apparatus travels through mankind's intestines and stomach, and the structure optical image captured from the capsule apparatus is derived
's;
The normal images are sutured using the information comprising the range information, to generate the image sequence after suture;And
Image sequence after output suture.
19. utensil as claimed in claim 18, wherein one or more electronic circuits or processor are also configured with from the capsule
Camera receives the structure optical image and derives the range information from the structure optical image, wherein the structure optical image is to use to be somebody's turn to do
Capsule cameras is captured by the scene in the visual field of projecting structural optical to the capsule cameras.
20. utensil as claimed in claim 18, wherein related between the range information and the correspondence image of the normal images
Information is also received and for the suture normal images.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2017/015668 WO2018140062A1 (en) | 2017-01-30 | 2017-01-30 | Method and apparatus for endoscope with distance measuring for object scaling |
USPCT/US2017/015668 | 2017-01-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108392165A true CN108392165A (en) | 2018-08-14 |
Family
ID=62978402
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810091804.5A Withdrawn CN108392165A (en) | 2017-01-30 | 2018-01-30 | Method and utensil for the introscope with the range measurement scaled for object |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP2018130537A (en) |
CN (1) | CN108392165A (en) |
WO (1) | WO2018140062A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109730683A (en) * | 2018-12-21 | 2019-05-10 | 重庆金山医疗器械有限公司 | Endoscope object size calculation method and analysis system |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2610457A (en) * | 2021-03-31 | 2023-03-08 | Nvidia Corp | Generation of bounding boxes |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1921797A (en) * | 2004-02-18 | 2007-02-28 | 国立大学法人大阪大学 | Endoscope system |
CN102063714A (en) * | 2010-12-23 | 2011-05-18 | 南方医科大学 | Method for generating body cavity full-view image based on capsule endoscope images |
CN103720483A (en) * | 2012-10-11 | 2014-04-16 | 三星电子株式会社 | X-ray apparatus and X-ray image obtaining method |
CN103815858A (en) * | 2014-02-26 | 2014-05-28 | 上海齐正微电子有限公司 | Capsular endoscope with multiple built-in sensors |
CN105996961A (en) * | 2016-04-27 | 2016-10-12 | 安翰光电技术(武汉)有限公司 | 3D stereo-imaging capsule endoscope system based on structured light and method for same |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4767685B2 (en) * | 2005-12-28 | 2011-09-07 | オリンパスメディカルシステムズ株式会社 | In-subject observation system |
US8636653B2 (en) * | 2008-06-09 | 2014-01-28 | Capso Vision, Inc. | In vivo camera with multiple sources to illuminate tissue at different distances |
US8617058B2 (en) * | 2008-07-09 | 2013-12-31 | Innurvation, Inc. | Displaying image data from a scanner capsule |
US20130002842A1 (en) * | 2011-04-26 | 2013-01-03 | Ikona Medical Corporation | Systems and Methods for Motion and Distance Measurement in Gastrointestinal Endoscopy |
JP2013013481A (en) * | 2011-07-01 | 2013-01-24 | Panasonic Corp | Image acquisition device and integrated circuit |
JP2013063179A (en) * | 2011-09-16 | 2013-04-11 | Olympus Medical Systems Corp | Observation system |
JP2014161355A (en) * | 2013-02-21 | 2014-09-08 | Olympus Corp | Image processor, endoscope device, image processing method and program |
CN107072498B (en) * | 2015-06-30 | 2019-08-20 | 奥林巴斯株式会社 | Image processing apparatus, capsule-type endoscope system and endoscopic system |
-
2017
- 2017-01-30 WO PCT/US2017/015668 patent/WO2018140062A1/en active Application Filing
-
2018
- 2018-01-30 JP JP2018013654A patent/JP2018130537A/en active Pending
- 2018-01-30 CN CN201810091804.5A patent/CN108392165A/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1921797A (en) * | 2004-02-18 | 2007-02-28 | 国立大学法人大阪大学 | Endoscope system |
CN102063714A (en) * | 2010-12-23 | 2011-05-18 | 南方医科大学 | Method for generating body cavity full-view image based on capsule endoscope images |
CN103720483A (en) * | 2012-10-11 | 2014-04-16 | 三星电子株式会社 | X-ray apparatus and X-ray image obtaining method |
CN103815858A (en) * | 2014-02-26 | 2014-05-28 | 上海齐正微电子有限公司 | Capsular endoscope with multiple built-in sensors |
CN105996961A (en) * | 2016-04-27 | 2016-10-12 | 安翰光电技术(武汉)有限公司 | 3D stereo-imaging capsule endoscope system based on structured light and method for same |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109730683A (en) * | 2018-12-21 | 2019-05-10 | 重庆金山医疗器械有限公司 | Endoscope object size calculation method and analysis system |
CN109730683B (en) * | 2018-12-21 | 2021-11-05 | 重庆金山医疗技术研究院有限公司 | Endoscope target size calculation method and analysis system |
Also Published As
Publication number | Publication date |
---|---|
JP2018130537A (en) | 2018-08-23 |
WO2018140062A1 (en) | 2018-08-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10402992B2 (en) | Method and apparatus for endoscope with distance measuring for object scaling | |
CN107920722B (en) | Reconstruction by object detection for images captured from a capsule camera | |
CN108139207B (en) | Single image sensor for capturing mixed structured light and regular images | |
CN105939451B (en) | Image exposure processing system and method for capsule endoscope system | |
US10736559B2 (en) | Method and apparatus for estimating area or volume of object of interest from gastrointestinal images | |
US7684599B2 (en) | System and method to detect a transition in an image stream | |
US8582853B2 (en) | Device, system and method for automatic detection of contractile activity in an image frame | |
CN109381152A (en) | For the area of the object of interest in stomach image or the method for volume and equipment | |
CN105531720B (en) | The system and method for size estimation for intrabody objects | |
CN107529969B (en) | Image processing apparatus and endoscopic system | |
JP6064106B1 (en) | Image processing apparatus, capsule endoscope system, and endoscope system | |
US20090202117A1 (en) | Device, system and method for measurement and analysis of contractile activity | |
CN1572228A (en) | Endoscope device | |
CN106618454A (en) | Capsule endoscope system | |
US9412054B1 (en) | Device and method for determining a size of in-vivo objects | |
CN114983317A (en) | Method and apparatus for travel distance measurement of capsule camera in gastrointestinal tract | |
CN104939793A (en) | Variable-focus 3-D capsule endoscope system based on liquid lens | |
JP6501800B2 (en) | Reconstruction of images from in vivo multi-camera capsules with confidence matching | |
JP5408843B2 (en) | Image processing apparatus and image processing program | |
US20200082510A1 (en) | Method and Apparatus of Sharpening of Gastrointestinal Images Based on Depth Information | |
CN108392165A (en) | Method and utensil for the introscope with the range measurement scaled for object | |
US10785428B2 (en) | Single image sensor for capturing mixed structured-light images and regular images | |
US20200104983A1 (en) | Method and Apparatus of Sharpening of Gastrointestinal Images Based on Depth Information | |
JPWO2010050400A1 (en) | Image processing apparatus and image processing method | |
CN111956168A (en) | Capsule endoscope system and distance measuring method for capsule endoscope |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20180814 |
|
WW01 | Invention patent application withdrawn after publication |