WO2008101129A1 - System and method for projection of subsurface structure onto an object's surface - Google Patents

System and method for projection of subsurface structure onto an object's surface Download PDF

Info

Publication number
WO2008101129A1
WO2008101129A1 PCT/US2008/054029 US2008054029W WO2008101129A1 WO 2008101129 A1 WO2008101129 A1 WO 2008101129A1 US 2008054029 W US2008054029 W US 2008054029W WO 2008101129 A1 WO2008101129 A1 WO 2008101129A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
pixel
value
body tissue
imaging system
Prior art date
Application number
PCT/US2008/054029
Other languages
French (fr)
Inventor
Jeff D. Berryhill
Carlos Vrancken
Peter Meenen
Original Assignee
Luminetx Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Luminetx Corporation filed Critical Luminetx Corporation
Priority to JP2009550140A priority Critical patent/JP2010517733A/en
Priority to US12/526,820 priority patent/US20100177184A1/en
Priority to MX2009008653A priority patent/MX2009008653A/en
Priority to EP08729923A priority patent/EP2114253A1/en
Priority to CN200880012041A priority patent/CN101686820A/en
Priority to KR1020097018766A priority patent/KR20090113324A/en
Publication of WO2008101129A1 publication Critical patent/WO2008101129A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4887Locating particular structures in or on the body
    • A61B5/489Blood vessels
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • G06T5/75Unsharp masking
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis

Definitions

  • the present invention is generally directed to generation of diffuse infrared light. More particularly, the invention is directed to a system for illuminating an object with diffuse infrared light, producing a video image of buried structure beneath the surface of the object based on reflected infrared light, and then projecting an image of the buried structure onto the surface of the object.
  • an imaging system and method illuminates body tissue with infrared light to enhance visibility of subcutaneous blood vessels, and generates an image of the body tissue and the subcutaneous blood vessels based on reflected infrared light.
  • the system includes an infrared illumination source for generating the infrared light.
  • the system further includes an imaging device for receiving the infrared light reflected from the body tissue and for generating an enhanced image of the body tissue based on the reflected infrared light.
  • the enhanced image is produced by contrast enhancement techniques involving applications of an unsharp mask.
  • the system further includes a projector for receiving an output signal from the imaging device and for projecting the enhanced image onto the imaged body tissue.
  • the contrast enhancement techniques include application of first and second blur filters each having a different resolution.
  • the blur filters are used for generating first and second unsharp masks.
  • the blur filters include application of an "averaging window" to each pixel in the image to generate a blurred image.
  • the contrast enhancement techniques include adjustment of the window sizes of blur filters used to generate the unsharp mask.
  • the contrast enhancement techniques include application of a threshold to pixel data and setting the value of each pixel to a preset value when the pixel data is below the threshold.
  • the contrast enhancement techniques include application of an offset to pixel data whereby each pixel is adjusted higher or low a set amount. Further, if after application of the offset, an adjusted pixel value is outside of the allowable range (e.g., 0-255), the value is
  • the contrast enhancement techniques include application of linear scaling to the image as a final contrast adjustment.
  • the contrast enhancement techniques include using the absolute values of pixel data during execution of one or more processing steps.
  • the contrast enhancement techniques include application of a maximum filter window that sets the value of a target pixel to the maximum value of any pixels within the window.
  • selection means can be provided for allowing selection of a contrast enhancement technique or a combination of contrast enhancement techniques to be executed from a plurality of contrast enhancement techniques.
  • the systems and methods of the present invention can be used to identify the location of vascular structures.
  • the present invention addresses a situation, wherein some medical procedures and treatments require a medical practitioner to locate a blood vessel in a patient's arm or other appendage. In the prior art, this could be a difficult task, especially when the blood vessel lies under a significant deposit of subcutaneous fat. The performance of previous imaging systems designed to aid in finding such blood vessels has been lacking. It is therefore the technical problem underlying the present invention to provide an apparatus and method for enhancing the visual contrast between subcutaneous blood vessels and surrounding tissue.
  • the medical device comprises an imaging device for receiving diffuse light reflected from an object and for producing an input image and generating an enhanced image therefrom and a video projector for projecting a visible light image of the buried structure onto the surface of the object.
  • the technical idea underlying the invention is a conceptual change by including new contrast enhancement techniques that aid in the location of the edges of buried structures by making them appear with a sharper contrast to the surrounding tissue.
  • new contrast enhancement techniques that aid in the location of the edges of buried structures by making them appear with a sharper contrast to the surrounding tissue.
  • the apparatus also comprises an infrared light source for illuminating the body tissue with infrared light which reflects from the body tissue and is imaged by the imaging device.
  • an infrared light source for illuminating the body tissue with infrared light which reflects from the body tissue and is imaged by the imaging device.
  • contrast enhancement may be achieved by, in addition to unsharp masking, the adding of a value to each pixel value of the input image, the using of a threshold to set all values above or below the threshold to a preset value or the taking of the absolute value of each pixel value.
  • FIG. 1 depicts an imaging system for viewing an object under infrared illumination according to a preferred embodiment of the invention
  • FIGS. 2a and 2b are perspective views of an imaging system using diffuse infrared light according to a preferred embodiment of the invention
  • FIGS. 3 and 4 are cross-sectional views of the imaging system according to a preferred embodiment of the invention.
  • FIG. 5 is a functional block diagram of the imaging system according to a preferred embodiment of the invention.
  • FIG. 6a is a perspective view of an imaging system using diffuse infrared light according to an alternative embodiment of the invention.
  • FIG. 6b is a cross-sectional view of the imaging system of FIG. 6a;
  • FIG. 7a is a perspective view of an imaging system using diffuse infrared light according to another embodiment of the invention.
  • FIG. 7b is a cross-sectional view of the imaging system of FIG. 7a;
  • FIG. 8 is an isometric view of yet another aspect of an imaging system
  • FIG. 9 is a front view of a portion of the imaging system as viewed in the direction of the arrows taken along line A-A of FIG. 8;
  • FIG. 10 is a cross-sectional side view taken along line B-B of FIG. 9 and,
  • FIG. 1 1 is a block diagram of an imaging system
  • FIG. 12 is a perspective internal view of a third version of the imaging system of the present invention.
  • FIG. 13 is an internal view of a fourth version of the imaging system of the present invention with some parts shown in section for purposes of explanation.
  • FIG. 14 is a diagrammatic view of the fourth version of the imaging system of the present invention.
  • FIG. 15 is an internal view of a fifth version of the imaging system of the present invention, which uses ambient lighting to illuminate the viewed object.
  • FIGS. 16a and 16b taken together in sequence, are a program listing for artifact removal image processing of the received image.
  • FIGS. 17a, 17b, 17c, 17d, 17e, and 17f, taken together in sequence, are a program listing in the C++ programming language for artifact removal image processing of the received image.
  • FIG. 18 is a diagrammatic perspective view showing how a pair of laser pointers is used to position the object to be viewed.
  • FIG. 19 is a diagrammatic perspective view showing the calibration procedure for the imaging system of the present invention.
  • FIGS. 20a, 20b, and 20c are photographs of a processed image of subcutaneous blood vessels projected onto body tissue that covers the blood vessels.
  • FIG. 21 is a photograph of a projected image having a text border therearound.
  • FIG. 22 is another photograph of a projected image having a text border therearound, similar to FIG. 21 but in which the viewed object has been moved out of position, showing how the text border becomes out-of-focus to indicate that the object is not positioned properly.
  • FIG. 23 shows a text border image that is combined with a projected image for joint projection onto the object to ensure proper positioning.
  • FIG. 24 is a photograph of a processed image of subsurface veins projected onto a hand by the present invention, similar to Fig. 20 (which omits the text border) and Fig. 21 but showing how the text border becomes out of focus to indicate that the hand is no positioned properly.
  • FIG. 25a and FIG. 25b are computer listings showing the solution for bilinear transformation coefficients of the calibration procedure for the imaging system of the present invention.
  • FIG. 26 is a program listing in the C++ programming language, which performs the run-time correction to the viewed image of the object using coefficients determined during the calibration procedure.
  • Fig. 27A is a flow chart of one method for contrast enhancing an image of an object according to an embodiment of the present invention.
  • Fig. 27B is an image of a test target (gradient) along with a plot of the pixel values for a selected section of the gradient.
  • Fig. 27C is an image of the test gradient after being enhanced by the process set forth in Fig. 27A along with a plot of the post processed pixel values for the selected section of the gradient.
  • Fig. 28A is a flow chart of another method for enhancing the contrast of an image to provide improved dimensional detail, according to an embodiment of the present invention.
  • Fig. 28B is an image of the test gradient after being enhanced by the process set forth in Fig. 28A along with a plot of the post processed pixel values for the selected section of the gradient.
  • Fig. 28C includes images of an enhanced image of subcutaneous vessels projected back on a human arm.
  • Fig. 29A is a flow chart of another method for enhancing the contrast of an image, according to an embodiment of the present invention.
  • Fig. 29B is an image of the test gradient after being enhanced by the process set forth in Fig. 29A along with a plot of the post processed pixel values for the selected section of the gradient.
  • Fig. 29C includes images of an enhanced image of subcutaneous vessels projected back on a human arm.
  • Fig. 3OA is a flow chart of another method for enhancing the contrast of an image, according to an embodiment of the present invention.
  • Fig. 3OB is an image of the test gradient after being enhanced by the process set forth in Fig. 3OA along with a plot of the post processed pixel values for the selected section of the gradient.
  • the inventor has determined that when an area of body tissue having a significant deposit of subcutaneous fat is imaged in near-infrared range under illumination of highly diffuse infrared light, there is significantly higher contrast between the blood vessels and surrounding flesh than when the tissue is viewed under direct infrared illumination. Although the invention should not be limited by any particular theory of operation, it appears that most of the diffuse infrared light reflected by the subcutaneous fat is directed away from the viewing direction. Thus, when highly diffuse infrared light is used to illuminate the tissue, the desired visual contrast between the blood vessels and the surrounding flesh is maintained. [0065] Shown in FIG.
  • the imaging system 2 includes an illumination system 10 that illuminates the object 32 with infrared light from multiple different illumination directions.
  • the system 10 includes multiple infrared light providers 10a-I Of, each providing infrared light to the object 32 from a different illumination direction. The directions of arrival of the infrared light from each light provider 10a-I Of are represented in FIG.
  • the directions of arrival of the infrared light ranges from perpendicular or near perpendicular to the surface of the object 32, to parallel or near parallel to the surface of the object 32.
  • the infrared illumination arrives at the object 32 from such a wide range of illumination directions, the infrared illumination is highly diffuse.
  • the light providers 10a-1 Of are preferably light reflecting surfaces that direct light from a single illumination source toward the object 32. In other embodiments, the light providers 10a-1 Of are individual illumination sources, or combinations of illumination sources and reflectors.
  • the imaging system 2 also includes an imaging device 38, such as a video camera, for viewing the object 32.
  • the imaging device 38 views the object 32 from a viewing direction which is represented in FIG. 1 by the arrow 6.
  • the imaging device 38 receives the diffuse infrared light reflected from the object 32, and generates an electronic video image of the object 32 based on the reflected infrared light.
  • FIGS. 2a and 2b Shown in FIGS. 2a and 2b is a preferred embodiment of the illumination system 10.
  • FIG. 3 depicts a cross-sectional view of the system 10 corresponding to the section A-a as shown in FIGS. 2a-b.
  • the system 10 preferably includes an illumination source 12.
  • FIG. 1 In a preferred embodiment of the invention, as depicted in FIG.
  • the illumination source 12 includes a cold mirror 34 disposed between the lamp 26 and the input aperture 18 of the outer enclosure 16.
  • the cold mirror 34 reflects substantially all light having wavelengths outside a selected infrared range of wavelengths.
  • the selected range includes wavelengths from approximately 700 to 1 100 nanometers.
  • an infrared transmitting filter 36 which further attenuates light having wavelengths outside the selected infrared range while transmitting light having wavelengths within the selected infrared ranged.
  • the light that passes through the cold mirror 34 and the filter 36 into the outer enclosure 16 is infrared light having wavelengths within the selected infrared range.
  • the illumination source 12 could be configured to generate infrared light.
  • the illumination source 12 could consist of an infrared light-emitting diode (LED) or an array of infrared LEDs.
  • LED infrared light-emitting diode
  • FIG. 4 the configuration of the illumination source 12 shown in FIG. 3 and described above is a preferred embodiment only, and the invention is not limited to any particular configuration of the illumination source 12.
  • a preferred embodiment of the invention includes a lens 40 used in conjunction with the video imaging device 38 to produce a video image of the object 32 based on diffuse light reflected from the object 32.
  • the imaging device 38 of this embodiment is a charge-coupled device (CCD) video camera 38 manufactured by Cohu, having model number 631520010000.
  • the lens 40 of the preferred embodiment is a 25 mm f-0.95 movie camera lens manufactured by Angenieux.
  • the camera 38 and lens 40 of the preferred embodiment are disposed within the tubular section 24a of the inner reflector 24. As shown in FIG. 3, the open end of the tubular section 24a forms an aperture toward which the camera 38 and lens 40 are pointed. In this manner, the hollow light guide 22 is substantially centered within the field of view of the camera 38.
  • the camera 38 receives light reflected from the object 32 that enters the light guide 22, travels through the enclosure 16, and enters the open end of the section 24a.
  • the preferred embodiment of the invention includes an infrared-transmitting filter 42 disposed in the open end of the tubular section 24a.
  • This filter 42 receives light reflected from the object 32, and any other light that may enter the enclosure 16, and substantially eliminates all light having wavelengths outside the infrared range of approximately 700 to 1 100 nanometers.
  • the light that passes through the filter 42 and into the lens 40 is infrared light within the selected wavelength range. Therefore, the camera 38 primarily receives infrared light which originates from within the illumination system 10 and which is reflected from the object 32.
  • the camera 38 Based on the light reflected from the object 32, the camera 38 generates a video image of the object 32 in the form of an electrical video signal.
  • the video signal is preferably provided to an image enhancement board 44, such as a board manufactured by DigiVision having a model number ICE- 3000.
  • the board 44 generates an enhanced video image signal based on the video signal from the camera 38.
  • the enhanced video image signal is provided to a video capture and display card 46, such as a model 20-TD Live card manufactured by Miro.
  • the card 46 captures still images from the image signal which may be saved in digital format on a digital storage device.
  • the card 46 also formats the video image signal for real-time display on a video monitor 48.
  • the illumination system 10 could use other means for generating diffuse infrared light in accordance with the invention.
  • the light providers 10a-I Of of FIG. 1 could be embodied by a ring-light strobe light.
  • a circular array of LEDs could be used to illuminate a plastic transmitting diffuser placed near the surface of the object 32.
  • the light providers 10a-I Of would correspond to the individual LEDs in the array.
  • the imaging system 2 includes a video projector 50 for illuminating the object 32 with an image of the object 32 to enhance the visual contrast between lighter and darker areas of the object 32.
  • the features of an object can be visually enhanced for an observer when the features of a projected visible-light image of the object overlay the corresponding features of the object.
  • the overlaid visible-light image causes the bright features of the object to appear brighter while the dark areas remain the same.
  • FIGS. 6a and 6b The embodiment of the invention shown in FIGS. 6a and 6b provides diffuse infrared light (represented by the rays 52) to the object 32 in a manner similar to that described previously. However, in the embodiment shown in FIGS. 6a and 6b, the optical path of the illuminating light is folded, such that the exit aperture 2 of the light guide 22 is rotated by 90 degrees relative to the exit aperture shown in FIGS. 1 -3.
  • a beam separator such as a hot mirror 54, receives infrared light 52 from the interior of the light diffusing structure 14 and reflects the infrared light 52 into the light guide 22 and toward the object 32.
  • the hot mirror 54 also receives an infrared image of the object 32 (represented by the ray 56) and reflects it toward the camera 38.
  • the hot mirror 54 receives the visible-light image (represented by the ray 58) from the projector 50 and transmits it into the light guide 22 and toward the object 32.
  • the video output signal from the video camera 38 is provided as a video input signal to the projector 50.
  • the projector 50 projects the visible- light image 58 of the object 32 toward the hot mirror 54.
  • the hot mirror 54 receives the visible-light image 58 and transmits it into the light guide 22 toward the object 32.
  • the features in the projected visible-light image 58 are made to overlay the corresponding features of the object 32. This is generally achieved when the projected visible-light image 58 is coaxial with the infrared image of the object 32 (represented by the ray 56) received by the camera 38.
  • FIGS. 7a and 7b depict an alternative embodiment of the invention for use as a contrast enhancing illuminator.
  • the embodiment of FIGS. 7a-b operates in a fashion similar to the embodiment of FIGS. 6a and 6b. However, in the embodiment of FIGS.
  • the camera 38 is located outside the light diffusing structure 14.
  • the hot mirror 54 shown in FIGS. 7a-b is rotated by 90 degrees clockwise relative to its position in FIGS. 6a-b. Otherwise, the hot mirror 54 serves a similar function as that described above in reference to FIGS. 6a-b.
  • the infrared-transmitting filter 42 is mounted in a wall of the light guide 22.
  • a reflective panel 60 is provided in this embodiment to further direct the light from the illumination source 12 into the light guide 22 and toward the exit aperture 23.
  • the panel 60 is a flat reflective sheet having an orifice therein to allow light to pass between the object 32 and the camera 38 and projector 50.
  • FIGS. 8-1 1 A preferred embodiment of a relative compact and highly reliable imaging system 70 is depicted in FIGS. 8-1 1 .
  • the imaging system 70 is most preferably configured to illuminate an object 71 , such as body tissue and the like, and to produce a video image of the object 71 based upon infrared light reflected from the object 71.
  • the imaging system 70 preferably includes a housing 72 which contains the imaging features of the system 70.
  • the housing 72 preferably has a substantially rectangular configuration.
  • the housing 72 preferably has a length of between about three and about five inches and a width of about three and one-half inches.
  • the imaging system 70 can be configured in a variety of ways and the invention should not be limited by any specific examples or embodiments discussed herein.
  • the housing is depicted as being substantially rectangular, however, circular, polygonal, and other geometries and sizes are feasible as well.
  • An imaging device 74 such as a video camera having a lens 75, and video processing components reside within the housing 72.
  • the imaging device 74 and video processing components operate to detect infrared light and to process the detected infrared light from the object 71 .
  • the imaging system 74 produces an image based on the detected infrared light reflected from the object 71 , as described herein.
  • the imaging device 74 is preferably mounted within an aperture 76 of mounting wall 78, with the lens 75 extending into the housing interior 77, as described further below. More particularly, the camera 74 is preferably centrally and symmetrically mounted within the housing 72. This preferred symmetrical camera location tends to maximize the amount of light detected by the camera, which enhances the image produced by the system 70, thereby enhancing the illumination of blood vessels disposed below subcutaneous fat in body tissue.
  • the housing 72 most preferably contains various components operable to transit diffuse light from the system 70 toward the object 71 .
  • Arrows 80 represent diffuse light transmitted by the system 70.
  • Arrows 82 represent the light reflected from the object 71.
  • the wall 78 contains a number of infrared light emitting diodes (LEDS) 84 disposed in a LED array 85 for emitting infrared light.
  • the LED array 85 defines a LED plane of reference.
  • each LED 84 When activated, each LED 84 preferably transmits light at a wavelength of about 740 nanometers (nm). In the preferred embodiment, each LED 84 is manufactured by Roithner Lasertechnik of Austria under model number ELD-740-524.
  • LEDs 84 are mounted on a circuit board 86 located adjacent to wall 78. As shown in FIG. 9, there are most preferably eight groups 92, 94 of LEDs 84 concentrically arranged about the imaging system 74. The concentric LED arrangement tends to provide maximal dispersion and transmission of diffuse light from the system 70. It is preferred that each group 92, 94 of LEDs 84 contain at least ten LEDs 84. However, the system 70 can include more or fewer LEDs within a particular group depending upon a desired implementation of the system 70. Furthermore, the system 70 can include more or fewer groups of LEDs in the LED array 85.
  • LEDs 84 located about the corner regions 96 of the LED array 85. Most preferably, at least fifteen LEDs 84 are disposed in each corner region 96 of the LED array 85. There are preferably four groups 94 of LEDs 84 disposed in lateral regions 98 of the LED array 85. Each lateral region 98 is located substantially between each corner region 94. Most preferably, at least ten LEDs 84 are disposed in each lateral region 98 of the LED array 85.
  • the LED array 85 is mot preferably disposed on circuit board 86.
  • the circuit board 86 includes control circuitry that controls the activation of one or more LEDs 84 within a particular group or groups 92, 94 of LEDs 84 in the LED array 85.
  • a power source 88 and a control system 90 are electrically connected to the circuit board 86. It will be appreciated that is also possible to control the LEDs without using a control system 90, that is, power source 88 can be switched "on” or "off” to activate and deactivate the LED array 85.
  • pulse modulation techniques can also be used in conjunction with power source 88 to activate and deactivate one or more of the LEDs 84 in the LED array 85 according to a preferred duty cycle, herein defined as the LED "on” time relative to the LED “off” time.
  • the LED array 85 is electrically connected via circuit board 86 to the power source 88 and control system 90.
  • the control system 90 includes control features for controlling the LED array 85 to emit infrared light toward an object 71.
  • the control system 90 can enable one or more of the LEDs 84 in a group or groups of the LED array 85 to emit light continuously or intermittently. That is, one LED 84 or a plurality of LEDs 84 can be selected and controlled to emit infrared light intermittently or continuously toward the object 71 .
  • the system 70 can be configured to transmit infrared light from the LED array in various permutations and combinations of LEDS 84 and/or LED groups 92, 94.
  • a first diffusion layer 100 is disposed adjacent to the emitting surfaces 102 of the LEDs 84 in the LED array 85.
  • the first diffusion layer 100 is glued, such as using known adhesives, onto the emitting surfaces 102 of the LED array 85, thereby operating to diffuse the light emitted by one or ore LEDs 84 in the LED array 85.
  • the first diffusion layer 100 is mot preferably a holographic twenty degree diffuser, such as a product having identification code LSD20PC10-F10x10/PSA, manufactured by Physical Optics Corporation of Torrance, Calif. Most preferably, the first diffusion layer 100 has a length of about three and one-half inches, a width of about three and one-half inches, and a thickness of about 0.10 inches.
  • the first diffusion layer 100 diffuses the infrared light emitted from the LED array 85, thereby providing a first amount of diffusion to the emitted infrared light.
  • the interior surfaces 104 of the housing 72 are shown in FIG. 10.
  • a second diffusion layer 106 is spaced apart from the first diffusion layer 100 by a distance LDD. Most preferably, the distance LDD between the first and second diffusion layers 100 and 106 is about three inches.
  • the second diffusion layer 106 is most preferably a holographic twenty degree diffuser, similar to or the same as the above-described first diffusion layer 100.
  • the second diffusion layer 106 has a preferred length of about three and one-half inches, a width of about three and one- half inches, and a thickness of about 0.10 inches.
  • the second diffusion layer 106 further diffuses the already diffuse light reflected from the interior surfaces 104 and provided by the first diffusion layer 100.
  • the first and second diffusion layers are substantially planar, that is, the layers 100 and 106 each define a planar geometry.
  • a backing material 108 such as
  • LUCITE material sold under the trademark LUCITE and manufactured by DuPont of Wilmington, Delaware, is disposed adjacent to the second diffusion layer 106. Most preferably, the backing material has a thickness of about 0.125 inches.
  • a visible polarizer 1 10 is disposed adjacent to the backing material 108. The visible polarizer 1 10 is most preferably manufactured by Visual Pursuits of Vernon Hills, Illinois, under part number VP-GS-12U, and having a thickness of about 0.075 inches. [0095]
  • the system 70 is operable to produce various levels of diffusion as the emitted light progresses through the first diffusion layer 100, reflects off of the interior surfaces 104 of the first compartment 72a, and continues to progress through the second diffusion layer 106, backing material 108, and polarizer 1 10.
  • a level of diffusion results after the emitted light passes through the first diffusion layer 100.
  • Another level of diffusion results from the reflection from the interior surface 104 of the first compartment 72a of the already diffused light provided by the first diffuser layer 100.
  • Yet another level of diffusion results after the diffuse light passes through the second diffusion layer 106.
  • the visible polarizer 1 10 preferably includes a central portion 1 12, most preferably in the shape of a circle having about a one-inch diameter.
  • the central portion 1 12 geometry most preferably coincides with the shape and dimension of the camera lens 75.
  • the polarization of the central portion 1 12 is preferably rotated approximately ninety degrees with respect to the polarization of the surrounding area 1 14 of the polarizer 1 10.
  • the camera lens 75 contacts the backing material 108.
  • the positional location of the lens 75 within the housing 70 preferably coincides with or shares the same central axis as the central portion 1 12 of the polarizer 1 10.
  • the central portion 1 12 of the polarizer 1 10 coinciding with the front of the lens 75 tends to remove any surface glare ("specular reflection") in the resulting camera image.
  • the first diffusion layer 100, interior surfaces 104, second diffusion layer 106, backing material 108, and visible polarizer 1 10 define a diffusing system 1 16 (FIG. 10) for providing diffuse light to an object 71 .
  • the diffusing structure can include more or fewer components and the invention is not to be limited by any specific examples or embodiments disclosed herein.
  • the diffusing system 1 16 can include either the first or the second diffusion layers 100, 106, with or without the polarizer 1 10, or can include the first and second diffusion layers 100, 106 without the polarizer 1 10.
  • the system 70 operates to transmit diffuse light 80 toward an object 71 and produce a video image of the object 71 with the imaging system 74, as described above. More particularly, once the power source 88 is enabled, one or more of the LEDs 84 in the LED array 85 emit infrared light from the emitting surface(s) 102.
  • the first diffusion layer 100 provides a first amount of diffusion to the emitted infrared light.
  • the interior surfaces 104 further diffuse the diffuse light emanating from the first diffusion layer 100.
  • the second diffusion layer 106 further diffuses the already diffuse light which is then transmitted through the backing material 108 and the polarizer before illuminating the object 71 .
  • the object 71 reflects the emitted diffuse light 80 producing diffuse reflected light 82 that is captured by the imaging system 74.
  • the imaging system 74 then produces a video image of the object 71 . Accordingly, by emitting diffuse light according to a unique diffusion providing system 70, the system 70 aids in locating and differentiating between different material properties of the object 71 , such as between blood vessels and tissue.
  • the planes defined by the first or second diffusing layers 100 and 106 can be adjusted to not be parallel with respect to one another, thereby providing different levels of diffuse light from the system 70.
  • the plane defined by the LED array 85 is mot preferably in substantial parallel relation with respect to the plane defined by the first diffusing layer 100.
  • the planes defined by LED array 85 and the first diffusing layer 100 can be varied to accommodate various operational conditions, as will be appreciated by those skilled in the art. Accordingly, it is expressly intended that the foregoing description and the accompanying drawings are illustrative of preferred embodiments only, not limiting thereto, an that the true spirit and scope of the present invention be determined by reference to the appended claims.
  • FIGS. 20a, 20b, and 20c are photographs of test subjects showing processed images of subcutaneous blood vessels being projected onto the surface of each subject's body tissue which covers the viewed blood vessels.
  • Additional embodiments will now be described showing a variety of configurations of illumination sources, imaging devices for viewing the image of buried structure beneath the surface of the illuminated object, and projectors for projecting the processed image back onto the surface of the object. Because all of the embodiments of the present invention have many structural features in common, only the differences between the structures need be discussed in detail, it being understood that similar structural features of all the embodiments perform similar functions. Those skilled in the art will readily recognize the similar structural features that appear in all embodiments of the present invention.
  • an observer using the present invention is not subject to the parallax errors that otherwise occur with prior art devices if an observer were to view from off-axis.
  • An important feature of all embodiments is that the image of buried structure viewed by the image device should be substantially with in a first spectrum outside a second spectrum of the image that is projected back onto the surface of the object, thereby causing the imaging device to be blind to the image that is projected back onto the surface of the object.
  • the substantial non-overlap of the spectrum of the viewed image of the buried structure with the spectrum of the projected image of the buried structure effectively decouples the image processing of the buried structure's image from interference by the projected image. Because the projected image is in the visible light spectrum and the illumination of the object for the imaging device is in the infrared spectrum, a substantial non-overlap of the two spectrums is maintained.
  • the object can be illuminated by broad- spectrum ambient light, and an infrared filter is placed in front of the imaging device to remove all spectral components outside the infrared spectrum, thereby causing the imaging device to only see the infrared component of the broad-spectrum diffuse light reflected from the object.
  • a third preferred embodiment 130 of the imaging system is shown in FIG. 12.
  • a well-known CCD camera with lens 132 is used as the imaging device, as in all embodiments.
  • a second polarizing filter 134 is interposed between the CCD camera and the reflected light from the viewed object, as previously described for earlier embodiments, so as to reduce specular reflection from the surface of the object.
  • the illumination source, first polarizing filter, holographic illumination diffuser ring, and optically-neutral glass cover, all generally at 136, are best described below in the discussion of the fourth embodiment of the imaging system shown in FIGS. 13 and 14, which has the same structure 136 which is shown in cross-section for that embodiment.
  • the third preferred embodiment includes a well-known video projector 138 or so-called "light engine” for projecting a visible image onto the object O under examination.
  • Video projector 138 includes a high-intensity green LED illumination source 140 which emits light into well-known prism assembly 142, thereby causing the emitted light to fold back, by internal reflection within prism assembly 142, and be directed rearwardly toward well-known Digital Light Processing (“DLP") device 144, also known as a Digital Mirror Device (“DMD”), having an array of closely-packed small mirrors that can individually shift the direction of the light beam reflected therefrom so as to either cause the light beam to be directed toward the target object through well-known projection lens 146 or to cause the light beam to not be directed toward the target object, thereby turning the emitted light beam off on a pixel-by-pixel basis in a manner well-known to those skilled in the art.
  • DLP Digital Light Processing
  • DMD Digital Mirror Device
  • prism assembly 142 permits a more compact apparatus for the various embodiments of the imaging system, and the use of such prism assemblies is well known to those skilled in the art of video projectors.
  • a well-known so-called "hot mirror” 148 is interposed at 45 degrees to intercept the infrared light reflected from the viewed object and reflect that infrared light downward to camera 132.
  • "Hot mirror” 148 acts as a mirror to longer wavelengths of light (such as infrared light) but higher-frequency light, such as the green light from projector 138, passes through without reflection and toward the viewed object.
  • Imaging system 130 further has first and second lasers 150, 152 for ensuring that the target is properly located for in-focus viewing by camera 132, as hereinafter described.
  • Fourth embodiment 154 is mounted upon a pole 156 that extends upwardly from a mobile cart 158, allowing the imaging system 154 to be easily transported.
  • a fine-focus stage 160 allows imaging system 154 to e raised or lowered so that it is properly positioned above the target object O.
  • video projector 162 is provided with a 525 nm green LED illumination source ("photon engine") 164 for illuminating the DMD/DLP chip 166.
  • a suitable photon engine 164 for use with the fourth embodiment is the Teledyne Lighting model PE09-G illuminator, having an output intensity of 85 lumens.
  • DMD chip 166 may be a Texas Instruments part number 0.7SVGA SDR DMD chip having a resolution of 848 x 600 pixels and a mirror tilt angle of ten degrees and a frame rate of 30 Hz.
  • Well-known prism assembly 168 as before, internally reflects the light from photon engine 164 toward DMD chip 166 and then directs the light reflected from DMD chip 166 toward object O.
  • DMD chip 166 is controlled by a well-known drive electronics board 167 which may be made by Optical Sciences Corporation.
  • a condenser lens 170 such as a BK7 bioconvex lens, part number 013-2790-AZ55, sold by OptoSigma, having a BBAR/AR coated surface coating for 425-675 nm light.
  • a condenser lens 170 such as a BK7 bioconvex lens, part number 013-2790-AZ55, sold by OptoSigma, having a BBAR/AR coated surface coating for 425-675 nm light.
  • the projector light emerges from prism assembly 168, it passes through well- known projection lens 172, Besler part number 8680 medium format enlarger lens and then through well-known "hot-mirror" (high pass filter) 174, which reflects the received infrared light image from the object O through second polarizing filter 178 and then to camera 176.
  • hot-mirror high pass filter
  • a suitable camera 176 is the Firefly Camera, part number FIRE-BW-XX, sold by Point Grey Research, which uses a 640 x 480 CCD chip, part number Sony ICX084AL, and which communicates its images to computer (“CPU") 180 through an IEEE-1394 ("FireWire") interface.
  • computer 180 has a number of interfaces signals 181 that communicate with the imaging system in a manner well-known to those skilled in the art.
  • the fourth embodiment also has first and second lasers 150, 152 for ensuring that the target O is properly located for in-focus viewing by camera 176.
  • fourth embodiment 154 has an assembly 136 that includes infrared illumination source 182, first polarizing filter 184 (which is ring-shaped with a center hole therethrough so as not to affect the projected image from projector 162 or the viewed image of the object), holographic illumination diffuser ring 186 (which likewise has a center hole therethrough for passage of the projected image from projector 162 and of the viewed image of the object) and which diffuses the light from LEDs 190, and optically-neutral glass cover 188,.
  • infrared illumination source 182 which is ring-shaped with a center hole therethrough so as not to affect the projected image from projector 162 or the viewed image of the object
  • holographic illumination diffuser ring 186 which likewise has a center hole therethrough for passage of the projected image from projector 162 and of the viewed image of the object and which diffuses the light from LEDs 190, and optically-neutral glass cover 188,.
  • Infrared illumination source 182 is a group of LEDs preferably arranged in a select pattern, such as a circular ring having a centrally-disposed hole through which the projected image and the viewed object's image passes.
  • the LEDs are preferably 740 nm near-infrared LEDs 190 that illuminate the object O, and research has determined that such a structure provides sufficient diffused infrared light for satisfactory illumination of object O.
  • a fifth embodiment 192 of the imaging system of the present invention will now be explained.
  • the fifth embodiment does not provide an integral diffuse infrared illumination source (e.g., illumination source 182 with a ring of LEDs 190) for illuminating the object, but instead views the object as illuminated by ambient light L (or the sun S) that has a broader spectrum than the integral diffuse infrared illumination sources heretofore disclosed.
  • ambient light has some infrared spectral components and is quite diffuse, those infrared spectral components are generally of lower intensity than the infrared light produced by the diffuse infrared illumination sources heretofore disclosed. Accordingly, a better (i.e., more sensitive) image device camera is required for this embodiment, with better optics than the previously-described embodiments.
  • the fifth embodiment 192 includes video projector 162, including a green "photon engine” 164, prism assembly 168, projector lens 172, and DMD chip 166.
  • fifth embodiment 192 includes a "fold mirror” 194 that folds the beam at a right angle within the projector between the photon engine 164 and prism assembly 168.
  • fifth embodiment 192 includes a "hot mirror” 174.
  • Fifth embodiment 192 further has an infrared filter 196 interposed in the optical path between the imaging device (camera 198) and object O so as to filter out al. but the infrared component of the image viewed by camera 198.
  • Camera 198 is preferably a Basler CMOS camera, model A600-HDR, made by Basler Vision Technologies of Germany, which has an IEEE 1994 ("FireWire") interface and allows capture of images with up to a 1 12 dB dynamic range.
  • An advantage of the fifth embodiment is that it can be (and should be) used in a brightly-illuminated room.
  • FIGS. 16a and 16b taken together in sequence, are a program listing for artifact removal image processing of the received image.
  • the same artifact removal procedure is performed twice, and then a well-known adaptive edge enhancement procedure is performed, such as, for example, unsharp masking, followed by a smoothing to clean up image artifacts produced by the hair removal.
  • the program listing is well-commented and explains to those skilled in the art the image processing steps that are applied to the image.
  • the received mage having integer pixel values in the range (0...255) is converted to floating point values between 0.0 and 1.0, inclusive.
  • the resulting image is then converted to smoothed (blurred) using a Gaussian convolution having a sigma of 8 pixels. This is a fairly small value of sigma, and leave small features, such as narrow hairs, in the resulting smoothed image.
  • a "difference image” is created which is the original image minus the Gaussian-smoothed image, producing a zero-centered set of values from -1 .0 to 1 .0.
  • the original image (“iml”), having pixel values ranging from 0.0 to 1 .0, is then "boosted” at every "hair pixel” location by 0.015. Because this is a highly non-linear operation, the amount of "boost” if quite small, just 1.5%.
  • This same set of operations (Gaussian smoothing with a sigma of 8 pixels, creation of a difference image, identifying negative pixel locations, and “boosting” the image where negative pixels (small features and noise) are found) are performed again, and the resulting image is then smoothed again with a Gaussian convolution having a sigma of 64 pixels.
  • a third difference image is created, which is the again-"boosted" image minus the smoothed image, and an image is created that is formed from the absolute value of every pixel in the third difference image.
  • the resulting absolute value image is then smoothed with a Gaussian convolution having a sigma of 64 pixels, and the third difference image is then divided by the smoothed absolute value image, and the resulting divided image is smoothed with a Gaussian convolution having a sigma of 4 pixels.
  • the foregoing Artifact Removal algorithm allows the contrast to be set by the contrast of the subcutaneous vein (the subsurface structure of interest), ignoring the artifacts (hairs), and thereby prepares the image for adaptive unsharp masking edge enhancement to set the contrast of the final image.
  • FIGS. 17a, 17b, 17c, 17d, 17e, and 17f taken together in sequence, are a program listing in the C++ programming language for artifact removal image processing of the received image which is based upon the research/investigation program shown in FIGS. 16a and FIG. 16b, but instead uses the Intel image processing library to perform the mathematical operations more quickly.
  • any or all of the embodiments of the present invention preferably include a mechanism for keeping the image of the buried structure, as seen by the imaging device, in focus to the image device camera with a proper lens-to-subject distance thereto.
  • a first embodiment of this mechanism uses a pair of laser, 150, 152, each laser respectively emitting a beam 200, 2020, with beams 200 and 202 being non-parallel with respect to each other and thus being directed toward the object from different angles, such that the two laser beams only converge to the same spot 204 and intersect when the target is at the proper lens-to-subject distance from the imaging device, as shown by the position of intersecting plane 206.
  • the two laser beams will not intersect at a single point 204 but instead will appear on the surface of the object as a first pair of visible dots 212, 214 (for plane 208) or as a second pair of visible dots 216, 218 (for plane 210), indicating that the buried structure is not in focus to the imaging device camera, and that the distance from the object to the apparatus should be changed to bring the viewed image of the buried structure into focus.
  • Lasers 150 and 152 may also be seen in FIGS. 12, 13, and 14. Suitable laser for use with the present invention are the model LM-03 laser modules made by Roithner Lasertechnik, of Vienna, Austria.
  • a second embodiment of the target positioning mechanism adds a recognizable visible light pattern, such as a text border, independent of the buried structure being observed, to the projected image for mutual projection therewith.
  • the projected recognizable pattern will only be recognized by the human viewer as being in focus on the surface of the target object when the target is at the desired distance from the projector, thereby causing the buried structure beneath the surface of the target to also be at the proper lens-to-subject distance from the imaging device.
  • cartoon figures appealing to children could be provided as an incentive for children to properly position their body parts for viewing subcutaneous blood vessels, or a hospital's or clinic's logo or name could be used for the pattern.
  • FIG. 22 is another photograph of a projected image having a text border therearound, similar to FIG. 21 but in which the viewed object has been moved out of position, showing how the text border becomes out-of-focus to indicate that the object is not positioned properly with respect to the image device camera.
  • FIG. 23 shows a text border image that is combined with a projected image for joint projection onto the object to ensure proper positioning. Because of the image reversal that occurs in some embodiments of the invention as images reflect inside the prism structure heretofore described, this text border image is shown reversed but appears unreversed when projected. The projected image is appropriately cropped before combining with the text border so that the text border remains sharp and distinct when projected.
  • FIG. 24 is a photograph of a processed image of subsurface veins projected onto a hand by the present invention, similar to FIG. 20 (which omits the text border) and FIG. 21 but showing how the text border becomes out of focus to indicate that the hand is not positioned properly.
  • a calibration method is provided wherein the video projector 138 (or 162, or any of the projector of the present invention) projects a green target pattern 220 onto a fluorescent screen 222, which converts the projected four-dot green target pattern 220 into deep red light that is visible by the infrared imaging device 132.
  • a computer program records the observed position of the viewed pattern of four projected dots P1 , P2, P3, and P4, in Cartesian coordinates, i.e., (x1 , y1 ), (x2, y2), (x3, y3), and (x4, y4), versus the desired or "true" position of the dots if alignment were correct, i.e., (X1 , Y1 ), (X2, Y2), (X3, Y3), and (X4, Y4), and calculates calibration coefficients (a, b, c, d, g, h, k, f) to be used in the bi-linear transformation equations(the arguments to the "solve" function in FIG. 25az and FIG.
  • FIG. 25a and FIG. 25b show the use of the MAPLE 9 computer equation solving program to solve for the bilinear transformation coefficients as a function of the values measured during calibration. These calibration coefficients are used during operation of the device to transform the coordinate system of the image (x, y) into the corrected coordinate system (X, Y) necessary to produce a calibrated image.
  • FIG. 26 shows how these coordinates, once calculated during calibration, are used as parameters to a well-known image processing library mathematical routine provided by the integrated circuit company Intel for use with its processors, to achieve high performance image alignment correction using the bilinear transformation equation.
  • the run-time calculations are done using scaled integer arithmetic, rather than floating point arithmetic, for faster processing of the image.
  • the calibration procedure projects a test pattern 220, consisting of four dots P1 , P2, P3, and P4, each having a 25-pixel radius (as viewed by the imaging device camera) at the corners of a rectangle having dimensions of 320 x 240 pixels rectangle (as viewed by the imaging device camera), onto the fluorescent.
  • the camera 132 might have a resolution of 640 x 480 pixels
  • the projector 138 might have a resolution of 1024 x 780 pixels.
  • a test pattern of four spaced-apart dots P1 , P2, P3, and P4 is projected within a first spectrum, preferably using green light, onto a fluorescent screen 222, which then fluoresces and produces light within a second spectrum, preferably light adjacent or within the infrared spectrum, such as red light, that is visible to the image device camera 132, even through the infrared transmitting filter through which the image device camera views its target object.
  • Calibration software then measures the observed position of the four dots and computes the correction coefficients (a, b, c, d, g, f, h, k) for the bi-linear transformation equation, and then uses those coefficients as parameters to the bi-linear transformation in order to correct misalignment errors (rotation, translation, and magnification) between the image device camera and the projector by warping the image prior to projection so that the projected image is corrected for misalignment.
  • misalignment errors rotation, translation, and magnification
  • a received image may be visually enhanced by various image processing techniques before being projected back onto a target.
  • image processing techniques For example, an artifact removal process is described that employs, inter alia, an unsharp mask - - a blurred version of the object image is produced and is subtracted from an original object image (i.e., a focused image) to produce an edge-enhanced image. Additional techniques can be applied according to embodiments of the present invention.
  • Fig. 27A is a flow chart of a method for contrast enhancing an image of an object according to an embodiment of the present invention.
  • image data is received at an image processing device, e.g., from the camera.
  • the image data may be processed in known digital formats, such as, e.g., pixel data on a 0-255 gray scale.
  • a blurred image is generated by application of a blur filter, such as, e.g., Gaussian blurring. This blurring may occur in either the spatial domain, or in the frequency domain, via convolution, to enhance computational speed.
  • the resulting blurred image is subtracted (e.g., pixel-by-pixel) from the original image at step 27-3, resulting in the unsharp mask (27-4).
  • the absolute value (ABS) of the unsharp mask is taken (27-5) and another blur filter is applied thereto (27-6).
  • the unsharp mask is divided (27-7) by the blurred ABS of the unsharp mask, and the result of the operation is adjusted to generate the final enhanced image (27-8).
  • the blurred image is created by applying an "averaging window" to each pixel in the image.
  • An averaging window is a window having a kernel size smaller than that of the image being processed.
  • the averaging window is centered on each pixel of the image, and the value of the pixel of interest is set the average value of all the pixels within the window. For example, in an image having a resolution of 640X480 pixels, it has been determined that a 192X192 sized averaging window produces a good result as a first blur filter.
  • the averaging window is applied to pixels in the exterior part of the image such that the averaging window extends beyond the image definition, the pixels in the window are mirrored in order to fill the averaging window.
  • the blurred image is created.
  • the blur filter is applied two different times. It has been determined that better results are obtained when the second application of the blur filter uses a different window size than the first, preferably a smaller size. It was determined that if the first average window has a kernel size of 192X192 pixels, then a second average window having the size 96X96 pixel results in an effective increase in sharpness of the image.
  • the first average window has a kernel size of 192X192 pixels
  • a second average window having the size 96X96 pixel results in an effective increase in sharpness of the image.
  • final contrast adjustment can be accomplished by performing linear scaling.
  • the division function performed prior to this step results in a 16-bit signed integer. This value can be scaled back to an 8-bit unsigned integer using min and max values.
  • minimum (Min) and maximum (Max) parameters determine the spread and hence, the degree of contrast increase.
  • Figs. 27B-C Results of the image processing can be appreciated from Figs. 27B-C.
  • Fig. 27B is an image of a test target (gradient) along with a plot of the pixel values for a selected section of the gradient.
  • Fig. 27C is an image of the test gradient after being enhanced by the process set forth in Fig. 27A along with a plot of the post processed pixel values for the selected section of the gradient. As can be seen, the appearance of detail is created, such as the darkened center lines of the gradient lines.
  • FIG. 28A is a flow chart of another method for enhancing the contrast of an image to provide improved dimensional detail, according to an embodiment of the present invention.
  • the image to be processed is received.
  • a blurred image is generated by application of a blur filter at step 28-2 such as already described above.
  • the blurred image is subtracted from the original image at step 28-3, resulting in the unsharp mask (28-4).
  • the ABS of the unsharp mask is taken (28-5) and the blur filter is applied thereto (28-6).
  • ABS of the unsharp mask is divided (28-7) by the blurred ABS of the unsharp mask, and the result of the operation is adjusted to generate the final enhanced image (28-8).
  • Fig. 28A it was determined that employing first and second averaging windows of the size 76X76 and 40X40 pixels respectively achieved superior results.
  • Fig. 28B is an image of the test gradient after being enhanced by the process set forth in Fig. 28A along with a plot of the post processed pixel values for the selected section of the gradient. As can be seen, the appearance of detail is created, such as the darkened edges of the gradient lines.
  • Fig. 28C includes images of an enhanced image of subcutaneous vessels projected back on a human arm. The top image is a result of processing according to the method of Fig. 27A while the bottom image is a result of processing according to the method of Fig. 28A.
  • One having skill in the art should appreciate the distinctly different results each of the methods produce and understand that the techniques could be preferred for different applications.
  • Fig. 28D includes images of a human target body part during steps of the process of Fig. 28A.
  • the top left image is a raw image of the target body part.
  • the top right image is blurred image of the target body part.
  • the middle left image is the result of subtracting the blurred image from the raw image.
  • the middle right is the results of the process of Fig. 28A having enhanced dimensional detail.
  • the bottom two plots are cross-sectional plots of the pixel data of the image for the two images respectively above the plots.
  • Fig. 29A is a flow chart of another method for enhancing the contrast of an image, according to an embodiment of the present invention.
  • the image is received at step 29-1 from the camera.
  • a blurred image is generated by application of a blur filter at step 29-2.
  • the blurred image is subtracted from the original image at step 29-3, resulting in the unsharp mask (29-4).
  • the absolute value of the unsharp mask is taken (29-5) and the blur filter is applied thereto (29-6).
  • the unsharp mask is divided (29-7) by the blurred ABS of the unsharp mask, and the result of the operation is adjusted to generate the enhanced image (29-8).
  • each pixel is compared against threshold brightness at step 29-9. If the pixel is below the threshold, the pixel is set to the maximum level (e.g., 255 on a contrast scale of 0-255).
  • Fig. 29B is an image of the test gradient after being enhanced by the process set forth in Fig. 29A along with a plot of the post processed pixel values for the selected section of the gradient. As can be seen, the appearance of detail is created by extreme contrast between the darkened edges of the gradient lines with the bright center.
  • One having skill in the art should appreciate the distinctly different results each of the methods produce and understand that the techniques could be preferred for different applications.
  • Fig. 3OA is a flow chart of another method for enhancing the contrast of an image, according to an embodiment of the present invention.
  • the image to be processed is received at step 30-1 from the camera.
  • a blurred image is generated by application of a blur filter at step 30-2.
  • the blurred image is subtracted from the original image at step 30-3, resulting in the unsharp mask (30-4).
  • the absolute value of the unsharp mask is taken (30-5) and the blur filter is applied thereto (30-6).
  • the unsharp mask is divided (30-7) by the blurred ABS of the unsharp mask, and the result of the operation is adjusted to generate the enhanced image (30-8).
  • each pixel of the image is adjusted (reduced or increased) by an offset.
  • the value is reduced by a constant value and resulting negative values are "rolled over.” For example, using a gray scale of 0-255 and a constant of 30, an image value of 25 is reduced to -5 which is out of the allowable range and is rolled over to 250. If an offset is used to increase the pixel values, pixel values roll over from 255 to 0.
  • Fig. 3OA is an image of the test gradient after being enhanced by the process set forth in Fig. 3OA along with a plot of the post processed pixel values for the selected section of the gradient.
  • Fig. 29C includes images of an enhanced image of subcutaneous vessels projected back on a human arm. The top image is a result of processing according to the method of Fig. 27A while the bottom image is a result of processing according to the method of Fig. 3OA.
  • noise or interference in the image caused by hair on the body can be reduced by adding a step to the above processes that first applies a "maximum filter" to the image before applying the rest of the process steps.
  • the maximum filter is similar to the blur filter but instead of applying an averaging window to each pixel, a maximum window is applied.
  • the maximum window identifies the maximum value of any pixel in the window covering the pixel in interest and sets the pixel in interest to the maximum. It has been determined that a maximum window of the size 12X12 pixels centered on each pixel of interest achieves good results.
  • the maximum window filter can be applied to the method of Fig. 27A in order to reduce the influence of hair on the image.
  • Digital image processing can be performed by known conventional means, such as by a combination of hardware, software and/or firmware using logarithmic video signals or digital video signals.
  • processing is performed programmatically in a known computer language such as C.
  • the present invention is not limited, however, to any particular computing arrangement.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Vascular Medicine (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Multimedia (AREA)
  • Astronomy & Astrophysics (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Studio Devices (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

An imaging system and method illuminates body tissue with infrared light to enhance visibility of a vascular structure, and generates an image of the body tissue and the subcutaneous blood vessels based on reflected infrared light. The system includes an infrared illumination source for generating the infrared light and a structure for diffusing the infrared light. The system further includes an imaging device for receiving the infrared light reflected from the body tissue and for generating an enhanced image of the body tissue based on the reflected infrared light. The enhanced image is produced by contrast enhancement techniques involving applications of an unsharp mask. The system further includes a project a projector for receiving an output signal from the imaging device and for projecting the enhanced image onto the body tissue.

Description

TITLE OF THE INVENTION:
SYSTEM AND METHOD FOR PROJECTION OF SUBSURFACE STRUCTURE ONTO AN OBJECT'S SURFACE
BACKGROUND OF THE INVENTION
Technical Field
[0001] The present invention is generally directed to generation of diffuse infrared light. More particularly, the invention is directed to a system for illuminating an object with diffuse infrared light, producing a video image of buried structure beneath the surface of the object based on reflected infrared light, and then projecting an image of the buried structure onto the surface of the object. Description of the Related Art
[0002] Many medical procedures and treatments require a medical practitioner to locate a blood vessel in a patient's body, such as in their arm or other appendage. Identifying the location of a blood vessel can be a difficult task, especially when the blood vessel is small and/or the vessel is under a significant deposit of subcutaneous fat or other tissue. The performance of previous imaging systems designed to aid in finding such blood vessels has been poor.
[0003] Assignee for the present invention owns U.S. Patent No. 5,969,754
(the "754 patent"), entitled CONTRAST ENHANCING ILLUMINATOR, which describes a system for viewing subcutaneous blood vessels. In that system, diffuse infrared light is projected onto a target body part and the reflected light therefrom is used to generate an image of the subcutaneous vessels, which can be projected back onto the target body part. The entire contents of the 754 patent are incorporated herein by reference.
[0004] Although the image is enhanced before projection is made back onto the target, additional enhancement techniques are desired for varying applications. Further, when the image is projected back upon a target body position, the quality of the image can suffer due to factors such as the tone and texture of human skin, the amount of human hair on the target body part, etc. Accordingly, the systems and methods of the 754 patent could be improved.
[0005] U.S. Patent No. 6,556,858 (the "'858 patent"), entitled DIFFUSE
INFRARED LIGHT IMAGING SYSTEM, and pending U.S. Patent No. 7,239,909 (the '"909 patent"), entitled IMAGING SYSTEM USING DIFFUSE INFRARED LIGHT, were also filed by the assignee. The contents of the '858 patent and the '909 patent are hereby incorporated by reference in their entirety.
[0006] Although the '858 patent and '909 patent improved upon systems and methods of the 754 patent, there exists a need for further improved systems and methods for enhancing the visual contrast between subcutaneous blood vessels and surrounding tissue.
SUMMARY OF THE INVENTION
[0007] It is therefore an object of the present invention to overcome disadvantages of the prior art by providing systems and methods for enhancing the visual contrast between a vascular structure and surrounding tissue. [0008] In accordance with an embodiment of the present invention, an imaging system and method illuminates body tissue with infrared light to enhance visibility of subcutaneous blood vessels, and generates an image of the body tissue and the subcutaneous blood vessels based on reflected infrared light. The system includes an infrared illumination source for generating the infrared light. The system further includes an imaging device for receiving the infrared light reflected from the body tissue and for generating an enhanced image of the body tissue based on the reflected infrared light. The enhanced image is produced by contrast enhancement techniques involving applications of an unsharp mask. The system further includes a projector for receiving an output signal from the imaging device and for projecting the enhanced image onto the imaged body tissue.
[0009] According to another embodiment, the contrast enhancement techniques include application of first and second blur filters each having a different resolution. The blur filters are used for generating first and second unsharp masks. The blur filters include application of an "averaging window" to each pixel in the image to generate a blurred image. [0010] According to another embodiment, the contrast enhancement techniques include adjustment of the window sizes of blur filters used to generate the unsharp mask.
[0011] According to another embodiment, the contrast enhancement techniques include application of a threshold to pixel data and setting the value of each pixel to a preset value when the pixel data is below the threshold.
[0012] According to another embodiment, the contrast enhancement techniques include application of an offset to pixel data whereby each pixel is adjusted higher or low a set amount. Further, if after application of the offset, an adjusted pixel value is outside of the allowable range (e.g., 0-255), the value is
"rolled over" to an allowable value.
[0013] According to another embodiment, the contrast enhancement techniques include application of linear scaling to the image as a final contrast adjustment. [0014] According to another embodiment, the contrast enhancement techniques include using the absolute values of pixel data during execution of one or more processing steps.
[0015] According to another embodiment, the contrast enhancement techniques include application of a maximum filter window that sets the value of a target pixel to the maximum value of any pixels within the window.
[0016] According to another embodiment, selection means can be provided for allowing selection of a contrast enhancement technique or a combination of contrast enhancement techniques to be executed from a plurality of contrast enhancement techniques. [0017] According to another embodiment, the systems and methods of the present invention can be used to identify the location of vascular structures.
Accordingly, procedures involving locating or avoiding vascular structures in the body can be performed with application of the system and method of the present invention. [0018] Further applications and advantages of various aspects and embodiments of the present invention are discussed below with reference to the drawing figures. TECHNICAL ASPECTS OF THE INVENTION
[0019] From a technical point of view, the present invention addresses a situation, wherein some medical procedures and treatments require a medical practitioner to locate a blood vessel in a patient's arm or other appendage. In the prior art, this could be a difficult task, especially when the blood vessel lies under a significant deposit of subcutaneous fat. The performance of previous imaging systems designed to aid in finding such blood vessels has been lacking. It is therefore the technical problem underlying the present invention to provide an apparatus and method for enhancing the visual contrast between subcutaneous blood vessels and surrounding tissue.
[0020] This problem is solved by an apparatus to enhance the visibility of a buried structure beneath the surface of an object. The medical device comprises an imaging device for receiving diffuse light reflected from an object and for producing an input image and generating an enhanced image therefrom and a video projector for projecting a visible light image of the buried structure onto the surface of the object.
[0021] The technical idea underlying the invention is a conceptual change by including new contrast enhancement techniques that aid in the location of the edges of buried structures by making them appear with a sharper contrast to the surrounding tissue. As a result, the difficult task of locating a blood vessel in a patient's arm or other appendage is much easier because the blood vessel becomes visible as an image projected on the skin.
[0022] Preferably, the apparatus also comprises an infrared light source for illuminating the body tissue with infrared light which reflects from the body tissue and is imaged by the imaging device.
[0023] In preferred embodiments of this invention, contrast enhancement may be achieved by, in addition to unsharp masking, the adding of a value to each pixel value of the input image, the using of a threshold to set all values above or below the threshold to a preset value or the taking of the absolute value of each pixel value.
BRIEF DESCRIPTION OF THE DRAWINGS
[0024] FIG. 1 depicts an imaging system for viewing an object under infrared illumination according to a preferred embodiment of the invention; [0025] FIGS. 2a and 2b are perspective views of an imaging system using diffuse infrared light according to a preferred embodiment of the invention;
[0026] FIGS. 3 and 4 are cross-sectional views of the imaging system according to a preferred embodiment of the invention; [0027] FIG. 5 is a functional block diagram of the imaging system according to a preferred embodiment of the invention;
[0028] FIG. 6a is a perspective view of an imaging system using diffuse infrared light according to an alternative embodiment of the invention;
[0029] FIG. 6b is a cross-sectional view of the imaging system of FIG. 6a; [0030] FIG. 7a is a perspective view of an imaging system using diffuse infrared light according to another embodiment of the invention;
[0031] FIG. 7b is a cross-sectional view of the imaging system of FIG. 7a;
[0032] FIG. 8 is an isometric view of yet another aspect of an imaging system;
[0033] FIG. 9 is a front view of a portion of the imaging system as viewed in the direction of the arrows taken along line A-A of FIG. 8;
[0034] FIG. 10 is a cross-sectional side view taken along line B-B of FIG. 9 and,
[0035] FIG. 1 1 is a block diagram of an imaging system;
[0036] FIG. 12 is a perspective internal view of a third version of the imaging system of the present invention;
[0037] FIG. 13 is an internal view of a fourth version of the imaging system of the present invention with some parts shown in section for purposes of explanation.
[0038] FIG. 14 is a diagrammatic view of the fourth version of the imaging system of the present invention. [0039] FIG. 15 is an internal view of a fifth version of the imaging system of the present invention, which uses ambient lighting to illuminate the viewed object.
[0040] FIGS. 16a and 16b, taken together in sequence, are a program listing for artifact removal image processing of the received image.
[0041] FIGS. 17a, 17b, 17c, 17d, 17e, and 17f, taken together in sequence, are a program listing in the C++ programming language for artifact removal image processing of the received image.
[0042] FIG. 18 is a diagrammatic perspective view showing how a pair of laser pointers is used to position the object to be viewed. [0043] FIG. 19 is a diagrammatic perspective view showing the calibration procedure for the imaging system of the present invention. [0044] FIGS. 20a, 20b, and 20c are photographs of a processed image of subcutaneous blood vessels projected onto body tissue that covers the blood vessels. [0045] FIG. 21 is a photograph of a projected image having a text border therearound.
[0046] FIG. 22 is another photograph of a projected image having a text border therearound, similar to FIG. 21 but in which the viewed object has been moved out of position, showing how the text border becomes out-of-focus to indicate that the object is not positioned properly.
[0047] FIG. 23 shows a text border image that is combined with a projected image for joint projection onto the object to ensure proper positioning. [0048] FIG. 24 is a photograph of a processed image of subsurface veins projected onto a hand by the present invention, similar to Fig. 20 (which omits the text border) and Fig. 21 but showing how the text border becomes out of focus to indicate that the hand is no positioned properly.
[0049] FIG. 25a and FIG. 25b are computer listings showing the solution for bilinear transformation coefficients of the calibration procedure for the imaging system of the present invention. [0050] FIG. 26 is a program listing in the C++ programming language, which performs the run-time correction to the viewed image of the object using coefficients determined during the calibration procedure.
[0051] Fig. 27A is a flow chart of one method for contrast enhancing an image of an object according to an embodiment of the present invention. [0052] Fig. 27B is an image of a test target (gradient) along with a plot of the pixel values for a selected section of the gradient.
[0053] Fig. 27C is an image of the test gradient after being enhanced by the process set forth in Fig. 27A along with a plot of the post processed pixel values for the selected section of the gradient. [0054] Fig. 28A is a flow chart of another method for enhancing the contrast of an image to provide improved dimensional detail, according to an embodiment of the present invention. [0055] Fig. 28B is an image of the test gradient after being enhanced by the process set forth in Fig. 28A along with a plot of the post processed pixel values for the selected section of the gradient.
[0056] Fig. 28C includes images of an enhanced image of subcutaneous vessels projected back on a human arm.
[0057] Fig. 29A is a flow chart of another method for enhancing the contrast of an image, according to an embodiment of the present invention.
[0058] Fig. 29B is an image of the test gradient after being enhanced by the process set forth in Fig. 29A along with a plot of the post processed pixel values for the selected section of the gradient.
[0059] Fig. 29C includes images of an enhanced image of subcutaneous vessels projected back on a human arm.
[0060] Fig. 3OA is a flow chart of another method for enhancing the contrast of an image, according to an embodiment of the present invention. [0061] Fig. 3OB is an image of the test gradient after being enhanced by the process set forth in Fig. 3OA along with a plot of the post processed pixel values for the selected section of the gradient.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0062] While the present invention may be embodied in many different forms, a number of illustrative embodiments are described herein with the understanding that the present disclosure is to be considered as providing examples of the principles of the invention and such examples are not intended to limit the invention to the embodiments described and/or illustrated herein. [0063] Skin and some other body tissues reflect infrared light in the near- infrared range of about 700 to 900 nanometers while blood absorbs radiation in this range. Thus, in video images of body tissue taken under infrared illumination, blood vessels appear as dark lines against a lighter background of surrounding flesh. However, due to the reflective nature of subcutaneous fat, blood vessels that are disposed below significant deposits of such fat can be difficult or impossible to see when illuminated by direct light, that is, light that arrives generally from a single direction.
[0064] The inventor has determined that when an area of body tissue having a significant deposit of subcutaneous fat is imaged in near-infrared range under illumination of highly diffuse infrared light, there is significantly higher contrast between the blood vessels and surrounding flesh than when the tissue is viewed under direct infrared illumination. Although the invention should not be limited by any particular theory of operation, it appears that most of the diffuse infrared light reflected by the subcutaneous fat is directed away from the viewing direction. Thus, when highly diffuse infrared light is used to illuminate the tissue, the desired visual contrast between the blood vessels and the surrounding flesh is maintained. [0065] Shown in FIG. 1 is an imaging system 2 for illuminating an object 32, such as body tissue, with highly diffuse infrared light, and for producing a video image of the object 32 based upon infrared light reflected from the object 32. As described in detail herein, when the object 32 is body tissue, blood vessels that are disposed below subcutaneous fat in the tissue may be clearly seen in a video image produced by the system 2. [0066] The imaging system 2 includes an illumination system 10 that illuminates the object 32 with infrared light from multiple different illumination directions. The system 10 includes multiple infrared light providers 10a-I Of, each providing infrared light to the object 32 from a different illumination direction. The directions of arrival of the infrared light from each light provider 10a-I Of are represented in FIG. 1 by the rays 4a-4f. As shown in FIG. 1 , the directions of arrival of the infrared light ranges from perpendicular or near perpendicular to the surface of the object 32, to parallel or near parallel to the surface of the object 32. In this embodiment, since the infrared illumination arrives at the object 32 from such a wide range of illumination directions, the infrared illumination is highly diffuse. [0067] As described in greater detail hereinafter, the light providers 10a-1 Of are preferably light reflecting surfaces that direct light from a single illumination source toward the object 32. In other embodiments, the light providers 10a-1 Of are individual illumination sources, or combinations of illumination sources and reflectors.
[0068] The imaging system 2 also includes an imaging device 38, such as a video camera, for viewing the object 32. The imaging device 38 views the object 32 from a viewing direction which is represented in FIG. 1 by the arrow 6. The imaging device 38 receives the diffuse infrared light reflected from the object 32, and generates an electronic video image of the object 32 based on the reflected infrared light. [0069] Shown in FIGS. 2a and 2b is a preferred embodiment of the illumination system 10. FIG. 3 depicts a cross-sectional view of the system 10 corresponding to the section A-a as shown in FIGS. 2a-b. The system 10 preferably includes an illumination source 12. [0070] In a preferred embodiment of the invention, as depicted in FIG. 3, the illumination source 12 includes a cold mirror 34 disposed between the lamp 26 and the input aperture 18 of the outer enclosure 16. The cold mirror 34 reflects substantially all light having wavelengths outside a selected infrared range of wavelengths. Preferably, the selected range includes wavelengths from approximately 700 to 1 100 nanometers. Immediately proximate the cold mirror 34, and disposed between the cold mirror 34 and the input aperture 18, is an infrared transmitting filter 36 which further attenuates light having wavelengths outside the selected infrared range while transmitting light having wavelengths within the selected infrared ranged. Thus, the light that passes through the cold mirror 34 and the filter 36 into the outer enclosure 16 is infrared light having wavelengths within the selected infrared range.
[0071] It should be appreciated that there are other ways that the illumination source 12 could be configured to generate infrared light. For example, the illumination source 12 could consist of an infrared light-emitting diode (LED) or an array of infrared LEDs. Thus, the configuration of the illumination source 12 shown in FIG. 3 and described above is a preferred embodiment only, and the invention is not limited to any particular configuration of the illumination source 12. [0072] As shown in FIG. 4, a preferred embodiment of the invention includes a lens 40 used in conjunction with the video imaging device 38 to produce a video image of the object 32 based on diffuse light reflected from the object 32.
Preferably, the imaging device 38 of this embodiment is a charge-coupled device (CCD) video camera 38 manufactured by Cohu, having model number 631520010000. The lens 40 of the preferred embodiment is a 25 mm f-0.95 movie camera lens manufactured by Angenieux. [0073] The camera 38 and lens 40 of the preferred embodiment are disposed within the tubular section 24a of the inner reflector 24. As shown in FIG. 3, the open end of the tubular section 24a forms an aperture toward which the camera 38 and lens 40 are pointed. In this manner, the hollow light guide 22 is substantially centered within the field of view of the camera 38. Thus, the camera 38 receives light reflected from the object 32 that enters the light guide 22, travels through the enclosure 16, and enters the open end of the section 24a.
[0074] As shown in FIG. 4, the preferred embodiment of the invention includes an infrared-transmitting filter 42 disposed in the open end of the tubular section 24a. This filter 42 receives light reflected from the object 32, and any other light that may enter the enclosure 16, and substantially eliminates all light having wavelengths outside the infrared range of approximately 700 to 1 100 nanometers. Thus, the light that passes through the filter 42 and into the lens 40 is infrared light within the selected wavelength range. Therefore, the camera 38 primarily receives infrared light which originates from within the illumination system 10 and which is reflected from the object 32.
[0075] Based on the light reflected from the object 32, the camera 38 generates a video image of the object 32 in the form of an electrical video signal. As shown in FIG. 5, the video signal is preferably provided to an image enhancement board 44, such as a board manufactured by DigiVision having a model number ICE- 3000. The board 44 generates an enhanced video image signal based on the video signal from the camera 38. The enhanced video image signal is provided to a video capture and display card 46, such as a model 20-TD Live card manufactured by Miro. The card 46 captures still images from the image signal which may be saved in digital format on a digital storage device. The card 46 also formats the video image signal for real-time display on a video monitor 48.
[0076] It should be appreciated that the illumination system 10 could use other means for generating diffuse infrared light in accordance with the invention. For example, the light providers 10a-I Of of FIG. 1 could be embodied by a ring-light strobe light. Alternatively, a circular array of LEDs could be used to illuminate a plastic transmitting diffuser placed near the surface of the object 32. In the latter embodiment, the light providers 10a-I Of would correspond to the individual LEDs in the array. [0077] In an alternative embodiment of the invention depicted in FIGS. 6a and 6b, the imaging system 2 includes a video projector 50 for illuminating the object 32 with an image of the object 32 to enhance the visual contrast between lighter and darker areas of the object 32. As described in the 754 patent, the features of an object can be visually enhanced for an observer when the features of a projected visible-light image of the object overlay the corresponding features of the object. The overlaid visible-light image causes the bright features of the object to appear brighter while the dark areas remain the same.
[0078] The embodiment of the invention shown in FIGS. 6a and 6b provides diffuse infrared light (represented by the rays 52) to the object 32 in a manner similar to that described previously. However, in the embodiment shown in FIGS. 6a and 6b, the optical path of the illuminating light is folded, such that the exit aperture 2 of the light guide 22 is rotated by 90 degrees relative to the exit aperture shown in FIGS. 1 -3. [0079] As shown in FIG. 6b, a beam separator, such as a hot mirror 54, receives infrared light 52 from the interior of the light diffusing structure 14 and reflects the infrared light 52 into the light guide 22 and toward the object 32. The hot mirror 54 also receives an infrared image of the object 32 (represented by the ray 56) and reflects it toward the camera 38. The hot mirror 54 receives the visible-light image (represented by the ray 58) from the projector 50 and transmits it into the light guide 22 and toward the object 32.
[0080] As explained in greater detail in U.S. Pat. No. 5,969,754, the video output signal from the video camera 38 is provided as a video input signal to the projector 50. Based on the video input signal, the projector 50 projects the visible- light image 58 of the object 32 toward the hot mirror 54. The hot mirror 54 receives the visible-light image 58 and transmits it into the light guide 22 toward the object 32. By proper alignment of the projected visible-light image 58 from the projector 50 with the infrared image 56 of the object 32 which is sensed by the camera 38, the features in the projected visible-light image 58 are made to overlay the corresponding features of the object 32. This is generally achieved when the projected visible-light image 58 is coaxial with the infrared image of the object 32 (represented by the ray 56) received by the camera 38.
[0081] When the object 32 is body tissue, and the invention is used to find subcutaneous blood vessels in the body tissue, the blood vessels appear as dark lines in the projected visible-light image 58. Thus, when the visible-light image 58 is projected onto the body tissue, the subcutaneous blood vessels will lie directly beneath the dark lines in the projected visible-light image 58. In this manner, the invention significantly improves a medical practitioner's ability to find subcutaneous blood vessels while minimizing discomfort for the patient. [0082] FIGS. 7a and 7b depict an alternative embodiment of the invention for use as a contrast enhancing illuminator. The embodiment of FIGS. 7a-b operates in a fashion similar to the embodiment of FIGS. 6a and 6b. However, in the embodiment of FIGS. 7a-b, the camera 38 is located outside the light diffusing structure 14. To accommodate the different location of the camera 38, the hot mirror 54 shown in FIGS. 7a-b is rotated by 90 degrees clockwise relative to its position in FIGS. 6a-b. Otherwise, the hot mirror 54 serves a similar function as that described above in reference to FIGS. 6a-b. Also to accommodate the different camera location, the infrared-transmitting filter 42 is mounted in a wall of the light guide 22. A reflective panel 60 is provided in this embodiment to further direct the light from the illumination source 12 into the light guide 22 and toward the exit aperture 23. Preferably, the panel 60 is a flat reflective sheet having an orifice therein to allow light to pass between the object 32 and the camera 38 and projector 50. [0083] A preferred embodiment of a relative compact and highly reliable imaging system 70 is depicted in FIGS. 8-1 1 . The imaging system 70 is most preferably configured to illuminate an object 71 , such as body tissue and the like, and to produce a video image of the object 71 based upon infrared light reflected from the object 71. The imaging system 70 preferably includes a housing 72 which contains the imaging features of the system 70. [0084] As shown in FIG. 8, the housing 72 preferably has a substantially rectangular configuration. The housing 72 preferably has a length of between about three and about five inches and a width of about three and one-half inches. It will be appreciated by those skilled in the art that the imaging system 70 can be configured in a variety of ways and the invention should not be limited by any specific examples or embodiments discussed herein. For example, in FIG. 8 the housing is depicted as being substantially rectangular, however, circular, polygonal, and other geometries and sizes are feasible as well.
[0085] An imaging device 74, such as a video camera having a lens 75, and video processing components reside within the housing 72. The imaging device 74 and video processing components operate to detect infrared light and to process the detected infrared light from the object 71 . The imaging system 74 produces an image based on the detected infrared light reflected from the object 71 , as described herein. As shown in FIGS. 8 and 9, the imaging device 74 is preferably mounted within an aperture 76 of mounting wall 78, with the lens 75 extending into the housing interior 77, as described further below. More particularly, the camera 74 is preferably centrally and symmetrically mounted within the housing 72. This preferred symmetrical camera location tends to maximize the amount of light detected by the camera, which enhances the image produced by the system 70, thereby enhancing the illumination of blood vessels disposed below subcutaneous fat in body tissue.
[0086] The housing 72 most preferably contains various components operable to transit diffuse light from the system 70 toward the object 71 . Arrows 80 represent diffuse light transmitted by the system 70. Arrows 82 represent the light reflected from the object 71. As shown in FIG. 9, as viewed in the direction of the arrows along the section line A-A of FIG. 8, the wall 78 contains a number of infrared light emitting diodes (LEDS) 84 disposed in a LED array 85 for emitting infrared light. The LED array 85 defines a LED plane of reference. When activated, each LED 84 preferably transmits light at a wavelength of about 740 nanometers (nm). In the preferred embodiment, each LED 84 is manufactured by Roithner Lasertechnik of Austria under model number ELD-740-524.
[0087] As shown in FIG. 10, and according to the preferred embodiment, the
LEDs 84 are mounted on a circuit board 86 located adjacent to wall 78. As shown in FIG. 9, there are most preferably eight groups 92, 94 of LEDs 84 concentrically arranged about the imaging system 74. The concentric LED arrangement tends to provide maximal dispersion and transmission of diffuse light from the system 70. It is preferred that each group 92, 94 of LEDs 84 contain at least ten LEDs 84. However, the system 70 can include more or fewer LEDs within a particular group depending upon a desired implementation of the system 70. Furthermore, the system 70 can include more or fewer groups of LEDs in the LED array 85.
[0088] With continuing reference to FIG. 9, there are four groups 92 of LEDs
84 located about the corner regions 96 of the LED array 85. Most preferably, at least fifteen LEDs 84 are disposed in each corner region 96 of the LED array 85. There are preferably four groups 94 of LEDs 84 disposed in lateral regions 98 of the LED array 85. Each lateral region 98 is located substantially between each corner region 94. Most preferably, at least ten LEDs 84 are disposed in each lateral region 98 of the LED array 85.
[0089] As described above, the LED array 85 is mot preferably disposed on circuit board 86. In conjunction with the control system 90, the circuit board 86 includes control circuitry that controls the activation of one or more LEDs 84 within a particular group or groups 92, 94 of LEDs 84 in the LED array 85. As shown in the block diagram of FIG. 1 1 , a power source 88 and a control system 90, such as a microprocessor or similar control device, are electrically connected to the circuit board 86. It will be appreciated that is also possible to control the LEDs without using a control system 90, that is, power source 88 can be switched "on" or "off" to activate and deactivate the LED array 85. It will be appreciated that pulse modulation techniques can also be used in conjunction with power source 88 to activate and deactivate one or more of the LEDs 84 in the LED array 85 according to a preferred duty cycle, herein defined as the LED "on" time relative to the LED "off" time.
[0090] As shown in the block diagram of FIG. 1 1 , in a preferred embodiment of the imaging system 70, the LED array 85 is electrically connected via circuit board 86 to the power source 88 and control system 90. The control system 90 includes control features for controlling the LED array 85 to emit infrared light toward an object 71. As described herein, the control system 90 can enable one or more of the LEDs 84 in a group or groups of the LED array 85 to emit light continuously or intermittently. That is, one LED 84 or a plurality of LEDs 84 can be selected and controlled to emit infrared light intermittently or continuously toward the object 71 . Thus, the system 70 can be configured to transmit infrared light from the LED array in various permutations and combinations of LEDS 84 and/or LED groups 92, 94. [0091] Referring now to FIG. 10, a first diffusion layer 100 is disposed adjacent to the emitting surfaces 102 of the LEDs 84 in the LED array 85. According to a preferred embodiment, the first diffusion layer 100 is glued, such as using known adhesives, onto the emitting surfaces 102 of the LED array 85, thereby operating to diffuse the light emitted by one or ore LEDs 84 in the LED array 85. The first diffusion layer 100 is mot preferably a holographic twenty degree diffuser, such as a product having identification code LSD20PC10-F10x10/PSA, manufactured by Physical Optics Corporation of Torrance, Calif. Most preferably, the first diffusion layer 100 has a length of about three and one-half inches, a width of about three and one-half inches, and a thickness of about 0.10 inches. When one or more of the LEDs 84 in the LED array 85 are activated, the first diffusion layer 100 diffuses the infrared light emitted from the LED array 85, thereby providing a first amount of diffusion to the emitted infrared light. [0092] The interior surfaces 104 of the housing 72 are shown in FIG. 10. Most preferably, the interior surfaces 104 are coated with a reflective coating, such as white paint or the like, which reflects and further diffuses the already diffuse light produced by the first diffusion layer 100. With continuing reference to FIG. 10, a second diffusion layer 106 is spaced apart from the first diffusion layer 100 by a distance LDD. Most preferably, the distance LDD between the first and second diffusion layers 100 and 106 is about three inches. The second diffusion layer 106 is most preferably a holographic twenty degree diffuser, similar to or the same as the above-described first diffusion layer 100. The second diffusion layer 106 has a preferred length of about three and one-half inches, a width of about three and one- half inches, and a thickness of about 0.10 inches.
[0093] The second diffusion layer 106 further diffuses the already diffuse light reflected from the interior surfaces 104 and provided by the first diffusion layer 100. As shown in FIG. 8, the first and second diffusion layers are substantially planar, that is, the layers 100 and 106 each define a planar geometry.
[0094] With continuing reference to FIG. 10, a backing material 108, such as
LUCITE material sold under the trademark LUCITE and manufactured by DuPont of Wilmington, Delaware, is disposed adjacent to the second diffusion layer 106. Most preferably, the backing material has a thickness of about 0.125 inches. A visible polarizer 1 10 is disposed adjacent to the backing material 108. The visible polarizer 1 10 is most preferably manufactured by Visual Pursuits of Vernon Hills, Illinois, under part number VP-GS-12U, and having a thickness of about 0.075 inches. [0095] Thus, the system 70 is operable to produce various levels of diffusion as the emitted light progresses through the first diffusion layer 100, reflects off of the interior surfaces 104 of the first compartment 72a, and continues to progress through the second diffusion layer 106, backing material 108, and polarizer 1 10. Thus, a level of diffusion results after the emitted light passes through the first diffusion layer 100. Another level of diffusion results from the reflection from the interior surface 104 of the first compartment 72a of the already diffused light provided by the first diffuser layer 100. Yet another level of diffusion results after the diffuse light passes through the second diffusion layer 106.
[0096] As shown in FIG. 8, the visible polarizer 1 10 preferably includes a central portion 1 12, most preferably in the shape of a circle having about a one-inch diameter. The central portion 1 12 geometry most preferably coincides with the shape and dimension of the camera lens 75. The polarization of the central portion 1 12 is preferably rotated approximately ninety degrees with respect to the polarization of the surrounding area 1 14 of the polarizer 1 10. In the preferred embodiment, the camera lens 75 contacts the backing material 108. As shown in FIG. 8, the positional location of the lens 75 within the housing 70 preferably coincides with or shares the same central axis as the central portion 1 12 of the polarizer 1 10. The central portion 1 12 of the polarizer 1 10 coinciding with the front of the lens 75 tends to remove any surface glare ("specular reflection") in the resulting camera image. [0097] As shown in FIG. 10, the backing material 108 and the visible polarizer
1 10 have planar surfaces which preferably include a similar planar orientation with respect to the planes defined by the first and second diffusion layers 100, 106. According to a most preferred embodiment, the first diffusion layer 100, interior surfaces 104, second diffusion layer 106, backing material 108, and visible polarizer 1 10 define a diffusing system 1 16 (FIG. 10) for providing diffuse light to an object 71 . It will be appreciated that the diffusing structure can include more or fewer components and the invention is not to be limited by any specific examples or embodiments disclosed herein. For example, the diffusing system 1 16 can include either the first or the second diffusion layers 100, 106, with or without the polarizer 1 10, or can include the first and second diffusion layers 100, 106 without the polarizer 1 10.
[0098] Once actuated, the system 70 operates to transmit diffuse light 80 toward an object 71 and produce a video image of the object 71 with the imaging system 74, as described above. More particularly, once the power source 88 is enabled, one or more of the LEDs 84 in the LED array 85 emit infrared light from the emitting surface(s) 102. The first diffusion layer 100 provides a first amount of diffusion to the emitted infrared light. The interior surfaces 104 further diffuse the diffuse light emanating from the first diffusion layer 100. The second diffusion layer 106 further diffuses the already diffuse light which is then transmitted through the backing material 108 and the polarizer before illuminating the object 71 . As described above, the object 71 reflects the emitted diffuse light 80 producing diffuse reflected light 82 that is captured by the imaging system 74. The imaging system 74 then produces a video image of the object 71 . Accordingly, by emitting diffuse light according to a unique diffusion providing system 70, the system 70 aids in locating and differentiating between different material properties of the object 71 , such as between blood vessels and tissue.
[0099] It is contemplated, and will be apparent to those skilled in the art from the preceding description and the accompanying drawings, that modifications and/or changes maybe made in the embodiments of the invention. For example, the planes defined by the first or second diffusing layers 100 and 106 can be adjusted to not be parallel with respect to one another, thereby providing different levels of diffuse light from the system 70. Furthermore, the plane defined by the LED array 85 is mot preferably in substantial parallel relation with respect to the plane defined by the first diffusing layer 100. However, the planes defined by LED array 85 and the first diffusing layer 100 can be varied to accommodate various operational conditions, as will be appreciated by those skilled in the art. Accordingly, it is expressly intended that the foregoing description and the accompanying drawings are illustrative of preferred embodiments only, not limiting thereto, an that the true spirit and scope of the present invention be determined by reference to the appended claims.
[00100] FIGS. 20a, 20b, and 20c are photographs of test subjects showing processed images of subcutaneous blood vessels being projected onto the surface of each subject's body tissue which covers the viewed blood vessels. [00101] Additional embodiments will now be described showing a variety of configurations of illumination sources, imaging devices for viewing the image of buried structure beneath the surface of the illuminated object, and projectors for projecting the processed image back onto the surface of the object. Because all of the embodiments of the present invention have many structural features in common, only the differences between the structures need be discussed in detail, it being understood that similar structural features of all the embodiments perform similar functions. Those skilled in the art will readily recognize the similar structural features that appear in all embodiments of the present invention. [00102] Because of the present invention's departure from the prior art by projecting the image of the buried structure back onto the surface of the object (rather than onto a screen or monitor that is remote from the surface of the object), an observer using the present invention is not subject to the parallax errors that otherwise occur with prior art devices if an observer were to view from off-axis. An important feature of all embodiments is that the image of buried structure viewed by the image device should be substantially with in a first spectrum outside a second spectrum of the image that is projected back onto the surface of the object, thereby causing the imaging device to be blind to the image that is projected back onto the surface of the object. The substantial non-overlap of the spectrum of the viewed image of the buried structure with the spectrum of the projected image of the buried structure effectively decouples the image processing of the buried structure's image from interference by the projected image. Because the projected image is in the visible light spectrum and the illumination of the object for the imaging device is in the infrared spectrum, a substantial non-overlap of the two spectrums is maintained. In another herein-disclosed embodiment, rather than illuminating the object with light that is primarily in the infrared spectrum, the object can be illuminated by broad- spectrum ambient light, and an infrared filter is placed in front of the imaging device to remove all spectral components outside the infrared spectrum, thereby causing the imaging device to only see the infrared component of the broad-spectrum diffuse light reflected from the object. [00103] A third preferred embodiment 130 of the imaging system is shown in FIG. 12. A well-known CCD camera with lens 132 is used as the imaging device, as in all embodiments. A second polarizing filter 134 is interposed between the CCD camera and the reflected light from the viewed object, as previously described for earlier embodiments, so as to reduce specular reflection from the surface of the object. The illumination source, first polarizing filter, holographic illumination diffuser ring, and optically-neutral glass cover, all generally at 136, are best described below in the discussion of the fourth embodiment of the imaging system shown in FIGS. 13 and 14, which has the same structure 136 which is shown in cross-section for that embodiment. [00104] As with all embodiments, the third preferred embodiment includes a well-known video projector 138 or so-called "light engine" for projecting a visible image onto the object O under examination. A desirable feature of the video projector 138 is high output light intensity, because the intensity of the output of the projector's light is a determining factor in how well the projected image can be viewed under normal room illumination. Video projector 138 includes a high-intensity green LED illumination source 140 which emits light into well-known prism assembly 142, thereby causing the emitted light to fold back, by internal reflection within prism assembly 142, and be directed rearwardly toward well-known Digital Light Processing ("DLP") device 144, also known as a Digital Mirror Device ("DMD"), having an array of closely-packed small mirrors that can individually shift the direction of the light beam reflected therefrom so as to either cause the light beam to be directed toward the target object through well-known projection lens 146 or to cause the light beam to not be directed toward the target object, thereby turning the emitted light beam off on a pixel-by-pixel basis in a manner well-known to those skilled in the art. It shall be understood that prism assembly 142 permits a more compact apparatus for the various embodiments of the imaging system, and the use of such prism assemblies is well known to those skilled in the art of video projectors. [00105] As with the prior-described embodiments, a well-known so-called "hot mirror" 148 is interposed at 45 degrees to intercept the infrared light reflected from the viewed object and reflect that infrared light downward to camera 132. "Hot mirror" 148 acts as a mirror to longer wavelengths of light (such as infrared light) but higher-frequency light, such as the green light from projector 138, passes through without reflection and toward the viewed object. [00106] Imaging system 130 further has first and second lasers 150, 152 for ensuring that the target is properly located for in-focus viewing by camera 132, as hereinafter described.
[00107] Referring now to FIGS. 13 and 14, a fourth embodiment 154 of the imaging system of the present invention will now be explained. [00108] Fourth embodiment 154 is mounted upon a pole 156 that extends upwardly from a mobile cart 158, allowing the imaging system 154 to be easily transported. A fine-focus stage 160 allows imaging system 154 to e raised or lowered so that it is properly positioned above the target object O. As with all embodiments, video projector 162 is provided with a 525 nm green LED illumination source ("photon engine") 164 for illuminating the DMD/DLP chip 166. A suitable photon engine 164 for use with the fourth embodiment is the Teledyne Lighting model PE09-G illuminator, having an output intensity of 85 lumens. DMD chip 166 may be a Texas Instruments part number 0.7SVGA SDR DMD chip having a resolution of 848 x 600 pixels and a mirror tilt angle of ten degrees and a frame rate of 30 Hz. Well-known prism assembly 168, as before, internally reflects the light from photon engine 164 toward DMD chip 166 and then directs the light reflected from DMD chip 166 toward object O. DMD chip 166 is controlled by a well-known drive electronics board 167 which may be made by Optical Sciences Corporation. [00109] Interposed between photon engine 164 and prism assembly 168 is a condenser lens 170 such as a BK7 bioconvex lens, part number 013-2790-AZ55, sold by OptoSigma, having a BBAR/AR coated surface coating for 425-675 nm light. As the projector light emerges from prism assembly 168, it passes through well- known projection lens 172, Besler part number 8680 medium format enlarger lens and then through well-known "hot-mirror" (high pass filter) 174, which reflects the received infrared light image from the object O through second polarizing filter 178 and then to camera 176. A suitable camera 176 is the Firefly Camera, part number FIRE-BW-XX, sold by Point Grey Research, which uses a 640 x 480 CCD chip, part number Sony ICX084AL, and which communicates its images to computer ("CPU") 180 through an IEEE-1394 ("FireWire") interface. It should be noted that computer 180 has a number of interfaces signals 181 that communicate with the imaging system in a manner well-known to those skilled in the art. As briefly mentioned for the third embodiment, the fourth embodiment also has first and second lasers 150, 152 for ensuring that the target O is properly located for in-focus viewing by camera 176.
[00110] As with third embodiment 130 shown in FIG. 12, and with reference to FIGS. 12, 13, and 14, fourth embodiment 154 has an assembly 136 that includes infrared illumination source 182, first polarizing filter 184 (which is ring-shaped with a center hole therethrough so as not to affect the projected image from projector 162 or the viewed image of the object), holographic illumination diffuser ring 186 (which likewise has a center hole therethrough for passage of the projected image from projector 162 and of the viewed image of the object) and which diffuses the light from LEDs 190, and optically-neutral glass cover 188,. Infrared illumination source 182 is a group of LEDs preferably arranged in a select pattern, such as a circular ring having a centrally-disposed hole through which the projected image and the viewed object's image passes. The LEDs are preferably 740 nm near-infrared LEDs 190 that illuminate the object O, and research has determined that such a structure provides sufficient diffused infrared light for satisfactory illumination of object O. [00111] Referring to FIG. 15, a fifth embodiment 192 of the imaging system of the present invention will now be explained. The significant difference between this fifth embodiment and the other embodiments is that the fifth embodiment does not provide an integral diffuse infrared illumination source (e.g., illumination source 182 with a ring of LEDs 190) for illuminating the object, but instead views the object as illuminated by ambient light L (or the sun S) that has a broader spectrum than the integral diffuse infrared illumination sources heretofore disclosed. While ambient light has some infrared spectral components and is quite diffuse, those infrared spectral components are generally of lower intensity than the infrared light produced by the diffuse infrared illumination sources heretofore disclosed. Accordingly, a better (i.e., more sensitive) image device camera is required for this embodiment, with better optics than the previously-described embodiments. [00112] Like the other embodiments, the fifth embodiment 192 includes video projector 162, including a green "photon engine" 164, prism assembly 168, projector lens 172, and DMD chip 166. To permit a compact design, fifth embodiment 192, as could any of the embodiments, includes a "fold mirror" 194 that folds the beam at a right angle within the projector between the photon engine 164 and prism assembly 168. Also like the other embodiments, fifth embodiment 192 includes a "hot mirror" 174. [00113] Fifth embodiment 192 further has an infrared filter 196 interposed in the optical path between the imaging device (camera 198) and object O so as to filter out al. but the infrared component of the image viewed by camera 198. Camera 198 is preferably a Basler CMOS camera, model A600-HDR, made by Basler Vision Technologies of Germany, which has an IEEE 1994 ("FireWire") interface and allows capture of images with up to a 1 12 dB dynamic range. An advantage of the fifth embodiment is that it can be (and should be) used in a brightly-illuminated room. [00114] Experimental testing has revealed that some persons have arms or legs that are so covered with surface hair that it is difficult to see with clarity the projected subcutaneous structure that is projected onto the surface of the skin. Investigation has revealed that all hairs, even white hairs, look black in the near infrared. Hence, image processing is performed on the received image in order to remove small dark artifacts, such as hairs, from the image while retaining larger dark objects to maintain the visibility of the veins. FIGS. 16a and 16b, taken together in sequence, are a program listing for artifact removal image processing of the received image. The same artifact removal procedure is performed twice, and then a well- known adaptive edge enhancement procedure is performed, such as, for example, unsharp masking, followed by a smoothing to clean up image artifacts produced by the hair removal. The program listing is well-commented and explains to those skilled in the art the image processing steps that are applied to the image. [00115] The received mage, having integer pixel values in the range (0...255) is converted to floating point values between 0.0 and 1.0, inclusive. The resulting image is then converted to smoothed (blurred) using a Gaussian convolution having a sigma of 8 pixels. This is a fairly small value of sigma, and leave small features, such as narrow hairs, in the resulting smoothed image. A "difference image" is created which is the original image minus the Gaussian-smoothed image, producing a zero-centered set of values from -1 .0 to 1 .0. Hairs, even white hairs, appear black in the near infrared, so negative pixel values are indicative of hairs, and those negative-value pixels are thus replaced with the corresponding pixels from the Gaussian-smoothed image. This is the first step in the processing of the received image. Next, an array of values is created for the image, such that all pixel locations where the original "difference image" was negative (the "hair" locations) are set to 1 .0, and all other pixel locations are set to zero, thereby creating an array populated by 0.0 or 1 .0 values, with every "hair pixels" having a value of 1 .0 and all others having a zero value. The original image ("iml"), having pixel values ranging from 0.0 to 1 .0, is then "boosted" at every "hair pixel" location by 0.015. Because this is a highly non-linear operation, the amount of "boost" if quite small, just 1.5%. [00116] This same set of operations (Gaussian smoothing with a sigma of 8 pixels, creation of a difference image, identifying negative pixel locations, and "boosting" the image where negative pixels (small features and noise) are found) are performed again, and the resulting image is then smoothed again with a Gaussian convolution having a sigma of 64 pixels. A third difference image is created, which is the again-"boosted" image minus the smoothed image, and an image is created that is formed from the absolute value of every pixel in the third difference image. The resulting absolute value image is then smoothed with a Gaussian convolution having a sigma of 64 pixels, and the third difference image is then divided by the smoothed absolute value image, and the resulting divided image is smoothed with a Gaussian convolution having a sigma of 4 pixels. [00117] The foregoing Artifact Removal algorithm allows the contrast to be set by the contrast of the subcutaneous vein (the subsurface structure of interest), ignoring the artifacts (hairs), and thereby prepares the image for adaptive unsharp masking edge enhancement to set the contrast of the final image. Parameters such as sigma values, thresholds, etc., may be varied depending on the age of the subject, degree of pigmentation, etc. [00118] FIGS. 17a, 17b, 17c, 17d, 17e, and 17f, taken together in sequence, are a program listing in the C++ programming language for artifact removal image processing of the received image which is based upon the research/investigation program shown in FIGS. 16a and FIG. 16b, but instead uses the Intel image processing library to perform the mathematical operations more quickly.
[00119] Any or all of the embodiments of the present invention preferably include a mechanism for keeping the image of the buried structure, as seen by the imaging device, in focus to the image device camera with a proper lens-to-subject distance thereto. As seen best in FIG. 18, a first embodiment of this mechanism uses a pair of laser, 150, 152, each laser respectively emitting a beam 200, 2020, with beams 200 and 202 being non-parallel with respect to each other and thus being directed toward the object from different angles, such that the two laser beams only converge to the same spot 204 and intersect when the target is at the proper lens-to-subject distance from the imaging device, as shown by the position of intersecting plane 206. If the target is closer to the apparatus than the proper lens- to-subject distance, as shown by plane 208, or if the target is further from the apparatus than the proper lens-to-subject distance, as shown by plane 210, the two laser beams will not intersect at a single point 204 but instead will appear on the surface of the object as a first pair of visible dots 212, 214 (for plane 208) or as a second pair of visible dots 216, 218 (for plane 210), indicating that the buried structure is not in focus to the imaging device camera, and that the distance from the object to the apparatus should be changed to bring the viewed image of the buried structure into focus. Lasers 150 and 152 may also be seen in FIGS. 12, 13, and 14. Suitable laser for use with the present invention are the model LM-03 laser modules made by Roithner Lasertechnik, of Vienna, Austria.
[00120] A second embodiment of the target positioning mechanism adds a recognizable visible light pattern, such as a text border, independent of the buried structure being observed, to the projected image for mutual projection therewith. The projected recognizable pattern will only be recognized by the human viewer as being in focus on the surface of the target object when the target is at the desired distance from the projector, thereby causing the buried structure beneath the surface of the target to also be at the proper lens-to-subject distance from the imaging device. If desired, cartoon figures appealing to children could be provided as an incentive for children to properly position their body parts for viewing subcutaneous blood vessels, or a hospital's or clinic's logo or name could be used for the pattern. While the projected image of the buried structure is often somewhat blurred from image processing removal of artifacts, humans can quickly tell if a well-known or recognizable visible light pattern is out of focus. An advantage of this second embodiment of the target positioning mechanism, namely, the projection of a recognizable visible light pattern rather than the use of lasers, is that there is a possible hazard of injury, such as blindness, if proper safety precautions are not used with the lasers. [00121] The photograph of FIG. 21 shows a projected image having a text border therearound.
[00122] FIG. 22 is another photograph of a projected image having a text border therearound, similar to FIG. 21 but in which the viewed object has been moved out of position, showing how the text border becomes out-of-focus to indicate that the object is not positioned properly with respect to the image device camera. [00123] FIG. 23 shows a text border image that is combined with a projected image for joint projection onto the object to ensure proper positioning. Because of the image reversal that occurs in some embodiments of the invention as images reflect inside the prism structure heretofore described, this text border image is shown reversed but appears unreversed when projected. The projected image is appropriately cropped before combining with the text border so that the text border remains sharp and distinct when projected.
[00124] FIG. 24 is a photograph of a processed image of subsurface veins projected onto a hand by the present invention, similar to FIG. 20 (which omits the text border) and FIG. 21 but showing how the text border becomes out of focus to indicate that the hand is not positioned properly.
[00125] As shown in FIG. 19, a calibration method is provided wherein the video projector 138 (or 162, or any of the projector of the present invention) projects a green target pattern 220 onto a fluorescent screen 222, which converts the projected four-dot green target pattern 220 into deep red light that is visible by the infrared imaging device 132. A computer program records the observed position of the viewed pattern of four projected dots P1 , P2, P3, and P4, in Cartesian coordinates, i.e., (x1 , y1 ), (x2, y2), (x3, y3), and (x4, y4), versus the desired or "true" position of the dots if alignment were correct, i.e., (X1 , Y1 ), (X2, Y2), (X3, Y3), and (X4, Y4), and calculates calibration coefficients (a, b, c, d, g, h, k, f) to be used in the bi-linear transformation equations(the arguments to the "solve" function in FIG. 25az and FIG. 25b) to correct magnification, rotation, and translation misalignment between the imaging device and the projector. FIG. 25a and FIG. 25b show the use of the MAPLE 9 computer equation solving program to solve for the bilinear transformation coefficients as a function of the values measured during calibration. These calibration coefficients are used during operation of the device to transform the coordinate system of the image (x, y) into the corrected coordinate system (X, Y) necessary to produce a calibrated image. FIG. 26 shows how these coordinates, once calculated during calibration, are used as parameters to a well-known image processing library mathematical routine provided by the integrated circuit company Intel for use with its processors, to achieve high performance image alignment correction using the bilinear transformation equation. The run-time calculations are done using scaled integer arithmetic, rather than floating point arithmetic, for faster processing of the image. [00126] The calibration procedure projects a test pattern 220, consisting of four dots P1 , P2, P3, and P4, each having a 25-pixel radius (as viewed by the imaging device camera) at the corners of a rectangle having dimensions of 320 x 240 pixels rectangle (as viewed by the imaging device camera), onto the fluorescent. For example, the camera 132 might have a resolution of 640 x 480 pixels, whereas the projector 138 might have a resolution of 1024 x 780 pixels. Experimental testing for dot radii varying from 4 to 50 pixels showed that the standard deviation of 100 samples decreased rapidly from a dot radius of 5 pixels to about 25 pixels, and then decreased much more slowly out to a radius of 50 pixels. [00127] To practice the calibration method of the present invention, a test pattern of four spaced-apart dots P1 , P2, P3, and P4 is projected within a first spectrum, preferably using green light, onto a fluorescent screen 222, which then fluoresces and produces light within a second spectrum, preferably light adjacent or within the infrared spectrum, such as red light, that is visible to the image device camera 132, even through the infrared transmitting filter through which the image device camera views its target object. Calibration software then measures the observed position of the four dots and computes the correction coefficients (a, b, c, d, g, f, h, k) for the bi-linear transformation equation, and then uses those coefficients as parameters to the bi-linear transformation in order to correct misalignment errors (rotation, translation, and magnification) between the image device camera and the projector by warping the image prior to projection so that the projected image is corrected for misalignment. It should be noted that this procedure allows for correction of magnification errors that are different in the horizontal and vertical directions, and also allows for correction of translation errors that are different in the horizontal and vertical directions.
[00128] Testing has shown that this calibration procedure can correct misalignments as great as +/-25.4 mm to within about half of the mage camera's pixel size. The alignment is best for image portion near the test pattern's four dots, but remains remarkably good over the entire image. [00129] It should be understood that features of any of these embodiments may be used with another in a way that will now be understood in view of the foregoing disclosure. For example, any embodiment could choose to illuminate the object using infrared components within ambient lighting, rather than providing a separate diffuse infrared illumination source, and/or could choose between a laser target positioner and a recognizable pattern that is combined with the projected image of the buried structure for maintaining a desired distance from the image device camera to the object..
[00130] As described above, in the system and method of the present invention, a received image may be visually enhanced by various image processing techniques before being projected back onto a target. For example, an artifact removal process is described that employs, inter alia, an unsharp mask - - a blurred version of the object image is produced and is subtracted from an original object image (i.e., a focused image) to produce an edge-enhanced image. Additional techniques can be applied according to embodiments of the present invention. [00131] Fig. 27A is a flow chart of a method for contrast enhancing an image of an object according to an embodiment of the present invention. At step 27-1 , image data is received at an image processing device, e.g., from the camera. The image data may be processed in known digital formats, such as, e.g., pixel data on a 0-255 gray scale. At step 27-2, a blurred image is generated by application of a blur filter, such as, e.g., Gaussian blurring. This blurring may occur in either the spatial domain, or in the frequency domain, via convolution, to enhance computational speed. The resulting blurred image is subtracted (e.g., pixel-by-pixel) from the original image at step 27-3, resulting in the unsharp mask (27-4). The absolute value (ABS) of the unsharp mask is taken (27-5) and another blur filter is applied thereto (27-6). The unsharp mask is divided (27-7) by the blurred ABS of the unsharp mask, and the result of the operation is adjusted to generate the final enhanced image (27-8).
[00132] According to an embodiment of the present invention, the blurred image is created by applying an "averaging window" to each pixel in the image. An averaging window is a window having a kernel size smaller than that of the image being processed. The averaging window is centered on each pixel of the image, and the value of the pixel of interest is set the average value of all the pixels within the window. For example, in an image having a resolution of 640X480 pixels, it has been determined that a 192X192 sized averaging window produces a good result as a first blur filter. When the averaging window is applied to pixels in the exterior part of the image such that the averaging window extends beyond the image definition, the pixels in the window are mirrored in order to fill the averaging window. [00133] By applying the average window to each pixel in the image, the blurred image is created. In the method of Fig. 27A, the blur filter is applied two different times. It has been determined that better results are obtained when the second application of the blur filter uses a different window size than the first, preferably a smaller size. It was determined that if the first average window has a kernel size of 192X192 pixels, then a second average window having the size 96X96 pixel results in an effective increase in sharpness of the image. One skilled in the art will understand that, if processing occurs in the spatial domain, smaller kernels may be processed more quickly than larger kernels and that the present invention is not limited to any particular kernel sizes. [00134] According to an embodiment of the present invention, final contrast adjustment (e.g., 27-8) can be accomplished by performing linear scaling. For example, in one embodiment, the division function performed prior to this step results in a 16-bit signed integer. This value can be scaled back to an 8-bit unsigned integer using min and max values. During the scaling, minimum (Min) and maximum (Max) parameters determine the spread and hence, the degree of contrast increase. The scaling formula used to map the source pixel p to the destination pixel p' is: p' = dst_Min + k*(p - src_Min) where k = (dst_Max - dst_Min)/(src_Max - src_Min).
[00135] Results of the image processing can be appreciated from Figs. 27B-C. Fig. 27B is an image of a test target (gradient) along with a plot of the pixel values for a selected section of the gradient. Fig. 27C is an image of the test gradient after being enhanced by the process set forth in Fig. 27A along with a plot of the post processed pixel values for the selected section of the gradient. As can be seen, the appearance of detail is created, such as the darkened center lines of the gradient lines.
[00136] Finer detail can be obtained by applying the method of Fig. 27A with smaller averaging window sizes for the blur steps. It has been determined that a finer image can be obtained by employing a first average window of a size 96X96 pixels in step 27-2 and a second average window of a size 48X48 pixels in step 27- 6.
[00137] Further image processing can be employed to create a more visually useful image of the subcutaneous vessels. For example, contrast enhancing techniques can generate an image of vessels having more defined edges or a well defined center. Fig. 28A is a flow chart of another method for enhancing the contrast of an image to provide improved dimensional detail, according to an embodiment of the present invention. At step 28-1 , the image to be processed is received. A blurred image is generated by application of a blur filter at step 28-2 such as already described above. The blurred image is subtracted from the original image at step 28-3, resulting in the unsharp mask (28-4). The ABS of the unsharp mask is taken (28-5) and the blur filter is applied thereto (28-6). The ABS of the unsharp mask is divided (28-7) by the blurred ABS of the unsharp mask, and the result of the operation is adjusted to generate the final enhanced image (28-8). [00138] In the method of Fig. 28A, it was determined that employing first and second averaging windows of the size 76X76 and 40X40 pixels respectively achieved superior results.
[00139] Results of the image processing can be appreciated from Figs. 28B-D. Fig. 28B is an image of the test gradient after being enhanced by the process set forth in Fig. 28A along with a plot of the post processed pixel values for the selected section of the gradient. As can be seen, the appearance of detail is created, such as the darkened edges of the gradient lines. Fig. 28C includes images of an enhanced image of subcutaneous vessels projected back on a human arm. The top image is a result of processing according to the method of Fig. 27A while the bottom image is a result of processing according to the method of Fig. 28A. One having skill in the art should appreciate the distinctly different results each of the methods produce and understand that the techniques could be preferred for different applications.
[00140] Fig. 28D includes images of a human target body part during steps of the process of Fig. 28A. The top left image is a raw image of the target body part. The top right image is blurred image of the target body part. The middle left image is the result of subtracting the blurred image from the raw image. The middle right is the results of the process of Fig. 28A having enhanced dimensional detail. The bottom two plots are cross-sectional plots of the pixel data of the image for the two images respectively above the plots. [00141] Fig. 29A is a flow chart of another method for enhancing the contrast of an image, according to an embodiment of the present invention. The image is received at step 29-1 from the camera. A blurred image is generated by application of a blur filter at step 29-2. The blurred image is subtracted from the original image at step 29-3, resulting in the unsharp mask (29-4). The absolute value of the unsharp mask is taken (29-5) and the blur filter is applied thereto (29-6). The unsharp mask is divided (29-7) by the blurred ABS of the unsharp mask, and the result of the operation is adjusted to generate the enhanced image (29-8). Next, each pixel is compared against threshold brightness at step 29-9. If the pixel is below the threshold, the pixel is set to the maximum level (e.g., 255 on a contrast scale of 0-255).
[00142] In the method of Fig. 29A, it was determined that employing first and second averaging windows of the size 96X96 and 40X40 pixels respectively achieved superior results. [00143] Results of the image processing can be appreciated from Figs. 29B-C. Fig. 29B is an image of the test gradient after being enhanced by the process set forth in Fig. 29A along with a plot of the post processed pixel values for the selected section of the gradient. As can be seen, the appearance of detail is created by extreme contrast between the darkened edges of the gradient lines with the bright center. One having skill in the art should appreciate the distinctly different results each of the methods produce and understand that the techniques could be preferred for different applications.
[00144] Fig. 3OA is a flow chart of another method for enhancing the contrast of an image, according to an embodiment of the present invention. The image to be processed is received at step 30-1 from the camera. A blurred image is generated by application of a blur filter at step 30-2. The blurred image is subtracted from the original image at step 30-3, resulting in the unsharp mask (30-4). The absolute value of the unsharp mask is taken (30-5) and the blur filter is applied thereto (30-6). The unsharp mask is divided (30-7) by the blurred ABS of the unsharp mask, and the result of the operation is adjusted to generate the enhanced image (30-8). Next, each pixel of the image is adjusted (reduced or increased) by an offset. In one embodiment, the value is reduced by a constant value and resulting negative values are "rolled over." For example, using a gray scale of 0-255 and a constant of 30, an image value of 25 is reduced to -5 which is out of the allowable range and is rolled over to 250. If an offset is used to increase the pixel values, pixel values roll over from 255 to 0.
[00145] In the method of Fig. 3OA, it was determined that employing first and second averaging windows of the size 96X96 and 40X40 pixels respectively achieved superior results. [00146] Results of the image processing can be appreciated from Fig. 3OB, which is an image of the test gradient after being enhanced by the process set forth in Fig. 3OA along with a plot of the post processed pixel values for the selected section of the gradient. As can be seen, the appearance of detail is created by extreme contrast between the darkened edges of the gradient lines with the bright center. Fig. 29C includes images of an enhanced image of subcutaneous vessels projected back on a human arm. The top image is a result of processing according to the method of Fig. 27A while the bottom image is a result of processing according to the method of Fig. 3OA. One having skill in the art should appreciate the distinctly different results each of the methods produce and understand that the techniques could be preferred for different applications.
[00147] According to another embodiment of the present invention, noise or interference in the image caused by hair on the body can be reduced by adding a step to the above processes that first applies a "maximum filter" to the image before applying the rest of the process steps. The maximum filter is similar to the blur filter but instead of applying an averaging window to each pixel, a maximum window is applied. The maximum window identifies the maximum value of any pixel in the window covering the pixel in interest and sets the pixel in interest to the maximum. It has been determined that a maximum window of the size 12X12 pixels centered on each pixel of interest achieves good results. [00148] According to one embodiment, the maximum window filter can be applied to the method of Fig. 27A in order to reduce the influence of hair on the image. It was determined that employing first and second average windows of the size 192X192 and 96X96 pixels respectively achieved superior results. [00149] Digital image processing can be performed by known conventional means, such as by a combination of hardware, software and/or firmware using logarithmic video signals or digital video signals. In embodiments of the present invention, processing is performed programmatically in a known computer language such as C. The present invention is not limited, however, to any particular computing arrangement.
[00150] Thus, a number of preferred embodiments have been fully described above with reference to the drawing figures. Although the invention has been described based upon these preferred embodiments, it would be apparent to those of skill in the art that certain modifications, variations, and alternative constructions could be made to the described embodiments within the spirit and scope of the invention.

Claims

THE CLAIMS:
1 . An imaging system comprising:
(a) an imaging device receiving infrared light which has been reflected from an area of body tissue in the form of an input image and generating an enhanced image of said area of body tissue, wherein the generation of said enhanced image comprises contrast enhancement including the application of an unsharp mask to said input image; and (b) a projector which receives said enhanced image and projects said ehanced image onto said area of body tissue.
2. The imaging system of claim 1 further comprising an infrared light source for generating infrared light towards said area of body tissue.
3. The imaging system of claim 1 wherein said contrast enhancement further comprises the application of first and second blur filters each having a different resolution.
4. The imaging system of claim 3 wherein said first and second blur filters comprise the application of an averaging window to each pixel of said input image.
5. The imaging system of claim 1 wherein said contrast enhancement further comprises adjustment of blur filters used to generate the unsharp mask.
6. The imaging system of claim 1 wherein said input image is comprised of pixel data and wherein said contrast enhancement further comprises the application of a threshold to said pixel data such that when the value of a pixel is below the threshold, the value of that pixel is changed to a preset value.
7. The imaging system of claim 1 wherein said input image is comprised of pixel data and wherein said contrast enhancement further comprises the offsetting of each pixel value by a set amount to create an adjusted pixel value and if any adjusted pixel value falls outside a preset range, that value is rolled over to a value within the preset range.
8. The imaging system of claim 1 wherein said contrast enhancement further comprises the application of linear scaling.
9. The imaging system of claim 1 wherein said input image is comprised of pixel data and wherein said contrast enhancement further comprises the step of using the absolute values of each pixel value during the execution of one or more processing steps.
10. The imaging system of claim 1 wherein said input image is comprised of pixel data and wherein said contrast enhancement further comprises the application of a maximum filter window that sets the value of a target pixel to the maximum value of any pixels within the window.
1 1 . The imaging system of claim 1 wherein said imaging device has at least two possible contrast enhancement options and further comprising a selector such that a user may use the selector to select one or more contrast enhancement options for said imaging device to apply to generate said enhanced image.
12. The imaging system of claim 1 wherein said area of body tissue comprises body tissue containing vascular structures and the enhanced image contains data allowing a user to locate said vascular structures.
13. A method for enhancing the visibility of buried structures inside body tissue comprising:
(a) receiving infrared light reflected from said body tissue to create an input image; (b) enhancing the contrast of said input image to create an enhanced image containing representations of said buried structures; (c) projecting said enhanced image onto said body tissue such that said representations of said buried structures overlay said buried structures.
14. The method of claim 13 further comprising the initial step of illuminating said body tissue with infrared light.
15. The method of claim 13 wherein the step of enhancing the contrast further comprises the application of first and second blur filters each having a different resolution.
16. The method of claim 15 wherein said first and second blur filters comprise the application of an averaging window to each pixel of said input image.
17. The method of claim 13 wherein the step of enhancing the contrast further comprises adjustment of blur filters used to generate the unsharp mask.
18. The method of claim 13 wherein said input image is comprised of pixel data and wherein the step of enhancing the contrast further comprises the application of a threshold to said pixel data such that when the value of a pixel is below the threshold, the value of that pixel is changed to a preset value.
19. The method of claim 13 wherein said input image is comprised of pixel data and wherein the step of enhancing the contrast further comprises the offsetting of each pixel value by a set amount to create an adjusted pixel value and if any adjusted pixel value falls outside a preset range, that value is rolled over to a value within the preset range.
20. The method of claim 13 wherein the step of enhancing the contrast further comprises the application of linear scaling.
21 . The method of claim 13 wherein said input image is comprised of pixel data and wherein the step of enhancing the contrast further comprises the step of using the absolute values of each pixel value during the execution of one or more processing steps.
22. The method of claim 13 wherein said input image is comprised of pixel data and wherein the step of enhancing the contrast further comprises the application of a maximum filter window that sets the value of a target pixel to the maximum value of any pixels within the window.
23. The method of claim 13 further comprising the step of selecting one or more contrast enhancement options for said imaging device to apply to generate said enhanced image.
24. The method of claim 13 wherein said area of body tissue comprises body tissue containing vascular structures and further comprising the step of locating said vascular structures.
PCT/US2008/054029 2007-02-14 2008-02-14 System and method for projection of subsurface structure onto an object's surface WO2008101129A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
JP2009550140A JP2010517733A (en) 2007-02-14 2008-02-14 Apparatus and method for projecting subcutaneous structure onto object surface
US12/526,820 US20100177184A1 (en) 2007-02-14 2008-02-14 System And Method For Projection of Subsurface Structure Onto An Object's Surface
MX2009008653A MX2009008653A (en) 2007-02-14 2008-02-14 System and method for projection of subsurface structure onto an object's surface.
EP08729923A EP2114253A1 (en) 2007-02-14 2008-02-14 System and method for projection of subsurface structure onto an object's surface
CN200880012041A CN101686820A (en) 2007-02-14 2008-02-14 Be used for structure under the surface is projected to system and method on the object surface
KR1020097018766A KR20090113324A (en) 2007-02-14 2008-02-14 System and method for projection of subsurface structure onto an object's surface

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US90106907P 2007-02-14 2007-02-14
US60/901,069 2007-02-14

Publications (1)

Publication Number Publication Date
WO2008101129A1 true WO2008101129A1 (en) 2008-08-21

Family

ID=39690531

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2008/054029 WO2008101129A1 (en) 2007-02-14 2008-02-14 System and method for projection of subsurface structure onto an object's surface

Country Status (7)

Country Link
US (1) US20100177184A1 (en)
EP (1) EP2114253A1 (en)
JP (1) JP2010517733A (en)
KR (1) KR20090113324A (en)
CN (1) CN101686820A (en)
MX (1) MX2009008653A (en)
WO (1) WO2008101129A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102871645A (en) * 2011-07-11 2013-01-16 浙江大学 Near-infrared imaging ultrasonic vascular therapeutic apparatus
US9235108B2 (en) 2013-07-05 2016-01-12 Panasonic Intellectual Property Management Co., Ltd. Projection system
WO2018125225A1 (en) * 2016-12-30 2018-07-05 Barco Nv System and method for 3d reconstruction
CN112423654A (en) * 2018-07-17 2021-02-26 爱恩百生物科技有限公司 Oral scanner and three-dimensional overlay image display method using same
CN114098641A (en) * 2016-09-13 2022-03-01 开放水域互联网公司 Imaging device and method of imaging tissue

Families Citing this family (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8478386B2 (en) * 2006-01-10 2013-07-02 Accuvein Inc. Practitioner-mounted micro vein enhancer
US9492117B2 (en) 2006-01-10 2016-11-15 Accuvein, Inc. Practitioner-mounted micro vein enhancer
US12089951B2 (en) 2006-01-10 2024-09-17 AccuVeiw, Inc. Scanned laser vein contrast enhancer with scanning correlated to target distance
US10813588B2 (en) 2006-01-10 2020-10-27 Accuvein, Inc. Micro vein enhancer
US8489178B2 (en) 2006-06-29 2013-07-16 Accuvein Inc. Enhanced laser vein contrast enhancer with projection of analyzed vein data
US11253198B2 (en) 2006-01-10 2022-02-22 Accuvein, Inc. Stand-mounted scanned laser vein contrast enhancer
US8838210B2 (en) 2006-06-29 2014-09-16 AccuView, Inc. Scanned laser vein contrast enhancer using a single laser
US9854977B2 (en) 2006-01-10 2018-01-02 Accuvein, Inc. Scanned laser vein contrast enhancer using a single laser, and modulation circuitry
US11278240B2 (en) 2006-01-10 2022-03-22 Accuvein, Inc. Trigger-actuated laser vein contrast enhancer
US8255040B2 (en) * 2006-06-29 2012-08-28 Accuvein, Llc Micro vein enhancer
US8665507B2 (en) * 2006-06-29 2014-03-04 Accuvein, Inc. Module mounting mirror endoscopy
US8594770B2 (en) * 2006-06-29 2013-11-26 Accuvein, Inc. Multispectral detection and presentation of an object's characteristics
US10238294B2 (en) 2006-06-29 2019-03-26 Accuvein, Inc. Scanned laser vein contrast enhancer using one laser
US8463364B2 (en) 2009-07-22 2013-06-11 Accuvein Inc. Vein scanner
US8730321B2 (en) * 2007-06-28 2014-05-20 Accuvein, Inc. Automatic alignment of a contrast enhancement system
WO2009110082A1 (en) * 2008-03-06 2009-09-11 富士通株式会社 Image photographic device, image photographic method, and image photographic program
US9061109B2 (en) 2009-07-22 2015-06-23 Accuvein, Inc. Vein scanner with user interface
NZ705330A (en) 2010-05-04 2016-12-23 Ethicon Llc Laser cutting system and methods for creating self-retaining sutures
JP5507376B2 (en) * 2010-07-28 2014-05-28 三洋電機株式会社 Imaging device
US8996086B2 (en) 2010-09-17 2015-03-31 OptimumTechnologies, Inc. Digital mapping system and method
US8780161B2 (en) * 2011-03-01 2014-07-15 Hewlett-Packard Development Company, L.P. System and method for modifying images
US8947527B1 (en) * 2011-04-01 2015-02-03 Valdis Postovalov Zoom illumination system
CN102429640A (en) * 2011-08-16 2012-05-02 谢幼宸 Portable blood vessel display lamp
KR101348063B1 (en) * 2012-03-07 2014-01-03 진우현 Vein-viewer System using the Difference of Infrared Ray Absorption Rate Based on Oxygen Saturation
US8897522B2 (en) * 2012-05-30 2014-11-25 Xerox Corporation Processing a video for vascular pattern detection and cardiac function analysis
US9072426B2 (en) 2012-08-02 2015-07-07 AccuVein, Inc Device for detecting and illuminating vasculature using an FPGA
JP2015007739A (en) * 2012-10-01 2015-01-15 キヤノン株式会社 Display divice and control method thereof
KR101617068B1 (en) * 2012-10-11 2016-05-02 이문기 Image processing system using polarization difference camera
JP6071444B2 (en) * 2012-11-07 2017-02-01 キヤノン株式会社 Image processing apparatus, operation method thereof, and program
US10376148B2 (en) 2012-12-05 2019-08-13 Accuvein, Inc. System and method for laser imaging and ablation of cancer cells using fluorescence
JP5807192B2 (en) * 2013-01-21 2015-11-10 パナソニックIpマネジメント株式会社 Measuring apparatus and measuring method
CN103126654A (en) * 2013-02-05 2013-06-05 杭州柏拉图科技有限公司 Detecting system for near-infared body surface blood vessel detector
EP2976609B1 (en) * 2013-03-19 2022-01-05 Koninklijke Philips N.V. System for hyperspectral imaging in visible light, method for recording a hyperspectral image and displaying the hyperspectral image in visible light
WO2014183387A1 (en) * 2013-05-13 2014-11-20 执鼎医疗科技江苏有限公司 Vascular image positioning system
CN104414620B (en) * 2013-08-23 2017-07-07 东莞市中健医疗设备科技有限公司 Vein localization method and device based on binocular camera shooting
CN104665852A (en) * 2013-11-29 2015-06-03 上海西门子医疗器械有限公司 Projection method, device and system of medical image
US9298076B2 (en) * 2014-01-05 2016-03-29 Hong Kong Applied Science and Technology Research Institute Company Limited Image projector
WO2015131026A1 (en) 2014-02-27 2015-09-03 Intuitive Surgical Operations, Inc. System and method for specular reflection detection and reduction
US9968285B2 (en) * 2014-07-25 2018-05-15 Christie Digital Systems Usa, Inc. Multispectral medical imaging devices and methods thereof
CN107411705A (en) * 2017-04-05 2017-12-01 展谱光电科技(上海)有限公司 Multispectral shooting and projection arrangement and method
CN108937992B (en) * 2018-08-06 2020-10-23 清华大学 In-situ visualization system for X-ray perspective imaging and calibration method thereof
CN112529800B (en) * 2020-12-07 2022-08-23 同济大学 Near-infrared vein image processing method for filtering hair noise

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030007687A1 (en) * 2001-07-05 2003-01-09 Jasc Software, Inc. Correction of "red-eye" effects in images
US20050157939A1 (en) * 2004-01-16 2005-07-21 Mark Arsenault Processes, products and systems for enhancing images of blood vessels
US20060072158A1 (en) * 2004-09-29 2006-04-06 Greg Christie Methods and apparatuses for aesthetically enhanced image conversion
US20060087557A1 (en) * 2004-10-20 2006-04-27 Fuji Photo Film Co., Ltd. Electronic endoscope apparatus
US20060122515A1 (en) * 2000-01-19 2006-06-08 Luminetx Corporation Projection of subsurface structure onto an object's surface
US20060238784A1 (en) * 2005-04-20 2006-10-26 Lee Hae-Kee Image processing apparatus using multi-level halftoning and method thereof

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05237083A (en) * 1991-11-22 1993-09-17 Arch Dev Corp Digital image system and digital imaging method
JP3568280B2 (en) * 1995-07-12 2004-09-22 富士写真フイルム株式会社 Surgical operation support system
JPH1127533A (en) * 1997-07-07 1999-01-29 Dainippon Screen Mfg Co Ltd Contour emphasis method and its device
US6285798B1 (en) * 1998-07-06 2001-09-04 Eastman Kodak Company Automatic tone adjustment by contrast gain-control on edges
US6556858B1 (en) * 2000-01-19 2003-04-29 Herbert D. Zeman Diffuse infrared light imaging system
US20070156038A1 (en) * 2000-01-19 2007-07-05 Zeman Herbert D Method to image the heart
DE60318022T2 (en) * 2003-06-11 2008-09-11 Agfa Healthcare Nv Method and user interface for changing at least contrast or intensity of the pixels of a processed image
US7035077B2 (en) * 2004-05-10 2006-04-25 Greatbatch-Sierra, Inc. Device to protect an active implantable medical device feedthrough capacitor from stray laser weld strikes, and related manufacturing process
JP2006102110A (en) * 2004-10-05 2006-04-20 Matsushita Electric Ind Co Ltd Blood vessel position presenting apparatus
JP2006102360A (en) * 2004-10-08 2006-04-20 Matsushita Electric Ind Co Ltd Living body information presentation device
US20070038118A1 (en) * 2005-08-10 2007-02-15 Depue Marshall Thomas Subcutaneous tissue imager
US8838210B2 (en) * 2006-06-29 2014-09-16 AccuView, Inc. Scanned laser vein contrast enhancer using a single laser
JP4834464B2 (en) * 2006-06-06 2011-12-14 パナソニック株式会社 Image processing method and image processing apparatus
US20080298642A1 (en) * 2006-11-03 2008-12-04 Snowflake Technologies Corporation Method and apparatus for extraction and matching of biometric detail

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060122515A1 (en) * 2000-01-19 2006-06-08 Luminetx Corporation Projection of subsurface structure onto an object's surface
US20030007687A1 (en) * 2001-07-05 2003-01-09 Jasc Software, Inc. Correction of "red-eye" effects in images
US20050157939A1 (en) * 2004-01-16 2005-07-21 Mark Arsenault Processes, products and systems for enhancing images of blood vessels
US20060072158A1 (en) * 2004-09-29 2006-04-06 Greg Christie Methods and apparatuses for aesthetically enhanced image conversion
US20060087557A1 (en) * 2004-10-20 2006-04-27 Fuji Photo Film Co., Ltd. Electronic endoscope apparatus
US20060238784A1 (en) * 2005-04-20 2006-10-26 Lee Hae-Kee Image processing apparatus using multi-level halftoning and method thereof

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102871645A (en) * 2011-07-11 2013-01-16 浙江大学 Near-infrared imaging ultrasonic vascular therapeutic apparatus
US9235108B2 (en) 2013-07-05 2016-01-12 Panasonic Intellectual Property Management Co., Ltd. Projection system
US9354493B2 (en) 2013-07-05 2016-05-31 Panasonic Intellectual Property Management Co., Ltd. Projection system
CN114098641A (en) * 2016-09-13 2022-03-01 开放水域互联网公司 Imaging device and method of imaging tissue
WO2018125225A1 (en) * 2016-12-30 2018-07-05 Barco Nv System and method for 3d reconstruction
AU2016433865B2 (en) * 2016-12-30 2022-05-26 Barco Nv System and method for 3D reconstruction
US11986311B2 (en) 2016-12-30 2024-05-21 Barco Nv System and method for 3D reconstruction
CN112423654A (en) * 2018-07-17 2021-02-26 爱恩百生物科技有限公司 Oral scanner and three-dimensional overlay image display method using same
EP3824800A4 (en) * 2018-07-17 2022-04-20 Ionebio Inc. Oral scanner and 3d overlay image display method using same

Also Published As

Publication number Publication date
JP2010517733A (en) 2010-05-27
US20100177184A1 (en) 2010-07-15
MX2009008653A (en) 2009-12-08
CN101686820A (en) 2010-03-31
EP2114253A1 (en) 2009-11-11
KR20090113324A (en) 2009-10-29

Similar Documents

Publication Publication Date Title
US20100177184A1 (en) System And Method For Projection of Subsurface Structure Onto An Object's Surface
EP1906833B1 (en) Projection of subsurface structure onto an object's surface
US8078263B2 (en) Projection of subsurface structure onto an object's surface
JP6743137B2 (en) System and method for target illumination and imaging
US8494616B2 (en) Method and apparatus for projection of subsurface structure onto an object's surface
EP1349487B1 (en) Image capturing device with reflex reduction
CA2518315C (en) Imaging system using diffuse infrared light
US7328060B2 (en) Cancer detection and adaptive dose optimization treatment system
JP3532368B2 (en) Endoscope
JP4739242B2 (en) Imaging of embedded structures
JP6779089B2 (en) Endoscope system and how to drive the endoscope system
US20050157939A1 (en) Processes, products and systems for enhancing images of blood vessels
CN110087528B (en) Endoscope system and image display device
EP3284396B1 (en) Observation apparatus and method for visual enhancement of an observed object
CN107981855B (en) Blood flow imaging device and endoscope
KR20080043767A (en) Projection of subsurface structure onto an object's surface
KR20160069233A (en) Vascular Venous Identification System And Its Methods
KR20150061793A (en) Vascular Venous Identification System And Its Methods
US12133027B2 (en) Medical control device and medical observation controlling projected illumination
US20220151460A1 (en) Medical control device and medical observation
CN118806231A (en) Skin detection device with display screen
CN118806232A (en) Skin detection device and intelligent device connected with display screen
JPH08271985A (en) Illumination device

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200880012041.5

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08729923

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2009550140

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: MX/A/2009/008653

Country of ref document: MX

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 1020097018766

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 5357/CHENP/2009

Country of ref document: IN

Ref document number: 2008729923

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 12526820

Country of ref document: US