WO2016039002A1 - Imaging device and imaging method - Google Patents

Imaging device and imaging method Download PDF

Info

Publication number
WO2016039002A1
WO2016039002A1 PCT/JP2015/069607 JP2015069607W WO2016039002A1 WO 2016039002 A1 WO2016039002 A1 WO 2016039002A1 JP 2015069607 W JP2015069607 W JP 2015069607W WO 2016039002 A1 WO2016039002 A1 WO 2016039002A1
Authority
WO
WIPO (PCT)
Prior art keywords
relational expression
focusing lens
image
focus
accuracy index
Prior art date
Application number
PCT/JP2015/069607
Other languages
French (fr)
Japanese (ja)
Inventor
足立 晋
Original Assignee
株式会社島津製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社島津製作所 filed Critical 株式会社島津製作所
Publication of WO2016039002A1 publication Critical patent/WO2016039002A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof

Definitions

  • the present invention relates to an imaging apparatus and an imaging method for photographing fluorescence generated from a fluorescent dye by irradiating a fluorescent dye injected into a subject with excitation light.
  • indocyanine green as a fluorescent dye is injected into the affected area.
  • ICG indocyanine green
  • near infrared light 750 to 810 nm (nanometers) as excitation light
  • indocyanine green emits fluorescence in the near infrared region having a peak at about 800 nm.
  • This fluorescence is photographed by a camera capable of detecting near infrared light, and the image is displayed on a display unit such as a liquid crystal display panel.
  • a display unit such as a liquid crystal display panel.
  • Patent Document 1 discloses a near-infrared fluorescence intensity distribution image obtained by irradiating an indocyanine green excitation light to a living organ to which indocyanine green is administered, and indocyanine green administration. Compared with the cancer lesion distribution image obtained by applying X-rays, nuclear magnetic resonance or ultrasound to the previous test organ, it is detected by the intensity distribution image of near-infrared fluorescence, A data collection method is disclosed in which data of a region that is not detected in a cancer lesion distribution image is collected as cancer secondary lesion region data.
  • excitation light and visible light are alternately irradiated on a subject to which an angiographic contrast agent is administered, and a fluorescence image and a visible image irradiated with excitation light are alternately acquired by an imaging unit.
  • a surgical support method is disclosed in which a fluorescent image is subjected to threshold processing with a predetermined threshold to extract a blood vessel image, and a composite image in which the visible image and the extracted blood vessel image are superimposed is created.
  • Patent Document 3 a lens is driven in an arbitrary direction from a focusing operation start point, a contrast value is obtained at n points during this time, and the majority value is determined by comparing the contrast value at the n point with the contrast value at the starting point.
  • a hill-climbing autofocus device that determines an initial drive direction.
  • the autofocus device described in Patent Document 3 employs a configuration in which a focus value, which is a parameter corresponding to contrast, is calculated from a captured image and the focusing lens is moved to a position where the focus value is maximized. Has been.
  • Fluorescence from indocyanine green is very weak compared to visible light. For this reason, when this fluorescence is imaged by the image sensor, the noise component of the image sensor occupies a large proportion of the fluorescence image signal.
  • FIG. 8 is a graph schematically showing the relationship between the rotational position of the motor for moving the focusing lens and the focus value at that time.
  • the vertical axis and the horizontal axis indicate arbitrary units (au).
  • the present invention has been made to solve the above-described problems, and an object of the present invention is to provide an imaging apparatus capable of performing accurate focusing even when photographing fluorescence.
  • an excitation light source that irradiates the subject with an excitation light beam for exciting the fluorescent dye injected into the subject, and a fluorescent image that is generated from the fluorescent dye when irradiated with the excitation light beam
  • An image pickup device capable of picking up an image, a focusing mechanism for focusing a fluorescent image generated from the fluorescent dye by movement of the focusing lens on the image pickup device, and a motor for moving the focusing lens
  • the imaging apparatus for capturing a fluorescent image of the subject the focusing lens is moved by the motor, and the position of the focusing lens at that time and the focus value of the fluorescent image captured by the imaging element
  • a relational expression calculation unit for obtaining a relational expression representing the relationship between the relational expression and an accuracy index for calculating an accuracy index of the relational expression obtained by the relational expression calculation unit
  • the accuracy index calculated by the calculation unit and the accuracy index calculation unit is equal to or greater than a predetermined value, an area where the in-focus position exists is determined based on the relational expression obtained by the relational
  • the relational expression is a linear expression
  • the focusing lens position control unit determines that an area where the inclination of the straight line represented by the linear expression is inverted is an area where the in-focus position exists.
  • the focus value is calculated by applying a sharpening filter to the fluorescence image captured by the image sensor.
  • an excitation beam for exciting the fluorescent dye injected into the subject is irradiated toward the subject, and the fluorescence generated from the fluorescent dye is focused on the image sensor by moving the focusing lens.
  • an excitation light irradiation step of irradiating the subject with an excitation light beam, and a fluorescent image generated from the fluorescent dye while moving the focusing lens A relational expression calculating step for obtaining a relational expression representing a relation between the position of the focusing lens and the focus value of the fluorescent image; and an accuracy index calculating step for calculating a precision index of the relational expression calculated in the relational expression calculating step;
  • the relational expression calculation step is continuously executed until the accuracy index calculated in the accuracy index calculation step reaches a set value, and the accuracy index becomes equal to or greater than the set value.
  • the focus value can be easily calculated by the sharpening filter.
  • FIG. 1 is a schematic diagram of an imaging apparatus according to the present invention.
  • 2 is a perspective view of an illumination / photographing unit 12.
  • FIG. 2 is a schematic diagram of a camera 21.
  • FIG. It is a block diagram which shows the main control systems of the imaging device which concerns on this invention.
  • 3 is a functional block diagram of a fluorescence image processing unit 32.
  • FIG. It is a flowchart which shows the imaging method which concerns on this invention. It is explanatory drawing of a Laplacian filter. It is a graph which shows typically the relation between the rotation position of the motor for moving the focusing lens, and the focus value at that time.
  • FIG. 1 is a schematic diagram of an imaging apparatus according to the present invention.
  • the imaging apparatus includes an input unit 11 such as a touch panel, and includes a main body 10 incorporating a control unit 30 and the like described later, an illumination / photographing unit 12 supported movably by an arm 13, and a liquid crystal display panel.
  • a display unit 14 and a treatment table 16 on which a patient 17 is placed are provided.
  • the illumination / imaging unit 12 is not limited to the one supported by the arm 13, and may be carried by the surgeon in hand.
  • the display unit 14 may be a glasses-type wearable device instead of a liquid crystal display panel.
  • FIG. 2 is a perspective view of the illumination / photographing unit 12 described above.
  • the illumination / photographing unit 12 includes a camera 21 capable of detecting near-infrared rays and visible light, a visible light source 22 including a large number of LEDs disposed on the outer peripheral portion of the camera 21, and an outer peripheral portion of the visible light source 22. And an excitation light source 23 made up of a number of arranged LEDs.
  • the visible light source 22 emits visible light.
  • the excitation light source 23 irradiates near infrared light having a wavelength of 760 nm, which is excitation light for exciting indocyanine green.
  • FIG. 3 is a schematic diagram of the camera 21.
  • the camera 21 includes a focusing lens 71 that reciprocates for focusing, a wavelength selection filter 53, a visible light image sensor 51, and a fluorescence image sensor 52.
  • the visible light image sensor 51 and the fluorescence image sensor 52 are composed of a CMOS or a CCD.
  • Visible light and fluorescence incident on the camera 21 coaxially along the optical axis L pass through the focusing lens 71 constituting the focusing mechanism, and then reach the wavelength selection filter 53.
  • visible light and fluorescence incident coaxially visible light is reflected by the wavelength selection filter 53 and enters the visible light image sensor 51.
  • the fluorescence passes through the wavelength selection filter 53 and enters the fluorescence imaging device 52. At this time, the visible light is focused on the visible light imaging element 51 and the fluorescence is focused on the fluorescent imaging element 52 by the action of the focusing mechanism including the focusing lens 71.
  • FIG. 4 is a block diagram showing a main control system of the imaging apparatus according to the present invention.
  • This imaging apparatus is composed of a CPU that performs logical operations, a ROM that stores operation programs necessary for controlling the apparatus, a RAM that temporarily stores data during control, and the like, and a control unit that controls the entire apparatus 30.
  • the control unit 30 synthesizes the visible image and the fluorescent image with the visible image processing unit 31 for processing the visible image, the fluorescent image processing unit 32 for processing the fluorescent image generated from indocyanine green, and the like. Image synthesizing unit 33.
  • the control unit 30 is connected to the input unit 11 and the display unit 14 described above.
  • the control unit 30 is connected to an illumination / photographing unit 12 including a camera 21, a visible light source 22, and an excitation light source 23. Further, the control unit 30 is connected to a motor 39 for moving the focusing lens 71 described above.
  • the visible image processing unit 31 in the control unit 30 transmits a control signal to the visible light source 22 and the visible light imaging device 51 and receives an image signal of a visible image from the visible light imaging device 51.
  • the fluorescence image processing unit 32 in the control unit 30 transmits a control signal to the excitation light source 23 and the fluorescence imaging element 52, receives an image signal of the fluorescence image from the fluorescence imaging element 52, and The rotation of the motor 39 that moves the focusing lens 71 is controlled.
  • indocyanine green When performing an operation using the imaging apparatus having the above-described configuration, indocyanine green is injected into the patient 17 who is supine on the treatment table 16 by injection. Then, near infrared rays are emitted from the excitation light source 23 toward the subject including the affected part, and visible light is emitted from the visible light source 22. As the near infrared light emitted from the excitation light source 23, as described above, 760 nm near infrared light acting as excitation light for indocyanine green to emit fluorescence is employed. As a result, indocyanine green generates fluorescence in the near infrared region having a peak at about 800 nm.
  • the camera 21 captures the vicinity of the affected part of the patient 17.
  • the camera 21 can detect fluorescence and visible light.
  • An image obtained by irradiating the patient 17 with visible light and photographing this with the camera 21 becomes a visible image
  • an image obtained by irradiating the patient 17 with near-infrared light and photographing fluorescence from indocyanine green with the camera 21 is a fluorescent image.
  • a fluorescent image captured by the camera 21 at a predetermined frame rate is processed by the fluorescent image processing unit 32
  • a visible image captured by the camera 21 at a predetermined frame rate is processed by the visible image processing unit 31.
  • the processed fluorescence image data and visible image data are synthesized by the image synthesis unit 33, and a synthesized image in which the visible image and the fluorescence image are fused is created.
  • a fluorescent image, a visible image, and a composite image are displayed simultaneously in a divided area or selectively.
  • noise is superimposed on an image captured by the visible light image sensor 51 and the fluorescence image sensor 52.
  • the presence of this noise becomes a problem particularly when photographing fluorescence from indocyanine green, which is very weak compared to visible light, when the amplification factor of the signal of the fluorescence image sensor 52 is increased. This will affect the focus operation.
  • a relational expression representing a relation between the position of the focusing lens 71 and the focus value of the fluorescent image captured by the fluorescent imaging element 52 is calculated by the relational expression calculation unit.
  • the accuracy index of the relational expression calculated by the relational expression calculation unit is calculated by the accuracy index calculation unit, and the in-focus position exists based on the relational expression when the accuracy index calculated by the accuracy index calculation unit is a predetermined value or more. A region is determined, and the focusing lens 71 is moved to a focus position at a position where the focus value is maximum in this region.
  • FIG. 5 is a functional block diagram of the fluorescent image processing unit 32 described above.
  • the fluorescent image processing unit 32 includes an arithmetic unit 41 that executes various types of calculations described later, an image memory 48 for storing fluorescent images, a motor driver 49 for driving the motor 39, and a bus controller. 46.
  • the computing unit 41, the image memory 48, and the bus controller 46 are connected by a data bus 47.
  • the calculator 41 represents the relationship between the position of the focusing lens 71 by moving the focusing lens 71 by the motor 39 and the focus value of the fluorescence image captured by the fluorescence imaging element 52.
  • the accuracy index calculated by the calculation unit 43 is equal to or greater than a predetermined value
  • an area where the in-focus position exists is determined based on the relational expression, and the focus value is set to a position where the focus value is maximum in this area.
  • FIG. 6 is a flowchart showing a focusing operation at the time of photographing a fluorescent image by the imaging apparatus according to the present invention.
  • the motor 39 shown in FIG. 4 is driven to start the movement of the focusing lens 71 shown in FIG. 3 toward the farther side of the both ends of the stroke from the current position (step). S1).
  • focus values are acquired at n positions while moving the focusing lens 71 in the direction (step S2).
  • This focus value is acquired at n positions with respect to the fluorescence image captured by the fluorescence imaging element 52 in the camera 21 while moving the focusing lens 71.
  • This focus value is calculated by applying a sharpening filter to the fluorescence image captured by the fluorescence image sensor 52.
  • FIG. 7 is an explanatory diagram of a Laplacian filter as a sharpening filter.
  • a 3 ⁇ 3 Laplacian filter is used as the sharpening filter.
  • the sharpening filter is also called a sharpening filter, and is a filter that extracts edge information of a captured fluorescent image.
  • the focus value at this position is calculated by applying the sharpening filter shown in FIG. 7 to each pixel of the fluorescence image captured by the fluorescence imaging element 52.
  • the coefficient and matrix size of the Laplacian filter can be changed as appropriate.
  • another sharpening filter may be used instead of the Laplacian filter.
  • the coefficient showing the inclination in this relational expression is determined (step S4). That is, it is determined whether the inclination is in the upward direction, the downward direction, or whether the inclination cannot be determined.
  • n focus values y i (i 1, 2, 3... N) of the fluorescence image captured by the fluorescence imaging element 52 and at that time
  • fitting is performed using the least square method on the linear function expressed by the following formulas 1 and 2.
  • a determination coefficient R 2 representing the accuracy of fitting of this mathematical expression is expressed by the following mathematical expressions 4 and 5.
  • the value of the coefficient of determination R 2 is a value between 0 and 1, the closer to 1, the higher the statistical confidence.
  • step S4 the coefficient of determination R 2 values are predetermined threshold (e.g., 0.8) and above the, and determines the coefficient a is the increase, if a positive value, the coefficient of determination and the value of R 2 is above a predetermined threshold value, and determines the coefficient a is a lowered if a negative value, determines the value of the coefficient of determination R 2 is impossible determined equal to or smaller than a predetermined threshold value .
  • predetermined threshold e.g., 0.8
  • step S4 When the determination of “down” is made in the tilt determination step (step S4), the moving direction of the focusing lens 71 is set in the reverse direction (step S3), and the above-described step S2 is executed again. In addition, when the determination of “rise” is made in the inclination determination step (step S4), the process proceeds to step S6 described later. On the other hand, when it is determined that “determination is impossible” in the tilt determination step (step S4), one of n data indicating the relationship between the focus value y i and the position x i of the focusing lens 71. And the focus data is updated by acquiring the next data (step S5), and step S4 is repeated until the determination becomes possible.
  • step S4 When it is determined that “increase” in the tilt determination step (step S4), one of the n data indicating the relationship with the position x i of the focus value y i focusing lens 71 is discarded and the next After the focus data is updated by acquiring the data (step S6), the tilt determination is performed again (step S7).
  • step S6 When the determination “rising” is made, or when the determination is “impossible to judge”, the n values of the data indicating the relationship with the position x i of the focus value y i focusing lens 71 are obtained. While discarding one, the next data is acquired, and step S6 and step S7 are repeated until a determination of “down” is obtained.
  • the inclination determination is performed by using a linear function.
  • other functions such as a quadratic function may be used.
  • the focus of the fluorescent image has been described.
  • the focus result of the fluorescent image may be used for the visible image.
  • the fluorescent image and the visible image may be focused by separate optical systems.
  • the excitation light source 23 that emits near-infrared light having a wavelength of 760 nm is used.
  • the excitation light source 23 excites indocyanine green to generate excitation light.
  • Those that irradiate near-infrared light having a wavelength of about 750 nm to 810 nm capable of generating light can be used.
  • indocyanine green is used as a material containing a fluorescent dye, and the indocyanine green is irradiated with near-infrared light of 760 nm as excitation light, thereby approximately 800 nm from indocyanine green.
  • near-infrared light 760 nm as excitation light, thereby approximately 800 nm from indocyanine green.
  • light other than near-infrared light may be used.
  • 5-ALA (5-aminolevulinic acid / 5-aminolevulinic acid) can be used as a fluorescent dye.
  • 5-ALA When 5-ALA is used, 5-ALA that has entered the body of the patient 17 changes to a protoporphyrin IX / PpIX that is a fluorescent substance.
  • visible light of about 400 nm When visible light of about 400 nm is irradiated toward the protoporphyrin, red visible light is irradiated as fluorescence from the protoporphyrin. Therefore, when 5-ALA is used, an excitation light source that emits visible light having a wavelength of about 400 nm may be used, and a light source for confirmation emits fluorescence from protoporphyrin. What irradiates red visible light to be used may be used.

Abstract

 A relational expression indicating a relationship between the position of a focusing lens and a focus value of a fluorescence image captured by a fluorescence imaging element is computed by a relational expression computing unit 42, a precision index of the relational expression computed by the relational expression computing unit 42 is computed by a precision index computing unit 43, a region in which a focal position is present on the basis of the relational expression when the precision index computed by the precision index computing unit 43 is equal to or greater than a predetermined value is determined by a lens position control unit 44, and the focusing lens is moved to the position at which the focus value is maximized in the region.

Description

イメージング装置およびイメージング方法Imaging apparatus and imaging method
 この発明は、被写体に注入された蛍光色素に対して励起光を照射し、蛍光色素から発生した蛍光を撮影するイメージング装置およびイメージング方法に関する。 The present invention relates to an imaging apparatus and an imaging method for photographing fluorescence generated from a fluorescent dye by irradiating a fluorescent dye injected into a subject with excitation light.
 近年、近赤外蛍光イメージングと呼称される手法が外科手術に利用されている。この近赤外蛍光イメージングにおいては、蛍光色素としてのインドシアニングリーン(ICG)を患部に注入する。そして、このインドシアニングリーンに750~810nm(ナノメータ)の近赤外光を励起光として照射すると、インドシアニングリーンはおおよそ800nmをピークとする近赤外領域の蛍光を発する。この蛍光を、近赤外光を検出可能なカメラで撮影し、その画像を液晶表示パネル等の表示部に表示する。この近赤外蛍光イメージングによれば、体表から20mm程度までの深さに存在する血管やリンパ管等の観察が可能となる。 In recent years, a technique called near-infrared fluorescence imaging has been used for surgery. In this near-infrared fluorescence imaging, indocyanine green (ICG) as a fluorescent dye is injected into the affected area. When indocyanine green is irradiated with near infrared light of 750 to 810 nm (nanometers) as excitation light, indocyanine green emits fluorescence in the near infrared region having a peak at about 800 nm. This fluorescence is photographed by a camera capable of detecting near infrared light, and the image is displayed on a display unit such as a liquid crystal display panel. According to this near-infrared fluorescence imaging, it is possible to observe blood vessels, lymph vessels, and the like existing at a depth of about 20 mm from the body surface.
 特許文献1には、インドシアニングリーンが投与された生体の被検臓器に対して、インドシアニングリーンの励起光を照射して得られた、近赤外蛍光の強度分布イメージと、インドシアニングリーン投与前の被検臓器に対して、X線、核磁気共鳴または超音波を作用させて得られた、癌病巣分布イメージと、を比較し、近赤外蛍光の強度分布イメージで検出されるが、癌病巣分布イメージでは検出されない領域のデータを、癌の副病巣領域データとして収集するデータ収集方法が開示されている。 Patent Document 1 discloses a near-infrared fluorescence intensity distribution image obtained by irradiating an indocyanine green excitation light to a living organ to which indocyanine green is administered, and indocyanine green administration. Compared with the cancer lesion distribution image obtained by applying X-rays, nuclear magnetic resonance or ultrasound to the previous test organ, it is detected by the intensity distribution image of near-infrared fluorescence, A data collection method is disclosed in which data of a region that is not detected in a cancer lesion distribution image is collected as cancer secondary lesion region data.
 特許文献2には、血管造影剤が投与された被検体に対して励起光と可視光とを交互に照射し、撮像手段によって励起光が照射された蛍光画像と可視画像とを交互に取得するとともに、蛍光画像を所定の閾値により閾値処理して血管画像を抽出し、可視画像と抽出した血管画像を重畳させた合成画像を作成する手術支援方法が開示されている。 In Patent Literature 2, excitation light and visible light are alternately irradiated on a subject to which an angiographic contrast agent is administered, and a fluorescence image and a visible image irradiated with excitation light are alternately acquired by an imaging unit. In addition, a surgical support method is disclosed in which a fluorescent image is subjected to threshold processing with a predetermined threshold to extract a blood vessel image, and a composite image in which the visible image and the extracted blood vessel image are superimposed is created.
 また、このような被検体を撮像素子により撮影するときには、被検体の画像を撮像素子に対して焦点合わせする必要がある。特許文献3には、合焦動作開始点より任意方向にレンズ駆動し、この間のnポイントでコントラスト値を求め、上記nポイントのコントラスト値を開始点におけるコントラスト値と大小比較して多数決判断することにより、初期駆動方向を決定する山登り方式によるオートフォーカス装置が開示されている。この特許文献3に記載のオートフォーカス装置においては、撮影された画像からコントラストに対応するパラメータであるフォーカス値を計算し、このフォーカス値が最大となる位置に焦点合わせ用レンズを移動させる構成が採用されている。 Also, when such an object is imaged by the image sensor, it is necessary to focus the image of the object on the image sensor. In Patent Document 3, a lens is driven in an arbitrary direction from a focusing operation start point, a contrast value is obtained at n points during this time, and the majority value is determined by comparing the contrast value at the n point with the contrast value at the starting point. Discloses a hill-climbing autofocus device that determines an initial drive direction. The autofocus device described in Patent Document 3 employs a configuration in which a focus value, which is a parameter corresponding to contrast, is calculated from a captured image and the focusing lens is moved to a position where the focus value is maximized. Has been.
国際公開第2009/139466号公報International Publication No. 2009/139466 特開2009-226072号公報JP 2009-226072 A 特許第3179138号公報Japanese Patent No. 3179138
 インドシアニングリーンからの蛍光は、可視光に比べて非常に微弱である。このため、この蛍光を撮像素子により撮影した場合には、蛍光の画像信号に対する撮像素子のノイズ成分が大きな割合を占めることになる。 Fluorescence from indocyanine green is very weak compared to visible light. For this reason, when this fluorescence is imaged by the image sensor, the noise component of the image sensor occupies a large proportion of the fluorescence image signal.
 図8は、焦点合わせ用レンズを移動させるためのモータの回転位置とそのときのフォーカス値の関係を模式的に示すグラフである。なお、このグラフにおいては、縦軸および横軸は、任意単位(a.u.)を示している。 FIG. 8 is a graph schematically showing the relationship between the rotational position of the motor for moving the focusing lens and the focus value at that time. In this graph, the vertical axis and the horizontal axis indicate arbitrary units (au).
 図8に示すように、モータの回転位置とフォーカス値との関係においては、蛍光が微弱であり、撮像素子からの信号にノイズ成分が多く含まれることから、それらの関係に揺らぎが生じる。このため、上述した特許文献3に記載のオートフォーカス装置のような構成を採用した場合においては、フォーカス値が最大となる位置の判定が困難となり、精度よく焦点合わせを実行することができないという問題が生ずる。 As shown in FIG. 8, in the relationship between the rotational position of the motor and the focus value, the fluorescence is weak, and a lot of noise components are included in the signal from the image sensor, so that the relationship fluctuates. For this reason, when the configuration like the autofocus device described in Patent Document 3 described above is adopted, it is difficult to determine the position where the focus value is maximized, and it is not possible to perform focusing accurately. Will occur.
 この発明は上記課題を解決するためになされたものであり、蛍光を撮影する場合であっても、正確な焦点合わせを実行することが可能なイメージング装置を提供することを目的とする。 The present invention has been made to solve the above-described problems, and an object of the present invention is to provide an imaging apparatus capable of performing accurate focusing even when photographing fluorescence.
 第1の発明では、被写体に注入された蛍光色素を励起させるための励起光線を、前記被写体に向けて照射する励起用光源と、励起光線が照射されることにより前記蛍光色素から発生する蛍光画像を撮像可能な撮像素子と、焦点合わせ用レンズの移動により前記蛍光色素から発生した蛍光画像を前記撮像素子に対して焦点合わせする焦点合わせ機構と、前記焦点合わせ用レンズを移動させるためのモータと、を備え、前記被写体の蛍光画像を撮影するイメージング装置において、前記モータにより前記焦点合わせ用レンズを移動させ、そのときの焦点合わせ用レンズの位置と前記撮像素子により撮像した蛍光画像のフォーカス値との関係を表す関係式を求める関係式演算部と、前記関係式演算部で求めた関係式の精度指標を演算する精度指標演算部と、前記精度指標演算部により演算した精度指標が所定値以上であったときに、前記関係式演算部により求めた関係式に基づいて合焦点位置が存在する領域を判定し、この領域内で前記フォーカス値が最大となる位置に前記焦点合わせ用レンズを移動させる焦点合わせ用レンズ位置制御部と、を備える。 In the first invention, an excitation light source that irradiates the subject with an excitation light beam for exciting the fluorescent dye injected into the subject, and a fluorescent image that is generated from the fluorescent dye when irradiated with the excitation light beam An image pickup device capable of picking up an image, a focusing mechanism for focusing a fluorescent image generated from the fluorescent dye by movement of the focusing lens on the image pickup device, and a motor for moving the focusing lens In the imaging apparatus for capturing a fluorescent image of the subject, the focusing lens is moved by the motor, and the position of the focusing lens at that time and the focus value of the fluorescent image captured by the imaging element A relational expression calculation unit for obtaining a relational expression representing the relationship between the relational expression and an accuracy index for calculating an accuracy index of the relational expression obtained by the relational expression calculation unit When the accuracy index calculated by the calculation unit and the accuracy index calculation unit is equal to or greater than a predetermined value, an area where the in-focus position exists is determined based on the relational expression obtained by the relational expression calculation unit, and this area A focusing lens position control unit that moves the focusing lens to a position where the focus value is maximized.
 第2の発明では、前記関係式は一次式であり、前記焦点合わせ用レンズ位置制御部は、前記一次式で示される直線の傾きが反転した領域を前記合焦点位置が存在する領域と判定する。 In the second invention, the relational expression is a linear expression, and the focusing lens position control unit determines that an area where the inclination of the straight line represented by the linear expression is inverted is an area where the in-focus position exists. .
 第3の発明では、前記フォーカス値は、前記撮像素子により撮像した蛍光画像に先鋭化フィルターを適用することにより演算される。 In the third invention, the focus value is calculated by applying a sharpening filter to the fluorescence image captured by the image sensor.
 第4の発明では、被写体に注入された蛍光色素を励起させるための励起光線を前記被写体に向けて照射し、前記蛍光色素から発生する蛍光を焦点合わせ用レンズの移動により撮像素子に対して焦点合わせして励起光の画像を得るイメージング方法において、前記被写体に対して励起光線を照射する励起光照射工程と、前記焦点合わせ用レンズを移動されながら前記蛍光色素から発生する蛍光画像を撮影し、前記焦点合わせ用レンズの位置と前記蛍光画像のフォーカス値との関係を表す関係式を求める関係式演算工程と、前記関係式演算工程で演算した関係式の精度指標を演算する精度指標演算工程と、前記精度指標演算工程で演算した精度指標が設定値となるまで前記関係式演算工程を継続して実行し、精度指標が設定値以上となったときに、前記関係式演算工程にて演算した関係式に基づいて合焦点位置が存在する領域を判定し、この領域内で前記フォーカス値が最大となる位置に前記焦点合わせ用レンズを移動させる焦点合わせ工程と、を含む。 In the fourth aspect of the invention, an excitation beam for exciting the fluorescent dye injected into the subject is irradiated toward the subject, and the fluorescence generated from the fluorescent dye is focused on the image sensor by moving the focusing lens. In the imaging method for obtaining an image of excitation light together, an excitation light irradiation step of irradiating the subject with an excitation light beam, and a fluorescent image generated from the fluorescent dye while moving the focusing lens, A relational expression calculating step for obtaining a relational expression representing a relation between the position of the focusing lens and the focus value of the fluorescent image; and an accuracy index calculating step for calculating a precision index of the relational expression calculated in the relational expression calculating step; The relational expression calculation step is continuously executed until the accuracy index calculated in the accuracy index calculation step reaches a set value, and the accuracy index becomes equal to or greater than the set value. The focusing step of determining a region where the in-focus position exists based on the relational expression calculated in the relational expression calculating step, and moving the focusing lens to a position where the focus value is maximum in this region And including.
 第1および第4の発明によれば、微弱な蛍光を撮影する場合であっても、精度指標を利用することにより、ノイズの影響を防止して、正確な焦点合わせを実行することが可能となる。 According to the first and fourth inventions, even when weak fluorescence is photographed, it is possible to prevent the influence of noise and perform accurate focusing by using an accuracy index. Become.
 第2の発明によれば、一次式を利用して、容易に合焦点位置を判定することが可能となる。 According to the second invention, it is possible to easily determine the in-focus position using the primary expression.
 第3の発明によれば、先鋭化フィルターにより、フォーカス値を容易に演算することが可能となる。 According to the third invention, the focus value can be easily calculated by the sharpening filter.
この発明に係るイメージング装置の概要図である。1 is a schematic diagram of an imaging apparatus according to the present invention. 照明・撮影部12の斜視図である。2 is a perspective view of an illumination / photographing unit 12. FIG. カメラ21の概要図である。2 is a schematic diagram of a camera 21. FIG. この発明に係るイメージング装置の主要な制御系を示すブロック図である。It is a block diagram which shows the main control systems of the imaging device which concerns on this invention. 蛍光画像処理部32の機能ブロック図である。3 is a functional block diagram of a fluorescence image processing unit 32. FIG. この発明に係るイメージング方法を示すフローチャートである。It is a flowchart which shows the imaging method which concerns on this invention. ラプラシアンフィルターの説明図である。It is explanatory drawing of a Laplacian filter. 焦点合わせ用レンズを移動させるためのモータの回転位置とそのときのフォーカス値の関係を模式的に示すグラフである。It is a graph which shows typically the relation between the rotation position of the motor for moving the focusing lens, and the focus value at that time.
 以下、この発明の実施の形態を図面に基づいて説明する。図1は、この発明に係るイメージング装置の概要図である。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. FIG. 1 is a schematic diagram of an imaging apparatus according to the present invention.
 このイメージング装置は、タッチパネル等の入力部11を備え、後述する制御部30等を内蔵した本体10と、アーム13により移動可能に支持された照明・撮影部12と、液晶表示パネルから構成される表示部14と、患者17を載置する治療台16とを備える。なお、照明・撮影部12はアーム13によって支持されたものに限定されず、術者が手に携帯するものであってもよい。また、表示部14として、液晶表示パネルではなく、眼鏡型のウエアラブルデバイスを使用してもよい。 The imaging apparatus includes an input unit 11 such as a touch panel, and includes a main body 10 incorporating a control unit 30 and the like described later, an illumination / photographing unit 12 supported movably by an arm 13, and a liquid crystal display panel. A display unit 14 and a treatment table 16 on which a patient 17 is placed are provided. The illumination / imaging unit 12 is not limited to the one supported by the arm 13, and may be carried by the surgeon in hand. The display unit 14 may be a glasses-type wearable device instead of a liquid crystal display panel.
 図2は、上述した照明・撮影部12の斜視図である。 FIG. 2 is a perspective view of the illumination / photographing unit 12 described above.
 この照明・撮影部12は、近赤外線および可視光を検出可能なカメラ21と、このカメラ21の外周部に配設された多数のLEDよりなる可視光源22と、この可視光源22の外周部に配設された多数のLEDよりなる励起用光源23とを備える。可視光源22は、可視光を照射する。励起用光源23は、インドシアニングリーンを励起させるための励起光であるその波長が760nmの近赤外光を照射する。 The illumination / photographing unit 12 includes a camera 21 capable of detecting near-infrared rays and visible light, a visible light source 22 including a large number of LEDs disposed on the outer peripheral portion of the camera 21, and an outer peripheral portion of the visible light source 22. And an excitation light source 23 made up of a number of arranged LEDs. The visible light source 22 emits visible light. The excitation light source 23 irradiates near infrared light having a wavelength of 760 nm, which is excitation light for exciting indocyanine green.
 図3は、カメラ21の概要図である。 FIG. 3 is a schematic diagram of the camera 21.
 このカメラ21は、焦点合わせのために往復移動する焦点合わせ用レンズ71と、波長選択フィルター53と、可視光用撮像素子51と、蛍光用撮像素子52とを備える。可視光用撮像素子51と蛍光用撮像素子52とは、CMOSやCCDから構成される。カメラ21に対して、その光軸Lに沿って同軸で入射した可視光および蛍光は、焦点合わせ機構を構成する焦点合わせ用レンズ71を通過した後、波長選択フィルター53に到達する。同軸状に入射した可視光および蛍光のうち、可視光は、波長選択フィルター53により反射され、可視光用撮像素子51に入射する。また、同軸状の可視光および蛍光のうち、蛍光は、波長選択フィルター53を通過して蛍光用撮像素子52に入射する。このとき、焦点合わせ用レンズ71を含む焦点合わせ機構の作用により、可視光は可視光用撮像素子51に対して焦点合わせされ、蛍光は蛍光用撮像素子52に対して焦点合わせされる。 The camera 21 includes a focusing lens 71 that reciprocates for focusing, a wavelength selection filter 53, a visible light image sensor 51, and a fluorescence image sensor 52. The visible light image sensor 51 and the fluorescence image sensor 52 are composed of a CMOS or a CCD. Visible light and fluorescence incident on the camera 21 coaxially along the optical axis L pass through the focusing lens 71 constituting the focusing mechanism, and then reach the wavelength selection filter 53. Of visible light and fluorescence incident coaxially, visible light is reflected by the wavelength selection filter 53 and enters the visible light image sensor 51. Of the coaxial visible light and fluorescence, the fluorescence passes through the wavelength selection filter 53 and enters the fluorescence imaging device 52. At this time, the visible light is focused on the visible light imaging element 51 and the fluorescence is focused on the fluorescent imaging element 52 by the action of the focusing mechanism including the focusing lens 71.
 図4は、この発明に係るイメージング装置の主要な制御系を示すブロック図である。 FIG. 4 is a block diagram showing a main control system of the imaging apparatus according to the present invention.
 このイメージング装置は、論理演算を実行するCPU、装置の制御に必要な動作プログラムが格納されたROM、制御時にデータ等が一時的にストアされるRAM等から構成され、装置全体を制御する制御部30を備える。この制御部30は、可視画像を処理するための可視画像処理部31と、インドシアニングリーンから発生した蛍光の画像を処理するための蛍光画像処理部32と、可視画像と蛍光画像とを合成するための画像合成部33とを備える。 This imaging apparatus is composed of a CPU that performs logical operations, a ROM that stores operation programs necessary for controlling the apparatus, a RAM that temporarily stores data during control, and the like, and a control unit that controls the entire apparatus 30. The control unit 30 synthesizes the visible image and the fluorescent image with the visible image processing unit 31 for processing the visible image, the fluorescent image processing unit 32 for processing the fluorescent image generated from indocyanine green, and the like. Image synthesizing unit 33.
 この制御部30は、上述した入力部11および表示部14と接続されている。また、この制御部30は、カメラ21、可視光源22および励起用光源23を備えた照明・撮影部12と接続されている。さらに、この制御部30は、上述した焦点合わせ用レンズ71を移動させるためのモータ39と接続されている。制御部30における可視画像処理部31は、可視光源22と可視光用撮像素子51に対して制御信号を送信するとともに、可視光用撮像素子51から可視画像の画像信号を受信する。また、制御部30における蛍光画像処理部32は、励起用光源23と蛍光用撮像素子52に対して制御信号を送信するとともに、蛍光用撮像素子52から蛍光画像の画像信号を受信し、さらには、焦点合わせ用レンズ71を移動させるモータ39の回転を制御する。 The control unit 30 is connected to the input unit 11 and the display unit 14 described above. The control unit 30 is connected to an illumination / photographing unit 12 including a camera 21, a visible light source 22, and an excitation light source 23. Further, the control unit 30 is connected to a motor 39 for moving the focusing lens 71 described above. The visible image processing unit 31 in the control unit 30 transmits a control signal to the visible light source 22 and the visible light imaging device 51 and receives an image signal of a visible image from the visible light imaging device 51. The fluorescence image processing unit 32 in the control unit 30 transmits a control signal to the excitation light source 23 and the fluorescence imaging element 52, receives an image signal of the fluorescence image from the fluorescence imaging element 52, and The rotation of the motor 39 that moves the focusing lens 71 is controlled.
 上述した構成を有するイメージング装置を使用して手術を行う場合には、治療台16上の仰臥した患者17にインドシアニングリーンを注射により注入する。そして、患部を含む被写体に向けて、励起用光源23から近赤外線を照射するとともに可視光源22から可視光を照射する。なお、励起用光源23から照射される近赤外光としては、上述したように、インドシアニングリーンが蛍光を発するための励起光として作用する760nmの近赤外光が採用される。これにより、インドシアニングリーンは、約800nmをピークとする近赤外領域の蛍光を発生する。 When performing an operation using the imaging apparatus having the above-described configuration, indocyanine green is injected into the patient 17 who is supine on the treatment table 16 by injection. Then, near infrared rays are emitted from the excitation light source 23 toward the subject including the affected part, and visible light is emitted from the visible light source 22. As the near infrared light emitted from the excitation light source 23, as described above, 760 nm near infrared light acting as excitation light for indocyanine green to emit fluorescence is employed. As a result, indocyanine green generates fluorescence in the near infrared region having a peak at about 800 nm.
 そして、患者17の患部付近をカメラ21により撮影する。このカメラ21は、上述したように、蛍光と可視光とを検出することが可能となっている。患者17に可視光を照射し、これをカメラ21により撮影した画像が可視画像となり、患者17に近赤外光を照射し、インドシアニングリーンからの蛍光をカメラ21により撮影した画像が蛍光画像となる。カメラ21により所定のフレームレートで撮影された蛍光画像は蛍光画像処理部32により処理され、カメラ21により所定のフレームレートで撮影された可視画像は可視画像処理部31により処理される。そして、処理後の蛍光画像データと可視画像データとは、画像合成部33において合成され、可視画像と蛍光画像とが融合された合成画像が作成される。表示部14には、蛍光画像、可視画像および合成画像が、領域を分けて同時に、あるいは、選択的に表示される。 Then, the camera 21 captures the vicinity of the affected part of the patient 17. As described above, the camera 21 can detect fluorescence and visible light. An image obtained by irradiating the patient 17 with visible light and photographing this with the camera 21 becomes a visible image, and an image obtained by irradiating the patient 17 with near-infrared light and photographing fluorescence from indocyanine green with the camera 21 is a fluorescent image. Become. A fluorescent image captured by the camera 21 at a predetermined frame rate is processed by the fluorescent image processing unit 32, and a visible image captured by the camera 21 at a predetermined frame rate is processed by the visible image processing unit 31. Then, the processed fluorescence image data and visible image data are synthesized by the image synthesis unit 33, and a synthesized image in which the visible image and the fluorescence image are fused is created. On the display unit 14, a fluorescent image, a visible image, and a composite image are displayed simultaneously in a divided area or selectively.
 このような構成において、可視光用撮像素子51および蛍光用撮像素子52により撮影された画像には、ノイズが重畳する。このノイズの存在は、特に、可視光に比べて非常に微弱なインドシアニングリーンからの蛍光を撮影する場合において、蛍光用撮像素子52の信号の増幅率を大きくしたときに問題となり、正確なオートフォーカス動作に影響を与えることになる。 In such a configuration, noise is superimposed on an image captured by the visible light image sensor 51 and the fluorescence image sensor 52. The presence of this noise becomes a problem particularly when photographing fluorescence from indocyanine green, which is very weak compared to visible light, when the amplification factor of the signal of the fluorescence image sensor 52 is increased. This will affect the focus operation.
 このため、この発明に係るイメージング装置においては、焦点合わせ用レンズ71の位置と蛍光用撮像素子52により撮像した蛍光画像のフォーカス値との関係を表す関係式を関係式演算部で演算するとともに、関係式演算部で演算した関係式の精度指標を精度指標演算部で演算し、精度指標演算部により演算した精度指標が所定値以上であった場合の関係式に基づいて合焦点位置が存在する領域を判定し、この領域内でフォーカス値が最大となる位置に焦点合わせ用レンズ71を合焦点位置に移動させる。 For this reason, in the imaging apparatus according to the present invention, a relational expression representing a relation between the position of the focusing lens 71 and the focus value of the fluorescent image captured by the fluorescent imaging element 52 is calculated by the relational expression calculation unit. The accuracy index of the relational expression calculated by the relational expression calculation unit is calculated by the accuracy index calculation unit, and the in-focus position exists based on the relational expression when the accuracy index calculated by the accuracy index calculation unit is a predetermined value or more. A region is determined, and the focusing lens 71 is moved to a focus position at a position where the focus value is maximum in this region.
 図5は、上述した蛍光画像処理部32の機能ブロック図である。 FIG. 5 is a functional block diagram of the fluorescent image processing unit 32 described above.
 この蛍光画像処理部32は、後述する各種の演算等を実行する演算器41と、蛍光画像を記憶するための画像メモリ48と、上述したモータ39を駆動するためのモータドライバ49と、バスコントローラ46とを備える。演算器41と、画像メモリ48と、バスコントローラ46とは、データバス47により接続されている。また、演算器41は、機能的構成として、モータ39により焦点合わせ用レンズ71を移動させて焦点合わせ用レンズ71の位置と蛍光用撮像素子52により撮像した蛍光画像のフォーカス値との関係を表す関係式を求める関係式演算部42と、この関係式演算部42で演算した関係式の精度指標を演算する精度指標演算部43と、関係式演算部42により求めた関係式に対して精度指標演算部43により演算した精度指標が所定値以上であったときに、その関係式に基づいて合焦点位置が存在する領域を判定し、この領域内でフォーカス値が最大となる位置に焦点合わせ用レンズ71を移動させるレンズ位置制御部44と、を備えている。 The fluorescent image processing unit 32 includes an arithmetic unit 41 that executes various types of calculations described later, an image memory 48 for storing fluorescent images, a motor driver 49 for driving the motor 39, and a bus controller. 46. The computing unit 41, the image memory 48, and the bus controller 46 are connected by a data bus 47. Further, as a functional configuration, the calculator 41 represents the relationship between the position of the focusing lens 71 by moving the focusing lens 71 by the motor 39 and the focus value of the fluorescence image captured by the fluorescence imaging element 52. A relational expression calculating unit 42 for obtaining a relational expression, an accuracy index calculating part 43 for calculating an accuracy index of the relational expression calculated by the relational expression calculating part 42, and an accuracy index for the relational expression obtained by the relational expression calculating unit 42 When the accuracy index calculated by the calculation unit 43 is equal to or greater than a predetermined value, an area where the in-focus position exists is determined based on the relational expression, and the focus value is set to a position where the focus value is maximum in this area. A lens position control unit 44 for moving the lens 71.
 次に、上述したイメージング装置による蛍光画像撮影時の焦点合わせ動作について説明する。図6は、この発明に係るイメージング装置による蛍光画像撮影時の焦点合わせ動作を示すフローチャートである。 Next, the focusing operation at the time of fluorescent image photographing by the above-described imaging apparatus will be described. FIG. 6 is a flowchart showing a focusing operation at the time of photographing a fluorescent image by the imaging apparatus according to the present invention.
 焦点合わせ動作を実行するときには、図4に示すモータ39の駆動により、図3に示す焦点合わせ用レンズ71を、現在位置からストロークの両端のうちより遠い方側にむけて移動を開始させる(ステップS1)。 When performing the focusing operation, the motor 39 shown in FIG. 4 is driven to start the movement of the focusing lens 71 shown in FIG. 3 toward the farther side of the both ends of the stroke from the current position (step). S1).
 次に、焦点合わせ用レンズ71を前記方向に移動させながら、n個の位置においてフォーカス値を取得する(ステップS2)。このフォーカス値は、焦点合わせ用レンズ71を移動させながらカメラ21における蛍光用撮像素子52により撮影した蛍光画像に対して、n個の位置で取得される。このフォーカス値は、蛍光用撮像素子52により撮像した蛍光画像に先鋭化フィルターを適用することにより演算される。 Next, focus values are acquired at n positions while moving the focusing lens 71 in the direction (step S2). This focus value is acquired at n positions with respect to the fluorescence image captured by the fluorescence imaging element 52 in the camera 21 while moving the focusing lens 71. This focus value is calculated by applying a sharpening filter to the fluorescence image captured by the fluorescence image sensor 52.
 図7は、先鋭化フィルターとしてのラプラシアンフィルターの説明図である。 FIG. 7 is an explanatory diagram of a Laplacian filter as a sharpening filter.
 この実施形態においては、先鋭化フィルターとして、3行3列のラプラシアンフィルターを使用している。先鋭化フィルターは鮮鋭化フィルターとも呼称され、撮像された蛍光画像のエッジ情報を抽出するフィルターである。蛍光用撮像素子52により撮像した蛍光画像の各画素に対して図7に示す先鋭化フィルターを適用することで、この位置におけるフォーカス値を演算している。なお、このラプラシアンフィルターの係数とマトリックスサイズは、適宜、変更することができる。また、ラプラシアンフィルターにかえて、他の先鋭化フィルターを使用してもよい。 In this embodiment, a 3 × 3 Laplacian filter is used as the sharpening filter. The sharpening filter is also called a sharpening filter, and is a filter that extracts edge information of a captured fluorescent image. The focus value at this position is calculated by applying the sharpening filter shown in FIG. 7 to each pixel of the fluorescence image captured by the fluorescence imaging element 52. Note that the coefficient and matrix size of the Laplacian filter can be changed as appropriate. In addition, another sharpening filter may be used instead of the Laplacian filter.
 再度、図6を参照して、次に、図5に示す関係式演算部42により、焦点合わせ用レンズ71の位置と蛍光用撮像素子52により撮像した蛍光画像のフォーカス値との関係を表す関係式を一次式として求めるとともに、この関係式における傾斜を表す係数を判定する(ステップS4)。すなわち、この傾斜が上昇方向であるか、下降方向であるか、傾斜の判定が不能であるのかを判定する。 Referring to FIG. 6 again, next, a relation representing the relationship between the position of the focusing lens 71 and the focus value of the fluorescent image captured by the fluorescent imaging element 52 by the relational expression calculation unit 42 shown in FIG. While calculating | requiring a type | formula as a primary type | formula, the coefficient showing the inclination in this relational expression is determined (step S4). That is, it is determined whether the inclination is in the upward direction, the downward direction, or whether the inclination cannot be determined.
 より具体的には、最初に、この関係式を求めるときには、蛍光用撮像素子52により撮像した蛍光画像のn個のフォーカス値yi (i=1、2、3・・・n)とそのときの焦点合わせ用レンズ71の位置xi (i=1、2、3・・・n)について、下記の数式1および数式2で表される一次関数に最小二乗法を用いてフィッティングを実行する。 More specifically, first, when obtaining this relational expression, n focus values y i (i = 1, 2, 3... N) of the fluorescence image captured by the fluorescence imaging element 52 and at that time For the position x i (i = 1, 2, 3,... N) of the focusing lens 71, fitting is performed using the least square method on the linear function expressed by the following formulas 1 and 2.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 このとき、数式2に示される一次式の傾きを示す係数aは、図5に示す精度指標演算部43により、下記の数式3に基づいて演算される。 At this time, the coefficient a indicating the slope of the linear expression shown in Expression 2 is calculated based on Expression 3 below by the accuracy index calculation unit 43 shown in FIG.
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 そして、この数式のフィッティングの精度を表す決定係数R2 は、下記の数式4および数式5で表される。この決定係数R2 の値は、0から1の間の値であり、1に近いほど、統計的な信頼度が高いことを意味する。 A determination coefficient R 2 representing the accuracy of fitting of this mathematical expression is expressed by the following mathematical expressions 4 and 5. The value of the coefficient of determination R 2 is a value between 0 and 1, the closer to 1, the higher the statistical confidence.
Figure JPOXMLDOC01-appb-M000004

  
Figure JPOXMLDOC01-appb-M000004

  
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
 そして、傾斜判定工程(ステップS4)においては、決定係数R2 の値が所定の閾値(例えば、0.8)を上回っており、かつ、係数aが正の値であれば上昇と判定し、決定係数R2 の値が所定の閾値を上回っており、かつ、係数aが負の値であれば下降と判定し、決定係数R2 の値が所定の閾値以下であれば判定不能と判定する。 Then, in the inclination judgment step (step S4), and the coefficient of determination R 2 values are predetermined threshold (e.g., 0.8) and above the, and determines the coefficient a is the increase, if a positive value, the coefficient of determination and the value of R 2 is above a predetermined threshold value, and determines the coefficient a is a lowered if a negative value, determines the value of the coefficient of determination R 2 is impossible determined equal to or smaller than a predetermined threshold value .
 この傾斜判定工程(ステップS4)において「下降」という判定がなされた場合においては、焦点合わせ用レンズ71の移動方向を逆方向として(ステップS3)、再度、上述したステップS2を実行する。また、この傾斜判定工程(ステップS4)において「上昇」という判定がなされた場合においては、後述するステップS6に進む。一方、この傾斜判定工程(ステップS4)において「判定不能」という判定がなされた場合においては、フォーカス値yi と焦点合わせ用レンズ71の位置xi との関係を示すn個のデータの一つを廃棄するとともに次のデータを取得することによりフォーカスデータを更新して(ステップS5)、判定が可能となるまでステップS4を繰り返す。 When the determination of “down” is made in the tilt determination step (step S4), the moving direction of the focusing lens 71 is set in the reverse direction (step S3), and the above-described step S2 is executed again. In addition, when the determination of “rise” is made in the inclination determination step (step S4), the process proceeds to step S6 described later. On the other hand, when it is determined that “determination is impossible” in the tilt determination step (step S4), one of n data indicating the relationship between the focus value y i and the position x i of the focusing lens 71. And the focus data is updated by acquiring the next data (step S5), and step S4 is repeated until the determination becomes possible.
 傾斜判定工程(ステップS4)において「上昇」という判定がなされた場合においては、フォーカス値yi 焦点合わせ用レンズ71の位置xi との関係を示すn個のデータの一つを廃棄するとともに次のデータを取得することによりフォーカスデータを更新した後(ステップS6)、再度、傾斜判定を実行する(ステップS7)。そして、「上昇」という判定がなされた場合、あるいは、「判定不能」という判定がなされた場合には、フォーカス値yi 焦点合わせ用レンズ71の位置xi との関係を示すn個のデータの一つを廃棄するとともに、次のデータを取得して、「下降」という判定が得られるまで、ステップS6およびステップS7を繰り返す。 When it is determined that “increase” in the tilt determination step (step S4), one of the n data indicating the relationship with the position x i of the focus value y i focusing lens 71 is discarded and the next After the focus data is updated by acquiring the data (step S6), the tilt determination is performed again (step S7). When the determination “rising” is made, or when the determination is “impossible to judge”, the n values of the data indicating the relationship with the position x i of the focus value y i focusing lens 71 are obtained. While discarding one, the next data is acquired, and step S6 and step S7 are repeated until a determination of “down” is obtained.
 そして、「下降」という判定がなされた場合には、図5に示すレンズ位置制御部44により、傾斜が上昇から下降に至るまでの領域に合焦点位置が存在すると判定する。そして、傾斜が上昇から下降に至るまでの過程であるステップS6およびステップS7においてフォーカス値が最大となった位置まで、焦点合わせ用レンズ71を移動させる。これにより焦点合わせ用レンズ71が合焦点位置に移動し、蛍光画像の焦点合わせが完了する。 When the determination of “down” is made, it is determined by the lens position control unit 44 shown in FIG. 5 that the in-focus position exists in the region from the rise to the fall. Then, the focusing lens 71 is moved to the position where the focus value is maximized in steps S6 and S7, which is the process from the rise to the fall. As a result, the focusing lens 71 moves to the in-focus position, and the focusing of the fluorescent image is completed.
 なお、上述した実施形態においては、傾斜判断を、一次関数を利用することにより実行しているが、二次関数等のその他の関数を利用してもよい。 In the above-described embodiment, the inclination determination is performed by using a linear function. However, other functions such as a quadratic function may be used.
 また、上述した実施形態においては、蛍光画像の焦点合わせについて説明したが、可視画像について蛍光画像の焦点合わせ結果を利用してもよい。また、蛍光画像と可視画像とを別々の光学系で焦点合わせするようにしてもよい。 In the above-described embodiment, the focus of the fluorescent image has been described. However, the focus result of the fluorescent image may be used for the visible image. Further, the fluorescent image and the visible image may be focused by separate optical systems.
 また、上述した実施形態においては、励起用光源23として、波長が760nmの近赤外光を照射するものを使用しているが、励起用光源23としては、インドシアニングリーンを励起させて励起光を発生させることが可能な750nm~810nm程度の近赤外光を照射させるものを使用することができる。 In the above-described embodiment, the excitation light source 23 that emits near-infrared light having a wavelength of 760 nm is used. However, the excitation light source 23 excites indocyanine green to generate excitation light. Those that irradiate near-infrared light having a wavelength of about 750 nm to 810 nm capable of generating light can be used.
 また、上述した実施形態においては、蛍光色素を含む材料としてインドシアニングリーンを使用し、このインドシアニングリーンに対して760nmの近赤外光を励起光として照射することにより、インドシアニングリーンからおおよそ800nmをピークとする近赤外領域の蛍光を発光させる場合について説明したが、近赤外線以外の光を使用してもよい。 In the above-described embodiment, indocyanine green is used as a material containing a fluorescent dye, and the indocyanine green is irradiated with near-infrared light of 760 nm as excitation light, thereby approximately 800 nm from indocyanine green. Although the case of emitting fluorescence in the near-infrared region having a peak at, light other than near-infrared light may be used.
 例えば、蛍光色素として、5-ALA(5-アミノレブリン酸/5-Aminolevulinic Acid)を使用することができる。この5-ALAを使用した場合には、患者17の体内に侵入した5-ALAが蛍光物質であるプロトポルフィリン(protoporphyrinIX/PpIX)に変化する。このプロトポルフィリンに向けて400nm程度の可視光を照射すると、プロトポルフィリンから赤色の可視光が蛍光として照射される。このため、5-ALAを使用する場合には、励起用光源としてはその波長が400nm程度の可視光を照射するものを使用すればよく、また、確認用光源としては、プロトポルフィリンから蛍光として発光される赤色の可視光を照射するものを使用すればよい。 For example, 5-ALA (5-aminolevulinic acid / 5-aminolevulinic acid) can be used as a fluorescent dye. When 5-ALA is used, 5-ALA that has entered the body of the patient 17 changes to a protoporphyrin IX / PpIX that is a fluorescent substance. When visible light of about 400 nm is irradiated toward the protoporphyrin, red visible light is irradiated as fluorescence from the protoporphyrin. Therefore, when 5-ALA is used, an excitation light source that emits visible light having a wavelength of about 400 nm may be used, and a light source for confirmation emits fluorescence from protoporphyrin. What irradiates red visible light to be used may be used.
 10  本体
 11  入力部
 12  照明・撮影部
 13  アーム
 14  表示部
 16  治療台
 17  患者
 21  カメラ
 22  可視光源
 23  励起用光源
 30  制御部
 31  可視画像処理部
 32  蛍光画像処理部
 33  画像合成部
 41  演算器
 42  関係式演算部
 43  精度指標演算部
 44  レンズ位置制御部
 51  可視光用撮像素子
 52  蛍光用撮像素子
 71  焦点合わせ用レンズ
DESCRIPTION OF SYMBOLS 10 Main body 11 Input part 12 Illumination / imaging | photography part 13 Arm 14 Display part 16 Treatment table 17 Patient 21 Camera 22 Visible light source 23 Excitation light source 30 Control part 31 Visible image process part 32 Fluorescence image process part 33 Image composition part 41 Calculator 42 Relational expression calculation unit 43 Accuracy index calculation unit 44 Lens position control unit 51 Image sensor for visible light 52 Image sensor for fluorescence 71 Lens for focusing

Claims (4)

  1.  被写体に注入された蛍光色素を励起させるための励起光線を、前記被写体に向けて照射する励起用光源と、
     励起光線が照射されることにより前記蛍光色素から発生する蛍光画像を撮像可能な撮像素子と、
     焦点合わせ用レンズの移動により前記蛍光色素から発生した蛍光画像を前記撮像素子に対して焦点合わせする焦点合わせ機構と、
     前記焦点合わせ用レンズを移動させるためのモータと、を備え、
     前記被写体の蛍光画像を撮影するイメージング装置において、
     前記モータにより前記焦点合わせ用レンズを移動させ、そのときの焦点合わせ用レンズの位置と前記撮像素子により撮像した蛍光画像のフォーカス値との関係を表す関係式を求める関係式演算部と、
     前記関係式演算部で求めた関係式の精度指標を演算する精度指標演算部と、
     前記精度指標演算部により演算した精度指標が所定値以上であったときに、前記関係式演算部により求めた関係式に基づいて合焦点位置が存在する領域を判定し、この領域内で前記フォーカス値が最大となる位置に前記焦点合わせ用レンズを移動させる焦点合わせ用レンズ位置制御部と、
     を備える、イメージング装置。
    An excitation light source that irradiates the subject with an excitation beam for exciting the fluorescent dye injected into the subject;
    An imaging device capable of capturing a fluorescent image generated from the fluorescent dye by being irradiated with excitation light; and
    A focusing mechanism that focuses the fluorescent image generated from the fluorescent dye by the movement of the focusing lens with respect to the imaging device;
    A motor for moving the focusing lens,
    In an imaging device that captures a fluorescent image of the subject,
    A relational expression calculation unit that obtains a relational expression representing a relation between a position of the focusing lens at that time and a focus value of a fluorescent image captured by the imaging element by moving the focusing lens by the motor;
    An accuracy index calculation unit for calculating the accuracy index of the relational expression obtained by the relational expression calculation unit;
    When the accuracy index calculated by the accuracy index calculation unit is equal to or greater than a predetermined value, an area where the in-focus position exists is determined based on the relational expression obtained by the relational expression calculation unit, and the focus is within this area A focusing lens position controller that moves the focusing lens to a position where the value is maximized;
    An imaging apparatus comprising:
  2.  請求項1に記載のイメージング装置において、
     前記関係式は一次式であり、
     前記焦点合わせ用レンズ位置制御部は、前記一次式で示される直線の傾きが反転した領域を前記合焦点位置が存在する領域と判定するイメージング装置。
    The imaging apparatus according to claim 1, wherein
    The relational expression is a linear expression,
    The focusing lens position control unit is an imaging apparatus that determines an area where the inclination of a straight line represented by the linear expression is reversed as an area where the in-focus position exists.
  3.  請求項1に記載のイメージング装置において、
     前記フォーカス値は、前記撮像素子により撮像した蛍光画像に先鋭化フィルターを適用することにより演算されるイメージング装置。
    The imaging apparatus according to claim 1, wherein
    The imaging apparatus in which the focus value is calculated by applying a sharpening filter to a fluorescent image captured by the image sensor.
  4.  被写体に注入された蛍光色素を励起させるための励起光線を前記被写体に向けて照射し、前記蛍光色素から発生する蛍光を焦点合わせ用レンズの移動により撮像素子に対して焦点合わせして励起光の画像を得るイメージング方法において、
     前記被写体に対して励起光線を照射する励起光照射工程と、
     前記焦点合わせ用レンズを移動されながら前記蛍光色素から発生する蛍光画像を撮影し、前記焦点合わせ用レンズの位置と前記蛍光画像のフォーカス値との関係を表す関係式を求める関係式演算工程と、
     前記関係式演算工程で演算した関係式の精度指標を演算する精度指標演算工程と、
     前記精度指標演算工程で演算した精度指標が設定値となるまで前記関係式演算工程を継続して実行し、精度指標が設定値以上となったときに、前記関係式演算工程にて演算した関係式に基づいて合焦点位置が存在する領域を判定し、この領域内で前記フォーカス値が最大となる位置に前記焦点合わせ用レンズを移動させる焦点合わせ工程と、
     を含む、イメージング方法。
    The excitation light for exciting the fluorescent dye injected into the subject is irradiated toward the subject, and the fluorescence generated from the fluorescent dye is focused on the image pickup device by moving the focusing lens to generate the excitation light. In an imaging method for obtaining an image,
    An excitation light irradiation step of irradiating the subject with excitation light;
    Taking a fluorescent image generated from the fluorescent dye while moving the focusing lens, a relational expression calculating step for obtaining a relational expression representing a relation between a position of the focusing lens and a focus value of the fluorescent image;
    An accuracy index calculation step of calculating an accuracy index of the relational expression calculated in the relational expression calculation step;
    The relational expression calculation step is continuously executed until the accuracy index calculated in the accuracy index calculation step becomes a set value, and the relationship calculated in the relational expression calculation step when the accuracy index becomes equal to or greater than the set value. A focusing step of determining a region where the in-focus position exists based on the equation, and moving the focusing lens to a position where the focus value is maximum in this region;
    An imaging method comprising:
PCT/JP2015/069607 2014-09-09 2015-07-08 Imaging device and imaging method WO2016039002A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014183328 2014-09-09
JP2014-183328 2014-09-09

Publications (1)

Publication Number Publication Date
WO2016039002A1 true WO2016039002A1 (en) 2016-03-17

Family

ID=55458764

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/069607 WO2016039002A1 (en) 2014-09-09 2015-07-08 Imaging device and imaging method

Country Status (1)

Country Link
WO (1) WO2016039002A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180220894A1 (en) * 2017-02-07 2018-08-09 Shimadzu Corporation Time intensity curve measuring apparatus

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001036799A (en) * 1999-07-23 2001-02-09 Mitsubishi Electric Corp Method and device for adjusting position of optical lens for fixed focus type image pickup device and computer readable recording medium storage program concerned with the method
JP2012154896A (en) * 2011-01-28 2012-08-16 Jikei Univ Trace light imaging device and trace light imaging method
JP2013109271A (en) * 2011-11-24 2013-06-06 Keyence Corp Image processor, focus adjustment method, and computer program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001036799A (en) * 1999-07-23 2001-02-09 Mitsubishi Electric Corp Method and device for adjusting position of optical lens for fixed focus type image pickup device and computer readable recording medium storage program concerned with the method
JP2012154896A (en) * 2011-01-28 2012-08-16 Jikei Univ Trace light imaging device and trace light imaging method
JP2013109271A (en) * 2011-11-24 2013-06-06 Keyence Corp Image processor, focus adjustment method, and computer program

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180220894A1 (en) * 2017-02-07 2018-08-09 Shimadzu Corporation Time intensity curve measuring apparatus
US11937898B2 (en) * 2017-02-07 2024-03-26 Shimadzu Corporation Time intensity curve measuring apparatus

Similar Documents

Publication Publication Date Title
JP6319448B2 (en) Imaging device
JP2011193983A5 (en)
JP6319449B2 (en) Imaging device
JP6758287B2 (en) Control device and medical imaging system
JP2018128294A (en) Area of interest tracking device
KR101992016B1 (en) fundus fluorescence image acquisition apparatus with optical source and focus automatic control function, and method thereof
WO2016035450A1 (en) Imaging device
JP6485275B2 (en) Imaging device
JP6556466B2 (en) Laser therapy device
WO2016039002A1 (en) Imaging device and imaging method
US11931008B2 (en) Treatment support device and method of setting region of interest
JP6708143B2 (en) Time intensity curve measuring device
JP2018128759A (en) Shininess removing device
KR101959394B1 (en) Imaging device for multi-exposure laser speckle image
JP6432533B2 (en) Imaging device
JP2018061708A (en) Imaging device
JP6711638B2 (en) Ophthalmic equipment
JP2019048111A (en) Imaging method
JP7200755B2 (en) IMAGING DEVICE AND METHOD OF OPERATION OF IMAGING DEVICE
JP2018195247A (en) Image processing apparatus and imaging apparatus
WO2016039001A1 (en) Imaging device
JP2012225826A (en) Interference light measuring apparatus
JP2019066398A (en) Imaging device
US20230076477A1 (en) Imaging device and imaging method
JP2018124664A (en) Image processing device and imaging device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15840609

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15840609

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP