WO2016039002A1 - Dispositif d'imagerie et méthode d'imagerie - Google Patents

Dispositif d'imagerie et méthode d'imagerie Download PDF

Info

Publication number
WO2016039002A1
WO2016039002A1 PCT/JP2015/069607 JP2015069607W WO2016039002A1 WO 2016039002 A1 WO2016039002 A1 WO 2016039002A1 JP 2015069607 W JP2015069607 W JP 2015069607W WO 2016039002 A1 WO2016039002 A1 WO 2016039002A1
Authority
WO
WIPO (PCT)
Prior art keywords
relational expression
focusing lens
image
focus
accuracy index
Prior art date
Application number
PCT/JP2015/069607
Other languages
English (en)
Japanese (ja)
Inventor
足立 晋
Original Assignee
株式会社島津製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社島津製作所 filed Critical 株式会社島津製作所
Publication of WO2016039002A1 publication Critical patent/WO2016039002A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence

Definitions

  • the present invention relates to an imaging apparatus and an imaging method for photographing fluorescence generated from a fluorescent dye by irradiating a fluorescent dye injected into a subject with excitation light.
  • indocyanine green as a fluorescent dye is injected into the affected area.
  • ICG indocyanine green
  • near infrared light 750 to 810 nm (nanometers) as excitation light
  • indocyanine green emits fluorescence in the near infrared region having a peak at about 800 nm.
  • This fluorescence is photographed by a camera capable of detecting near infrared light, and the image is displayed on a display unit such as a liquid crystal display panel.
  • a display unit such as a liquid crystal display panel.
  • Patent Document 1 discloses a near-infrared fluorescence intensity distribution image obtained by irradiating an indocyanine green excitation light to a living organ to which indocyanine green is administered, and indocyanine green administration. Compared with the cancer lesion distribution image obtained by applying X-rays, nuclear magnetic resonance or ultrasound to the previous test organ, it is detected by the intensity distribution image of near-infrared fluorescence, A data collection method is disclosed in which data of a region that is not detected in a cancer lesion distribution image is collected as cancer secondary lesion region data.
  • excitation light and visible light are alternately irradiated on a subject to which an angiographic contrast agent is administered, and a fluorescence image and a visible image irradiated with excitation light are alternately acquired by an imaging unit.
  • a surgical support method is disclosed in which a fluorescent image is subjected to threshold processing with a predetermined threshold to extract a blood vessel image, and a composite image in which the visible image and the extracted blood vessel image are superimposed is created.
  • Patent Document 3 a lens is driven in an arbitrary direction from a focusing operation start point, a contrast value is obtained at n points during this time, and the majority value is determined by comparing the contrast value at the n point with the contrast value at the starting point.
  • a hill-climbing autofocus device that determines an initial drive direction.
  • the autofocus device described in Patent Document 3 employs a configuration in which a focus value, which is a parameter corresponding to contrast, is calculated from a captured image and the focusing lens is moved to a position where the focus value is maximized. Has been.
  • Fluorescence from indocyanine green is very weak compared to visible light. For this reason, when this fluorescence is imaged by the image sensor, the noise component of the image sensor occupies a large proportion of the fluorescence image signal.
  • FIG. 8 is a graph schematically showing the relationship between the rotational position of the motor for moving the focusing lens and the focus value at that time.
  • the vertical axis and the horizontal axis indicate arbitrary units (au).
  • the present invention has been made to solve the above-described problems, and an object of the present invention is to provide an imaging apparatus capable of performing accurate focusing even when photographing fluorescence.
  • an excitation light source that irradiates the subject with an excitation light beam for exciting the fluorescent dye injected into the subject, and a fluorescent image that is generated from the fluorescent dye when irradiated with the excitation light beam
  • An image pickup device capable of picking up an image, a focusing mechanism for focusing a fluorescent image generated from the fluorescent dye by movement of the focusing lens on the image pickup device, and a motor for moving the focusing lens
  • the imaging apparatus for capturing a fluorescent image of the subject the focusing lens is moved by the motor, and the position of the focusing lens at that time and the focus value of the fluorescent image captured by the imaging element
  • a relational expression calculation unit for obtaining a relational expression representing the relationship between the relational expression and an accuracy index for calculating an accuracy index of the relational expression obtained by the relational expression calculation unit
  • the accuracy index calculated by the calculation unit and the accuracy index calculation unit is equal to or greater than a predetermined value, an area where the in-focus position exists is determined based on the relational expression obtained by the relational
  • the relational expression is a linear expression
  • the focusing lens position control unit determines that an area where the inclination of the straight line represented by the linear expression is inverted is an area where the in-focus position exists.
  • the focus value is calculated by applying a sharpening filter to the fluorescence image captured by the image sensor.
  • an excitation beam for exciting the fluorescent dye injected into the subject is irradiated toward the subject, and the fluorescence generated from the fluorescent dye is focused on the image sensor by moving the focusing lens.
  • an excitation light irradiation step of irradiating the subject with an excitation light beam, and a fluorescent image generated from the fluorescent dye while moving the focusing lens A relational expression calculating step for obtaining a relational expression representing a relation between the position of the focusing lens and the focus value of the fluorescent image; and an accuracy index calculating step for calculating a precision index of the relational expression calculated in the relational expression calculating step;
  • the relational expression calculation step is continuously executed until the accuracy index calculated in the accuracy index calculation step reaches a set value, and the accuracy index becomes equal to or greater than the set value.
  • the focus value can be easily calculated by the sharpening filter.
  • FIG. 1 is a schematic diagram of an imaging apparatus according to the present invention.
  • 2 is a perspective view of an illumination / photographing unit 12.
  • FIG. 2 is a schematic diagram of a camera 21.
  • FIG. It is a block diagram which shows the main control systems of the imaging device which concerns on this invention.
  • 3 is a functional block diagram of a fluorescence image processing unit 32.
  • FIG. It is a flowchart which shows the imaging method which concerns on this invention. It is explanatory drawing of a Laplacian filter. It is a graph which shows typically the relation between the rotation position of the motor for moving the focusing lens, and the focus value at that time.
  • FIG. 1 is a schematic diagram of an imaging apparatus according to the present invention.
  • the imaging apparatus includes an input unit 11 such as a touch panel, and includes a main body 10 incorporating a control unit 30 and the like described later, an illumination / photographing unit 12 supported movably by an arm 13, and a liquid crystal display panel.
  • a display unit 14 and a treatment table 16 on which a patient 17 is placed are provided.
  • the illumination / imaging unit 12 is not limited to the one supported by the arm 13, and may be carried by the surgeon in hand.
  • the display unit 14 may be a glasses-type wearable device instead of a liquid crystal display panel.
  • FIG. 2 is a perspective view of the illumination / photographing unit 12 described above.
  • the illumination / photographing unit 12 includes a camera 21 capable of detecting near-infrared rays and visible light, a visible light source 22 including a large number of LEDs disposed on the outer peripheral portion of the camera 21, and an outer peripheral portion of the visible light source 22. And an excitation light source 23 made up of a number of arranged LEDs.
  • the visible light source 22 emits visible light.
  • the excitation light source 23 irradiates near infrared light having a wavelength of 760 nm, which is excitation light for exciting indocyanine green.
  • FIG. 3 is a schematic diagram of the camera 21.
  • the camera 21 includes a focusing lens 71 that reciprocates for focusing, a wavelength selection filter 53, a visible light image sensor 51, and a fluorescence image sensor 52.
  • the visible light image sensor 51 and the fluorescence image sensor 52 are composed of a CMOS or a CCD.
  • Visible light and fluorescence incident on the camera 21 coaxially along the optical axis L pass through the focusing lens 71 constituting the focusing mechanism, and then reach the wavelength selection filter 53.
  • visible light and fluorescence incident coaxially visible light is reflected by the wavelength selection filter 53 and enters the visible light image sensor 51.
  • the fluorescence passes through the wavelength selection filter 53 and enters the fluorescence imaging device 52. At this time, the visible light is focused on the visible light imaging element 51 and the fluorescence is focused on the fluorescent imaging element 52 by the action of the focusing mechanism including the focusing lens 71.
  • FIG. 4 is a block diagram showing a main control system of the imaging apparatus according to the present invention.
  • This imaging apparatus is composed of a CPU that performs logical operations, a ROM that stores operation programs necessary for controlling the apparatus, a RAM that temporarily stores data during control, and the like, and a control unit that controls the entire apparatus 30.
  • the control unit 30 synthesizes the visible image and the fluorescent image with the visible image processing unit 31 for processing the visible image, the fluorescent image processing unit 32 for processing the fluorescent image generated from indocyanine green, and the like. Image synthesizing unit 33.
  • the control unit 30 is connected to the input unit 11 and the display unit 14 described above.
  • the control unit 30 is connected to an illumination / photographing unit 12 including a camera 21, a visible light source 22, and an excitation light source 23. Further, the control unit 30 is connected to a motor 39 for moving the focusing lens 71 described above.
  • the visible image processing unit 31 in the control unit 30 transmits a control signal to the visible light source 22 and the visible light imaging device 51 and receives an image signal of a visible image from the visible light imaging device 51.
  • the fluorescence image processing unit 32 in the control unit 30 transmits a control signal to the excitation light source 23 and the fluorescence imaging element 52, receives an image signal of the fluorescence image from the fluorescence imaging element 52, and The rotation of the motor 39 that moves the focusing lens 71 is controlled.
  • indocyanine green When performing an operation using the imaging apparatus having the above-described configuration, indocyanine green is injected into the patient 17 who is supine on the treatment table 16 by injection. Then, near infrared rays are emitted from the excitation light source 23 toward the subject including the affected part, and visible light is emitted from the visible light source 22. As the near infrared light emitted from the excitation light source 23, as described above, 760 nm near infrared light acting as excitation light for indocyanine green to emit fluorescence is employed. As a result, indocyanine green generates fluorescence in the near infrared region having a peak at about 800 nm.
  • the camera 21 captures the vicinity of the affected part of the patient 17.
  • the camera 21 can detect fluorescence and visible light.
  • An image obtained by irradiating the patient 17 with visible light and photographing this with the camera 21 becomes a visible image
  • an image obtained by irradiating the patient 17 with near-infrared light and photographing fluorescence from indocyanine green with the camera 21 is a fluorescent image.
  • a fluorescent image captured by the camera 21 at a predetermined frame rate is processed by the fluorescent image processing unit 32
  • a visible image captured by the camera 21 at a predetermined frame rate is processed by the visible image processing unit 31.
  • the processed fluorescence image data and visible image data are synthesized by the image synthesis unit 33, and a synthesized image in which the visible image and the fluorescence image are fused is created.
  • a fluorescent image, a visible image, and a composite image are displayed simultaneously in a divided area or selectively.
  • noise is superimposed on an image captured by the visible light image sensor 51 and the fluorescence image sensor 52.
  • the presence of this noise becomes a problem particularly when photographing fluorescence from indocyanine green, which is very weak compared to visible light, when the amplification factor of the signal of the fluorescence image sensor 52 is increased. This will affect the focus operation.
  • a relational expression representing a relation between the position of the focusing lens 71 and the focus value of the fluorescent image captured by the fluorescent imaging element 52 is calculated by the relational expression calculation unit.
  • the accuracy index of the relational expression calculated by the relational expression calculation unit is calculated by the accuracy index calculation unit, and the in-focus position exists based on the relational expression when the accuracy index calculated by the accuracy index calculation unit is a predetermined value or more. A region is determined, and the focusing lens 71 is moved to a focus position at a position where the focus value is maximum in this region.
  • FIG. 5 is a functional block diagram of the fluorescent image processing unit 32 described above.
  • the fluorescent image processing unit 32 includes an arithmetic unit 41 that executes various types of calculations described later, an image memory 48 for storing fluorescent images, a motor driver 49 for driving the motor 39, and a bus controller. 46.
  • the computing unit 41, the image memory 48, and the bus controller 46 are connected by a data bus 47.
  • the calculator 41 represents the relationship between the position of the focusing lens 71 by moving the focusing lens 71 by the motor 39 and the focus value of the fluorescence image captured by the fluorescence imaging element 52.
  • the accuracy index calculated by the calculation unit 43 is equal to or greater than a predetermined value
  • an area where the in-focus position exists is determined based on the relational expression, and the focus value is set to a position where the focus value is maximum in this area.
  • FIG. 6 is a flowchart showing a focusing operation at the time of photographing a fluorescent image by the imaging apparatus according to the present invention.
  • the motor 39 shown in FIG. 4 is driven to start the movement of the focusing lens 71 shown in FIG. 3 toward the farther side of the both ends of the stroke from the current position (step). S1).
  • focus values are acquired at n positions while moving the focusing lens 71 in the direction (step S2).
  • This focus value is acquired at n positions with respect to the fluorescence image captured by the fluorescence imaging element 52 in the camera 21 while moving the focusing lens 71.
  • This focus value is calculated by applying a sharpening filter to the fluorescence image captured by the fluorescence image sensor 52.
  • FIG. 7 is an explanatory diagram of a Laplacian filter as a sharpening filter.
  • a 3 ⁇ 3 Laplacian filter is used as the sharpening filter.
  • the sharpening filter is also called a sharpening filter, and is a filter that extracts edge information of a captured fluorescent image.
  • the focus value at this position is calculated by applying the sharpening filter shown in FIG. 7 to each pixel of the fluorescence image captured by the fluorescence imaging element 52.
  • the coefficient and matrix size of the Laplacian filter can be changed as appropriate.
  • another sharpening filter may be used instead of the Laplacian filter.
  • the coefficient showing the inclination in this relational expression is determined (step S4). That is, it is determined whether the inclination is in the upward direction, the downward direction, or whether the inclination cannot be determined.
  • n focus values y i (i 1, 2, 3... N) of the fluorescence image captured by the fluorescence imaging element 52 and at that time
  • fitting is performed using the least square method on the linear function expressed by the following formulas 1 and 2.
  • a determination coefficient R 2 representing the accuracy of fitting of this mathematical expression is expressed by the following mathematical expressions 4 and 5.
  • the value of the coefficient of determination R 2 is a value between 0 and 1, the closer to 1, the higher the statistical confidence.
  • step S4 the coefficient of determination R 2 values are predetermined threshold (e.g., 0.8) and above the, and determines the coefficient a is the increase, if a positive value, the coefficient of determination and the value of R 2 is above a predetermined threshold value, and determines the coefficient a is a lowered if a negative value, determines the value of the coefficient of determination R 2 is impossible determined equal to or smaller than a predetermined threshold value .
  • predetermined threshold e.g., 0.8
  • step S4 When the determination of “down” is made in the tilt determination step (step S4), the moving direction of the focusing lens 71 is set in the reverse direction (step S3), and the above-described step S2 is executed again. In addition, when the determination of “rise” is made in the inclination determination step (step S4), the process proceeds to step S6 described later. On the other hand, when it is determined that “determination is impossible” in the tilt determination step (step S4), one of n data indicating the relationship between the focus value y i and the position x i of the focusing lens 71. And the focus data is updated by acquiring the next data (step S5), and step S4 is repeated until the determination becomes possible.
  • step S4 When it is determined that “increase” in the tilt determination step (step S4), one of the n data indicating the relationship with the position x i of the focus value y i focusing lens 71 is discarded and the next After the focus data is updated by acquiring the data (step S6), the tilt determination is performed again (step S7).
  • step S6 When the determination “rising” is made, or when the determination is “impossible to judge”, the n values of the data indicating the relationship with the position x i of the focus value y i focusing lens 71 are obtained. While discarding one, the next data is acquired, and step S6 and step S7 are repeated until a determination of “down” is obtained.
  • the inclination determination is performed by using a linear function.
  • other functions such as a quadratic function may be used.
  • the focus of the fluorescent image has been described.
  • the focus result of the fluorescent image may be used for the visible image.
  • the fluorescent image and the visible image may be focused by separate optical systems.
  • the excitation light source 23 that emits near-infrared light having a wavelength of 760 nm is used.
  • the excitation light source 23 excites indocyanine green to generate excitation light.
  • Those that irradiate near-infrared light having a wavelength of about 750 nm to 810 nm capable of generating light can be used.
  • indocyanine green is used as a material containing a fluorescent dye, and the indocyanine green is irradiated with near-infrared light of 760 nm as excitation light, thereby approximately 800 nm from indocyanine green.
  • near-infrared light 760 nm as excitation light, thereby approximately 800 nm from indocyanine green.
  • light other than near-infrared light may be used.
  • 5-ALA (5-aminolevulinic acid / 5-aminolevulinic acid) can be used as a fluorescent dye.
  • 5-ALA When 5-ALA is used, 5-ALA that has entered the body of the patient 17 changes to a protoporphyrin IX / PpIX that is a fluorescent substance.
  • visible light of about 400 nm When visible light of about 400 nm is irradiated toward the protoporphyrin, red visible light is irradiated as fluorescence from the protoporphyrin. Therefore, when 5-ALA is used, an excitation light source that emits visible light having a wavelength of about 400 nm may be used, and a light source for confirmation emits fluorescence from protoporphyrin. What irradiates red visible light to be used may be used.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Biochemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Analytical Chemistry (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)

Abstract

 L'invention concerne une expression relationnelle indiquant une relation entre la position d'une lentille de focalisation et une valeur de mise au point d'une image de fluorescence capturée par un élément d'imagerie de fluorescence. L'expression relationnelle est calculée par une unité de calcul d'expression relationnelle 42. Un indice de précision de l'expression relationnelle calculé par l'unité de calcul d'expression relationnelle 42 est calculé par une unité de calcul d'indice de précision 43. Une région dans laquelle une position focale est présente d'après l'expression relationnelle lorsque l'indice de précision calculé par l'unité de calcul d'indice de précision 43 est égal ou supérieur à la valeur prédéterminée est déterminée par une unité de commande de position de la lentille 44, et la lentille de focalisation est déplacée à la position où la valeur de mise au point est optimisée dans la région.
PCT/JP2015/069607 2014-09-09 2015-07-08 Dispositif d'imagerie et méthode d'imagerie WO2016039002A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-183328 2014-09-09
JP2014183328 2014-09-09

Publications (1)

Publication Number Publication Date
WO2016039002A1 true WO2016039002A1 (fr) 2016-03-17

Family

ID=55458764

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/069607 WO2016039002A1 (fr) 2014-09-09 2015-07-08 Dispositif d'imagerie et méthode d'imagerie

Country Status (1)

Country Link
WO (1) WO2016039002A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180220894A1 (en) * 2017-02-07 2018-08-09 Shimadzu Corporation Time intensity curve measuring apparatus

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001036799A (ja) * 1999-07-23 2001-02-09 Mitsubishi Electric Corp 固定焦点型撮像装置の光学レンズ位置調整法およびその方法にかかるプログラムを記録したコンピュータ読み取り可能な記録媒体並びに固定焦点型撮像装置の光学レンズ位置調整装置
JP2012154896A (ja) * 2011-01-28 2012-08-16 Jikei Univ 微量光イメージング装置、及び微量光イメージング方法
JP2013109271A (ja) * 2011-11-24 2013-06-06 Keyence Corp 画像処理装置、フォーカス調整方法及びコンピュータプログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001036799A (ja) * 1999-07-23 2001-02-09 Mitsubishi Electric Corp 固定焦点型撮像装置の光学レンズ位置調整法およびその方法にかかるプログラムを記録したコンピュータ読み取り可能な記録媒体並びに固定焦点型撮像装置の光学レンズ位置調整装置
JP2012154896A (ja) * 2011-01-28 2012-08-16 Jikei Univ 微量光イメージング装置、及び微量光イメージング方法
JP2013109271A (ja) * 2011-11-24 2013-06-06 Keyence Corp 画像処理装置、フォーカス調整方法及びコンピュータプログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180220894A1 (en) * 2017-02-07 2018-08-09 Shimadzu Corporation Time intensity curve measuring apparatus
US11937898B2 (en) * 2017-02-07 2024-03-26 Shimadzu Corporation Time intensity curve measuring apparatus

Similar Documents

Publication Publication Date Title
JP6319448B2 (ja) イメージング装置
JP2011193983A5 (fr)
JP6319449B2 (ja) イメージング装置
JP6758287B2 (ja) 制御装置及び医療用撮像システム
JP2018128294A (ja) 関心領域追跡装置
KR101992016B1 (ko) 광원 및 초점 자동 제어 기능을 가지는 안저 형광 영상 획득 장치 및 방법
JP6485275B2 (ja) イメージング装置
WO2016035450A1 (fr) Dispositif d'imagerie
JP6556466B2 (ja) レーザ治療装置
WO2016039002A1 (fr) Dispositif d'imagerie et méthode d'imagerie
US11931008B2 (en) Treatment support device and method of setting region of interest
JP6708143B2 (ja) 時間強度曲線測定装置
JP2018128759A (ja) テカリ領域除去装置
KR101959394B1 (ko) 다중-노출 레이저 스펙클 영상 수집 장치
JP6432533B2 (ja) イメージング装置
JP2018061708A (ja) イメージング装置
JP6711638B2 (ja) 眼科装置
JP2019048111A (ja) イメージング方法
JP7200755B2 (ja) イメージング装置およびイメージング装置の作動方法
JP2018195247A (ja) 画像処理装置およびイメージング装置
JP6480769B2 (ja) 撮影装置、撮影システム及び撮影装置で用いられる支持部材
WO2016039001A1 (fr) Dispositif d'imagerie
JP2012225826A (ja) 干渉光計測装置
JP2019066398A (ja) イメージング装置
US20230076477A1 (en) Imaging device and imaging method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15840609

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15840609

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP