WO2013114884A1 - Programme permettant d'identifier la position d'un sujet, dispositif permettant d'identifier la position d'un sujet et caméra - Google Patents

Programme permettant d'identifier la position d'un sujet, dispositif permettant d'identifier la position d'un sujet et caméra Download PDF

Info

Publication number
WO2013114884A1
WO2013114884A1 PCT/JP2013/000530 JP2013000530W WO2013114884A1 WO 2013114884 A1 WO2013114884 A1 WO 2013114884A1 JP 2013000530 W JP2013000530 W JP 2013000530W WO 2013114884 A1 WO2013114884 A1 WO 2013114884A1
Authority
WO
WIPO (PCT)
Prior art keywords
subject position
evaluation value
specifying
procedure
target image
Prior art date
Application number
PCT/JP2013/000530
Other languages
English (en)
Japanese (ja)
Inventor
啓之 阿部
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Publication of WO2013114884A1 publication Critical patent/WO2013114884A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows

Definitions

  • the present invention relates to an object position specifying program, an object position specifying device, and a camera.
  • This imaging apparatus performs a focus adjustment process on the AF area selected by the user (for example, Patent Document 1).
  • the subject position specifying program causes a computer to classify a single image into a plurality of segmented images based on color information or luminance information of the target image, and to classify each of the plurality of segmented images as the color information.
  • a binarization procedure for binarizing using luminance information to generate a plurality of binarized images, and for specifying a subject position in the target image for each of the plurality of binarized images
  • a first evaluation value calculating procedure for calculating a first evaluation value to be used; a first subject position specifying procedure for specifying a subject position in the target image based on the first evaluation value; Based on the subject position specified in the subject position specifying procedure, the subject position in the target image is specified again for each of the plurality of binarized images.
  • the subject position in the target image may be specified again based on both the second evaluation value and the first evaluation value.
  • the first evaluation value is at least a value related to an area of a white pixel region composed of white pixels in the binarized image and a value related to a distance between the white pixel region and a predetermined reference region.
  • the second evaluation value is at least a value related to the area and a value related to the distance between the white pixel region and the region based on the subject position specified in the first subject position specifying procedure. It may be calculated based on this.
  • Another object position specifying program is a computer program for generating a binarized image by binarizing pixels with a predetermined threshold value, and for the first pixel exceeding the predetermined threshold value, the 2
  • the first determination procedure for determining the degree of clustering of the first pixels with reference to a predetermined position of the binarized image and the first determination procedure determined that the first pixel is more than the predetermined threshold.
  • a second determination procedure for determining the degree of clumping of the first pixel with reference to a predetermined position in the clump of one pixel, and a determination procedure for determining a subject position based on the determination result of the second determination procedure And execute.
  • An apparatus for specifying a subject position includes: a division unit that divides one image into a plurality of divided images based on color information or luminance information of the target image; and the color information or luminance information for each of the plurality of divided images.
  • a binarization unit that binarizes using the image to generate a plurality of binarized images, and a first unit used to identify a subject position in the target image for each of the plurality of binarized images.
  • a first evaluation value calculation unit that calculates an evaluation value of the first object
  • a first subject position specification unit that specifies a subject position in the target image based on the first evaluation value
  • the first subject position A second evaluation value used to re-specify the subject position in the target image for each of the plurality of binarized images based on the subject position specified by the specifying unit.
  • the camera according to the present invention includes a division procedure for dividing one image into a plurality of divided images based on the color information or luminance information of the target image, and each of the plurality of divided images using the color information or luminance information.
  • a binarization procedure for binarizing and generating a plurality of binarized images, and a first evaluation value used for specifying a subject position in the target image for each of the plurality of binarized images A first evaluation value calculation procedure to be calculated, a first subject position specification procedure for specifying a subject position in the target image based on the first evaluation value, and a specification in the first subject position specification procedure Based on the subject position, a second evaluation value calculation for calculating a second evaluation value used to re-specify the subject position in the target image for each of the plurality of binarized images.
  • FIG. 1 is a block diagram showing a configuration of an embodiment of a camera according to the present embodiment.
  • the camera 100 includes an operation member 101, a lens 102, an image sensor 103, a control device 104, a memory card slot 105, and a monitor 106.
  • the operation member 101 includes various input members operated by the user, such as a power button, a release button, a zoom button, a cross key, an enter button, a play button, and a delete button.
  • the lens 102 is composed of a plurality of optical lenses, but is representatively represented by one lens in FIG.
  • the image sensor 103 is an image sensor such as a CCD or a CMOS, for example, and captures a subject image formed by the lens 102. Then, an image signal obtained by imaging is output to the control device 104.
  • the control device 104 generates a predetermined image format, for example, JPEG format image data (hereinafter referred to as “main image data”) based on the image signal input from the image sensor 103. Further, the control device 104 generates display image data, for example, thumbnail image data, based on the generated image data. The control device 104 generates an image file that includes the generated main image data and thumbnail image data, and further includes header information, and outputs the image file to the memory card slot 105. In the present embodiment, it is assumed that both the main image data and the thumbnail image data are image data expressed in the RGB color system.
  • the memory card slot 105 is a slot for inserting a memory card as a storage medium, and records and records the image file output from the control device 104 on the memory card.
  • the memory card slot 105 reads an image file stored in the memory card based on an instruction from the control device 104.
  • the monitor 106 is a liquid crystal monitor (rear monitor) mounted on the back surface of the camera 100, and the monitor 106 displays an image stored in a memory card, a setting menu for setting the camera 100, and the like. . Further, when the user sets the mode of the camera 100 to the shooting mode, the control device 104 outputs image data for display of images acquired from the image sensor 103 in time series to the monitor 106. As a result, a through image is displayed on the monitor 106.
  • the control device 104 includes a CPU, a memory, and other peripheral circuits, and controls the camera 100.
  • the memory constituting the control device 104 includes SDRAM and flash memory.
  • the SDRAM is a volatile memory that is used as a work memory for the CPU to develop a program when the program is executed, and also as a buffer memory for temporarily recording data.
  • the flash memory is a non-volatile memory in which data of a program executed by the control device 104, various parameters read during program execution, and the like are recorded.
  • control device 104 specifies the position of the subject in the image based on the color information or luminance information of the image.
  • FIG. 2A shows the target image itself
  • FIG. 2B shows the main subject as an illustration for the purpose of explanation.
  • the flower portion indicated by the frame F1 in FIG. 2B is a portion photographed as a main subject by the user.
  • control device 104 is executed by the control device 104 as a program that starts when input of image data from the image sensor 103 is started.
  • step S101 the control device 104 converts the target image into a YCbCr format image, and displays a Y component image (Y plane image), a Cr component image (Cr plane image), and a Cb component image (Cb plane image).
  • a target image represented in the RGB color system is represented by a luminance image composed of luminance components (Y components) in the YCbCr color space and color difference components (Cb components) using the following equations (1) to (3). , Cr component).
  • the control device 104 For the target image, the control device 104 generates a luminance image composed of the Y component as a Y plane image using the following equation (1), and a color difference composed of the Cb component using the following equations (2) and (3).
  • An image and a color difference image composed of Cr components are generated as a Cb plane image and a Cr plane image, respectively.
  • step S102 the control device 104 binarizes the Y plane image, the Cr plane image, and the Cb plane image generated in step S101 into nine sections.
  • the control device 104 examines the density values of all the pixels in the image for each of the Y plane image, the Cr plane image, and the Cb plane image generated in step S101, and calculates the average (Ave) of each density value and each density value. The standard deviation ( ⁇ ) is calculated. Then, the control device 104 binarizes the Y plane image, the Cb plane image, and the Cr plane image using the average of each density value and the standard deviation of each density.
  • FIG. 4 is a diagram schematically showing a binarization method for a Y plane image, a Cb plane image, and a Cr plane image.
  • the control device 104 generates three binarized images, that is, nine sections, for each of the Y plane image, the Cb plane image, and the Cr plane image.
  • Ave at each threshold indicates an average of the above-described density values
  • indicates a standard deviation of each of the above-described densities.
  • ⁇ and ⁇ are predetermined coefficients.
  • Fig. 5 shows an example of 9-level binarized images.
  • step S103 the control device 104 performs a labeling process on the binarized images of nine sections generated in step S102.
  • control device 104 extracts a set of white pixels and a set of black pixels in each binarized image as a labeling region for each of the nine binarized images generated in step S102. And the labeling area
  • step S104 the control device 104 calculates the area of each island (white pixel region) detected by the labeling process in step S103.
  • an island of a certain size or larger or an island of a certain size or smaller may be excluded.
  • an island with an area ratio of 60% or more with respect to the entire area of the binarized image or an island with an area ratio with respect to the area of the entire binarized image of 1% or less may be excluded.
  • step S105 the control device 104 calculates the moment of inertia of each island (white pixel region) detected by the labeling process in step S103.
  • the control device 104 calculates the moment of inertia around the center of the screen for the islands in the binarized image of nine sections generated in step S102. With this process, the moment of inertia around the center of the screen is calculated for each island in the binarized image.
  • the method of calculating the moment of inertia is well known and will not be described in detail. For example, it can be calculated by the sum of the square of the pixel distance from the center of the screen ⁇ (0 or 1) density value.
  • step S106 the control device 104 calculates a first evaluation value for each island based on the area of each island calculated in step S104 and the inertia moment of each island calculated in step S105.
  • the control device 104 calculates the first evaluation value by the following equation (4).
  • First evaluation value (area of each island calculated in step S104) ⁇ ⁇ ⁇ (moment of inertia of each island calculated in step S105) (4)
  • is a predetermined coefficient.
  • step S107 the control device 104 ranks the islands in the binarized image generated in step S102 based on the first evaluation value calculated in step S106.
  • the control device 104 compares the first evaluation value calculated in step S106 for each island specified in step S103, and specifies the island having the largest first evaluation value as a representative island.
  • FIG. 6 shows an example of a binarized image of nine sections in which representative islands are specified.
  • the control apparatus 104 ranks each island in the binarized image of 9 divisions produced
  • the result of ranking the representative islands of the binarized image based on the first representative evaluation value is, for example, as shown in FIG. 7, the representative island of M (Y2), the representative island of M (Y1), M (B3) The representative island of the pass, the representative island of the M (R1) pass, the representative island of the M (R2) pass, and the representative island of the M (Y3) pass.
  • the first to sixth positions are shown.
  • the first evaluation value calculated in step S106 increases as the area of the island increases and the inertia moment of the island decreases. For this reason, by ranking based on the first evaluation value, there are a large number of white pixels with a large area of the island and a high possibility of a subject, and the representative is a representative in the binarized image where the island is close to the center of the screen. The higher the island, the higher the ranking.
  • step S108 the control device 104 specifies the first subject position based on the result of the ranking performed in step S107.
  • the control device 104 specifies the position of the representative island with the highest rank as the subject position in the target image based on the ranking result performed in step S107. As shown in FIG. 7, the representative island of M (Y2) is ranked first. Therefore, the control device 104 specifies the envelope frame F2 of the representative island in M (Y2) as the first subject position.
  • FIGS. 8A and 8B show the envelope frame F2 applied to FIGS. 2A and 2B. As shown in FIGS. 8A and 8B, in the first subject position specification based on the first evaluation value, a position close to the center of the screen is likely to be prioritized, so that it is specified at a position different from the actual main subject.
  • step S109 the control device 104 recalculates the moment of inertia of the representative island in the higher rank among the representative islands of the binarized images specified in step S107.
  • the control device 104 recalculates the moment of inertia with respect to the representative island of the higher rank among the representative islands of the binarized images specified in step S107, with the center of gravity position of each representative island as the center.
  • step S105 The details of the method of calculating the moment of inertia are the same as in step S105 described above.
  • the moment of inertia is calculated assuming that the center of the screen is the subject position, whereas in step S109, the representatives ranked in step S107 are calculated.
  • the moment of inertia is recalculated using the position of the island as the subject position.
  • step S110 the control device 104 calculates a second evaluation value based on the area of each island calculated in step S104 and the inertia moment of each island calculated in step S109.
  • the control device 104 calculates the second evaluation value by the following equation (5).
  • Second evaluation value (area of each island calculated in step S104) ⁇ ⁇ ⁇ (moment of inertia of each island calculated in step S109) (5)
  • is a predetermined coefficient.
  • step S111 the control device 104 ranks the representative islands of the binarized images specified in step S107 based on the second evaluation value calculated in step S110.
  • the control device 104 re-ranks the representative islands of the respective binarized images specified in step S107 based on the second evaluation value of the representative island. Specifically, the ranking is performed such that the higher the second evaluation value of the representative island in each binarized image, the higher the ranking.
  • the ranking results of the representative islands of the binarized image based on the second representative evaluation value in this case are, for example, as shown in FIG. 9, the representative island of M (R1), the representative island of M (R2), M (Y2) The representative island of Kashiwa, the representative island of M (Y1), the representative island of M (B3), and the representative island of M (Y3).
  • the first to sixth positions are shown.
  • the second evaluation value calculated in step S110 is an evaluation value for re-evaluating the first evaluation value in consideration of the subject position specified based on the first evaluation value. For this reason, by ranking based on the second evaluation value, ranking according to the contents of the image can be performed.
  • step S112 the control device 104 specifies the second subject position based on the result of the ranking performed in step S111.
  • the control device 104 identifies the position of the representative island with the highest rank as the subject position in the target image based on the ranking result performed in step S111. As shown in FIG. 9, the representative island of M (R1) is ranked first. Therefore, the control device 104 specifies the envelope frame F3 of the representative island in M (R1) as the second subject position.
  • FIG. 10A and FIG. 10B show the envelope frame F3 applied to FIG. 2A and FIG. 2B.
  • the control device 104 ends the series of processes shown in FIG.
  • the example in which the second subject position is specified based only on the second evaluation value is shown.
  • the second subject position is specified based on both the first evaluation value and the second evaluation value. Also good.
  • the first evaluation value and the second evaluation value may be handled equivalently, or may be comprehensively evaluated by appropriately weighting.
  • a score is given according to the ranking based on the first evaluation value
  • a score is given according to the ranking based on the second evaluation value
  • the respective scores are added to obtain the total score of each binarized image. calculate.
  • the final subject position may be specified based on the binarized image having the highest total score.
  • the subject position may be specified in consideration of other conditions in addition to the first evaluation value and the second evaluation value. Which evaluation value and condition the subject position should be identified may be designated by the user or may be automatically selected by the control device 104.
  • the control device 104 divides one image into a plurality of divided images based on the color information or luminance information of the target image, and binarizes each of the plurality of divided images using the color information or information.
  • the binarized image is generated. Then, for each of the plurality of binarized images, a first evaluation value used for specifying the subject position in the target image is calculated, and based on the calculated first evaluation value, Specify the subject position. Further, based on the specified subject position, a second evaluation value used for re-specifying the subject position in the target image is calculated for each of the plurality of binarized images, and the calculated second evaluation value is calculated.
  • the subject position in the target image is specified again based on the value.
  • the subject position in the target image can be specified with high accuracy. Therefore, even when the camera user cannot catch the subject, such as when the subject moves fast or wants to take a quick shot, for example, the first subject is identified assuming that the subject is at the center of the camera screen. By specifying the second subject based on the result, the subject position can be specified with high accuracy.
  • the control device 104 re-specifies the subject position in the target image based on both the first evaluation value and the second evaluation value. As a result, the subject position can be specified with a better balance.
  • the control device 104 determines, as the first evaluation value, at least a value related to the area of the white pixel region composed of white pixels in the binarized image, and the distance between the white pixel region and the predetermined reference region And the second evaluation value is based on at least the value related to the area and the value related to the distance between the white pixel region and the region based on the subject position specified based on the first evaluation value.
  • the subject position can be specified with high accuracy in consideration of the area of the island, the position of the island, and the like.
  • the binarization process and the labeling process described in the above embodiment are examples, and the present invention is not limited to this example.
  • any method may be used as long as one image can be divided into a plurality of divided images based on color information or luminance information, and binarization based on color information or luminance information may be used. Any method may be used. Also, any method may be used for specifying the island by the labeling process.
  • the example in which the image data of the target image is image data expressed in the RGB color system has been shown.
  • color space conversion processing or the like is appropriately performed regardless of the data format. By doing so, the present invention can be applied as well.
  • the method of calculating each evaluation value described in the above embodiment is an example, and the present invention is not limited to this example.
  • the evaluation value is calculated based on the value related to the area of the white pixel region composed of white pixels in the binarized image and the value related to the distance between the white pixel region and a certain region, It can be anything.
  • the present invention is not limited to the configurations in the above-described embodiments as long as the characteristic functions of the present invention are not impaired. Moreover, it is good also as a structure which combined the above-mentioned embodiment and a some modification.
  • DESCRIPTION OF SYMBOLS 100 ... Camera, 101 ... Operation member, 102 ... Lens, 103 ... Image pick-up element, 104 ... Control device, 105 ... Memory card slot, 106 ... Monitor

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

La position d'un sujet dans une image est identifiée avec précision, grâce à l'exécution par un ordinateur : d'une procédure de segmentation servant à segmenter une image unique en une pluralité d'images segmentaires en fonction des informations de couleur ou des informations de luminance d'une image cible ; d'une procédure de binarisation servant à binariser en utilisant des informations de couleur ou des informations de luminance chaque image de la pluralité d'images segmentaires, et de générer une pluralité d'images binaires ; d'une procédure de calcul de première valeur d'évaluation servant à calculer une première valeur d'évaluation destinée à être utilisée pour l'identification d'une position de sujet à l'intérieur de l'image cible, pour chaque image de la pluralité d'images binaires ; d'une procédure d'identification de première position de sujet servant à identifier la position de sujet à l'intérieur de l'image cible, sur la base de la première valeur d'évaluation ; d'une procédure de calcul de seconde valeur d'évaluation servant à calculer une seconde valeur d'évaluation destinée à être utilisée pour la nouvelle identification de la position de sujet à l'intérieur de l'image cible, pour chaque image de la pluralité d'images binaires, sur la base de la position de sujet identifiée dans la procédure d'identification de première position de sujet ; et d'une procédure d'identification de seconde position de sujet servant à identifier de nouveau la position de sujet à l'intérieur de l'image cible, sur la base de la seconde valeur d'évaluation.
PCT/JP2013/000530 2012-02-01 2013-01-31 Programme permettant d'identifier la position d'un sujet, dispositif permettant d'identifier la position d'un sujet et caméra WO2013114884A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012019959 2012-02-01
JP2012-019959 2012-02-01

Publications (1)

Publication Number Publication Date
WO2013114884A1 true WO2013114884A1 (fr) 2013-08-08

Family

ID=48904928

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/000530 WO2013114884A1 (fr) 2012-02-01 2013-01-31 Programme permettant d'identifier la position d'un sujet, dispositif permettant d'identifier la position d'un sujet et caméra

Country Status (1)

Country Link
WO (1) WO2013114884A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015106304A (ja) * 2013-11-29 2015-06-08 株式会社ニコン 被写体特定装置、撮像装置およびプログラム
JP2015148906A (ja) * 2014-02-05 2015-08-20 株式会社ニコン 画像処理装置、撮像装置及び画像処理プログラム
JP2015148905A (ja) * 2014-02-05 2015-08-20 株式会社ニコン 被写体検出装置、撮像装置及びプログラム

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011019177A (ja) * 2009-07-10 2011-01-27 Nikon Corp 被写体位置特定用プログラム、およびカメラ

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011019177A (ja) * 2009-07-10 2011-01-27 Nikon Corp 被写体位置特定用プログラム、およびカメラ

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015106304A (ja) * 2013-11-29 2015-06-08 株式会社ニコン 被写体特定装置、撮像装置およびプログラム
JP2015148906A (ja) * 2014-02-05 2015-08-20 株式会社ニコン 画像処理装置、撮像装置及び画像処理プログラム
JP2015148905A (ja) * 2014-02-05 2015-08-20 株式会社ニコン 被写体検出装置、撮像装置及びプログラム

Similar Documents

Publication Publication Date Title
JP5246078B2 (ja) 被写体位置特定用プログラム、およびカメラ
WO2010128579A1 (fr) Caméra électronique, dispositif de traitement d'image et procédé de traitement d'image
WO2013114884A1 (fr) Programme permettant d'identifier la position d'un sujet, dispositif permettant d'identifier la position d'un sujet et caméra
JP5691617B2 (ja) 被写体特定装置、および被写体追尾装置
JP2012039210A (ja) 被写体特定用プログラム、およびカメラ
US9699371B1 (en) Image processing system with saliency integration and method of operation thereof
JP4771536B2 (ja) 撮像装置及び主被写体となる顔の選択方法
JP5381498B2 (ja) 画像処理装置、画像処理プログラムおよび画像処理方法
JP5240305B2 (ja) 被写体特定用プログラム、およびカメラ
JP5434057B2 (ja) 画像表示装置、および画像表示プログラム
JP6776532B2 (ja) 画像処理装置、撮像装置、電子機器及び画像処理プログラム
JP2018152095A (ja) 画像処理装置、撮像装置及び画像処理プログラム
JP6326841B2 (ja) 画像処理装置、撮像装置及び画像処理プログラム
JP4770965B2 (ja) 画像マッチング装置、およびカメラ
JP2011147076A (ja) 画像処理装置、撮像装置およびプログラム
JP2010245923A (ja) 被写体追尾装置、およびカメラ
JP4661957B2 (ja) 画像マッチング装置、およびカメラ
JP2010197968A (ja) 合焦評価装置、カメラおよびプログラム
JP6405638B2 (ja) 被写体検出装置、撮像装置及びプログラム
JP6318661B2 (ja) 被写体検出装置、撮像装置及び画像処理プログラム
JP2019036972A (ja) 被写体検出装置、撮像装置及びプログラム
JP6326840B2 (ja) 画像処理装置、撮像装置及び画像処理プログラム
JP2019008830A (ja) 被写体検出装置、撮像装置及びプログラム
JP2018141991A (ja) 被写体検出装置
JP2009140285A (ja) 画像識別装置、画像識別方法、及び、撮像装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13743264

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13743264

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP