WO2016151869A1 - Appareil de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Appareil de traitement d'informations, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2016151869A1
WO2016151869A1 PCT/JP2015/059822 JP2015059822W WO2016151869A1 WO 2016151869 A1 WO2016151869 A1 WO 2016151869A1 JP 2015059822 W JP2015059822 W JP 2015059822W WO 2016151869 A1 WO2016151869 A1 WO 2016151869A1
Authority
WO
WIPO (PCT)
Prior art keywords
projector
cameras
input
controlling
external display
Prior art date
Application number
PCT/JP2015/059822
Other languages
English (en)
Inventor
Xiao Peng
Original Assignee
Nec Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nec Corporation filed Critical Nec Corporation
Priority to PCT/JP2015/059822 priority Critical patent/WO2016151869A1/fr
Publication of WO2016151869A1 publication Critical patent/WO2016151869A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3173Constructional details thereof wherein the projection device is specially adapted for enhanced portability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback

Definitions

  • the present invention relates to an information processing apparatus, an information processing method, and program.
  • wearable devices such as smart watches have become the new hot spot in consumer electronics.
  • the smart watch is computerized and can perform many other functions besides telling the time, such as making phone call, playing audio and video and so on.
  • the wearable device usually contains the strong computing devices.
  • many sensors are integrated into the wearable devices for various features, such as camera (image sensor), accelerometer and so on.
  • the wearable device can also connect to the outside world through communication devices and protocols, such as WiFi and near field communication (NFC).
  • WiFi and NFC near field communication
  • the human- machine interface (HMI) in the wearable devices is very important for the application scope and the user experience.
  • the wearable devices should be easy to be fixed on human body, for example, the smart watches should be fastened on the wrist, the wearable devices should be small and light enough.
  • the nature of size and weight limitation determined that the HMI in the wearable device cannot be made complex enough as the smart phone. Thus, the current wearable devices fall into the simple HMI problem.
  • the HMI is limited to the simple button press operation.
  • the simple HMI problem mainly includes two aspects: the first is that little information can display on the screen at one time and the second is that only simple touch operation is allowed on the screen.
  • the drawback of this invention includes two aspects:
  • the purpose of this invention is to solve the simple HMI problem for wearable
  • One aspect of the present invention provides an information processing apparatus comprising at least one projector, at least two cameras and a control unit for controlling said at least one projector to display an image on a surface outside of the apparatus and controlling the at least two cameras to generate a virtual input.
  • Another aspect of the present invention provides a method for controlling an information processing apparatus comprising at least one projector and at least two cameras, comprising controlling said at least one projector to display an image on a surface outside of the apparatus and controlling said at least two cameras to generate a virtual input.
  • FIG. 1 is a schematic diagram illustrating the basic architecture of the first embodiment.
  • FIG. 2 is a schematic diagram illustrating the working process of the first embodiment.
  • FIG. 3 is a schematic diagram illustrating the construction of the second embodiment.
  • FIG. 4 is a schematic diagram illustrating the architecture of the second embodiment.
  • FIG. 5 is a flow chart illustrating the working process of the second embodiment.
  • FIG. 6 is a schematic diagram illustrating the continuous and discontinuous area in the second embodiment.
  • Fig. 7A is a schematic diagram illustrating changing the external display area and shape according to the user and application requirement.
  • FIG. 7B is a schematic diagram illustrating changing the external display area and shape according to the user and application requirement.
  • FIG. 7C is a schematic diagram illustrating changing the external display area and shape according to the user and application requirement.
  • Fig. 8A is a schematic diagram illustrating the relative position calculation of projector and the surface in the surface plane.
  • Fig. 8B is a schematic diagram illustrating the relative position calculation of projector and the surface in the vertical plane.
  • FIG. 9 is a schematic diagram illustrating the making the projection to the required shape with the mask setting in the second embodiment.
  • FIG. 10 is a schematic diagram illustrating the feature point detection
  • FIG. 11 A is a flow chart illustrating the working process for detecting the input operation start.
  • FIG. 1 IB is a flow chart illustrating the working process for detecting the input operation end.
  • FIG. 12 is a schematic diagram illustrating the tracking for recognizing the handwriting.
  • FIG. 13 is a schematic diagram illustrating the construction of the third embodiment.
  • FIG. 14 is a schematic diagram illustrating the architecture of the third embodiment.
  • FIG. 15 is a flow chart illustrating the working process of the third embodiment.
  • FIG. 16 is a schematic diagram illustrating the making the projection to the required shape with the background imitation in the third embodiment.
  • FIG. 17 is a schematic diagram illustrating the construction of the fourth embodiment.
  • FIG. 18 is a schematic diagram illustrating the architecture of the fourth embodiment.
  • FIG. 19 is a schematic diagram illustrating the making the projection to the required shape with the background imitation in the fourth embodiment.
  • FIG. 20 is a schematic diagram illustrating the difference between the fingertip and the actual operation point.
  • FIG. 21 is a schematic diagram illustrating the detection error when fingertip is not face towards the light transmitter and receiver.
  • Fig. 1 illustrates the basic block diagram of a wearable device 100 according to the first embodiment of the present invention.
  • the wearable device 100 there are at least two cameras 101 for recognizing the operation in virtual input. There are some spaces between the cameras.
  • At least one projector 102 is used to project an image on an external display area 120.
  • An internal HMI 104 includes the internal output device, such as the screen of the wearable device 100, and the internal input devices, such as the buttons of the wearable device 100.
  • the internal HMI can also be realized as the wireless or voice control.
  • a control unit 103 is used to control the cameras, the projector and the internal HMI.
  • control unit 103 receives the images signals captured from the cameras 101, recognizes the operation information from the virtual input. If necessary, control unit 103 can modify the parameters of the cameras 101.
  • the apparatus can include other devices for assisting the external display and virtual input.
  • the communication device can help the wearable device to communicate with the outer device.
  • the various sensors can provide the information for controlling the cameras and projectors.
  • control unit 103 controls the area, the angle, the shape and other parameters of the projection. Control unit 103 transmits the projection content to the projector.
  • control unit 103 receives the input information from the internal HMI 104 and transmits the output information to the internal HMI 104.
  • the control unit 103 is implemented in some computing device, such as Central Processing Unit (CPU). There is memory device for storing the control program.
  • CPU Central Processing Unit
  • Fig. 2 illustrates the working process of the external display and virtual input.
  • the cameras capture the images in which the surface to be projected on is contained.
  • the continuous area which is suitable for external display on the surface is detected (S203).
  • the external display area and shape is determined according to user and application requirement(S205).
  • the relative position of the projector and the surface is calculated in the surface plane and vertical plane.
  • the parameters, such as the projection angle and the distortion correction, of the projector are set (S207).
  • the external display content is projected on the surface by the projector (S209).
  • the cameras capture the images in order to detect the feature points of the virtual input object, such as the fingertip (S211).
  • the detecting method of feature points is calculating the relative positions of the input object and the surface.
  • the predefined amount of points which have the smallest distance between the input object and the surface are regarded as the feature points. From the viewpoint of the cameras, since the input object will always appear in the operation area, how to judge the start and end of the actual operation is quite important. It is difficult to recognize whether the input object touches the surface or not. Instead, the relative positions of the feature points and the surface are considered. When the input object is approaching the surface, the relative distances between the feature points are getting smaller.
  • the input operation start is detected.
  • the relative distances between the feature points are getting larger.
  • the input operation end is detected.
  • the feature points are tracked for recognizing the operation of the virtual input object between the operation start and end (S213).
  • Input information which is input into the wearable is recognized (S215). During the actual realization, this process is carried out over and over again in real time computing.
  • Fig. 3 illustrates the architecture of the second embodiment for the case of smart watch 300.
  • Fig. 4 illustrates a block diagram of the smart watch 300.
  • smart watch 300 contains the computing device, the memory device and sensors 408 in the watch body.
  • One projector 102 is mounted in the watch body for projecting an image on the external display area 120.
  • the mask 307 is mounted on the projector 102 for changing the shape of external display area 120.
  • Two cameras 301 , 302 are mounted at both sides of the projector 102 for recognizing the operation in virtual input.
  • a screen 304 and the buttons 305 in the watch body form the internal HMI.
  • the external display area 120 is on the surface of the back of the hand 310.
  • the virtual input object is the opposite hand 320.
  • the control unit 303 is implemented in the computing device and memory device. The control unit 303 can use the information from sensors 408 for assisting the external display and virtual input.
  • Fig. 5 illustrates the working process of this embodiment.
  • the cameras 301, 302 capture the images in which the back of the hand to be projected on is contained.
  • the continuous area which is suitable for external display on the surface is detected (S503).
  • the external display area 120 and shape is determined according to user and application requirement (S505).
  • the relative position of the projector 102 and the surface is calculated in the surface plane and vertical plane. Based on the external display area and shape determination, and the relative position of the projector and the surface, the parameters of the projector are set (S507).
  • An image is projected to the surface by the projector 102 (S509).
  • the shape of the external display is determined by setting the mask.
  • the cameras capture the images in order to detect the feature points of the virtual input object which is the fingertip (S511).
  • the feature points are tracked for recognizing the operation of the virtual input object (S513).
  • the input information input into the smart watch is recognized (S515). During the actual realization, this process is carried out over and over again in real time computing.
  • Fig. 6 shows the example of continuous area 601 and discontinuous area 602.
  • the continuous area means an area which has no edge or boundary in the closed area.
  • the area in the solid line is the continuous area of the back of hand.
  • the area in the dashed line is the example of the discontinuous area. Since the discontinuous area 602 contains the space between the fingers, it will bring distortion.
  • the shape and the area of the external display is determined in the continuous area.
  • Figs. 7A-7C shows the example of changing the position and shape of external display area 120.
  • the email is projected on the back of hand 320.
  • the projection area 702 is enlarged according to the email content (see Fig. 7B).
  • the shape is set from rectangle 702 to circle 703 (see Fig. 7C).
  • FIGs. 8A and 8B illustrate the example that the relative position is calculated in the surface plane 801 and the vertical plane 802.
  • camera 301 and camera 302 capture different images of the back of hand.
  • the relative position of projector in the surface plane 801 is calculated through analyzing the two different images.
  • camera 301 and camera 302 capture different images of the back of hand.
  • the relative position of projector in the vertical plane 802 is calculated through analyzing the two different images.
  • the parameters such as the projection angle and the distortion correction are set.
  • the mask 307 mounted on the projector 102 is configured manually or automatically. As shown in the example in Fig. 9, with the mask 307, the required hexagon external display area 901 is projected on the back of the hand 320.
  • the index finger of the opposite hand 320 is detected.
  • the relative positions of the feature points in the surface plane 801 and the vertical plane 802 are calculated with the two cameras 301, 302 in the same principle of calculating the relative positon of the projector 102.
  • the predefined amount of points which have the smallest distance between the finger 1001 and the surface 1002 are detected as the feature points.
  • the input operation start is detected (SI 105).
  • the input operation end is detected (SI 115).
  • the input information is recognized. For example in Fig. 12, the handwriting input "A" 1201 can be recognized and input to the smart watch 300.
  • Fig. 14 illustrates a block diagram of the smart watch 1300.
  • the smart watch 1300 contains the computing device, the memory device, the sensors 408 and communication unit 1301 in the watch body.
  • One projector is mounted in the watch body for projecting the external display.
  • Two cameras 301, 302 are mounted at both sides of the projector 102 for recognizing the operation in virtual input.
  • the screen and the buttons in the watch body form the internal HMI 104.
  • the external display area is on the surface of the back of the hand.
  • the virtual input object is the opposite hand.
  • the control unit 303 is implemented in the computing device and memory device. The control unit 303 can use the information from sensors for assisting the external display and virtual input.
  • the communication unit 1401 connects a remote computing device 1410, such as the cloud, which helps the computing and controlling the external display and virtual input.
  • Fig. 15 illustrates the working process of this embodiment.
  • the cameras 301 , 302 capture the images in which the back of the hand to be projected on is contained(S).
  • S 1501 the continuous area which is suitable for external display on the surface is detected (S1503).
  • the external display area and shape is determined according to user and application requirement (SI 505).
  • the image information, such as the color and brightness, of the selected area on the surface is recorded.
  • the relative position of the projector 102 and the surface is calculated in the surface plane and vertical plane.
  • the parameters of the projector are set(S1507).
  • the external display content (image) is projected to the surface by the projector 102 (S1509).
  • the cameras 301, 302 capture the images in order to detect the feature points of the virtual input object which is the fingertip (S1511).
  • the feature points are tracked for recognizing the operation of the virtual input object (S1513).
  • the input information input into the smart watch 1000 is recognized (S1515). During the actual realization, this process is carried out over and over again in real time computing.
  • Fig. 16 illustrates the example of the background imitation for projecting the required shape on the surface.
  • the background imitation needs not to change the actual projection area.
  • the content to be displayed in the external display area 1601 is projected.
  • the imitation image that is similar with the projection surface is projected on the rest area which is called "background imitation area”.
  • the effect is that the background imitation area 1602 looks the same with the other part on the surface, which does not influence the user experience.
  • Fig. 18 illustrates a block diagram of the smart wristband 1700.
  • the smart wristband 1700 contains the computing device, the memory device and sensors in the wristband body 1707.
  • Two projectors 1701, 1702 are mounted in the wristband body 1707 for projecting the external display.
  • Two cameras 1703, 1704 are mounted for recognizing the operation in virtual input.
  • the indicator lights 1705 and the buttons 1706 in the wristband body 1707 form the internal HMI.
  • the external display area is positioned on the back of the hand.
  • the dedicated stylus 1710 can be used as a virtual input object.
  • the control unit 1803 is implemented in the computing device and memory device.
  • the control unit 1803 can use the information from sensors 1808 for assisting the external display and virtual input.
  • each camera is connected to one computing device (CD) 1801, 1802, which is used to distributing the computing and helping the control in the control unit 1803.
  • CD computing device
  • Fig. 19 illustrates the example of the background imitation for projecting the required shape on the surface.
  • the background imitation needs not to change the actual projection area.
  • the content to be displayed in the external display area 1901 is projected from projector 1701.
  • the imitation image that is similar with the projection surface is projected on the rest area, which is called "background imitation area.”
  • the effect is that the background imitation area 1902 looks the same with the other part on the surface, which does not influence the user experience.
  • the content of the background imitation area 1902 is projected from projector 1702.
  • the effect is that the background imitation area 1902 looks the same with the other part on the surface, which does not influence the user experience.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention a trait à un appareil de traitement d'informations. Cet appareil de traitement d'informations comprend au moins un projecteur, au moins deux appareils de prise de vues et une unité de commande conçue pour commander ledit projecteur afin d'afficher une image sur une surface à l'extérieur de l'appareil, et pour commander lesdits appareils de prise de vues dans le but de générer une entrée virtuelle.
PCT/JP2015/059822 2015-03-23 2015-03-23 Appareil de traitement d'informations, procédé de traitement d'informations et programme WO2016151869A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/059822 WO2016151869A1 (fr) 2015-03-23 2015-03-23 Appareil de traitement d'informations, procédé de traitement d'informations et programme

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/059822 WO2016151869A1 (fr) 2015-03-23 2015-03-23 Appareil de traitement d'informations, procédé de traitement d'informations et programme

Publications (1)

Publication Number Publication Date
WO2016151869A1 true WO2016151869A1 (fr) 2016-09-29

Family

ID=56978046

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/059822 WO2016151869A1 (fr) 2015-03-23 2015-03-23 Appareil de traitement d'informations, procédé de traitement d'informations et programme

Country Status (1)

Country Link
WO (1) WO2016151869A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022131024A (ja) * 2021-02-26 2022-09-07 セイコーエプソン株式会社 表示方法およびプログラム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120293402A1 (en) * 2011-05-17 2012-11-22 Microsoft Corporation Monitoring interactions between two or more objects within an environment
WO2013028280A2 (fr) * 2011-08-19 2013-02-28 Qualcomm Incorporated Sélection dynamique de surfaces dans le monde réel pour y projeter des informations
US20140078378A1 (en) * 2011-05-25 2014-03-20 Obzerv Technologies Unc. Active Imaging Device Having Field of View and Field of Illumination With Corresponding Rectangular Aspect Ratios
US20140292648A1 (en) * 2013-04-02 2014-10-02 Fujitsu Limited Information operation display system, display program, and display method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120293402A1 (en) * 2011-05-17 2012-11-22 Microsoft Corporation Monitoring interactions between two or more objects within an environment
US20140078378A1 (en) * 2011-05-25 2014-03-20 Obzerv Technologies Unc. Active Imaging Device Having Field of View and Field of Illumination With Corresponding Rectangular Aspect Ratios
WO2013028280A2 (fr) * 2011-08-19 2013-02-28 Qualcomm Incorporated Sélection dynamique de surfaces dans le monde réel pour y projeter des informations
US20140292648A1 (en) * 2013-04-02 2014-10-02 Fujitsu Limited Information operation display system, display program, and display method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022131024A (ja) * 2021-02-26 2022-09-07 セイコーエプソン株式会社 表示方法およびプログラム
JP7287409B2 (ja) 2021-02-26 2023-06-06 セイコーエプソン株式会社 表示方法およびプログラム

Similar Documents

Publication Publication Date Title
EP3090331B1 (fr) Systèmes avec techniques pour une commande d'interface utilisateur
CN110199251B (zh) 显示装置和远程操作控制装置
WO2017215375A1 (fr) Dispositif et procédé d'entrée d'informations
US20160320855A1 (en) Touch fee interface for augmented reality systems
WO2017036035A1 (fr) Procédé et dispositif de commande d'écran
WO2020103526A1 (fr) Méthode et dispositif de photographie, support de stockage et dispositif terminal
WO2018072339A1 (fr) Casque de réalité virtuelle et procédé de commutation d'informations d'affichage de casque de réalité virtuelle
WO2021035646A1 (fr) Dispositif portable et son procédé de commande, procédé de reconnaissance de geste et système de commande
JP2014048937A (ja) ジェスチャ認識装置、その制御方法、表示機器、および制御プログラム
JP2017146927A (ja) 制御装置、制御方法及びプログラム
WO2019033322A1 (fr) Dispositif de commande portatif, et procédé et système de suivi et de localisation
CN109839827B (zh) 一种基于全空间位置信息的手势识别智能家居控制系统
US11816924B2 (en) Method for behaviour recognition based on line-of-sight estimation, electronic equipment, and storage medium
WO2018198499A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement
US11886643B2 (en) Information processing apparatus and information processing method
CN105306819A (zh) 一种基于手势控制拍照的方法及装置
JP4985531B2 (ja) ミラーシステム
JP4870651B2 (ja) 情報入力システムおよび情報入力方法
WO2021004413A1 (fr) Dispositif d'entrée portatif et procédé et appareil de commande d'extinction pour icône d'indication d'un dispositif d'entrée portatif
JP2016071401A (ja) 位置検出装置、プロジェクター、及び、位置検出方法
WO2016151869A1 (fr) Appareil de traitement d'informations, procédé de traitement d'informations et programme
US20220244788A1 (en) Head-mounted display
US9761009B2 (en) Motion tracking device control systems and methods
US11054941B2 (en) Information processing system, information processing method, and program for correcting operation direction and operation amount
CN114327047B (zh) 设备控制方法、设备控制装置及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15886421

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15886421

Country of ref document: EP

Kind code of ref document: A1