WO2009054619A2 - Dispositif informatique de réalité amplifiée - Google Patents
Dispositif informatique de réalité amplifiée Download PDFInfo
- Publication number
- WO2009054619A2 WO2009054619A2 PCT/KR2008/005842 KR2008005842W WO2009054619A2 WO 2009054619 A2 WO2009054619 A2 WO 2009054619A2 KR 2008005842 W KR2008005842 W KR 2008005842W WO 2009054619 A2 WO2009054619 A2 WO 2009054619A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- rectangle
- mark
- camera
- computer
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
Definitions
- the present invention relates to the augmented reality computer device.
- the object of the present invention is to provide the enlarged display through the head mound display by augmented reality with the small physical display of small (handheld, mobile) computer such as mobile phone, navigator, mobile game machine, umpc(ultra mobile PC).
- the augmented reality is the technology to overlay the computer graphic image onto the real video image.
- the overlaied computer graphic image usually attached to the physical mark in video image and the overlaied image looks like real physical thing. For example, a person wares the mark on head and camera captures the person and image processing portion recognizes the mark from the video and overlays the 3 dimensional monster face image onto the mark then the final output video shows the person whose head is monster.
- the mark usually contains black rectangle in white background.
- the image processing portion recognizes the rectangle from the video and produces the 3 dimensional distance and direction between the camera and the mark by analyzing the shape(distortion) and size of rectangle in captured video and overlay the transformed 3 dimensional monster face image onto the video.
- the transformation includes the translation, scaling, rotation which is well known skill in 3 dimensional game programming.
- the present invention captures the video in direction of view of eye of its user by the camera attached to the head mount display(or head of user) and overlays the enlarged screen image of mobile computer to the video.
- the head mount display of present invention is safer(easy to escape from the accident) than ordinary head mount display because the ordinary head mount display cover the whole field of view of its user so the user must take off the head mount display in order to watch the real world but the user of present invention can watch the real world and virtual overlaied screen at the same time without taking off the head mount display.
- FIG.1 shows the composition of present invention.
- FIG.2 shows the mark on display of mobile computer.
- FIG.3 shows the modified mark on display of mobile computer.
- the augmented reality computer device of present invention comprises [27] camera portion capturing the image of mark of mobile computer,
- image synthesis portion synthesizing the image by transforming and overlaying the screen shot image of mobile computer onto the video image captured by the camera portion where the transformation and the overlaying position are determined by the distance and direction between camera and mark,
- FIG.l shows the composition of the present invention.
- stereo cameras(cal,ca2) are attached on the head mount display(ds) in order to capture the video in the direction of view of eye.
- the display(mo) of mobile computer is captured by the said stereo camera(cal,ca2) and the captured video(cd) is transferred to the main portion (con).
- the said stereo camera(cal,ca2) is captured by the said stereo camera(cal,ca2) and the captured video(cd) is transferred to the main portion (con).
- the main portion (con).
- Image processing program in main portion(con) analyzes the received video(cd) and extract feature points of mark(for example, rectangle of display(mo) of mobile computer ,rectangle of case of mobile computer ,LED attached on the mobile computer, or image in display(mo) of mobile computer) and outputs the direction and distance (3 dimensional position or 2 dimensional position) between camera and mark by recognizing the size and shape(distortion) of mark.
- There is well known technology to calculate the 3 dimensional distance and direction between camera and mark whose size and shape is known. For example, by analyzing size and shape of rectangle mark (vertices of rectangle are 4 feature points) in captured image, the 3 dimensional distance and direction can be obtained by the solution of the perspective 4 point problem. Detail information can be found from http://homepages.inf.ed.ac.uk/rbf/CVonline/LOCAL COPIES/MARBLE/high/pia/solv ing.htm
- the 3 dimensional direction and distance between mark and camera can also be obtained by analyzing the stereo video images (left and right image).
- the main portion(con) receives the screen shot video image(md) signal from the mobile computerlike the beam projector receives the video signal from computer.)
- the main portion(con) may be embedded into the mobile computer.
- the image synthesis portion (micro controller, computer, or digital signal processor) in main portion synthesizes the output video by transforming (scaling, translating, rotating, and distorting(affine transforaiation)) the received video image(md) of mobile computer and overlaying it onto the video image(cd) captured by camera where the overlaying position and the transformation is determined by the information of 3 dimensional distance and direction between camera(cal,ca2) and the mark of mobile computer.
- the synthesized video is outputted through the head mount display(ds) and the wearer of the head mount display can see the enlarged virtual display(amo) of mobile computer attached on the mobile computer.
- FIG.1 shows the large virtual display(amo) as if it is attached on the mobile computer (mobile phone).
- the wearer of head mount display of present invention can watch his environment quickly and find way out by just laying down the mobile computer (laying down the mobile computer removes the virtual large display (amo) from the view of the wearer). But the wearer of the conventional head mount display must take off the head mount display to watch his environment because the video image of mobile computer covers all the view of its wearer. Therefore it is safer to use the head mount display of present invention then conventional one.
- Physical mark (like LED) may be attached on the mobile computer and the virtual image of keyboard, mouse or stylus pen may be overlaied onto the final synthesize image.
- the wearer of head mount display of present invention can use the virtual keyboard or mouse to control the mobile computer by adding function of recognizing the gesture of hand of wearer of head mount display of present invention into the image processing portion.
- FIG.2 shows the example of mark displayed on the display(mo) of mobile computer.
- the mark contains small rectangle (mk2) and big torus like rectangle (mkl) with common center and direction representing rectangle(mk3).
- Such mark can be obtained by outputting graphic image onto the display of mobile computer.
- the image processing portion can detect the rectangles (mkl, mk2) by recognizing the edge of rectangles by hough trans - formation(hough transformation is well known line detection technology) and analyzing the relative position of vertices which are the crossing points of the edge lines and the direction of mark can be determined by recognizing the another rectangle (mrk3). By inserting the positions of the vertices of rectangle (mkl or mk2) into the formula of perspective 4 point problem, 3 dimensional direction and distance between camera and the rectangle can be obtained.
- the rectangle is only one example of mark and there is no limitation on the modification of mark in present invention.
- FIG.3 shows the mark modified from the mark of FIG.2.
- Mark 1 of FIG.3 contains direction representing inner vertex mark(mk4) which is corresponding to the mark (mk3) of FIG.2.
- the boundary rectangle(mo) of display of mobile computer can be used as a mark.
- the image processing portion can recognizes the shape of mobile computer and extract mark by comparing the 3 dimensional model data of housing of mobile computer stored in memory and mark from the captured video image. If stereo camera(cal,ca2) is implemented ,then the mark can be extracted from the stereo video by comparing the left and right video image.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
- Processing Or Creating Images (AREA)
Abstract
L'invention concerne un dispositif informatique de réalité amplifiée. Ce dispositif comprend : une partie caméra permettant de capturer l'image d'une marque d'ordinateur mobile; une partie de traitement d'image permettant d'obtenir la distance entre la caméra et la marque et la direction caméra/marque; une partie de synthèse d'image permettant de synthétiser l'image par transformation et par superposition de l'image de capture d'écran de l'ordinateur mobile sur l'image vidéo capturée par la partie caméra, la transformation et la position de superposition étant déterminées par la distance séparant la caméra de la marque et par la direction caméra/marque; et une partie et d'affichage de lunettes intégrales permettant de produire l'image synthétisée. L'invention vise à obtenir, par un processus de réalité amplifiée, un affichage agrandi via l'affichage de lunettes intégrales au moyen d'un petit affichage physique de petit ordinateur (ordinateur pouvant tenir dans la main ou ordinateur mobile), notamment un téléphone mobile, un navigateur, un dispositif mobile de jeux vidéo et un PC ultra mobile. La réalité amplifiée est une technologie consistant à superposer l'image graphique de l'ordinateur à l'image vidéo réelle. L'image graphique de l'ordinateur ayant subi cette superposition est habituellement fixée sur la marque physique de l'image vidéo, et l'image présentant une telle superposition paraît réelle.
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20070106350 | 2007-10-22 | ||
KR10-2007-0106350 | 2007-10-22 | ||
KR20070108593 | 2007-10-28 | ||
KR10-2007-0108593 | 2007-10-28 | ||
KR10-2008-0097458 | 2008-10-05 | ||
KR1020080097458A KR20090040839A (ko) | 2007-10-22 | 2008-10-05 | 증강현실 컴퓨터 장치 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2009054619A2 true WO2009054619A2 (fr) | 2009-04-30 |
WO2009054619A3 WO2009054619A3 (fr) | 2009-06-04 |
Family
ID=40580210
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2008/005842 WO2009054619A2 (fr) | 2007-10-22 | 2008-10-05 | Dispositif informatique de réalité amplifiée |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2009054619A2 (fr) |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011160114A1 (fr) * | 2010-06-18 | 2011-12-22 | Minx, Inc. | Réalité accrue |
ES2391112A1 (es) * | 2010-07-16 | 2012-11-21 | Universidad Politécnica de Madrid | Sistema de proyeccion espacial de realidad aumentada fijado a la cabeza |
US20130050194A1 (en) * | 2011-08-31 | 2013-02-28 | Nintendo Co., Ltd. | Information processing program, information processing system, information processing apparatus, and information processing method, utilizing augmented reality technique |
EP2617202A2 (fr) * | 2010-09-20 | 2013-07-24 | Kopin Corporation | Interface bluetooth ou autre interface sans fil avec gestion de puissance pour un afficheur monté sur la tête |
EP2741480A1 (fr) * | 2012-12-07 | 2014-06-11 | BlackBerry Limited | Dispositif mobile, système et procédé pour commander un affichage frontal |
EP2826007A2 (fr) * | 2012-03-15 | 2015-01-21 | Crown Packaging Technology, Inc | Dispositif, système et procédé pour faciliter une interaction entre un dispositif de communication sans fil et un boîtier |
US9152378B2 (en) | 2010-09-20 | 2015-10-06 | Kopin Corporation | Bluetooth or other wireless interface with power management for head mounted display |
US9323057B2 (en) | 2012-12-07 | 2016-04-26 | Blackberry Limited | Mobile device, system and method for controlling a heads-up display |
US9582482B1 (en) | 2014-07-11 | 2017-02-28 | Google Inc. | Providing an annotation linking related entities in onscreen content |
US9703541B2 (en) | 2015-04-28 | 2017-07-11 | Google Inc. | Entity action suggestion on a mobile device |
US9710967B2 (en) | 2011-08-31 | 2017-07-18 | Nintendo Co., Ltd. | Information processing program, information processing system, information processing apparatus, and information processing method, utilizing augmented reality technique |
US9965559B2 (en) | 2014-08-21 | 2018-05-08 | Google Llc | Providing automatic actions for mobile onscreen content |
US10055390B2 (en) | 2015-11-18 | 2018-08-21 | Google Llc | Simulated hyperlinks on a mobile device based on user intent and a centered selection of text |
CN108762482A (zh) * | 2018-04-16 | 2018-11-06 | 北京大学 | 一种大屏幕和增强现实眼镜间数据交互方法和系统 |
CN108885856A (zh) * | 2016-03-29 | 2018-11-23 | 索尼公司 | 信息处理设备、信息处理方法和程序 |
US10178527B2 (en) | 2015-10-22 | 2019-01-08 | Google Llc | Personalized entity repository |
US10535005B1 (en) | 2016-10-26 | 2020-01-14 | Google Llc | Providing contextual actions for mobile onscreen content |
US10970646B2 (en) | 2015-10-01 | 2021-04-06 | Google Llc | Action suggestions for user-selected content |
US11237696B2 (en) | 2016-12-19 | 2022-02-01 | Google Llc | Smart assist for repeated actions |
US11348320B2 (en) | 2020-04-02 | 2022-05-31 | Samsung Electronics Company, Ltd. | Object identification utilizing paired electronic devices |
US12026593B2 (en) | 2020-10-15 | 2024-07-02 | Google Llc | Action suggestions for user-selected content |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040104935A1 (en) * | 2001-01-26 | 2004-06-03 | Todd Williamson | Virtual reality immersion system |
KR20050082348A (ko) * | 2004-02-18 | 2005-08-23 | 한국과학기술원 | 증강현실 기술을 이용하는 두부장착 디스플레이 장치 |
US20050285878A1 (en) * | 2004-05-28 | 2005-12-29 | Siddharth Singh | Mobile platform |
-
2008
- 2008-10-05 WO PCT/KR2008/005842 patent/WO2009054619A2/fr active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040104935A1 (en) * | 2001-01-26 | 2004-06-03 | Todd Williamson | Virtual reality immersion system |
KR20050082348A (ko) * | 2004-02-18 | 2005-08-23 | 한국과학기술원 | 증강현실 기술을 이용하는 두부장착 디스플레이 장치 |
US20050285878A1 (en) * | 2004-05-28 | 2005-12-29 | Siddharth Singh | Mobile platform |
Non-Patent Citations (1)
Title |
---|
ANTONIAC P.: 'AUGMENTED REALITY BASED USER INTERFACE FOR MOBILE APPLICATIONS AND SERVICES', 2005, UNIVERSITY OF OULU * |
Cited By (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011160114A1 (fr) * | 2010-06-18 | 2011-12-22 | Minx, Inc. | Réalité accrue |
ES2391112A1 (es) * | 2010-07-16 | 2012-11-21 | Universidad Politécnica de Madrid | Sistema de proyeccion espacial de realidad aumentada fijado a la cabeza |
EP2617202A2 (fr) * | 2010-09-20 | 2013-07-24 | Kopin Corporation | Interface bluetooth ou autre interface sans fil avec gestion de puissance pour un afficheur monté sur la tête |
CN103890836A (zh) * | 2010-09-20 | 2014-06-25 | 寇平公司 | 用于头戴式显示器的具有电源管理的蓝牙或其他无线接口 |
EP2617202A4 (fr) * | 2010-09-20 | 2015-01-21 | Kopin Corp | Interface bluetooth ou autre interface sans fil avec gestion de puissance pour un afficheur monté sur la tête |
US9152378B2 (en) | 2010-09-20 | 2015-10-06 | Kopin Corporation | Bluetooth or other wireless interface with power management for head mounted display |
US9710967B2 (en) | 2011-08-31 | 2017-07-18 | Nintendo Co., Ltd. | Information processing program, information processing system, information processing apparatus, and information processing method, utilizing augmented reality technique |
US20130050194A1 (en) * | 2011-08-31 | 2013-02-28 | Nintendo Co., Ltd. | Information processing program, information processing system, information processing apparatus, and information processing method, utilizing augmented reality technique |
US8922588B2 (en) * | 2011-08-31 | 2014-12-30 | Nintendo Co., Ltd. | Information processing program, information processing system, information processing apparatus, and information processing method, utilizing augmented reality technique |
EP2826007A2 (fr) * | 2012-03-15 | 2015-01-21 | Crown Packaging Technology, Inc | Dispositif, système et procédé pour faciliter une interaction entre un dispositif de communication sans fil et un boîtier |
EP2741480A1 (fr) * | 2012-12-07 | 2014-06-11 | BlackBerry Limited | Dispositif mobile, système et procédé pour commander un affichage frontal |
US9323057B2 (en) | 2012-12-07 | 2016-04-26 | Blackberry Limited | Mobile device, system and method for controlling a heads-up display |
US10080114B1 (en) | 2014-07-11 | 2018-09-18 | Google Llc | Detection and ranking of entities from mobile onscreen content |
US11907739B1 (en) | 2014-07-11 | 2024-02-20 | Google Llc | Annotating screen content in a mobile environment |
US9762651B1 (en) | 2014-07-11 | 2017-09-12 | Google Inc. | Redaction suggestion for sharing screen content |
US9788179B1 (en) | 2014-07-11 | 2017-10-10 | Google Inc. | Detection and ranking of entities from mobile onscreen content |
US9811352B1 (en) | 2014-07-11 | 2017-11-07 | Google Inc. | Replaying user input actions using screen capture images |
US9824079B1 (en) | 2014-07-11 | 2017-11-21 | Google Llc | Providing actions for mobile onscreen content |
US9886461B1 (en) | 2014-07-11 | 2018-02-06 | Google Llc | Indexing mobile onscreen content |
US9916328B1 (en) | 2014-07-11 | 2018-03-13 | Google Llc | Providing user assistance from interaction understanding |
US10963630B1 (en) | 2014-07-11 | 2021-03-30 | Google Llc | Sharing screen content in a mobile environment |
US11573810B1 (en) | 2014-07-11 | 2023-02-07 | Google Llc | Sharing screen content in a mobile environment |
US9582482B1 (en) | 2014-07-11 | 2017-02-28 | Google Inc. | Providing an annotation linking related entities in onscreen content |
US11347385B1 (en) | 2014-07-11 | 2022-05-31 | Google Llc | Sharing screen content in a mobile environment |
US10652706B1 (en) | 2014-07-11 | 2020-05-12 | Google Llc | Entity disambiguation in a mobile environment |
US10592261B1 (en) | 2014-07-11 | 2020-03-17 | Google Llc | Automating user input from onscreen content |
US10244369B1 (en) | 2014-07-11 | 2019-03-26 | Google Llc | Screen capture image repository for a user |
US10248440B1 (en) | 2014-07-11 | 2019-04-02 | Google Llc | Providing a set of user input actions to a mobile device to cause performance of the set of user input actions |
US10491660B1 (en) | 2014-07-11 | 2019-11-26 | Google Llc | Sharing screen content in a mobile environment |
US11704136B1 (en) | 2014-07-11 | 2023-07-18 | Google Llc | Automatic reminders in a mobile environment |
US9965559B2 (en) | 2014-08-21 | 2018-05-08 | Google Llc | Providing automatic actions for mobile onscreen content |
US9703541B2 (en) | 2015-04-28 | 2017-07-11 | Google Inc. | Entity action suggestion on a mobile device |
US10970646B2 (en) | 2015-10-01 | 2021-04-06 | Google Llc | Action suggestions for user-selected content |
US10178527B2 (en) | 2015-10-22 | 2019-01-08 | Google Llc | Personalized entity repository |
US11716600B2 (en) | 2015-10-22 | 2023-08-01 | Google Llc | Personalized entity repository |
US11089457B2 (en) | 2015-10-22 | 2021-08-10 | Google Llc | Personalized entity repository |
US10055390B2 (en) | 2015-11-18 | 2018-08-21 | Google Llc | Simulated hyperlinks on a mobile device based on user intent and a centered selection of text |
US10733360B2 (en) | 2015-11-18 | 2020-08-04 | Google Llc | Simulated hyperlinks on a mobile device |
CN108885856A (zh) * | 2016-03-29 | 2018-11-23 | 索尼公司 | 信息处理设备、信息处理方法和程序 |
US10535005B1 (en) | 2016-10-26 | 2020-01-14 | Google Llc | Providing contextual actions for mobile onscreen content |
US11734581B1 (en) | 2016-10-26 | 2023-08-22 | Google Llc | Providing contextual actions for mobile onscreen content |
US11237696B2 (en) | 2016-12-19 | 2022-02-01 | Google Llc | Smart assist for repeated actions |
US11860668B2 (en) | 2016-12-19 | 2024-01-02 | Google Llc | Smart assist for repeated actions |
CN108762482A (zh) * | 2018-04-16 | 2018-11-06 | 北京大学 | 一种大屏幕和增强现实眼镜间数据交互方法和系统 |
US11348320B2 (en) | 2020-04-02 | 2022-05-31 | Samsung Electronics Company, Ltd. | Object identification utilizing paired electronic devices |
US12026593B2 (en) | 2020-10-15 | 2024-07-02 | Google Llc | Action suggestions for user-selected content |
Also Published As
Publication number | Publication date |
---|---|
WO2009054619A3 (fr) | 2009-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2009054619A2 (fr) | Dispositif informatique de réalité amplifiée | |
US11262835B2 (en) | Human-body-gesture-based region and volume selection for HMD | |
US11676349B2 (en) | Wearable augmented reality devices with object detection and tracking | |
JP6323040B2 (ja) | 画像処理装置、画像処理方法およびプログラム | |
US10089794B2 (en) | System and method for defining an augmented reality view in a specific location | |
KR101171660B1 (ko) | 증강현실의 포인팅 장치 | |
CN107851299B (zh) | 信息处理装置、信息处理方法以及程序 | |
CN112581571B (zh) | 虚拟形象模型的控制方法、装置、电子设备及存储介质 | |
KR20090040839A (ko) | 증강현실 컴퓨터 장치 | |
JP2014029656A (ja) | 画像処理装置および画像処理方法 | |
JP2014029566A (ja) | 画像処理装置、画像処理方法、及び画像処理プログラム | |
JP2010272078A (ja) | 電子情報ボードシステム、電子情報ボード制御装置およびカーソル制御方法 | |
GB2345538A (en) | Optical tracker | |
KR101582225B1 (ko) | 인터랙티브 증강현실 서비스 시스템 및 방법 | |
CN117897682A (zh) | 在物理表面上显示数字媒体内容 | |
JP2005165864A (ja) | コマンド入力方法、画像表示方法及び画像表示装置 | |
Sobota et al. | Mixed reality: a known unknown | |
WO2023017623A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et programme | |
JP2005301479A (ja) | 投影された提示者の動作による命令入力装置 | |
ZHANG et al. | Opportunistic Interfaces for Augmented Reality: Transforming Everyday Objects into Tangible 6DoF Interfaces Using Ad hoc UI | |
Luo et al. | Research and simulation on virtual movement based on kinect | |
Piechaczek et al. | Popular strategies and methods for using augmented reality | |
CN117496097A (zh) | 基于ar现实增强的数字沙盘展示系统 | |
CN115731495A (zh) | 一种操控三维虚拟场景中的方法 | |
Shoaei Shirehjini | Smartphones as Visual Prosthesis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08840914 Country of ref document: EP Kind code of ref document: A2 |
|
NENP | Non-entry into the national phase in: |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 08840914 Country of ref document: EP Kind code of ref document: A2 |