EP3673331A1 - System für virtuelle realität - Google Patents
System für virtuelle realitätInfo
- Publication number
- EP3673331A1 EP3673331A1 EP18848273.1A EP18848273A EP3673331A1 EP 3673331 A1 EP3673331 A1 EP 3673331A1 EP 18848273 A EP18848273 A EP 18848273A EP 3673331 A1 EP3673331 A1 EP 3673331A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- virtual reality
- projectors
- projector
- reality system
- controller unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B37/00—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
- G03B37/04—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/3147—Multi-projection systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B2206/00—Systems for exchange of information between different pieces of apparatus, e.g. for exchanging trimming information, for photo finishing
Definitions
- the present invention relates to a virtual reality system and particularly to controlling the virtual reality system.
- a virtual reality system is a complex system, which includes projectors and a projector control unit.
- the virtual reality system can be controlled by a user operating the projector control unit over a control application executed on an operating system running on the projector control unit.
- the projectors are installed to locations near the roof, for unobstructed projection of the virtual reality environment.
- a user should be trained to use the control application on the projector control unit and troubleshoot any problems with the projectors, projector control unit and the control application. Since the projectors are typically near the roof and out of sight, a closer inspection for troubleshooting the projectors is difficult.
- Remote controls can provide control of the projectors, however, only one-by-one.
- the controller unit and the projectors may not be close to the user such that the user could make observations about whether they are on or off, e.g. whether the fans are running or not.
- FIGURE 1 illustrates an example of virtual reality system in accordance of at least some embodiments of the present invention
- FIGURE 2 illustrates an example of a method for a virtual reality system in accordance with at least some embodiments
- FIGURE 3 illustrates an example of a method for a virtual reality system in accordance with at least some embodiments
- FIGURE 4 illustrates an example of a sequence in accordance with at least some embodiments
- FIGURE 5 illustrates an example of a view generated by the virtual reality system on a surface arrangement
- FIGURE 6 illustrates an example of user input device installed to a virtual reality system in accordance with at least some embodiments.
- FIGURE 1 illustrates an example of virtual reality system (VRS) 100 in accordance of at least some embodiments of the present invention.
- the virtual reality system comprises a projector system comprising a projector controller unit 104 and projectors 106a, 106b, 106c controlled by the projector controller unit for projecting a virtual reality environment on a surface arrangement comprising at least three surfaces.
- the virtual reality system comprises a user input device 108 operatively connected to the projector system for receiving an input from a user for controlling the projector system.
- the VRS may further comprise one or more additional devices 110, 112 of the VRS.
- the additional devices that may be utilized for complementing the virtual reality environment provided by the VRS.
- the additional devices may be capable of adjusting and/or improving a true-to-life experience provided by the VRS to a user viewing the virtual reality environment. Additionally or alternatively the additional devices may be or serve for user input devices and/or user output devices.
- Examples of the user input devices 108 comprise devices that are capable of receiving user input such as touch or voice and capable of causing cause a command to the VRS in response to the user input.
- a user input device capable of receiving touch of the user may be a button, a push-button, a switch, a key, a keypad or touch screen.
- a user input device capable of receiving voice of the user may be a microphone.
- the user input device may be incorporated in another device for example a computing device such as a tablet computer or a smart phone. In this way the user input device may serve for both using the applications and/or operating system of the computing device as well as the virtual reality system.
- Examples of the user output devices comprise devices that are capable of displaying viewable information to the user or capable of haptic communications with the user.
- the user output device may be a display or touch screen.
- the user output devices may be capable of outputting information to the user in response to an operational state of the VRS and/or a user input.
- the additional devices may comprise sensors. Examples of the sensors comprise temperature sensors, optical sensors, still cameras and video cameras.
- the user input device 108, the additional devices 110, 112, the projector controller 104 and the projectors 106a, 106b, 106c may be connected to a control system 114 of the VRS.
- the control system may be a dedicated system and physically separate from the projector controller.
- the control system may be integrated into the projector controller, whereby the projector controller may control all the devices in the VRS.
- Connections illustrated in FIGURE 1 by straight lines between the devices of the VRS may be implemented by cable connections or wireless connections that are capable of data communications for transfer of information such as messages and commands between the devices.
- Examples of the cable connections comprise an Ethernet connection and an RS-323 serial port connection.
- Examples of the wireless connections comprise a wireless local area network connection according to IEEE 802.11, Zigbee and Bluetooth.
- Connections (not illustrated) between the projectors and the projector controller may be capable of communications of video signal for projecting the virtual reality environment by the projectors.
- Examples of the connections between the projectors and the projector controller comprise composite video, SCART, Video Graphics Array (VGA), Digital visual interface, High-Definition Multimedia Interface and Mobile High-Definition Link connections.
- the VRS 100 may be a virtual cave.
- a virtual cave refers to a room, part of a room or a part of a space in a building, where a virtual reality view may be presented viewable to a person within the virtual reality cave on the surface arrangement comprising at least three surfaces.
- the surfaces may comprise one or more wall surfaces, a floor surface and/or a ceiling surface.
- three planar wall surfaces may be hinged together to form a corner and a floor surface in front of the wall surfaces may serve as the fourth surface of the VRS for presenting a virtual reality view.
- the projector controller may serve as a source of video signal for the projectors such that when the video signals are received by the projectors and projected to the surfaces, a virtual reality environment may be generated.
- the video signals may be stereo video signals, for example.
- the virtual reality view may be viewed by a person, when the person is observing the projections on the wall, floor and/or ceiling surfaces.
- the virtual cave may have a viewing location. The viewing location is a location on the floor space of the virtual cave, where the virtual reality view may be optimally viewed by a person.
- the VRS may comprise a virtual reality viewing accessory for viewing the virtual reality view.
- the virtual reality viewing accessory may be worn by the person for viewing the virtual reality view. Examples of the virtual reality viewing accessory comprise virtual reality glasses.
- Virtual reality glasses may be adapted for viewing the projected stereo video signal.
- the virtual reality glasses may comprise different colors of lenses/films in front of the left and right eyes.
- FIGURE 2 illustrates an example of a method for a virtual reality system in accordance with at least some embodiments.
- the method may be performed by a control system of the virtual reality system illustrated in FIGURE 1 for example.
- Phase 202 comprises determining, by the virtual reality system, in response to receiving an input from the user by the user input device, whether the projectors and the projector controller are turned on or off.
- Phase 204 comprises turning on, by the virtual reality system, the projectors and projector controller unit, when the projectors and the projector controller unit are turned off.
- current operational states e.g. turned on or turned off, of the projectors and projector controller unit, and the current operational states may be used to determine a next operational state that is applied to the projectors and projector controller unit.
- FIGURE 3 illustrates an example of a method for a virtual reality system in accordance with at least some embodiments.
- the method may be performed by a control system of the VRS after the VRS or at least part of the virtual reality system is turned on, for example according to the method of FIGURE 2.
- Phase 302 comprises receiving a user input from the user by the user input device.
- Phase 304 comprises determining, whether the projectors and the projector controller are turned on or off similar to phase 202. It should be appreciated that each of the projectors and the projector controller may be turned on or off. Accordingly, all or only a part of the projectors and the projector controller may be turned on.
- Phase 306 comprises turning off, by the virtual reality system, the projectors and projector controller unit, when the projectors and the projector controller unit or at least part of the projectors and the projector controller unit are turned on.
- phase 308 may be performed and the projectors and projector controller unit may be turned on by the virtual reality system in accordance with phase 204.
- phase 306 is performed in response to receiving a subsequent user input after a first user input within a time limit in phase 310.
- Phase 310 may follow after the projectors and the projector controller unit or at least part of the projectors and the projector controller unit are turned on.
- the method may comprise determining in phase 310, whether the time limit has passed since a previous input from the user by the user input device in phase 302. If the time limit has not passed the method may proceed to phase 306 for turning off the projectors and the projector controller unit in response to the subsequent user input. On the other hand if the time limit has passed, the turning off may be omitted and the method may proceed from phase 310 to phase 304 to determine whether the projectors and the projector controller are turned on or off.
- the time of user input may be registered for facilitating the determining of whether the time limit has passed when a subsequent user input or information indicating user input is received.
- FIGURE 4 illustrates an example of a sequence in accordance with at least some embodiments.
- the sequence is illustrated with reference to the items of FIGURE 1.
- Phase 404 comprises receiving, by a control system 114, information indicating a user input from a user input device 108.
- the user input may be received for example as described in phase 302.
- the information indicating the user input may comprise analog or digital data.
- the data may be a message indicating input of the user.
- Phase 406 may comprise determining operational states of one or more devices of the VRS.
- An operational state may comprise information indicating whether a device is turned on, a device is turned off, a device is in standby and/or a device has an error. Accordingly, the phase 406 may be performed in accordance with phase 202 and phase 304.
- the device When the device is turned on, the device may be operational in the VRS.
- the device When the device is operational in the VRS it may contribute directly or indirectly to generating the virtual reality environment and/or supporting the operation of the VRS. In this state, the device is consuming electrical power.
- the device when the device is turned off, the device does not contribute directly or indirectly to generating the virtual reality environment and/or supporting the operation of the VRS, and the device is not consuming electrical power.
- the device when the device is in standby, the device may not contribute directly or indirectly to generating the virtual reality environment and/or supporting the operation of the VRS, but however, the device may be turned on more quickly than if the device was turned off. In the standby state the power consumption of the device is less than if the device was turned on, but higher than if the device was turned off.
- a startup sequence of the device from the standby state to the turned on state may be shorter in time than a startup sequence of the device from a turned off state to the turned on state.
- the shorter time may be achieved by maintaining at least part of the parts of the device powered in the standby state such that a number of operations in the startup sequence from the standby state to the turned on state is less than the number of operations in the startup sequence from the turned off state to the turned on state.
- determining the operational state in phase 406 may comprise determining whether the projectors and the projector controller are turned on or off similar to phase 202. Alternatively or additionally one or more other states may be determined. It should be appreciated that instead or in additionally states of other devices such as the additional devices 112, 114 of the VRS may be determined.
- phase 406 After phase 406 a next operational state for one or more of the devices of the
- phase 410 may comprise turning on the projectors and projector controller unit, when the projectors and the projector controller unit are turned off.
- phase 410 may comprise turning off the projectors and projector controller unit, when the projectors and the projector controller unit or at least part of the projectors and the projector controller unit are turned on.
- Phase 408 may comprise receiving a subsequent user input within a time limit.
- phase 410 may comprise turning off the projectors and projector controller unit in response to receiving the subsequent user input in phase 408. Since the devices are turned off in response to the user input within a time limit, turning off the devices is essentially confirmed by the user by the subsequent user input before the phase 410 is executed. If the user input is not received within the time limit, the phase 410 may be omitted and the user input may be determined as a first user input, for example the user input communicated in phase 404.
- phases 406 and 410 may comprise messaging for obtaining information such as information indicating operational states and commands for applying next operational states.
- the messaging may comprise requests and responses. Responses may be omitted for example for a command to a device to apply a next operational state.
- FIGURE 5 illustrates an example of a view 502 generated by the virtual reality system on a surface arrangement.
- the view may comprise a message 504 indicating the user that a confirmation of a user input is needed.
- the message may read for example "Do you want to turn off the system?".
- the message may be displayed for example after a first user input is received and it is determined that the projectors and the projector controller unit or at least part of the projectors and the projector controller unit are turned on, for example in accordance with phase 304. Accordingly, the message may be displayed by the virtual reality system during a time limit for receiving a subsequent user input for turning off the projectors and the projector controller.
- the projectors and projector control unit may be turned off, for example in accordance with phase 306 and 410. However, if the time limit is passed without the subsequent user input, the message may be removed. In this way the virtual reality system displays a message during the time limit for indicating a need for the subsequent user input, whereby the user may be guided to give a further user input by the user input device to confirm that the devices are turned off.
- the devices in the VRS may be other devices than the projectors and the projector control unit, and the next operational state may be another state than a turned off state. Examples of the operational states comprise turned on, a device is turned off, a device is in standby and/or a device has an error.
- FIGURE 6 illustrates an example of user input device installed to a virtual reality system in accordance with at least some embodiments.
- the virtual reality system may be in accordance with the virtual reality system described with FIGURE 1.
- the VRS may comprise a surface arrangement comprising surfaces 602 on which video signal from projectors may be projected.
- a user input device 604 may be installed to a side 606 of at least one of the surfaces. In this way the video signal is not projected on the user input device and user contact to the surface in connection with operating the user input device may be prevented.
- the side of the surface 602 may be a vertical side of the surface. The side may be facing into a direction that is away from the projectors.
- An embodiment concerns a computer program comprising executable code, which when executed by a virtual reality system comprising a projector system comprising a projector controller unit and projectors controlled by the projector controller unit for projecting a virtual reality environment on a surface arrangement comprising at least three surfaces and a user input device connected to the projector system is configured to cause one or more functionalities according to an embodiment.
- An apparatus, virtual reality system, control system and projector control unit may comprise memory and processor.
- Memory may comprise random-access memory and/or permanent memory.
- Memory may comprise at least one RAM chip.
- Memory may comprise solid-state, magnetic, optical and/or holographic memory, for example.
- Memory may be at least in part accessible to processor, a computer or other means for controlling.
- Memory may be at least in part comprised in processor.
- Memory may be means for storing information.
- Memory may be non-transitory computer readable medium.
- Memory may comprise computer instructions that processor is configured to execute. The computer instructions may be executable code of a computer program.
- processor and/or its at least one processing core may be considered to be configured to perform said certain actions.
- Memory may be at least in part comprised in processor.
- Memory may be at least in part external to device but accessible to device.
- Processor may comprise, for example, a single- or multi-core processor wherein a single-core processor comprises one processing core and a multi-core processor comprises more than one processing core.
- Processor may comprise more than one processor.
- a processing core may comprise, for example, a Cortex-A8 processing core manufactured by ARM Holdings or a Steamroller processing core produced by Advanced Micro Devices Corporation.
- Processor may comprise for example Qualcomm Snapdragon and/or Intel Atom processor.
- Processor may comprise at least one application-specific integrated circuit, ASIC.
- Processor may comprise at least one field- programmable gate array, FPGA.
- Processor may be means for performing method steps in device.
- Processor may be configured, at least in part by computer instructions, to perform actions.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Controls And Circuits For Display Device (AREA)
- Digital Computer Display Output (AREA)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| FI20175749A FI127603B (en) | 2017-08-23 | 2017-08-23 | Virtual Reality System |
| PCT/FI2018/050598 WO2019038478A1 (en) | 2017-08-23 | 2018-08-23 | VIRTUAL REALITY SYSTEM |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| EP3673331A1 true EP3673331A1 (de) | 2020-07-01 |
| EP3673331A4 EP3673331A4 (de) | 2021-05-19 |
Family
ID=62103890
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP18848273.1A Withdrawn EP3673331A4 (de) | 2017-08-23 | 2018-08-23 | System für virtuelle realität |
Country Status (3)
| Country | Link |
|---|---|
| EP (1) | EP3673331A4 (de) |
| FI (1) | FI127603B (de) |
| WO (1) | WO2019038478A1 (de) |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2000338941A (ja) | 1999-05-27 | 2000-12-08 | Seiko Epson Corp | 投射型表示装置 |
| JP6149394B2 (ja) * | 2012-03-21 | 2017-06-21 | セイコーエプソン株式会社 | プロジェクター、及びプロジェクターシステム |
| US9298071B2 (en) | 2012-07-12 | 2016-03-29 | Cj Cgv Co., Ltd. | Multi-projection system |
| JP6127443B2 (ja) | 2012-10-19 | 2017-05-17 | カシオ計算機株式会社 | 投影装置及び投影状態調整方法 |
| KR101489261B1 (ko) | 2013-08-26 | 2015-02-04 | 씨제이씨지브이 주식회사 | 상영관 파라미터 관리 장치 및 방법 |
| CA2969226A1 (en) | 2014-12-03 | 2016-06-09 | Barco, Inc. | Systems and methods for an immersion theater environment with dynamic screens |
-
2017
- 2017-08-23 FI FI20175749A patent/FI127603B/en active IP Right Grant
-
2018
- 2018-08-23 EP EP18848273.1A patent/EP3673331A4/de not_active Withdrawn
- 2018-08-23 WO PCT/FI2018/050598 patent/WO2019038478A1/en not_active Ceased
Also Published As
| Publication number | Publication date |
|---|---|
| WO2019038478A1 (en) | 2019-02-28 |
| EP3673331A4 (de) | 2021-05-19 |
| FI20175749A (fi) | 2018-04-27 |
| FI20175749A7 (fi) | 2018-04-27 |
| FI127603B (en) | 2018-10-15 |
| FI20175749L (fi) | 2018-04-27 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP3026875B1 (de) | Verfahren und vorrichtung zur einstellung des betriebszustands eines intelligenten haushaltsgeräts | |
| US10564833B2 (en) | Method and apparatus for controlling devices | |
| CN103154856B (zh) | 针对手势识别的环境相关动态范围控制 | |
| US20170019518A1 (en) | Method and apparatus for controlling devices | |
| US20170322709A1 (en) | Split-screen display method, apparatus and medium | |
| CN112327653B (zh) | 设备控制方法、设备控制装置及存储介质 | |
| US20150128050A1 (en) | User interface for internet of everything environment | |
| WO2022158820A1 (en) | Systems and methods for manipulating views and shared objects in xr space | |
| EP2930705A1 (de) | Verfahren und vorrichtung zur steuerung eines intelligenten endgeräts | |
| CN108885495A (zh) | 电子设备和在电子设备中提供信息的方法 | |
| CN115509361B (zh) | 虚拟空间交互方法、装置、设备和介质 | |
| US10270614B2 (en) | Method and device for controlling timed task | |
| JP6496805B2 (ja) | WiFi信号アイコンの展示方法、装置、移動端末、プログラム、及び記録媒体 | |
| CN107452119A (zh) | 虚拟现实实时导览方法及系统 | |
| US20180139790A1 (en) | Methods, apparatuses and storage medium for controlling a wireless connection | |
| WO2021143899A1 (zh) | 连接呼叫方法、装置、存储介质及终端 | |
| US20160123622A1 (en) | Air purification notification method and apparatus, user equipment and system | |
| WO2019137155A1 (zh) | 屏幕显示模式的切换方法、装置、存储介质及电子装置 | |
| EP3425533A1 (de) | Seite anzeigen | |
| US11157085B2 (en) | Method and apparatus for switching display mode, mobile terminal and storage medium | |
| US20240066716A1 (en) | Interactive method, electronic device, and storage medium | |
| EP3995942B1 (de) | Verfahren, vorrichtung und speichermedium zur kleinbildschirmfensteranzeige | |
| EP3291489B1 (de) | Verfahren und vorrichtung zur vorrichtungsidentifizierung | |
| EP3673331A1 (de) | System für virtuelle realität | |
| CN109196860B (zh) | 一种多视角图像的控制方法及相关装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
| 17P | Request for examination filed |
Effective date: 20200305 |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| AX | Request for extension of the european patent |
Extension state: BA ME |
|
| DAV | Request for validation of the european patent (deleted) | ||
| DAX | Request for extension of the european patent (deleted) | ||
| A4 | Supplementary search report drawn up and despatched |
Effective date: 20210419 |
|
| RIC1 | Information provided on ipc code assigned before grant |
Ipc: G03B 37/04 20210101AFI20210413BHEP Ipc: G03B 21/00 20060101ALI20210413BHEP Ipc: G08C 17/02 20060101ALI20210413BHEP |
|
| GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
| INTG | Intention to grant announced |
Effective date: 20230117 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
| 18D | Application deemed to be withdrawn |
Effective date: 20230531 |