WO2020043764A1 - An odour-based interactive system - Google Patents

An odour-based interactive system Download PDF

Info

Publication number
WO2020043764A1
WO2020043764A1 PCT/EP2019/072930 EP2019072930W WO2020043764A1 WO 2020043764 A1 WO2020043764 A1 WO 2020043764A1 EP 2019072930 W EP2019072930 W EP 2019072930W WO 2020043764 A1 WO2020043764 A1 WO 2020043764A1
Authority
WO
WIPO (PCT)
Prior art keywords
odour
virtual
augmented reality
containers
valves
Prior art date
Application number
PCT/EP2019/072930
Other languages
French (fr)
Inventor
Jonas OLOFSSON
Peter LUNDÉN
Simon NIEDENTHAL
Original Assignee
Olofsson Jonas
Lunden Peter
Niedenthal Simon
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olofsson Jonas, Lunden Peter, Niedenthal Simon filed Critical Olofsson Jonas
Publication of WO2020043764A1 publication Critical patent/WO2020043764A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L9/00Disinfection, sterilisation or deodorisation of air
    • A61L9/015Disinfection, sterilisation or deodorisation of air using gaseous or vaporous substances, e.g. ozone
    • A61L9/02Disinfection, sterilisation or deodorisation of air using gaseous or vaporous substances, e.g. ozone using substances evaporated in the air by heating or combustion
    • A61L9/03Apparatus therefor
    • A61L9/035Apparatus therefor emanating multiple odours
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4005Detecting, measuring or recording for evaluating the nervous system for evaluating the sensory system
    • A61B5/4011Evaluating olfaction, i.e. sense of smell
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2209/00Aspects relating to disinfection, sterilisation or deodorisation of air
    • A61L2209/10Apparatus features
    • A61L2209/13Dispensing or storing means for active compounds
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65DCONTAINERS FOR STORAGE OR TRANSPORT OF ARTICLES OR MATERIALS, e.g. BAGS, BARRELS, BOTTLES, BOXES, CANS, CARTONS, CRATES, DRUMS, JARS, TANKS, HOPPERS, FORWARDING CONTAINERS; ACCESSORIES, CLOSURES, OR FITTINGS THEREFOR; PACKAGING ELEMENTS; PACKAGES
    • B65D2203/00Decoration means, markings, information elements, contents indicators
    • B65D2203/12Audible, olfactory or visual signalling means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K11/00Methods or arrangements for graph-reading or for converting the pattern of mechanical parameters, e.g. force or presence, into electrical signal
    • G06K11/06Devices for converting the position of a manually-operated writing or tracing member into an electrical signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality

Definitions

  • the invention relates to a method, user device, system, computer program and computer program product for an interactive odour generation in a virtual reality (VR) or augmented reality (AR) environment.
  • VR virtual reality
  • AR augmented reality
  • Augmented reality is an interactive computer-generated experience of a real-world environment taking place in a simulated environment or a combination of a simulated and a real-world environment.
  • Digital environments including computer games and virtual or augmented reality environments usually emphases visual and auditory stimuli.
  • simulations mostly incorporate visual, auditory or tactile sensory feedback.
  • Current virtual or augmented reality technology most commonly uses virtual reality headsets with head-mounted displays in order to provide an experience of looking around in the artificial world, moving around and interacting with virtual features or items to a user.
  • the headsets may comprise a head motion tracking sensors, which may include gyroscopes, accelerometers, eye tracking sensors, gaming controllers, etc.
  • Other input devices used in virtual or augmented virtual reality environments may include speech recognition systems and/or gesture recognition systems embedded in a peripheral device.
  • a virtual/augmented reality environment generating computer analyses the received data (sensed visual, audio and other data) in order to generate and position augmented or virtual reality environments.
  • Virtual reality systems may be used for education, training, learning and testing.
  • Current solutions do not create interactive systems that engage the sense of smell for smell-based interactions. It is an objective to improve the virtual/augmented reality environment by also incorporating an olfactory sensory feedback. Thereby system dynamics may be improved.
  • an apparatus for releasing and interacting with odours comprises: a housing arranged for holding one or more odour containers; one or more odour containers; the odour container containing an odour source, a valve at an inlet of each one or more odour containers, and a valve at an outlet of each one or more odour container, wherein the pair of valves of each one or more odour containers are manoeuvred synchronously; a servomotor arranged to actuate the at least one or more pair of valves and a control unit.
  • the control unit is arranged to receive odour control signals from a
  • valve control signal for said at least one pair of valves, and transmit said valve control signal to the at least one pair of valves; control the servomotor; transmit location information of the apparatus and timing information of a trigger event of activation of an odour release to the virtual/augmented reality system, and capable of acting as an access point.
  • the apparatus may further comprise a fan comprising an air distributor generating an airflow through the housing towards an outlet of the housing, through the one or more pair of valves and a mixer area.
  • the odour source may be absorbed in a porous material.
  • the apparatus may further comprise a trigger button for activating the odour release.
  • the control unit may be arranged to form a fan control signal for the fan for controlling the speed of the airflow generated by the fan.
  • a method performed by a VR/AR system in a VR/AR reality based environment comprises: receiving location information from an apparatus for releasing and interacting with odours, receiving timing information of a trigger event of activation of the odour release; determining a control signal for release of odours based on the pre-determined rules, based on the received location information of the apparatus for releasing and interacting with odours and based on timing information of the trigger event of activation of the odour release, and transmit the determined control signals to the control unit of the apparatus in order to.
  • the method further comprises the steps of receiving user response upon the activation of the odour release, comparison of the user response information with a desired user response to an odour stimulus; based on the said comparison, determining the user performance; based on the user performance, providing new interactive virtual /augmented reality based system with a modified odour stimulus.
  • a computer program for a virtual /augmented reality based system the computer program
  • a computer program product comprising a computer program according to the third aspect and a computer readable means on which the computer program is stored.
  • a virtual/ augmented reality based system comprising: a computing device; a memory storing a VR/AR program for the VR/AR reality based system; means for providing a visual, auditory and tactile simulation; an apparatus for releasing and interacting with odours according to the first aspect; a VR/AR program configured to perform according to the third aspect.
  • Fig l discloses an example of an apparatus for releasing and interacting with odours.
  • Fig 2 discloses an example of a virtual or augmented reality environment system comprising an apparatus for releasing and interacting with odours.
  • the invention relates to a virtual reality (VR) or augmented reality (AR) system, wherein the VR/AR system comprises an apparatus for releasing and interacting with odours.
  • VR virtual reality
  • AR augmented reality
  • Fig l discloses an example of an apparatus for releasing and interacting with odours, too.
  • the apparatus too comprises a housing no arranged for holding one or more odour containers 130.
  • the one or more odour containers 130 comprise an odour source which may be absorbed in a porous material.
  • An odour source can be liquid or non-liquid.
  • the apparatus may further comprise one or a plurality of fans 120.
  • the fan may be a single fan 120.
  • An air flow is generated by the single fan 120.
  • the speed of the fan can be controlled by a control unit 200.
  • a system of valves 140, 150 controls the airflow and the delivery of odours.
  • An inlet valve 140 is arranged at the inlet of the odour container.
  • An outlet valve 150 is arranged at the outlet of the odour container.
  • the valves are effectively closing the container when no odour should be delivered from that container.
  • the pair of valves 140, 150 is manoeuvred by synchronous and proportionally shifted between one scented (through the one or more containers 130) and one clean airflow 135.
  • the valves 140, 150 are controlled by the control unit 200 to mix clean air with the scented air in order to make the two airflows, the scented one, and the clean air one, balanced and the total air flow
  • the apparatus 100 can deliver one odour or more odours in an arbitrary mixture, and the intensity of each odour being controlled continuously.
  • the one or more pairs of valves 140, 150 are in the illustrated example actuated by one or more actuators 160, controlled by the control unit 200.
  • the control unit is arranged to communicate with the rest of the environment.
  • the control unit 200 is set up as an access point (e.g. a Wi-Fi access point), and capable of receiving input signals from the VR/AR system 300 by a Wi-Fi or any other wireless networking technology.
  • the communication can be based on the open Sound Control (OSC) protocol.
  • the control unit is arranged to process the received signals and to transmit the processed signals to the valve-mechanism 140, 150, so a specific odour or a
  • combination of odours can be released by controlling the one or more pair of valves 140, 150.
  • the airflow is directed through the containers 130 inside the apparatus too and toward an outlet 180 of the apparatus, thereupon odorising the air delivered at the outlet 180.
  • control unit 200 is capable of obtaining location
  • the control unit is further arranged to transmit the obtained location information of the apparatus and timing information of an odour stimulus trigger event.
  • Fig 2. illustrates a VR/AR system responsive also to odour stimulus. The system determines how the user actions match with the desired
  • behavioural response to the odour stimulus This information is used to determine user performance, (e.g. correct vs incorrect).
  • user performance outcome After the user performance outcome is evaluated, it is translated into feedback that is presented to the user in terms of a modified virtual reality environment (e.g. an accumulated score, increased or decreased difficulty, etc.) where the subsequent stages of interaction are shaped based on the determined user performance.
  • a modified virtual reality environment e.g. an accumulated score, increased or decreased difficulty, etc.
  • the subsequent stages of interaction are shaped based on the determined user performance.
  • the user interaction determines outcomes.
  • the user themselves does not have to make any decisions about the choice of a specific odour or a combination of odours that will be released.
  • the decisions are made based on the pre-determined rules of the VR/AR system.
  • the pre-determined rules are based on how the user actions match with the desired behavioural response to the odour stimulus, in connection with user’s position and timing.
  • the apparatus 100 is in the illustrated example attached to a hand controller 250 comprised in the VR/AR system.
  • the user holds the apparatus.
  • the user may place an outlet of the apparatus too in a close vicinity of the nose in order to smell objects visible or sensible in the VR/AR environment.
  • Triggering of an odour stimulus may be controlled by the user pushing a trigger button on the hand controller 250, or a trigger button integrated in the apparatus.
  • Triggering of the odour stimulus may also be controlled by the VR/AR system.
  • the VR/AR system is able to sense the position of the apparatus too relative to the user by e.g. a hand controller 250 and a head mounted display that the user is wearing or any other user input or user output devices of the VR/AR system, most commonly wearables.
  • the apparatus can be stationary and located, for example, on a table in front of the user.
  • the control unit 200 which is placed in the apparatus in the illustrated example, transmits the trigger event information either from the VR/AR system 300 to the apparatus 100 in order to steer the valves and/or from the apparatus 100 or hand controller to the VR/AR system if the event was triggered by the user pushing the trigger button.
  • the control unit is set up as an access point and communicates via wireless network 400 with the VR/AR system 300.
  • the VR/AR system 300 determines how the user actions match with the desired behavioural response to the odour stimulus. This information is used to determine user performance, (e.g. correct vs incorrect). After the outcome is evaluated, it is translated into feedback that is presented to the user in terms of a modified virtual reality environment (e.g. an
  • the user interaction determines outcomes.
  • the user themselves does not have to make any decisions about the choice of a specific odour or a combination of odours that will be released. The decisions are based on pre-determined rules.
  • the invention can be used in education, training, learning, testing or anywhere where incorporating an odour stimulus can improve the experience of virtual reality and thereby the education, training, etc.

Abstract

It is provided an apparatus, a system and a method for releasing and interacting with odours in virtual/augmented reality system. The apparatus comprises: a housing arranged for holding one or more odour containers, one or more odour containers; the odour container containing an odour source, a valve at an inlet of each one or more odour containers, and a valve at an outlet of each one or more odour containers, wherein the pair of valves of each one or more odour containers are manoeuvred synchronously; and a control unit arranged to: receive odour control signals from a virtual/augmented reality system, form a valve control signal for said at least one pair of valves and transmit said valve control signal to the at least one pair of valves; transmit location information of the apparatus and timing information of a trigger event of activation of the odour release to the virtual/augmented reality system, and act as an access point.

Description

AN ODOUR-BASED INTERACTIVE SYSTEM
TECNICAL FIELD
The invention relates to a method, user device, system, computer program and computer program product for an interactive odour generation in a virtual reality (VR) or augmented reality (AR) environment.
BACKGROUND
Augmented reality is an interactive computer-generated experience of a real-world environment taking place in a simulated environment or a combination of a simulated and a real-world environment. Digital environments, including computer games and virtual or augmented reality environments usually emphases visual and auditory stimuli. The
simulations mostly incorporate visual, auditory or tactile sensory feedback. Current virtual or augmented reality technology most commonly uses virtual reality headsets with head-mounted displays in order to provide an experience of looking around in the artificial world, moving around and interacting with virtual features or items to a user. The headsets may comprise a head motion tracking sensors, which may include gyroscopes, accelerometers, eye tracking sensors, gaming controllers, etc. Other input devices used in virtual or augmented virtual reality environments may include speech recognition systems and/or gesture recognition systems embedded in a peripheral device. Based on the received data from sensors and from other input devices, a virtual/augmented reality environment generating computer, analyses the received data (sensed visual, audio and other data) in order to generate and position augmented or virtual reality environments.
SUMMARY
Virtual reality systems may be used for education, training, learning and testing. Current solutions do not create interactive systems that engage the sense of smell for smell-based interactions. It is an objective to improve the virtual/augmented reality environment by also incorporating an olfactory sensory feedback. Thereby system dynamics may be improved.
According to the first aspect, it is provided an apparatus for releasing and interacting with odours. The apparatus comprises: a housing arranged for holding one or more odour containers; one or more odour containers; the odour container containing an odour source, a valve at an inlet of each one or more odour containers, and a valve at an outlet of each one or more odour container, wherein the pair of valves of each one or more odour containers are manoeuvred synchronously; a servomotor arranged to actuate the at least one or more pair of valves and a control unit. The control unit is arranged to receive odour control signals from a
virtual/augmented reality system, form a valve control signal for said at least one pair of valves, and transmit said valve control signal to the at least one pair of valves; control the servomotor; transmit location information of the apparatus and timing information of a trigger event of activation of an odour release to the virtual/augmented reality system, and capable of acting as an access point.
The apparatus may further comprise a fan comprising an air distributor generating an airflow through the housing towards an outlet of the housing, through the one or more pair of valves and a mixer area.
The odour source may be absorbed in a porous material.
The apparatus may further comprise a trigger button for activating the odour release.
The control unit may be arranged to form a fan control signal for the fan for controlling the speed of the airflow generated by the fan.
According to a second aspect, it is provided a method performed by a VR/AR system in a VR/AR reality based environment. The method comprises: receiving location information from an apparatus for releasing and interacting with odours, receiving timing information of a trigger event of activation of the odour release; determining a control signal for release of odours based on the pre-determined rules, based on the received location information of the apparatus for releasing and interacting with odours and based on timing information of the trigger event of activation of the odour release, and transmit the determined control signals to the control unit of the apparatus in order to.
The method further comprises the steps of receiving user response upon the activation of the odour release, comparison of the user response information with a desired user response to an odour stimulus; based on the said comparison, determining the user performance; based on the user performance, providing new interactive virtual /augmented reality based system with a modified odour stimulus.
According to the third aspect, it is provided a computer program for a virtual /augmented reality based system, the computer program
comprising computer program code, which causes the system to perform a method according to the second aspect.
According to the fourth aspect, it is provided a computer program product comprising a computer program according to the third aspect and a computer readable means on which the computer program is stored. According to the fifth aspect, it is provided a virtual/ augmented reality based system comprising: a computing device; a memory storing a VR/AR program for the VR/AR reality based system; means for providing a visual, auditory and tactile simulation; an apparatus for releasing and interacting with odours according to the first aspect; a VR/AR program configured to perform according to the third aspect.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention is now described, by way of example, with reference to the accompanying drawings, in which: Fig l discloses an example of an apparatus for releasing and interacting with odours.
Fig 2 discloses an example of a virtual or augmented reality environment system comprising an apparatus for releasing and interacting with odours.
DETAILED DESCRIPTION
The invention will now be described more fully hereinafter with reference to the accompanying drawings, in which certain embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the
embodiments set forth herein; rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout the description.
The invention relates to a virtual reality (VR) or augmented reality (AR) system, wherein the VR/AR system comprises an apparatus for releasing and interacting with odours.
Fig l discloses an example of an apparatus for releasing and interacting with odours, too. The apparatus too comprises a housing no arranged for holding one or more odour containers 130. The one or more odour containers 130 comprise an odour source which may be absorbed in a porous material. An odour source can be liquid or non-liquid.
The apparatus may further comprise one or a plurality of fans 120. The fan may be a single fan 120. An air flow is generated by the single fan 120. The speed of the fan can be controlled by a control unit 200.
A system of valves 140, 150 controls the airflow and the delivery of odours. There are in the illustrated example two valves for each odour container 130. An inlet valve 140 is arranged at the inlet of the odour container. An outlet valve 150 is arranged at the outlet of the odour container. The valves are effectively closing the container when no odour should be delivered from that container. For each of one or more odours, the pair of valves 140, 150 is manoeuvred by synchronous and proportionally shifted between one scented (through the one or more containers 130) and one clean airflow 135. The valves 140, 150 are controlled by the control unit 200 to mix clean air with the scented air in order to make the two airflows, the scented one, and the clean air one, balanced and the total air flow
constant. The apparatus 100 can deliver one odour or more odours in an arbitrary mixture, and the intensity of each odour being controlled continuously.
In practice, the one or more pairs of valves 140, 150 are in the illustrated example actuated by one or more actuators 160, controlled by the control unit 200. The control unit is arranged to communicate with the rest of the environment.
The control unit 200 is set up as an access point (e.g. a Wi-Fi access point), and capable of receiving input signals from the VR/AR system 300 by a Wi-Fi or any other wireless networking technology. The communication can be based on the open Sound Control (OSC) protocol. The control unit is arranged to process the received signals and to transmit the processed signals to the valve-mechanism 140, 150, so a specific odour or a
combination of odours can be released by controlling the one or more pair of valves 140, 150. As understood from the above, the airflow is directed through the containers 130 inside the apparatus too and toward an outlet 180 of the apparatus, thereupon odorising the air delivered at the outlet 180.
Moreover, the control unit 200 is capable of obtaining location
information, either by a location equipment or relatively to other user input or output devices, by a proximity sensor or any other type of equipment able to detect the presence of nearby objects. The control unit is further arranged to transmit the obtained location information of the apparatus and timing information of an odour stimulus trigger event. Fig 2. illustrates a VR/AR system responsive also to odour stimulus. The system determines how the user actions match with the desired
behavioural response to the odour stimulus. This information is used to determine user performance, (e.g. correct vs incorrect). After the user performance outcome is evaluated, it is translated into feedback that is presented to the user in terms of a modified virtual reality environment (e.g. an accumulated score, increased or decreased difficulty, etc.) where the subsequent stages of interaction are shaped based on the determined user performance. Hence, the user interaction determines outcomes. The user themselves does not have to make any decisions about the choice of a specific odour or a combination of odours that will be released. The decisions are made based on the pre-determined rules of the VR/AR system. The pre-determined rules are based on how the user actions match with the desired behavioural response to the odour stimulus, in connection with user’s position and timing.
The apparatus 100 is in the illustrated example attached to a hand controller 250 comprised in the VR/AR system. In another example, the user holds the apparatus. The user may place an outlet of the apparatus too in a close vicinity of the nose in order to smell objects visible or sensible in the VR/AR environment. Triggering of an odour stimulus may be controlled by the user pushing a trigger button on the hand controller 250, or a trigger button integrated in the apparatus. Triggering of the odour stimulus may also be controlled by the VR/AR system. The VR/AR system is able to sense the position of the apparatus too relative to the user by e.g. a hand controller 250 and a head mounted display that the user is wearing or any other user input or user output devices of the VR/AR system, most commonly wearables. Hence, either a user actively chooses to trigger an olfactory experience, or the VR/AR system, based on user’s position and the position of the apparatus triggers the olfactory experience. The apparatus can be stationary and located, for example, on a table in front of the user. The control unit 200, which is placed in the apparatus in the illustrated example, transmits the trigger event information either from the VR/AR system 300 to the apparatus 100 in order to steer the valves and/or from the apparatus 100 or hand controller to the VR/AR system if the event was triggered by the user pushing the trigger button. The control unit is set up as an access point and communicates via wireless network 400 with the VR/AR system 300.
The VR/AR system 300 determines how the user actions match with the desired behavioural response to the odour stimulus. This information is used to determine user performance, (e.g. correct vs incorrect). After the outcome is evaluated, it is translated into feedback that is presented to the user in terms of a modified virtual reality environment (e.g. an
accumulated score, increased or decreased difficulty, etc.) where the subsequent stages of interaction are shaped based on the determined user performance. Hence, the user interaction determines outcomes. The user themselves does not have to make any decisions about the choice of a specific odour or a combination of odours that will be released. The decisions are based on pre-determined rules.
The invention can be used in education, training, learning, testing or anywhere where incorporating an odour stimulus can improve the experience of virtual reality and thereby the education, training, etc.
results.
The invention has mainly been described above with reference to a few embodiments. However, as it is readily appreciated by a person skilled in the art, other embodiments than the ones disclosed above are equally possible within the scope of the invention, as defined by the appended patent claims.

Claims

1. An apparatus for releasing and interacting with odours (too),
wherein the apparatus comprises:
- a housing (110) arranged for holding one or more odour
containers;
- one or more odour containers (130); the odour container
containing an odour source;
- a valve (140) at an inlet of each one or more odour containers, and a valve (150) at an outlet of each one or more odour containers, wherein the pair of valves (140, 150) of each one or more odour containers are manoeuvred synchronously;
- a servomotor (160) arranged to actuate the at least one or more pair of valves (140,150); and
- a control unit (200) arranged to:
a) receive odour control signals from a virtual/augmented reality system,
- b) form a valve control signal for said at least one pair of valves and transmit said valve control signal to the at least one pair of valves;
- c) control the servomotor (160);
- d) transmit location information of the apparatus and timing information of a trigger event of activation of an odour release to a virtual/ augmented reality (VR/AR) system (300), and e) act as an access point.
2. The apparatus according to claim 1, wherein the apparatus further comprises a fan (120) comprising an air distributor (125) generating an airflow through the housing (110) towards an outlet (180) of the housing (110), through the one or more pair of valves (140, 150) and a mixer area (170).
3. The apparatus according to any of the preceding claims, wherein the odour source is absorbed in a porous material.
4. The apparatus according to any of the preceding claims, wherein the apparatus further comprises: a trigger button for activating the odour release.
5. The apparatus according to any of the preceding claims, wherein the wherein the control unit (200) is further arranged to: form a fan control signal for the fan for controlling the speed of the airflow generated by the fan (120).
6. A method performed by a virtual/ augmented reality (AR/VR) system in an AR/VR based environment, wherein the method comprises the steps of:
- receiving location information from an apparatus for releasing and interacting with odours,
- receiving timing information of a trigger event of activation of an odour release;
- determining a control signal for release of odours based on the pre-determined rules, based on the received location
information of the apparatus for releasing and interacting with odours and based on timing information of the trigger event of activation of the odour release,
- transmitting the determined control signal to the control unit of the apparatus,
- receiving user response upon the activation of the odour release,
- comparison of the user response information with a desired user response to an odour stimulus,
- based on the said comparison, determining the user
performance, and
- based on the user performance, providing new interactive virtual /augmented reality based system with a modified odour stimulus.
7. A computer program for a virtual /augmented reality based system, the computer program comprising computer program code, which causes the system to perform a method according to claim 6.
8. A computer program product comprising a computer according to claim 7 and a computer readable means on which the computer program is stored.
9. A virtual/ augmented reality based system comprising:
- a computing device
- a memory storing a virtual/augmented reality program for the virtual/ augmented reality based system;
- means for providing a visual, auditory and tactile simulation;
- an apparatus for releasing and interacting with odours according to the claim 1.
- a virtual/augmented reality program configured to perform the method steps according to claim 6.
PCT/EP2019/072930 2018-08-31 2019-08-28 An odour-based interactive system WO2020043764A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE1851036-2 2018-08-31
SE1851036A SE1851036A1 (en) 2018-08-31 2018-08-31 An odour-based interactive system

Publications (1)

Publication Number Publication Date
WO2020043764A1 true WO2020043764A1 (en) 2020-03-05

Family

ID=67809466

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/072930 WO2020043764A1 (en) 2018-08-31 2019-08-28 An odour-based interactive system

Country Status (2)

Country Link
SE (1) SE1851036A1 (en)
WO (1) WO2020043764A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160021325A1 (en) * 2013-03-07 2016-01-21 Olorama Tecnología S.L. (LLC) Procedure and Device for the Emission of Scents in Audiovisual Productions
WO2017019630A1 (en) * 2015-07-24 2017-02-02 5Th Screen Digital Inc. Digital aroma cassette cartridge and matrix dispersion system for remote controls
WO2017165295A1 (en) * 2016-03-21 2017-09-28 Eye Labs, LLC Scent dispersal systems for head-mounted displays

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160021325A1 (en) * 2013-03-07 2016-01-21 Olorama Tecnología S.L. (LLC) Procedure and Device for the Emission of Scents in Audiovisual Productions
WO2017019630A1 (en) * 2015-07-24 2017-02-02 5Th Screen Digital Inc. Digital aroma cassette cartridge and matrix dispersion system for remote controls
WO2017165295A1 (en) * 2016-03-21 2017-09-28 Eye Labs, LLC Scent dispersal systems for head-mounted displays

Also Published As

Publication number Publication date
SE1851036A1 (en) 2020-03-01

Similar Documents

Publication Publication Date Title
US11435825B2 (en) Haptic interaction method, tool and system
CN108170262B (en) Haptic surround function
JP2020004395A (en) Real-world haptic interaction for virtual reality user
CN107077229B (en) Human-machine interface device and system
US20180232051A1 (en) Automatic localized haptics generation system
KR20190122559A (en) Systems and methods for providing dynamic haptic playback for an augmented or virtual reality environments
WO2017076224A1 (en) User interaction method and system based on virtual reality
WO2019155710A1 (en) Control device, control method, and program
KR20130101395A (en) Cognitive rehabilitation system and method using tangible interaction
JP2019145103A (en) Systems and methods for haptifying virtual objects using smart stickers
US10943445B2 (en) Systems and methods for providing haptic effects with airflow and thermal stimulation
JP2019008751A (en) Information processing method, program, and information processing device
US10065113B1 (en) Virtual reality system with enhanced sensory effects
WO2020043764A1 (en) An odour-based interactive system
JP6446150B1 (en) Program, information processing apparatus, and method
CN107735827A (en) Using the augmented reality with physical object to change the method and apparatus of User Status
Prekopcsák et al. Design and development of an everyday hand gesture interface
CN110609615A (en) System and method for integrating haptic overlays in augmented reality
US10049596B2 (en) Apparatus for recognizing intention of horse-riding simulator user and method thereof
Johnsor et al. Shapes: A multi-sensory environment for the B/VI and hearing impaired community
CN113874238A (en) Display system for a motor vehicle
KR102324640B1 (en) Forest experience system using mixed reality
WO2021059642A1 (en) Information processing device, control method, and program
US20230143099A1 (en) Breathing rhythm restoration systems, apparatuses, and interfaces and methods for making and using same
WO2023127403A1 (en) System for improving realistic sensations and program for improving realistic sensations in vr

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19761816

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 31.05.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19761816

Country of ref document: EP

Kind code of ref document: A1