GB2532025A - A mixed-reality system for intelligent virtual object interaction - Google Patents

A mixed-reality system for intelligent virtual object interaction Download PDF

Info

Publication number
GB2532025A
GB2532025A GB1419702.4A GB201419702A GB2532025A GB 2532025 A GB2532025 A GB 2532025A GB 201419702 A GB201419702 A GB 201419702A GB 2532025 A GB2532025 A GB 2532025A
Authority
GB
United Kingdom
Prior art keywords
real
virtual
mixed
world
objects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1419702.4A
Other versions
GB201419702D0 (en
Inventor
Colin Kelly
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
VIRTUAL COHERENCE Ltd
Original Assignee
VIRTUAL COHERENCE Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by VIRTUAL COHERENCE Ltd filed Critical VIRTUAL COHERENCE Ltd
Priority to GB1419702.4A priority Critical patent/GB2532025A/en
Publication of GB201419702D0 publication Critical patent/GB201419702D0/en
Priority to PCT/GB2015/053346 priority patent/WO2016071690A1/en
Publication of GB2532025A publication Critical patent/GB2532025A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2111/00Details relating to CAD techniques
    • G06F2111/12Symbolic schematics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B1/00Manually or mechanically operated educational appliances using elements forming, or bearing, symbols, signs, pictures, or the like which are arranged or adapted to be arranged in one or more particular ways
    • G09B1/02Manually or mechanically operated educational appliances using elements forming, or bearing, symbols, signs, pictures, or the like which are arranged or adapted to be arranged in one or more particular ways and having a support carrying or adapted to carry the elements
    • G09B1/04Manually or mechanically operated educational appliances using elements forming, or bearing, symbols, signs, pictures, or the like which are arranged or adapted to be arranged in one or more particular ways and having a support carrying or adapted to carry the elements the elements each bearing a single symbol or a single combination of symbols
    • G09B1/06Manually or mechanically operated educational appliances using elements forming, or bearing, symbols, signs, pictures, or the like which are arranged or adapted to be arranged in one or more particular ways and having a support carrying or adapted to carry the elements the elements each bearing a single symbol or a single combination of symbols and being attachable to, or mounted on, the support

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Evolutionary Computation (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A mixed-reality system is provided for intelligent virtual object interaction. The system comprises a plurality of real-world objects, 116 provided in a real-world environment each comprising a distinguishable identifier, and at least one first sensor adapted to capture and output real-time image data. The system further comprises a processor device, having installed application software, adapted to receive and process said real-time image data so as to assign a plurality of virtual objects in a virtual-reality environment to respective said predetermined identifier of each one of said plurality of real-world objects. Each one of said plurality of virtual objects is associated with a predetermined set of attributes defining at least one function. The system further comprises a display device operably coupled to said processor device and adapted to display real-world content superimposed with augmented-reality content and wherein said plurality of virtual objects are interactable in the virtual-reality environment and in the real-world environment. Furthermore virtual objects are operably coupled in the virtual reality environment when their virtual boundaries intersect, in this way the system may be used to simulate for example electric circuits.

Description

A MIXED-REALITY SYSTEM FOR INTELLIGENT VIRTUAL OBJECT INTERACTION
The present invention relates generally to the field of mixed reality technologies including virtual reality, augmented reality and real-world reality. In particular, the present invention relates to a mixed-reality system providing for intelligent interaction with and between virtual objects.
Introduction
Mixed reality is a technology that allows, for example, virtual imagery or objects to be mixed with a real-world physical environment so that the physical and digital (i.e. virtual) object can coexist in real-time. Mixed reality generally encompasses augmented reality (AR), where elements in a real-world environment are augmented in a live direct or indirect view of that real-world environment.
Currently, AR is predominantly used as a marketing tool, for example, to enhance product previews so as to allow a customer to view what may be inside a product's packaging without having to open it, or as an aid in selecting products from a catalogue or through a kiosk.
In contrast, virtual reality (VR) fully replaces the real world with a simulated one (e.g. digital), so that the user interacts with virtual objects in the VR environment.
Also, AR and VR are more and more utilised in general education to complement, for example, a standard curriculum by superimposing text, graphics, video and/or audio into a user's real time environment. In particular, textbooks, flashcards and other educational reading material may contain embedded "markers" that, when scanned by an AR device, produce supplementary information to the user. The "markers" that, when recognised by the computer system, trigger the supplemented information may be any type of a computer readable code, recognizable shapes or colours. In addition to simply providing supplemental information, VR may be utilised in education to conduct virtual experiments, i.e. the user can participate interactively within the VR environment through a computer interface. For example, VR and AR may aid a user in understanding chemistry by helping the user to not only visualize the spatial structure of a molecule but also interact with a virtual model generated in a VR environment and/or provided as AR positioned at a marker held in the user's hand.
Moreover, educational experiments and/or simulations are currently limited to be either fully conducted in the real-world environment, i.e. utilising real-world objects, or entirely in a VR environment applying, for example, a simulation software program that is executed and controlled through a computer.
However, conducting educational experiments and/or simulations in a real-world environment requires a significant amount of resources, e.g. electronic components when experimenting with electronic circuits, which are not only expensive but also susceptible to damage and/or wear & tear. On the other hand, conducting experiments and/or simulations entirely in the VR environment by using software programs does not provide the real-world interaction a user (e.g. student) may need to fully engage with the subject matter of the experiment. This tactile interaction can be particularly helpful in the education of younger students.
Accordingly, it is an object of the present invention to provide a system that is adapted to provide the user with a fully functional experimental set-up in the VR and AR environment, but which also allows real-world interaction.
Summary of the Invention
Preferred embodiment(s) of the invention seek to overcome one or more of the above disadvantages of the prior art.
According to a first embodiment of the invention there is provided a mixed-reality system for intelligent virtual object interaction comprising: a plurality of real-world objects provided in a real-world environment, each comprising a distinguishable identifier; at least one first sensor adapted to capture and output real-time image data; a processor device, having installed an application software, adapted to receive and process said real-time image data so as to assign a plurality of virtual objects in a virtual-reality environment to respective said predetermined identifier of each one of said plurality of real-world objects, each one of said plurality of virtual objects is associated with a predetermined set of attributes defining at least one function; a display device operably coupled to said processor device and adapted to display real-world content superimposed with augmented-reality content, and wherein said plurality of virtual objects are interactable in the virtual-reality environment and in the real-world environment.
This provides the advantage that real-world objects within the AR environment can be provided with a "sense of its environment" and purpose, allowing the user to interact with physical objects in the real-world environment and observe a virtual response of associated virtual objects in the AR environment. In addition, the real-world objects could be of any shape and form, therefore significantly minimising the manufacturing and/or replacement costs of the real-world objects. In particular, the real-world object and its identifier are the only real-world component of the associated virtual object and its assigned function in the AR environment, thus the virtual object does not experience any physical damage and/or potential rear & tear, and a damaged real-world object may simply be replaced by attaching an identifier to another suitable real-world object.
In a particular example, the real-world object may be a simple disc or tile made of plastic having a unique AR code place on a top surface so that it is readable by a computer system when in use. The unique AR code may assign a specific electronic component to the disc and generates a VR object that is displayed as an AR object on the disc in the VR environment (e.g. the screen of the computer). The use may combine several discs, each having a VR object assigned, to form an electronic circuit in the VR environment. "Combining" in the real-world environment may simply mean to place the discs next to each other (e.g. touching or in close vicinity). The computer system will recognise the identifier and associated functions, as well as, the relative location between the discs and functionally couple the VR objects.
For example, a virtual motor may be coupled to a virtual switch and a virtual power source. The display of the computer system will show a real-time image (or video) the discs laid out in front of the image sensor and its associated virtual objects. The computer system will be able to recognise, whether or not, two or more virtual objects are coupled and create the virtual circuit. The user may then actuate the switch either through the VR environment (e.g. touching the screen image of the virtual switch on the display device) or by touching the respective disc in the real-world environment. Once the switch is virtually actuated, the motor may be powered on or off.
Advantageously, a virtual boundary region of predetermined dimension may be defined in the periphery of each one of said plurality of real-world objects by said application software. Preferably, the plurality of virtual objects may be operably coupled in said virtual-reality environment when respective said virtual boundary region of at least one of said plurality of real-world objects intersects with said virtual boundary region of at least one other of said plurality of real-world objects. Even more preferably, at least one function of said predetermined set of attributes may be affected by the relative position between respective said plurality of real-world objects.
Advantageously, at least one of said plurality of virtual objects may be actuatable in said virtual-reality environment by engaging corresponding one of said plurality of real-world objects in said real-world environment. Additionally, at least one of said plurality of virtual objects may be actuatable in said virtual-reality environment by engaging at least one of said plurality of virtual objects in said virtual-reality environment.
Advantageously, the processor device may be adapted to track movement of said identifier of any one of said plurality of real-world objects in said real-world environment and synchronize movement of associated one of said plurality of virtual objects in said virtual-reality environment. This provides the advantage of a realistically perceived AR environment, i.e. the user can move the real-world objects and the computer system will be able to move the VR objects associated with the real-world objects in real time providing the illusion that the user is moving the VR object in the real-world environment.
The mixed-reality system may further comprise a base member adapted to receive and removably secure said plurality of real-world objects in a predetermined position. This provides the advantage that the real-world objects can be positioned securely in a predetermined pattern.
Advantageously, the predetermined identifier may be a computer-readable code.
Preferably, the computer readable code may be an Augmented Reality (AR) code. Even more advantageously, the first sensor may be a video image sensor adapted to output live video data. This provides the advantage that the real-world object can be tracked in real time.
Advantageously, a processor device may be comprised in said display device. Preferably, the display device may be a mobile display device. This provides the advantage of improved mobility during use allowing the user to easily apply the mixed-reality system from different angles / position.
Advantageously, the mixed-reality system may further comprise at least one transceiver operably coupled to said processor device and adapted to receive and transmit electromagnetic signals.
Advantageously, the mixed-reality system may further comprise at least one user input device. Preferably the user input device may comprise any one of a mouse, pointer device, gesture recognition unit, push button device, touch-sensitive surface and voice recognition unit.
Additionally, the mixed-reality system may further comprise at least one second sensor coupled to at least one of said plurality of real-world objects and adapted to detect a predetermined quantity and provide a corresponding signal to said processor device.
Brief Description of the Drawings
Preferred embodiments of the present invention will now be described, by way of example only and not in any limitative sense, with reference to the accompanying drawings, in which: Figure 1 shows a perspective view of a simplified example of the mixed-reality system of the present invention in use, including a tablet computer, a base plate, and example circuit chips placed on the base plate; Figure 2 shows a perspective view of an example base plate having pegs adapted to interlock with respective holes of the marker chips; Figure 3 shows a perspective close-up view of an example circuit chip secured to the base plate; Figure 4 shows a perspective close-up view of an example circuit chip secured to the base plate and augmented with its associated motor component as seen on the screen of the computer device; Figure 5 shows a plurality of example markers for a variety of circuit components; Figure 6 illustrates examples of associated virtual reality objects that are augmented to respective circuit chips, e.g. (a) a linear potentiometer, (b) a switch and (c) a contact switch; Figure 7 illustrates the four boundary regions (colliders) positioned around a virtual reality object defining the coupling region of the virtual reality object; Figure 8 illustrates two virtual reality objects that are coupled by intersecting respective boundary regions of the virtual reality objects; Figure 9 illustrates a first and third marker chip positioned on the base plate so that the boundary regions are not intersecting, i.e. no coupling exists in the virtual reality; Figure 10 illustrates a second marker chip position in-between the first and third marker chip so that the boundary regions of the second marker chip intersect with respective boundary regions of the first and third marker chip, i.e. a coupling between all three associated virtual reality objects exists in the virtual reality; Figure 11 shows a flow chart of a first example user interaction with the circuit chips in the real world and its associated function in the virtual reality; Figure 12 shows a flow chart of a second example user interaction with the circuit chips in the real world and its associated function in the virtual reality, and Figure 13 shows a flow chart of a third example user interaction with the circuit chips in the real world and its associated function in the virtual reality.
Detailed description of the preferred embodiment(s) The exemplary embodiments of this invention will be described in relation to an experimental electronic circuit set for use in education. However, it should be appreciated that, in general, the system and method of this invention will work equally well for any other educational or experimental subject matter, such as, for example chemistry, biology, computer logic, mechanics and engineering and many more.
For purposes of explanation, it should be appreciated that the terms 'determine', 'calculate' and 'compute, and variations thereof, as used herein are used interchangeably and include any type of methodology, process, mathematical operation or technique. In addition, the terms 'virtual' and 'virtual reality (VRJ, and variations thereof, are used interchangeably and refers to any computer-simulated environment and computer generated objects / content functioning within that environment. Virtual content that relates to one or more real-world objects and that may be provided on a display device may be referred to as Augmented Reality (AR) content. The terms 'top' and 'bottom' refer to a position during use, i.e. a base plate placed on a planar surface has a bottom surface (the surface facing towards the planar surface) and a top surface (the surface facing away from the planar surface), or a chip positioned on a base plate in use has a top surface (facing away from the base plate) and a bottom surface (facing toward the base plate).
Referring now to Figures 1 to 4, an example embodiment of a mixed-reality system 100 is shown during use. The system 100 comprises a tablet computer 102 preferably configured to run either android (mobile operation system based on Linux kernel) or iOS (mobile operation system, Apple Inc.). However, it is understood by the skilled person in the art that any other computer device, either mobile or desktop, may be used, and that the computer device may be configured to run any other suitable operating system. The tablet computer 102 has, for example, software packages such as Unity 3D Game engine and an Augmented Reality Plug-in pre-installed. However, it is understood by the skilled person that any other software application suitable to generate objects and content functioning within a VR environment may be used.
The tablet computer 102 further comprises at least one imaging sensor (not shown), such as a video camera positioned at the rear of the tablet computer 102, and has a touchscreen 104 adapted to allow the user to interact with the VR objects 106. In addition, the mixed-reality system 100 further comprises a base plate 108 and a plurality of chips 110. The base plate 108 may be manufactured from Perspex and has a predetermined number of pegs 112 protruding from its top surface and adapted to matingly interlock with respective holes 114 of the chips 110.
In this particular embodiment, the base plate 108 is formed from grey Perspex at 245 mm x 266 mm x 3 mm having linearly arranged pegs 112 of about 5 mm diameter and 3 mm height. The base plate of this particular embodiment can accommodate 5 x 6 chips 110, wherein four pegs 112 matingly interlock with respective four holes of the chips 110. However, it is understood by the person skilled in the art that the base plate 108 may be of any suitable dimension, colour and material, and that the pegs 112 and corresponding holes 114 may be of any suitable number, pattern arrangement and shape.
The chips 110 may be made from white coloured Perspex at 42 mm x 42 mm x 5 mm. A hole 114 of about 5 mm diameter is positioned in each corner of each circuit chip 110. An Augmenter Reality (AR) code 116 is printed in a centre square of the top surface of each chip 110. In this particular example the chip 110 may also be referred to as circuit chip 110 referring to the use as an electronic circuit component.
Figure 2 shows an example base plate 108 with four circuit chips 110 lined up in a row and secured by the pegs 112.
The tablet computer 102 has installed a software application adapted to receive and process data from any one of the tablet computer's interfaces with the user, e.g. the imaging sensor* (not shown), the touchscreen, a mouse, a pointer device, a gesture recognition unit, a push button device, a touch sensitive surface and/or voice recognition unit During use, the application software receives a continuous stream of imaging data (e.g. video image) from the imaging sensor (e.g. video camera) and continuously scans the imaging data for any computer readable marker(s) (i.e. AR code 116). A computer readable library of AR codes and each one's associated VR object and predefined function is pre-installed on the tablet computer 102 and accessible by the software application. When an AR code 116 has been detected and recognised, the associated VR circuit component 106 is read from the library and displayed on screen 104 as AR object (i.e. augmenting the displayed real-world chip 110). Figure 3 shows the circuit chip 110 with an AR code 116 for a VR motor component in the real-world environment and Figure 4 shows the real-world circuit chip 110 augmented with the associated VR object 116 (i.e. motor) as displayed on the screen 104.
Continuous scanning of the imaging data received from the imaging sensor allows the software application to track any recognised AR code 116 so that the real-world object (i.e. circuit chip 110) and the VR object 106 (i.e. associated electronic component) remain in sync even if the circuit chip 110 is moved to a different position on the base plate 108.
Figure 5 shows a variety of AR codes 116 that may be placed or printed on the top surface of the chip 110, and Figure 6 shows examples of respective VR object representations generated in the VR environment of the tablet computer 102, e.g. (a) a linear potentiometer, (b) a switch and (c) a contact switch. Consequently, it is possible to provide AR codes 116 and associated VR objects for any imaginable real-world component or function, therefore potentially providing the user with an endless number of VR objects 106 of any subject matter to experiment with. The acquisition costs of the mixed-reality components are also kept to a minimum, because, in order to provide a new set of components, respective AR codes 116 are simply attached j printed to a chip 110 and the associated VR library data is loaded onto the tablet computer 102 for use by the application software.
As mentioned before, each AR code 116 is linked to a VR object 106 having its own set of attributes and function within the VR environment. In this particular example embodiment of electronic circuit components, the associated attributes and functions of the VR object 106 mimic the attributes and function of its real-world counterpart. In other words, the library data includes information on the specific laws of electronics (e.g. conductivity, voltage, current), as well as, the laws of nature.
For example, the type of information associated with the VR motor component may be: Component: Motor Marker ID: 05 Circuit Type: Output Conduct: True Resistance: 20 Ohms Minimum Voltage: 1.51/ Maximum Voltage: 12V External Input: False External Input Type: NA External Input Result: NA Therefore, because the VR environment and VR, object obey to the predefined laws (e.g. laws of nature, electronics etc. as found in the real world), the mixed-reality system can replace the real-world electronic components that would normally be required to experiment in, for example, a classroom environment.
Furthermore, the library data also includes information on possible interaction of one VR object 106 with another VR object 106. For example, VR motor may be operably coupled to a VR power source, so that the VR power source drives the VR motor in the VR environment (displayed on the screen 104). The library data may also include information on a particular response of a VR object 106 when coupled to another VR object, e.g. a VR motor may be coupled to a VR 9V power source causing the VR motor to output a specific revolution.
Referring now to Figures 7 to 10, boundary regions 118 are defined around the VR object 106 within the VR environment, therefore allowing the system100 to detect, whether or not, a VR object 106 is operably coupled to another VR object 106 within the VR environment. The application software is configured to detect an intersection of one or more boundary regions 118 from different VR objects 106 and operably couple the respective one or more VR objects 106 in the VR environment. It is understood that the boundary region 118 may be defined of any suitable dimension and/or at any relative position around the VR object 106.
For example, a circuit chip 110 associated with a battery (M1) may be positioned next to a circuit chip 110 associated with a switch (M2) so that the defined boundary regions 118 intersect with each other. The application software may now start a "conversation' between the VR objects 106, which is provided in a simplified and illustrative format: M2: a component on my left has connected to me -"hello, what are you?" a component on my right has connected to me -"hello, l am a battery.' M2: "can you provide me with power?" Ml: 'Yes, I currently have 3r 1141 & M2: connection made with power When another circuit chip 110 associated with a motor (M3) is positioned on the base plate 108 so that the boundary region 118 of the switch intersects with the boundary region 118 of the motor (M3), the "conversation" may be as follows: M3: a component on my left is connected to me -"hello, what are you?" M2: a component on my right is connected to me - l am a switch," M3: "can you provide me with power?" M2: "no, I am currently open circuit." M2 & M3: connection made without power The software application of the mixed-reality system 100 is further configured to allow user interaction with any one of the VR objects, either via an interface of the tablet computer 102, or via the real-world chips 110 positioned on the base plate 108. For example, a VA switch may be actuated by simply pressing the touchscreen 104 at the location of the VA switch. The software application will then detect the user input through the touchscreen 104 and actuate the corresponding VR switch in the VR environment. Alternatively, in order to actuate the VA switch, the user may touch the real-world circuit chip 110. The software application is configured to detect the finger movement and/or covering of the AR code 116 and, as a response, actuate the corresponding VR switch. However, it is understood by the person skilled in the art that any other user input may be used to actuate any one of the VR objects in the VP, environment. For example, the VR object may be actuated using any one of a mouse, pointer device, gesture recognition unit (via the image sensor), push button device and a voice recognition unit.
Continuing now with the previous example, upon detection of a user interaction with at least one of the VR objects 16 (e.g. VR switch), the application software may now start a "conversation" as follows: M2: my circuit is now closed, I can now provide 3b' power" M2 & M3: connection made with power The VR motor (M3) may now run at a predetermined revolution (associated with 3V) in the VP, environment which is displayed on the screen 104 until the user actuates the VR switch again and closes the circuit, or simply removes one or more of the circuit chips 110 front the base plate 108, or moves the boundary regions 118 out of engagement. The "conversation" of the software application may now be as follows: Ml: a component on my right has disconnected M3: a component on my left has disconnected M3: that component was providing me with power -power lost 2411 & M3: connection lost Figures 11 to 13, illustrate flowcharts of example user interactions with the mixed-reality system 100 of the present invention and associated function within the VR environment.
in addition to the functions described above, the software application may also be adapted to detect the relative position of one VR object 106 with regards to another VR object 106 and provide a predetermined function and/or attribute depending on that relative position. Also, the actuatahility of a VR object 106 may he affected by the relative position of one VR object with regards to another VR object (e.g. houndaiy region 118 intersection on the left or right side).
It has been shown that the mixed-reality system 100 of the present invention provides the user with an AR environment allowing real-world and VR interaction by providing the real-world objects with a "sense of its environment and purpose", i.e. the system adds some kind of Artificial Intelligence to the real-world objects 110 within the VR environment allowing the user to "experiment" with real-world objects and see a VR outcome / response.
It will be appreciated by persons skilled in the art that the above embodiment has been described by way of example only and not in any limitative sense, and that various alterations and modifications are possible without departing from the scope of the invention as defined by the appended claims.

Claims (17)

  1. CLAIMS1. A mixed-reality system for intelligent virtual object interaction comprising: a plurality of real-world objects provided in a real-world environment, each comprising a distinguishable identifier; at least one first sensor adapted to capture and output real-time image data; a processor device, having installed an application software, adapted to receive and process said real-time image data so as to assign a plurality of virtual objects in a virtual-reality environment to respective said predetermined identifier of each one of said plurality of real-world objects, each one of said plurality of virtual objects is associated with a predetermined set of attributes defining at least one function; a display device operably coupled to said processor device and adapted to display real-world content superimposed with augmented-reality content, and wherein said plurality of virtual objects are interactable in the virtual-reality environment and in the real-world environment.
  2. 2. A mixed-reality system according to claim 1, wherein a virtual boundary region of predetermined dimension is defined in the periphery of each one of said plurality of real-world objects by said application software.
  3. 3. A mixed-reality system according to claim 2, wherein said plurality of virtual objects are operably coupled in said virtual-reality environment when respective said virtual boundary region of at least one of said plurality of real-world objects intersects with said virtual boundary region of at least one other of said plurality of real-world objects.
  4. 4. A mixed-reality system according to any one of claims 3, wherein said at least one function of said predetermined set of attributes is affected by the relative position between respective said plurality of real-world objects.
  5. 5.
  6. 6.
  7. 7.
  8. 8.
  9. 9.
  10. 10.
  11. 11.
  12. 12.A mixed-reality system according too any one of the preceding claims, wherein at least one of said plurality of virtual objects is actuatable in said virtual-reality environment by engaging corresponding one of said plurality of real-world objects in said real-world environment A mixed-reality system according to any one of the preceding claims, wherein at least one of said plurality of virtual objects is actuatable in said virtual-reality environment by engaging at least one of said plurality of virtual objects in said virtual-reality environment.A mixed-reality system according to any one of the preceding claims, wherein said processor device is adapted to track movement of said identifier of any one of said plurality of real-world objects in said real-world environment and synchronize movement of associated one of said plurality of virtual objects in said virtual-reality environment.A mixed-reality system according to any one of the preceding claims, further comprising a base member adapted to receive and removably secure said plurality of real-world objects in a predetermined position.A mixed-reality system according to any one of the preceding claims, wherein said predetermined identifier is a computer-readable code.A mixed-reality system according to claim 8, wherein said computer readable code is an Augmented Reality (AR) code.A mixed-reality system according to any one of the preceding claims, wherein said first sensor is a video imaging sensor adapted to output live video data.A mixed-reality system according to any one of the preceding claims, wherein said processor device is comprised in said display device.
  13. 13. A mixed-reality system according to claim 12, wherein said display device is a mobile display device.
  14. 14. A mixed-reality system according to any one of the preceding claims, further comprising at least one transceiver operably coupled to said processor device and adapted to receive and transmit electromagnetic signals.
  15. 15. A mixed-reality system according to any one of the preceding claims, further comprising at least one user input device.
  16. 16. A mixed-reality system according to claim 15, wherein said user input device comprises any one of a mouse, pointer device, gesture recognition unit, push button device, touch sensitive surface and voice recognition unit.
  17. 17. A mixed-reality system according to claim 16, further comprising at least one second sensor coupled to at least one of said plurality of real-world objects and adapted to detect a predetermined quantity and provide a corresponding signal to said processor device.
GB1419702.4A 2014-11-05 2014-11-05 A mixed-reality system for intelligent virtual object interaction Withdrawn GB2532025A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1419702.4A GB2532025A (en) 2014-11-05 2014-11-05 A mixed-reality system for intelligent virtual object interaction
PCT/GB2015/053346 WO2016071690A1 (en) 2014-11-05 2015-11-04 A mixed-reality system for intelligent virtual object interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1419702.4A GB2532025A (en) 2014-11-05 2014-11-05 A mixed-reality system for intelligent virtual object interaction

Publications (2)

Publication Number Publication Date
GB201419702D0 GB201419702D0 (en) 2014-12-17
GB2532025A true GB2532025A (en) 2016-05-11

Family

ID=52118739

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1419702.4A Withdrawn GB2532025A (en) 2014-11-05 2014-11-05 A mixed-reality system for intelligent virtual object interaction

Country Status (2)

Country Link
GB (1) GB2532025A (en)
WO (1) WO2016071690A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10445935B2 (en) 2017-05-26 2019-10-15 Microsoft Technology Licensing, Llc Using tracking to simulate direct tablet interaction in mixed reality
US10943399B2 (en) 2017-08-28 2021-03-09 Microsoft Technology Licensing, Llc Systems and methods of physics layer prioritization in virtual environments
WO2020083944A1 (en) * 2018-10-22 2020-04-30 Unity IPR ApS Method and system for addressing and segmenting portions of the real world for visual digital authoring in a mixed reality environment
WO2020136615A1 (en) * 2018-12-28 2020-07-02 Pankaj Uday Raut A system and a method for generating a head mounted device based artificial intelligence (ai) bot
US11610363B2 (en) * 2020-12-31 2023-03-21 Oberon Technologies, Inc. Systems and methods for virtual reality environments

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050245302A1 (en) * 2004-04-29 2005-11-03 Microsoft Corporation Interaction between objects and a virtual environment display
EP2267595A2 (en) * 2008-02-12 2010-12-29 Gwangju Institute of Science and Technology Tabletop, mobile augmented reality system for personalization and cooperation, and interaction method using augmented reality

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050245302A1 (en) * 2004-04-29 2005-11-03 Microsoft Corporation Interaction between objects and a virtual environment display
EP2267595A2 (en) * 2008-02-12 2010-12-29 Gwangju Institute of Science and Technology Tabletop, mobile augmented reality system for personalization and cooperation, and interaction method using augmented reality

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
"A Pilot Study of the Effectiveness of Augmented Reality to Enhance the Use of Remote Labs in Electrical Engineering Education", MejÃas Borrero A; Andà jar Mà rquez J M, Journal of Science Education and Technology, 22/10/2011, Vol 21, pages 540-557 *
"Augmented reality internet labs versus hands-on and virtual labs: A comparative study", Shatha Abu Shanab; Salaheddin Odeh; Rami Hodrob; Mahasen Anabtawi, 2012 Int Conf on Interactive Mobile and Computer Aided Learning (IMCL), 6/11/2012, pages 17-21 *
"Mixed reality with hyper-bonds-A means for remote labs", Wilhelm Bruns F; Erbe H -H, CONTROL ENGINEERING PRACTICE, 18/9/2007, GB, Vol 15, pages 1435-1444 *
"Remote augmented reality engineering labs", Salaheddin Odeh; Shatha Abu Shanab; Mahasen Anabtawi; Rami Hodrob, 2012 IEEE Global Engineering Education Conference (EDUCON), 17/4/2012, pages 1-6 *

Also Published As

Publication number Publication date
WO2016071690A1 (en) 2016-05-12
GB201419702D0 (en) 2014-12-17

Similar Documents

Publication Publication Date Title
WO2016071690A1 (en) A mixed-reality system for intelligent virtual object interaction
Hornecker Beyond affordance: tangibles' hybrid nature
Seo et al. Direct hand touchable interactions in augmented reality environments for natural and intuitive user experiences
Jetter et al. " in vr, everything is possible!": Sketching and simulating spatially-aware interactive spaces in virtual reality
Follmer et al. deForm: an interactive malleable surface for capturing 2.5 D arbitrary objects, tools and touch
Magnenat-Thalmann et al. Haptics in virtual reality and multimedia
CN107077229B (en) Human-machine interface device and system
US20170177077A1 (en) Three-dimension interactive system and method for virtual reality
Borst et al. Evaluation of a haptic mixed reality system for interactions with a virtual control panel
Yan Augmented reality instructions for construction toys enabled by accurate model registration and realistic object/hand occlusions
Simeone Substitutional reality: Towards a research agenda
EP3206144A3 (en) Interactive modeling and simulation for factory layout
Monteiro et al. Teachable reality: Prototyping tangible augmented reality with everyday objects by leveraging interactive machine teaching
Chamaret et al. Integration and evaluation of haptic feedbacks: from CAD models to virtual prototyping
CN205039917U (en) Sea floor world analog system based on CAVE system
Lassagne et al. Performance evaluation of passive haptic feedback for tactile HMI design in CAVEs
CN116645850A (en) Physical interactive playback of recordings
Rehman et al. Gestures and marker based low-cost interactive writing board for primary education
Ismail et al. Implementation of natural hand gestures in holograms for 3D object manipulation
Bordegoni et al. Requirements for an enactive tool to support skilled designers in aesthetic surfaces definition
Butnariu et al. DEVELOPMENT OF A NATURAL USER INTERFACE FOR INTUITIVE PRESENTATIONS IN EDUCATIONAL PROCESS.
Bruno et al. Mixed prototyping environment with different video tracking techniques
Desnoyers-Stewart et al. Mixed reality MIDI keyboard
Ferrise et al. Virtualization of industrial consumer products for haptic interaction design
KR20100132322A (en) Method and system for motion learning using motioncapure and three dimensional graphics

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20181115 AND 20181130

WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)