WO2020165885A1 - Computer-implemented method and system for providing interaction rules in mixed reality - Google Patents

Computer-implemented method and system for providing interaction rules in mixed reality Download PDF

Info

Publication number
WO2020165885A1
WO2020165885A1 PCT/IB2020/052565 IB2020052565W WO2020165885A1 WO 2020165885 A1 WO2020165885 A1 WO 2020165885A1 IB 2020052565 W IB2020052565 W IB 2020052565W WO 2020165885 A1 WO2020165885 A1 WO 2020165885A1
Authority
WO
WIPO (PCT)
Prior art keywords
mixed reality
virtual
real
wearable device
physical
Prior art date
Application number
PCT/IB2020/052565
Other languages
French (fr)
Inventor
Purav Shah
Mahesh Gadhvi
Veera Raghavan
Akshay Avasthi
Original Assignee
Quaqua Experiences Pvt. Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Quaqua Experiences Pvt. Ltd. filed Critical Quaqua Experiences Pvt. Ltd.
Publication of WO2020165885A1 publication Critical patent/WO2020165885A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/70Labelling scene content, e.g. deriving syntactic or semantic representations

Definitions

  • the disclosed subject matter relates generally to a mixed reality systems. More particularly, the present disclosure relates to a computer-implemented method and system for providing interaction rules in mixed reality to the users.
  • An objective of the present disclosure is directed towards enhancing the experience of mixed reality with more immersive experience.
  • Another objective of the present disclosure is directed towards providing a mixed reality interface standard of interactions and communication which primarily caters to mixed reality travel, virtual travel, and physical travel together or discrete.
  • Another objective of the present disclosure is directed towards generates a realistic interpretation of the motion or action of the users.
  • Another objective of the present disclosure is directed towards providing the interaction with a mixed reality by capturing an image of a real-world object using an image capturing device positioned in the wearable device.
  • a system comprising a wearable device wirelessly connected to a computing device via a network, the network facilitates the communication and interaction between the wearable device and the computing device and the wearable device is worn by a user.
  • the system further comprising a cloud server configured to receive a virtual object and transmits the virtual object to the wearable device over the network and the cloud server is associated with mixed reality immersive experience and provides the interactions and communications to the user on the computing device by using the wearable device,
  • the wearable device comprises a processing device configured to transform a physical object to a virtual object with a virtual time stamp and a virtual dimensional information of space.
  • FIG. 1 is a diagram depicting a schematic representation of a mixed reality environment, in accordance with one or more exemplary embodiments.
  • FIG. 2 is a block diagram depicting the wearable device 102 shown in FIG. 1, in accordance with one or more exemplary embodiments.
  • FIG. 3 is a block diagram depicting a schematic representation of mixed reality interfacing module 112 shown in FIG. 1, in accordance with one or more exemplary embodiments.
  • FIG. 4 is a flowchart depicting an exemplary method for providing the interaction between the computing device and the user, in accordance with one or more exemplary embodiments.
  • FIG. 5 is a block diagram illustrating the details of digital processing system in which various aspects of the present disclosure are operative by execution of appropriate software instructions.
  • FIG. 1 is a block diagram 100 depicting an example environment in which aspects of the present disclosure can be implemented.
  • FIG. 1 depicting a schematic representation of a mixed reality environment, in accordance with one or more exemplary embodiments.
  • the environment 100 provides interaction rules in mixed reality where the physical objects and virtual objects or entities co-exist.
  • the term mixed reality means a reality having at least one real-world object and at least one virtual object, which a user of the mixed reality space may perceive as interacting with one another.
  • the environment 100 provides a mixed reality interface standard of interactions and communication to the user.
  • the environment 100 depicting a wearable device 102, a processing device 104, a cloud server 106, a network 108, and a computing device 110.
  • the computing device 110 includes a mixed reality interfacing module 112.
  • the environment 100 provides the experience of the mixed reality, virtual reality, and physical experience together or distinct.
  • the wearable device 102 may be worn by the user to acquire the mixed reality experience.
  • the user may include but not limited to, an individual, a person, a group, and so forth.
  • the environment 100 facilitates the communications and interactions between the users and the computing device 110 via a network 108.
  • the network 108 may include but not limited to, an Internet of things (IoT network devices), an Ethernet, a wireless local area network (WLAN), or a wide area network (WAN), a Bluetooth low energy network, a ZigBee network, a WIFI communication network e.g., the wireless high speed internet, or a combination of networks, a cellular service such as a 4G (e.g., LTE, mobile WiMAX) or 5G cellular data service, a RFID module, a NFC module, wired cables, such as the world-wide- web based Internet, or other types of networks may include Transport Control Protocol/Internet Protocol (TCP/IP) or device addresses (e.g.
  • the cloud server 106 here may be referred to a cloud or a physical server located in a remote location and is associated with mixed reality immersive experience and provides the interactions and communications to the users on the computing device 110 by using the wearable device 102.
  • the wearable device 102 may be wirelessly connected to the computing device 110 via the network 108.
  • the network 108 facilitates the communication and interaction between the wearable device 102 and the computing device 110.
  • the wearable device 102 is worn by a user.
  • the cloud server 106 may be configured to receive a virtual object and transmit the virtual object to the wearable device 102 over the network 108 and the cloud server 106 may be associated with mixed reality immersive experience and provides the interactions and communications to the user on the computing device 110 by using the wearable device 102.
  • the wearable device 102 comprises the processing device 104 configured to transform a physical object to a virtual object with a virtual time stamp and a virtual dimensional information of space.
  • FIG. 2 is a block diagram 200 depicting the wearable device 102 shown in FIG. 1, in accordance with one or more exemplary embodiments.
  • the wearable device 102 includes the processing device 104, an image capturing device 202, a physical sensor 204, a visual sensor 206, and an artificial sensor 208.
  • the processing device 104 may include but not limited to, a microcontroller (for example ARM 7 or ARM 11), a raspberry pi, a microprocessor, a digital signal processor, a microcomputer, a field programmable gate array, a programmable logic device, a state machine or a logic circuitry, PC board.
  • the image capturing device 202 may be configured to capture the physical objects from a physical environment and real-world objects within the environment.
  • the image capturing device 202 may include a depth camera, a visible light camera, an infrared light camera, and/or a position tracking camera, and so forth.
  • the physical sensor 204 may be configured to transform a physical object to a virtual object and forcing the eyes to see the virtual object and allowing the brain to interpret that virtual object as real.
  • the visual sensor 206 may be capable of processing and fusing images of a scene from multiple viewpoints.
  • the primary human senses may be connected to the artificial sensor 208 establishing the calibration of the artificial sensor 208 with the human senses.
  • the human senses may include but not limited to, camera focus length, size of objects, color, and so forth.
  • FIG. 3 is a block diagram 300 depicting a schematic representation of the mixed reality interfacing module 112 shown in FIG. 1, in accordance with one or more exemplary embodiments.
  • the mixed reality interfacing module 112 may include a bus 301, an entity detection module 302, and a space recognition module 304, an entity tagging module 306, an interaction management module 308, and a central database 310.
  • the bus 301 may include a path that permits communication among the modules of the mixed reality interfacing module 112.
  • the central database 310 may be configured to store the transformation rules for the mixed reality interactions.
  • the virtual objects to real objects and virtual users to real users may be combined by applying the transformation rules using the central database 310 to obtain the transformed data.
  • the transformation rules may include physical operations (For example, push, pull, weigh may be performed only on the real objects), coupled virtual object to real object in which the real object viewed by the image capturing device 202 in real time is coupled in such a way that what you seen in the image capturing device 202, coupled virtual-real users in which the virtual users are coupled with real users who exist as part of the real space, real users in the real space are may be limited by the capacity of the real space, virtual users may not be limited by the capacity in either the real space or the virtual space or the mixed reality space (For example, the user may invite unlimited number of users to view or operate in a mixed reality space or may share to unlimited Virtual users.)
  • the transformation rules may combine the virtual objects to real objects in which the real objects viewed by the image capturing device 202 and virtual users to real users to obtain the transformed data.
  • the transformed data may be sent to an intermediate channel and index the entities and then tagging the space by the mixed reality interfacing module 112.
  • the mixed reality interfacing module 112 may be configured to create an output file and providing the interactions between the computing device 110 and the user by the mixed reality interfacing module 112.
  • the mixed reality interfacing module 112 may be configured to encrypt the created output file and then decrypted the output file.
  • the mixed reality interfacing module 112 may receive the decrypted output file.
  • the entity detection module 302 may be configured to detect the multiple entities of objects captured by the image capturing device 202.
  • the entities may include but not limited to, virtual, real, virtual real -transformed real to virtual, virtual-virtual transformed virtual entity properties or to another virtual entity, virtual-real, transformed virtual to real, real-real transformed real to real, and so forth.
  • the space recognition module 304 may be configured to recognize the spaces captured by the image capturing device 202.
  • the spaces may include but not limited to, real spaces, virtual spaces, and mixed reality spaces.
  • the mixed reality spaces further includes augmented real over virtual, augmented virtual over real and shared spaces.
  • the entity tagging module 306 may be configured to tag the objects captured by the image capturing device 202.
  • the entity tagging module 306 may be configured to tag every entity as a virtual, real, virtual-real, real-virtual and by time and, space belongs right from its origin, and so forth.
  • the entity tagging module 306 may also be configured to identify whether the entity is a sensor entity or a non-sensor entity.
  • the interaction management module 308 may be configured to provide interaction or the communication interface between the computing device 110 and the user.
  • the interactions may include but not limited to, human operations (For example, physical forces or actions), machine operations (For example, physical force or machine actions), visual-keyboard interface (For example, text- based commands interface), visual-mouse interface (For example, mouse-based commands, click commands), visual-touch interface (For example, touch-based or click command based), visual-haptic (For example, motion interface), voice-based interface (For example, voice commands), virtual physics-(For example, gesture-based interface), visual-gaze based Interface (For example, communicates using Gaze or focus point), mixed interface (For example, combination of one or more interactions), and so forth.
  • FIG. 4 is a flowchart 400 depicting an exemplary method for providing the interaction between the computing device and the user, in accordance with one or more exemplary embodiments.
  • the method 400 is carried out in the context of the details of FIG. 1, FIG. 2, and FIG. 3. Flowever, the method 400 is carried out in any desired environment. Further, the aforementioned definitions are equally applied to the description below.
  • the exemplary method 400 commences at step 402, capturing the physical objects of a scene by the image capturing device from a physical environment. Thereafter, at step 404, transforming the captured physical objects to virtual objects by the physical sensor and processing captured physical objects of the scene from multiple viewpoints by the visual sensor. Thereafter, at step 406, combining the virtual object to real objects, virtual users to real users by applying the transformation rules using the central database to obtain the transformed data. Determining whether the transformation rules are applied to send the transformed data, at step 408. If answer to the step 408 is No, the process redirects to the step 406. If answer to the step 408 is YES, sending the transformed data through the intermediate channel, at step 410.
  • FIG. 5 is a block diagram 500 illustrating the details of a digital processing system 500 in which various aspects of the present disclosure are operative by execution of appropriate software instructions.
  • the Digital processing system 500 may correspond to the computing devices 110 (or any other system in which the various features disclosed above can be implemented).
  • Digital processing system 500 may contain one or more processors such as a central processing unit (CPU) 510, random access memory (RAM) 520, secondary memory 527, graphics controller 560, display unit 570, network interface 580, and input interface 590. All the components except display unit 570 may communicate with each other over communication path 550, which may contain several buses as is well known in the relevant arts. The components of Figure 8 are described below in further detail.
  • processors such as a central processing unit (CPU) 510, random access memory (RAM) 520, secondary memory 527, graphics controller 560, display unit 570, network interface 580, and input interface 590. All the components except display unit 570 may communicate with each other over communication path 550, which may contain several buses as is well known in the relevant arts.
  • the components of Figure 8 are described below in further detail.
  • CPU 510 may execute instructions stored in RAM 520 to provide several features of the present disclosure.
  • CPU 510 may contain multiple processing units, with each processing unit potentially being designed for a specific task.
  • CPU 510 may contain only a single general-purpose processing unit.
  • RAM 520 may receive instructions from secondary memory 530 using communication path 550.
  • RAM 520 is shown currently containing software instructions, such as those used in threads and stacks, constituting shared environment 525 and/or user programs 526.
  • Shared environment 525 includes operating systems, device drivers, virtual machines, etc., which provide a (common) run time environment for execution of user programs 526.
  • Graphics controller 560 generates display signals (e.g., in RGB format) to display unit 570 based on data/instructions received from CPU 510.
  • Display unit 570 contains a display screen to display the images defined by the display signals.
  • Input interface 590 may correspond to a keyboard and a pointing device (e.g., touch-pad, mouse) and may be used to provide inputs.
  • Network interface 580 provides connectivity to a network (e.g., using Internet Protocol), and may be used to communicate with other systems (such as those shown in Figure 1) connected to the network 108.
  • Secondary memory 530 may contain hard drive 535, flash memory 536, and removable storage drive 537. Secondary memory 530 may store the data software instructions (e.g., for performing the actions noted above with respect to the Figures), which enable digital processing system 500 to provide several features in accordance with the present disclosure.
  • removable storage unit 540 Some or all of the data and instructions may be provided on removable storage unit 540, and the data and instructions may be read and provided by removable storage drive 537 to CPU 510.
  • Floppy drive, magnetic tape drive, CD-ROM drive, DVD Drive, Flash memory, removable memory chip (PCMCIA Card, EEPROM) are examples of such removable storage drive 537.
  • Removable storage unit 540 may be implemented using medium and storage format compatible with removable storage drive 537 such that removable storage drive 537 can read the data and instructions.
  • removable storage unit 540 includes a computer-readable (storage) medium having stored therein computer software and/or data.
  • the computer (or machine, in general) readable medium can be in other forms (e.g., non removable, random access, etc.).
  • computer program product is used to generally refer to removable storage unit 540 or hard disk installed in hard drive 535. These computer program products are means for providing software to digital processing system 500.
  • CPU 510 may retrieve the software instructions, and execute the instructions to provide various features of the present disclosure described above.
  • Non-volatile media includes, for example, optical disks, magnetic disks, or solid-state drives, such as storage memory 530.
  • Volatile media includes dynamic memory, such as RAM 520.
  • Storage media includes, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.
  • Storage media is distinct from but may be used in conjunction with transmission media.
  • Transmission media participates in transferring information between storage media.
  • transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus (communication path) 550.
  • Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.

Abstract

Exemplary embodiments of the present disclosure are directed towards a system for providing interaction rules in mixed reality. The system comprising a wearable device 102 wirelessly connected to a computing device 110 via a network108, the network 108 facilitates the communication and interaction between the wearable device 102 and the computing device 110 and the wearable device 102 is worn by a user, and a cloud server 106 is configured to receive a virtual object and transmit the virtual object to the wearable device 102 over the network108and the cloud server 106 is associated with mixed reality immersive experience and provides the interactions and communications to the user on the computing device 110 by using the wearable device 102, the wearable device 102 comprises a processing device 104 configured to transform a physical object to a virtual object with a virtual time stamp and a virtual dimensional information of space.

Description

“COMPUTER-IMPLEMENTED METHOD AND SYSTEM FOR PROVIDING
INTERACTION RULES IN MIXED REALITY”
TECHNICAL FIELD
[001] The disclosed subject matter relates generally to a mixed reality systems. More particularly, the present disclosure relates to a computer-implemented method and system for providing interaction rules in mixed reality to the users.
BACKGROUND
[002] In the ancient time human beings interpreted what happen around them in the world as stories and some of them started creating stories, perhaps the first form of virtual reality. The stories gave human beings a sense of experience of various kinds of happenings. The stories belongs to any language which had always classes of nouns and nouns to recognize and identify physical entities like water, mountains, animals, characters, etc. and classes of verbs to represent various kinds of physical forces that existed like push, pull, get, weigh, lift, role of a character etc. Mixed reality is also referred to as augment virtual entities over a real- world physical environment. Users are aspired to make the real experience enhanced over real. In the world of mixed reality where physical and virtual objects or entities co-exist and this calls for a new way of human training and interface standards, which provides human being a new challenge of interacting as both the worlds virtual and physical are governed by different universal rules. The brain cannot distinguish between real and virtual objects unless an action is performed like measure the physicality of the objects like the force.
[003] Various challenges still exist such as, for example, creating a natural experience for the user while interacting with a mixed environment of real-world objects. Therefore, an intuitive system is needed to provide a mixed reality interface standard of interactions and communications to the user.
[004] In the light of the aforementioned discussion, there exists a need for a certain system with novel methodologies that would overcome the above-mentioned disadvantages.
SUMMARY [005] The following presents a simplified summary of the disclosure in order to provide a basic understanding of the reader. This summary is not an extensive overview of the disclosure and it does not identify key/critical elements of the invention or delineate the scope of the invention. Its sole purpose is to present some concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.
[006] An objective of the present disclosure is directed towards enhancing the experience of mixed reality with more immersive experience.
[007] Another objective of the present disclosure is directed towards providing a mixed reality interface standard of interactions and communication which primarily caters to mixed reality travel, virtual travel, and physical travel together or discrete.
[008] Another objective of the present disclosure is directed towards generates a realistic interpretation of the motion or action of the users.
[009] Another objective of the present disclosure is directed towards providing the interaction with a mixed reality by capturing an image of a real-world object using an image capturing device positioned in the wearable device.
[0010] According to an exemplary aspect, a system comprising a wearable device wirelessly connected to a computing device via a network, the network facilitates the communication and interaction between the wearable device and the computing device and the wearable device is worn by a user.
[0011] According to another exemplary aspect, the system further comprising a cloud server configured to receive a virtual object and transmits the virtual object to the wearable device over the network and the cloud server is associated with mixed reality immersive experience and provides the interactions and communications to the user on the computing device by using the wearable device, the wearable device comprises a processing device configured to transform a physical object to a virtual object with a virtual time stamp and a virtual dimensional information of space. BRIEF DESCRIPTION OF THE DRAWINGS
[0012] FIG. 1 is a diagram depicting a schematic representation of a mixed reality environment, in accordance with one or more exemplary embodiments.
[0013] FIG. 2 is a block diagram depicting the wearable device 102 shown in FIG. 1, in accordance with one or more exemplary embodiments.
[0014] FIG. 3 is a block diagram depicting a schematic representation of mixed reality interfacing module 112 shown in FIG. 1, in accordance with one or more exemplary embodiments.
[0015] FIG. 4 is a flowchart depicting an exemplary method for providing the interaction between the computing device and the user, in accordance with one or more exemplary embodiments.
[0016] FIG. 5 is a block diagram illustrating the details of digital processing system in which various aspects of the present disclosure are operative by execution of appropriate software instructions.
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
[0017] It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the drawings. The present disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.
[0018] The use of “including”,“comprising” or“having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. The terms“a” and“an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced item. Further, the use of terms“first”,“second”, and“third”, and the like, herein do not denote any order, quantity, or importance, but rather are used to distinguish one element from another.
[0019] Referring to FIG. 1 is a block diagram 100 depicting an example environment in which aspects of the present disclosure can be implemented. Specifically, FIG. 1 depicting a schematic representation of a mixed reality environment, in accordance with one or more exemplary embodiments. The environment 100 provides interaction rules in mixed reality where the physical objects and virtual objects or entities co-exist. The term mixed reality means a reality having at least one real-world object and at least one virtual object, which a user of the mixed reality space may perceive as interacting with one another. The environment 100 provides a mixed reality interface standard of interactions and communication to the user. The environment 100 depicting a wearable device 102, a processing device 104, a cloud server 106, a network 108, and a computing device 110. The computing device 110 includes a mixed reality interfacing module 112. The environment 100 provides the experience of the mixed reality, virtual reality, and physical experience together or distinct. The wearable device 102 may be worn by the user to acquire the mixed reality experience. The user may include but not limited to, an individual, a person, a group, and so forth. The environment 100 facilitates the communications and interactions between the users and the computing device 110 via a network 108. The network 108 may include but not limited to, an Internet of things (IoT network devices), an Ethernet, a wireless local area network (WLAN), or a wide area network (WAN), a Bluetooth low energy network, a ZigBee network, a WIFI communication network e.g., the wireless high speed internet, or a combination of networks, a cellular service such as a 4G (e.g., LTE, mobile WiMAX) or 5G cellular data service, a RFID module, a NFC module, wired cables, such as the world-wide- web based Internet, or other types of networks may include Transport Control Protocol/Internet Protocol (TCP/IP) or device addresses (e.g. network-based MAC addresses, or those provided in a proprietary networking protocol, such as Modbus TCP, or by using appropriate data feeds to obtain data from various web services, including retrieving XML data from an F1TTP address, then traversing the XML for a particular node) and so forth without limiting the scope of the present disclosure. The cloud server 106 here may be referred to a cloud or a physical server located in a remote location and is associated with mixed reality immersive experience and provides the interactions and communications to the users on the computing device 110 by using the wearable device 102. [0020] The wearable device 102 may be wirelessly connected to the computing device 110 via the network 108. The network 108 facilitates the communication and interaction between the wearable device 102 and the computing device 110. The wearable device 102 is worn by a user. The cloud server 106 may be configured to receive a virtual object and transmit the virtual object to the wearable device 102 over the network 108 and the cloud server 106 may be associated with mixed reality immersive experience and provides the interactions and communications to the user on the computing device 110 by using the wearable device 102. The wearable device 102 comprises the processing device 104 configured to transform a physical object to a virtual object with a virtual time stamp and a virtual dimensional information of space.
[0021] Referring to FIG. 2 is a block diagram 200 depicting the wearable device 102 shown in FIG. 1, in accordance with one or more exemplary embodiments. The wearable device 102 includes the processing device 104, an image capturing device 202, a physical sensor 204, a visual sensor 206, and an artificial sensor 208. The processing device 104 may include but not limited to, a microcontroller (for example ARM 7 or ARM 11), a raspberry pi, a microprocessor, a digital signal processor, a microcomputer, a field programmable gate array, a programmable logic device, a state machine or a logic circuitry, Arduino board. The image capturing device 202 may be configured to capture the physical objects from a physical environment and real-world objects within the environment. For example, the image capturing device 202 may include a depth camera, a visible light camera, an infrared light camera, and/or a position tracking camera, and so forth. The physical sensor 204 may be configured to transform a physical object to a virtual object and forcing the eyes to see the virtual object and allowing the brain to interpret that virtual object as real. The visual sensor 206 may be capable of processing and fusing images of a scene from multiple viewpoints. The primary human senses may be connected to the artificial sensor 208 establishing the calibration of the artificial sensor 208 with the human senses. The human senses may include but not limited to, camera focus length, size of objects, color, and so forth.
[0022] Referring to FIG. 3 is a block diagram 300 depicting a schematic representation of the mixed reality interfacing module 112 shown in FIG. 1, in accordance with one or more exemplary embodiments. The mixed reality interfacing module 112 may include a bus 301, an entity detection module 302, and a space recognition module 304, an entity tagging module 306, an interaction management module 308, and a central database 310. The bus 301 may include a path that permits communication among the modules of the mixed reality interfacing module 112. The central database 310 may be configured to store the transformation rules for the mixed reality interactions. The virtual objects to real objects and virtual users to real users may be combined by applying the transformation rules using the central database 310 to obtain the transformed data. The transformation rules may include physical operations (For example, push, pull, weigh may be performed only on the real objects), coupled virtual object to real object in which the real object viewed by the image capturing device 202 in real time is coupled in such a way that what you seen in the image capturing device 202, coupled virtual-real users in which the virtual users are coupled with real users who exist as part of the real space, real users in the real space are may be limited by the capacity of the real space, virtual users may not be limited by the capacity in either the real space or the virtual space or the mixed reality space (For example, the user may invite unlimited number of users to view or operate in a mixed reality space or may share to unlimited Virtual users.) The transformation rules may combine the virtual objects to real objects in which the real objects viewed by the image capturing device 202 and virtual users to real users to obtain the transformed data. The transformed data may be sent to an intermediate channel and index the entities and then tagging the space by the mixed reality interfacing module 112. The mixed reality interfacing module 112 may be configured to create an output file and providing the interactions between the computing device 110 and the user by the mixed reality interfacing module 112. The mixed reality interfacing module 112 may be configured to encrypt the created output file and then decrypted the output file. The mixed reality interfacing module 112 may receive the decrypted output file.
[0023] The entity detection module 302 may be configured to detect the multiple entities of objects captured by the image capturing device 202. The entities may include but not limited to, virtual, real, virtual real -transformed real to virtual, virtual-virtual transformed virtual entity properties or to another virtual entity, virtual-real, transformed virtual to real, real-real transformed real to real, and so forth. The space recognition module 304 may be configured to recognize the spaces captured by the image capturing device 202. The spaces may include but not limited to, real spaces, virtual spaces, and mixed reality spaces. The mixed reality spaces further includes augmented real over virtual, augmented virtual over real and shared spaces. The entity tagging module 306 may be configured to tag the objects captured by the image capturing device 202. The entity tagging module 306 may be configured to tag every entity as a virtual, real, virtual-real, real-virtual and by time and, space belongs right from its origin, and so forth. The entity tagging module 306 may also be configured to identify whether the entity is a sensor entity or a non-sensor entity. The interaction management module 308 may be configured to provide interaction or the communication interface between the computing device 110 and the user. The interactions may include but not limited to, human operations (For example, physical forces or actions), machine operations (For example, physical force or machine actions), visual-keyboard interface (For example, text- based commands interface), visual-mouse interface (For example, mouse-based commands, click commands), visual-touch interface (For example, touch-based or click command based), visual-haptic (For example, motion interface), voice-based interface (For example, voice commands), virtual physics-(For example, gesture-based interface), visual-gaze based Interface (For example, communicates using Gaze or focus point), mixed interface (For example, combination of one or more interactions), and so forth.
[0024] Referring to FIG. 4 is a flowchart 400 depicting an exemplary method for providing the interaction between the computing device and the user, in accordance with one or more exemplary embodiments. As an option, the method 400 is carried out in the context of the details of FIG. 1, FIG. 2, and FIG. 3. Flowever, the method 400 is carried out in any desired environment. Further, the aforementioned definitions are equally applied to the description below.
[0025] The exemplary method 400 commences at step 402, capturing the physical objects of a scene by the image capturing device from a physical environment. Thereafter, at step 404, transforming the captured physical objects to virtual objects by the physical sensor and processing captured physical objects of the scene from multiple viewpoints by the visual sensor. Thereafter, at step 406, combining the virtual object to real objects, virtual users to real users by applying the transformation rules using the central database to obtain the transformed data. Determining whether the transformation rules are applied to send the transformed data, at step 408. If answer to the step 408 is No, the process redirects to the step 406. If answer to the step 408 is YES, sending the transformed data through the intermediate channel, at step 410. Thereafter, at step 412, indexing the entities by the entity detection module and tagging the space by the space recognition module. Thereafter, at step 414, creating the output file and transferring the created output file to the mixed reality interfacing module. Thereafter, at step 416, providing the interactions between the computing device and the user by the mixed reality interfacing module. [0026] Referring to FIG. 5 is a block diagram 500 illustrating the details of a digital processing system 500 in which various aspects of the present disclosure are operative by execution of appropriate software instructions. The Digital processing system 500 may correspond to the computing devices 110 (or any other system in which the various features disclosed above can be implemented).
[0027] Digital processing system 500 may contain one or more processors such as a central processing unit (CPU) 510, random access memory (RAM) 520, secondary memory 527, graphics controller 560, display unit 570, network interface 580, and input interface 590. All the components except display unit 570 may communicate with each other over communication path 550, which may contain several buses as is well known in the relevant arts. The components of Figure 8 are described below in further detail.
[0028] CPU 510 may execute instructions stored in RAM 520 to provide several features of the present disclosure. CPU 510 may contain multiple processing units, with each processing unit potentially being designed for a specific task. Alternatively, CPU 510 may contain only a single general-purpose processing unit.
[0029] RAM 520 may receive instructions from secondary memory 530 using communication path 550. RAM 520 is shown currently containing software instructions, such as those used in threads and stacks, constituting shared environment 525 and/or user programs 526. Shared environment 525 includes operating systems, device drivers, virtual machines, etc., which provide a (common) run time environment for execution of user programs 526.
[0030] Graphics controller 560 generates display signals (e.g., in RGB format) to display unit 570 based on data/instructions received from CPU 510. Display unit 570 contains a display screen to display the images defined by the display signals. Input interface 590 may correspond to a keyboard and a pointing device (e.g., touch-pad, mouse) and may be used to provide inputs. Network interface 580 provides connectivity to a network (e.g., using Internet Protocol), and may be used to communicate with other systems (such as those shown in Figure 1) connected to the network 108. [0031] Secondary memory 530 may contain hard drive 535, flash memory 536, and removable storage drive 537. Secondary memory 530 may store the data software instructions (e.g., for performing the actions noted above with respect to the Figures), which enable digital processing system 500 to provide several features in accordance with the present disclosure.
[0032] Some or all of the data and instructions may be provided on removable storage unit 540, and the data and instructions may be read and provided by removable storage drive 537 to CPU 510. Floppy drive, magnetic tape drive, CD-ROM drive, DVD Drive, Flash memory, removable memory chip (PCMCIA Card, EEPROM) are examples of such removable storage drive 537.
[0033] Removable storage unit 540 may be implemented using medium and storage format compatible with removable storage drive 537 such that removable storage drive 537 can read the data and instructions. Thus, removable storage unit 540 includes a computer-readable (storage) medium having stored therein computer software and/or data. However, the computer (or machine, in general) readable medium can be in other forms (e.g., non removable, random access, etc.).
[0034] In this document, the term "computer program product" is used to generally refer to removable storage unit 540 or hard disk installed in hard drive 535. These computer program products are means for providing software to digital processing system 500. CPU 510 may retrieve the software instructions, and execute the instructions to provide various features of the present disclosure described above.
[0035] The term“storage media/medium” as used herein refers to any non-transitory media that store data and/or instructions that cause a machine to operate in a specific fashion. Such storage media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical disks, magnetic disks, or solid-state drives, such as storage memory 530. Volatile media includes dynamic memory, such as RAM 520. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge. [0036] Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus (communication path) 550. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
[0037] Reference throughout this specification to“one embodiment”,“an embodiment”, or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrases“in one embodiment”,“in an embodiment” and similar language throughout this specification may, but do not necessarily, ah refer to the same embodiment.
[0038] Furthermore, the described features, structures, or characteristics of the disclosure may be combined in any suitable manner in one or more embodiments. In the above description, numerous specific details are provided such as examples of programming, software modules, user selections, network transactions, database queries, database structures, hardware modules, hardware circuits, hardware chips, etc., to provide a thorough understanding of embodiments of the disclosure.
[0039] Although the present disclosure has been described in terms of certain preferred embodiments and illustrations thereof, other embodiments and modifications to preferred embodiments may be possible that are within the principles and spirit of the invention. The above descriptions and figures are therefore to be regarded as illustrative and not restrictive.
[0040] Thus the scope of the present disclosure is defined by the appended claims and includes both combinations and sub-combinations of the various features described hereinabove as well as variations and modifications thereof, which would occur to persons skilled in the art upon reading the foregoing description.

Claims

CLAIMS We Claim:
1. A system for providing interaction rules in mixed reality, comprising: a wearable device 102 wirelessly connected to a computing device 110 via a network 108, whereby the network 108 facilitates the communication and interaction between the wearable device 102 and the computing device 110 and the wearable device 102 is worn by a user; and a cloud server 106 is configured to receive a virtual object and transmit the virtual object to the wearable device 102 over the network 108 and the cloud server 106 is associated with mixed reality immersive experience and provides the interactions and communications to the user on the computing device 110 by using the wearable device 102, whereby the wearable device 102 comprises a processing device 104 configured to transform a physical object to a virtual object with a virtual time stamp and a virtual dimensional information of space.
2. The system as claimed in claim 1, wherein the wearable device 102 comprises an image capturing device 202 configured to capture physical objects from a physical environment and real-world objects within the environment.
3. The system as claimed in claim 1, wherein the wearable device 102 further comprises a physical sensor 204 configured to transform a physical object to a virtual object and forcing the eyes to see the virtual object and allowing the brain to interpret that virtual object as real.
4. The system as claimed in claim 1, wherein the wearable device 102 further comprises a visual sensor 206 capable of processing and fusing images of a scene from multiple viewpoints.
5. The system as claimed in claim 1, wherein the wearable device 102 also comprises artificial sensors 208 configured to establish the calibration with the human senses.
6. The system as claimed in claim 1, wherein the computing device 110 comprises a mixed reality interfacing module 112 configured to provide transformation rules for the mixed reality interactions.
7. The system as claimed in claim 6, wherein the mixed reality interfacing module 112 comprises an entity detection module 302 configured to detect the multiple entities of objects captured by the image capturing device 202.
8. The system as claimed in claim 6, wherein the mixed reality interfacing module 112 comprises a space recognition module 304 configured to recognize the spaces captured by the image capturing device 202.
9. The system as claimed in claim 6, wherein the mixed reality interfacing module 112 comprises an entity tagging module 306 configured to identify the entity as a sensor entity and a non-sensor entity.
10. The system as claimed in claim 9, wherein the entity tagging module 306 configured to tag every entity as a virtual, real, virtual-real, real-virtual and by time and, space belongs right from its origin.
11. The system as claimed in claim 6, wherein the mixed reality interfacing module 112 comprises an interaction management module 308 configured to provide interaction and the communication interface between the computing device 110 and the user.
12. The system as claimed in claim 6, wherein the mixed reality interfacing module 112 comprises a central database 310 configured to store the transformation rules for the mixed reality interactions.
13. A method for providing interaction rules in mixed reality, comprising: capturing physical objects of a scene by an image capturing device 202 from a physical environment; transforming the captured physical objects to virtual objects by a physical sensor 204 and processing the captured physical objects of the scene from multiple viewpoints by a visual sensor 206; combining the virtual object to real objects, virtual users to real users by applying the transformation rules using a central database 310 to obtain the transformed data; and sending the transformed data to an intermediate channel and indexing the entities and then tagging the space by a mixed reality interfacing module 112, whereby the mixed reality interfacing module 112 configured to create an output file and providing the interactions between the computing device 110 and the user by the mixed reality interfacing module 112.
14. The method as claimed in claim 13, wherein the mixed reality interfacing module 112 encrypts the created output file and then decrypts the output file.
15. The method as claimed in claim 14, wherein the mixed reality interfacing module 112 receives the output file.
16. A computer program product comprising a non-transitory computer-readable medium having a computer-readable program code embodied therein to be executed by one or more processors, the program code including instructions to: capture physical objects of a scene by an image capturing device 202 from a physical environment; transform the captured physical objects to virtual objects by a physical sensor 204 and processing the captured physical objects of the scene from multiple viewpoints by a visual sensor 206; combine the virtual object to real objects, virtual users to real users by applying the transformation rules using a central database 310 to obtain the transformed data; and send the transformed data to an intermediate channel and indexing the entities and then tagging the space by a mixed reality interfacing module 112, whereby the mixed reality interfacing module 112 configured to create an output file and providing the interactions between the computing device 110 and the user by the mixed reality interfacing module 112.
PCT/IB2020/052565 2019-02-13 2020-03-20 Computer-implemented method and system for providing interaction rules in mixed reality WO2020165885A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201941005749 2019-02-13
IN201941005749 2019-02-13

Publications (1)

Publication Number Publication Date
WO2020165885A1 true WO2020165885A1 (en) 2020-08-20

Family

ID=72044208

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2020/052565 WO2020165885A1 (en) 2019-02-13 2020-03-20 Computer-implemented method and system for providing interaction rules in mixed reality

Country Status (1)

Country Link
WO (1) WO2020165885A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112364478A (en) * 2020-09-30 2021-02-12 深圳市为汉科技有限公司 Virtual reality-based testing method and related device
CN112527101A (en) * 2020-11-09 2021-03-19 义乌市输变电工程有限公司 Remote control method and device for variable electric field
CN112527100A (en) * 2020-11-09 2021-03-19 义乌市输变电工程有限公司 Remote assistance method and device based on intelligent wearable equipment
WO2023019982A1 (en) * 2021-08-17 2023-02-23 广州博冠信息科技有限公司 Same-screen interaction control method and apparatus, and electronic device and storage medium
WO2024049585A1 (en) * 2022-08-31 2024-03-07 Snap Inc. Timelapse of generating a collaborative object

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012135554A1 (en) * 2011-03-29 2012-10-04 Qualcomm Incorporated System for the rendering of shared digital interfaces relative to each user's point of view
WO2017020132A1 (en) * 2015-08-04 2017-02-09 Yasrebi Seyed-Nima Augmented reality in vehicle platforms

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012135554A1 (en) * 2011-03-29 2012-10-04 Qualcomm Incorporated System for the rendering of shared digital interfaces relative to each user's point of view
WO2017020132A1 (en) * 2015-08-04 2017-02-09 Yasrebi Seyed-Nima Augmented reality in vehicle platforms

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112364478A (en) * 2020-09-30 2021-02-12 深圳市为汉科技有限公司 Virtual reality-based testing method and related device
CN112527101A (en) * 2020-11-09 2021-03-19 义乌市输变电工程有限公司 Remote control method and device for variable electric field
CN112527100A (en) * 2020-11-09 2021-03-19 义乌市输变电工程有限公司 Remote assistance method and device based on intelligent wearable equipment
WO2023019982A1 (en) * 2021-08-17 2023-02-23 广州博冠信息科技有限公司 Same-screen interaction control method and apparatus, and electronic device and storage medium
WO2024049585A1 (en) * 2022-08-31 2024-03-07 Snap Inc. Timelapse of generating a collaborative object

Similar Documents

Publication Publication Date Title
WO2020165885A1 (en) Computer-implemented method and system for providing interaction rules in mixed reality
US10664060B2 (en) Multimodal input-based interaction method and device
Betancourt et al. The evolution of first person vision methods: A survey
US9563272B2 (en) Gaze assisted object recognition
US20220256647A1 (en) Systems and Methods for Collaborative Edge Computing
US20220237812A1 (en) Item display method, apparatus, and device, and storage medium
US20190087647A1 (en) Method and apparatus for facial recognition
JP2020509504A (en) Image tagging method, apparatus, and electronic device
JP2018505462A (en) Avatar selection mechanism
CN108307214B (en) Method and apparatus for controlling a device
US10963277B2 (en) Network error detection using virtual reality display devices
TW200844795A (en) Controlling a document based on user behavioral signals detected from a 3D captured image stream
US20160364008A1 (en) Smart glasses, and system and method for processing hand gesture command therefor
Kim et al. Watch & Do: A smart iot interaction system with object detection and gaze estimation
US20190050068A1 (en) Causing specific location of an object provided to a device
KR102094953B1 (en) Method for eye-tracking and terminal for executing the same
US10101885B1 (en) Interact with TV using phone camera and touch
Milazzo et al. KIND‐DAMA: A modular middleware for Kinect‐like device data management
Rumiński et al. Performance analysis of interaction between smart glasses and smart objects using image-based object identification
Kopinski et al. Touchless interaction for future mobile applications
Bhowmik Natural and intuitive user interfaces with perceptual computing technologies
KR20200066133A (en) Electronic device implementing mobile device and gateway device operating on platform
Ganesan et al. Deep learning based smart survilance robot
US20240096319A1 (en) Gaze-based command disambiguation
WO2024066977A1 (en) Palm-based human-computer interaction method, and apparatus, device, medium and program product

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20756726

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20756726

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20756726

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 03.11.2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20756726

Country of ref document: EP

Kind code of ref document: A1