WO2020070756A1 - A system and a method for providing a virtual tour - Google Patents

A system and a method for providing a virtual tour

Info

Publication number
WO2020070756A1
WO2020070756A1 PCT/IN2019/050729 IN2019050729W WO2020070756A1 WO 2020070756 A1 WO2020070756 A1 WO 2020070756A1 IN 2019050729 W IN2019050729 W IN 2019050729W WO 2020070756 A1 WO2020070756 A1 WO 2020070756A1
Authority
WO
WIPO (PCT)
Prior art keywords
interest
place
unit
location
user
Prior art date
Application number
PCT/IN2019/050729
Other languages
French (fr)
Inventor
Gils JAMES
Joseph James MAPPUMCHERY
Original Assignee
James Gils
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by James Gils filed Critical James Gils
Publication of WO2020070756A1 publication Critical patent/WO2020070756A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63GMERRY-GO-ROUNDS; SWINGS; ROCKING-HORSES; CHUTES; SWITCHBACKS; SIMILAR DEVICES FOR PUBLIC AMUSEMENT
    • A63G31/00Amusement arrangements
    • A63G31/16Amusement arrangements creating illusions of travel
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images

Definitions

  • the present disclosure generally relates to virtual reality and, more particularly, to a system and a method for providing a virtual tour of a place of interest, for example.
  • This summary is provided to introduce aspects related to method(s) and system(s) for providing virtual reality based telepresence applications.
  • a system for a virtual tour of a place of interest includes a first device associated with a user.
  • the first device includes a first application for enabling the user to select the place of interest and a guide for providing the virtual tour of the place of interest.
  • the system also includes a second device associated with the guide.
  • the second device includes a second application configured to operatively connect to the first application of the first device and a location capturing unit configured to capture the place of interest.
  • the sy stem includes a computer-generated simulation device configured to operatively connect to the first device and render one or more media streams of the selected place of interest, captured by the location capturing unit to the user.
  • the system includes a location mimic unit configured to operatively connect to the first device and render a virtual condition associated with the selected place of interest captured by the location capturing unit to the user.
  • a method for providing a virtual tour of a place of interest includes step of enabling a user, by a first application of a first device, to select a place of interest and a guide for providing the virtual tour of the place of interest.
  • the method also includes the step of operatively connecting a second application of a second device associated with the guide to the first application of the first device.
  • the method further, includes the step of capturing the place of interest, by a location capturing unit.
  • the method includes the step of rendering one or more media streams of the selected place of interest captured by the location capturing unit to the user, by a computer-generated simulation device operatively connected to the first device.
  • the method includes the step of rendering a virtual condition associated with the selected place of interest, captured by the location capturing unit, by a location mimic unit operatively connected to the first device.
  • FIG, 1 illustrates a block diagram of a system 100 for providing a virtual tour in accordance with an embodiment of the present disclosure
  • FIG. 2 illustrates a block diagram of the first device 102, the computer-generated simulation device 104, the location mimic unit 106 in accord ance with an embodiment of FIG.
  • FIG. 3 illustrates an exploded block diagram of the second device 110, in accordance with an embodiment of FIG. 1 of the present disclosure
  • FIG. 4 illustrates a block diagram of a control unit 402 connected to the first device 102 and the location mimic unit 106, in accordance with an embodiment of FIG. 1 of the present disclosure
  • FIG. 5 illustrates a processing system 502 for transmitting and receiving data signals to and from the second device in accordance with another embodiment of the present disclosure
  • FIG, 7 illustrates a flow chart describing a method for providing the virtual tour, in accordance with an embodiment of FIGS. 1-6 of the present disclosure.
  • FIG. SJ are example screens shots illustrating the first application for providing the virtual tour, in accordance with an example embodiment of the present disclosure.
  • a system for providing a virtual tour of a place of interest provides the virtual tour of the place of interest to a user.
  • the system includes a first device, a second device, a computer-generated simulation device and a location mimic unit.
  • the first device is associated with the user and the second device is associated with a guide.
  • the computer-generated simulation device and the location mimic unit are operatively connected to the first device.
  • the first device includes a first application that enables the user to select the place of interest and the guide for providing the virtual tour of the place of interest.
  • the second device includes a second application and a location capturing unit. The second application is configured to operatively connect to the first application of the first device.
  • a location capturing unit is configured to capture the place of interest.
  • the computer-generated simulation device is used to render one or more media streams of the selected place of interest to the user.
  • the one or media streams are captured by the location capturing unit.
  • the location mimic unit renders a virtual condition associated with the selected place of interest to the user.
  • FIG. 1 illustrates a block diagram of a system 100 for providing a virtual tour in accordance with an embodiment of the present disclosure.
  • the system 100 includes a first device 102 associated with a user 102-B, a computer-generated simulation device 104 operatively connected to the first device 102, a location mimic unit 106 operatively connected to the first device 102, a second device 110 associated with a guide 110-B, a location capturing unit 114 connected to the second device 110 through a network 112, and a communication network 108 for providing communication between the first device 102 and the second device
  • the communication network 108 hosted on a cloud and configured for providing communication between the first device 102 and the second device 110.
  • the first device 102 is associated with the user 102-B and the second device 110 is associated with the guide 110-B.
  • the user 102-B herein, refers to a person who wishes to have the virtual tour of the place of interest.
  • the user 102-B can be at any location of world, whereas the guide 110-B is a person who provides the virtual tour of the place of interest to the user 102-B.
  • the guide 110-B is present at the place of interest.
  • the place of interest herein, refers to a place which the user 102-B intends to have the virtual tour with help of the guide 110-B.
  • the first device 102 and the second device 110 include but are not limited to smart phones.
  • the first and second devices 102, 110 can be any devices which have internet connections and are able to receive and transmit data bidirectional
  • the first device 102 includes a first application 102-A.
  • the first application 102- A enables the user 102-B to select the place of interest and the guide 110-B for providing the virtual tour of the place of interest through a user interface.
  • the second device 110 includes a second application 110-A.
  • the second application 110- A is configured to operatively connect to the first application 102-A of the first device 102 through the communication network 108.
  • the user 102-B and the guide 110-B are interconnected via the first and second applications 102-A, 102-B of the first and second devices 102, 110 through the communication network 108.
  • the location capturing unit 114 is configured to capture the place of interest. In one embodiment of the present disclosure, the location capturing unit 114 is integrated to the second device 110. In another embodiment of the present disclosure, the location capturing unit 114 is disposed separate from the second device 110. [0032] The location capturing 114 unit may include but not limited to a camera and an ambient condition sensor (not shown in FIG. 1). The number of cameras and ambient condition sensors may vary depending on the application. In one embodiment, the camera and the ambient condition sensor are integrated to the second device 110. In yet another embodiment, the camera and the ambient condition sensor are disposed separate from the second device 110
  • the second application 110- A is configured to use the camera of the second device 110 to capture a view' of the place of interest.
  • the captured view of the place of interest may include one or more media streams such as video and audio streams associated with the place of interest.
  • the application 110- A is configured to use the ambient condition sensor integrated to the second device 110, for detecting an ambient conditions.
  • ambient condition sensors may be used to detect parameters such as but not limited to temperature, humidity, air-pressure, or a combinations thereof associated with the place of interest.
  • the camera is stationed at the place of the interest, or the camera can be carried by the guide 110-B at the place of interest.
  • the number of cameras may vary' depending on the application.
  • a camera is configured to connect to a remote server via a network 112 such that the camera can be actuated so as to capture an entire 360 degree view of the place of interest.
  • the location capturing unit 114 is disposed separate from the second device 110, the one or more ambient condition sensors are disposed separately at the place of interest.
  • the one or more ambient condition sensors may be connected to the second device 110 via a Bluetooth connection or through a Wi-Fi connection. The sensed ambient conditions of the place of interest are transmitted over the communication network 108 to the location mimic unit 106.
  • the computer-generated simulation device 104 may include a VR (Virtual Reality) enabled device, AR (Augmented Reality) enabled device, or a combination thereof.
  • the user 102-B is shown wearing the computer-generated simulation device, for example, such as a VR headset.
  • the computer-generated simulation device 104 communicates to the communication network 108 via the first device 102-A.
  • the computer-generated simulation device 104 may communicate with the communication network 108 directly.
  • the computer-generated simulation receives the one or more media streams continuously from the location capturing unit 114, and renders virtual reality of the place of interest to the user 102-B.
  • the location mimic unit 106 renders a virtual condition associated with the selected place of interest to the user 102-B.
  • FIG. 2 illustrates a block diagram of the first device 102, the computer-generated simulation device 104, and the location mimic unit 106 in accordance with an embodiment of FIG. 1.
  • the computer-generated simulation device 104 and the location mimic unit 106 are operatively connected to the first device 102, in accordance with an embodiment of the present disclosure.
  • the first device 102 includes a streaming unit 102-A1, an audio/video (A/V) receiver unit 102-A2, a head tracking unit 102-A3, a feedback unit 102- A4, a billing unit 102-A5, a processor 102-A6, and a memory 102-A7.
  • A/V audio/video
  • the computer-generated simulation device 104 include a streaming unit 104-1 and a detecting-monitoring unit 104-2.
  • the location mimic unit 106 include a cooling unit 106-1, an induction heating unit 106-2, and an air blower unit 106-3. It should be noted herein that FIG. 2 is an exemplary embodiment and should not be construed as a limitation of the invention.
  • the first device 102 is a smartphone running the first application 102-A having the streaming unit 102-A1, the audio/video receiver unit 102-A2, the head tracking unit 102- A3, the feedback unit 102-A4 and the billing unit 102-A5.
  • the streaming unit 102-A1 streams the media content received from camera via the remote server (not shown in FIG. 2).
  • the streaming unit 102-A1 may be configured to stream the media content, for example, the audio data, the video data, and any associated images transmitted from the location capturing unit 114 at the place of interest, either on the first device 102 itself or on a display unit of the computer-generated simulation device 104.
  • the computer-generated simulation device 104 includes at a VR (Virtual Reality) enabled device, AR (Augmented Reality) enabled device, or a combination thereof.
  • the ambient conditions and the type of the place of interest are, further, processed by the processor 102- A 6 of the first device 102 for generating instructions for a location mimic unit 106, the location mimic unit 106 is being operatively connected to the first device 102.
  • the location mimic unit 106 based on the instructions received from the processor 102- A6 generates a virtual condition associated with the selected place of interest to the user 102-B.
  • the virtual condition is representative of one or more ambient conditions, a type of the place of interest, or a combination thereof of the place of interest.
  • the location mimic unit 106 is used to generate for example, a tactile, visual or audio signal representative of a virtual condition of the place of interest for the user 102-B.
  • the location mimic unit 106 includes the cooling unit 106-1, the induction heating unit 106-2 and the air blower 106-3.
  • the cooling unit 106-1, the induction heating unit 106-2 and the air blower 106-3 are configured for providing the virtual reality of the ambient conditions and the type of the place of interest to the user 102-B.
  • a VR headset is used as the computer-generated simulation device, such a VR headset is configured to continuously monitor the user gestures such as eye movement, head movement etc. while in operation and communicate data associated with the user movements with the location capturing unit 114 via the remote server.
  • the location mimic unit 106 provides the necessary visual, tactile and audio signal to the user 102-B while rendering a virtual reality scene so as to match one or more actual conditions associated with the place of interest. For example, if the scene rendered on the VR headset depicts a scene of an ocean during a storm, the air blower 106-3 of the location mimic unit 106 is configured to generate, for example, an audio signal representative of the storm.
  • the movement tracking of a user is of importance in a virtual reality environment since a user is rendered the entire 360 degrees field of view and any change in movement of a user at any point in time during rendering of the virtual reality provides an indication of the regions within the virtual reality content that are of particular interest to the user. For example, during a five minute video stream of a historical monument rendered to a. user 102-B, after the second minute if the user wishes to go back and view' the video frames of the first minute, the system would automatically render the scene from the first minute upon detecting user gestures.
  • the feedback unit 102-A4 is configured to provide a feedback representative of the user ’ s gesture based on the rendered one or more media streams of the selected place of interest.
  • the feedback unit 102-A4 is also configured to provide a feedback representative of a comparison of a virtual condition with an actual condition associated with the selected place of interest based on the captured place of interest.
  • the billing unit 102-A5 is configured for accounting of costs associated with a time period spent by the user 102-B for using the virtual reality.
  • FIG. 3 illustrates an exploded block diagram of the second device 110, in accordance with an embodiment of FIG. 1 of the present disclosure.
  • the second device 110 includes the location capturing unit 114, an audio/video (A/V) transmission unit 110-A2, a feedback unit 1 10- A3 a billing unit 110-A4, a processor 110-A5, and a memory' 110-A6.
  • the second device 110 is associated with the guide.
  • the location capturing unit 114 may be disposed separate from the second device 110.
  • the second application 110- A is configured to use the camera 114-1 of the second device 110 to capture a view of the place of interest.
  • the second application 110- A is configured to use the plurality of ambient condition sensors 114-2 integrated to the second device 110, for detecting the ambient conditions of the place of interest.
  • the sensors may be used to detect temperature, humidity, air-pressure, or a combinations thereof, for example, of the place of interest.
  • the audio/video transmission unit 110-A2 is configured to transmit the captured one or more media streams of the place of interest captured by the location capturing unit 114 to the audio/video receiver 102- A2 of the first device 102.
  • the configuration of the audio/video transmission unit 110-A2 of the second device 110 is similar to the audio/video receiver 102-A2 of the first device 102, Similarly, the configuration of the feedback unit 110- A3, the billing unit 110-A4, the processor 110-A5 and the memory 1 I0-A6 of the second device 110 are to the corresponding components of the first device 102.
  • FIG. 4 illustrates a block diagram of a control unit 402 connected to the first device 102 and the location mimic unit 106, in accordance with an embodiment of FIG. 1 of the present disclosure.
  • the control unit 402 is configured to control the location mimic unit 106 to render the virtual condition associated with the selected place of interest captured by the location capturing unit 114 to the user 102-B.
  • the first device 102 includes the control unit 402 configured to operatively connect to the location mimic unit 106
  • FIG. 5 illustrates a processing system 502 for transmitting and receiving data signals to and from the second device in accordance with another embodiment of the present disclosure.
  • the processing system 502 is in addition to the processor 110A-5 shown in FIG. 3.
  • the second device 110 may not include the ambient condition sensors 114-2.
  • the ambient condition sensors may be disposed outside the second device 110.
  • the plurality of sensors includes a temperature and humidity sensor 506, an accelerometer and gyrometer sensor 508, and an air flow sensor 504.
  • some sensors may be integrated to the second device 110 and some other sensors may be disposed separate from the second device 110 All such permutations and combinations are envisioned with the scope of the invention.
  • the sensed ambient conditions of the place of interest are received by the processing system 510 of the first device 102 using the location mimic unit 106, in particulars the induction heating unit 106-2, the cooling unit 106-1 and the air blower 106-3 of the location mimic unit 106.
  • FIG. 6 illustrates a feedback process of the system 100 in accordance with an exemplary embodiment of FIGS. 1-5 of the present disclosure.
  • the comparison of the guide sensor data 604 and the user sensor data 606 are represented by the step 602
  • the control unit 402 controls the actuators 512, the air blower 106-3, the heating 106-2 and the cooling unit 106-1 of the location mimic unit 106 based on the comparison of the user sensor data 604 associated with the virtual reality condition and the guide sensor data 604 representative of the actual condition associated with the selected place of interest. Specifically, if there is a mismatch between the virtual reality' condition and the actual condition, the control unit 402 is used to control the actuators 512, the air blower 106-3, the heating 106-2 and the cooling unit 106-1 of the location mimic unit 106 to rectify the mismatch.
  • FIG. 7 illustrates a flow ? chart describing a method for providing the virtual tour, in accordance with an embodiment of FIGS. 1-6 of the present disclosure
  • a place of interest and the guide 110-B is selected by the user 102-B using the first application 102-A of the first device 102, for providing the virtual tour of a place of interest. It may be noted that, the first device 102 is associated with the user 102-B.
  • the second application 110- A of the second device 110 is operatively connected to the first application 102-A of the first device 102. It may be noted that, the second device 110 is associated with the guide 110-B.
  • the second device 110 includes the second application 110- A configured to operatively connect to the first application 102-A of the first device 102.
  • the place of interest is captured by the location capturing unit 114. In one embodiment, the location capturing unit 114 is integrated to the second device 110. In another embodiment, the location capturing unit 114 is disposed separate from the second device 110.
  • the application 110- A is configured to use the camera of the second device 110 to capture a view of the place of interest.
  • the captured view of the place of interest for example, include one or more media streams of the place of interest.
  • the application 110- A is configured to use the ambient condition sensors for detecting the ambient conditions associated with the place of interest.
  • the ambient condition sensors may be used to detect parameters such as but not limited to temperature, humidity, air-pressure, or a combinations thereof of the place of interest.
  • the camera is stationed at the place of the interest, or the camera can be carried by the guide 110-B at the place of interest, to capture the view of the place of interest.
  • the sensors may be connected to the second device 110 by a Bluetooth or through a Wi-Fi connection.
  • one or more media streams of the selected place of interest captured by the location capturing unit 114 are rendered to the user 102-B, by the computer-generated simulation device 104. It may be noted that, the computer-generated simulation device 104 is operatively connected to the first device 102 and are used to render the one or more media streams of the selected place of interest.
  • a virtual condition associated with the selected place of interest are rendered to the user 102-B, by the location mimic unit 106.
  • the virtual condition associated with the selected place of interest are captured by the location capturing unit 114.
  • the location mimic unit 106 is operatively connected to the first device 102 and renders the virtual condition associated with the selected place of interest to the user 102-B.
  • the virtual condition includes one or more ambient conditions, a type of the place of interest, or a combination thereof
  • FIG. 8J are example screens shots illustrating the first application for providing the virtual tour, in accordance with an example embodiment of the present disclosure.
  • the first application provided on the first device 102 enables the user 102-B to select an option of either to:‘explore the world’, or‘be a guide’. On selecting the option of‘explore the world’, the first application further enables the user 102-B with an option for:‘explore for free’ and‘hire an explorer’.
  • the user 102-B On selecting the option of‘hire an explorer’, the user 102-B is provided with the options among‘guide for free’, or‘guide for credit’ Further, on selecting any one of the options among‘guide for free’, or‘guide for credit’, the first application redirects the user 102-B to use a google map to choose a location of a place of interest for having a virtual tour. Upon choosing the location of the place of interest, the first application starts searching for the guide 110-B near the location of the place of interest for the user 102-B.
  • the first application Upon finding the guide 110- B at the location of the place of interest, the first application confirms the availability of the guide 110-B near the location of the place of interest, to the user 102-B and also sends a request to the guide 110-B to connect with the user 102-B. The guide 110-B can then accept the request for connecting with the user 102-B. The first application, further displays a message to the guide 110-B, stating that:“You are online now. Please wait and stay on the page while an explorer request you”. Upon receiving the request from the user, the guide will connect with the user and provide a virtual tour of the place of interest to the user.
  • a plurality of first devices may be connected to a single second device for rendering the media streams of a place of interest and for rendering a virtual condition associated with the selected place of interest. For example, if ten users would like to experience the virtual tour of a certain place, then all the ten users may simultaneously get the media stream from the second device 110 located in proximity of the place of interest.
  • the personalization of the virtual tour is accomplished by a location capturing unit via the remote server by selectively presenting a scene to each of the plurality of the users depending on the user gestures.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The system (100) comprises a first device (102) associated with a user (102-B), having a first application (102-A) for enabling the user (102-B) to select the place of interest and a guide (110-B). The system (100) also comprises a second device (110) associated with the guide (110-B). The second device (110) comprises a second application (110-A) configured to operatively connect to the first application (102-A) and a location capturing unit (114) configured to capture a place of interest. The system (100) comprises a computer-generated simulation device (104) configured to operatively connect to the first device (102) and render one or more media streams of the selected place of interest. The system (100) also comprises a location mimic unit (106) configured to operatively connect to the first device (102) and render a virtual condition associated with the selected place of interest to the user (102-B).

Description

A SYSTEM AND A METHOD FOR PROVIDING A VIRTUAL TOUR
FIELD OF TECHNOLOGY
[001] The present disclosure generally relates to virtual reality and, more particularly, to a system and a method for providing a virtual tour of a place of interest, for example.
BACKGROUND
[002] Usage of internet today has given rise to an accelerated form of personal interactions through instant messaging, social networking, internet forums, virtual touring, and the like. An individual can do video conferencing, connect with other people at far off locations, watch online television, buy an article from an online store, and remotely participate in a distant environment activity without being physically present at a particular location. An activity of remotely participating at a place different from a true location of a person is referred to as a telepresence application. With the help of telepresence, the individual can feel or experience an effect, as if being actually participating in the remote location activities.
[003] Currently, in practice, there are a plurality of applications implementing telepresence technologies, for example, virtual diagnostics, virtual gaming, telemedicine, virtual tour, and the like. With reference to the above-mentioned applications, the virtual tour, nowadays is gaining importance among people for fulfilling people’s desire for experiencing world tour. Existing telepresence technologies used for providing virtual tours, implement webcams, teleconferencing devices, remotely operated mobile teleconferencing robots, etc. With the existing telepresence technologies, the users are restricted to an audio-visual dialog between persons located at different remote locations. Further, with the existing telepresence technologies, the users are restricted to view along a direction where a device is being focused to by the guide at the remote location. Also, the existing applications restricts the user in getting a virtual experience of the ambient conditions of the remote location.
SUMMARY
[004] This summary is provided to introduce a selection of concepts in a simple manner that is further described in the detailed description of the disclosure. This summary is not intended to identify key or essential inventive concepts of the subject matter nor is it intended to determine the scope of the disclosure.
[005] To overcome at least one of the above mentioned problems, there exists a need for a telepresence application providing a user a virtual realistic experience of the feel of the ambient conditions of remote location. A system for a virtual tour is needed that provides the user to view a 360 degree view of scene of the remote locations.
[006] This summary is provided to introduce aspects related to method(s) and system(s) for providing virtual reality based telepresence applications.
[007] Briefly, according to an exemplary embodiment, a system for a virtual tour of a place of interest, is provided. The system includes a first device associated with a user. The first device includes a first application for enabling the user to select the place of interest and a guide for providing the virtual tour of the place of interest. The system also includes a second device associated with the guide. The second device includes a second application configured to operatively connect to the first application of the first device and a location capturing unit configured to capture the place of interest. Further, the sy stem includes a computer-generated simulation device configured to operatively connect to the first device and render one or more media streams of the selected place of interest, captured by the location capturing unit to the user. In addition, the system includes a location mimic unit configured to operatively connect to the first device and render a virtual condition associated with the selected place of interest captured by the location capturing unit to the user.
[008] Briefly, according to an exemplary embodiment, a method for providing a virtual tour of a place of interest is disclosed. The method includes step of enabling a user, by a first application of a first device, to select a place of interest and a guide for providing the virtual tour of the place of interest. The method also includes the step of operatively connecting a second application of a second device associated with the guide to the first application of the first device. The method further, includes the step of capturing the place of interest, by a location capturing unit. In addition, the method includes the step of rendering one or more media streams of the selected place of interest captured by the location capturing unit to the user, by a computer-generated simulation device operatively connected to the first device. Moreover, the method includes the step of rendering a virtual condition associated with the selected place of interest, captured by the location capturing unit, by a location mimic unit operatively connected to the first device.
[009] The summary above is illustrative only and is not intended to be in any way limiting. Further aspects, exemplary embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF THE FIGURES
[0010] These and other features, aspects, and advantages of the example embodiments will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
[0011] FIG, 1 illustrates a block diagram of a system 100 for providing a virtual tour in accordance with an embodiment of the present disclosure;
[0012] FIG. 2 illustrates a block diagram of the first device 102, the computer-generated simulation device 104, the location mimic unit 106 in accord ance with an embodiment of FIG.
1;
[0013] FIG. 3 illustrates an exploded block diagram of the second device 110, in accordance with an embodiment of FIG. 1 of the present disclosure;
[0014] FIG. 4 illustrates a block diagram of a control unit 402 connected to the first device 102 and the location mimic unit 106, in accordance with an embodiment of FIG. 1 of the present disclosure;
[0015] FIG. 5 illustrates a processing system 502 for transmitting and receiving data signals to and from the second device in accordance with another embodiment of the present disclosure;
[0016] FIG. 6 illustrates a feedback process of the system 100 in accordance with an exemplary embodiment of FIGS. 1-5 of the present disclosure;
[0017] FIG, 7 illustrates a flow chart describing a method for providing the virtual tour, in accordance with an embodiment of FIGS. 1-6 of the present disclosure; and [0018] FIG. 8A, FIG. 8B, FIG. 8C, FIG. 8D, FIG. 8E, FIG. 8F, FIG. 8G, FIG. 8H, FIG.
81 and FIG. SJ are example screens shots illustrating the first application for providing the virtual tour, in accordance with an example embodiment of the present disclosure.
[0019] Further, skilled artisans will appreciate that elements in the figures are illustrated for simplicity and may not have necessarily been drawn to scale. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the figures by conventional symbols, and the figures may show only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the figures with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
DETAILED DESCRIPTION
[0020] F or the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiments illustrated in the figures and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates.
[0021] It will be understood by those skilled in the art. that the foregoing general description and the following detailed description are exemplary and explanatory of the invention and are not intended to be restrictive thereof.
[0022] The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures or components proceeded by "comprises... a" does not, without more constraints, preclude the existence of other devices or other sub-systems or other elements or other structures or other components or additional devices or additional sub-systems or additional elements or additional structures or additional components. Appearances of the phrase“in an embodiment”,“in another embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
[0023] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The system, methods, and examples provided herein are illustrative only and not intended to be limiting.
[0024] The terms‘augmented reality’ and‘virtual reality’ used throughout this disclosure refers to the techniques known in the art for computer generated enhancements atop an existing reality and computer generated simulation of a real life environment respectfully. As such, the system and methods of the present disclosure are applicable for creating content to be viewed in both‘augmented reality’ and‘virtual reality’. Also the term‘image’ refers to images, videos and other such multimedia content.
[0025] In addition to the illustrative aspects, exemplary embodiments, and features described above, further aspects, exemplary' embodiments of the present disclosure will become apparent by reference to the drawings and the following detailed description.
[0026] In one embodiment of the present disclosure, a system for providing a virtual tour of a place of interest is provided. The system provides the virtual tour of the place of interest to a user. The system includes a first device, a second device, a computer-generated simulation device and a location mimic unit. The first device is associated with the user and the second device is associated with a guide. The computer-generated simulation device and the location mimic unit are operatively connected to the first device. The first device includes a first application that enables the user to select the place of interest and the guide for providing the virtual tour of the place of interest. The second device includes a second application and a location capturing unit. The second application is configured to operatively connect to the first application of the first device. A location capturing unit is configured to capture the place of interest. The computer-generated simulation device is used to render one or more media streams of the selected place of interest to the user. The one or media streams are captured by the location capturing unit. The location mimic unit renders a virtual condition associated with the selected place of interest to the user.
[0027] FIG. 1 illustrates a block diagram of a system 100 for providing a virtual tour in accordance with an embodiment of the present disclosure. The system 100 includes a first device 102 associated with a user 102-B, a computer-generated simulation device 104 operatively connected to the first device 102, a location mimic unit 106 operatively connected to the first device 102, a second device 110 associated with a guide 110-B, a location capturing unit 114 connected to the second device 110 through a network 112, and a communication network 108 for providing communication between the first device 102 and the second device
110
[0028] In one embodiment, the communication network 108 hosted on a cloud and configured for providing communication between the first device 102 and the second device 110. As disclosed earlier, the first device 102 is associated with the user 102-B and the second device 110 is associated with the guide 110-B. The user 102-B, herein, refers to a person who wishes to have the virtual tour of the place of interest. The user 102-B can be at any location of world, whereas the guide 110-B is a person who provides the virtual tour of the place of interest to the user 102-B. The guide 110-B is present at the place of interest. The place of interest, herein, refers to a place which the user 102-B intends to have the virtual tour with help of the guide 110-B. Herein, the first device 102 and the second device 110 include but are not limited to smart phones. The first and second devices 102, 110 can be any devices which have internet connections and are able to receive and transmit data bidirectional
[0029] In the same embodiment of the present disclosure, the first device 102 includes a first application 102-A. The first application 102- A enables the user 102-B to select the place of interest and the guide 110-B for providing the virtual tour of the place of interest through a user interface.
[0030] The second device 110 includes a second application 110-A. The second application 110- A is configured to operatively connect to the first application 102-A of the first device 102 through the communication network 108. In other words, the user 102-B and the guide 110-B are interconnected via the first and second applications 102-A, 102-B of the first and second devices 102, 110 through the communication network 108.
[0031 ] The location capturing unit 114 is configured to capture the place of interest. In one embodiment of the present disclosure, the location capturing unit 114 is integrated to the second device 110. In another embodiment of the present disclosure, the location capturing unit 114 is disposed separate from the second device 110. [0032] The location capturing 114 unit may include but not limited to a camera and an ambient condition sensor (not shown in FIG. 1). The number of cameras and ambient condition sensors may vary depending on the application. In one embodiment, the camera and the ambient condition sensor are integrated to the second device 110. In yet another embodiment, the camera and the ambient condition sensor are disposed separate from the second device 110
[0033] In one example in which the location capturing unit 114 is integrated to the second device 110, the second application 110- A is configured to use the camera of the second device 110 to capture a view' of the place of interest. The captured view of the place of interest, may include one or more media streams such as video and audio streams associated with the place of interest. Similarly, the application 110- A is configured to use the ambient condition sensor integrated to the second device 110, for detecting an ambient conditions. In embodiments, where a plurality of ambient condition sensors are integrated to the second device 110, such ambient condition sensors may be used to detect parameters such as but not limited to temperature, humidity, air-pressure, or a combinations thereof associated with the place of interest.
[0034] Further, the one or more media streams of the place of interest are transmitted via the communication network 108 and the first device 102 to the computer-generated simulation device 104. The computer-generated simulation device 104 renders the one or more media streams of the selected place of interest to the user 102-B. Similarly, the sensed ambient conditions of the place of interest are transmitted via the communication network 108 and the first device 102 to the location mimic unit 106. The location mimic unit 106 renders a virtual condition associated with the selected place of interest to the user 102-B. For example, the location mimic unit 106 may he used to render a virtual condition representative of temperature, humidity, air-pressure, or a combinations thereof associated with the place of interest to the user 102-B.
[0035] In yet another embodiment, if the location capturing unit 114 is disposed separate from the second device 110, the camera is stationed at the place of the interest, or the camera can be carried by the guide 110-B at the place of interest. As discussed earlier, the number of cameras may vary' depending on the application. In one specific embodiment, a camera is configured to connect to a remote server via a network 112 such that the camera can be actuated so as to capture an entire 360 degree view of the place of interest. [0036] Similarly, if the location capturing unit 114 is disposed separate from the second device 110, the one or more ambient condition sensors are disposed separately at the place of interest. The one or more ambient condition sensors may be connected to the second device 110 via a Bluetooth connection or through a Wi-Fi connection. The sensed ambient conditions of the place of interest are transmitted over the communication network 108 to the location mimic unit 106.
[0037] The computer-generated simulation device 104 may include a VR (Virtual Reality) enabled device, AR (Augmented Reality) enabled device, or a combination thereof. In the illustrated embodiment, the user 102-B is shown wearing the computer-generated simulation device, for example, such as a VR headset. As discussed earlier, in one embodiment, the computer-generated simulation device 104 communicates to the communication network 108 via the first device 102-A. In another embodiment, the computer-generated simulation device 104 may communicate with the communication network 108 directly. The computer-generated simulation receives the one or more media streams continuously from the location capturing unit 114, and renders virtual reality of the place of interest to the user 102-B. In addition, the location mimic unit 106 renders a virtual condition associated with the selected place of interest to the user 102-B.
[0038] FIG. 2 illustrates a block diagram of the first device 102, the computer-generated simulation device 104, and the location mimic unit 106 in accordance with an embodiment of FIG. 1. As mentioned earlier, the computer-generated simulation device 104 and the location mimic unit 106 are operatively connected to the first device 102, in accordance with an embodiment of the present disclosure. The first device 102 includes a streaming unit 102-A1, an audio/video (A/V) receiver unit 102-A2, a head tracking unit 102-A3, a feedback unit 102- A4, a billing unit 102-A5, a processor 102-A6, and a memory 102-A7. The computer-generated simulation device 104 include a streaming unit 104-1 and a detecting-monitoring unit 104-2. The location mimic unit 106 include a cooling unit 106-1, an induction heating unit 106-2, and an air blower unit 106-3. It should be noted herein that FIG. 2 is an exemplary embodiment and should not be construed as a limitation of the invention.
[0039] In one example scenario, the first device 102, is a smartphone running the first application 102-A having the streaming unit 102-A1, the audio/video receiver unit 102-A2, the head tracking unit 102- A3, the feedback unit 102-A4 and the billing unit 102-A5. The streaming unit 102-A1 streams the media content received from camera via the remote server (not shown in FIG. 2). The streaming unit 102-A1 may be configured to stream the media content, for example, the audio data, the video data, and any associated images transmitted from the location capturing unit 114 at the place of interest, either on the first device 102 itself or on a display unit of the computer-generated simulation device 104. The computer-generated simulation device 104 includes at a VR (Virtual Reality) enabled device, AR (Augmented Reality) enabled device, or a combination thereof.
[0040] The computer-generated simulation device 104 includes the detecting-monitoring unit 104-2 configured to detect a user’s gesture. In the same example scenario, the audio/video receiver unit 102-A2 is configured to receive the data from the camera via the remote server in response to the request for rendering the virtual reality of the place of interest. In one embodiment, the audio/video receiver 102-A2 is configured to receive one or more ambient conditions, a type of the place of interest, or a combination thereof associated with the scene rendered on the VR headset, i .e. the scene transmited from the place of interest. The ambient conditions includes but is not limited to temperature, air pressure, wind conditions, sunlight, humidity, and the like. The ambient conditions and the type of the place of interest (for example, tides in an ocean, a snowfall, and the like) are, further, processed by the processor 102- A 6 of the first device 102 for generating instructions for a location mimic unit 106, the location mimic unit 106 is being operatively connected to the first device 102. The location mimic unit 106, based on the instructions received from the processor 102- A6 generates a virtual condition associated with the selected place of interest to the user 102-B. The virtual condition is representative of one or more ambient conditions, a type of the place of interest, or a combination thereof of the place of interest. The location mimic unit 106 is used to generate for example, a tactile, visual or audio signal representative of a virtual condition of the place of interest for the user 102-B.
[0041] As mentioned earlier, the location mimic unit 106 includes the cooling unit 106-1, the induction heating unit 106-2 and the air blower 106-3. The cooling unit 106-1, the induction heating unit 106-2 and the air blower 106-3 are configured for providing the virtual reality of the ambient conditions and the type of the place of interest to the user 102-B. In one embodiment, if a VR headset is used as the computer-generated simulation device, such a VR headset is configured to continuously monitor the user gestures such as eye movement, head movement etc. while in operation and communicate data associated with the user movements with the location capturing unit 114 via the remote server. The location mimic unit 106 provides the necessary visual, tactile and audio signal to the user 102-B while rendering a virtual reality scene so as to match one or more actual conditions associated with the place of interest. For example, if the scene rendered on the VR headset depicts a scene of an ocean during a storm, the air blower 106-3 of the location mimic unit 106 is configured to generate, for example, an audio signal representative of the storm.
[0042] In the same embodiment, referring again to the first device 102, the memory unit 102-A7 is configured to store user information, ambient conditions data, data associated with the type of the place of interest, and any other historical data. Further, in the same embodiment, the head tracking unit 102-A3 is configured to continuously monitor the user gestures during the rendering of the virtual reality of the place of interest. The gestures may include, for example, head movement, eye movement, hand movement, or the like. Each of such gestures are monitored to detect a current position and an orientation of the user 102-B in response to the virtual reality being rendered. The movement tracking of a user is of importance in a virtual reality environment since a user is rendered the entire 360 degrees field of view and any change in movement of a user at any point in time during rendering of the virtual reality provides an indication of the regions within the virtual reality content that are of particular interest to the user. For example, during a five minute video stream of a historical monument rendered to a. user 102-B, after the second minute if the user wishes to go back and view' the video frames of the first minute, the system would automatically render the scene from the first minute upon detecting user gestures. Moreover, in the same embodiment, with reference to the first device 102, the feedback unit 102-A4 is configured to provide a feedback representative of the users gesture based on the rendered one or more media streams of the selected place of interest. In addition, the feedback unit 102-A4 is also configured to provide a feedback representative of a comparison of a virtual condition with an actual condition associated with the selected place of interest based on the captured place of interest. Also, in the same embodiment, the billing unit 102-A5 is configured for accounting of costs associated with a time period spent by the user 102-B for using the virtual reality.
[0043] FIG. 3 illustrates an exploded block diagram of the second device 110, in accordance with an embodiment of FIG. 1 of the present disclosure. In the illustrated embodiment, the second device 110 includes the location capturing unit 114, an audio/video (A/V) transmission unit 110-A2, a feedback unit 1 10- A3 a billing unit 110-A4, a processor 110-A5, and a memory' 110-A6. In particular, the second device 110 is associated with the guide. In another embodiment, the location capturing unit 114 may be disposed separate from the second device 110.
[0044] In one embodiment, the second application 110- A is configured to use the camera 114-1 of the second device 110 to capture a view of the place of interest. Similarly, the second application 110- A is configured to use the plurality of ambient condition sensors 114-2 integrated to the second device 110, for detecting the ambient conditions of the place of interest. As mentioned earlier, the sensors may be used to detect temperature, humidity, air-pressure, or a combinations thereof, for example, of the place of interest.
[0045] In the same embodiment, the audio/video transmission unit 110-A2 is configured to transmit the captured one or more media streams of the place of interest captured by the location capturing unit 114 to the audio/video receiver 102- A2 of the first device 102. The configuration of the audio/video transmission unit 110-A2 of the second device 110 is similar to the audio/video receiver 102-A2 of the first device 102, Similarly, the configuration of the feedback unit 110- A3, the billing unit 110-A4, the processor 110-A5 and the memory 1 I0-A6 of the second device 110 are to the corresponding components of the first device 102.
[0046] FIG. 4 illustrates a block diagram of a control unit 402 connected to the first device 102 and the location mimic unit 106, in accordance with an embodiment of FIG. 1 of the present disclosure. The control unit 402 is configured to control the location mimic unit 106 to render the virtual condition associated with the selected place of interest captured by the location capturing unit 114 to the user 102-B. In another embodiment, the first device 102 includes the control unit 402 configured to operatively connect to the location mimic unit 106
[0047] FIG. 5 illustrates a processing system 502 for transmitting and receiving data signals to and from the second device in accordance with another embodiment of the present disclosure. It should be noted herein that the processing system 502 is in addition to the processor 110A-5 shown in FIG. 3. In such an embodiment, the second device 110 may not include the ambient condition sensors 114-2. In the illustrated embodiment, the ambient condition sensors may be disposed outside the second device 110. Specifically, the plurality of sensors includes a temperature and humidity sensor 506, an accelerometer and gyrometer sensor 508, and an air flow sensor 504. In some embodiments, some sensors may be integrated to the second device 110 and some other sensors may be disposed separate from the second device 110 All such permutations and combinations are envisioned with the scope of the invention. The sensed ambient conditions of the place of interest are received by the processing system 510 of the first device 102 using the location mimic unit 106, in particulars the induction heating unit 106-2, the cooling unit 106-1 and the air blower 106-3 of the location mimic unit 106.
[0048] FIG. 6 illustrates a feedback process of the system 100 in accordance with an exemplary embodiment of FIGS. 1-5 of the present disclosure.
[0049] Herein, the combination of feedback unit 110- A3 and the feedback unit 102 A-4 are used to monitor guide sensor data 604 associated with the guide 110-B from the location capturing unit 114 and user sensor data 606 associated with the user 102-B from the detecting and monitoring unit 104-2 and the head tracking unit 102-A3 (for example, the user’s gesture based on the rendered one or more media streams of the selected place of interest and the comparison of the virtual condition with an actual condition associated with the selected place of interest based on the captured place of interest) in real time. The comparison of the guide sensor data 604 and the user sensor data 606 are represented by the step 602 In one embodiment, the control unit 402 controls the actuators 512, the air blower 106-3, the heating 106-2 and the cooling unit 106-1 of the location mimic unit 106 based on the comparison of the user sensor data 604 associated with the virtual reality condition and the guide sensor data 604 representative of the actual condition associated with the selected place of interest. Specifically, if there is a mismatch between the virtual reality' condition and the actual condition, the control unit 402 is used to control the actuators 512, the air blower 106-3, the heating 106-2 and the cooling unit 106-1 of the location mimic unit 106 to rectify the mismatch.
[0050] FIG. 7 illustrates a flow? chart describing a method for providing the virtual tour, in accordance with an embodiment of FIGS. 1-6 of the present disclosure
[0051] At step 702, a place of interest and the guide 110-B is selected by the user 102-B using the first application 102-A of the first device 102, for providing the virtual tour of a place of interest. It may be noted that, the first device 102 is associated with the user 102-B.
[0052] At step 704, the second application 110- A of the second device 110 is operatively connected to the first application 102-A of the first device 102. It may be noted that, the second device 110 is associated with the guide 110-B. The second device 110 includes the second application 110- A configured to operatively connect to the first application 102-A of the first device 102. [0053] At step 706, the place of interest is captured by the location capturing unit 114. In one embodiment, the location capturing unit 114 is integrated to the second device 110. In another embodiment, the location capturing unit 114 is disposed separate from the second device 110.
[0054] Further, if the location capturing unit 114 is integrated to the second device 110, the application 110- A is configured to use the camera of the second device 110 to capture a view of the place of interest. The captured view of the place of interest, for example, include one or more media streams of the place of interest. Similarly, if the location capturing unit 114 is integrated to the second device 110, the application 110- A is configured to use the ambient condition sensors for detecting the ambient conditions associated with the place of interest. The ambient condition sensors may be used to detect parameters such as but not limited to temperature, humidity, air-pressure, or a combinations thereof of the place of interest.
[0055] Similarly, if the location capturing unit 114 is disposed separate from the second device 110, the camera is stationed at the place of the interest, or the camera can be carried by the guide 110-B at the place of interest, to capture the view of the place of interest. Similarly, the sensors may be connected to the second device 110 by a Bluetooth or through a Wi-Fi connection.
[0056] At step 708, one or more media streams of the selected place of interest captured by the location capturing unit 114 are rendered to the user 102-B, by the computer-generated simulation device 104. It may be noted that, the computer-generated simulation device 104 is operatively connected to the first device 102 and are used to render the one or more media streams of the selected place of interest.
[0057] At step 710, a virtual condition associated with the selected place of interest are rendered to the user 102-B, by the location mimic unit 106. The virtual condition associated with the selected place of interest are captured by the location capturing unit 114. It may be noted that the location mimic unit 106 is operatively connected to the first device 102 and renders the virtual condition associated with the selected place of interest to the user 102-B. The virtual condition includes one or more ambient conditions, a type of the place of interest, or a combination thereof
Figure imgf000015_0001
81 and FIG. 8J are example screens shots illustrating the first application for providing the virtual tour, in accordance with an example embodiment of the present disclosure. The first application provided on the first device 102 enables the user 102-B to select an option of either to:‘explore the world’, or‘be a guide’. On selecting the option of‘explore the world’, the first application further enables the user 102-B with an option for:‘explore for free’ and‘hire an explorer’. On selecting the option of‘hire an explorer’, the user 102-B is provided with the options among‘guide for free’, or‘guide for credit’ Further, on selecting any one of the options among‘guide for free’, or‘guide for credit’, the first application redirects the user 102-B to use a google map to choose a location of a place of interest for having a virtual tour. Upon choosing the location of the place of interest, the first application starts searching for the guide 110-B near the location of the place of interest for the user 102-B. Upon finding the guide 110- B at the location of the place of interest, the first application confirms the availability of the guide 110-B near the location of the place of interest, to the user 102-B and also sends a request to the guide 110-B to connect with the user 102-B. The guide 110-B can then accept the request for connecting with the user 102-B. The first application, further displays a message to the guide 110-B, stating that:“You are online now. Please wait and stay on the page while an explorer request you”. Upon receiving the request from the user, the guide will connect with the user and provide a virtual tour of the place of interest to the user.
[0059] In one embodiment, a plurality of first devices may be connected to a single second device for rendering the media streams of a place of interest and for rendering a virtual condition associated with the selected place of interest. For example, if ten users would like to experience the virtual tour of a certain place, then all the ten users may simultaneously get the media stream from the second device 110 located in proximity of the place of interest. In one embodiment, the personalization of the virtual tour is accomplished by a location capturing unit via the remote server by selectively presenting a scene to each of the plurality of the users depending on the user gestures.
[0060] Consider, an example, where the user 102-B is interested in experiencing a virtual tour of a place of interest. During such an experience, the user 102-B may like to view a scene of the place of interest. Also, the user 102-B may like to experience a virtual feel of the ambient conditions of the place of interest.
[0061] Further, it is to be noted that while the embodiments of the present disclosure is described with reference to a virtual tour application, various other telepresence applications may be implemented using the exemplary systems and methods of the present disclosure. For example, the exemplary systems and methods of the present disclosure may be implement in the field of telemedicine, virtual diagnostics, virtual gaming, and the like.
[0062] While specific language has been used to describe the disclosure, any limitations arising on account of the same are not intended. As would be apparent to a person skilled in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein.
[0063] The figures and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, orders of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts necessarily need to be performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples. Numerous variations, whether explicitly given in the specification or not, such as differences in structure, dimension, and use of material, are possible. The scope of embodiments is at least as broad as given by the following claims.

Claims

WE CLAIM:
1. A system (100) for providing a virtual tour of a place of interest, the system (100) comprising:
a first device (102) associated with a user (102-B), wherein the first device (102) comprises a first application (102- A) for enabling the user (102-B) to select the place of interest and a guide (110-B) for providing the vi rtual tour of the place of interest; a second device (110) associated with the guide (110-B), wherein the second device (110) comprises a second application (110-A) configured to operatively connect to the first application (102- A) of the first device (102) and a location capturing unit (114) configured to capture the place of interest,
a computer-generated simulation device (104) configured to operatively connect to the first device (102) and render one or more media streams of the selected place of interest, captured by the location capturing unit (114) to the user (102); and
a location mimic unit (106) configured to operatively connect to the first device (102) and render a virtual condition associated with the selected place of interest captured by the location capturing unit (114) to the user (102-B).
2. The system (100) as claimed in claim 1, further comprising a feedback unit for providing a feedback representative of:
a user’s gesture based on the rendered one or more media streams of the selected place of interest; and
comparison of the virtual condition with an actual condition associated with the selected place of interest based on the captured place of interest.
3. The system (100) as claimed in claim 2, wherein the computer-generated simulation device (104) is configured to detect a user’s gesture
4. The system (100) as claimed in claim l wherein the virtual condition comprises one or more ambient conditions, a type of the place of interest, or a combination thereof.
5. The system (100) as claimed in claim 1, wherein the location capturing unit (114) is integrated to the second device (110).
6. The system (100) as claimed in claim 1, wherein the location capturing unit (114) is disposed separate from the second device (110).
7. The system (100) as claimed in claim 1, wherein the location capturing unit (114) comprises a camera and an ambient condition sensor.
8. The system (100) as claimed in claim 1, wherein the computer-generated simulation device (104) comprises a VR (Virtual Reality) enabled device, AR (Augmented Reality) enabled device, or a combination thereof
9. The system (100) as claimed in claim 1, wherein the location mimic unit (106) comprises an induction heating unit (106-2), a cooling unit (106-1), and an air blower
10. The system (100) as claimed in claim 1, further comprising a control unit (402) connected to the first device (102) and the location mimic unit (106), wherein the control unit (402) is configured to control the location mimic unit (106) to render a virtual condition associated with the selected place of interest captured by the location capturing unit (114) to the user (102-B).
11. The system (100) as claimed in claim 1, wherein the first device (102) comprises a control unit (402) configured to operatively connect to the location mimic unit (106), wherein the control unit (402) is configured to control the location mimic unit (106) to render a virtual condition associ ated with the selected place of interest captured by the location capturing unit (114) to the user (102-B).
12. The system (100) as claimed in claim 1, wherein the one or more media streams comprises a video stream, an audio stream, or a combination thereof.
13. A method for providing a virtual tour of a place of interest, the method comprising steps of:
enabling a user (102-B), by a first application (102- A) of a first device (102), to select a place of interest and a guide (110-B) for providing the virtual tour of the place of interest; operatively connecting a second application (110- A) of a second device (110) associated with the guide (110-B) to the first application (102- A) of the first device (102);
capturing the place of interest, by a location capturing unit (114);
rendering one or more media streams of the selected place of interest captured by the location capturing unit (114) to the user (102-B), by a computer-generated simulation device (104) operatively connected to the first device (102), and
rendering a virtual condition associated with the selected place of interest, captured by the location capturing unit (114), by a location mimic unit (106) operatively connected to the first device (102).
14. The method as claimed in claim 13, further comprising a feedback unit for providing a feedback representative of:
a user’s gesture based on the rendered one or more media streams of the selected place of interest; and
comparison of the virtual condition with an actual condition associated with the selected place of interest based on the captured place of interest.
15. The method as claimed in claim 13, wherein the first device (102) comprises a control unit (402) configured to operatively connect to the location mimic unit (106), wherein the control unit (402) is configured to control the location mimic unit (106) to render a virtual condition associated with the selected place of interest captured by the location capturing unit (114) to the user (102-B).
PCT/IN2019/050729 2018-10-03 2019-10-03 A system and a method for providing a virtual tour WO2020070756A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201841012607 2018-10-03
IN201841012607 2018-10-03

Publications (1)

Publication Number Publication Date
WO2020070756A1 true WO2020070756A1 (en) 2020-04-09

Family

ID=70054584

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2019/050729 WO2020070756A1 (en) 2018-10-03 2019-10-03 A system and a method for providing a virtual tour

Country Status (1)

Country Link
WO (1) WO2020070756A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8860787B1 (en) * 2011-05-11 2014-10-14 Google Inc. Method and apparatus for telepresence sharing
WO2017095647A1 (en) * 2015-12-03 2017-06-08 Microsoft Technology Licensing, Llc Immersive telepresence
WO2017151402A1 (en) * 2016-02-29 2017-09-08 Microsoft Technology Licensing, Llc Immersive interactive telepresence

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8860787B1 (en) * 2011-05-11 2014-10-14 Google Inc. Method and apparatus for telepresence sharing
WO2017095647A1 (en) * 2015-12-03 2017-06-08 Microsoft Technology Licensing, Llc Immersive telepresence
WO2017151402A1 (en) * 2016-02-29 2017-09-08 Microsoft Technology Licensing, Llc Immersive interactive telepresence

Similar Documents

Publication Publication Date Title
CN109313812B (en) Shared experience with contextual enhancements
US20180329209A1 (en) Methods and systems of smart eyeglasses
US9256986B2 (en) Automated guidance when taking a photograph, using virtual objects overlaid on an image
US8558759B1 (en) Hand gestures to signify what is important
US9317113B1 (en) Gaze assisted object recognition
CN114527864B (en) Augmented reality text display system, method, equipment and medium
WO2017209979A1 (en) Video pinning
US20120293613A1 (en) System and method for capturing and editing panoramic images
US9529428B1 (en) Using head movement to adjust focus on content of a display
EP2731348A2 (en) Apparatus and method for providing social network service using augmented reality
US11288871B2 (en) Web-based remote assistance system with context and content-aware 3D hand gesture visualization
US10037077B2 (en) Systems and methods of generating augmented reality experiences
CN111432245B (en) Multimedia information playing control method, device, equipment and storage medium
US11568615B2 (en) Collaborative on-demand experiences
CN111836069A (en) Virtual gift presenting method, device, terminal, server and storage medium
CN112333458B (en) Live room display method, device, equipment and storage medium
CN117120960A (en) Interface with haptic feedback response and audio feedback response
CN117157609A (en) Virtual reality interface with haptic feedback response
CN116670635A (en) Real-time video communication interface with haptic feedback
CN116685941A (en) Media content item with haptic feedback enhancement
CN117120959A (en) Interface with haptic feedback response and audio feedback response
CN116710885A (en) Communication interface with haptic feedback response
CN116648687A (en) Electronic communication interface with haptic feedback response
CN108924534B (en) Panoramic image display method, client, server and storage medium
CN115379125A (en) Interactive information sending method, device, server and medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19869826

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19869826

Country of ref document: EP

Kind code of ref document: A1