CN112738498B - Virtual tour system and method - Google Patents

Virtual tour system and method Download PDF

Info

Publication number
CN112738498B
CN112738498B CN202011553781.9A CN202011553781A CN112738498B CN 112738498 B CN112738498 B CN 112738498B CN 202011553781 A CN202011553781 A CN 202011553781A CN 112738498 B CN112738498 B CN 112738498B
Authority
CN
China
Prior art keywords
image
video image
user
viewing direction
direction information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011553781.9A
Other languages
Chinese (zh)
Other versions
CN112738498A (en
Inventor
张永忠
刘长城
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202011553781.9A priority Critical patent/CN112738498B/en
Publication of CN112738498A publication Critical patent/CN112738498A/en
Application granted granted Critical
Publication of CN112738498B publication Critical patent/CN112738498B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a virtual tour system and a virtual tour method. The detection module is used for detecting the viewing direction information of the user in real time, wherein the viewing direction information is a change result of the current viewing direction of the user relative to the reference viewing direction; the image acquisition module is used for controlling and acquiring an environment video image in the viewing direction information range according to the viewing direction information; the display module is used for acquiring and displaying a composite video image, wherein the composite video image is obtained by combining a local image of a carrier extracted from an environment video image, a local image of a user body and a scenic spot video image of a place to be visited. According to the application, the synthesized video image is displayed through the wearable equipment, so that a user can be personally on the scene, and the sense of reality and immersion of the tour are improved.

Description

Virtual tour system and method
Technical Field
The application relates to the technical field of virtual reality, in particular to a virtual tour system and a virtual tour method.
Background
The travel is taken as a leisure mode, so that people can grow the insights and happiness after being busy, and the daily life is enriched. But the travel to a distant place has the defects of money consumption and time consumption, and is particularly labor-consuming, so that a plurality of people cannot adapt to the travel mode.
At present, the related art visits a virtual scenic spot by simulating a user riding in a vehicle, however, there is a place within the scenic spot where the vehicle is not suitable for traveling, a strong offensive sense is generated, and the sense of realism and immersion is reduced.
Disclosure of Invention
In view of the above-mentioned drawbacks or shortcomings of the related art, it is desirable to provide a virtual tour system and method that enable a user to sit in a virtual scenic spot without violating and feeling, thereby improving the realism and immersion of the tour.
In a first aspect, the present application provides a virtual tour system comprising: the wearable device comprises a carrier for carrying a user and a wearable device worn on the head of the user, wherein the wearable device comprises a detection module, an image acquisition module and a display module;
the detection module is used for detecting the viewing direction information of the user in real time, wherein the viewing direction information is the change result of the current viewing direction of the user relative to the reference viewing direction;
the image acquisition module is used for controlling and acquiring an environment video image in the viewing direction information range according to the viewing direction information;
the display module is used for acquiring and displaying a composite video image, and the composite video image is obtained by carrying out composite processing on the local image of the carrier, the local image of the body of the user and the scenic spot video image of the place to be visited, which are extracted from the environment video image.
Optionally, in some embodiments of the present application, the image capturing module is further configured to control the image capturing module to capture the environmental video image when the viewing direction information reaches a trigger angle; and when the viewing direction information does not reach the triggering angle, controlling the image acquisition module to be not operated.
Optionally, in some embodiments of the application, the system further comprises:
the data processor is used for subtracting a preset angle from the viewing angle when the image acquisition module starts to acquire the partial image of the carrier or the partial image of the body of the user last time to serve as a starting trigger angle for starting the image acquisition module to work next time.
Optionally, in some embodiments of the application, the system further comprises:
the data processor is used for acquiring the viewing direction information and the environment video image; and extracting the local image of the carrier and the local image of the body of the user from the environment video image, and rendering and synthesizing the scenic spot video image, the local image of the carrier and the local image of the body of the user to obtain the synthesized video image.
Optionally, in some embodiments of the present application, the detection module includes a rotation angle sensing sub-module for sensing a rotation angle change of the head and a pitch angle sensing sub-module for sensing a pitch angle change of the head.
Optionally, in some embodiments of the present application, the wearable device further includes a play control module, where the play control module is configured to determine a play speed of the composite video image according to a body inclination angle of the user.
Optionally, in some embodiments of the application, the system further comprises:
unmanned aerial vehicle, be provided with panoramic camera on the unmanned aerial vehicle, panoramic camera is used for gathering treat scenic spot video image in sightseeing place.
Optionally, in some embodiments of the application, the system further comprises:
the appearance is similar to unmanned aerial vehicle and cartoon equipment, the cartoon equipment is set up on the said unmanned aerial vehicle, the eye position of the said cartoon equipment is provided with the image acquisition device;
the cartoon equipment is used for receiving the user action information acquired by the wearable equipment and generating corresponding actions according to the action information.
In a second aspect, the application provides a virtual tour method performed by means of a wearable device for wearing on a user's head and a carrier for carrying the user, the method comprising:
the detection module detects the viewing direction information of the user in real time, wherein the viewing direction information is a change result of the current viewing direction of the user relative to the reference viewing direction;
the image acquisition module acquires an environment video image in the range corresponding to the viewing direction information;
the data processor acquires a scenic spot video image of a place to be browsed and the environment video image, extracts a local image of the carrier and a local image of the body of the user from the environment video image, and synthesizes the local image of the carrier, the local image of the body of the user and the scenic spot video image of the place to be browsed to obtain a synthesized video image;
and the display module acquires and displays the synthesized video image.
Optionally, in some embodiments of the present application, the method further comprises:
determining whether the viewing direction information reaches a starting trigger angle of the image acquisition module;
when the viewing direction information reaches the triggering angle, controlling the image acquisition module to acquire the environment video image;
and when the viewing direction information does not reach the triggering angle, controlling the image acquisition module to be not operated.
Optionally, in some embodiments of the present application, the start trigger angle is a viewing angle obtained by subtracting a predetermined angle from a viewing angle when the image acquisition module last started to acquire a partial image of the carrier or a body partial image of the user.
In a third aspect, the present application provides a virtual tour method comprising:
acquiring an environment video image and a scenic spot video image of a place to be visited, wherein the environment video image and the scenic spot video image are sent by wearable equipment;
extracting a local image of a carrier and a body local image of a user from the environment video image;
combining the local image of the carrier, the local image of the body of the user and the scenic spot video image of the to-be-visited place to obtain a combined video image;
and sending the synthesized video image to the wearable device for display by a display module of the wearable device.
From the above technical solutions, the embodiment of the present application has the following advantages:
the embodiment of the application provides a virtual tour system and a virtual tour method, wherein wearable equipment in the virtual tour system can control and collect an environment video image in the range of viewing direction information according to the change of the viewing direction information of a user, and display a local image of a carrier extracted from the environment video image, a synthesized video image of a body local image of the user and a scenic spot video image of a place to be browsed, so that the user can be on the scene, and meanwhile, the carrier can also give the psychological hint that the user can fly, and can be more attached to various scenes of the virtual scenic spot without violating and feeling, thereby improving the realism and immersion of tour.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the accompanying drawings in which:
fig. 1 is a schematic structural diagram of a virtual tour system according to an embodiment of the present application;
FIG. 2 is a schematic illustration of a fly blanket according to an embodiment of the present application;
fig. 3 is a schematic diagram of a wearable device according to an embodiment of the present application;
FIG. 4 is a schematic diagram of another virtual tour system according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a virtual tour system according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a virtual tour system according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a virtual tour system according to another embodiment of the present application;
FIG. 8 is a schematic diagram of a virtual tour system according to another embodiment of the present application;
fig. 9 is a schematic flow chart of a virtual tour method according to an embodiment of the present application;
FIG. 10 is a flowchart of another virtual tour method according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
In order to make the present application better understood by those skilled in the art, the following description will clearly and completely describe the technical solutions in the embodiments of the present application with reference to the accompanying drawings, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the described embodiments of the application may be implemented in other sequences than those illustrated or otherwise described herein.
Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or modules is not necessarily limited to those steps or modules that are expressly listed or inherent to such process, method, article, or apparatus.
For ease of understanding and description, the virtual tour system and method according to embodiments of the present application are described in detail below with reference to fig. 1 through 11.
Fig. 1 is a schematic structural diagram of a virtual tour system according to an embodiment of the present application. The virtual tour system 10 comprises a vehicle 11 for carrying a user and a wearable device 12 worn on the head of the user, the wearable device 12 comprising a detection module 121, an image acquisition module 122 and a display module 123.
It should be noted that, in the embodiment of the present application, the carrier 11 may include, but is not limited to, a blanket, a flying saucer, a airship, and the like, and is preferably a small-sized carrier that is easily controlled by human ideas, such as a blanket or a flying saucer. Optionally, in some embodiments of the present application, the color of the carrier 11 is in sharp contrast to the surrounding environment, so that the identification and image extraction can be facilitated, and the processing efficiency is improved. Illustratively, the flying blanket shown in fig. 2 is provided with a seat, wherein the seat may be formed by bending a portion of the structure of the flying blanket. When in actual use, a user sits on the seat body to carry out virtual tour, and meanwhile, the fly blanket is played to carry out advancing and lifting video images, so that psychological hints of flying can be brought to the user, and immersion sense is improved. The wearable device 12 may include, but is not limited to, a helmet, VR (Virtual Reality) glasses, AR (Augmented Reality ) glasses, etc., such as shown in fig. 3, the wearable device 12 includes a display screen 1 for displaying a video image of a scenic spot of a place to be visited, an earphone 2 for playing sound effects, and a camera 3 for capturing an environmental video image, and the camera 3 is provided at a position corresponding to both eyes on the wearable device 12.
Specifically, the detection module 121 is configured to detect, in real time, user viewing direction information, which is a result of a change of a current viewing direction of a user with respect to a reference viewing direction. It should be noted that, in the embodiment of the present application, the reference viewing direction refers to the viewing direction when the head is in the reference state, for example, the reference state may include, but is not limited to, a head rotation angle reference and a head pitch angle reference. Further, the head rotation angle reference refers to the front when the head is not rotated left and right, for example, the head rotation angle reference is a vertical plane perpendicular to the horizontal plane when the user is looking ahead, and the head pitch angle reference refers to the horizontal plane in which the line of sight is located when the head is not pitch, for example, the head pitch angle reference may be the horizontal plane when the user is looking ahead. Optionally, as shown in fig. 4, the detection module 121 in some embodiments of the present application may include a rotation angle sensing sub-module 1211 for sensing a rotation angle change of the head and a pitch angle sensing sub-module 1212 for sensing a pitch angle change of the head.
The image acquisition module 122 is configured to control the acquisition of the environmental video image within the viewing direction information according to the viewing direction information. Optionally, in some embodiments of the present application, the image capturing module 122 is further configured to control the image capturing module 122 to capture an environmental video image when the viewing direction information reaches the trigger angle, and control the image capturing module 122 to be inoperative when the viewing direction information does not reach the trigger angle. It should be noted that, in the embodiment of the present application, the image capturing module 122 is in an open state before capturing no environmental video image. The image acquisition module 122 is inactive meaning that it does not send the ambient video image out. The ambient video image refers to a real environment around which the user uses the virtual tour system 10, for example, the real environment may include a partial image of the vehicle 11 and a partial image of the user's body. Further, the virtual tour system 10 further includes a data processor 124, where the data processor 124 is configured to subtract the viewing angle when the image capturing module 122 last begins to capture a partial image of the vehicle 11 or a body partial image of the user, by a predetermined angle as a start trigger angle for starting the operation of the image capturing module 122 next time. It should be noted that, the starting trigger angle is changed according to the viewing angle when the partial image of the carrier 11 or the partial image of the body of the user is acquired last time, rather than being completely fixed, so that the device can adapt to the heights, the body shapes and the sitting postures of different users. The predetermined angle is subtracted when the trigger angle is determined to be started, so that the local image of the carrier 11 or the local image of the body of the user can be timely acquired under the condition of adjusting sitting postures, sitting postures and actions of the user, and accordingly the discomfort caused by the synthesized video image seen by the user is avoided. The predetermined angle can be 3 degrees, 5 degrees and the like, the specific degree can be adjusted according to the accumulated user data, a larger value can be determined in the early stage, and the later stage can be adjusted through the accumulated user data. Alternatively, as shown in fig. 5, the data processor 124 may be configured within the wearable device 12. Alternatively, as shown in fig. 6, the data processor 124 may be configured at a server, with a communication connection established between the server and the wearable device 12. It should be noted that, the communication connection may include, but is not limited to, a wired interface connection or a wireless internet connection, such as Wi-Fi, wireless broadband, worldwide interoperability for microwave access, ultra-wideband, bluetooth, and so on.
The display module 123 is configured to acquire and display a composite video image, where the composite video image is obtained by combining a local image of the vehicle 11 and a body local image of a user extracted from an environmental video image with a scenic spot video image of a location to be visited. Optionally, some embodiments of the present application synthesize the video image by the data processor 124, that is, the data processor 124 first obtains the viewing direction information and the environmental video image, then extracts the local image of the carrier 11 and the local image of the user's body from the environmental video image, and renders and synthesizes the scenic spot video image of the location to be browsed, the local image of the carrier 11, and the local image of the user's body, to obtain the synthesized video image, for example, by performing a partial superposition rendering process on the local image of the carrier 11 and the local image of the user's body and the scenic spot video image according to an occlusion relationship.
Optionally, as shown in fig. 7, the wearable device 12 in some embodiments of the present application further includes a play control module 125, where the play control module 125 is configured to determine a play speed of the composite video image, i.e., a virtual tour speed, according to the body tilt angle of the user. For example, the embodiment of the present application collects the body forward tilting angle through the acceleration sensor module set on the wearable device 12.
Optionally, as shown in fig. 8, in some embodiments of the present application, the virtual tour system 10 further includes an unmanned aerial vehicle 13, where a panoramic camera 131 is disposed on the unmanned aerial vehicle 13, and the panoramic camera 131 is used to collect video images of a scenic spot of a location to be browsed, so that the benefit of this arrangement is that there is no up-and-down jerkiness when capturing a step path, and the real feel of flying virtual tour is more conformed. In addition, the unmanned aerial vehicle 13 is further provided with a ranging sensor for measuring the ground clearance, and the panoramic camera 131 may have a height of 2 to 2.5 meters. In actual use, the wearable device 12 can receive the real-time scenic spot video image transmitted by the panoramic camera 131, and can also store the scenic spot video image transmitted by the panoramic camera 131 to the local, and the scenic spot video image is directly called after the user selects the virtual sightseeing place, thereby improving the processing efficiency.
Optionally, in some embodiments of the present application, the virtual tour system 10 further includes a drone having a similar shape to the vehicle 11 and a cartoon device, wherein the cartoon device is disposed on the drone, and an image acquisition device is disposed at an eye portion of the cartoon device. It should be noted that, in the embodiment of the present application, the cartoon device may include, but is not limited to, a doll with a cartoon image or a doll with a humanoid shape, and the cartoon device is configured to receive the action information of the user collected by the wearable device 12, and generate a corresponding action according to the action information, so that interaction with other tourists is possible, and the sense of realism and immersion of the tour are enhanced.
The embodiment of the application provides a virtual tour system, wherein wearable equipment in the virtual tour system can control and collect an environment video image within the range of viewing direction information according to the change of the viewing direction information of a user, and display a local image of a carrier extracted from the environment video image, a synthesized video image of a body local image of the user and a scenic spot video image of a place to be browsed, so that the user can be on the scene, and meanwhile, the carrier can also give psychological hints that the user can fly, and can be more attached to various scenes of the virtual scenic spot without violating and feeling, thereby improving the realism and immersion of tour.
Based on the foregoing embodiments, embodiments of the present application provide a virtual tour method. The method is performed by means of a carrier 11 for carrying a user and a wearable device 12 for wearing on the head of the user. Fig. 9 is a schematic flow chart of a virtual tour method according to an embodiment of the present application. The method is applied to the wearable device 12, comprising the steps of:
s101, detecting user viewing direction information in real time by a detection module, wherein the viewing direction information is a change result of the current viewing direction of the user relative to a reference viewing direction.
It should be noted that, in the embodiment of the present application, the reference viewing direction refers to the viewing direction when the head is in the reference state, for example, the reference state may include, but is not limited to, a head rotation angle reference and a head pitch angle reference.
S102, an image acquisition module acquires an environment video image in a range corresponding to the viewing direction information.
Optionally, in some embodiments of the present application, it is further determined whether the viewing direction information reaches a start trigger angle of the image acquisition module. When the viewing direction information reaches the trigger angle, the image acquisition module is controlled to acquire the environment video image, and when the viewing direction information does not reach the trigger angle, the image acquisition module is controlled to be not operated. It should be noted that, in the embodiment of the present application, the starting trigger angle is obtained by subtracting the predetermined angle from the viewing angle when the image acquisition module starts to acquire the partial image of the carrier or the partial image of the body of the user last time, thereby improving the processing efficiency.
S103, the data processor acquires a scenic spot video image and an environment video image of the to-be-visited place, extracts a local image of the carrier and a body local image of the user in the environment video image, and synthesizes the local image of the carrier, the body local image of the user and the scenic spot video image of the to-be-visited place to obtain a synthesized video image.
Illustratively, the embodiment of the application performs partial superposition rendering processing on the local image of the carrier and the local image of the body of the user and the scenic spot video image according to the shielding relation.
S104, the display module acquires and displays the synthesized video image.
It should be noted that, in this embodiment, the descriptions of the same steps and the same content as those in other embodiments may refer to the descriptions in other embodiments, and are not repeated here.
The embodiment of the application provides a virtual tour method, wherein a wearable device can control and collect an environment video image within the range of viewing direction information according to the change of the viewing direction information of a user, and display a local image of a carrier extracted from the environment video image, a body local image of the user and a synthesized video image of a scenic spot video image of a place to be browsed, so that the user can be personally on the scene, and meanwhile, the carrier can also give the user psychological implications of flying, and can be more attached to various scenes of a virtual scenic spot without violating and feeling, thereby improving the realism and the immersion of tour.
Based on the foregoing embodiments, the embodiments of the present application provide another virtual tour method. Fig. 10 is a schematic flow chart of another virtual tour method according to an embodiment of the present application. The method is applied to the data processor 124, for example, the data processor 124 is a server, and includes the following steps:
s201, acquiring an environment video image and a scenic spot video image of a place to be visited, wherein the environment video image and the scenic spot video image are sent by the wearable equipment.
It should be noted that, in the embodiment of the present application, the environmental video image refers to a real environment around the virtual tour system when the user uses the virtual tour system, for example, the real environment may include a local image of the vehicle and a local image of the body of the user.
S202, extracting a local image of the carrier and a body local image of the user from the environment video image.
S203, combining the local image of the carrier, the local image of the body of the user and the scenic spot video image of the to-be-visited place to obtain a combined video image.
Illustratively, the embodiment of the application performs partial superposition rendering processing on the local image of the carrier and the local image of the body of the user and the scenic spot video image according to the shielding relation.
S204, sending the synthesized video image to the wearable device for display by a display module of the wearable device.
It should be noted that, in this embodiment, the descriptions of the same steps and the same content as those in other embodiments may refer to the descriptions in other embodiments, and are not repeated here.
The embodiment of the application provides a virtual tour method, wherein a wearable device can control and collect an environment video image within the range of viewing direction information according to the change of the viewing direction information of a user, and display a local image of a carrier extracted from the environment video image, a body local image of the user and a synthesized video image of a scenic spot video image of a place to be browsed, so that the user can be personally on the scene, and meanwhile, the carrier can also give the user psychological implications of flying, and can be more attached to various scenes of a virtual scenic spot without violating and feeling, thereby improving the realism and the immersion of tour.
Based on the foregoing embodiments, embodiments of the present application provide a server. Referring to fig. 11, the server 1000 may have a relatively large difference between configurations or performances, and may include one or more central processing units (Central Processing Units, CPU) 1001 (e.g., one or more processors) and a memory 1002, and one or more storage media 1005 (e.g., one or more mass storage devices) storing applications 1003 or data 1004. Wherein the memory 1002 and the storage medium 1005 may be transitory or persistent. The program stored on the storage medium 1005 may include one or more modules (not shown), each of which may include a series of instructions to operate on a server. Still further, the central processor 1001 may be configured to communicate with the storage medium 1005, and execute a series of instruction operations in the storage medium 1005 on the server 1000.
The server 1000 may also include one or more power supplies 1006, one or more wired or wireless network interfaces 1007, one or more input/output interfaces 1008, and/or one or more operating systems 1009, such as Windows Server TM, mac OS XTM, unix TM, linux TM, freeBSDTM, and the like.
In particular, the processes described above with reference to flowcharts 9-10 may be implemented as computer software programs according to embodiments of the present application. For example, the corresponding embodiment of fig. 10 of the present application includes a computer program product comprising a computer program carried on a computer readable medium, the computer program being executed by the CPU1001 to implement the steps of:
acquiring an environment video image and a scenic spot video image of a place to be visited, wherein the environment video image and the scenic spot video image are sent by wearable equipment;
extracting a local image of a carrier and a body local image of a user from an environment video image;
combining the local image of the carrier, the local image of the body of the user and the scenic spot video image of the to-be-visited place to obtain a combined video image;
and sending the synthesized video image to the wearable device for display by a display module of the wearable device.
As another aspect, embodiments of the present application provide a computer readable storage medium storing program code for performing any one of the foregoing virtual tour methods according to the respective embodiments.
As a further aspect, embodiments of the present application provide a computer program product comprising instructions which, when run on a computer, cause the computer to perform any one of the implementations of the virtual tour method of the various embodiments described above.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the systems, apparatuses and modules described above may refer to the corresponding processes in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or modules, which may be in electrical, mechanical, or other forms. The modules illustrated as separate components may or may not be physically separate, and components shown as modules may or may not be physical units, may be located in one place, or may be distributed over multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional module in the embodiments of the present application may be integrated in one processing unit, or each module may exist alone physically, or two or more units may be integrated in one module. The integrated units may be implemented in hardware or in software functional units. And the integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer-readable storage medium.
Based on such understanding, the technical solution of the present application may be embodied essentially or partly in the form of a software product, or all or part of the technical solution, which is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the virtual tour method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
It should be noted that the above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (11)

1. A virtual tour system, the system comprising: the device comprises a carrier for carrying a user and wearable equipment worn on the head of the user, wherein the carrier comprises a flying blanket or a flying saucer, and the wearable equipment comprises a detection module, an image acquisition module, a display module and a play control module;
the detection module is used for detecting the viewing direction information of the user in real time, wherein the viewing direction information is the change result of the current viewing direction of the user relative to the reference viewing direction;
the image acquisition module is used for controlling and acquiring an environment video image in the viewing direction information range according to the viewing direction information;
the display module is used for acquiring and displaying a composite video image, wherein the composite video image is obtained by carrying out composite processing on a local image of the carrier and a body local image of the user extracted from the environment video image and a scenic spot video image of a place to be visited;
and the play control module is used for determining the play speed of the synthesized video image according to the body inclination angle of the user.
2. The virtual tour system of claim 1 wherein the image acquisition module is further configured to control the image acquisition module to acquire the ambient video image when the viewing direction information reaches a trigger angle; and when the viewing direction information does not reach the triggering angle, controlling the image acquisition module to be not operated.
3. The virtual tour system according to claim 2, wherein the system further comprises:
the data processor is used for subtracting a preset angle from the viewing angle when the image acquisition module starts to acquire the partial image of the carrier or the partial image of the body of the user last time to serve as a starting trigger angle for starting the image acquisition module to work next time.
4. The virtual tour system according to claim 1, wherein the system further comprises:
the data processor is used for acquiring the viewing direction information and the environment video image; and extracting the local image of the carrier and the local image of the body of the user from the environment video image, and rendering and synthesizing the scenic spot video image, the local image of the carrier and the local image of the body of the user to obtain the synthesized video image.
5. The virtual tour system according to claim 1, wherein the detection module includes a rotation angle sensing sub-module for sensing a rotation angle change of the head and a pitch angle sensing sub-module for sensing a pitch angle change of the head.
6. The virtual tour system according to claim 1, wherein the system further comprises:
unmanned aerial vehicle, be provided with panoramic camera on the unmanned aerial vehicle, panoramic camera is used for gathering treat scenic spot video image in sightseeing place.
7. The virtual tour system according to claim 1, wherein the system further comprises:
the appearance is similar to unmanned aerial vehicle and cartoon equipment, the cartoon equipment is set up on the said unmanned aerial vehicle, the eye position of the said cartoon equipment is provided with the image acquisition device;
the cartoon equipment is used for receiving the user action information acquired by the wearable equipment and generating corresponding actions according to the action information.
8. A virtual tour method performed by a wearable device for wearing on a user's head and a vehicle for carrying a user, the vehicle comprising a flying blanket or a flying saucer, the method comprising:
the detection module detects the viewing direction information of the user in real time, wherein the viewing direction information is a change result of the current viewing direction of the user relative to the reference viewing direction;
the image acquisition module acquires an environment video image in the range corresponding to the viewing direction information;
the data processor acquires a scenic spot video image of a place to be browsed and the environment video image, extracts a local image of the carrier and a local image of the body of the user from the environment video image, and synthesizes the local image of the carrier, the local image of the body of the user and the scenic spot video image of the place to be browsed to obtain a synthesized video image;
and the display module acquires and displays the synthesized video image, and the playing speed of the synthesized video image is determined according to the body inclination angle of the user.
9. The virtual tour method according to claim 8, characterized in that the method further comprises:
determining whether the viewing direction information reaches a starting trigger angle of the image acquisition module;
when the viewing direction information reaches the triggering angle, controlling the image acquisition module to acquire the environment video image;
and when the viewing direction information does not reach the triggering angle, controlling the image acquisition module to be not operated.
10. The virtual tour method according to claim 9, wherein the start trigger angle is a viewing angle at which the image acquisition module last started acquiring a partial image of the vehicle or a partial image of the body of the user, subtracted by a predetermined angle.
11. A virtual tour method, the method comprising:
acquiring an environment video image and a scenic spot video image of a place to be visited, wherein the environment video image and the scenic spot video image are sent by wearable equipment;
extracting a local image of a carrier and a body local image of a user from the environment video image, wherein the carrier comprises a flying blanket or a flying saucer;
combining the local image of the carrier, the local image of the body of the user and the scenic spot video image of the to-be-visited place to obtain a combined video image;
and sending the synthesized video image to the wearable device for display by a display module of the wearable device, wherein the playing speed of the synthesized video image is determined according to the body inclination angle of the user.
CN202011553781.9A 2020-12-24 2020-12-24 Virtual tour system and method Active CN112738498B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011553781.9A CN112738498B (en) 2020-12-24 2020-12-24 Virtual tour system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011553781.9A CN112738498B (en) 2020-12-24 2020-12-24 Virtual tour system and method

Publications (2)

Publication Number Publication Date
CN112738498A CN112738498A (en) 2021-04-30
CN112738498B true CN112738498B (en) 2023-12-08

Family

ID=75615481

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011553781.9A Active CN112738498B (en) 2020-12-24 2020-12-24 Virtual tour system and method

Country Status (1)

Country Link
CN (1) CN112738498B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113534835B (en) * 2021-07-01 2022-05-31 湘南学院 Tourism virtual remote experience system and method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107223271A (en) * 2016-12-28 2017-09-29 深圳前海达闼云端智能科技有限公司 A kind of data display processing method and device
CN107895330A (en) * 2017-11-28 2018-04-10 特斯联(北京)科技有限公司 A kind of visitor's service platform that scenario building is realized towards smart travel
CN110412765A (en) * 2019-07-11 2019-11-05 Oppo广东移动通信有限公司 Augmented reality image capturing method, device, storage medium and augmented reality equipment
CN110673734A (en) * 2019-09-30 2020-01-10 京东方科技集团股份有限公司 Virtual tourism method, client, server, system and image acquisition equipment
CN110930517A (en) * 2019-08-19 2020-03-27 泉州师范学院 Panoramic video interaction system and method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105955456B (en) * 2016-04-15 2018-09-04 深圳超多维科技有限公司 The method, apparatus and intelligent wearable device that virtual reality is merged with augmented reality

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107223271A (en) * 2016-12-28 2017-09-29 深圳前海达闼云端智能科技有限公司 A kind of data display processing method and device
CN107895330A (en) * 2017-11-28 2018-04-10 特斯联(北京)科技有限公司 A kind of visitor's service platform that scenario building is realized towards smart travel
CN110412765A (en) * 2019-07-11 2019-11-05 Oppo广东移动通信有限公司 Augmented reality image capturing method, device, storage medium and augmented reality equipment
CN110930517A (en) * 2019-08-19 2020-03-27 泉州师范学院 Panoramic video interaction system and method
CN110673734A (en) * 2019-09-30 2020-01-10 京东方科技集团股份有限公司 Virtual tourism method, client, server, system and image acquisition equipment

Also Published As

Publication number Publication date
CN112738498A (en) 2021-04-30

Similar Documents

Publication Publication Date Title
JP6276882B1 (en) Information processing method, apparatus, and program for causing computer to execute information processing method
US10409365B2 (en) Method of providing a virtual space image subjected to blurring processing based on detected displacement, and system therefor
CN106170083B (en) Image processing for head mounted display device
AU2016262576B2 (en) Privacy-sensitive consumer cameras coupled to augmented reality systems
CN106659932B (en) Sensory stimulus management in head mounted displays
JP2020535878A (en) Extension of virtual reality video game with friend avatar
US20180330536A1 (en) Method of providing virtual space, program for executing the method on computer, and information processing apparatus for executing the program
EP3383036A2 (en) Information processing device, information processing method, and program
US20190005732A1 (en) Program for providing virtual space with head mount display, and method and information processing apparatus for executing the program
US20190005731A1 (en) Program executed on computer for providing virtual space, information processing apparatus, and method of providing virtual space
CN108416832B (en) Media information display method, device and storage medium
JP6470859B1 (en) Program for reflecting user movement on avatar, information processing apparatus for executing the program, and method for distributing video including avatar
CN114797085A (en) Game control method and device, game terminal and storage medium
JP7085578B2 (en) Information processing device, user guide presentation method, and head-mounted display
CN112738498B (en) Virtual tour system and method
CN111638798A (en) AR group photo method, AR group photo device, computer equipment and storage medium
CN110651304B (en) Information processing device, information processing method, and program
EP3346375B1 (en) Program, recording medium, content provision device, and control method
JP6775669B2 (en) Information processing device
US20220230400A1 (en) Image processing apparatus, image distribution system, and image processing method
JP6566209B2 (en) Program and eyewear
JP5864789B1 (en) Railway model viewing device, method, program, dedicated display monitor, scene image data for composition
US10668379B2 (en) Computer-readable recording medium, computer apparatus, image display method
JP7044846B2 (en) Information processing equipment
US20240078767A1 (en) Information processing apparatus and information processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant