WO2018103233A1 - 基于虚拟现实技术的观景方法、装置及系统 - Google Patents

基于虚拟现实技术的观景方法、装置及系统 Download PDF

Info

Publication number
WO2018103233A1
WO2018103233A1 PCT/CN2017/077987 CN2017077987W WO2018103233A1 WO 2018103233 A1 WO2018103233 A1 WO 2018103233A1 CN 2017077987 W CN2017077987 W CN 2017077987W WO 2018103233 A1 WO2018103233 A1 WO 2018103233A1
Authority
WO
WIPO (PCT)
Prior art keywords
video image
current
camera
viewing angle
user
Prior art date
Application number
PCT/CN2017/077987
Other languages
English (en)
French (fr)
Inventor
杨帆
孙华柱
崔溪远
Original Assignee
深圳创维-Rgb电子有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳创维-Rgb电子有限公司 filed Critical 深圳创维-Rgb电子有限公司
Priority to AU2017370476A priority Critical patent/AU2017370476B2/en
Priority to US16/323,676 priority patent/US20190208174A1/en
Publication of WO2018103233A1 publication Critical patent/WO2018103233A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • H04N13/117Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/371Image reproducers using viewer tracking for tracking viewers with different interocular distances; for tracking rotational head movements around the vertical axis
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/378Image reproducers using viewer tracking for tracking rotational head movements around an axis perpendicular to the screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • the present disclosure relates to the field of virtual reality technologies, for example, to a viewing method, apparatus, and system based on virtual reality technology.
  • VR Virtual Reality
  • the present disclosure provides a viewing method, device and system based on virtual reality technology, which can realize that a user can watch a lot of places in the world in real time without leaving the house, and has the effect of viewing the scene in front of the window.
  • the embodiment provides a viewing method based on the virtual reality technology, including: acquiring current location information of the user relative to the display device, and calculating a current viewing angle of the user according to the current location information, where the current viewing view is performed.
  • the field of view includes a horizontal viewing angle and a vertical viewing angle; acquiring a video image captured in real time by the camera device of the user-specified location; and performing the video image according to the current location information and the current viewing angle Processing; and transmitting the processed video image to a display device for display.
  • the camera device is a 360° panoramic camera.
  • the processing according to the current location information and the current viewing angle of view, the video image, including: if the current viewing angle of view is smaller than a camera angle of view, according to the current The video image is cropped by the location information and the current viewing angle of view; and the cropped video image is scaled according to the resolution of the display device.
  • the method further includes: if the video image corresponding to the current viewing angle is beyond the video image captured by the camera Range, adjusting the acquisition angle of the camera device.
  • the acquiring the video image captured by the camera device in the user-specified location in real time may include: acquiring a video image captured in real time after the camera device adjusts the acquisition angle.
  • adjusting the acquisition angle of the camera includes: calculating the current viewing angle The amount of change of the angle of view of the field angle and the angle of view of the previous viewing angle; determining whether the video image corresponding to the current viewing angle of view exceeds the video image captured by the camera device according to the amount of change in the angle of view a range; if the video image corresponding to the current viewing angle of view exceeds a range of the video image captured by the camera device, adjusting an acquisition angle of the camera device.
  • the present embodiment provides a viewing device based on the virtual reality technology, including: a current viewing angle acquisition unit, configured to acquire current location information of the user relative to the display device, and calculate a current user based on the current location information.
  • a viewing angle of view the current viewing angle of view includes a horizontal viewing angle and a vertical viewing angle
  • the video image acquiring unit is configured to acquire a video image captured by the camera in a location specified by the user in real time
  • the image processing unit is configured to process the video image according to the current location information and a current viewing angle
  • the video image transmitting unit is configured to send the processed video image to the display device for display.
  • the device may further include: an imaging device acquisition angle adjusting unit, configured to adjust the camera device if the video image corresponding to the current viewing angle of view exceeds a range of video images captured by the camera device
  • the video image acquisition unit is configured to acquire a video image captured in real time after the camera device adjusts the acquisition angle.
  • the video image processing unit is further configured to: if the current viewing angle of view is smaller than a camera angle of view, crop the video image according to the current viewing angle of view, and according to the display The resolution of the device scales the cropped video image.
  • the embodiment further provides a management server, wherein the management server is configured with the virtual reality technology-based viewing device according to any one of the above,
  • the current viewing angle of view acquiring unit is configured to: acquire current location information of the user sent by the virtual reality device with respect to the display device, and calculate a current viewing angle of the user according to the current location information, where The current location information is collected by a local camera device disposed on the display device and sent to the virtual reality device; the video image acquisition unit is configured to: receive a video captured in real time by a remote camera device at a location specified by the user. An image, wherein the far-end imaging device is disposed in a plurality of scenic spots; the video image transmitting unit is configured to: send the processed video image to the display device for display by the virtual display device.
  • the present embodiment further provides a virtual reality device, where the virtual reality device is configured with the virtual reality technology-based viewing device according to any one of the above, the current viewing angle acquisition unit is configured to: acquire a current position information of the user sent by the local camera device disposed on the display device with respect to the display device, and calculating a current viewing angle of the user according to the current position information; the video image acquiring unit is configured to: pass The management server receives a video image captured by a remote camera device in a location specified by the user in real time, wherein the remote camera device is disposed in a plurality of scenic spots; and the camera device angle adjustment unit is configured to: send the user to the management server The remote camera device of the designated location collects an angle adjustment instruction, and instructs the management server to adjust the collection angle of the remote camera device.
  • the embodiment further provides a viewing system based on virtual reality technology, including a remote camera device, a virtual reality device, a display device, and a local camera device, and the foregoing management server.
  • a viewing system based on virtual reality technology, including a remote camera device, a virtual reality device, a display device, and a local camera device, and the foregoing management server.
  • the embodiment further provides a viewing system based on virtual reality technology, including a management server, a remote camera device, a display device, and a local camera device, and the above virtual reality device.
  • the embodiment further provides a computer readable storage medium storing computer executable instructions for executing any of the above-described real-time viewing methods based on virtual reality technology.
  • the embodiment provides a management server including one or more processors, a memory, and one or more programs, the one or more programs being stored in the memory when executed by one or more processors At any time, any of the above-described viewing methods based on virtual reality technology is performed.
  • the embodiment further provides a virtual reality device, the virtual reality device including one or more processes And a memory and one or more programs, the one or more programs being stored in the memory, and when executed by the one or more processors, performing any of the above-described virtual reality based viewing methods.
  • Obtaining a current viewing angle of the user according to the current location information by acquiring current location information of the user relative to the display device, and real-time imaging the camera device of the user specified location according to the current location information and the current viewing angle
  • the captured video image is processed, and the processed video image is sent to the display device for display, and the real-time collected scenic spot video image is processed according to the user's location information, just like the user standing in front of the window to view the scenery, the display device displays
  • the video image changes according to the user's position change.
  • the user can enjoy the scenery of many places in the world without leaving the house.
  • the user experience is good and the effect is realistic.
  • FIG. 1 is a flowchart of a viewing method based on virtual reality technology provided by this embodiment.
  • 2a is a schematic plan view of a lateral field of view of a user provided by this embodiment.
  • 2b is a schematic plan view of the longitudinal field of view of the user provided by the embodiment.
  • FIG. 3 is a schematic diagram of video image cropping provided by this embodiment.
  • FIG. 4 is a flowchart of a view method based on virtual reality technology provided by the embodiment.
  • FIG. 5 is a block diagram showing the structure of a viewing device based on virtual reality technology provided by this embodiment.
  • FIG. 6 is a block diagram showing the structure of a viewing device based on virtual reality technology provided by this embodiment.
  • FIG. 7 is a structural block diagram of a viewing system based on virtual reality technology provided by this embodiment.
  • FIG. 8 is a structural block diagram of a viewing system based on virtual reality technology provided by this embodiment.
  • FIG. 9 is a schematic structural diagram of hardware of a management server or a virtual reality device according to this embodiment.
  • FIG. 1 is a flowchart of a viewing method based on virtual reality technology provided by this embodiment. This method can It is executed by a virtual reality technology-based viewing device configured in the management server, and can also be executed by a virtual reality technology-based viewing device configured in the virtual reality device. Referring to FIG. 1, the method may include S110-S140.
  • the current relative position information of the user's face and the display device is acquired in real time, and the current viewing angle of the user is calculated according to the relative position information.
  • a camera is disposed at the upper end of the display device, and it is possible to detect in real time whether there is a user in front of the display device and a change in the position of the user.
  • the camera is used to track the change of the user's face position, and the current location information of the user is obtained in time.
  • the camera is mounted on the display device, the camera is placed on the display device, and the camera's shooting angle of view is a known parameter.
  • the average distance between the two eyes of the user is also a known parameter, which can be taken by the camera.
  • the image is processed to obtain the planar position information of the user's face relative to the display device, so that the distance of the user from the display screen can be estimated by the camera parameters and the distance between the two eyes of the user in the image.
  • the position information of the user may be represented by establishing a three-dimensional coordinate system, and the origin of the three-dimensional coordinate system may be selected as the center position of the display device or may be selected as the lower left corner position of the display device.
  • the position of the user's face in front of the display device can be determined by the position parameters of the three imaging devices.
  • the infrared device ranging and the camera face tracking algorithm can be combined to determine the location information of the user.
  • the current viewing angle of view may include a lateral viewing angle and a vertical viewing angle.
  • 2a is a schematic plan view of a lateral field of view of a user provided by this embodiment.
  • the lateral viewing angle a is two points (points A and B) of the left and right sides of the display device in the same horizontal plane as the user's face position, and the user's face position Q (for example, the user's eyes).
  • the angle between the two, Q represents the user's face position
  • AB is the length of the TV landscape
  • the angle ⁇ is the horizontal viewing angle.
  • 2b is a schematic plan view of the longitudinal field of view of the user provided by the embodiment. As shown in Fig.
  • the longitudinal viewing angle ⁇ is in the same level as the user's face position.
  • the angle between two points on the upper and lower sides of the display device (such as points C and D) and the position of the user's face Q (for example, the user's eyes), Q represents the position of the user's face, and the CD is the width of the longitudinal direction of the TV.
  • the angle ⁇ is the longitudinal viewing angle. In the actual calculation, whether it is the horizontal viewing angle or the vertical viewing angle, the midpoint of the two eyes can be selected as the apex of the angle, or the left or right eye can be used as the reference. select.
  • a video image captured in real time by an imaging device of a location designated by the user is acquired.
  • the user can select the attraction that he or she wants to enjoy, and retrieve the video image of the attraction in real time by capturing the camera device matched with the attraction.
  • the camera device may be a camera, and the camera may be a 360-degree panoramic camera, and the horizontal field of view and the vertical field of view of the camera are both greater than 180°, and the user's angle of view may fall regardless of how the user moves.
  • the camera can also use a wide-angle camera.
  • the acquisition of the video image is real-time acquisition, and the time interval of the acquisition may be accurate to microseconds and milliseconds, so as to meet the current scene of the scenic spot when the video image seen by the user changes in front of the display device.
  • the flow of scenic spots and the movement of the wind can be displayed on the display device in real time.
  • the video image is processed according to the current location information and a current viewing angle.
  • the video image is cropped according to the current location information and a current viewing angle; the cropping is performed according to a resolution of the display device The subsequent video image is scaled.
  • the camera is a 360° panoramic camera
  • the horizontal angle of view and the vertical field of view of the camera are both 180°
  • the user's current horizontal viewing angle is 50°
  • the vertical viewing angle is 30°. Since the video image captured by the camera device is larger than the image displayed on the display device, the video image needs to be cropped according to the current viewing angle of view and the current position information of the user.
  • FIG. 3 is a schematic diagram of video image cropping provided by this embodiment.
  • the current viewing angle of the user also changes, that is, compared to the user's view at the first position M. Field angle, the angle of view of the user at the second position N It has become smaller.
  • the video image cropped by the virtual reality device also needs to be adjusted according to the second position N of the user.
  • the video image displayed by the display device is the A-area content cropped in the video image captured by the remote camera.
  • the video image displayed by the display device is the B-region content cropped in the video image taken by the remote camera.
  • the video image captured by the remote camera device is cropped according to the current location information of the user (the user is currently at the second location N), and the cropped content is scaled and sent to the display device for processing by the display device. The subsequent video image is displayed.
  • the video image can be changed in accordance with the positional movement of the user, the user can be seen as if looking at the scenery in front of the window, that is, the scenery seen from the window is different as the position of the user changes.
  • the position information of the user relative to the display device is represented by a three-dimensional coordinate system, with the center point of the display device (such as the center 0 point of the display device as shown in FIG. 3) as the origin; the video image captured by the camera The point is used as the origin to establish a two-dimensional coordinate system; when the position coordinate of the user is represented as (x, y, z), the coordinates of the center point of the cropped video image are (-x, -y).
  • the center point of the cropped video image also moves accordingly, and the appropriate video image is cropped according to the current viewing angle.
  • the resolution of the cropped video image does not necessarily correspond to the resolution of the display device. Therefore, the cropped video image can also be scaled according to the resolution of the display device, so that the processed video image is displayed on the display device. good.
  • the display device may make a reminder that the user is unable or unfavorable for viewing at the current location.
  • the camera can be set on the display device, and the camera can collect the current position information of the user. If the user exceeds the shooting angle of the camera, it can be determined that the user exceeds the preset range and an alarm is given.
  • two cameras may be provided on the display device to increase the shooting range.
  • the processed video image is transmitted to the display device for display.
  • the cropped video image in S130 is transmitted to the display device for display.
  • the display device may be a liquid crystal display.
  • the virtual reality technology-based viewing method calculates the current viewing angle of the user according to the current location information by acquiring the current location information of the user relative to the display device, according to the current location information and the current
  • the viewing angle of the scene processes the video image captured by the camera device in the user-specified location in real time, and sends the processed video image to the display device to display the real-time captured video image of the spot according to the user's location information, just like
  • the user stands in front of the window to see the scenery, which satisfies the user's need to enjoy the scenery around the world without leaving the house.
  • the user experience is good and the effect is realistic.
  • FIG. 4 is a flowchart of a view method based on virtual reality technology provided by the embodiment. This embodiment is based on the foregoing method, and is a specific implementation when the video image captured by the field of view of the camera device cannot meet the demand. Referring to FIG. 4, the method may include S201-S205.
  • the current viewing angle of view includes a horizontal viewing angle and a vertical viewing angle.
  • the angle of view of the camera is limited.
  • the angle of view of a general outdoor camera is 150°.
  • the lens of the camera can be rotated, and the shooting range can be enlarged by the rotation of the camera lens.
  • the amount of change of the angle of view of the current viewing angle and the viewing angle of the viewing point of the previous moment may be calculated, and the video corresponding to the current viewing angle is determined according to the amount of change of the viewing angle Whether the image exceeds the range of the video image captured by the camera.
  • the video image cropped by the user's previous position information is already at the edge of the video image captured by the camera, then when the user moves the position again, the next time can be determined.
  • the cropped video image is out of range of the video image taken by the camera.
  • the video image is processed according to the current location information and a current viewing angle of view.
  • the processed video image is transmitted to the display device for display.
  • the virtual reality technology-based viewing method calculates the current viewing angle of the user according to the current location information by acquiring current location information of the user relative to the display device, if the current viewing field of view
  • the video image corresponding to the angle exceeds the range of the video image captured by the camera device, adjusts the acquisition angle of the camera device, and acquires a video image captured in real time after the camera device adjusts the acquisition angle, according to the current location information and the current viewing view.
  • the video image is processed by the field of view, and the processed video image is sent to the display device for display, and the real-time image of the scenic spot captured according to the user's location information is processed, just like the user standing in front of the window and watching the scenery.
  • the acquisition angle of the remote camera is adjusted to expand the viewing range, which satisfies the user's need to enjoy the scenery around the world without leaving the home.
  • the user experience is good and the effect is realistic. .
  • FIG. 5 is a block diagram showing the structure of a viewing device based on virtual reality technology provided by this embodiment.
  • the viewing device based on the virtual reality technology may be configured on the management server or may be configured on the virtual reality device.
  • the apparatus 10 may include a current viewing angle acquisition unit 100, a video image acquisition unit 101, a video image processing unit 102, and a video image transmitting unit 103.
  • the current viewing angle acquisition unit 100 is configured to acquire current position information of the user relative to the display device, and calculate a current viewing angle of the user according to the current position information, where the current viewing angle includes horizontal Viewing angle of view and vertical viewing angle of view.
  • the video image acquisition unit 101 is configured to acquire a video image captured in real time by an imaging device of a location designated by the user.
  • the video image processing unit 102 is configured to set the current position information according to the current position information and the current viewing angle
  • the video image is processed; specifically, if the current viewing angle of view is smaller than the field of view of the camera, the video image is cropped according to the current viewing angle, and is determined according to the resolution of the display device. Rate scales the cropped video image.
  • the video image transmitting unit 103 is configured to transmit the processed video image to the display device for display.
  • the virtual reality technology-based viewing device obtains the current viewing position of the user according to the current location information by acquiring the current location information of the user relative to the display device, according to the current location information and the current
  • the viewing angle of the scene processes the video image captured by the camera device in the user-specified location in real time, and sends the processed video image to the display device to display the real-time captured video image of the spot according to the user's location information, just like
  • the user stands in front of the window to see the scenery, which satisfies the user's need to enjoy the scenery around the world without leaving the house.
  • the user experience is good and the effect is realistic.
  • FIG. 6 is a block diagram showing the structure of a viewing device based on virtual reality technology provided by this embodiment.
  • the device is based on the aforementioned device, and can realize adjustment of the acquisition angle of the camera device.
  • the apparatus further includes a camera acquisition angle adjustment unit 104.
  • the camera acquisition angle adjustment unit 104 is configured to adjust the acquisition angle of the camera device if the video image corresponding to the current viewing angle of view exceeds the range of the video image captured by the camera device.
  • the video image acquiring unit 101 is configured to acquire a video image captured in real time after the camera adjusts the acquisition angle.
  • the viewing device based on the virtual reality technology adjusts the collection angle of the camera device when the range of the video image captured by the camera device cannot meet the requirements of the user's moving range, thereby expanding the viewing range and further improving the user's viewing effect. Experience is good.
  • FIG. 7 is a structural block diagram of a viewing system based on virtual reality technology provided by this embodiment.
  • the system includes a management server 1, a remote camera 2, a virtual reality device 3, and a display.
  • the device 4 and the local imaging device 5 are provided with the virtual reality technology-based real-time viewing device 10 described above.
  • the remote camera device 2 is disposed in a famous spot in many places in the world, and is configured to send a real-time video image to the management server.
  • the local imaging device 5 is disposed on the display device 4, and is configured to collect position information of a user's face in front of the display device 4 with respect to the display device 4, and send the position information to the virtual reality device 3.
  • the virtual reality device 3 is configured to acquire location information of the user's face relative to the display device 4 and transmit the location information to the management server 1; receive the video image processed by the management server 1, The video image is sent to the display device 4 for display.
  • the system sends a real-time video image to the management server through the remote camera device, and the virtual reality device acquires the location information of the user's face relative to the display device through the local camera device, and sends the location information to the management server, and the management server according to the virtual reality device
  • the transmitted position information calculates a user's field of view angle, and the video image is cropped according to the position information and the angle of view, and the clipped video image is sent to the display device through the virtual reality device for display, thereby realizing the position according to the user.
  • Information processing Real-time collection of scenic video images, just like users standing in front of the window to see the scenery, to meet the needs of users to enjoy the scenery around the world without leaving home, the user experience is good, the effect is realistic.
  • the viewing device based on the virtual reality technology is configured on the management server, and all video image processing operations are completed in the management server, which reduces the amount of data transmission and requires low network bandwidth.
  • the far-end camera device 2 is a 360° panoramic camera, and the 360° panoramic camera has a large shooting range, which can satisfy the user's random position.
  • the local camera 5 is a wide-angle camera with a large shooting range, so that the user has a large moving range.
  • the management server is a cloud server, connected to the virtual reality device through a network, and the remote camera device is connected to the management server through a network.
  • the management server manages video images returned by remote camera devices around the world, which may be the Internet or 3G/4G.
  • FIG. 8 is a structural block diagram of a viewing system based on virtual reality technology provided by this embodiment.
  • the system includes a management server 2, a remote camera 3, a virtual reality device 1, a display device 4, and a local camera 5.
  • the virtual reality device 1 is configured with real-time based on virtual reality technology as described above.
  • the remote camera device 3 is disposed in a plurality of scenic spots, and is configured to send a real-time video image to the management server 2.
  • the local imaging device 5 is disposed on the display device 4, and is configured to collect position information of a user's face in front of the display device 4 with respect to the display device, and send the position information to the virtual reality device 1.
  • the management server 2 is configured to adjust an acquisition angle of the remote camera 3 according to a remote camera acquisition angle adjustment command sent by the virtual reality device 1; and send a real-time video image captured by the remote camera to the virtual reality Device 1 performs processing.
  • the display device 4 is arranged to display a video image processed by the virtual reality device 1.
  • the system sends a real-time video image to the management server through the remote camera device, and the management server adjusts the acquisition angle of the remote camera device according to the remote camera device acquisition angle adjustment command sent by the virtual reality device, and the remote camera device captures the image.
  • the real-time video image is sent to the virtual reality device for processing, and the real-time collection of the scenic spot video image is processed according to the user's location information, just like the user standing in front of the window to see the scenery, satisfying the user to enjoy the scenery around the world without leaving the house.
  • the demand, the user experience is good, the effect is realistic.
  • the viewing device based on the virtual reality technology is configured on the virtual reality device, and all the video image processing operations are completed in the virtual reality device, which reduces the workload of the management server and has low requirements on the server, which is beneficial to reducing the cost.
  • the far-end camera device 3 is a 360° panoramic camera, and the 360° panoramic camera has a large shooting range, which can satisfy the user's random position.
  • the local camera 5 is a wide-angle camera with a large shooting range, so that the user has a large moving range.
  • the management server is a cloud server, connected to the virtual reality device through a network, and the remote camera device is connected to the management server through a network.
  • the management server manages video images returned by remote camera devices around the world, which may be the Internet or 3G/4G.
  • the embodiment further provides a computer readable storage medium storing computer executable instructions for executing any of the above-described virtual reality based viewing methods.
  • the management server or the virtual reality device may include: one or more processors 310 (one processor 310 in FIG. 9) As an example), the memory 320.
  • the management server or virtual reality device may further include: an input device 330 and an output device 340.
  • the processor 310, the memory 320, the input device 330, and the output device 340 may be connected by a bus or other means, as exemplified by a bus connection in FIG.
  • the memory 320 is a computer readable storage medium that can be used to store software programs, computer executable programs, and modules.
  • the processor 310 executes a plurality of functional applications and data processing by executing software programs, instructions, and modules stored in the memory 320 to implement a virtual reality-based viewing method of any of the above method embodiments.
  • the memory 320 may include a storage program area and an storage data area, wherein the storage program area may store an operating system, an application required for at least one function; the storage data area may store data created according to usage of the management server or the virtual reality device, and the like.
  • Memory 320 can be a non-transitory computer storage medium or a transitory computer storage medium.
  • the non-transitory computer storage medium may include volatile memory such as random access memory (RAM), and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other nonvolatile memory. Solid state storage devices.
  • memory 320 can optionally include memory remotely located relative to processor 310, which can be connected to a management server or virtual reality device over a network. Examples of the above networks may include the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
  • the input device 330 can be configured to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the management server or virtual reality device.
  • the output device 340 can include a display device such as a display screen.
  • This embodiment may also include a communication device 350 for transmitting and/or receiving information over a communication network.
  • the program when executed, may include the flow of an embodiment of the method described above, wherein the non-transitory computer readable storage medium may be a magnetic disk, an optical disk, a read only memory (ROM), or a random Memory (RAM), etc.
  • the non-transitory computer readable storage medium may be a magnetic disk, an optical disk, a read only memory (ROM), or a random Memory (RAM), etc.
  • the present disclosure provides a viewing method, device and system based on virtual reality technology, which can realize real-time collection of scenic spot video images according to user's location information, so that users can watch multiple scenic spots in real time without leaving the home. The effect of viewing the view in front of the window.

Abstract

一种基于虚拟现实技术的观景方法、装置及系统,该方法包括:获取用户相对于显示装置的当前的位置信息,根据所述当前位置信息计算出用户当前观景视场角,所述当前观景视场角包括横向观景视场角和纵向观景视场角;获取用户指定地点的摄像装置实时拍摄的视频图像;根据所述当前位置信息和当前观景视场角对所述视频图像进行处理;发送所述处理后的视频图像至显示装置显示。

Description

基于虚拟现实技术的观景方法、装置及系统 技术领域
本公开涉及虚拟现实技术领域,例如涉及一种基于虚拟现实技术的观景方法、装置及系统。
背景技术
近来虚拟现实(Virtual Reality,VR)技术发展迅速,VR相关产品也快速增加。虚拟现实系统需要用户头戴特殊设备,虽然能给用户带来全视角的体验,但使用复杂,设备昂贵,而且可能对用户的健康会产生一定的不良影响,具有局限性。
发明内容
本公开提供一种基于虚拟现实技术的观景方法、装置及系统,可以实现用户足不出户便可以实时观看全球多地的景点,且具有窗前观景的效果。
本实施例提供一种基于虚拟现实技术的观景方法,包括:获取用户相对于显示装置的当前的位置信息,根据所述当前位置信息计算出用户当前观景视场角,所述当前观景视场角包括横向观景视场角和纵向观景视场角;获取用户指定地点的摄像装置实时拍摄的视频图像;根据所述当前位置信息和当前观景视场角对所述视频图像进行处理;以及发送所述处理后的视频图像至显示装置显示。
可选地,所述摄像装置为360°全景摄像头。
可选地,所述根据所述当前位置信息和当前观景视场角对所述视频图像进行处理,包括:如果所述当前观景视场角小于摄像装置视场角,则根据所述当前位置信息和当前观景视场角对所述视频图像进行裁剪;以及根据显示装置的分辨率对裁剪后的视频图像进行缩放。
可选地,在根据所述当前位置信息计算出用户当前观景视场角之后,还可以包括:如果所述当前观景视场角对应的视频图像超出了所述摄像装置采集的视频图像的范围,调整所述摄像装置的采集角度。
所述获取用户指定地点的摄像装置实时拍摄的视频图像可以包括:获取摄像装置调整采集角度后实时拍摄的视频图像。
可选地,所述如果所述当前观景视场角对应的视频图像超出了所述摄像装置采集的视频图像的范围,调整所述摄像装置的采集角度,包括:计算所述当前观景视场角与上一观景视场角的视场角变化量;根据所述视场角变化量判断所述当前观景视场角对应的视频图像是否超出了所述摄像装置采集的视频图像的范围;如果所述当前观景视场角对应的视频图像超出了所述摄像装置采集的视频图像的范围,则调整所述摄像装置的采集角度。
本实施例提供一种基于虚拟现实技术的观景装置,包括:当前观景视场角获取单元,设置为获取用户相对于显示装置的当前的位置信息,根据所述当前位置信息计算出用户当前观景视场角,所述当前观景视场角包括横向观景视场角和纵向观景视场角;视频图像获取单元,设置为获取用户指定地点的摄像装置实时拍摄的视频图像;视频图像处理单元,设置为根据所述当前位置信息和当前观景视场角对所述视频图像进行处理;以及视频图像发送单元,设置为发送所述处理后的视频图像至显示装置显示。
可选地,该装置还可以包括:摄像装置采集角度调整单元,设置为如果所述当前观景视场角对应的视频图像超出了所述摄像装置采集的视频图像的范围,调整所述摄像装置的采集角度;所述视频图像获取单元,是设置为获取摄像装置调整采集角度后实时拍摄的视频图像。
可选地,所述视频图像处理单元还设置为如果所述当前观景视场角小于摄像装置视场角,则根据所述当前观景视场角对所述视频图像进行裁剪,并根据显示装置的分辨率对裁剪后的视频图像进行缩放。
本实施例还提供了一种管理服务器,所述管理服务器中配置有如上述任一项所述的基于虚拟现实技术的观景装置,
所述当前观景视场角获取单元是设置为:获取虚拟现实设备发送的用户相对于显示装置的当前位置信息,并根据所述当前位置信息计算出用户的当前观景视场角,其中,所述当前位置信息是通过设置于显示装置上的本地摄像设备采集并发送至所述虚拟现实设备的;所述视频图像获取单元是设置为:接收用户指定地点的远端摄像设备实时拍摄的视频图像,其中,所述远端摄像设备布设于多地景点;所述视频图像发送单元是设置为:将处理后的视频图像通过所述虚拟显示设备发送至所述显示装置进行显示。
本实施例还提供了一种虚拟现实设备,所述虚拟现实设备配置有上述任一项所述的基于虚拟现实技术的观景装置,所述当前观景视场角获取单元是设置为:获取配置于显示装置上的本地摄像设备发送的用户相对于显示装置的当前位置信息,并根据所述当前位置信息计算出用户的当前观景视场角;所述视频图像获取单元是设置为:通过所述管理服务器接收用户指定地点的远端摄像设备实时拍摄的视频图像,其中,所述远端摄像装置布设于多地景点;所述摄像装置采集角度调整单元是设置为:向管理服务器发送用户指定地点的远端摄像设备采集角度调整指令,指示所述管理服务器对所述远端摄像设备的采集角度进行调整。
本实施例还提供一种基于虚拟现实技术的观景系统,包括远端摄像设备、虚拟现实设备、显示装置和本地摄像设备,以及上述管理服务器。
本实施例还提供一种基于虚拟现实技术的观景系统,包括管理服务器、远端摄像设备、显示装置和本地摄像设备,以及上述虚拟现实设备。
本实施例还提供一种计算机可读存储介质,存储有计算机可执行指令,所述计算机可执行指令用于执行上述任意一种基于虚拟现实技术的实时观景方法。
本实施例提供了一种管理服务器,该管理服务器包括一个或多个处理器、存储器以及一个或多个程序,所述一个或多个程序存储在存储器中,当被一个或多个处理器执行时,执行上述任意一种基于虚拟现实技术的观景方法。
本实施例还提供一种虚拟现实设备,该虚拟现实设备包括一个或多个处理 器、存储器以及一个或多个程序,所述一个或多个程序存储在存储器中,当被一个或多个处理器执行时,执行上述任意一种基于虚拟现实技术的观景方法。
通过获取用户相对于显示装置的当前的位置信息,根据所述当前位置信息计算出用户当前观景视场角,根据所述当前位置信息和当前观景视场角对用户指定地点的摄像装置实时拍摄的视频图像进行处理,发送所述处理后的视频图像至显示装置显示,实现了根据用户的位置信息处理实时采集的景点视频图像,就像用户站在窗前看风景一样,显示装置显示的视频图像根据用户位置变化而变化,用户足不出户欣赏全球多地风景的需求,用户体验好,效果逼真。
附图说明
图1是本实施例提供的基于虚拟现实技术的观景方法流程图。
图2a是本实施例提供的用户横向视场角的平面示意图。
图2b是本实施例提供的用户纵向视场角的平面示意图。
图3是本实施例提供的视频图像裁剪示意图。
图4是本实施例提供的基于虚拟现实技术的观景方法流程图。
图5是本实施例提供的基于虚拟现实技术的观景装置结构方框图。
图6是本实施例提供的基于虚拟现实技术的观景装置结构方框图。
图7是本实施例提供的基于虚拟现实技术的观景系统结构方框图。
图8是本实施例提供的基于虚拟现实技术的观景系统结构方框图。
图9是本实施例提供的一种管理服务器或虚拟现实设备的硬件结构示意图。
具体实施方式
为使本公开的技术方案和达到的技术效果更加清楚,下面将结合附图对本公开的技术方案作相关描述。所描述的实施例仅仅是本公开一部分实施例,而不是全部的实施例。在不冲突的情况下,以下实施例和实施例中的技术特征可以相互组合。
图1是本实施例提供的基于虚拟现实技术的观景方法流程图。该方法可以 由配置在管理服务器中的基于虚拟现实技术的观景装置来执行,也可以由配置在虚拟现实设备中的基于虚拟现实技术的观景装置来执行。参考图1所示,该方法可以包括S110-S140。
在S110中,获取用户相对于显示装置的当前的位置信息,根据所述当前位置信息计算出用户当前观景视场角,所述当前观景视场角包括横向观景视场角和纵向观景视场角。
可选地,实时获取用户面部与显示装置的当前相对位置信息,根据所述相对位置信息计算出用户的当前观景视场角。
可选地,在显示装置上端设置一个摄像头,可以实时检测显示装置前方是否存在用户,以及用户位置的变化。通过摄像头跟踪用户的面部位置变化,及时获取用户的当前位置信息。摄像头安装在显示装置上后,摄像头设置于显示装置上的位置、摄像头的拍摄视场角均为已知参数,用户两只眼睛之间的平均距离也为已知参数,可以通过对摄像头拍摄的图像进行处理得出用户面部相对显示装置的平面位置信息,从而可以通过摄像头参数及图像中用户两只眼睛之间的距离估算出用户距离显示屏幕的距离。此外,可以通过建立三维坐标系表示出用户的位置信息,三维坐标系的原点可以选择为显示装置的中心位置,也可以选择为显示装置的左下角位置。
获取用户面部信息的方式有多种,比如,在显示装置上设置三个摄像装置,则可以通过三个摄像装置的位置参数确定显示装置前方用户面部的位置。
此外,还可以将红外装置测距和摄像头面部追踪算法相结合,确定用户的位置信息。
当前观景视场角可以包括横向观景视场角和纵向观景视场角。图2a是本实施例提供的用户横向视场角的平面示意图。如图2a所示,横向观景视场角a为与用户面部位置在同一横平面的显示装置的左右边的两个点(A点和B点)与用户面部位置Q(例如用户眼睛)之间的夹角,Q代表用户面部位置,AB为电视横向的长度,夹角α为横向观景视场角。图2b是本实施例提供的用户纵向视场角的平面示意图。如图2b所示,纵向观景视场角β为与用户面部位置在同一纵平 面内的显示装置的上下边的两个点(如C点和D点)与用户面部位置Q(例如用户眼睛)之间的夹角,Q代表用户面部位置,CD为电视纵向的宽度,夹角β为纵向观景视场角。在实际计算中,无论是横向观景视场角还是纵向观景视场角都可以选择两只眼睛的中点为夹角的顶点,也可以以左眼或者右眼为基准,可以根据实际情况选择。
在S120中,获取用户指定地点的摄像装置实时拍摄的视频图像。
用户可以选择想要欣赏的景点,调取与该景点匹配的摄像装置实时采集景点的视频图像。
可选地,所述摄像装置可以为摄像头,所述摄像头可以为360度全景摄像头,该摄像头的水平视场角和垂直视场角均大于180°,无论用户如何移动,用户的视角均可以落入摄像头的视角范围内。此外,摄像装置也可以选用广角摄像头。
可选地,所述视频图像的采集为实时采集,采集的时间间隔可精确到微秒、毫秒级,以满足用户在显示装置前位置变化时所看到的视频图像为景点当前的现场实景,景点人员流动、风吹草动都能实时在显示装置上显示出来。
在S130中,根据所述当前位置信息和当前观景视场角对所述视频图像进行处理。
可选地,如果所述当前观景视场角小于摄像装置视场角,则根据所述当前位置信息和当前观景视场角对所述视频图像进行裁剪;根据显示装置的分辨率对裁剪后的视频图像进行缩放。
当摄像装置为360°全景摄像头时,假设摄像头的水平视场角和垂直视场角均为180°,用户当前横向观景视场角为50°,纵向观景视场角为30°。由于摄像装置拍摄的视频图像大于在显示装置上显示的图像,需要按照当前观景视场角和用户当前位置信息对视频图像进行裁剪。
图3是本实施例提供的视频图像裁剪示意图。参考图3所示,在显示装置前,用户从第一个位置M变换为第二个位置N时,用户当前视场角也发生变化即,相较于用户在第一个位置M处的视场角,用户在第二个位置N处的视场角 变小了。
因此,虚拟现实设备裁剪的视频图像也需要根据用户的第二个位置N进行调整。用户位置变化前(即用户在第一个位置M处)时,显示装置显示的视频图像是在远端摄像装置拍摄的视频图像中裁剪的A区域内容。用户位置变化后(即用户在第二个位置N处),显示装置显示的视频图像是在远端摄像装置拍摄的视频图像中裁剪的B区域内容。在根据用户当前的位置信息(用户当前处于第二个位置N处),对远端摄像装置拍摄的视频图像进行裁剪,并将裁剪的内容进行缩放处理后发送至显示装置,由显示装置对处理后的视频图像进行显示。
此时,由于视频图像可以跟随用户的位置移动而变化,可以使用户如在窗前向外看风景一样,即随着用户位置的变化,从窗户看到的风景也不同。
举例说明,用户相对于显示装置的位置信息通过三维坐标系来表示,以显示装置的中心点(如图3中所示的显示装置的中心0点)为原点;将摄像头拍摄的视频图像的中点作为原点建立二维坐标系;用户的位置坐标表示为(x,y,z)时,裁剪视频图像的中心点坐标为(-x、-y)。当用户的位置发生变化时,裁剪视频图像的中心点也相应的移动,根据当前观景视场角裁剪出合适的视频图像。裁剪后的视频图像分辨率不一定和显示装置的分辨率对应,因此,还可以对裁剪后的视频图像根据显示装置的分辨率进行缩放处理,以使得处理后的视频图像在显示装置中的显示良好。
当用户当前位置超出预设范围时,则显示装置可进行提醒,告知用户在当前位置不能或不利于观景。例如,可以在显示装置上设置摄像头,摄像头可以采集用户的当前位置信息,如果用户超出该摄像头的拍摄视场角,可确定用户超出了预设范围,进行报警提醒。
可选地,为了扩大用户活动区域,可以在显示装置上设置两个摄像头,以增大拍摄范围。
在S140中,发送所述处理后的视频图像至显示装置显示。
将S130中的裁剪过的视频图像发送至显示装置进行显示。可选地,显示装置可以为液晶显示屏。
本实施例提供的基于虚拟现实技术的观景方法通过获取用户相对于显示装置的当前的位置信息,根据所述当前位置信息计算出用户当前观景视场角,根据所述当前位置信息和当前观景视场角对用户指定地点的摄像装置实时拍摄的视频图像进行处理,发送所述处理后的视频图像至显示装置显示,实现了根据用户的位置信息处理实时采集的景点视频图像,就像用户站在窗前看风景一样,满足了用户足不出户欣赏全球各地风景的需求,用户体验好,效果逼真。
图4是本实施例提供的基于虚拟现实技术的观景方法流程图。本实施例以前述方法为基础,当摄像装置的视场角拍摄的视频图像不能满足需求时的具体实现方式。参考图4所示,该方法可以包括S201-S205。
在S201中,获取用户相对于显示装置的当前的位置信息,根据所述当前位置信息计算出用户当前观景视场角。
可选地,所述当前观景视场角包括横向观景视场角和纵向观景视场角。
在S202中,如果所述当前观景视场角对应的视频图像超出了所述摄像装置采集的视频图像的范围,调整所述摄像装置的采集角度。
在摄像头的镜头固定不动时,摄像头的视场角有限制,比如一般户外摄像头的视场角为150°。为了宽大摄像头的拍摄范围,在摄像头的底座固定时,可以旋转摄像头的镜头,通过摄像头镜头的旋转扩大拍摄范围。
可选地,可以计算所述当前观景视场角与上一时刻观景视场角的视场角变化量,根据所述视场角变化量判断所述当前观景视场角对应的视频图像是否超出了所述摄像装置采集的视频图像的范围。
例如,当视场角为150°的摄像头拍摄的视频图像,用户前一次的位置信息所裁剪的视频图像已经位于摄像头拍摄的视频图像的边缘,那么当用户再次移动位置时,则可以确定下次裁剪的视频图像已经超出了摄像头拍摄的视频图像的范围。
如果所述当前观景视场角对应的视频图像超出了所述摄像装置采集的视频图像的范围,则调整所述摄像装置的采集角度。
如果判断出下次将要裁剪的视频图像已经超出了摄像头所拍摄的视频图像 的范围,那么调整摄像头的镜头角度,则采集到满足裁剪需求的视频图像。
在S203中,获取摄像装置调整采集角度后实时拍摄的视频图像。
在S204中,根据所述当前位置信息和当前观景视场角对所述视频图像进行处理。
在S205中,发送所述处理后的视频图像至显示装置显示。
S201-S205的相关操作请参考图1以及图1对应的实施例中的内容。
本实施例提供的基于虚拟现实技术的观景方法通过获取用户相对于显示装置的当前的位置信息,根据所述当前位置信息计算出用户当前观景视场角,如果所述当前观景视场角对应的视频图像超出了所述摄像装置采集的视频图像的范围,调整所述摄像装置的采集角度,获取摄像装置调整采集角度后实时拍摄的视频图像,根据所述当前位置信息和当前观景视场角对所述视频图像进行处理,发送所述处理后的视频图像至显示装置显示,实现了根据用户的位置信息处理实时采集的景点视频图像,就像用户站在窗前看风景一样,如果摄像装置拍摄的视频图像不能满足裁剪要求,则对远端摄像装置的采集角度进行调整,扩大了观景范围,满足了用户足不出户欣赏全球各地风景的需求,用户体验好,效果逼真。
图5是本实施例提供的基于虚拟现实技术的观景装置结构方框图。该基于虚拟现实技术的观景装置可以配置于管理服务器,也可以配置于虚拟现实设备。参考图5所示,该装置10可以包括当前观景视场角获取单元100,视频图像获取单元101,视频图像处理单元102和视频图像发送单元103。
当前观景视场角获取单元100,设置为获取用户相对于显示装置的当前的位置信息,根据所述当前位置信息计算出用户当前观景视场角,所述当前观景视场角包括横向观景视场角和纵向观景视场角。
视频图像获取单元101,设置为获取用户指定地点的摄像装置实时拍摄的视频图像。
视频图像处理单元102,设置为根据所述当前位置信息和当前观景视场角对 所述视频图像进行处理;具体用于如果所述当前观景视场角小于摄像装置视场角,则根据所述当前观景视场角对所述视频图像进行裁剪,并根据显示装置的分辨率对裁剪后的视频图像进行缩放。
视频图像发送单元103,设置为发送所述处理后的视频图像至显示装置显示。
装置的详细内容可参见图1所示的方法中的相关介绍。
本实施例提供的基于虚拟现实技术的观景装置通过获取用户相对于显示装置的当前的位置信息,根据所述当前位置信息计算出用户当前观景视场角,根据所述当前位置信息和当前观景视场角对用户指定地点的摄像装置实时拍摄的视频图像进行处理,发送所述处理后的视频图像至显示装置显示,实现了根据用户的位置信息处理实时采集的景点视频图像,就像用户站在窗前看风景一样,满足了用户足不出户欣赏全球各地风景的需求,用户体验好,效果逼真。
图6是本实施例提供的基于虚拟现实技术的观景装置结构方框图。该装置以前述装置为基础,可以实现摄像装置采集角度的调整。参考图6所示,该装置还包括摄像装置采集角度调整单元104。
摄像装置采集角度调整单元104设置为如果所述当前观景视场角对应的视频图像超出了所述摄像装置采集的视频图像的范围,调整所述摄像装置的采集角度。
所述视频图像获取单元101,是设置为获取摄像装置调整采集角度后实时拍摄的视频图像。
该基于虚拟现实技术的观景装置在摄像装置拍摄的视频图像范围不能满足用户移动范围的要求时,对摄像装置的采集角度进行调整,扩大了观景范围,进一步提高了用户观景效果,用户体验好。
图7是本实施例提供的基于虚拟现实技术的观景系统的结构方框图。参考图7所示,该系统包括管理服务器1、远端摄像装置2、虚拟现实设备3、显示 装置4和本地摄像装置5,所述管理服务器1配置有上述所述的基于虚拟现实技术的实时观景装置10。
所述远端摄像装置2布设于世界多地的著名景点,设置为拍摄实时视频图像发送至所述管理服务器。
所述本地摄像装置5布设于所述显示装置4上,设置为采集显示装置4前方用户面部相对于所述显示装置4的位置信息,并将所述位置信息发送至所述虚拟现实设备3。
所述虚拟现实设备3设置为获取所述用户面部相对于所述显示装置4的位置信息并将该位置信息发送至所述管理服务器1;接收所述管理服务器1处理后的视频图像,将所述视频图像发送至所述显示装置4进行显示。
该系统通过远端摄像装置拍摄实时视频图像发送至管理服务器,虚拟现实设备通过本地摄像装置获取用户的面部相对于显示装置的位置信息,并将位置信息发送至管理服务器,管理服务器根据虚拟现实设备发送的位置信息计算出用户视场角,并根据该位置信息和视场角对视频图像进行裁剪,并将裁剪后的视频图像通过虚拟现实设备发送至显示装置进行显示,实现了根据用户的位置信息处理实时采集的景点视频图像,就像用户站在窗前看风景一样,满足了用户足不出户欣赏全球各地风景的需求,用户体验好,效果逼真。本实施例中,基于虚拟现实技术的观景装置配置于管理服务器,将所有视频图像处理工作均在管理服务器完成,减少了数据传输量,对网络带宽要求低。
可选地,所述远端摄像装置2为360°全景摄像头,360°全景摄像头拍摄范围大,能满足用户的随意位置。所述本地摄像装置5为广角摄像头,具有较大的拍摄范围,从而使用户具有较大的移动范围。
可选地,所述管理服务器为云服务器,通过网络与虚拟现实设备连接,且远端摄像装置通过网络与管理服务器连接。管理服务器对全球各地的远端摄像装置传回的视频图像进行管理,所述网络可以为互联网或3G/4G。
图8是本实施例提供的基于虚拟现实技术的观景系统的结构方框图。参考 图8所示,该系统包括管理服务器2、远端摄像装置3、虚拟现实设备1、显示装置4和本地摄像装置5,所述虚拟现实设备1配置有如上述所述的基于虚拟现实技术的实时观景装置10。
所述远端摄像装置3布设于多地景点,设置为拍摄实时视频图像发送至所述管理服务器2。
所述本地摄像装置5布设于所述显示装置4上,设置为采集显示装置4前方用户面部相对于所述显示装置的位置信息,并将所述位置信息发送至所述虚拟现实设备1。
所述管理服务器2设置为根据虚拟现实设备1发送的远端摄像装置采集角度调整指令对远端摄像装置3的采集角度进行调整;将远端摄像装置拍摄的实时视频图像发送至所述虚拟现实设备1进行处理。
所述显示装置4设置为显示所述虚拟现实设备1处理后的视频图像。
该系统通过远端摄像装置拍摄实时视频图像发送至管理服务器,管理服务器根据虚拟现实设备发送的远端摄像装置采集角度调整指令对远端摄像装置的采集角度进行调整,将远端摄像装置拍摄的实时视频图像发送至所述虚拟现实设备进行处理,实现了根据用户的位置信息处理实时采集的景点视频图像,就像用户站在窗前看风景一样,满足了用户足不出户欣赏全球各地风景的需求,用户体验好,效果逼真。本实施例中,基于虚拟现实技术的观景装置配置于虚拟现实设备,将所有视频图像处理工作均在虚拟现实设备完成,减少了管理服务器的工作量,对服务器要求低,有利于降低成本。
可选地,所述远端摄像装置3为360°全景摄像头,360°全景摄像头拍摄范围大,能满足用户的随意位置。所述本地摄像装置5为广角摄像头,具有较大的拍摄范围,从而使用户具有较大的移动范围。
可选地,所述管理服务器为云服务器,通过网络与虚拟现实设备连接,且远端摄像装置通过网络与管理服务器连接。管理服务器对全球各地的远端摄像装置传回的视频图像进行管理,所述网络可以为互联网或3G/4G。
本实施例还提供一种计算机可读存储介质,存储有计算机可执行指令,所述计算机可执行指令用于执行上述任意一种基于虚拟现实技术的观景方法。
图9是本实施例的管理服务器或虚拟现实设备的硬件结构示意图,如图9所示,该管理服务器或虚拟现实设备可以包括:一个或多个处理器310(图9中以一个处理器310为例),存储器320。
所述管理服务器或虚拟现实设备还可以包括:输入装置330和输出装置340。
处理器310、存储器320、输入装置330和输出装置340可以通过总线或者其他方式连接,图9中以通过总线连接为例。
存储器320作为一种计算机可读存储介质,可用于存储软件程序、计算机可执行程序以及模块。处理器310通过运行存储在存储器320中的软件程序、指令以及模块,从而执行多种功能应用以及数据处理,以实现上述任意一种方法实施例的基于虚拟现实技术的观景方法。存储器320可以包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需要的应用程序;存储数据区可存储根据管理服务器或虚拟现实设备的使用所创建的数据等。存储器320可以是非暂态计算机存储介质或暂态计算机存储介质。该非暂态计算机存储介质可以包括随机存取存储器(Random Access Memory,RAM)等易失性存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他非易失性固态存储器件。在一些实施例中,存储器320可选包括相对于处理器310远程设置的存储器,这些远程存储器可以通过网络连接至管理服务器或虚拟现实设备。上述网络的实例可以包括互联网、企业内部网、局域网、移动通信网及其组合。
输入装置330可用于接收输入的数字或字符信息,以及产生与管理服务器或虚拟现实设备的用户设置以及功能控制有关的键信号输入。输出装置340可包括显示屏等显示装置。
本实施例还可以包括通信装置350,通过通信网络传输和/或接收信息。
最后需要说明的是,本领域普通技术人员可理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来执行相关的硬件来完成的,该程序可存储于一个非暂态计算机可读存储介质中,该程序在执行时,可包括如上述方法的实施例的流程,其中,该非暂态计算机可读存储介质可以为磁碟、光盘、只读存储记忆体(ROM)或随机存储记忆体(RAM)等。
工业实用性
本公开提供了一种基于虚拟现实技术的观景方法、装置和系统,可以实现根据用户的位置信息处理实时采集的景点视频图像,使得用户足不出户便可以实时观看多地的景点,达到窗前观景的效果。

Claims (13)

  1. 一种基于虚拟现实技术的观景方法,包括:
    获取用户相对于显示装置的当前的位置信息,根据所述当前位置信息计算出用户当前观景视场角,所述当前观景视场角包括横向观景视场角和纵向观景视场角;
    获取用户指定地点的摄像装置实时拍摄的视频图像;
    根据所述当前位置信息和当前观景视场角对所述视频图像进行处理;以及
    发送处理后的视频图像至显示装置显示。
  2. 根据权利要求1所述的方法,其中,所述摄像装置为360°全景摄像头。
  3. 根据权利要求1所述的方法,其中,所述根据所述当前位置信息和当前观景视场角对所述视频图像进行处理,包括:
    如果所述当前观景视场角小于摄像装置视场角,则根据所述当前位置信息和当前观景视场角对所述视频图像进行裁剪;以及
    根据显示装置的分辨率对裁剪后的视频图像进行缩放。
  4. 根据权利要求1所述的方法,在根据所述当前位置信息计算出用户当前观景视场角之后,还包括:
    如果所述当前观景视场角对应的视频图像超出了所述摄像装置采集的视频图像的范围,调整所述摄像装置的采集角度;
    所述获取用户指定地点的摄像装置实时拍摄的视频图像包括:
    获取摄像装置调整采集角度后实时拍摄的视频图像。
  5. 根据权利要求4所述的方法,其中,所述如果所述当前观景视场角对应的视频图像超出了所述摄像装置采集的视频图像的范围,调整所述摄像装置的采集角度,包括:
    计算所述当前观景视场角与上一观景视场角的视场角变化量;
    根据所述视场角变化量判断所述当前观景视场角对应的视频图像是否超出了所述摄像装置采集的视频图像的范围;以及
    如果所述当前观景视场角对应的视频图像超出了所述摄像装置采集的视频图像的范围,则调整所述摄像装置的采集角度。
  6. 一种基于虚拟现实技术的观景装置,包括:
    当前观景视场角获取单元,设置为获取用户相对于显示器的当前的位置信息,根据所述当前位置信息计算出用户当前观景视场角,所述当前观景视场角包括横向观景视场角和纵向观景视场角;
    视频图像获取单元,设置为获取用户指定地点的摄像装置实时拍摄的视频图像;
    视频图像处理单元,设置为根据所述当前位置信息和当前观景视场角对所述视频图像进行处理;以及
    视频图像发送单元,设置为发送所述处理后的视频图像至显示装置显示。
  7. 根据权利要求6所述的装置,还包括:
    摄像装置采集角度调整单元,设置为如果所述当前观景视场角对应的视频图像超出了所述摄像装置采集的视频图像的范围,调整所述摄像装置的采集角度;以及
    所述视频图像获取单元,还设置为获取摄像装置调整采集角度后实时拍摄的视频图像。
  8. 根据权利要求1所述的装置,其中,所述视频图像处理单元是设置为如果所述当前观景视场角小于摄像装置视场角,则根据所述当前观景视场角对所述视频图像进行裁剪,并根据显示装置的分辨率对裁剪后的视频图像进行缩放。
  9. 一种管理服务器,所述管理服务器中配置有如权利要求6至8任一项所述的基于虚拟现实技术的观景装置,
    所述当前光观景视场角获取单元是设置为:获取虚拟现实设备发送的用户相对于显示装置的当前位置信息,并根据所述当前位置信息计算出用户的当前观景视场角,其中,所述当前位置信息是通过设置于显示装置上的本地摄像设备采集并发送至所述虚拟现实设备的;
    所述视频图像获取单元是设置为:接收用户指定地点的远端摄像设备实时拍摄的视频图像,其中,所述远端摄像设备布设于多地景点;
    所述视频图像发送单元是设置为:将处理后的视频图像通过所述虚拟显示 设备发送至所述显示装置进行显示。
  10. 一种虚拟现实设备,所述虚拟现实设备配置有如权利要求6至8任一项所述的基于虚拟现实技术的观景装置,
    所述当前观景视场角获取单元是设置为:获取配置于显示装置上的本地摄像设备发送的用户相对于显示装置的当前位置信息,并根据所述当前位置信息计算出用户的当前观景视场角;
    所述视频图像获取单元是设置为:通过所述管理服务器接收用户指定地点的远端摄像设备实时拍摄的视频图像,其中,所述远端摄像装置布设于多地景点;
    所述摄像装置采集角度调整单元是设置为:向管理服务器发送用户指定地点的远端摄像设备采集角度调整指令,指示所述管理服务器对所述远端摄像设备的采集角度进行调整。
  11. 一种基于虚拟现实技术的观景系统,包括远端摄像设备、虚拟现实设备、显示装置和本地摄像设备,以及权利要求9所述的管理服务器。
  12. 一种基于虚拟现实技术的观景系统,包括管理服务器、远端摄像装置、显示装置和本地摄像装置,以及权利要求10所述的虚拟现实设备。
  13. 一种计算机可读存储介质,存储有计算机可执行指令,所述计算机可执行指令用于执行权利要求1-5任一项所述的观景方法。
PCT/CN2017/077987 2016-12-09 2017-03-24 基于虚拟现实技术的观景方法、装置及系统 WO2018103233A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2017370476A AU2017370476B2 (en) 2016-12-09 2017-03-24 Virtual reality-based viewing method, device, and system
US16/323,676 US20190208174A1 (en) 2016-12-09 2017-03-24 Virtual reality-based viewing method, device, and system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201611130324.2A CN106791385A (zh) 2016-12-09 2016-12-09 一种基于虚拟现实技术的观景方法、装置及系统
CN201611130324.2 2016-12-09

Publications (1)

Publication Number Publication Date
WO2018103233A1 true WO2018103233A1 (zh) 2018-06-14

Family

ID=58879727

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/077987 WO2018103233A1 (zh) 2016-12-09 2017-03-24 基于虚拟现实技术的观景方法、装置及系统

Country Status (4)

Country Link
US (1) US20190208174A1 (zh)
CN (1) CN106791385A (zh)
AU (1) AU2017370476B2 (zh)
WO (1) WO2018103233A1 (zh)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107317987B (zh) * 2017-08-14 2020-07-03 歌尔股份有限公司 虚拟现实的显示数据压缩方法和设备、系统
TWI653882B (zh) 2017-11-23 2019-03-11 宏碁股份有限公司 視訊裝置及其三維物件編解碼方法
CN108650522B (zh) * 2018-05-29 2020-10-27 青岛一舍科技有限公司 基于自动控制的可即时获取高清照片的直播系统
CN110856107B (zh) * 2018-08-21 2023-08-22 上海擎感智能科技有限公司 智能导游方法、系统、服务器及车辆
CN109741464B (zh) * 2019-01-08 2023-03-24 三星电子(中国)研发中心 用于显示实景的方法和装置
US20210349308A1 (en) * 2020-05-05 2021-11-11 Szu Wen FAN System and method for video processing using a virtual reality device
CN111683077B (zh) * 2020-06-02 2021-05-04 硅谷数模(苏州)半导体有限公司 虚拟现实设备及数据的处理方法
CN111741287B (zh) * 2020-07-10 2022-05-17 南京新研协同定位导航研究院有限公司 一种mr眼镜利用位置信息触发内容的方法
CN112995501A (zh) * 2021-02-05 2021-06-18 歌尔科技有限公司 摄像头的控制方法、装置、电子设备及存储介质
CN114286005A (zh) * 2021-12-29 2022-04-05 合众新能源汽车有限公司 一种车辆天窗的图像显示方法及装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103345357A (zh) * 2013-07-31 2013-10-09 关鸿亮 一种基于移动设备传感器实现自动街景展示的方法
CN103384865A (zh) * 2011-02-22 2013-11-06 高通股份有限公司 基于用户相对于移动平台的位置来提供经纠正的视图
CN104902263A (zh) * 2015-05-26 2015-09-09 深圳市圆周率软件科技有限责任公司 一种图像信息展现系统和方法
US20160086306A1 (en) * 2014-09-19 2016-03-24 Sony Computer Entertainment Inc. Image generating device, image generating method, and program
CN106162035A (zh) * 2015-05-12 2016-11-23 Lg电子株式会社 移动终端

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7883415B2 (en) * 2003-09-15 2011-02-08 Sony Computer Entertainment Inc. Method and apparatus for adjusting a view of a scene being displayed according to tracked head motion
US9389682B2 (en) * 2012-07-02 2016-07-12 Sony Interactive Entertainment Inc. Methods and systems for interaction with an expanded information space

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103384865A (zh) * 2011-02-22 2013-11-06 高通股份有限公司 基于用户相对于移动平台的位置来提供经纠正的视图
CN103345357A (zh) * 2013-07-31 2013-10-09 关鸿亮 一种基于移动设备传感器实现自动街景展示的方法
US20160086306A1 (en) * 2014-09-19 2016-03-24 Sony Computer Entertainment Inc. Image generating device, image generating method, and program
CN106162035A (zh) * 2015-05-12 2016-11-23 Lg电子株式会社 移动终端
CN104902263A (zh) * 2015-05-26 2015-09-09 深圳市圆周率软件科技有限责任公司 一种图像信息展现系统和方法

Also Published As

Publication number Publication date
US20190208174A1 (en) 2019-07-04
AU2017370476B2 (en) 2020-11-05
AU2017370476A1 (en) 2019-04-11
CN106791385A (zh) 2017-05-31

Similar Documents

Publication Publication Date Title
WO2018103233A1 (zh) 基于虚拟现实技术的观景方法、装置及系统
US9965026B2 (en) Interactive video display method, device, and system
KR102363364B1 (ko) 파노라마 비디오의 상호작용적 전송을 위한 방법 및 시스템
CN114527872B (zh) 虚拟现实交互系统、方法及计算机存储介质
CN103905792B (zh) 一种基于ptz监控摄像机的3d定位方法及装置
CN108259921B (zh) 一种基于场景切换的多角度直播系统及切换方法
KR20140053885A (ko) 모바일 컴퓨팅 디바이스에서의 파노라마 비디오 이미징을 위한 장치 및 방법
CN105163158A (zh) 一种图像处理方法和装置
US10827117B2 (en) Method and apparatus for generating indoor panoramic video
KR102069930B1 (ko) 이머전 통신 클라이언트, 서버 및 컨텐츠 뷰를 획득하는 방법
CN101720027A (zh) 可变焦阵列摄像机协同获取不同分辨率多目标视频方法
CN110602383B (zh) 监控摄像头的位姿调节方法、装置、终端及存储介质
JP6359572B2 (ja) 画像送信装置、情報処理端末、画像送信方法、情報処理方法、プログラム及び情報記憶媒体
WO2019085829A1 (zh) 控制系统的处理方法、装置、存储介质和电子装置
KR20170044451A (ko) 헤드 마운트 디스플레이를 이용한 원격지 카메라 제어 시스템 및 방법
US11062422B2 (en) Image processing apparatus, image communication system, image processing method, and recording medium
CN108449545B (zh) 一种监控系统及其应用方法
CN114442805A (zh) 一种监控场景展示方法、系统、电子设备及存储介质
CN115668913A (zh) 现场演出的立体显示方法、装置、介质及系统
CN111325201A (zh) 影像处理方法、装置与可移动设备、无人机遥控器及系统
KR101410985B1 (ko) 감시카메라를 이용한 감시시스템 및 감시장치와 그 감시방법
US10425608B2 (en) Image processing method and camera
CN112640419B (zh) 跟随方法、可移动平台、设备和存储介质
CN103139457A (zh) 一种图像获得控制方法及电子设备
WO2023241495A1 (zh) 拍摄方法及其装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17878572

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017370476

Country of ref document: AU

Date of ref document: 20170324

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17878572

Country of ref document: EP

Kind code of ref document: A1