CN115103114A - Panoramic video view tracking method, device, equipment and medium - Google Patents

Panoramic video view tracking method, device, equipment and medium Download PDF

Info

Publication number
CN115103114A
CN115103114A CN202210683281.XA CN202210683281A CN115103114A CN 115103114 A CN115103114 A CN 115103114A CN 202210683281 A CN202210683281 A CN 202210683281A CN 115103114 A CN115103114 A CN 115103114A
Authority
CN
China
Prior art keywords
panoramic
view angle
user
target
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210683281.XA
Other languages
Chinese (zh)
Other versions
CN115103114B (en
Inventor
李志�
刘江波
李照威
王怀亮
于洪达
王亚星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202210683281.XA priority Critical patent/CN115103114B/en
Publication of CN115103114A publication Critical patent/CN115103114A/en
Application granted granted Critical
Publication of CN115103114B publication Critical patent/CN115103114B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a method, a device, equipment and a medium for tracking a view angle of a panoramic video, wherein the method comprises the following steps: when the user is in the effective identification area, carrying out visual angle tracking on the user to determine a target visual angle of the user; determining a target image area in a panoramic spherical coordinate system of the panoramic video based on the target view angle and the view angle range, wherein the view angle range is an angle formed by an original point of the view angle and the edge of the panoramic playing device, and the original point of the view angle is the middle point of the effective identification area; and carrying out panoramic display on the target image area, and automatically adjusting the played target image area through visual angle tracking so as to provide a novel experience of being personally on the scene for a user.

Description

Panoramic video view tracking method, device, equipment and medium
Technical Field
The present disclosure relates generally to the field of panoramic display technologies, and in particular, to a method, an apparatus, a device, and a medium for tracking a view angle of a panoramic video.
Background
Panoramic video is an immersive virtual reality technology, and is favored by more and more users due to a unique interactive experience mode. Along with the rapid development of the liquid crystal display, the resolution ratio of the display screen can reach more ideal height, and then the screen image quality is more and more clear and fine, which is enough for playing the panoramic video. However, in the related art, when the liquid crystal display is reused to play the panoramic video, a user needs to manually drag a picture, and an experience of being personally on the scene cannot be provided for the user.
Disclosure of Invention
In view of the above-mentioned drawbacks or deficiencies in the prior art, it is desirable to provide a method, an apparatus, a device, and a medium for tracking a view angle of a panoramic video, which automatically adjust a played target image area through view angle tracking, and provide an immersive, new experience for a user.
In a first aspect, an embodiment of the present application provides a view tracking method for panoramic video, including:
when the user is in the effective identification area, carrying out visual angle tracking on the user to determine a target visual angle of the user;
determining a target image area in a panoramic spherical coordinate system of the panoramic video based on the target view angle and a view angle range, wherein the view angle range is an angle formed by a view angle origin and the edge of the panoramic playing equipment, and the view angle origin is the midpoint of the effective identification area;
and carrying out panoramic display on the target image area.
In some embodiments, the effective identification area is a preset distance in front of the panoramic playing device, where the preset distance is a preset proportion of a distance between the panoramic playing device and an obstacle.
In some embodiments, the detection of human eye movement in the human face image is determined by an optical flow method, and the tracking of the visual angle is performed according to the detection result of the human eye movement.
In some embodiments, the target perspective comprises a horizontal perspective and a vertical perspective;
the horizontal visual angle is an included angle between the sight line direction of the user in the horizontal direction and the vertical direction of the panoramic playing device;
the vertical visual angle is an included angle between the sight direction of the user in the vertical direction and the vertical direction of the panoramic playing device.
In some embodiments, the determining the target image area in the panoramic spherical coordinate system of the panoramic video based on the target view angle and the view angle range comprises:
constructing a panoramic spherical coordinate system of the panoramic video into an external sphere of the effective identification area;
determining the target image area on the panoramic spherical surface based on the target view angle and the view angle range.
In some embodiments, a coordinate region within the view angle range centered on the target view angle on the panoramic spherical coordinate system is taken as the target image region.
In some embodiments, the target image area is zoomed according to the size of the panorama playing device, and then is panoramically displayed.
In a second aspect, an embodiment of the present application provides a view tracking apparatus for panoramic video, including:
the tracking module is used for tracking the visual angle of the user to determine the target visual angle of the user when the user is in the effective identification area;
the determining module is used for determining a target image area in a panoramic spherical coordinate system of the panoramic video based on the target view angle and a view angle range, wherein the view angle range is an angle formed by a view angle origin and the edge of the panoramic playing equipment, and the view angle origin is the midpoint of the effective identification area;
and the display module is used for carrying out panoramic display on the target image area.
In a third aspect, embodiments of the present application provide an electronic device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and the processor executes the computer program to implement the method described in the embodiments of the present application.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the method as described in the embodiments of the present application.
According to the visual angle tracking method for the panoramic video, when the user is located in the effective identification area, the user can be tracked through visual angles, the target image area is determined in the panoramic spherical coordinate system of the panoramic video and displayed, the effect of eye movement following the image can be achieved, the user can feel personally on the scene when watching the panoramic video, and therefore user experience is effectively improved.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments made with reference to the following drawings:
fig. 1 is a diagram illustrating an implementation environment architecture of a view tracking method for panoramic video according to an embodiment of the present application;
fig. 2 is a schematic flowchart illustrating a view tracking method for panoramic video according to an embodiment of the present application;
FIG. 3 illustrates a schematic plan view of an effective identification area according to an embodiment of the present application;
FIG. 4 illustrates a perspective view of an active identification area according to an embodiment of the present application;
fig. 5 is a flowchart illustrating a view tracking method for panoramic video according to another embodiment of the present application;
FIG. 6 is a schematic diagram illustrating a position relationship between an effective recognition area and a panoramic spherical coordinate system according to an embodiment of the present application;
fig. 7 is a schematic diagram illustrating a principle that a human face pitch angle provided by an embodiment of the present application is fixed in the same horizontal direction and rotates to determine a target image area;
fig. 8 is a schematic structural diagram illustrating a view tracking apparatus for panoramic video according to an embodiment of the present application;
fig. 9 shows a schematic structural diagram of a computer system suitable for implementing the electronic device or the server according to the embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
A specific implementation environment of the perspective tracking method for panoramic video provided by the present application is shown in fig. 1. Fig. 1 is a diagram illustrating an implementation environment architecture of a view tracking method for panoramic video according to an embodiment of the present application.
As shown in fig. 1, the implementation environment architecture includes: the panoramic playing device 101, the server 102 and the human face acquisition device 103. Fig. 1 illustrates an example of a built-in camera with a human face acquisition device 103 as a panoramic playing device 101.
The panorama playback device 101 is a display device capable of playing panoramic video, such as a liquid crystal display screen or the like. The panorama playback device 101 may be disposed on a wall surface. The face collecting device 103 may be a camera, and the face collecting device 103 may be disposed on the panorama playback apparatus 101 or may be disposed independently, that is, the face collecting device 103 may be a built-in camera of the panorama playback apparatus 101 or an independent camera directly or indirectly connected to the panorama playback apparatus 101.
When the face acquisition device 103 is directly connected to the panoramic playback device 101, the face acquisition device 103 sends the acquired face image to the panoramic playback device 101, so that the panoramic playback device 101 sends the face image to the server 102 and the server 102 performs view tracking analysis on the face image to obtain a target view. When the face acquisition device 103 is indirectly connected to the panoramic playing device 101, the face acquisition device 103 sends the acquired face image to the server 102, so that the server 102 performs view tracking analysis on the face image to obtain a target view.
The server 102 receives the face image sent by the panoramic playing device 101 or the face acquisition device 103, performs view tracking analysis on the face image to determine a target view angle, determines a target image area in a panoramic spherical coordinate system of the panoramic video based on the target view angle and the view angle range, and controls the panoramic playing device 101 to perform panoramic display on the target area.
Alternatively, the server 102 may provide multiple panoramic videos, such as panoramic videos of different times of the same location or panoramic videos of different locations, which the user may select to determine for presentation. Preferably, the selection preference of the user can be recorded, and panoramic videos and the like which accord with the preference of the user are recommended to the user when the user selects the panoramic videos. The method for providing and selecting the panoramic video is not specifically limited in the present application.
The server 102 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, a cloud storage, a network service, cloud communication, middleware service, a domain name service, a security service, a CDN, and a big data and artificial intelligence platform.
The panorama playback device 101 and the server 102 are connected directly or indirectly through wired or wireless communication. Optionally, the wireless or wired networks described above use standard communication techniques and/or protocols. The Network is typically the Internet, but may be any Network including, but not limited to, a Local Area Network (LAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a mobile, wireline or wireless Network, a private Network, or any combination of virtual private networks.
The view angle tracking method for the panoramic video can be implemented by a view angle tracking device for the panoramic video, and the view angle tracking device for the panoramic video can be installed on a panoramic playing device or a server.
To further explain the technical solutions provided by the embodiments of the present application, the following detailed descriptions are made in conjunction with the accompanying drawings and the detailed description. Although the embodiments of the present application provide the following embodiments or steps of the method operation instructions shown in the drawings, more or less steps of the operation instructions may be included in the method based on the conventional or non-inventive labor. In steps where no necessary causal relationship exists logically, the order of execution of the steps is not limited to that provided by the embodiments of the present application. The method can be executed in sequence or in parallel according to the method shown in the embodiment or the figure when the method is executed in an actual processing procedure or a device.
Referring to fig. 2, fig. 2 is a flowchart illustrating a view tracking method for panoramic video according to an embodiment of the present application. As shown in fig. 2, the method includes:
step 201, when the user is in the effective identification area, performing view tracking on the user to determine a target view of the user.
In one or more embodiments, the effective identification difference is a preset distance in front of the panoramic playing device, where the preset distance is a preset proportion of the distance between the panoramic playing device and the obstacle.
The preset proportion can be a proportion value preset by a user or a proportion value obtained through limited experiments. Preferably, in one or more embodiments, the preset ratio may be 2/3.
For example, as shown in fig. 3, BD is the length of the panorama playback device, AC ═ BD, OR is the distance between the panorama playback device and the obstacle, where the obstacle is the bed, O is the face acquisition device, OQ is the preset distance, that is, 2/3 of the distance between the panorama playback device and the obstacle, and P is the midpoint of OQ.
That is, the effective recognition area is an area in front of the panorama playback apparatus and away from an obstacle, which may be a bed, a desk, or the like. Therefore, the normal work and life of a user can be effectively prevented from being influenced by the following effect of the panoramic playing, namely, only when the user is far away from a work area and a living area and actively comes to the panoramic playing equipment, the visual angle following of the panoramic playing is executed.
Specifically, the position of a user is determined according to a face image acquired by a face acquisition device, then whether the position is in an effective identification area is judged, if yes, visual angle tracking is carried out according to the face image to determine a target visual angle of the user, and if not, the face image of the user is continuously acquired to carry out position judgment.
The position can be determined by using the relationship between the size of the face in the face image and the sizes and positions of other known objects in the room, and the position can also be determined by using a neural network model according to the face image.
In one or more embodiments, the detection of human eye movement in the face image is determined by an optical flow method, and the tracking of the view angle is performed according to the detection result of the human eye movement.
In the optical flow method, the motion condition of the object is determined by using the characteristic change conditions in the horizontal direction and the vertical direction, and in the embodiment of the application, only the motion condition of human eyes can be judged.
It should be noted that the viewing angle is an angle formed by the line of sight and the vertical direction of the panoramic playing device, and the target viewing angle is an angle formed by a new line of sight generated by the movement of human eyes and the vertical direction of the panoramic playing device.
Wherein the target view comprises a horizontal view and a vertical view; the horizontal visual angle is an included angle between the sight direction of the user in the horizontal direction and the vertical direction of the panoramic playing device; the vertical visual angle is an included angle between the sight direction of the user in the vertical direction and the vertical direction of the panoramic playing device.
In one or more embodiments, the active identification area is a volumetric spatial area.
Specifically, the effective identification area is a spatial area covered by the panoramic playing device within a preset distance, that is, as shown in fig. 4, ABCD is a plane determined by the length of the panoramic playing device and the preset distance in fig. 2, H is the width of the panoramic playing device, x is a direction perpendicular to the panoramic playing device on a horizontal plane, y is a direction on a vertical plane, and z is a direction parallel to the panoramic playing device on the horizontal plane. The position of the center of the effective identification area is the origin of the view angle.
Step 202, determining a target image area in a panoramic spherical coordinate system of the panoramic video based on a target view angle and a view angle range, wherein the view angle range is an angle formed by a view angle origin and the edge of the panoramic playing device, and the view angle origin is the midpoint of the effective identification area.
The panoramic video is a video shot in all directions at 360 degrees by using a 3D camera, and a user can adjust the video to watch the panoramic video up, down, left and right at will when watching the panoramic video.
The coordinate system to which the panoramic video corresponds is a panoramic spherical coordinate system, i.e. the coordinates of the images of the panoramic video correspond to location points in the panoramic spherical coordinate system.
That is to say, the coordinate area corresponding to the target image is determined in the panoramic spherical coordinate system of the panoramic video according to the target view angle and the view angle range, and then the image corresponding to the coordinates is displayed in a panoramic manner.
In one or more embodiments, as shown in fig. 5, step 102, determining a target image area in a panoramic spherical coordinate system of a panoramic video based on a target viewing angle and a viewing angle range, includes:
step 2021, construct the panoramic spherical coordinate system of the panoramic video into an external ball of the effective identification area.
It should be understood that, when the panoramic spherical coordinate system is configured as an external ball of the effective identification area, the distance from the midpoint of the effective identification area to any vertex can be used as the radius of the external ball, and the panoramic spherical coordinate system is configured by using the radius of the external ball, and then the panoramic spherical coordinate system of the panoramic video is scaled according to the radius of the external ball, so that the panoramic spherical coordinate system of the panoramic video and the external ball of the effective area are integrated, and the purpose of configuring the panoramic spherical coordinate system of the panoramic video as the external ball of the effective identification area is achieved.
For example, as shown in fig. 6, a cube is an effective area, and a sphere circumscribing the cube is a panoramic spherical coordinate system.
Step 2022, determining a target image area on the panoramic spherical surface based on the target view angle and the view angle range.
Preferably, a coordinate area within a view angle range centered on the target view angle on the panoramic spherical coordinate system is set as the target image area.
For example, fig. 7 is a schematic diagram illustrating the principle that the pitch angle of the face is fixed and rotated in the same horizontal direction to determine the target image area. In the diagrams (a) - (e), O is an origin of a viewing angle, angle MON is a viewing angle range in the horizontal direction, OQ is an angle bisector of angle MON, that is, the current main line of sight of the user, T is a position corresponding to the main line of sight of the user, BD is the length of the panoramic playing device, P is the current position of the user, and arc WU is a target image area determined in the horizontal direction.
That is, in fig. 7, (a) shows that the user stands at the origin of the viewing angle, the target viewing angle is 0, and the viewing angle range is the angle formed by the origin of the viewing angle and the edge of the panoramic playing device, at this time, the edge point B, D of the panoramic playing device coincides with the viewing angle range point M, N and the target image area edge W, U. When the user moves close to the panoramic playing device, namely, the position moves from the point O to the point P in (b), the intersection point arc WU of the visual angle range & lt MON in the panoramic spherical coordinate system is shortened, and correspondingly, when the user moves away from the panoramic playing device, namely, the position moves from the point O to the point P in (c), the intersection point arc WU of the visual angle range & lt MON in the panoramic spherical coordinate system is increased. If the user keeps the position of the origin of the visual angle unchanged, only the visual line is rotated, so that the target visual angle becomes ≤ α in (d) and (e), and at the moment, the length of the intersection point arc WU of the visual angle range ≤ MON in the panoramic spherical coordinate system is the same as that in (a), and the direction is changed only based on the target visual angle ≤.
It should be understood that when the face is fixed in the same vertical direction to rotate in the horizontal direction, the face pitch angle is similar to the face pitch angle fixed in the same horizontal direction to rotate, and the description is omitted in this application. It should also be understood that, when determining the target image area, it is necessary to synthesize the two results in the horizontal direction and the vertical direction to obtain the final target image area.
Preferably, after the target image area is determined, four boundary coordinate values and two target view angles of the target image area are recorded.
And step 203, displaying the target image area in a panoramic mode.
Optionally, the target image area is zoomed according to the size of the panorama playing device, and then is displayed in a panorama.
Based on the example in fig. 7, it can be known that only when the user keeps at the origin position of the viewing angle, the size of the target area is the same as the size of the image range normally displayed by the panoramic playing device, and if the user moves within the effective identification area, the size of the determined target area may change, at this time, the size of the determined target image area needs to be adjusted according to the size that can be displayed by the panoramic playing device, that is, the target image area is zoomed, so that the target image area is full of the panoramic playing device, the depth effect of the panoramic video is realized, and the experience of the user in the presence of the user is improved.
In summary, the view angle tracking method for the panoramic video, provided by the embodiment of the application, can determine and display the target image area in the panoramic spherical coordinate system of the panoramic video by tracking the view angle of the user when the user is in the effective identification area, and can achieve the effect of eye movement following the image, so that the user can feel personally on the scene when watching the panoramic video, and thus the user experience is effectively improved.
It should be noted that while the operations of the method of the present invention are depicted in the drawings in a particular order, this does not require or imply that the operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results.
Fig. 8 shows a schematic structural diagram of a view tracking apparatus for panoramic video according to an embodiment of the present application.
As shown in fig. 8, a viewing angle tracking apparatus 10 for panoramic video according to an embodiment of the present application includes:
the tracking module 11 is configured to perform view tracking on the user to determine a target view of the user when the user is in the effective identification area;
a determining module 12, configured to determine a target image area in a panoramic spherical coordinate system of a panoramic video based on the target view angle and a view angle range, where the view angle range is an angle formed by an original point of the view angle and an edge of a panoramic playing device, and the original point of the view angle is a midpoint of the effective identification area;
and a display module 13, configured to perform panoramic display on the target image area.
In some embodiments, the effective identification area is a preset distance in front of the panoramic playing device, where the preset distance is a preset proportion of a distance between the panoramic playing device and an obstacle.
In some embodiments, the detection of human eye movement in the human face image is determined by an optical flow method, and the tracking of the visual angle is performed according to the detection result of the human eye movement.
In some embodiments, the target perspective comprises a horizontal perspective and a vertical perspective;
the horizontal visual angle is an included angle between the sight line direction of the user in the horizontal direction and the vertical direction of the panoramic playing device;
the vertical visual angle is an included angle between the sight direction of the user in the vertical direction and the vertical direction of the panoramic playing device.
In some embodiments, the determining the target image area in the panoramic spherical coordinate system of the panoramic video based on the target view angle and the view angle range comprises:
constructing a panoramic spherical coordinate system of the panoramic video into an external sphere of the effective identification area;
determining the target image area on the panoramic spherical surface based on the target view angle and the view angle range.
In some embodiments, a coordinate region within the view angle range centered on the target view angle on the panoramic spherical coordinate system is taken as the target image region.
In some embodiments, the target image area is zoomed according to the size of the panorama playing device, and then is panoramically displayed.
It should be understood that the units or modules recited in the apparatus 10 correspond to the various steps in the method described with reference to fig. 2. Thus, the operations and features described above with respect to the method are equally applicable to the apparatus 10 and the units contained therein and will not be described again here. The apparatus 10 may be implemented in a browser or other security applications of the electronic device in advance, or may be loaded into the browser or other security applications of the electronic device by downloading or the like. Corresponding elements in the apparatus 10 may cooperate with elements in the electronic device to implement aspects of embodiments of the present application.
The division into several modules or units mentioned in the above detailed description is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In summary, the visual angle tracking device for panoramic video provided by the embodiment of the application can track the visual angle of the user when the user is in the effective identification area, and the target image area is determined and displayed in the panoramic spherical coordinate system of the panoramic video, so that the effect of eye movement can be realized, the user can feel personally on the scene when watching the panoramic video, and the user experience is effectively improved.
Referring now to fig. 9, fig. 9 illustrates a schematic diagram of a computer system suitable for use in implementing an electronic device or server according to embodiments of the present application,
as shown in fig. 9, the computer system includes a Central Processing Unit (CPU)901, which can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)902 or a program loaded from a storage section 908 into a Random Access Memory (RAM) 903. In the RAM903, various programs and data necessary for operation instructions of the system are also stored. The CPU901, ROM902, and RAM903 are connected to each other via a bus 904. An input/output (I/O) interface 905 is also connected to bus 904.
The following components are connected to the I/O interface 905; an input portion 906 including a keyboard, a mouse, and the like; an output section 907 including components such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 908 including a hard disk and the like; and a communication section 909 including a network interface card such as a LAN card, a modem, or the like. The communication section 909 performs communication processing via a network such as the internet. The drive 910 is also connected to the I/O interface 905 as necessary. A removable medium 911 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 910 as necessary, so that a computer program read out therefrom is mounted into the storage section 908 as necessary.
In particular, according to embodiments of the present application, the process described above with reference to the flowchart fig. 2 may be implemented as a computer software program. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program comprises program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 909, and/or installed from the removable medium 911. The above-described functions defined in the system of the present application are executed when the computer program is executed by a Central Processing Unit (CPU) 901.
It should be noted that the computer readable medium shown in the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operational instructions of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units or modules described in the embodiments of the present application may be implemented by software or hardware. The described units or modules may also be provided in a processor, and may be described as: a processor includes a tracking module, a determination module, and a display module. Where the names of these units or modules do not in some cases constitute a limitation on the units or modules themselves, for example, a tracking module may also be described as "tracking the perspective of a user to determine a target perspective of the user when the user is in a valid recognition area.
As another aspect, the present application also provides a computer-readable storage medium, which may be included in the electronic device described in the above embodiments, or may exist separately without being assembled into the electronic device. The computer-readable storage medium stores one or more programs which, when executed by one or more processors, perform the perspective tracking method for panoramic video described herein.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the disclosure. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (10)

1. A view tracking method for panoramic video, comprising:
when the user is in the effective identification area, carrying out visual angle tracking on the user to determine a target visual angle of the user;
determining a target image area in a panoramic spherical coordinate system of the panoramic video based on the target view angle and a view angle range, wherein the view angle range is an angle formed by a view angle origin and the edge of the panoramic playing equipment, and the view angle origin is the midpoint of the effective identification area;
and carrying out panoramic display on the target image area.
2. The viewing angle tracking method according to claim 1,
the effective identification area is a preset distance in front of the panoramic playing device, wherein the preset distance is a preset proportion of the distance between the panoramic playing device and the barrier.
3. The perspective tracking method according to claim 1, further comprising:
and determining human eye motion detection in the human face image by using an optical flow method, and tracking a visual angle according to a human eye motion detection result.
4. The perspective tracking method according to claim 3, wherein the target perspective includes a horizontal perspective and a vertical perspective;
the horizontal visual angle is an included angle between the sight direction of the user in the horizontal direction and the vertical direction of the panoramic playing device;
the vertical visual angle is an included angle between the sight direction of the user in the vertical direction and the vertical direction of the panoramic playing device.
5. The method for tracking a perspective of claim 1, wherein the effective identification area is a stereo space area, and the determining a target image area in a panoramic spherical coordinate system of a panoramic video based on the target perspective and the range of perspectives comprises:
constructing a panoramic spherical coordinate system of the panoramic video into an circumsphere of the effective identification area;
determining the target image area on the panoramic spherical surface based on the target view angle and the view angle range.
6. The method of claim 5, further comprising:
and taking a coordinate area within the view angle range on the panoramic spherical coordinate system by taking the target view angle as a center as the target image area.
7. The method of claim 1, further comprising:
and zooming the target image area according to the size of the panoramic playing equipment, and then carrying out panoramic display.
8. An apparatus for tracking a view angle of a panoramic video, comprising:
the tracking module is used for tracking the visual angle of the user to determine the target visual angle of the user when the user is in the effective identification area;
the determining module is used for determining a target image area in a panoramic spherical coordinate system of the panoramic video based on the target view angle and a view angle range, wherein the view angle range is an angle formed by a view angle origin and the edge of the panoramic playing equipment, and the view angle origin is the midpoint of the effective identification area;
and the display module is used for carrying out panoramic display on the target image area.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor, when executing the program, implements the perspective tracking method of panoramic video according to any one of claims 1 to 7.
10. A computer-readable storage medium on which a computer program is stored, the program, when executed by a processor, implementing the view tracking method of panoramic video according to any one of claims 1 to 7.
CN202210683281.XA 2022-06-16 2022-06-16 Viewing angle tracking method, device, equipment and medium for panoramic video Active CN115103114B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210683281.XA CN115103114B (en) 2022-06-16 2022-06-16 Viewing angle tracking method, device, equipment and medium for panoramic video

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210683281.XA CN115103114B (en) 2022-06-16 2022-06-16 Viewing angle tracking method, device, equipment and medium for panoramic video

Publications (2)

Publication Number Publication Date
CN115103114A true CN115103114A (en) 2022-09-23
CN115103114B CN115103114B (en) 2024-06-14

Family

ID=83291070

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210683281.XA Active CN115103114B (en) 2022-06-16 2022-06-16 Viewing angle tracking method, device, equipment and medium for panoramic video

Country Status (1)

Country Link
CN (1) CN115103114B (en)

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103207664A (en) * 2012-01-16 2013-07-17 联想(北京)有限公司 Image processing method and equipment
CN104320688A (en) * 2014-10-15 2015-01-28 小米科技有限责任公司 Video play control method and device
CN105847379A (en) * 2016-04-14 2016-08-10 乐视控股(北京)有限公司 Tracking method and tracking apparatus for panoramic video moving direction
US20160301862A1 (en) * 2015-04-10 2016-10-13 Finwe Oy Method and system for tracking an interest of a user within a panoramic visual content
CN106331732A (en) * 2016-09-26 2017-01-11 北京疯景科技有限公司 Method for generating panoramic content, method for displaying panoramic content and corresponding devices
CN106534827A (en) * 2016-12-19 2017-03-22 暴风集团股份有限公司 Method and system for playing panoramic video based on user perspective
CN106598428A (en) * 2016-11-29 2017-04-26 宇龙计算机通信科技(深圳)有限公司 Method and system for playing panoramic video, and terminal equipment
CN107396077A (en) * 2017-08-23 2017-11-24 深圳看到科技有限公司 Virtual reality panoramic video stream projecting method and equipment
CN107888987A (en) * 2016-09-29 2018-04-06 华为技术有限公司 A kind of panoramic video player method and device
CN107945231A (en) * 2017-11-21 2018-04-20 江西服装学院 A kind of 3 D video playback method and device
DE102017009149A1 (en) * 2016-11-04 2018-05-09 Avago Technologies General Ip (Singapore) Pte. Ltd. Record and playback 360-degree object tracking videos
CN108650500A (en) * 2018-04-02 2018-10-12 北京奇艺世纪科技有限公司 A kind of panoramic video processing method and processing device
CN108648257A (en) * 2018-04-09 2018-10-12 腾讯科技(深圳)有限公司 Acquisition methods, device, storage medium and the electronic device of panorama
JP2019101563A (en) * 2017-11-29 2019-06-24 大日本印刷株式会社 Information processing apparatus, information processing system, information processing method, and program
CN110956583A (en) * 2018-09-26 2020-04-03 华为技术有限公司 Spherical image processing method and device and server
CN111163306A (en) * 2018-11-08 2020-05-15 华为技术有限公司 VR video processing method and related device
CN113242384A (en) * 2021-05-08 2021-08-10 聚好看科技股份有限公司 Panoramic video display method and display equipment
CN113870213A (en) * 2021-09-24 2021-12-31 深圳市火乐科技发展有限公司 Image display method, image display device, storage medium, and electronic apparatus
CN114299258A (en) * 2021-12-21 2022-04-08 山东大学 Real walking roaming system and method based on panoramic video
CN114500970A (en) * 2020-11-13 2022-05-13 聚好看科技股份有限公司 Panoramic video image processing and displaying method and device

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103207664A (en) * 2012-01-16 2013-07-17 联想(北京)有限公司 Image processing method and equipment
CN104320688A (en) * 2014-10-15 2015-01-28 小米科技有限责任公司 Video play control method and device
US20160301862A1 (en) * 2015-04-10 2016-10-13 Finwe Oy Method and system for tracking an interest of a user within a panoramic visual content
CN105847379A (en) * 2016-04-14 2016-08-10 乐视控股(北京)有限公司 Tracking method and tracking apparatus for panoramic video moving direction
CN106331732A (en) * 2016-09-26 2017-01-11 北京疯景科技有限公司 Method for generating panoramic content, method for displaying panoramic content and corresponding devices
CN107888987A (en) * 2016-09-29 2018-04-06 华为技术有限公司 A kind of panoramic video player method and device
DE102017009149A1 (en) * 2016-11-04 2018-05-09 Avago Technologies General Ip (Singapore) Pte. Ltd. Record and playback 360-degree object tracking videos
CN106598428A (en) * 2016-11-29 2017-04-26 宇龙计算机通信科技(深圳)有限公司 Method and system for playing panoramic video, and terminal equipment
CN106534827A (en) * 2016-12-19 2017-03-22 暴风集团股份有限公司 Method and system for playing panoramic video based on user perspective
CN107396077A (en) * 2017-08-23 2017-11-24 深圳看到科技有限公司 Virtual reality panoramic video stream projecting method and equipment
CN107945231A (en) * 2017-11-21 2018-04-20 江西服装学院 A kind of 3 D video playback method and device
JP2019101563A (en) * 2017-11-29 2019-06-24 大日本印刷株式会社 Information processing apparatus, information processing system, information processing method, and program
CN108650500A (en) * 2018-04-02 2018-10-12 北京奇艺世纪科技有限公司 A kind of panoramic video processing method and processing device
CN108648257A (en) * 2018-04-09 2018-10-12 腾讯科技(深圳)有限公司 Acquisition methods, device, storage medium and the electronic device of panorama
CN110956583A (en) * 2018-09-26 2020-04-03 华为技术有限公司 Spherical image processing method and device and server
CN111163306A (en) * 2018-11-08 2020-05-15 华为技术有限公司 VR video processing method and related device
CN114500970A (en) * 2020-11-13 2022-05-13 聚好看科技股份有限公司 Panoramic video image processing and displaying method and device
CN113242384A (en) * 2021-05-08 2021-08-10 聚好看科技股份有限公司 Panoramic video display method and display equipment
CN113870213A (en) * 2021-09-24 2021-12-31 深圳市火乐科技发展有限公司 Image display method, image display device, storage medium, and electronic apparatus
CN114299258A (en) * 2021-12-21 2022-04-08 山东大学 Real walking roaming system and method based on panoramic video

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈立栋;徐玮;包卫东;张茂军;熊志辉;: "一种全景图像浏览器的设计与实现", 小型微型计算机系统, no. 03, 15 March 2008 (2008-03-15) *

Also Published As

Publication number Publication date
CN115103114B (en) 2024-06-14

Similar Documents

Publication Publication Date Title
US10096157B2 (en) Generation of three-dimensional imagery from a two-dimensional image using a depth map
US9626790B1 (en) View-dependent textures for interactive geographic information system
US9460555B2 (en) System and method for three-dimensional visualization of geographical data
US9286718B2 (en) Method using 3D geometry data for virtual reality image presentation and control in 3D space
CN108594999B (en) Control method and device for panoramic image display system
CN112073748B (en) Panoramic video processing method and device and storage medium
US11776142B2 (en) Structuring visual data
JPH09139956A (en) Apparatus and method for analyzing and emphasizing electronic scene
CN113223130B (en) Path roaming method, terminal equipment and computer storage medium
US20190335166A1 (en) Deriving 3d volumetric level of interest data for 3d scenes from viewer consumption data
TWI786157B (en) Apparatus and method for generating a tiled three-dimensional image representation of a scene
WO2009093136A2 (en) Image capture and motion picture generation
US9025007B1 (en) Configuring stereo cameras
JP2019509526A (en) Optimal spherical image acquisition method using multiple cameras
CN113286138A (en) Panoramic video display method and display equipment
CN114926612A (en) Aerial panoramic image processing and immersive display system
US11831853B2 (en) Information processing apparatus, information processing method, and storage medium
KR101588935B1 (en) A method using 3d geometry data for virtual reality image presentation and control in 3d space
US20220165015A1 (en) Image signal representing a scene
JP2022522504A (en) Image depth map processing
CN115103114B (en) Viewing angle tracking method, device, equipment and medium for panoramic video
CN110910482B (en) Method, system and readable storage medium for video data organization and scheduling
CN117197319B (en) Image generation method, device, electronic equipment and storage medium
JP7447403B2 (en) Information processing device, information processing system, information processing method and program
CN113452954B (en) Behavior analysis method, apparatus, device and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant