CN113223130B - Path roaming method, terminal equipment and computer storage medium - Google Patents

Path roaming method, terminal equipment and computer storage medium Download PDF

Info

Publication number
CN113223130B
CN113223130B CN202110287704.1A CN202110287704A CN113223130B CN 113223130 B CN113223130 B CN 113223130B CN 202110287704 A CN202110287704 A CN 202110287704A CN 113223130 B CN113223130 B CN 113223130B
Authority
CN
China
Prior art keywords
camera
path
video stream
roaming
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110287704.1A
Other languages
Chinese (zh)
Other versions
CN113223130A (en
Inventor
梁红霞
李乾坤
卢维
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202110287704.1A priority Critical patent/CN113223130B/en
Publication of CN113223130A publication Critical patent/CN113223130A/en
Application granted granted Critical
Publication of CN113223130B publication Critical patent/CN113223130B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Architecture (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses a path roaming method, terminal equipment and a computer storage medium, wherein the path roaming method comprises the following steps: acquiring an initial path of path roaming; moving the camera according to the initial path, and acquiring camera position parameters and camera viewpoint parameters under a plurality of target visual angles in the moving process according to a preset rule; obtaining an optimal video stream corresponding to a target visual angle by utilizing camera visual point parameters; and calculating a patrol path for loading the optimal video stream into the three-dimensional scene where the corresponding target visual angle is located by using the camera position parameters and the camera viewpoint parameters. By the method, the video stream is loaded to the three-dimensional scene, so that dynamic simulation is realized, and the observation efficiency of a user is improved.

Description

Path roaming method, terminal equipment and computer storage medium
Technical Field
The present invention relates to the field of video monitoring technologies, and in particular, to a path roaming method, a terminal device, and a computer storage medium.
Background
Under the background of the national initiative of digital twinning, a three-dimensional scene is established, various monitoring data are overlapped and checked in the scene, the problem of checking in the scene of electric power, traffic and parks is solved, and the method becomes a current popular study.
After the geographic information data is presented in the three-dimensional scene, a set of roaming mechanism needs to be established, so that seamless and accurate browsing of the three-dimensional scene is realized. Roaming is a browsing operation mode of a three-dimensional scene, and is realized by changing the position and the viewpoint of a camera. Roaming in three-dimensional scenes is mainly two: one is interactive roaming, where a user operates a mouse or a keyboard light or other interactive device to control the location and viewpoint of roaming according to his or her intention. The other is that the user roams according to a track defined in advance, the track is a curve in a three-dimensional space, and the coordinates of control points on the curve are calculated through a difference value, namely, the path roams, which is also called automatic roam.
Then, the current technology simply carries out path planning through a starting position and a terminating position, so that the camera flies in a scene, a real-time monitoring video is not fused, the reality is poor, and the integration is weak.
Disclosure of Invention
The application provides a path roaming method, terminal equipment and a computer storage medium.
In order to solve the technical problems, one technical scheme adopted by the application is as follows: provided is a path roaming method including:
acquiring an initial path of path roaming;
moving the camera according to the initial path, and acquiring camera position parameters and camera viewpoint parameters under a plurality of target visual angles in the moving process according to a preset rule;
obtaining an optimal video stream corresponding to a target viewing angle by utilizing the camera viewpoint parameters;
and calculating a patrol path for loading the optimal video stream into the three-dimensional scene where the corresponding target visual angle is located by using the camera position parameter and the camera viewpoint parameter.
The method for obtaining the camera position parameters and the camera viewpoint parameters under a plurality of target viewing angles in the moving process according to the preset rule comprises the following steps:
setting a plurality of roaming path points on the initial path according to the preset rule;
and in the moving process, acquiring a camera position parameter and a camera viewpoint parameter when the camera moves to each roaming path point.
The method for obtaining the camera position parameters and the camera viewpoint parameters under a plurality of target viewing angles in the moving process according to the preset rule comprises the following steps:
setting a plurality of roaming time points according to the preset rule;
and in the moving process, when the roaming time of the camera reaches the roaming time point, acquiring the camera position parameter and the camera viewpoint parameter under the target view angle at the moment.
The obtaining the optimal video stream corresponding to the target viewing angle by using the camera viewpoint parameter comprises the following steps:
acquiring video streams acquired by a plurality of monitoring devices at corresponding target visual angles;
acquiring position parameters of the plurality of monitoring devices;
judging whether shielding exists between the positions of the monitoring devices and the camera view point or not by utilizing the position parameters of the monitoring devices and the camera view point parameters;
and if the video stream is not shielded, taking the video stream acquired by the shielding-free monitoring equipment as the optimal video stream.
The method for using the video stream collected by the non-shielding monitoring equipment as the optimal video stream comprises the following steps:
acquiring a target in a video stream acquired by the non-shielding monitoring equipment;
calculating the imaging pixel number of the target in the video by using the size of the target and camera equipment parameters;
and taking the video stream with the largest imaging pixel number as the optimal video stream.
Wherein the camera device parameters include a camera focal length and a camera working distance;
the step of using the video stream with the largest imaging pixel number as the optimal video stream comprises the following steps:
and taking the video stream with the largest ratio of the focal length of the camera to the working distance of the camera as the optimal video stream.
The loading the optimal video stream to the three-dimensional scene where the corresponding target view angle is located includes:
creating a presentation plane at the target viewing angle;
and updating the texture mapping on the display plane according to the video frame sequence of the pictures in the optimal video stream.
The calculating, by using the camera position parameter and the camera viewpoint parameter, a tour path for loading the optimal video stream into a three-dimensional scene where a corresponding target viewing angle is located, includes:
calculating a path of a picture in the optimal video stream from a camera position to a camera viewpoint and a motion speed by using the camera position parameter and the camera viewpoint parameter;
and setting transition animation of the pictures in the optimal video stream into a slow-in and slow-out form.
In order to solve the technical problems, another technical scheme adopted by the application is as follows: providing a terminal device comprising a processor and a memory; the memory has stored therein a computer program, and the processor is configured to execute the computer program to implement the steps of the path roaming method as described above.
In order to solve the technical problems, another technical scheme adopted by the application is as follows: there is provided a computer storage medium storing a computer program which when executed implements the steps of the path roaming method described above.
In contrast to the prior art, the beneficial effects of this application lie in: the terminal equipment acquires an initial path of path roaming; moving the camera according to the initial path, and acquiring camera position parameters and camera viewpoint parameters under a plurality of target visual angles in the moving process according to a preset rule; obtaining an optimal video stream corresponding to a target visual angle by utilizing camera visual point parameters; and calculating a patrol path for loading the optimal video stream into the three-dimensional scene where the corresponding target visual angle is located by using the camera position parameters and the camera viewpoint parameters. By the method, the video stream is loaded to the three-dimensional scene, so that dynamic simulation is realized, and the observation efficiency of a user is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort to those of ordinary skill in the art.
FIG. 1 is a flow chart of an embodiment of a path roaming method provided in the present application;
FIG. 2 is a schematic diagram of a specific flow of S103 in the path roaming method shown in FIG. 1;
FIG. 3 is a schematic diagram of one embodiment of a no-occlusion case provided herein;
FIG. 4 is a schematic diagram of the relationship between the number of pixels and the target size provided in the present application;
fig. 5 is a schematic structural diagram of an embodiment of a terminal device provided in the present application;
fig. 6 is a schematic structural diagram of another embodiment of a terminal device provided in the present application;
fig. 7 is a schematic structural diagram of an embodiment of a computer storage medium provided in the present application.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In order to solve the problem caused by the fact that the prior art does not integrate real-time monitoring videos, the application provides a path roaming method. Referring to fig. 1, fig. 1 is a flow chart illustrating an embodiment of a path roaming method provided in the present application. In three-dimensional augmented reality, a real two-dimensional monitoring video is often used in combination with a virtual object (such as a three-dimensional model) so as to achieve the effect of augmented reality. Therefore, the method for path roaming in the embodiment of the application can clearly and intuitively reflect the actual situation to the user by using the real-time video stream to replace the observation point when the path is roamed compared with the method for path roaming in a three-dimensional scene.
The path roaming method is applied to terminal equipment, wherein the terminal equipment can be a server, mobile equipment or a system formed by mutually matching the server and the mobile equipment. Accordingly, each part, such as each unit, sub-unit, module, and sub-module, included in the terminal device may be all disposed in the server, may be all disposed in the mobile device, or may be disposed in the server and the mobile device, respectively.
Further, the server may be hardware or software. When the server is hardware, the server may be implemented as a distributed server cluster formed by a plurality of servers, or may be implemented as a single server. When the server is software, it may be implemented as a plurality of software or software modules, for example, software or software modules for providing a distributed server, or may be implemented as a single software or software module, which is not specifically limited herein.
Specifically, in the embodiment of the present application, the terminal device may be a monitoring device with computing capability, or other image acquisition terminals with computing capability, which will not be described herein.
As shown in fig. 1, the path roaming method of the present embodiment specifically includes the following steps:
s101: an initial path for path roaming is obtained.
The terminal device in the embodiment of the present application may obtain an initial path of the path roaming through a path planning method of the current path roaming. Specifically, the terminal device receives a starting viewpoint and a terminating viewpoint input by a user instruction, wherein the starting viewpoint comprises the position and the gesture of the camera at a starting point, and the terminating viewpoint comprises the position and the gesture of the camera at a terminating point.
The instructions input by the user need to include the focal position of the camera at the start point and the focal position at the end point, and in addition, the user can set the time required for the path roaming by himself or use the time required for the path roaming recommended by the system, for example, the flight speed of the camera is preset, and the time required for the flight is obtained according to the straight line distance between the start point and the end point. The user can determine the focal position of the camera at the start point and the focal position at the end point by directly selecting the point in the three-dimensional scene or inputting the coordinate value of the focal point of the start point and the coordinate value of the focal point of the end point. Wherein the focus is the point where the camera's line of sight intersects the three-dimensional scene and is closest to the camera, and the viewpoint is a description of the position and pose of the camera.
It will be appreciated that in order to obtain the starting and ending viewpoints of the camera, in addition to the focal position of the camera at the starting point and the focal position at the ending point, at least one of the positions or attitudes of the camera at the starting point and the ending point is required, so that the starting and ending viewpoints of the camera can be obtained through forward or backward calculation of the viewpoints, and thus the initial path of the path roaming of the camera can be calculated.
In the embodiment of the application, the three-dimensional scene can be a plane three-dimensional scene, a spherical three-dimensional scene or an ellipsoidal three-dimensional scene.
S102: and moving the camera according to the initial path, and acquiring camera position parameters and camera viewpoint parameters under a plurality of target visual angles in the moving process according to a preset rule.
The terminal device moves the three-dimensional scene through moving the camera according to the initial path of the path roaming, and sequentially adds and stores camera position parameters P under a plurality of target visual angles viewpoint (x 1, y1, z 1) and camera viewpoint parameter Q viewpoint (x1,y1,z1)。
The establishment of the target viewing angle may be manually selected by the user on the initial path of the path roaming on the one hand, and may be automatically selected according to a preset rule on the other hand. Specifically, the terminal device may set a plurality of roaming path points on the initial path according to the user instruction, for example, the user instruction may specify a distance between the roaming path points, and the terminal device may select the roaming path points according to the specified distance from the starting point of the initial path. After the roaming path points are determined, the terminal equipment automatically acquires camera position parameters and camera viewpoint parameters when the camera moves to the position of each roaming path point in the process of moving the three-dimensional scene.
In other embodiments, the user command may also specify a plurality of roaming time points, the camera moves on the initial path according to the preset roaming speed, and when the roaming time reaches the specified roaming time point, the terminal device may automatically acquire the camera position parameter and the camera viewpoint parameter under the target view angle at this time.
S103: and obtaining the optimal video stream corresponding to the target visual angle by using the camera visual point parameters.
The terminal equipment calculates an optimal video stream for observing the camera viewpoint according to the camera viewpoint parameters. The establishment of the optimal video stream can be realized by manually selecting and intercepting the optimal video stream on the monitoring video collected by the monitoring equipment by a user, and can also be realized by screening the monitoring video segments collected by all the monitoring equipment according to a preset rule so as to calculate the optimal video stream.
Specifically, the terminal device may calculate the optimal video stream for observing the camera viewpoint through the method shown in fig. 2, and fig. 2 is a specific flowchart of S103 in the path roaming method shown in fig. 1.
As shown in fig. 2, S103 in the above path roaming method may specifically include the following substeps:
s301: and acquiring the position parameters of a plurality of monitoring devices.
The terminal equipment can judge whether the monitoring equipment has shielding in the process of observing the target. Firstly, the terminal equipment needs to acquire the position parameters P of all monitoring equipment for monitoring the observation target under the viewpoint of the camera camera
S302: and judging whether shielding exists between the positions of the monitoring devices and the camera view point by using the position parameters of the monitoring devices and the camera view point parameters.
The embodiment of the application judges that no shielding exists according to the position P of each monitoring device in the three-dimensional scene camera For the end point, take P camera 、Q viewpoint The connection line of the two points sends out a ray for the direction. If the intersection point of the ray corresponding to the monitoring equipment and the actual space in the three-dimensional scene is Q viewpoint Indicating that there is no shielding between the monitoring device and the target, as shown in FIG. 3, Q viewpoint The plane is the plane of the target, P camera 、Q viewpoint The two points can be directly connected in a straight line. If the intersection point of the ray corresponding to the monitoring equipment and the actual space in the three-dimensional scene is not Q viewpoint And indicating that the monitoring equipment is blocked from the target. When it is determined that there is no shielding between a certain monitoring device and the target, step S303 is entered:
s303: and acquiring a video stream acquired by the non-shielding monitoring equipment.
The monitoring device can well monitor and record global features and detailed features of the target, so that the terminal device can take a video stream acquired by the monitoring device without shielding the target as an optimal video stream. If the judging result is that the monitoring equipment without shielding is only one, the terminal equipment can directly take the video stream collected by the monitoring equipment as the optimal video stream; if the judging result is that the plurality of monitoring devices are not shielded, the terminal device can further determine the optimal video stream by comparing the number of imaging pixels of the target in the monitoring video.
S304: and acquiring the target in the video stream acquired by the non-shielding monitoring equipment.
The terminal equipment respectively acquires the areas where the targets in the video streams acquired by the plurality of monitoring equipment without shielding are located as the judging result, and detects the imaging size of the targets in the monitoring video through a target detection algorithm.
S305: the size of the target and camera device parameters are used to calculate the number of pixels imaged by the target in the video.
The terminal device can further calculate the imaging pixel number of the target in the monitoring video according to the imaging size of the target and the camera device parameters. Specifically, referring to fig. 4, fig. 4 is a schematic diagram of the relationship between the pixel number and the target size provided in the present application. As shown in fig. 4, the camera device parameters include a camera focal length and a camera working distance, and a calculation formula for calculating the imaging pixel number of the target in the monitoring video is specifically as follows:
imaging pixel count=target size× (f/WD)
Where f is the focal length of the camera, representing the distance between the imaging plane and the lens plane, WD is the working distance of the camera, representing the distance between the target plane and the lens plane.
S306: and taking the video stream with the largest imaging pixel number as the optimal video stream.
The terminal equipment compares the imaging pixel numbers in the video streams acquired by the plurality of monitoring equipment without shielding as a judging result, and takes the video stream with the largest imaging pixel number as the optimal video stream. The maximum imaging pixel number indicates that the target has the highest duty ratio in the monitored video picture, and the target characteristics which can be embodied are also the most complete, so that the imaging pixel number can be used as a better factor for judging the optimal video stream.
In addition, in other embodiments, when the size of the target is fixed, the larger the value of f/WD, the more pixels the target has in the monitoring video of the monitoring device, and the clearer the observed target. Therefore, the terminal device can directly select the video stream with the largest ratio of the focal length of the camera to the working distance of the camera from all the non-shielding video streams as the optimal video stream, and the process of calculating the imaging pixel number is omitted.
It should be noted that, in the embodiment of the present application, selecting the optimal video stream of the observation target is divided into two steps, firstly, selecting the video stream without shielding, and then selecting the video stream with the largest ratio of focal length and working distance from all the video streams without shielding as the optimal video stream. Summarizing, the determination of the best video stream is based on two criteria: firstly, no shielding exists in the process of observing the target, and secondly, the number of pixels of the target in the video is the largest. In other embodiments, one of the judgment bases or the order of the two judgment bases may be selected or adjusted to judge the best video stream, which will not be described herein.
S104: and calculating a patrol path for loading the optimal video stream into the three-dimensional scene where the corresponding target visual angle is located by using the camera position parameters and the camera viewpoint parameters.
Wherein, after determining the optimal video stream, the terminal device is at the camera viewpoint Q viewpoint A rectangular plane is created to show the best video stream to observe the camera viewpoint. Specifically, the terminal device may use the optimal video stream as a texture map, that is, extract a frame-by-frame picture from the monitor video as a texture map of the three-dimensional model, and then continuously update the texture map according to the frame sequence of the monitor video, so as to reflect the real-time monitor picture.
Firstly, the terminal equipment can cut the size of the optimal video stream, and the length and width of the optimal video stream are processed to be the power of 2 n; then, the terminal device maps the texture map of the transformed two-dimensional video stream to a three-dimensional space. Assuming that a certain point coordinate on the video texture is T (u, v), the value range of the point coordinate is equal to or more than 0 and equal to or less than 1, and the value range of the point coordinate is equal to or less than 0 and equal to or less than 1, the point coordinate is mapped onto a plane rectangle with Z=0, and the edges of the rectangle are parallel to the coordinate axis. The mapped coordinates are T ' (x ', y ', z '), the value range is x1 is less than or equal to x ' < x2, y1 is less than or equal to y ' < y2, and z ' =0, and the mapping relation can be expressed as:
in the above manner, the terminal device maps the texture mapped to the plane rectangle of z=0 to the viewpoint Q of the camera viewpoint On a rectangular plane created by the camera, the view is taken from the viewpoint Q of the camera viewpoint The intersection point (x ', y ', z ') of the reflected ray at the created rectangular planar visible point (x, y, z) with the planar rectangular surface at z=0 serves as the mapping point for point (x, y, z).
According to the embodiment of the application, the monitoring video acquired by the front-end camera is mapped into the three-dimensional scene through the video texture mapping technology, the camera is checked in the three-dimensional scene, and the camera is called, so that the interactivity is greatly improved.
Finally, the terminal device also needs to calculate a tour path for loading the optimal video stream into the three-dimensional scene where the corresponding target viewing angle is located. Specifically, a tour path, i.e., camera position parameter P, is calculated viewpoint (x 1, y1, z 1) and camera viewpoint parameter Q viewpoint (x 1, y1, z 1) smoothly transitions to each point. The specific implementation flow is to adopt a difference function (t-3) of a third power, the first half of the tour path can be set to accelerate from 0, the second half of the tour path is decelerated to 0, and the animation can be in a slow-in and slow-out form.
The pseudo code for calculating the tour path of the best video stream is specifically as follows:
function(t,b,c,d){
if((t/=d/2)<1)return c/2*t*t*t+b;
return c/2*((t-=2)*t*t+2)+b;
}
where t represents the current time, the value range of t is normalized to the interval of [0,1], b represents the initial value, c represents the change in value, and d represents the duration.
In the embodiment of the application, a terminal device acquires an initial path of path roaming; moving the camera according to the initial path, and acquiring camera position parameters and camera viewpoint parameters under a plurality of target visual angles in the moving process according to a preset rule; obtaining an optimal video stream corresponding to a target visual angle by utilizing camera visual point parameters; and calculating a patrol path for loading the optimal video stream into the three-dimensional scene where the corresponding target visual angle is located by using the camera position parameters and the camera viewpoint parameters. By the method, the video monitoring is integrated into the three-dimensional scene, the real-time performance of the video monitoring is utilized to make up for the defect of the dynamic real-time performance of the three-dimensional simulation scene, the dynamic simulation is truly realized, and the observation efficiency is improved; in addition, the method realizes the maximization of camera utilization, and the method forms path roaming by setting patrol points and loading video streams, and provides more comprehensive space-time information by using the camera to the maximum extent; the path roaming method is wide in application range, and the three-dimensional scene fusion monitoring video forms path roaming adaptation reality and is closely connected with various industries such as electric power, traffic, parks and the like, so that wide application is realized.
It will be appreciated by those skilled in the art that in the above-described method of the specific embodiments, the written order of steps is not meant to imply a strict order of execution but rather should be construed according to the function and possibly inherent logic of the steps.
In order to implement the path roaming method of the foregoing embodiment, the present application further provides a terminal device, and specifically please refer to fig. 5, fig. 5 is a schematic structural diagram of an embodiment of the terminal device provided in the present application.
As shown in fig. 5, the terminal device 400 of the present embodiment includes a path acquisition module 41, a parameter calibration module 42, a video acquisition module 43, and a path calculation module 44. Wherein,,
the path acquisition module 41 is configured to acquire an initial path of the path roaming.
And the parameter calibration module 42 is configured to move the camera according to the initial path, and acquire camera position parameters and camera viewpoint parameters at a plurality of target viewing angles in the moving process according to a preset rule.
The video obtaining module 43 is configured to obtain an optimal video stream corresponding to the target viewing angle by using the camera viewpoint parameter.
And a path calculation module 44, configured to calculate a tour path for loading the optimal video stream into the three-dimensional scene where the corresponding target viewing angle is located by using the camera position parameter and the camera viewpoint parameter.
In order to implement the path roaming method of the foregoing embodiment, the present application further provides another terminal device, specifically referring to fig. 6, fig. 6 is a schematic structural diagram of another embodiment of the terminal device provided in the present application.
As shown in fig. 6, the terminal device 500 of the present embodiment includes a processor 51, a memory 52, an input-output device 53, and a bus 54.
The processor 51, the memory 52 and the input/output device 53 are respectively connected to the bus 54, and the memory 52 stores a computer program, and the processor 51 is configured to execute the computer program to implement the path roaming method of the above embodiment.
In the present embodiment, the processor 51 may also be referred to as a CPU (Central Processing Unit ). The processor 51 may be an integrated circuit chip with signal processing capabilities. Processor 51 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The processor 51 may also be a GPU (Graphics Processing Unit, graphics processor), also called a display core, a vision processor, a display chip, and is a microprocessor that is specially used for image computation on a personal computer, a workstation, a game machine, and some mobile devices (such as a tablet computer, a smart phone, etc.). The GPU is used for converting and driving display information required by a computer system, providing a line scanning signal for a display, controlling the correct display of the display, and is an important element for connecting the display and a personal computer mainboard and is also one of important equipment for 'man-machine conversation'. The display card is an important component in the host computer, and is very important for people who are engaged in professional graphic design to take on the task of outputting and displaying graphics. The general purpose processor may be a microprocessor or the processor 51 may be any conventional processor or the like.
The present application also provides a computer storage medium, as shown in fig. 7, where the computer storage medium 600 is configured to store a computer program 61, and the computer program 61, when executed by a processor, is configured to implement a method as described in an embodiment of a path roaming method of the present application.
The method referred to in the path roaming method embodiment of the present application may be stored in a device, such as a computer readable storage medium, when implemented in the form of a software functional unit and sold or used as a stand alone product. Based on such understanding, the technical solution of the present application may be embodied essentially or in part or all or part of the technical solution contributing to the prior art, or in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to perform all or part of the steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing description is only of embodiments of the present invention, and is not intended to limit the scope of the invention, and all equivalent structures or equivalent processes using the descriptions and the drawings of the present invention or directly or indirectly applied to other related technical fields are included in the scope of the present invention.

Claims (8)

1. A path roaming method, the path roaming method comprising:
acquiring an initial path of path roaming;
moving the camera according to the initial path, and acquiring camera position parameters and camera viewpoint parameters under a plurality of target visual angles in the moving process according to a preset rule;
obtaining an optimal video stream corresponding to a target viewing angle by utilizing the camera viewpoint parameters;
calculating a patrol path for loading the optimal video stream into a three-dimensional scene where a corresponding target visual angle is located by using the camera position parameter and the camera viewpoint parameter;
the obtaining the optimal video stream corresponding to the target viewing angle by using the camera viewpoint parameter comprises the following steps:
acquiring video streams acquired by a plurality of monitoring devices at corresponding target visual angles;
acquiring position parameters of the plurality of monitoring devices;
judging whether shielding exists between the positions of the monitoring devices and the camera view point or not by utilizing the position parameters of the monitoring devices and the camera view point parameters;
if no shielding exists, judging whether the monitoring equipment without shielding exists or not;
if yes, taking the video stream acquired by the non-shielding monitoring equipment as the optimal video stream;
if not, acquiring targets in the video stream acquired by all the non-shielding monitoring equipment; calculating the imaging pixel number of the target in the video by using the size of the target and camera equipment parameters; taking the video stream with the largest imaging pixel number as the optimal video stream;
the step of judging whether shielding exists between the positions of the plurality of monitoring devices and the camera viewpoint by using the position parameters of the plurality of monitoring devices and the camera viewpoint parameters comprises the following steps:
setting an endpoint by using the position parameter of the monitoring equipment, and setting a camera viewpoint by using the camera viewpoint parameter;
generating rays according to the connecting line direction of the end points and the camera view points;
judging whether the intersection point of the ray and the actual space is the camera viewpoint;
if yes, determining that no shielding exists between the position of the monitoring equipment and the camera viewpoint.
2. The method of path roaming according to claim 1, wherein,
the obtaining the camera position parameters and the camera viewpoint parameters under a plurality of target viewing angles in the moving process according to a preset rule comprises the following steps:
setting a plurality of roaming path points on the initial path according to the preset rule;
and in the moving process, acquiring a camera position parameter and a camera viewpoint parameter when the camera moves to each roaming path point.
3. The method of path roaming according to claim 2, wherein,
the obtaining the camera position parameters and the camera viewpoint parameters under a plurality of target viewing angles in the moving process according to a preset rule comprises the following steps:
setting a plurality of roaming time points according to the preset rule;
and in the moving process, when the roaming time of the camera reaches the roaming time point, acquiring the camera position parameter and the camera viewpoint parameter under the target view angle at the moment.
4. The method of path roaming according to claim 1, wherein,
the camera equipment parameters comprise a camera focal length and a camera working distance;
the step of using the video stream with the largest imaging pixel number as the optimal video stream comprises the following steps:
and taking the video stream with the largest ratio of the focal length of the camera to the working distance of the camera as the optimal video stream.
5. The method of path roaming according to claim 1, wherein,
the loading the optimal video stream to the three-dimensional scene where the corresponding target view angle is located includes:
creating a presentation plane at the target viewing angle;
and updating the texture mapping on the display plane according to the video frame sequence of the pictures in the optimal video stream.
6. The method of path roaming according to claim 5, wherein,
the calculating, by using the camera position parameter and the camera viewpoint parameter, a tour path for loading the optimal video stream into a three-dimensional scene where a corresponding target viewing angle is located, includes:
calculating a path of a picture in the optimal video stream from a camera position to a camera viewpoint and a motion speed by using the camera position parameter and the camera viewpoint parameter;
and setting transition animation of the pictures in the optimal video stream into a slow-in and slow-out form.
7. A terminal device, characterized in that the terminal device comprises a processor and a memory; the memory has stored therein a computer program for executing the computer program to implement the steps of the path roaming method according to any of the claims 1-6.
8. A computer storage medium storing a computer program which when executed implements the steps of the path roaming method according to any one of claims 1 to 6.
CN202110287704.1A 2021-03-17 2021-03-17 Path roaming method, terminal equipment and computer storage medium Active CN113223130B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110287704.1A CN113223130B (en) 2021-03-17 2021-03-17 Path roaming method, terminal equipment and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110287704.1A CN113223130B (en) 2021-03-17 2021-03-17 Path roaming method, terminal equipment and computer storage medium

Publications (2)

Publication Number Publication Date
CN113223130A CN113223130A (en) 2021-08-06
CN113223130B true CN113223130B (en) 2023-07-28

Family

ID=77083827

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110287704.1A Active CN113223130B (en) 2021-03-17 2021-03-17 Path roaming method, terminal equipment and computer storage medium

Country Status (1)

Country Link
CN (1) CN113223130B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113870429B (en) * 2021-09-29 2022-04-26 生态环境部卫星环境应用中心 Web end three-dimensional scene posture self-adaptive path roaming method considering terrain fluctuation
CN114898076B (en) * 2022-03-29 2023-04-21 北京城市网邻信息技术有限公司 Model label adding method and device, electronic equipment and storage medium
CN115114835B (en) * 2022-08-23 2023-01-31 深圳市城市交通规划设计研究中心股份有限公司 Road network data roaming display system and method, electronic equipment and storage medium
CN116452718B (en) * 2023-06-15 2023-09-12 山东捷瑞数字科技股份有限公司 Path planning method, system, device and storage medium for scene roaming
CN117809000B (en) * 2024-02-28 2024-05-10 四川省公路规划勘察设计研究院有限公司 Highway path roaming method and equipment based on Gaussian filter algorithm

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110659385A (en) * 2019-09-12 2020-01-07 中国测绘科学研究院 Fusion method of multi-channel video and three-dimensional GIS scene

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104599243B (en) * 2014-12-11 2017-05-31 北京航空航天大学 A kind of virtual reality fusion method of multiple video strems and three-dimensional scenic
CN107316344B (en) * 2017-05-18 2020-08-14 深圳市佳创视讯技术股份有限公司 Method for planning roaming path in virtual-real fusion scene
CN107809611A (en) * 2017-09-28 2018-03-16 福建四创软件有限公司 A kind of 720 ° of Panoramic Warping method for real-time monitoring of hydraulic engineering and system
CN108833863A (en) * 2018-07-24 2018-11-16 河北德冠隆电子科技有限公司 Method for previewing is checked in the virtual camera monitoring monitoring of four-dimensional outdoor scene traffic simulation
CN111402374B (en) * 2018-12-29 2023-05-23 曜科智能科技(上海)有限公司 Multi-path video and three-dimensional model fusion method, device, equipment and storage medium thereof
CN111147768A (en) * 2019-12-25 2020-05-12 北京恒峰致远科技有限公司 Intelligent monitoring video review method for improving review efficiency
CN111640173B (en) * 2020-05-09 2023-04-21 杭州群核信息技术有限公司 Cloud rendering method and system for home roaming animation based on specific path
CN112017270A (en) * 2020-08-28 2020-12-01 南昌市国土资源勘测规划院有限公司 Live-action three-dimensional visualization online application system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110659385A (en) * 2019-09-12 2020-01-07 中国测绘科学研究院 Fusion method of multi-channel video and three-dimensional GIS scene

Also Published As

Publication number Publication date
CN113223130A (en) 2021-08-06

Similar Documents

Publication Publication Date Title
CN113223130B (en) Path roaming method, terminal equipment and computer storage medium
US6717586B2 (en) Apparatus, method, program code, and storage medium for image processing
US11776142B2 (en) Structuring visual data
US11044398B2 (en) Panoramic light field capture, processing, and display
CN112184873B (en) Fractal graph creation method, fractal graph creation device, electronic equipment and storage medium
CN110648274A (en) Fisheye image generation method and device
CN108629799B (en) Method and equipment for realizing augmented reality
US9025007B1 (en) Configuring stereo cameras
CN114926612A (en) Aerial panoramic image processing and immersive display system
CA3199128A1 (en) Systems and methods for augmented reality video generation
CN114358112A (en) Video fusion method, computer program product, client and storage medium
CN115486091A (en) System and method for video processing using virtual reality devices
JP2004326179A (en) Image processing device, image processing method, image processing program, and recording medium storing it
US9807302B1 (en) Offset rolling shutter camera model, and applications thereof
US20220392121A1 (en) Method for Improved Handling of Texture Data For Texturing and Other Image Processing Tasks
CN108510433B (en) Space display method and device and terminal
CN113810755B (en) Panoramic video preview method and device, electronic equipment and storage medium
Rajan et al. A realistic video avatar system for networked virtual environments
CN110910482B (en) Method, system and readable storage medium for video data organization and scheduling
CN114900743A (en) Scene rendering transition method and system based on video plug flow
JP2023504846A (en) Encoding and Decoding Views on Volumetric Image Data
Lin et al. Fast intra-frame video splicing for occlusion removal in diminished reality
US12073574B2 (en) Structuring visual data
CN116433848B (en) Screen model generation method, device, electronic equipment and storage medium
CN117274558B (en) AR navigation method, device and equipment for visual positioning and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant