CN117519484A - Virtual scene interaction method, device, equipment and medium based on track ball - Google Patents

Virtual scene interaction method, device, equipment and medium based on track ball Download PDF

Info

Publication number
CN117519484A
CN117519484A CN202311579274.6A CN202311579274A CN117519484A CN 117519484 A CN117519484 A CN 117519484A CN 202311579274 A CN202311579274 A CN 202311579274A CN 117519484 A CN117519484 A CN 117519484A
Authority
CN
China
Prior art keywords
rot
virtual scene
virtual
information
virtual camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311579274.6A
Other languages
Chinese (zh)
Inventor
万杨阳
温子颢
陈翰林
高昊天
郎玥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yixing Xingfan Tianjin Technology Co ltd
Original Assignee
Yixing Xingfan Tianjin Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yixing Xingfan Tianjin Technology Co ltd filed Critical Yixing Xingfan Tianjin Technology Co ltd
Priority to CN202311579274.6A priority Critical patent/CN117519484A/en
Publication of CN117519484A publication Critical patent/CN117519484A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03549Trackballs

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention discloses a virtual scene interaction method, device, equipment and medium based on a track ball, which belong to the technical field of computers, wherein an application scene is 360-degree surrounding observation of an observed product controlled by the track ball, but different from the existing functional scheme, the scene product is fixed, and the moving is the visual angle of an observer. In order to realize the function, the thought of detecting the movement amount of the mouse every frame in real time is used, the movement amount of the mouse in the x axis and the y axis of the screen space of each frame is mapped to the rotation of the observation cameras Yaw and Pitch directions, and the movement of the mouse is mapped to the rotation of the track ball, so that the effect of rotating the track ball to rotate the camera around the observation object without dead angles can be realized.

Description

Virtual scene interaction method, device, equipment and medium based on track ball
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to a virtual scene interaction method, device, equipment and medium based on a track ball.
Background
The disclosure of this background section is only intended to increase the understanding of the general background of the invention and is not necessarily to be construed as an admission or any form of suggestion that this information forms the prior art already known to those of ordinary skill in the art.
The interactive display of three-dimensional virtual scenes has become necessary display equipment for popular science venues, which can display different scenes in an omnibearing manner according to the control of users, and corresponding virtual display devices exist in the existing equipment, but in the switching process of display results, only image transformation is displayed due to the switching between planar graphs or between adjacent graphs, so that excessive flatness cannot be realized, for example, in a science exhibition hall, when different stars are observed and switched, the first star interface is actually switched to the second star interface directly, and the space difference between the two and the process visual effect generated by space translation cannot be realized, so that the experience effect is poor.
Disclosure of Invention
Therefore, the embodiment of the invention provides a virtual scene interaction method based on a track ball, which aims to solve the problem of incoherence image conversion caused by the hardness of a virtual interaction space switching process in the prior art.
In order to achieve the above object, the embodiments of the present invention provide the following technical solutions:
in a first aspect of an embodiment of the present invention, there is provided a virtual scene interaction method based on a track ball, the method including:
acquiring a virtual scene, and determining vector track information applied to an identification point controlled by a track ball in the virtual scene;
mapping the vector track information of the identification points to the YAW and Pitch directions of the virtual camera;
reading the mapped complete image information and uploading a result picture;
and judging the position of the virtual camera, and calling a corresponding feature set result by combining the result picture.
Further, the method for determining the identification point vector aggregation information comprises the following steps:
confirming a system of an X axis and a Y axis of the virtual scene, and identifying the movement amounts Delta X and Delta Y of the X axis and the Y axis of the mouse in real time;
according to the algorithm, the movement coefficients ROT X and ROT Y of X and Y axes are calculated, and the specific calculation formula is as follows
ROT X+=Delta X*50
ROT Y+=DeltaY*(-20)。
Further, the method for mapping the vector track information comprises the following steps of
Two moving rails YAW (transverse moving rail) and Pitch (longitudinal moving rail) are provided for the virtual camera;
vectors set on YAW directional spline for ROT X
Vectors set on Pitch direction spline for ROT Y
The vectors in both directions are superimposed:
will beEndowing the virtual camera with the virtual camera;
obtaining tangential direction on YAW direction spline of virtual camera
The virtual camera is rotated by 90 degrees along the Z-axis direction through a rotation matrix algorithm:
finally, adding infinite movement setting to the virtual camera, wherein the virtual camera is transmitted to a spline starting point when approaching the spline ending point; when the virtual camera approaches the spline start point, it will be transferred to the spline end point.
Further, the position judgment of the virtual camera includes that the virtual scene is set in a partition mode, gradual transition of two adjacent areas is achieved, wherein feature results of the feature set are respectively set in different partition modes, and the partition modes are as follows:
A t for ROT X=0-3000
A j For ROT x=3000-6000
A t For ROT x=6000-9000
A s For ROT x=9000-12000
A d For ROT x=12000-18000
A m For ROT x=19000-22000
A h ROT x=22000-25000.
Further, the feature result is called to include a UMG interface and audio, and the method for implementing gradual transition of the UMG interface and audio between each other includes:
the transparency of the UMG interface is controlled through the parameter Alpha:
Alpha=a-(X-b) 2 , (0-1)
wherein, a controls the speed of the gradual entering and the gradual exiting, wherein, the value of a of one characteristic result is 9, and the value of a of the other characteristic results is 2.25.
b is controlled by the progressive-in and progressive-out interval, which is the intermediate value of the interval corresponding to the characteristic result ROT X value.
Further, the method for audio gradual transition between different ones of the feature results comprises:
the Volume of the planet interaction audio is controlled through the parameter Volume:
Volume=a-(X-b) 2
wherein, a controls the speed of the gradual entering and the gradual exiting, wherein, the value of a of one characteristic result is 9, and the value of a of the other characteristic results is 2.25;
b is controlled by the progressive-in and progressive-out interval, which is the intermediate value of the interval corresponding to the characteristic result ROT X value.
In a second aspect of the embodiments of the present invention, an apparatus for virtual scene interaction based on a trackball is characterized in that: comprising
An acquisition unit: the method comprises the steps of acquiring virtual scene image information, identification point track information and feature set information;
processing mapping unit: confirming the corresponding relation of the identification point track information in the virtual scene image and mapping the corresponding relation to the virtual camera;
and a comparison unit: confirming position information corresponding to the virtual camera and the virtual scene according to the mapping information, further determining a corresponding characteristic result, and comparing and confirming the characteristic result with the corresponding characteristic result;
a calling unit: and calling the corresponding interface and audio information according to the determined characteristic result information.
Further, the apparatus comprises a database unit comprising audio information corresponding to each of the feature results.
In a third aspect of the embodiments of the present invention, there is provided an electronic device including an input device and an output device, characterized by further including
A processor adapted to implement one or more instructions; and
a computer readable storage medium having stored thereon at least one instruction, at least one program, code set, or instruction set, said at least one instruction, said at least one program, said code set, or instruction set being loaded and executed by said processor to implement the steps of the aforementioned trackball-based virtual scene interaction method.
In a fourth aspect of embodiments of the present invention, there is provided a computer readable storage medium having stored thereon at least one instruction, at least one program, a set of codes or a set of instructions, the at least one instruction, the at least one program, the set of codes or the set of instructions being loaded and executed by the processor to implement the steps of the aforementioned trackball-based virtual scene interaction method.
According to an embodiment of the invention, the method has the following advantages: the application scene is 360-degree surrounding observation of the observed product controlled by the track ball, but the scene product is fixed and the visual angle of the observer is moved, unlike the existing functional scheme. In order to realize the function, the thought of detecting the movement amount of the mouse every frame in real time is used, the movement amount of the mouse in the x axis and the y axis of the screen space of each frame is mapped to the rotation of the observation cameras Yaw and Pitch directions, and the movement of the mouse is mapped to the rotation of the track ball, so that the effect of rotating the track ball to rotate the camera around the observation object without dead angles can be realized.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below. It will be apparent to those skilled in the art from this disclosure that the drawings described below are merely exemplary and that other embodiments may be derived from the drawings provided without undue effort.
The structures, proportions, sizes, etc. shown in the present specification are shown only for the purposes of illustration and description, and are not intended to limit the scope of the invention, which is defined by the claims, so that any structural modifications, changes in proportions, or adjustments of sizes, which do not affect the efficacy or the achievement of the present invention, should fall within the ambit of the technical disclosure.
Fig. 1 is a schematic flow chart of a virtual scene interaction method based on a track ball according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a virtual scene interaction device based on a track ball according to an embodiment of the present invention;
FIG. 3 is a topology diagram of a product device in an embodiment of the invention;
FIG. 4 is a schematic diagram of spline lines of virtual cameras YAW (lateral movement track) and Pitch (longitudinal movement track) in an embodiment of the invention;
fig. 5 is a schematic diagram of an algorithm function for controlling transparency parameter Alpha of a UMG interface according to an embodiment of the present invention.
Detailed Description
Other advantages and advantages of the present invention will become apparent to those skilled in the art from the following detailed description, which, by way of illustration, is to be read in connection with certain specific embodiments, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terms such as "upper", "lower", "left", "right", "middle" and the like are also used in the present specification for convenience of description, but are not intended to limit the scope of the present invention, and the changes or modifications of the relative relationship thereof are considered to be within the scope of the present invention without substantial modification of the technical content.
Generally, the interactive display of three-dimensional virtual scenes has become necessary display equipment for science popularization venues, which can display different scenes in an omnibearing manner according to the control of users, and corresponding virtual display devices exist in the existing equipment, but in the switching process of display results, only image transformation is displayed due to the switching between planar graphs or between adjacent graphs, so that the smoothness transition cannot be realized, for example, in a science and technology exhibition hall, when the observation of different stars is switched, the viewing effect of the process vision effect generated by the fact that the viewing effect is switched from a first star interface to a second star interface directly, and the space difference and the space translation between the two is not realized, so that the experience effect is poor.
360 degree circular viewing of the viewed product by means of a trackball control, but unlike the prior art solutions, the scene product is fixed, moving from the viewing angle of the observer. In order to realize the function, the thought of detecting the movement amount of the mouse every frame in real time is used, the movement amount of the mouse in the x axis and the y axis of the screen space of each frame is mapped to the rotation of the observation cameras Yaw and Pitch directions, and the movement of the mouse is mapped to the rotation of the track ball, so that the effect of rotating the track ball to rotate the camera around the observation object without dead angles can be realized.
Based on the system structure of virtual scene interaction in the embodiment of the invention, the system structure is applied to observation and observer interaction of a plurality of planets in a solar system in the universe, as shown in fig. 3, the product consists of a track ball, a computer host, a sound box and a projector, and the control logic of the product is shown in the figure. The cursor of the host computer is controlled by the track ball, and the system controls the observation camera by mapping the movement coefficient of the cursor to the observation camera in the virtual scene. The video and audio output of the virtual scene is changed by observing the change of the position of the camera, so that the picture projected by the projector and the sound played by the sound box are controlled, the coordination conversion of the visual angle, UMG and audio under the control of the track ball is achieved, and immersive universe exploration virtual reality interaction experience is provided for an experimenter.
The method is applied to a space interactive experience project in a space science and technology achievement tour exhibition, and a virtual scene of the outer space of the earth is manufactured in a illusion engine, wherein space scenes in the solar system and a space station of the palace of China exist in the scene. The darkroom experience space is built on the exhibition site, and a gray projection wall, a projector, a track ball, a bearing table and a sound are arranged in the darkroom experience space and are connected by a main control computer of an operation room. The projector and the gray projection wall display virtual scenes manufactured by us, and the track ball controls the mouse cursor of the main control computer. After the method described by the present invention. The experienter of exhibition can adjust the viewing angle in the virtual space by rotating the track ball, thereby 360 degrees of the space station of the Chinese heaven. Meanwhile, when the visual angle rotates to the star at the periphery of the solar system, a data card corresponding to the star is displayed, and the introduction audio of the star is played at the same time, so that the transition between UMG and the audio file is smooth and natural due to the fade-in fade-out algorithm of transparency and volume.
Exemplary method
Fig. 1 is a schematic flow chart of a virtual scene interaction method based on a track ball according to an embodiment of the present invention. The method is realized by the following steps:
step 11: acquiring a virtual scene, and determining vector track information applied to an identification point controlled by a track ball in the virtual scene; the moving coefficients of the X axis and the Y axis of the screen mouse controlled by the track ball are read through a program, and the step provides a data basis for virtually observing the moving direction and the moving speed of the camera.
The method for determining the identification point vector aggregation information comprises the following steps:
confirming a system of an X axis and a Y axis of a virtual scene, and identifying the movement amounts Delta X and Delta Y of the X axis and the Y axis of a mouse in real time;
according to the algorithm, the movement coefficients ROT X and ROT Y of X and Y axes are calculated, and the specific calculation formula is as follows
ROT X+=Delta X*50
ROT Y+=DeltaY*(-20)。
Step 12: mapping the vector track information of the identification points to the YAW and Pitch directions of the virtual camera, wherein the step realizes the control of the track ball on the observation camera;
the method for mapping vector track information comprises the following steps of
Two moving rails YAW (transverse moving rail) and Pitch (longitudinal moving rail) are provided for the virtual camera;
vectors set on YAW directional spline for ROT X
Vectors set on Pitch direction spline for ROT Y
The vectors in both directions are superimposed:
will beEndowing the virtual camera with the virtual camera;
obtaining tangential direction on YAW direction spline of virtual camera
The virtual camera is rotated by 90 degrees along the Z-axis direction through a rotation matrix algorithm:
finally, adding infinite movement setting to the virtual camera, wherein the virtual camera is transmitted to a spline starting point when approaching the spline ending point; when the virtual camera approaches the spline start point, it will be transferred to the spline end point.
As shown in fig. 4, a schematic diagram illustrates a moving track set up for an observation camera in a virtual scene in this patent, wherein a circular arc-shaped transverse track is a YAW direction track, and a longitudinal straight line perpendicular to the YAW track is a Pitch direction track.
Step 13: the mapped complete image information is read, a result picture is uploaded, and a picture seen by an observation camera is read and displayed through a program, so that the function that an experienter controls a track ball to change an observation visual angle is realized;
the observing camera has the function of the UE, and can change the visual angle when the virtual scene is output through the observing camera.
Step 14: judging the position of a virtual camera, combining a result picture, calling a corresponding feature set result (a planet), judging the star appearing in the picture by monitoring and observing the position of the camera, calling the UMG (user interface in a virtual engine) corresponding to the star, and calling the interactive explanation audio corresponding to the star at the same time.
The position judgment of the virtual camera comprises the steps of carrying out partition setting on a virtual scene and realizing gradual transition of two adjacent areas, wherein the stars (characteristic results) of the characteristic set are respectively arranged in different partitions, and the partitions are as follows:
A t for ROT X=0-3000
A j For ROT x=3000-6000
A t For ROT x=6000-9000
A s For ROT x=9000-12000
A d For ROT x=12000-18000
A m For ROT x=19000-22000
A h ROT x=22000-25000.
The method for calling the star ball comprises a UMG interface and audio, and realizing gradual transition of the UMG interface and the audio, and the gradual transition of the graphical interface between different star balls comprises the following steps:
the transparency of the UMG interface is controlled through the parameter Alpha:
Alpha=a-(X-b) 2 , (0-1)
wherein, a controls the speed of the gradual entering and the gradual exiting, wherein the value of a of the earth is 9, and the value of a of the rest planet is 2.25.
b is the interval of the gradual-in and gradual-out, which is the intermediate value of the interval corresponding to the star ROT X value.
Taking the UMG interface call of the earth star as an example:
when (when)Or->
Alpha=2.25-(x-1.5) 2
When (when)
Alpha=1。
The method for audio gradual transition between different stars comprises the following steps:
the Volume of the planet interaction audio is controlled through the parameter Volume:
Volume=a-(X-b) 2
wherein, a controls the speed of the gradual entering and the gradual exiting, wherein the value of a of the earth is 9, and the value of a of the rest planet is 2.25;
b is the interval of the gradual-in and gradual-out, which is the intermediate value of the interval corresponding to the star ROT X value.
As shown in fig. 5, by this algorithm, when the observation camera enters each star area, the transparency of the page corresponding to the UMG increases from 0 to 1 according to the function algorithm, that is, the effect of fade-in. When the UMG is fully developed, the transparency is maintained at 1. When the observation camera leaves the current planet area, the transparency can be reduced from 1 to 0 according to the function algorithm, so that the fade-out effect is realized, and a new function is entered when the next planet area is entered. By the method, UMG in the virtual scene can realize smooth switching transition.
Wherein the curves in the schematic are functions: y=a- (x-b) 2
The line parallel to the transverse axis is a function y=1
The intersection section and the overlapping part of the curve and the line parallel to the horizontal axis are algorithm functions for controlling transparency parameter Alpha of the UMG interface:
Alpha=a-(x-b) 2 ,(0-1)
the star change, music and UI interface in the scene are all related to the core parameter of the movement amount of each frame of the mouse, and soft gradual transition among all contents can be controlled through the parameter, so that higher ornamental value is realized. In addition, the additional elements such as the planet, music, UI interface and the like in the scene of the scheme support replacement, other elements can be added subsequently, and the rotation track and the radius of the camera can be customized according to the observed product, so that the method has strong universality and expandability.
Exemplary apparatus
In a second aspect of the embodiment of the present invention, as shown in fig. 2, an apparatus for virtual scene interaction based on a track ball is characterized in that: comprising
The acquisition unit 21: the method comprises the steps of acquiring virtual scene image information, identification point track information and feature set information;
the process mapping unit 22: the corresponding relation of the identified point track information in the virtual scene image is confirmed and mapped to the virtual camera;
an alignment unit 23: confirming position information corresponding to the virtual camera and the virtual scene according to the mapping information, further confirming the corresponding star, and comparing and confirming the star with the position information;
the calling unit 24: and calling the corresponding interface and audio information according to the determined planet information.
Wherein the device further comprises a database unit 25 comprising audio information corresponding to each planet.
Exemplary apparatus
In a third aspect of the embodiments of the present invention, there is provided an electronic device including an input device and an output device, characterized by further including
A processor adapted to implement one or more instructions; and
a computer readable storage medium having stored thereon at least one instruction, at least one program, code set, or instruction set, the at least one instruction, at least one program, code set, or instruction set being loaded and executed by a processor to implement the steps of the trackball-based virtual scene interaction method described above.
In a fourth aspect of embodiments of the present invention, a computer readable storage medium having stored thereon at least one instruction, at least one program, code set, or instruction set, the at least one instruction, at least one program, code set, or instruction set being loaded and executed by a processor to implement the steps of the aforementioned trackball-based virtual scene interaction method is provided.
The above embodiment method may be implemented by software plus necessary general hardware platform, but may also be implemented by hardware, where the former is a preferred embodiment in many cases. Based on such understanding, the technical solution of the present invention may be essentially or partially contributing to the prior art in the form of a software product, and cannot be related to the limitation of implementation of the method of the present invention due to the limitation of the hardware structural platform, so that the embodiments of the method are applicable to the electronic device and the storage medium, and can achieve the same or similar beneficial effects.
While the invention has been described in detail in the foregoing general description and specific examples, it will be apparent to those skilled in the art that modifications and improvements can be made thereto. Accordingly, such modifications or improvements may be made without departing from the spirit of the invention and are intended to be within the scope of the invention as claimed.

Claims (10)

1. A method of virtual scene interaction based on a trackball, the method comprising:
acquiring a virtual scene, and determining vector track information applied to an identification point controlled by a track ball in the virtual scene;
mapping the vector track information of the identification points to the YAW and Pitch directions of the virtual camera;
reading the mapped complete image information and uploading a result picture;
and judging the position of the virtual camera, and calling a corresponding feature set result by combining the result picture.
2. The virtual scene interaction method based on the track ball as claimed in claim 1, wherein the determination method of the identification point vector aggregation information comprises the following steps:
confirming a system of an X axis and a Y axis of the virtual scene, and identifying the movement amounts Delta X and Delta Y of the X axis and the Y axis of the mouse in real time;
according to the algorithm, the movement coefficients ROT X and ROT Y of X and Y axes are calculated, and the specific calculation formula is as follows
ROT X+=Delta X*50
ROT Y+=DeltaY*(-20)。
3. The virtual scene interaction method based on a track ball as claimed in claim 2, wherein the vector track information mapping method comprises
Setting two moving tracks YAW and Pitch for the virtual camera;
vectors set on YAW directional spline for ROT X
Vectors set on Pitch direction spline for ROT Y
The vectors in both directions are superimposed:
will beEndowing the virtual camera with the virtual camera;
obtaining tangential direction on YAW direction spline of virtual camera
The virtual camera is rotated by 90 degrees along the Z-axis direction through a rotation matrix algorithm:
finally, adding infinite movement setting to the virtual camera, wherein the virtual camera is transmitted to a spline starting point when approaching the spline ending point; when the virtual camera approaches the spline start point, it will be transferred to the spline end point.
4. The method for interacting virtual scenes based on a track ball according to claim 1, wherein the position judgment of the virtual camera comprises the steps of carrying out partition setting on the virtual scene and realizing gradual transition of two adjacent areas, wherein the characteristic results of the characteristic sets are respectively arranged in different partitions, and the partitions are as follows:
A t for ROT X=0-3000
A j For ROT x=3000-6000
A t For ROT x=6000-9000
A s For ROT x=9000-12000
A d For ROT x=12000-18000
A m For ROT x=19000-22000
A h ROT x=22000-25000.
5. The virtual scene interaction method based on a track ball as recited in claim 4, wherein the feature result is called to include a UMG interface and audio, and the method for realizing gradual transition of the UMG interface and audio among each other, and the graphical interface gradual transition between different feature results includes:
the transparency of the UMG interface is controlled through the parameter Alpha:
Alpha=a-(X-b) 2 , (0-1)
wherein, a controls the speed of the gradual entering and the gradual exiting, wherein, the value of a of one characteristic result is 9, and the value of a of the other characteristic results is 2.25.
b is controlled by the progressive-in and progressive-out interval, which is the intermediate value of the interval corresponding to the characteristic result ROT X value.
6. The trackball-based virtual scene interaction method of claim 4, wherein the method of audio fade transition between different ones of the feature results comprises:
the Volume of the planet interaction audio is controlled through the parameter Volume:
Volume=a-(X-b) 2
wherein, a controls the speed of the gradual entering and the gradual exiting, wherein, the value of a of one characteristic result is 9, and the value of a of the other characteristic results is 2.25;
b is controlled by the progressive-in and progressive-out interval, which is the intermediate value of the interval corresponding to the characteristic result ROT X value.
7. A device for virtual scene interaction based on a track ball, which is characterized in that: comprising
An acquisition unit: the method comprises the steps of acquiring virtual scene image information, identification point track information and feature set information;
processing mapping unit: confirming the corresponding relation of the identification point track information in the virtual scene image and mapping the corresponding relation to the virtual camera;
and a comparison unit: confirming position information corresponding to the virtual camera and the virtual scene according to the mapping information, further determining a corresponding characteristic result, and comparing and confirming the characteristic result with the corresponding characteristic result;
a calling unit: and calling the corresponding interface and audio information according to the determined characteristic result information.
8. The apparatus of claim 7, further comprising a database unit that includes audio information corresponding to each of the feature results.
9. An electronic device comprising an input device and an output device, and further comprising a processor adapted to implement one or more instructions; and
a computer readable storage medium having stored thereon at least one instruction, at least one program, code set, or instruction set, the at least one instruction, the at least one program, the code set, or instruction set being loaded and executed by the processor to implement the steps of any of the methods of claims 1-6.
10. A computer readable storage medium having stored thereon at least one instruction, at least one program, code set, or instruction set, the at least one instruction, the at least one program, the code set, or instruction set being loaded and executed by the processor to implement the steps of any of the methods of claims 1-6.
CN202311579274.6A 2023-11-24 2023-11-24 Virtual scene interaction method, device, equipment and medium based on track ball Pending CN117519484A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311579274.6A CN117519484A (en) 2023-11-24 2023-11-24 Virtual scene interaction method, device, equipment and medium based on track ball

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311579274.6A CN117519484A (en) 2023-11-24 2023-11-24 Virtual scene interaction method, device, equipment and medium based on track ball

Publications (1)

Publication Number Publication Date
CN117519484A true CN117519484A (en) 2024-02-06

Family

ID=89762347

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311579274.6A Pending CN117519484A (en) 2023-11-24 2023-11-24 Virtual scene interaction method, device, equipment and medium based on track ball

Country Status (1)

Country Link
CN (1) CN117519484A (en)

Similar Documents

Publication Publication Date Title
US9919233B2 (en) Remote controlled vehicle with augmented reality overlay
US6570581B1 (en) On-location video assistance system with computer generated imagery overlay
US9658617B1 (en) Remote controlled vehicle with a head-mounted display
US6968973B2 (en) System and process for viewing and navigating through an interactive video tour
US5872575A (en) Method and system for the creation of and navigation through a multidimensional space using encoded digital video
US9369679B2 (en) System and process for projecting location-referenced panoramic images into a 3-D environment model and rendering panoramic images from arbitrary viewpoints within the 3-D environment model
KR101626038B1 (en) Touring in a geographic information system
US20120099804A1 (en) Generating Three-Dimensional Virtual Tours From Two-Dimensional Images
US20060287083A1 (en) Camera based orientation for mobile devices
US20110273451A1 (en) Computer simulation of visual images using 2d spherical images extracted from 3d data
US8508534B1 (en) Animating objects using relative motion
US20130271452A1 (en) Mechanism for facilitating context-aware model-based image composition and rendering at computing devices
CN1162806C (en) Shooting, formation, transmission and display method of road overall view image tape
US9007379B1 (en) Methods and apparatus for interactive user control of virtual cameras
CN117519484A (en) Virtual scene interaction method, device, equipment and medium based on track ball
US20230136597A1 (en) Ingesting 3d objects from a virtual environment for 2d data representation
JP3429271B2 (en) 3D image processing method, apparatus thereof, computer-readable recording medium recording 3D image processing program, and video game apparatus
CN110189418A (en) Image generating method and terminal device for digital guide to visitors
CN117173378A (en) CAVE environment-based WebVR panoramic data display method, device, equipment and medium
Sarmiento et al. Panoramic immersive videos-3d production and visualization framework
Newendorp et al. Development Methods and a Scenegraph Animation API for Cluster Driven Immersive Applications
Baričević et al. QAVE–a Gamebased Immersive Virtual Reality System
Chen et al. A Simple and Effective 3D Navigation System with 2D Map Guidance.

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination