CN114138120A - Virtual visual angle tracking method and system applied to virtual reality - Google Patents
Virtual visual angle tracking method and system applied to virtual reality Download PDFInfo
- Publication number
- CN114138120A CN114138120A CN202111501788.0A CN202111501788A CN114138120A CN 114138120 A CN114138120 A CN 114138120A CN 202111501788 A CN202111501788 A CN 202111501788A CN 114138120 A CN114138120 A CN 114138120A
- Authority
- CN
- China
- Prior art keywords
- target
- coordinate
- dimensional
- coordinates
- virtual
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a virtual visual angle tracking method and system applied to virtual reality, and relates to the technical field of virtual reality. The method comprises the following steps: acquiring a world three-dimensional coordinate of a virtual target object and a user vision origin three-dimensional coordinate in real time; calculating to obtain a target relative coordinate according to the world three-dimensional coordinate of the virtual target object and the three-dimensional coordinate of the user visual origin; filtering the relative coordinates of the target to obtain two-dimensional plane coordinates of the target; calculating to obtain a target orientation rotation value according to the target two-dimensional plane coordinate; generating a target three-dimensional coordinate according to the target two-dimensional plane coordinate and a preset fixed longitudinal value; and generating an indication icon according to the target three-dimensional coordinates and the target orientation rotation value. The invention can realize the 360-degree orientation guidance prompt of the target outside the field of view in the virtual reality scene, and improves the user experience quality.
Description
Technical Field
The invention relates to the technical field of virtual reality, in particular to a virtual visual angle tracking method and system applied to virtual reality.
Background
In virtual reality, as a 720-degree panoramic content display form, when immersion is brought, the sense of direction is more easily lost than that of traditional 2D content, and at present, the virtual reality content on the market intuitively guides few schemes of visual angle and direction, and generally needs to watch a target to distinguish colors and brightness so as to achieve the purpose of attracting sight. For the target outside the visual field, the function or function of guiding the experience person cannot be achieved, and the experience person is easy to get into the confusion of losing the target.
Disclosure of Invention
In order to overcome the above problems or at least partially solve the above problems, embodiments of the present invention provide a method and a system for tracking a virtual perspective in virtual reality, which can perform 360-degree orientation guidance prompt on an object outside a field of view in a virtual reality scene, thereby improving user experience quality.
The embodiment of the invention is realized by the following steps:
in a first aspect, an embodiment of the present invention provides a virtual perspective tracking method applied in virtual reality, including the following steps:
acquiring a world three-dimensional coordinate of a virtual target object and a user vision origin three-dimensional coordinate in real time;
calculating to obtain a target relative coordinate according to the world three-dimensional coordinate of the virtual target object and the three-dimensional coordinate of the user visual origin;
filtering the relative coordinates of the target to obtain two-dimensional plane coordinates of the target;
calculating to obtain a target orientation rotation value according to the target two-dimensional plane coordinate;
generating a target three-dimensional coordinate according to the target two-dimensional plane coordinate and a preset fixed longitudinal value;
and generating an indication icon according to the target three-dimensional coordinates and the target orientation rotation value.
In order to solve the technical problems that in the prior art, for targets outside a visual field, a guide function or function cannot be provided for an experiencer, and at the moment, the experiencer is easy to get a sense of confusion of losing targets, the method calculates three-dimensional coordinates relative to a visual origin according to three-dimensional coordinates of a virtual target object world, filters out longitudinal values of the three-dimensional coordinates to obtain two-dimensional coordinates relative to the visual origin, calculates to obtain vectors relative to the visual center, calculates to obtain a rotation value towards the targets, calculates to add fixed longitudinal values to obtain three-dimensional coordinates in the visual field again, then updates coordinates and rotation values of a prompt icon, and further accurately prompts the target orientation of the user. According to the method and the device, in the virtual reality scene, the approximate target azimuth area is sensed through the orientation of the prompt icon, even if the target is not in the visual field, the target azimuth can be sensed through the orientation of the prompt icon, so that the experience person is guided to turn, the established flow is completed orderly and smoothly, the confusion of the experience person for the first time is greatly reduced, and the experience quality is improved. The orientation guidance prompt of 360 degrees can be carried out on the target outside the field of view in the virtual reality scene, and the user experience quality is improved.
Based on the first aspect, in some embodiments of the present invention, the method for calculating the target relative coordinates according to the virtual target object world three-dimensional coordinates and the user visual origin three-dimensional coordinates includes the following steps:
and calculating the difference value between the world three-dimensional coordinate of the virtual target object and the three-dimensional coordinate of the visual origin of the user to obtain the relative coordinate of the target.
Based on the first aspect, in some embodiments of the present invention, the method for filtering the relative coordinates of the target to obtain the two-dimensional plane coordinates of the target includes the following steps:
and filtering the longitudinal value of the relative coordinate of the target to obtain the two-dimensional plane coordinate of the target.
Based on the first aspect, in some embodiments of the present invention, the method for calculating the target orientation rotation value according to the target two-dimensional plane coordinates includes the following steps:
and calculating by adopting a trigonometric function formula according to the target two-dimensional plane coordinate to obtain a target orientation rotation value.
Based on the first aspect, in some embodiments of the present invention, the virtual perspective tracking method applied in virtual reality further includes the following steps:
and generating interface prompt information according to the indication icon.
Based on the first aspect, in some embodiments of the present invention, the virtual perspective tracking method applied in virtual reality further includes the following steps:
and marking the indicating icon according to the preset icon color to obtain a new indicating icon.
In a second aspect, an embodiment of the present invention provides a virtual perspective tracking system applied in virtual reality, including a basic coordinate obtaining module, a relative coordinate module, a coordinate filtering module, an orientation calculating module, a target coordinate generating module, and an icon indicating module, where:
the basic coordinate acquisition module is used for acquiring a three-dimensional world coordinate of the virtual target object and a three-dimensional user visual origin coordinate in real time;
the relative coordinate module is used for calculating to obtain a target relative coordinate according to the world three-dimensional coordinate of the virtual target object and the three-dimensional coordinate of the user visual origin;
the coordinate filtering module is used for filtering the relative coordinates of the target to obtain two-dimensional plane coordinates of the target;
the orientation calculation module is used for calculating a target orientation rotation value according to the target two-dimensional plane coordinates;
the target coordinate generating module is used for generating a target three-dimensional coordinate according to the target two-dimensional plane coordinate and a preset fixed longitudinal value;
and the icon indicating module is used for generating an indicating icon according to the target three-dimensional coordinate and the target orientation rotation value.
In order to solve the technical problems that targets outside the visual field cannot be guided or cannot be guided by an experiencer in the prior art, and the experiencer is prone to getting a sense of confusion of losing the targets, the system acquires world three-dimensional coordinates of a virtual target object through a basic coordinate acquisition module, calculates the three-dimensional coordinates relative to a visual origin through a relative coordinate module, filters out longitudinal values of the three-dimensional coordinates through a coordinate filtering module to obtain two-dimensional coordinates relative to the visual origin, calculates to obtain vectors relative to the visual center, calculates to obtain a rotation value of the orientation target through an orientation calculation module, calculates to add a fixed longitudinal value through a target coordinate generation module to obtain the three-dimensional coordinates in the visual field again, updates coordinates and the rotation value of a prompt icon through an icon indication module, and further accurately prompts the target orientation of the user. According to the method and the device, in the virtual reality scene, the approximate target azimuth area is sensed through the orientation of the prompt icon, even if the target is not in the visual field, the target azimuth can be sensed through the orientation of the prompt icon, so that the experience person is guided to turn, the established flow is completed orderly and smoothly, the confusion of the experience person for the first time is greatly reduced, and the experience quality is improved. The orientation guidance prompt of 360 degrees can be carried out on the target outside the field of view in the virtual reality scene, and the user experience quality is improved.
Based on the second aspect, in some embodiments of the present invention, the relative coordinate module includes a difference calculation sub-module, configured to calculate a difference between the world three-dimensional coordinates of the virtual target object and the three-dimensional coordinates of the visual origin of the user, so as to obtain the target relative coordinates.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory for storing one or more programs; a processor. The program or programs, when executed by a processor, implement the method of any of the first aspects as described above.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the method according to any one of the first aspect described above.
The embodiment of the invention at least has the following advantages or beneficial effects:
the embodiment of the invention provides a virtual visual angle tracking method and a virtual visual angle tracking system applied to virtual reality, which solve the technical problems that in the prior art, a guide function or function cannot be realized for an experiencer for an object outside a visual field, and the experiencer is easy to get into a confused feeling of losing the object. The orientation guidance prompt of 360 degrees can be carried out on the target outside the field of view in the virtual reality scene, and the user experience quality is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a flowchart of a virtual perspective tracking method applied to virtual reality according to an embodiment of the present invention;
FIG. 2 is a schematic block diagram of a virtual perspective tracking system for use in virtual reality according to an embodiment of the present invention;
fig. 3 is a block diagram of an electronic device according to an embodiment of the present invention.
Icon: 100. a basic coordinate acquisition module; 200. a relative coordinate module; 210. a difference value calculation submodule; 300. a coordinate filtering module; 400. an orientation calculation module; 500. a target coordinate generation module; 600. an icon indicating module; 101. a memory; 102. a processor; 103. a communication interface.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Examples
As shown in fig. 1, in a first aspect, an embodiment of the present invention provides a virtual perspective tracking method applied in virtual reality, including the following steps:
s1, acquiring a world three-dimensional coordinate of the virtual target object and a three-dimensional coordinate of a user visual origin in real time;
s2, calculating to obtain a target relative coordinate according to the world three-dimensional coordinate of the virtual target object and the three-dimensional coordinate of the user visual origin;
further, the difference value of the world three-dimensional coordinates of the virtual target object and the three-dimensional coordinates of the visual origin of the user is calculated to obtain the relative coordinates of the target.
S3, filtering the relative coordinates of the target to obtain two-dimensional plane coordinates of the target;
further, the longitudinal value of the target relative coordinate is filtered to obtain the target two-dimensional plane coordinate.
S4, calculating according to the target two-dimensional plane coordinates to obtain a target orientation rotation value;
and further, calculating by adopting a trigonometric function formula according to the target two-dimensional plane coordinate to obtain a target orientation rotation value.
S5, generating a target three-dimensional coordinate according to the target two-dimensional plane coordinate and a preset fixed longitudinal value;
and S6, generating an indication icon according to the target three-dimensional coordinates and the target orientation rotation value.
Further, the virtual perspective tracking method applied to virtual reality further comprises the following steps: and generating interface prompt information according to the indication icon.
Further, the virtual perspective tracking method applied to virtual reality further comprises the following steps: and marking the indicating icon according to the preset icon color to obtain a new indicating icon.
In order to solve the technical problems that in the prior art, for targets outside a field of vision, a guide function or function cannot be provided for an experiencer, and at the moment, the experiencer is easy to get a sense of confusion of losing targets, the method obtains world three-dimensional coordinates of a virtual target object through three-dimensional software, then calculates the difference between the three-dimensional coordinates of the object and the three-dimensional coordinates of a vision origin to obtain target relative coordinates, then filters out longitudinal values of the relative coordinates to obtain two-dimensional plane coordinates of the relative coordinates, then calculates by adopting a trigonometric function based on the two-dimensional plane coordinates of the relative coordinates to obtain a target orientation rotation value, calculates according to the coordinate values to obtain a trigonometric function value, and then searches according to a trigonometric function table to obtain a corresponding angle, namely a target orientation rotation value; and then adding a fixed longitudinal value (the fixed longitudinal value is the distance from the origin to the target visual field, namely the distance seen by the eyes of the user) on the basis of the target two-dimensional plane coordinate to obtain a three-dimensional coordinate in the visual field again, and finally updating the coordinate and the rotation value of the prompting icon according to the target three-dimensional coordinate and the target orientation rotation value to enable the icon to be visible, thereby prompting the user.
According to the virtual reality scene, the approximate target azimuth area is sensed through the orientation of the prompt icon, even if the target is not in the visual field, the target azimuth can be sensed through the orientation of the prompt icon, the experience person is guided to turn, the established flow is completed orderly and smoothly, the confusion of the experience person at the first time is greatly reduced, the experience quality is improved, and a more remarkable guiding effect is achieved by matching with the UI prompt and the target color and brightness highlighting effects. The orientation guidance prompt of 360 degrees can be carried out on the target outside the field of view in the virtual reality scene, and the user experience quality is improved.
As shown in fig. 2, in a second aspect, an embodiment of the present invention provides a virtual perspective tracking system applied in virtual reality, including a basic coordinate obtaining module 100, a relative coordinate module 200, a coordinate filtering module 300, an orientation calculating module 400, a target coordinate generating module 500, and an icon indicating module 600, wherein:
a basic coordinate obtaining module 100, configured to obtain a three-dimensional world coordinate of a virtual target object and a three-dimensional user visual origin coordinate in real time;
the relative coordinate module 200 is used for calculating a target relative coordinate according to the world three-dimensional coordinate of the virtual target object and the three-dimensional coordinate of the user visual origin;
further, the relative coordinate module 200 includes a difference calculation sub-module 210 for calculating a difference between the world three-dimensional coordinates of the virtual target object and the three-dimensional coordinates of the visual origin of the user to obtain the target relative coordinates.
The coordinate filtering module 300 is configured to filter the target relative coordinates to obtain target two-dimensional plane coordinates;
the orientation calculation module 400 is configured to calculate a target orientation rotation value according to the target two-dimensional plane coordinate;
the target coordinate generating module 500 is used for generating a target three-dimensional coordinate according to the target two-dimensional plane coordinate and a preset fixed longitudinal value;
and an icon indicating module 600, configured to generate an indicating icon according to the target three-dimensional coordinate and the target orientation rotation value.
In order to solve the technical problems that in the prior art, a guide function or function cannot be provided for an experiencer for an object outside a field of vision, and the experiencer is easy to get a sense of confusion of losing the object, the system acquires three-dimensional coordinates of a virtual object world through the basic coordinate acquisition module 100, calculates the three-dimensional coordinates relative to a vision origin through the relative coordinate module 200, then the longitudinal value of the three-dimensional coordinate is filtered by the coordinate filtering module 300 to obtain a two-dimensional coordinate relative to the visual origin, a vector with the visual center is obtained by calculation, then the orientation calculation module 400 calculates the orientation target rotation value, the target coordinate generation module 500 calculates and adds the fixed longitudinal value to obtain the three-dimensional coordinate in the visual field again, and then the coordinates and the rotation value of the prompt icon are updated through the icon indication module 600, so as to accurately prompt the target position of the user. According to the virtual reality scene, the approximate target azimuth area is sensed through the orientation of the prompt icon, even if the target is not in the visual field, the target azimuth can be sensed through the orientation of the prompt icon, the experience person is guided to turn, the established flow is completed orderly and smoothly, the confusion of the experience person at the first time is greatly reduced, the experience quality is improved, and a more remarkable guiding effect is achieved by matching with the UI prompt and the target color and brightness highlighting effects. The orientation guidance prompt of 360 degrees can be carried out on the target outside the field of view in the virtual reality scene, and the user experience quality is improved.
As shown in fig. 3, in a third aspect, an embodiment of the present application provides an electronic device, which includes a memory 101 for storing one or more programs; a processor 102. The one or more programs, when executed by the processor 102, implement the method of any of the first aspects as described above.
Also included is a communication interface 103, and the memory 101, processor 102 and communication interface 103 are electrically connected to each other, directly or indirectly, to enable transfer or interaction of data. For example, the components may be electrically connected to each other via one or more communication buses or signal lines. The memory 101 may be used to store software programs and modules, and the processor 102 executes the software programs and modules stored in the memory 101 to thereby execute various functional applications and data processing. The communication interface 103 may be used for communicating signaling or data with other node devices.
The Memory 101 may be, but is not limited to, a Random Access Memory 101 (RAM), a Read Only Memory 101 (ROM), a Programmable Read Only Memory 101 (PROM), an Erasable Read Only Memory 101 (EPROM), an electrically Erasable Read Only Memory 101 (EEPROM), and the like.
The processor 102 may be an integrated circuit chip having signal processing capabilities. The Processor 102 may be a general-purpose Processor 102, including a Central Processing Unit (CPU) 102, a Network Processor 102 (NP), and the like; but may also be a Digital Signal processor 102 (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware components.
In the embodiments provided in the present application, it should be understood that the disclosed method and system and method can be implemented in other ways. The method and system embodiments described above are merely illustrative, for example, the flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of methods and systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, which, when executed by the processor 102, implements the method according to any one of the first aspect described above. The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory 101 (ROM), a Random Access Memory 101 (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above is only a preferred embodiment of the present invention, and is not intended to limit the present invention, and various modifications and changes will occur to those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.
Claims (10)
1. A virtual visual angle tracking method applied to virtual reality is characterized by comprising the following steps:
acquiring a world three-dimensional coordinate of a virtual target object and a user vision origin three-dimensional coordinate in real time;
calculating to obtain a target relative coordinate according to the world three-dimensional coordinate of the virtual target object and the three-dimensional coordinate of the user visual origin;
filtering the relative coordinates of the target to obtain two-dimensional plane coordinates of the target;
calculating to obtain a target orientation rotation value according to the target two-dimensional plane coordinate;
generating a target three-dimensional coordinate according to the target two-dimensional plane coordinate and a preset fixed longitudinal value;
and generating an indication icon according to the target three-dimensional coordinates and the target orientation rotation value.
2. The method as claimed in claim 1, wherein the method for calculating the relative coordinates of the target from the three-dimensional coordinates of the world of the virtual target object and the three-dimensional coordinates of the visual origin of the user comprises the following steps:
and calculating the difference value between the world three-dimensional coordinate of the virtual target object and the three-dimensional coordinate of the visual origin of the user to obtain the relative coordinate of the target.
3. The method as claimed in claim 1, wherein the step of filtering the relative coordinates of the target to obtain the two-dimensional plane coordinates of the target includes the steps of:
and filtering the longitudinal value of the relative coordinate of the target to obtain the two-dimensional plane coordinate of the target.
4. The method as claimed in claim 1, wherein the method for calculating the rotation value of the target orientation according to the two-dimensional plane coordinates of the target comprises the following steps:
and calculating by adopting a trigonometric function formula according to the target two-dimensional plane coordinate to obtain a target orientation rotation value.
5. The method as claimed in claim 1, further comprising the steps of:
and generating interface prompt information according to the indication icon.
6. The method as claimed in claim 1, further comprising the steps of:
and marking the indicating icon according to the preset icon color to obtain a new indicating icon.
7. The utility model provides a virtual visual angle tracking system who applies to among virtual reality which characterized in that, includes basic coordinate acquisition module, relative coordinate module, coordinate filter module, orientation calculation module, target coordinate generation module and icon indicating module, wherein:
the basic coordinate acquisition module is used for acquiring a three-dimensional world coordinate of the virtual target object and a three-dimensional user visual origin coordinate in real time;
the relative coordinate module is used for calculating to obtain a target relative coordinate according to the world three-dimensional coordinate of the virtual target object and the three-dimensional coordinate of the user visual origin;
the coordinate filtering module is used for filtering the relative coordinates of the target to obtain two-dimensional plane coordinates of the target;
the orientation calculation module is used for calculating a target orientation rotation value according to the target two-dimensional plane coordinates;
the target coordinate generating module is used for generating a target three-dimensional coordinate according to the target two-dimensional plane coordinate and a preset fixed longitudinal value;
and the icon indicating module is used for generating an indicating icon according to the target three-dimensional coordinate and the target orientation rotation value.
8. The system as claimed in claim 7, wherein the relative coordinates module comprises a difference calculation sub-module for calculating a difference between the world three-dimensional coordinates of the virtual target object and the three-dimensional coordinates of the visual origin of the user to obtain the target relative coordinates.
9. An electronic device, comprising:
a memory for storing one or more programs;
a processor;
the one or more programs, when executed by the processor, implement the method of any of claims 1-6.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111501788.0A CN114138120B (en) | 2021-12-09 | 2021-12-09 | Virtual visual angle tracking method and system applied to virtual reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111501788.0A CN114138120B (en) | 2021-12-09 | 2021-12-09 | Virtual visual angle tracking method and system applied to virtual reality |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114138120A true CN114138120A (en) | 2022-03-04 |
CN114138120B CN114138120B (en) | 2023-02-14 |
Family
ID=80385490
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111501788.0A Active CN114138120B (en) | 2021-12-09 | 2021-12-09 | Virtual visual angle tracking method and system applied to virtual reality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114138120B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115454255A (en) * | 2022-10-09 | 2022-12-09 | 如你所视(北京)科技有限公司 | Article display switching method and device, electronic equipment and storage medium |
CN116185244A (en) * | 2023-04-27 | 2023-05-30 | 山东捷瑞数字科技股份有限公司 | Method, system, device and medium for operating object rotation in three-dimensional scene |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006285833A (en) * | 2005-04-04 | 2006-10-19 | Sony Corp | Information processor and method, recording medium and program |
CN105378625A (en) * | 2013-06-25 | 2016-03-02 | 微软技术许可有限责任公司 | Indicating out-of-view augmented reality images |
CN107957781A (en) * | 2017-12-13 | 2018-04-24 | 北京小米移动软件有限公司 | Method for information display and device |
US20180348854A1 (en) * | 2017-05-31 | 2018-12-06 | Occipital, Inc. | Six degree of freedom tracking with scale recovery and obstacle avoidance |
CN111672126A (en) * | 2020-05-29 | 2020-09-18 | 腾讯科技(深圳)有限公司 | Information display method, device, equipment and storage medium |
-
2021
- 2021-12-09 CN CN202111501788.0A patent/CN114138120B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006285833A (en) * | 2005-04-04 | 2006-10-19 | Sony Corp | Information processor and method, recording medium and program |
CN105378625A (en) * | 2013-06-25 | 2016-03-02 | 微软技术许可有限责任公司 | Indicating out-of-view augmented reality images |
US20180348854A1 (en) * | 2017-05-31 | 2018-12-06 | Occipital, Inc. | Six degree of freedom tracking with scale recovery and obstacle avoidance |
CN107957781A (en) * | 2017-12-13 | 2018-04-24 | 北京小米移动软件有限公司 | Method for information display and device |
CN111672126A (en) * | 2020-05-29 | 2020-09-18 | 腾讯科技(深圳)有限公司 | Information display method, device, equipment and storage medium |
WO2021238870A1 (en) * | 2020-05-29 | 2021-12-02 | 腾讯科技(深圳)有限公司 | Information display method and apparatus, and device and storage medium |
Non-Patent Citations (1)
Title |
---|
TORBEN SCHINKE等: ""Visualization of Off-Screen Objects"", 《PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115454255A (en) * | 2022-10-09 | 2022-12-09 | 如你所视(北京)科技有限公司 | Article display switching method and device, electronic equipment and storage medium |
CN115454255B (en) * | 2022-10-09 | 2024-02-13 | 如你所视(北京)科技有限公司 | Switching method and device for article display, electronic equipment and storage medium |
CN116185244A (en) * | 2023-04-27 | 2023-05-30 | 山东捷瑞数字科技股份有限公司 | Method, system, device and medium for operating object rotation in three-dimensional scene |
Also Published As
Publication number | Publication date |
---|---|
CN114138120B (en) | 2023-02-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114138120B (en) | Virtual visual angle tracking method and system applied to virtual reality | |
US10192364B2 (en) | Augmented reality product preview | |
CN110568447B (en) | Visual positioning method, device and computer readable medium | |
US9805505B2 (en) | Estimation of object properties in 3D world | |
CN112729327B (en) | Navigation method, navigation device, computer equipment and storage medium | |
US11880956B2 (en) | Image processing method and apparatus, and computer storage medium | |
US20210041945A1 (en) | Machine learning based gaze estimation with confidence | |
JP6548967B2 (en) | Image processing apparatus, image processing method and program | |
CN111950521A (en) | Augmented reality interaction method and device, electronic equipment and storage medium | |
CN107862733B (en) | Large-scale scene real-time three-dimensional reconstruction method and system based on sight updating algorithm | |
US20160119615A1 (en) | Three dimensional data visualization | |
KR102310994B1 (en) | Computing apparatus and method for providing 3-dimensional interaction | |
CN114202640A (en) | Data acquisition method and device, computer equipment and storage medium | |
CN114387347A (en) | Method and device for determining external parameter calibration, electronic equipment and medium | |
CN114758075A (en) | Method, apparatus and storage medium for generating three-dimensional label | |
CN113256799A (en) | Three-dimensional face model training method and device | |
CN111569421A (en) | Virtual scene change synchronization method and system, VR playing equipment and storage medium | |
CN108779979A (en) | Relief map | |
CN109934058B (en) | Face image processing method, face image processing device, electronic apparatus, storage medium, and program | |
CN117611664A (en) | Image information and inertial navigation data fusion method | |
EP3088991A1 (en) | Wearable device and method for enabling user interaction | |
CN112114659B (en) | Method and system for determining a fine gaze point of a user | |
JP6487545B2 (en) | Recognition calculation device, recognition calculation method, and recognition calculation program | |
CN109003245B (en) | Coordinate processing method and device and electronic equipment | |
CN110390686A (en) | Naked eye 3D display method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20220427 Address after: 100000 01-1901, 16 / F, building 1, yard 8, gonggonggongtiyu North Road, Chaoyang District, Beijing Applicant after: Beijing Bingzhen Shanzhu Technology Co.,Ltd. Address before: 041000 No. 1704, unit 4, yingcaiyuan community, caishenlou North Street, Yaodu District, Linfen City, Shanxi Province Applicant before: Zhang Senmiao |
|
TA01 | Transfer of patent application right | ||
GR01 | Patent grant | ||
GR01 | Patent grant |