CN114138120B - Virtual visual angle tracking method and system applied to virtual reality - Google Patents

Virtual visual angle tracking method and system applied to virtual reality Download PDF

Info

Publication number
CN114138120B
CN114138120B CN202111501788.0A CN202111501788A CN114138120B CN 114138120 B CN114138120 B CN 114138120B CN 202111501788 A CN202111501788 A CN 202111501788A CN 114138120 B CN114138120 B CN 114138120B
Authority
CN
China
Prior art keywords
target
coordinate
dimensional
virtual
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111501788.0A
Other languages
Chinese (zh)
Other versions
CN114138120A (en
Inventor
张森淼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Bingzhen Shanzhu Technology Co ltd
Original Assignee
Beijing Bingzhen Shanzhu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Bingzhen Shanzhu Technology Co ltd filed Critical Beijing Bingzhen Shanzhu Technology Co ltd
Priority to CN202111501788.0A priority Critical patent/CN114138120B/en
Publication of CN114138120A publication Critical patent/CN114138120A/en
Application granted granted Critical
Publication of CN114138120B publication Critical patent/CN114138120B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Abstract

The invention discloses a virtual visual angle tracking method and system applied to virtual reality, and relates to the technical field of virtual reality. The method comprises the following steps: acquiring a world three-dimensional coordinate of a virtual target object and a user vision origin three-dimensional coordinate in real time; calculating to obtain a target relative coordinate according to the world three-dimensional coordinate of the virtual target object and the three-dimensional coordinate of the user visual origin; filtering the relative coordinates of the target to obtain two-dimensional plane coordinates of the target; calculating according to the target two-dimensional plane coordinates to obtain a target orientation rotation value; generating a target three-dimensional coordinate according to the target two-dimensional plane coordinate and a preset fixed longitudinal value; and generating an indication icon according to the target three-dimensional coordinates and the target orientation rotation value. The invention can realize the 360-degree orientation guidance prompt of the target outside the field of view in the virtual reality scene, and improves the user experience quality.

Description

Virtual visual angle tracking method and system applied to virtual reality
Technical Field
The invention relates to the technical field of virtual reality, in particular to a virtual visual angle tracking method and a virtual visual angle tracking system applied to virtual reality.
Background
In virtual reality, as a 720-degree panoramic content display form, when immersion is brought, the sense of direction is more easily lost than that of traditional 2D content, and at present, the virtual reality content on the market intuitively guides few schemes of visual angle and direction, and generally needs to watch a target to distinguish colors and brightness so as to achieve the purpose of attracting sight. For the target outside the visual field, the function or function of guiding the experience person cannot be achieved, and the experience person is easy to get into the confusion of losing the target.
Disclosure of Invention
In order to overcome the above problems or at least partially solve the above problems, embodiments of the present invention provide a method and a system for tracking a virtual perspective in virtual reality, which can perform 360-degree orientation guidance prompt on an object outside a field of view in a virtual reality scene, thereby improving user experience quality.
The embodiment of the invention is realized by the following steps:
in a first aspect, an embodiment of the present invention provides a virtual perspective tracking method applied in virtual reality, including the following steps:
acquiring a world three-dimensional coordinate of a virtual target object and a user vision origin three-dimensional coordinate in real time;
calculating to obtain a target relative coordinate according to the world three-dimensional coordinate of the virtual target object and the three-dimensional coordinate of the user visual origin;
filtering the relative coordinates of the target to obtain two-dimensional plane coordinates of the target;
calculating to obtain a target orientation rotation value according to the target two-dimensional plane coordinate;
generating a target three-dimensional coordinate according to the target two-dimensional plane coordinate and a preset fixed longitudinal value;
and generating an indication icon according to the target three-dimensional coordinate and the target orientation rotation value.
In order to solve the technical problems that in the prior art, for targets outside a visual field, a guide function or function cannot be provided for an experiencer, and at the moment, the experiencer is easy to get a sense of confusion of losing targets, the method calculates three-dimensional coordinates relative to a visual origin according to three-dimensional coordinates of a virtual target object world, filters out longitudinal values of the three-dimensional coordinates to obtain two-dimensional coordinates relative to the visual origin, calculates to obtain vectors relative to the visual center, calculates to obtain a rotation value towards the targets, calculates to add fixed longitudinal values to obtain three-dimensional coordinates in the visual field again, then updates coordinates and rotation values of a prompt icon, and further accurately prompts the target orientation of the user. According to the method and the device, in the virtual reality scene, the approximate target position area is sensed through the orientation of the prompt icon, even if the target is not in the visual field, the target position can be sensed through the orientation of the prompt icon, so that the experiencer is guided to turn, the established flow is completed orderly and smoothly, the confusion of the first-time experiencer is greatly reduced, and the experience quality is improved. The orientation guidance prompt of 360 degrees can be carried out on the target outside the field of vision in the virtual reality scene, and the user experience quality is improved.
Based on the first aspect, in some embodiments of the present invention, the method for calculating the target relative coordinates according to the virtual target object world three-dimensional coordinates and the user visual origin three-dimensional coordinates includes the following steps:
and calculating the difference value between the world three-dimensional coordinate of the virtual target object and the three-dimensional coordinate of the visual origin of the user to obtain the relative coordinate of the target.
Based on the first aspect, in some embodiments of the present invention, the method for filtering the relative coordinates of the target to obtain the two-dimensional plane coordinates of the target includes the following steps:
and filtering the longitudinal value of the relative coordinate of the target to obtain the two-dimensional plane coordinate of the target.
Based on the first aspect, in some embodiments of the present invention, the method for calculating the target orientation rotation value according to the target two-dimensional plane coordinates includes the following steps:
and calculating by adopting a trigonometric function formula according to the target two-dimensional plane coordinate to obtain a target orientation rotation value.
Based on the first aspect, in some embodiments of the present invention, the virtual perspective tracking method applied in virtual reality further includes the following steps:
and generating interface prompt information according to the indication icon.
Based on the first aspect, in some embodiments of the present invention, the virtual perspective tracking method applied in virtual reality further includes the following steps:
and marking the indicating icon according to the preset icon color to obtain a new indicating icon.
In a second aspect, an embodiment of the present invention provides a virtual perspective tracking system applied in virtual reality, including a basic coordinate obtaining module, a relative coordinate module, a coordinate filtering module, an orientation calculating module, a target coordinate generating module, and an icon indicating module, where:
the basic coordinate acquisition module is used for acquiring a three-dimensional world coordinate of the virtual target object and a three-dimensional user visual origin coordinate in real time;
the relative coordinate module is used for calculating to obtain a target relative coordinate according to the virtual target object world three-dimensional coordinate and the user vision origin three-dimensional coordinate;
the coordinate filtering module is used for filtering the relative coordinates of the target to obtain two-dimensional plane coordinates of the target;
the orientation calculation module is used for calculating a target orientation rotation value according to the target two-dimensional plane coordinates;
the target coordinate generating module is used for generating a target three-dimensional coordinate according to the target two-dimensional plane coordinate and a preset fixed longitudinal value;
and the icon indicating module is used for generating an indicating icon according to the target three-dimensional coordinate and the target orientation rotation value.
In order to solve the technical problems that targets outside the visual field cannot be guided or cannot be guided by an experiencer in the prior art, and the experiencer is prone to getting a sense of confusion of losing the targets, the system acquires world three-dimensional coordinates of a virtual target object through a basic coordinate acquisition module, calculates the three-dimensional coordinates relative to a visual origin through a relative coordinate module, filters out longitudinal values of the three-dimensional coordinates through a coordinate filtering module to obtain two-dimensional coordinates relative to the visual origin, calculates to obtain vectors relative to the visual center, calculates to obtain a rotation value of the orientation target through an orientation calculation module, calculates to add a fixed longitudinal value through a target coordinate generation module to obtain the three-dimensional coordinates in the visual field again, updates coordinates and the rotation value of a prompt icon through an icon indication module, and further accurately prompts the target orientation of the user. According to the method and the device, in the virtual reality scene, the approximate target azimuth area is sensed through the orientation of the prompt icon, even if the target is not in the visual field, the target azimuth can be sensed through the orientation of the prompt icon, so that the experience person is guided to turn, the established flow is completed orderly and smoothly, the confusion of the experience person for the first time is greatly reduced, and the experience quality is improved. The orientation guidance prompt of 360 degrees can be carried out on the target outside the field of view in the virtual reality scene, and the user experience quality is improved.
Based on the second aspect, in some embodiments of the present invention, the relative coordinate module includes a difference calculation sub-module, configured to calculate a difference between the world three-dimensional coordinates of the virtual target object and the three-dimensional coordinates of the visual origin of the user, so as to obtain the target relative coordinates.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory for storing one or more programs; a processor. The program or programs, when executed by a processor, implement the method of any of the first aspects as described above.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the method according to any one of the first aspect described above.
The embodiment of the invention at least has the following advantages or beneficial effects:
the embodiment of the invention provides a virtual visual angle tracking method and a virtual visual angle tracking system applied to virtual reality, which solve the technical problems that in the prior art, a guide function or function cannot be realized for an experiencer for an object outside a visual field, and the experiencer is easy to get into a confused feeling of losing the object. The orientation guidance prompt of 360 degrees can be carried out on the target outside the field of view in the virtual reality scene, and the user experience quality is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a flowchart of a virtual perspective tracking method applied to virtual reality according to an embodiment of the present invention;
FIG. 2 is a schematic block diagram of a virtual perspective tracking system for virtual reality according to an embodiment of the present invention;
fig. 3 is a block diagram of an electronic device according to an embodiment of the present invention.
Icon: 100. a basic coordinate acquisition module; 200. a relative coordinate module; 210. a difference value calculation submodule; 300. a coordinate filtering module; 400. an orientation calculation module; 500. a target coordinate generation module; 600. an icon indicating module; 101. a memory; 102. a processor; 103. a communication interface.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising" does not exclude the presence of additional like elements in a process, method, article, or apparatus that comprises the element.
Examples
As shown in fig. 1, in a first aspect, an embodiment of the present invention provides a virtual perspective tracking method applied in virtual reality, including the following steps:
s1, acquiring a world three-dimensional coordinate of a virtual target object and a three-dimensional coordinate of a user visual origin in real time;
s2, calculating to obtain a target relative coordinate according to the world three-dimensional coordinate of the virtual target object and the three-dimensional coordinate of the user visual origin;
further, the difference value of the world three-dimensional coordinates of the virtual target object and the three-dimensional coordinates of the visual origin of the user is calculated to obtain the relative coordinates of the target.
S3, filtering the relative coordinates of the target to obtain two-dimensional plane coordinates of the target;
further, the longitudinal value of the target relative coordinate is filtered to obtain the target two-dimensional plane coordinate.
S4, calculating according to the target two-dimensional plane coordinates to obtain a target orientation rotation value;
and further, calculating by adopting a trigonometric function formula according to the target two-dimensional plane coordinate to obtain a target orientation rotation value.
S5, generating a target three-dimensional coordinate according to the target two-dimensional plane coordinate and a preset fixed longitudinal value;
and S6, generating an indication icon according to the target three-dimensional coordinate and the target orientation rotation value.
Further, the virtual perspective tracking method applied to virtual reality further comprises the following steps: and generating interface prompt information according to the indication icon.
Further, the virtual perspective tracking method applied to virtual reality further comprises the following steps: and marking the indicating icon according to the preset icon color to obtain a new indicating icon.
In order to solve the technical problems that in the prior art, for targets outside a field of vision, a guide function or function cannot be provided for an experiencer, and at the moment, the experiencer is easy to get a sense of confusion of losing targets, the method obtains world three-dimensional coordinates of a virtual target object through three-dimensional software, then calculates the difference between the three-dimensional coordinates of the object and the three-dimensional coordinates of a vision origin to obtain target relative coordinates, then filters out longitudinal values of the relative coordinates to obtain two-dimensional plane coordinates of the relative coordinates, then calculates by adopting a trigonometric function based on the two-dimensional plane coordinates of the relative coordinates to obtain a target orientation rotation value, calculates according to the coordinate values to obtain a trigonometric function value, and then searches according to a trigonometric function table to obtain a corresponding angle, namely a target orientation rotation value; and finally, updating the coordinate and the rotation value of the prompting icon according to the target three-dimensional coordinate and the target orientation rotation value to make the icon visible, and further prompting the user.
According to the virtual reality scene, the approximate target azimuth area is sensed through the orientation of the prompt icon, even if the target is not in the visual field, the target azimuth can be sensed through the orientation of the prompt icon, the experience person is guided to turn, the established flow is completed orderly and smoothly, the confusion of the experience person at the first time is greatly reduced, the experience quality is improved, and a more remarkable guiding effect is achieved by matching with the UI prompt and the target color and brightness highlighting effects. The orientation guidance prompt of 360 degrees can be carried out on the target outside the field of view in the virtual reality scene, and the user experience quality is improved.
As shown in fig. 2, in a second aspect, an embodiment of the present invention provides a virtual perspective tracking system applied in virtual reality, including a basic coordinate obtaining module 100, a relative coordinate module 200, a coordinate filtering module 300, an orientation calculating module 400, a target coordinate generating module 500, and an icon indicating module 600, wherein:
a basic coordinate acquiring module 100, configured to acquire a three-dimensional world coordinate of a virtual target object and a three-dimensional user visual origin coordinate in real time;
the relative coordinate module 200 is used for calculating a target relative coordinate according to the world three-dimensional coordinate of the virtual target object and the three-dimensional coordinate of the user visual origin;
further, the relative coordinate module 200 includes a difference calculation sub-module 210 for calculating a difference between the world three-dimensional coordinates of the virtual target object and the three-dimensional coordinates of the visual origin of the user to obtain the target relative coordinates.
The coordinate filtering module 300 is configured to filter the target relative coordinates to obtain target two-dimensional plane coordinates;
the orientation calculation module 400 is configured to calculate a target orientation rotation value according to the target two-dimensional plane coordinate;
the target coordinate generating module 500 is used for generating a target three-dimensional coordinate according to the target two-dimensional plane coordinate and a preset fixed longitudinal value;
and an icon indicating module 600, configured to generate an indicating icon according to the target three-dimensional coordinate and the target orientation rotation value.
In order to solve the technical problems that in the prior art, for targets outside a visual field, a guide function or function cannot be provided for an experiencer, and at the moment, the experiencer is easy to get a sense of confusion of losing targets, the system acquires world three-dimensional coordinates of a virtual target object through a basic coordinate acquisition module 100, calculates the three-dimensional coordinates relative to a visual origin through a relative coordinate module 200, filters out longitudinal values of the three-dimensional coordinates through a coordinate filtering module 300 to obtain two-dimensional coordinates relative to the visual origin, calculates to obtain a vector relative to the visual center, calculates to obtain a rotation value towards the target through a direction calculation module 400, calculates to add a fixed longitudinal value through a target coordinate generation module 500 to obtain the three-dimensional coordinates in the visual field again, updates coordinates and rotation values of a prompt icon through an icon indication module 600, and further accurately prompts the target direction of the user. According to the method and the device, in the virtual reality scene, the approximate target position area is sensed through the orientation of the prompt icon, even if the target is not in the visual field, the target position can be sensed through the orientation of the prompt icon, the experience person is guided to turn, the established flow is completed orderly and smoothly, the confusion of the experience person for the first time is greatly reduced, the experience quality is improved, and a more remarkable guiding effect is achieved by matching with UI prompt and the protruding effect of the target color and brightness. The orientation guidance prompt of 360 degrees can be carried out on the target outside the field of view in the virtual reality scene, and the user experience quality is improved.
As shown in fig. 3, in a third aspect, an embodiment of the present application provides an electronic device, which includes a memory 101 for storing one or more programs; a processor 102. The one or more programs, when executed by the processor 102, implement the method of any of the first aspects as described above.
Also included is a communication interface 103, and the memory 101, processor 102 and communication interface 103 are electrically connected to each other, directly or indirectly, to enable transfer or interaction of data. For example, the components may be electrically connected to each other via one or more communication buses or signal lines. The memory 101 may be used to store software programs and modules, and the processor 102 executes the software programs and modules stored in the memory 101 to thereby execute various functional applications and data processing. The communication interface 103 may be used for communicating signaling or data with other node devices.
The Memory 101 may be, but is not limited to, a Random Access Memory 101 (RAM), a Read Only Memory 101 (ROM), a Programmable Read Only Memory 101 (PROM), an Erasable Read Only Memory 101 (EPROM), an electrically Erasable Read Only Memory 101 (EEPROM), and the like.
The processor 102 may be an integrated circuit chip having signal processing capabilities. The Processor 102 may be a general-purpose Processor 102, including a Central Processing Unit (CPU) 102, a Network Processor 102 (NP), and the like; but may also be a Digital Signal processor 102 (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware components.
In the embodiments provided in the present application, it should be understood that the disclosed method and system and method can be implemented in other ways. The method and system embodiments described above are merely illustrative and, for example, the flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of methods and systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, which, when executed by the processor 102, implements the method according to any one of the first aspect described above. The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory 101 (ROM), a Random Access Memory 101 (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The present invention has been described in terms of the preferred embodiment, and it is not intended to be limited to the embodiment. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
It will be evident to those skilled in the art that the application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.

Claims (8)

1. A virtual visual angle tracking method applied to virtual reality is characterized by comprising the following steps:
acquiring a world three-dimensional coordinate of a virtual target object and a user vision origin three-dimensional coordinate in real time;
calculating to obtain a target relative coordinate according to the world three-dimensional coordinate of the virtual target object and the three-dimensional coordinate of the user visual origin;
filtering the relative coordinates of the target to obtain two-dimensional plane coordinates of the target; the method comprises the following steps: filtering the longitudinal value of the relative coordinate of the target to obtain a target two-dimensional plane coordinate;
calculating to obtain a target orientation rotation value according to the target two-dimensional plane coordinate;
generating a target three-dimensional coordinate according to the target two-dimensional plane coordinate and a preset fixed longitudinal value; the fixed longitudinal value is a distance from a user's visual origin to a target field of view;
generating an indication icon according to the target three-dimensional coordinate and the target orientation rotation value;
further comprising the steps of:
and marking the indicating icon according to the preset icon color to obtain a new indicating icon.
2. The method as claimed in claim 1, wherein the method for calculating the relative coordinates of the target from the three-dimensional coordinates of the world of the virtual target object and the three-dimensional coordinates of the visual origin of the user comprises the following steps:
and calculating the difference value between the world three-dimensional coordinate of the virtual target object and the three-dimensional coordinate of the visual origin of the user to obtain the relative coordinate of the target.
3. The method as claimed in claim 1, wherein the method for calculating the rotation value of the target orientation according to the two-dimensional plane coordinates of the target comprises the following steps:
and calculating by adopting a trigonometric function formula according to the target two-dimensional plane coordinate to obtain a target orientation rotation value.
4. The method as claimed in claim 1, further comprising the steps of:
and generating interface prompt information according to the indication icon.
5. The utility model provides a virtual visual angle tracking system who applies to among virtual reality which characterized in that, includes basic coordinate acquisition module, relative coordinate module, coordinate filter module, orientation calculation module, target coordinate generation module and icon indicating module, wherein:
the basic coordinate acquisition module is used for acquiring a three-dimensional world coordinate of the virtual target object and a three-dimensional user visual origin coordinate in real time;
the relative coordinate module is used for calculating to obtain a target relative coordinate according to the world three-dimensional coordinate of the virtual target object and the three-dimensional coordinate of the user visual origin;
the coordinate filtering module is used for filtering the relative coordinates of the target to obtain two-dimensional plane coordinates of the target; the method comprises the following steps: filtering the longitudinal value of the relative coordinate of the target to obtain a target two-dimensional plane coordinate;
the orientation calculation module is used for calculating a target orientation rotation value according to the target two-dimensional plane coordinates;
the target coordinate generating module is used for generating a target three-dimensional coordinate according to the target two-dimensional plane coordinate and a preset fixed longitudinal value; the fixed longitudinal value is a distance from a user's visual origin to a target field of view;
the icon indicating module is used for generating an indicating icon according to the target three-dimensional coordinate and the target orientation rotation value; and the display device is also used for marking the indicating icon according to the preset icon color so as to obtain a new indicating icon.
6. The system as claimed in claim 5, wherein the relative coordinates module comprises a difference calculation sub-module for calculating a difference between the world three-dimensional coordinates of the virtual target object and the three-dimensional coordinates of the visual origin of the user to obtain the target relative coordinates.
7. An electronic device, comprising:
a memory for storing one or more programs;
a processor;
the one or more programs, when executed by the processor, implement the method of any of claims 1-4.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-4.
CN202111501788.0A 2021-12-09 2021-12-09 Virtual visual angle tracking method and system applied to virtual reality Active CN114138120B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111501788.0A CN114138120B (en) 2021-12-09 2021-12-09 Virtual visual angle tracking method and system applied to virtual reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111501788.0A CN114138120B (en) 2021-12-09 2021-12-09 Virtual visual angle tracking method and system applied to virtual reality

Publications (2)

Publication Number Publication Date
CN114138120A CN114138120A (en) 2022-03-04
CN114138120B true CN114138120B (en) 2023-02-14

Family

ID=80385490

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111501788.0A Active CN114138120B (en) 2021-12-09 2021-12-09 Virtual visual angle tracking method and system applied to virtual reality

Country Status (1)

Country Link
CN (1) CN114138120B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115454255B (en) * 2022-10-09 2024-02-13 如你所视(北京)科技有限公司 Switching method and device for article display, electronic equipment and storage medium
CN116185244B (en) * 2023-04-27 2023-08-01 山东捷瑞数字科技股份有限公司 Method, system, device and medium for operating object rotation in three-dimensional scene

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006285833A (en) * 2005-04-04 2006-10-19 Sony Corp Information processor and method, recording medium and program
CN105378625A (en) * 2013-06-25 2016-03-02 微软技术许可有限责任公司 Indicating out-of-view augmented reality images
CN107957781A (en) * 2017-12-13 2018-04-24 北京小米移动软件有限公司 Method for information display and device
CN111672126A (en) * 2020-05-29 2020-09-18 腾讯科技(深圳)有限公司 Information display method, device, equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10809795B2 (en) * 2017-05-31 2020-10-20 Occipital, Inc. Six degree of freedom tracking with scale recovery and obstacle avoidance

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006285833A (en) * 2005-04-04 2006-10-19 Sony Corp Information processor and method, recording medium and program
CN105378625A (en) * 2013-06-25 2016-03-02 微软技术许可有限责任公司 Indicating out-of-view augmented reality images
CN107957781A (en) * 2017-12-13 2018-04-24 北京小米移动软件有限公司 Method for information display and device
CN111672126A (en) * 2020-05-29 2020-09-18 腾讯科技(深圳)有限公司 Information display method, device, equipment and storage medium
WO2021238870A1 (en) * 2020-05-29 2021-12-02 腾讯科技(深圳)有限公司 Information display method and apparatus, and device and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Visualization of Off-Screen Objects";Torben Schinke等;《Proceedings of the 12th international conference》;20100930;第313至316页 *

Also Published As

Publication number Publication date
CN114138120A (en) 2022-03-04

Similar Documents

Publication Publication Date Title
CN114138120B (en) Virtual visual angle tracking method and system applied to virtual reality
US10755485B2 (en) Augmented reality product preview
CN110568447B (en) Visual positioning method, device and computer readable medium
US9430874B2 (en) Estimation of object properties in 3D world
US11880956B2 (en) Image processing method and apparatus, and computer storage medium
CN112148197A (en) Augmented reality AR interaction method and device, electronic equipment and storage medium
CN111950521A (en) Augmented reality interaction method and device, electronic equipment and storage medium
US20210041945A1 (en) Machine learning based gaze estimation with confidence
CN112729327A (en) Navigation method, navigation device, computer equipment and storage medium
CN107862733B (en) Large-scale scene real-time three-dimensional reconstruction method and system based on sight updating algorithm
EP3275182B1 (en) Methods and systems for light field augmented reality/virtual reality on mobile devices
CN114202640A (en) Data acquisition method and device, computer equipment and storage medium
CN109389634A (en) Virtual shopping system based on three-dimensional reconstruction and augmented reality
CN113262490A (en) Virtual object marking method and device, processor and electronic device
CN110853488A (en) POI (Point of interest) label display method, device and equipment
CN114387347A (en) Method and device for determining external parameter calibration, electronic equipment and medium
CN110458857B (en) Central symmetry primitive detection method and device, electronic equipment and readable storage medium
CN108779979A (en) Relief map
CN114758075A (en) Method, apparatus and storage medium for generating three-dimensional label
CN111569421A (en) Virtual scene change synchronization method and system, VR playing equipment and storage medium
CN109816406B (en) Article marking method, device, equipment and medium
CN107730534B (en) Target object tracking method and device
US10965930B2 (en) Graphical user interface for indicating off-screen points of interest
JP2015008394A (en) Information terminal device
EP3088991A1 (en) Wearable device and method for enabling user interaction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220427

Address after: 100000 01-1901, 16 / F, building 1, yard 8, gonggonggongtiyu North Road, Chaoyang District, Beijing

Applicant after: Beijing Bingzhen Shanzhu Technology Co.,Ltd.

Address before: 041000 No. 1704, unit 4, yingcaiyuan community, caishenlou North Street, Yaodu District, Linfen City, Shanxi Province

Applicant before: Zhang Senmiao

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant