CN113448445B - Target position tracking method and system based on virtual reality - Google Patents

Target position tracking method and system based on virtual reality Download PDF

Info

Publication number
CN113448445B
CN113448445B CN202111017680.4A CN202111017680A CN113448445B CN 113448445 B CN113448445 B CN 113448445B CN 202111017680 A CN202111017680 A CN 202111017680A CN 113448445 B CN113448445 B CN 113448445B
Authority
CN
China
Prior art keywords
user
virtual reality
acceleration
speed
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111017680.4A
Other languages
Chinese (zh)
Other versions
CN113448445A (en
Inventor
黄聪明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Chengzhi Technology Co ltd
Original Assignee
Shenzhen Chengzhi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Chengzhi Technology Co ltd filed Critical Shenzhen Chengzhi Technology Co ltd
Priority to CN202111017680.4A priority Critical patent/CN113448445B/en
Publication of CN113448445A publication Critical patent/CN113448445A/en
Application granted granted Critical
Publication of CN113448445B publication Critical patent/CN113448445B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Abstract

The invention provides a target position tracking method based on virtual reality, which comprises the following steps: firstly, establishing a positioning center, determining a front position point, a rear position point, a left position point and a right position point according to a set distance, determining the self position of a user in virtual reality when the user is at each position, and establishing a spatial position compensation model; enabling a user to move at a specific speed, determining the speed of the user perceived by the user in virtual reality, and establishing a speed compensation model; enabling a user to move at a specific acceleration, determining the self acceleration sensed by the user in the virtual reality, and establishing an acceleration compensation model; the method and the device can accurately track and position the position of the user sensing virtual scene in real time, and improve the experience of real-virtual interaction.

Description

Target position tracking method and system based on virtual reality
Technical Field
The invention belongs to the field of virtual reality, and particularly relates to a target position tracking method and system based on virtual reality.
Background
Today, it is still a relatively complicated process to experience a desired virtual reality environment at any time in person, because the existing VR experience requires either a specific device, such as a head-mounted device or a handheld controller tracked by an infrared camera, or requires a specific environment to experience, such as a specific area, etc., and these limitations greatly reduce the comfort of the user experience. For a real VR experience, the position tracking can be realized at any time and any place, and the displacement can be updated in time.
The existing VR tracking systems are mainly divided into two categories, one is a position tracking system from outside to inside, and the other is a tracking system from inside to outside; however, both systems only pay attention to the position of the user in the physical space, and the problem that human visual perception is different between the physical space and the virtual space is not considered, so that the user cannot correctly perceive the position of the user in the virtual space even if the posture of the user in the physical space is correctly positioned and fed back to the VR system in real time; therefore, the position relation between the virtual target and the self-body cannot be correct, and the interaction cannot be correct.
Disclosure of Invention
The invention aims to overcome the defects in the prior art and provides a target position tracking method based on virtual reality, which establishes a real-time tracking and positioning compensation model by establishing a position difference model between the actual position of a user and the position of the user in a virtual space perceived by the user, establishing a difference model between the actual position change of the user and the position change of the user in the virtual space perceived by the user, thereby correctly tracking and positioning the position of the user perceiving the virtual scene in real time, improving the experience of mutual interaction between real and virtual, and relieving dizziness caused by inconsistent perception.
The invention adopts the following technical scheme:
a target position tracking method based on virtual reality comprises the following steps:
firstly, establishing a positioning center, determining a front position point, a rear position point, a left position point and a right position point according to a set distance, determining the self position of a user in virtual reality when the user is at each position, and establishing a spatial position compensation model according to the front position point, the rear position point, the left position point, the right position point and the self position of the user in virtual reality at each position;
enabling a user to move at a specific speed, determining the speed of the user perceived by the user in the virtual reality, and establishing a speed compensation model according to the specific speed and the speed of the user perceived by the user in the virtual reality;
enabling a user to move at a specific acceleration, determining the self acceleration sensed by the user in the virtual reality, and establishing an acceleration compensation model according to the specific acceleration and the self acceleration sensed by the user in the virtual reality;
and a real-time tracking and positioning compensation model is established according to the established space position compensation model, the speed compensation model and the acceleration compensation model, and the change of the position of the user perceived in the virtual reality can be determined in real time by combining the captured position change information of the user.
Specifically, a positioning center is established, a front position point, a rear position point, a left position point and a right position point are determined according to a set distance, a self position perceived by a user in virtual reality at each position point is determined, and a spatial position compensation model is established according to the front position point, the rear position point, the left position point, the right position point and the self position perceived by the user in virtual reality at each position point, specifically:
Figure DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE002
to determine the abscissa of the front location point,
Figure 533047DEST_PATH_IMAGE003
the abscissa of the position of the user perceived in the virtual reality at the previous position point;
Figure 100002_DEST_PATH_IMAGE004
to determine the abscissa of the rear location point,
Figure 824089DEST_PATH_IMAGE005
the abscissa of the position of the user perceived in the virtual reality at the rear position point;
Figure 100002_DEST_PATH_IMAGE006
to determine the ordinate of the left location point,
Figure 213613DEST_PATH_IMAGE007
is the ordinate of the user's own position perceived in the virtual reality at the left location point;
Figure 100002_DEST_PATH_IMAGE008
to determine the ordinate of the right position point,
Figure 139981DEST_PATH_IMAGE009
is the ordinate of the user's own position perceived in the virtual reality at the right position point.
Specifically, the user is enabled to move at a specific speed, the speed of the user perceived by the user in the virtual reality is determined, and a speed compensation model is established according to the specific speed and the speed of the user perceived by the user in the virtual reality, specifically:
Figure 100002_DEST_PATH_IMAGE010
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE011
as is the speed at which the user is moving,
Figure 100002_DEST_PATH_IMAGE012
for the user to
Figure 469200DEST_PATH_IMAGE011
The speed of the virtual reality is sensed when the speed moves.
Specifically, the user is enabled to move at a specific acceleration, the self acceleration perceived by the user in the virtual reality is determined, and an acceleration compensation model is established according to the specific acceleration and the self acceleration perceived by the user in the virtual reality, specifically:
Figure 337667DEST_PATH_IMAGE013
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE014
is the acceleration of the movement of the user,
Figure DEST_PATH_IMAGE015
for the user to
Figure 807963DEST_PATH_IMAGE014
The acceleration is the acceleration of the virtual reality itself that is sensed when the acceleration moves.
Specifically, a real-time tracking and positioning compensation model is established according to the established spatial position compensation model, the velocity compensation model and the acceleration compensation model, and the real-time tracking and positioning compensation model specifically comprises the following steps:
Figure 100002_DEST_PATH_IMAGE016
wherein the content of the first and second substances,
Figure 115185DEST_PATH_IMAGE017
as a result of the start-up time,
Figure 100002_DEST_PATH_IMAGE018
is the current time.
Another aspect of an embodiment of the present invention provides a target position tracking system based on virtual reality, including:
a spatial position compensation model establishing unit: firstly, establishing a positioning center, determining a front position point, a rear position point, a left position point and a right position point according to a set distance, determining the self position of a user in virtual reality when the user is at each position, and establishing a spatial position compensation model according to the front position point, the rear position point, the left position point, the right position point and the self position of the user in virtual reality at each position;
a speed compensation model establishing unit: enabling a user to move at a specific speed, determining the speed of the user perceived by the user in the virtual reality, and establishing a speed compensation model according to the specific speed and the speed of the user perceived by the user in the virtual reality;
an acceleration compensation model establishing unit: enabling a user to move at a specific acceleration, determining the self acceleration sensed by the user in the virtual reality, and establishing an acceleration compensation model according to the specific acceleration and the self acceleration sensed by the user in the virtual reality;
a real-time perceptual location determination unit: and a real-time tracking and positioning compensation model is established according to the established space position compensation model, the speed compensation model and the acceleration compensation model, and the change of the position of the user perceived in the virtual reality can be determined in real time by combining the captured position change information of the user.
Specifically, the spatial position compensation model establishing unit is configured to establish a positioning center, determine a front position point, a rear position point, a left position point, and a right position point by a set distance, determine a self position of a user in virtual reality at each position, and establish a spatial position compensation model according to the front position point, the rear position point, the left position point, the right position point, and the self position of the user in virtual reality at each position, and specifically:
Figure 456168DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 495537DEST_PATH_IMAGE002
to determine the abscissa of the front location point,
Figure 312183DEST_PATH_IMAGE003
the abscissa of the position of the user perceived in the virtual reality at the previous position point;
Figure 862244DEST_PATH_IMAGE004
to determine the abscissa of the rear location point,
Figure 120050DEST_PATH_IMAGE005
the abscissa of the position of the user perceived in the virtual reality at the rear position point;
Figure 612212DEST_PATH_IMAGE006
to determine the ordinate of the left location point,
Figure 899842DEST_PATH_IMAGE007
is the ordinate of the user's own position perceived in the virtual reality at the left location point;
Figure 34020DEST_PATH_IMAGE008
to determine the ordinate of the right position point,
Figure 490541DEST_PATH_IMAGE009
is the ordinate of the user's own position perceived in the virtual reality at the right position point.
Specifically, the speed compensation model establishing unit enables the user to move at a specific speed, determines the speed that the user perceives in the virtual reality, and establishes the speed compensation model according to the specific speed and the speed that the user perceives in the virtual reality, specifically:
Figure 888024DEST_PATH_IMAGE010
wherein the content of the first and second substances,
Figure 662951DEST_PATH_IMAGE011
as is the speed at which the user is moving,
Figure 804082DEST_PATH_IMAGE012
for the user to
Figure 115109DEST_PATH_IMAGE011
The speed of the virtual reality is sensed when the speed moves.
Specifically, the acceleration compensation model establishing unit is configured to enable the user to move at a specific acceleration, determine the self acceleration perceived by the user in the virtual reality, and establish the acceleration compensation model according to the specific acceleration and the self acceleration perceived by the user in the virtual reality, specifically:
Figure 621177DEST_PATH_IMAGE013
wherein the content of the first and second substances,
Figure 634132DEST_PATH_IMAGE014
is the acceleration of the movement of the user,
Figure 562643DEST_PATH_IMAGE015
for the user to
Figure 977443DEST_PATH_IMAGE014
The acceleration is the acceleration of the virtual reality itself that is sensed when the acceleration moves.
Specifically, the real-time sensing position determining unit is configured to establish a real-time tracking and positioning compensation model according to the established spatial position compensation model, the velocity compensation model and the acceleration compensation model, where the real-time tracking and positioning compensation model specifically includes:
Figure 733041DEST_PATH_IMAGE016
wherein the content of the first and second substances,
Figure 233292DEST_PATH_IMAGE017
as a result of the start-up time,
Figure 965494DEST_PATH_IMAGE018
is the current time.
As can be seen from the above description of the present invention, compared with the prior art, the present invention has the following advantages:
(1) the invention provides a target position tracking method based on virtual reality, which comprises the steps of firstly establishing a positioning center, then determining a front position point, a rear position point, a left position point and a right position point according to a set distance, determining the self position of a user in the virtual reality when the user is at each position, and establishing a spatial position compensation model according to the front position point, the rear position point, the left position point, the right position point and the self position of the user in the virtual reality at each position; enabling a user to move at a specific speed, determining the speed of the user perceived by the user in the virtual reality, and establishing a speed compensation model according to the specific speed and the speed of the user perceived by the user in the virtual reality; enabling a user to move at a specific acceleration, determining the self acceleration sensed by the user in the virtual reality, and establishing an acceleration compensation model according to the specific acceleration and the self acceleration sensed by the user in the virtual reality; a real-time tracking and positioning compensation model is established according to the established space position compensation model, the speed compensation model and the acceleration compensation model, and the change of the position of the user perceived in the virtual reality can be determined in real time by combining the captured position change information of the user; the method provided by the invention establishes the real-time tracking and positioning compensation model by establishing the position difference model between the actual position of the user and the position of the user in the virtual space perceived by the user, establishing the difference model between the actual position change of the user and the position change of the user in the virtual space perceived by the user, thereby correctly tracking and positioning the position of the virtual scene perceived by the user in real time, improving the experience of real-virtual interaction and relieving the vertigo caused by inconsistent perception.
(2) The invention establishes a space position compensation model, a speed compensation model and an acceleration compensation model, thereby determining a real-time tracking and positioning compensation model, more comprehensively calculating the difference between the actual position of a user and the position sensed in virtual reality, and improving the accuracy.
Drawings
Fig. 1 is a flowchart of a target position tracking method based on virtual reality according to an embodiment of the present invention.
Fig. 2 is a block diagram of a target position tracking system based on virtual reality according to an embodiment of the present invention.
Detailed Description
The invention is further described below by means of specific embodiments.
The invention provides a target position tracking method based on virtual reality, which is characterized in that a real-time tracking and positioning compensation model is established by establishing a position difference model between the actual position of a user and the position of the user in a virtual space perceived by the user, establishing a difference model between the actual position change of the user and the position change of the user in the virtual space perceived by the user, so that the position of the user perceiving the virtual scene can be correctly tracked and positioned in real time, the experience of mutual interaction between real and virtual is improved, and the vertigo caused by inconsistent perception is relieved.
Fig. 1 is a flowchart of a target position tracking method based on virtual reality according to an embodiment of the present invention; the method specifically comprises the following steps:
s101: firstly, establishing a positioning center, determining a front position point, a rear position point, a left position point and a right position point according to a set distance, determining the self position of a user in virtual reality when the user is at each position, and establishing a spatial position compensation model according to the front position point, the rear position point, the left position point, the right position point and the self position of the user in virtual reality at each position;
the set distance in this embodiment is to determine a relative value, so the set value can be determined according to the size of the experimental site, and the set distance is set to 1m in this embodiment in consideration of the error size;
establishing a spatial position compensation model according to the front position point, the rear position point, the left position point, the right position point and the self position perceived by the user at each position in the virtual reality, specifically:
Figure 297118DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 958038DEST_PATH_IMAGE002
to determine the abscissa of the front location point,
Figure 945585DEST_PATH_IMAGE003
the abscissa of the position of the user perceived in the virtual reality at the previous position point;
Figure 504915DEST_PATH_IMAGE004
to determine the abscissa of the rear location point,
Figure 300833DEST_PATH_IMAGE005
the abscissa of the position of the user perceived in the virtual reality at the rear position point;
Figure 381921DEST_PATH_IMAGE006
to determine the ordinate of the left location point,
Figure 607498DEST_PATH_IMAGE007
is the ordinate of the user's own position perceived in the virtual reality at the left location point;
Figure 697813DEST_PATH_IMAGE008
to determine the ordinate of the right position point,
Figure 987718DEST_PATH_IMAGE009
the vertical coordinate of the self position perceived by the user in the virtual reality at the right position point;
for determining the self position perceived by the user in the virtual reality at each position, the following is specifically performed:
firstly, determining a calibration point in virtual reality by comparing and analyzing a perspective projection principle of a virtual camera and an imaging model of human eyes, and when a user is at a position point, moving the position until a fixation point of the human eyes in the virtual reality is coincident with the calibration point, wherein the final positioning of the user is the self position perceived by the user in the virtual reality at the position.
The gaze point is a binocular eye line direction obtained by a sight tracking algorithm based on a pupil corneal reflection technology, and then the closest point of two sights in the space is solved according to space analytic geometric knowledge, wherein the point is the gaze point.
S102: enabling a user to move at a specific speed, determining the speed of the user perceived by the user in the virtual reality, and establishing a speed compensation model according to the specific speed and the speed of the user perceived by the user in the virtual reality;
specifically, the user is enabled to move at a specific speed, the speed of the user perceived by the user in the virtual reality is determined, and a speed compensation model is established according to the specific speed and the speed of the user perceived by the user in the virtual reality, specifically:
Figure 974129DEST_PATH_IMAGE010
wherein the content of the first and second substances,
Figure 608372DEST_PATH_IMAGE011
as is the speed at which the user is moving,
Figure 987532DEST_PATH_IMAGE012
for the user to
Figure 85938DEST_PATH_IMAGE011
The speed of the virtual reality is sensed when the speed moves.
In the embodiment, in consideration of the error magnitude, the specific speed is set to be 1m/s, and for the speed that the user perceives himself in the virtual reality, specifically:
when the user moves at the speed of 1m/s, the measured moving speed of the eye in the virtual reality is the self speed perceived by the user in the virtual reality.
S103: enabling a user to move at a specific acceleration, determining the self acceleration sensed by the user in the virtual reality, and establishing an acceleration compensation model according to the specific acceleration and the self acceleration sensed by the user in the virtual reality;
specifically, the user is enabled to move at a specific acceleration, the self acceleration perceived by the user in the virtual reality is determined, and an acceleration compensation model is established according to the specific acceleration and the self acceleration perceived by the user in the virtual reality, specifically:
Figure 554834DEST_PATH_IMAGE013
wherein the content of the first and second substances,
Figure 4270DEST_PATH_IMAGE014
is the acceleration of the movement of the user,
Figure 921542DEST_PATH_IMAGE015
for the user to
Figure 874454DEST_PATH_IMAGE014
The acceleration is the acceleration of the virtual reality itself that is sensed when the acceleration moves.
The present embodiment sets the specific speed to 1m/s in consideration of the magnitude of the error2For the acceleration sensed by the user in the virtual reality, the method specifically includes:
when the user is at 1m/s2When the acceleration moves, the movement acceleration of the eye in the virtual reality, namely the self acceleration sensed by the user in the virtual reality, is measured.
S104: and a real-time tracking and positioning compensation model is established according to the established space position compensation model, the speed compensation model and the acceleration compensation model, and the change of the position of the user perceived in the virtual reality can be determined in real time by combining the captured position change information of the user.
Specifically, a real-time tracking and positioning compensation model is established according to the established spatial position compensation model, the velocity compensation model and the acceleration compensation model, and the real-time tracking and positioning compensation model specifically comprises the following steps:
Figure 405930DEST_PATH_IMAGE016
wherein the content of the first and second substances,
Figure 60771DEST_PATH_IMAGE017
as a result of the start-up time,
Figure 296580DEST_PATH_IMAGE018
is the current time.
As shown in fig. 2, another embodiment of the present invention provides a target position tracking system based on virtual reality, including:
spatial position compensation model creation unit 201: firstly, establishing a positioning center, determining a front position point, a rear position point, a left position point and a right position point according to a set distance, determining the self position of a user in virtual reality when the user is at each position, and establishing a spatial position compensation model according to the front position point, the rear position point, the left position point, the right position point and the self position of the user in virtual reality at each position;
the set distance in this embodiment is to determine a relative value, so the set value can be determined according to the size of the experimental site, and the set distance is set to 1m in this embodiment in consideration of the error size;
specifically, the spatial position compensation model establishing unit is configured to establish a positioning center, determine a front position point, a rear position point, a left position point, and a right position point by a set distance, determine a self position of a user in virtual reality at each position, and establish a spatial position compensation model according to the front position point, the rear position point, the left position point, the right position point, and the self position of the user in virtual reality at each position, and specifically:
Figure 917049DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 150584DEST_PATH_IMAGE002
to determine the abscissa of the front location point,
Figure 558300DEST_PATH_IMAGE003
the abscissa of the position of the user perceived in the virtual reality at the previous position point;
Figure 332221DEST_PATH_IMAGE004
to determine the abscissa of the rear location point,
Figure 10458DEST_PATH_IMAGE005
the abscissa of the position of the user perceived in the virtual reality at the rear position point;
Figure 477212DEST_PATH_IMAGE006
to determine the ordinate of the left location point,
Figure 575487DEST_PATH_IMAGE007
is the ordinate of the user's own position perceived in the virtual reality at the left location point;
Figure 559623DEST_PATH_IMAGE008
to determine the ordinate of the right position point,
Figure 856481DEST_PATH_IMAGE009
is the ordinate of the user's own position perceived in the virtual reality at the right position point.
For determining the self position perceived by the user in the virtual reality at each position, the following is specifically performed:
firstly, determining a calibration point in virtual reality by comparing and analyzing a perspective projection principle of a virtual camera and an imaging model of human eyes, and when a user is at a position point, moving the position until a fixation point of the human eyes in the virtual reality is coincident with the calibration point, wherein the final positioning of the user is the self position perceived by the user in the virtual reality at the position.
The gaze point is a binocular eye line direction obtained by a sight tracking algorithm based on a pupil corneal reflection technology, and then the closest point of two sights in the space is solved according to space analytic geometric knowledge, wherein the point is the gaze point.
Velocity compensation model creation unit 202: enabling a user to move at a specific speed, determining the speed of the user perceived by the user in the virtual reality, and establishing a speed compensation model according to the specific speed and the speed of the user perceived by the user in the virtual reality;
specifically, the speed compensation model establishing unit enables the user to move at a specific speed, determines the speed that the user perceives in the virtual reality, and establishes the speed compensation model according to the specific speed and the speed that the user perceives in the virtual reality, specifically:
Figure 962978DEST_PATH_IMAGE010
wherein the content of the first and second substances,
Figure 643489DEST_PATH_IMAGE011
as is the speed at which the user is moving,
Figure 759212DEST_PATH_IMAGE012
for the user to
Figure 910577DEST_PATH_IMAGE011
The speed of the virtual reality is sensed when the speed moves.
In the embodiment, in consideration of the error magnitude, the specific speed is set to be 1m/s, and for the speed that the user perceives himself in the virtual reality, specifically:
when the user moves at the speed of 1m/s, the measured moving speed of the eye in the virtual reality is the self speed perceived by the user in the virtual reality.
Acceleration compensation model creation unit 203: enabling a user to move at a specific acceleration, determining the self acceleration sensed by the user in the virtual reality, and establishing an acceleration compensation model according to the specific acceleration and the self acceleration sensed by the user in the virtual reality;
specifically, the acceleration compensation model establishing unit is configured to enable the user to move at a specific acceleration, determine the self acceleration perceived by the user in the virtual reality, and establish the acceleration compensation model according to the specific acceleration and the self acceleration perceived by the user in the virtual reality, specifically:
Figure 187974DEST_PATH_IMAGE013
wherein the content of the first and second substances,
Figure 293465DEST_PATH_IMAGE014
acceleration of user movement,
Figure 884983DEST_PATH_IMAGE015
For the user to
Figure 376007DEST_PATH_IMAGE014
The acceleration is the acceleration of the virtual reality itself that is sensed when the acceleration moves.
The present embodiment sets the specific speed to 1m/s in consideration of the magnitude of the error2For the acceleration sensed by the user in the virtual reality, the method specifically includes:
when the user is at 1m/s2When the acceleration moves, the movement acceleration of the eye in the virtual reality, namely the self acceleration sensed by the user in the virtual reality, is measured.
Real-time perceptual location determination unit 204: and a real-time tracking and positioning compensation model is established according to the established space position compensation model, the speed compensation model and the acceleration compensation model, and the change of the position of the user perceived in the virtual reality can be determined in real time by combining the captured position change information of the user.
Specifically, the real-time sensing position determining unit is configured to establish a real-time tracking and positioning compensation model according to the established spatial position compensation model, the velocity compensation model and the acceleration compensation model, where the real-time tracking and positioning compensation model specifically includes:
Figure DEST_PATH_IMAGE019
wherein the content of the first and second substances,
Figure 401470DEST_PATH_IMAGE017
as a result of the start-up time,
Figure 56573DEST_PATH_IMAGE018
is the current time.
The invention provides a target position tracking method based on virtual reality, which comprises the steps of firstly establishing a positioning center, then determining a front position point, a rear position point, a left position point and a right position point according to a set distance, determining the self position of a user in the virtual reality when the user is at each position, and establishing a spatial position compensation model according to the front position point, the rear position point, the left position point, the right position point and the self position of the user in the virtual reality at each position; enabling a user to move at a specific speed, determining the speed of the user perceived by the user in the virtual reality, and establishing a speed compensation model according to the specific speed and the speed of the user perceived by the user in the virtual reality; enabling a user to move at a specific acceleration, determining the self acceleration sensed by the user in the virtual reality, and establishing an acceleration compensation model according to the specific acceleration and the self acceleration sensed by the user in the virtual reality; a real-time tracking and positioning compensation model is established according to the established space position compensation model, the speed compensation model and the acceleration compensation model, and the change of the position of the user perceived in the virtual reality can be determined in real time by combining the captured position change information of the user; the method provided by the invention establishes the real-time tracking and positioning compensation model by establishing the position difference model between the actual position of the user and the position of the user in the virtual space perceived by the user, establishing the difference model between the actual position change of the user and the position change of the user in the virtual space perceived by the user, thereby correctly tracking and positioning the position of the virtual scene perceived by the user in real time, improving the experience of real-virtual interaction and relieving the vertigo caused by inconsistent perception.
The invention establishes a space position compensation model, a speed compensation model and an acceleration compensation model, thereby determining a real-time tracking and positioning compensation model, more comprehensively calculating the difference between the actual position of a user and the position sensed in virtual reality, and improving the accuracy.
The above description is only an embodiment of the present invention, but the design concept of the present invention is not limited thereto, and any insubstantial modifications made by using the design concept should fall within the scope of infringing the present invention.

Claims (2)

1. A target position tracking method based on virtual reality is characterized by comprising the following steps:
firstly, establishing a positioning center, determining a front position point, a rear position point, a left position point and a right position point according to a set distance, determining the self position of a user in virtual reality when the user is at each position, and establishing a spatial position compensation model according to the front position point, the rear position point, the left position point, the right position point and the self position of the user in virtual reality at each position;
the method specifically comprises the following steps:
Figure DEST_PATH_IMAGE002
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE003
to determine the abscissa of the front location point,
Figure DEST_PATH_IMAGE004
the abscissa of the position of the user perceived in the virtual reality at the previous position point;
Figure DEST_PATH_IMAGE005
to determine the abscissa of the rear location point,
Figure DEST_PATH_IMAGE006
the abscissa of the position of the user perceived in the virtual reality at the rear position point;
Figure DEST_PATH_IMAGE007
to determine the ordinate of the left location point,
Figure DEST_PATH_IMAGE008
is the ordinate of the user's own position perceived in the virtual reality at the left location point;
Figure DEST_PATH_IMAGE009
to determine the ordinate of the right position point,
Figure DEST_PATH_IMAGE010
the vertical coordinate of the self position perceived by the user in the virtual reality at the right position point;
enabling a user to move at a specific speed, determining the speed of the user perceived by the user in the virtual reality, and establishing a speed compensation model according to the specific speed and the speed of the user perceived by the user in the virtual reality;
the method specifically comprises the following steps:
Figure DEST_PATH_IMAGE012
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE013
as is the speed at which the user is moving,
Figure DEST_PATH_IMAGE014
for the user to
Figure 897289DEST_PATH_IMAGE013
Sensing the speed of the user in the virtual reality when the speed moves;
enabling a user to move at a specific acceleration, determining the self acceleration sensed by the user in the virtual reality, and establishing an acceleration compensation model according to the specific acceleration and the self acceleration sensed by the user in the virtual reality;
the method specifically comprises the following steps:
Figure DEST_PATH_IMAGE016
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE017
is the acceleration of the movement of the user,
Figure DEST_PATH_IMAGE018
for the user to
Figure 532539DEST_PATH_IMAGE017
Self acceleration sensed in virtual reality when the acceleration moves;
establishing a real-time tracking and positioning compensation model according to the established space position compensation model, the speed compensation model and the acceleration compensation model, and determining the change of the position of the user perceived in the virtual reality in real time by combining the captured position change information of the user;
the method specifically comprises the following steps:
Figure DEST_PATH_IMAGE020
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE021
as a result of the start-up time,
Figure DEST_PATH_IMAGE022
is the current time.
2. A virtual reality-based target location tracking system, comprising:
a spatial position compensation model establishing unit: firstly, establishing a positioning center, determining a front position point, a rear position point, a left position point and a right position point according to a set distance, determining the self position of a user in virtual reality when the user is at each position, and establishing a spatial position compensation model according to the front position point, the rear position point, the left position point, the right position point and the self position of the user in virtual reality at each position; the method specifically comprises the following steps:
Figure DEST_PATH_IMAGE023
wherein the content of the first and second substances,
Figure 558614DEST_PATH_IMAGE003
to determine the abscissa of the front location point,
Figure 32452DEST_PATH_IMAGE004
the abscissa of the position of the user perceived in the virtual reality at the previous position point;
Figure 909141DEST_PATH_IMAGE005
to determine the abscissa of the rear location point,
Figure 262762DEST_PATH_IMAGE006
the abscissa of the position of the user perceived in the virtual reality at the rear position point;
Figure 44642DEST_PATH_IMAGE007
to determine the ordinate of the left location point,
Figure 571438DEST_PATH_IMAGE008
is the ordinate of the user's own position perceived in the virtual reality at the left location point;
Figure 53366DEST_PATH_IMAGE009
to determine the ordinate of the right position point,
Figure 312309DEST_PATH_IMAGE010
the vertical coordinate of the self position perceived by the user in the virtual reality at the right position point;
a speed compensation model establishing unit: enabling a user to move at a specific speed, determining the speed of the user perceived by the user in the virtual reality, and establishing a speed compensation model according to the specific speed and the speed of the user perceived by the user in the virtual reality; the method specifically comprises the following steps:
Figure 394535DEST_PATH_IMAGE012
wherein the content of the first and second substances,
Figure 446061DEST_PATH_IMAGE013
as is the speed at which the user is moving,
Figure 500605DEST_PATH_IMAGE014
for the user to
Figure 196028DEST_PATH_IMAGE013
Sensing the speed of the user in the virtual reality when the speed moves;
an acceleration compensation model establishing unit: enabling a user to move at a specific acceleration, determining the self acceleration sensed by the user in the virtual reality, and establishing an acceleration compensation model according to the specific acceleration and the self acceleration sensed by the user in the virtual reality; the method specifically comprises the following steps:
Figure 516282DEST_PATH_IMAGE016
wherein the content of the first and second substances,
Figure 56985DEST_PATH_IMAGE017
is the acceleration of the movement of the user,
Figure 277620DEST_PATH_IMAGE018
for the user to
Figure 878365DEST_PATH_IMAGE017
Self acceleration sensed in virtual reality when the acceleration moves;
a real-time perceptual location determination unit: establishing a real-time tracking and positioning compensation model according to the established space position compensation model, the speed compensation model and the acceleration compensation model, and determining the change of the position of the user perceived in the virtual reality in real time by combining the captured position change information of the user;
the method specifically comprises the following steps:
Figure DEST_PATH_IMAGE024
wherein the content of the first and second substances,
Figure 217074DEST_PATH_IMAGE021
as a result of the start-up time,
Figure 358205DEST_PATH_IMAGE022
is the current time.
CN202111017680.4A 2021-09-01 2021-09-01 Target position tracking method and system based on virtual reality Active CN113448445B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111017680.4A CN113448445B (en) 2021-09-01 2021-09-01 Target position tracking method and system based on virtual reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111017680.4A CN113448445B (en) 2021-09-01 2021-09-01 Target position tracking method and system based on virtual reality

Publications (2)

Publication Number Publication Date
CN113448445A CN113448445A (en) 2021-09-28
CN113448445B true CN113448445B (en) 2021-11-30

Family

ID=77819412

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111017680.4A Active CN113448445B (en) 2021-09-01 2021-09-01 Target position tracking method and system based on virtual reality

Country Status (1)

Country Link
CN (1) CN113448445B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007017595A2 (en) * 2005-08-09 2007-02-15 Total Immersion Method and devices for visualising a real passenger compartment in a synthesis environment
CN101034309A (en) * 2007-04-10 2007-09-12 南京航空航天大学 System and method for virtual implementing helmet anti-dazzle based on multiple acceleration transducers
CN110221691A (en) * 2019-05-13 2019-09-10 深圳电通信息技术有限公司 A kind of immersion virtual experience method, system and device
CN111007939A (en) * 2019-11-25 2020-04-14 华南理工大学 Virtual reality system space positioning method based on depth perception

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110376550B (en) * 2018-04-12 2024-04-12 北京凌宇智控科技有限公司 Three-dimensional space positioning method and system based on position compensation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007017595A2 (en) * 2005-08-09 2007-02-15 Total Immersion Method and devices for visualising a real passenger compartment in a synthesis environment
CN101034309A (en) * 2007-04-10 2007-09-12 南京航空航天大学 System and method for virtual implementing helmet anti-dazzle based on multiple acceleration transducers
CN110221691A (en) * 2019-05-13 2019-09-10 深圳电通信息技术有限公司 A kind of immersion virtual experience method, system and device
CN111007939A (en) * 2019-11-25 2020-04-14 华南理工大学 Virtual reality system space positioning method based on depth perception

Also Published As

Publication number Publication date
CN113448445A (en) 2021-09-28

Similar Documents

Publication Publication Date Title
CN100487568C (en) Enhanced real natural interactive helmet with sight line follow-up function
EP0641132B1 (en) Stereoscopic image pickup apparatus
JP6084619B2 (en) Method for measuring geometric parameters of a spectacle wearer
WO2016021034A1 (en) Algorithm for identifying three-dimensional point of gaze
US20150103096A1 (en) Display device, head mount display, calibration method, calibration program and recording medium
CN111007939B (en) Virtual reality system space positioning method based on depth perception
US11570426B2 (en) Computer-readable non-transitory storage medium, web server, and calibration method for interpupillary distance
US20140240470A1 (en) Method, system and device for improving optical measurement of ophthalmic spectacles
WO2019046803A1 (en) Ray tracing system for optical headsets
KR102444666B1 (en) Method and apparatus for controlling 3d steroscopic image in vehicle
KR20170054511A (en) Method for determining optical parameters of a test subject with measurement accuracy in order to adapt a pair of eyeglasses to the test subject, and immobile video centering system
JP2019053603A (en) Display control program, apparatus and method
Wibirama et al. 3D gaze tracking on stereoscopic display using optimized geometric method
CN113448445B (en) Target position tracking method and system based on virtual reality
CN106708249B (en) Interaction method, interaction device and user equipment
CN116019693B (en) VR-based stereoscopic vision training method and device
CN111263133B (en) Information processing method and system
EP2772795A1 (en) Method, system and device for improving optical measurement of ophthalmic spectacles
JP2005081101A (en) System and methodology for detecting visual axis direction
CN109856802B (en) Pupil distance adjusting method and device and virtual display equipment
CN110706268B (en) Distance adjusting method and electronic equipment
Conti et al. Adjusting stereoscopic parameters by evaluating the point of regard in a virtual environment
CN108471939B (en) Pan zone measuring method and device and wearable display equipment
CN112752537A (en) Method for determining at least one geometric form parameter of a subject in a natural posture for determining a vision correction equipment
JP2021517654A (en) Display method and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant