CN117632339A - Method and device for controlling movement of particles in vehicle-mounted VR scene - Google Patents

Method and device for controlling movement of particles in vehicle-mounted VR scene Download PDF

Info

Publication number
CN117632339A
CN117632339A CN202311616381.1A CN202311616381A CN117632339A CN 117632339 A CN117632339 A CN 117632339A CN 202311616381 A CN202311616381 A CN 202311616381A CN 117632339 A CN117632339 A CN 117632339A
Authority
CN
China
Prior art keywords
particles
vehicle
scene
axis
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311616381.1A
Other languages
Chinese (zh)
Inventor
谭毅
郭颖山
包楠
王浩
任薛霖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Seres New Energy Automobile Design Institute Co Ltd
Original Assignee
Chongqing Seres New Energy Automobile Design Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Seres New Energy Automobile Design Institute Co Ltd filed Critical Chongqing Seres New Energy Automobile Design Institute Co Ltd
Priority to CN202311616381.1A priority Critical patent/CN117632339A/en
Publication of CN117632339A publication Critical patent/CN117632339A/en
Pending legal-status Critical Current

Links

Landscapes

  • Navigation (AREA)

Abstract

The application relates to the technical field of vehicle-mounted VR, and provides a method and a device for controlling movement of particles in a vehicle-mounted VR scene. The method comprises the following steps: acquiring running state data of a vehicle, wherein the running state data comprises the running speed and the course angle of the vehicle; determining the opposite direction of the running speed, and determining any direction in a preset angle range taking the opposite direction as the center as the moving direction of particles in the vehicle-mounted Virtual Reality (VR) scene; according to the running speed and the course angle, determining the movement amount of particles in the vehicle-mounted VR scene after preset time; according to the moving direction and the moving amount of the particles, the particles in the vehicle-mounted VR scene are controlled to move in the moving direction and the moving amount. The technical problem that the user easily happens dizziness when the vehicle-mounted VR is used is solved.

Description

Method and device for controlling movement of particles in vehicle-mounted VR scene
Technical Field
The application relates to the technical field of vehicle-mounted VR, in particular to a method and a device for controlling movement of particles in a vehicle-mounted VR scene.
Background
Vehicle-mounted Virtual Reality (VR) belongs to a new application scene of VR products on automobiles. When applications such as meditation and fragrance in vehicle VR applications are used, granular particles representing fragrance and star particles in silver rivers appear in the picture, and these particles move along a certain track in the VR picture, so that the picture becomes more beautiful and vivid.
However, after the VR device is worn by the user, the random movement of the particles may change the user's own body feeling during the running of the vehicle, thereby causing dizziness to the user.
Disclosure of Invention
In view of this, the embodiment of the application provides a method and a device for controlling the movement of particles in a vehicle-mounted VR scene, so as to solve the problem that in the prior art, dizziness is easy to occur to a user during vehicle-mounted VR.
In a first aspect of an embodiment of the present application, a method for generating and moving particles of a vehicle-mounted VR image is provided, including:
acquiring running state data of a vehicle, wherein the running state data comprises the running speed and the course angle of the vehicle;
determining the opposite direction of the running speed, and determining any direction in a preset angle range taking the opposite direction as the center as the moving direction of particles in the vehicle-mounted Virtual Reality (VR) scene;
according to the running speed and the course angle, determining the movement amount of particles in the vehicle-mounted VR scene after preset time;
according to the moving direction and the moving amount of the particles, the particles in the vehicle-mounted VR scene are controlled to move in the moving direction and the moving amount.
In a second aspect of the embodiments of the present application, there is provided a motion control device for particles in a vehicle-mounted VR scene, including:
the data collection module is used for obtaining running state data of the vehicle, wherein the running state data comprise the running speed and the course angle of the vehicle;
the direction determining module is used for determining the opposite direction of the running speed and determining any direction in a preset angle range taking the opposite direction as the center as the moving direction of the particles in the vehicle-mounted virtual reality VR scene;
the speed calculation module is used for determining the movement amount of particles in the vehicle-mounted VR scene after preset time according to the running speed and the course angle;
and the motion control module is used for controlling the particles in the vehicle-mounted VR scene to move in the moving direction and the moving amount according to the moving direction and the moving amount of the particles.
In a third aspect of the embodiments of the present application, there is provided an electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the above method when executing the computer program.
In a fourth aspect of the embodiments of the present application, there is provided a readable storage medium storing a computer program which, when executed by a processor, implements the steps of the above method.
The beneficial effects of the embodiment of the application at least comprise:
acquiring running state data of a vehicle, wherein the running state data comprises the running speed and the course angle of the vehicle; determining the opposite direction of the running speed, and determining any direction in a preset angle range taking the opposite direction as the center as the moving direction of particles in the vehicle-mounted Virtual Reality (VR) scene; according to the running speed and the course angle, determining the movement amount of particles in the vehicle-mounted VR scene after preset time; according to the moving direction and the moving amount of the particles, controlling the particles in the vehicle-mounted VR scene to move in the moving direction and the moving amount; the particles in the vehicle-mounted VR scene are not randomly moved any more, the moving direction tends to be consistent with the opposite direction of the running speed, and the moving amount is related to the running speed, so that the movement of the particles can be determined according to the running direction and the running speed of the vehicle, the movement track of the particles is related to the running direction and the running speed of the vehicle, the movement track of the particles is corresponding to the body feeling of a user, the user experience is improved, and the problem that dizziness is easy to occur when the user uses the vehicle-mounted VR device in the running process of the vehicle is avoided.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the following description will briefly introduce the drawings that are needed in the embodiments or the description of the prior art, it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a system architecture according to an embodiment of the present application;
fig. 2 is a flow chart of a method for controlling movement of particles in a vehicle-mounted VR scene according to an embodiment of the present application;
FIG. 3 is a schematic view of a particle provided in an embodiment of the present application in a three-dimensional coordinate system;
fig. 4 is a schematic structural diagram of a motion control device for particles in a vehicle-mounted VR scene provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate, such that embodiments of the present application may be implemented in sequences other than those illustrated or described herein, and that the objects identified by "first," "second," etc. are generally of the same type and are not limited to the number of objects, such as the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
Furthermore, it should be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
The following describes in detail a method and an apparatus for controlling movement of particles in an in-vehicle VR scene according to an embodiment of the present application with reference to the accompanying drawings.
Fig. 1 is a schematic system configuration diagram of an embodiment of the present application. Specifically, the system structure may include a gyroscope 11 and a speed sensor 12, an adapter (adapter) middleware 13 of a vehicle-to-machine system, a VR device 14, and the like. Wherein the gyroscope 11 can perform timed acquisition and detection on the course angle of the current vehicle; the speed sensor 12 may collect and detect the running speed of the current vehicle at regular time; the adapter middleware 13 of the vehicle machine system can continuously process the acquired heading angle and speed of the vehicle and is connected with the VR equipment 14 in a wireless manner; the VR device may display motion trajectories of particles and various types of information to the user. The gyroscope 11 collects the course angle of the vehicle, the speed sensor 12 collects the speed of the vehicle, after the collection is completed, the current course angle and running speed of the vehicle CAN be sent to the adapter middleware 13 of the vehicle machine system through a controller area network bus (Controller Area Network, CAN) signal, the adapter middleware processes and converts the data, then the processed running speed and course angle are sent to the VR equipment through Bluetooth or a wireless network, the VR equipment CAN determine the moving direction and moving amount of the particles based on the data after receiving the data, then the movement of the particles in a VR scene is controlled based on the moving direction and moving amount of the particles, and the particles are displayed in a screen to be watched by a user.
Fig. 2 is a flow chart of a method for controlling movement of particles in a vehicle-mounted VR scene according to an embodiment of the present application. The method may be performed by the VR device side or the vehicle side. As shown in fig. 2, the method for controlling the movement of particles in the on-vehicle VR scene includes:
in step 201, driving state data of a vehicle is acquired, wherein the driving state data includes a driving speed and a heading angle of the vehicle.
Specifically, the running speed of the vehicle may be acquired and detected by a speed sensor mounted on the vehicle, and the heading angle of the vehicle may be acquired and detected by a gyroscope mounted on the vehicle. The course angle may be an angle between a current vehicle speed direction and a north pole, a vehicle position may be set as an origin, a north-right horizontal line is set as 0 ° along a longitude line, an upward right manipulation is applied to define a rotation direction, and an angle range is set to have a value of 0 ° to 359.99 °. The heading angle can be controlled by controlling the front wheel rotation angle (steering wheel).
By acquiring the travel speed and heading angle of the vehicle, data support is provided for subsequent determination of the movement of the particles.
Step 202, determining the opposite direction of the running speed, and determining any direction within a preset angle range with the opposite direction as the center as the moving direction of the particles in the vehicle-mounted VR scene.
Specifically, the preset angle range may be smaller than 45 degrees, or may be smaller than 50 degrees.
In a real scene in the running process of a vehicle, the direction of wind perceived by a user is opposite to the running direction of the vehicle, and the moving direction of a real object in the air perceived by the user is opposite to the running direction of the vehicle, so that the moving direction of particles in the vehicle-mounted VR scene is set to be any direction within a preset angle range, and the preset angle range is centered on the opposite direction of the running speed, namely, the opposite direction of the running speed is set to be an average line, so that the moving direction of ions tends to be consistent with the opposite direction of the running direction of the vehicle, the moving direction of particles seen by the user in the VR scene is consistent with the body feeling of the user, and the moving direction of particles seen by the user in the VR scene is prevented from being inconsistent with the body feeling of the user, so that the problem of dizziness appears.
For example, as one example, the direction of movement of the particles in the in-vehicle VR scene is a direction that is 30 ° from the opposite direction of the travel speed, i.e., the direction of movement of the particles may be a left or right direction that is 30 degrees from the opposite direction of the travel speed, which causes the user's motion perception to tend to coincide with the direction of movement of the particles seen in the VR scene.
It should be noted that, the preset angle range may be set by default by the vehicle, or may be set by user according to self-experience, which is not limited herein.
Therefore, the moving direction of particles in the vehicle-mounted VR scene is set to be any direction within the preset angle range, and the preset angle range is centered on the opposite direction of the running speed, so that the moving feeling of a user is consistent with the trend of the opposite direction of the running direction of the vehicle, the experience of the user when the VR device is used is more real, the situation that the body feeling of the user collides with the scene seen by eyes is avoided, and dizziness is avoided.
Step 203, determining the movement amount of the particles in the vehicle-mounted VR scene after the preset time according to the running speed and the heading angle.
Specifically, the movement amount of the particles may be decomposed according to a three-dimensional coordinate system, and the travel speed may be decomposed according to a heading angle, so that a relationship between the travel speed and the movement amount of the particles in the on-vehicle VR scene may be obtained.
The movement amount of particles in the vehicle-mounted VR scene after the preset time is determined according to the running speed and the course angle, so that the movement speed of the particles in the vehicle-mounted VR scene is matched with the current vehicle speed, the synchronous change of the particle movement speed and the vehicle speed is ensured, the experience of using VR equipment is further improved, and the body feeling of a user is more consistent with the actual state.
Step 204, controlling the particles in the on-vehicle VR scene to move in the moving direction and the moving amount according to the moving direction and the moving amount of the particles.
By controlling the particles in the vehicle-mounted VR scene to move in the moving direction and the moving amount according to the moving direction and the moving amount of the particles, the association of the movement of the particles with the running track of the vehicle is realized, and therefore the movement track of the particles is synchronous with the body feeling of a user.
According to the technical scheme provided by the embodiment of the application, the running state data of the vehicle are obtained, wherein the running state data comprise the running speed and the course angle of the vehicle; determining the opposite direction of the running speed, and determining any direction in a preset angle range taking the opposite direction as the center as the moving direction of particles in the vehicle-mounted VR scene; according to the running speed and the course angle, determining the movement amount of particles in the vehicle-mounted VR scene after preset time; according to the moving direction and the moving amount of the particles, controlling the particles in the vehicle-mounted VR scene to move in the moving direction and the moving amount; the particles in the vehicle-mounted VR scene are not randomly moved any more, and move correspondingly according to the running state of the current vehicle, but the moving direction tends to be consistent with the opposite direction of the running speed, and the moving amount is also related to the running speed, so that the movement of the particles can be determined according to the running direction and the running speed of the vehicle, the movement track of the particles is related to the running direction and the running speed of the vehicle, the movement track of the particles corresponds to the body feeling of a user, the user experience is improved, and the problem that dizziness is easy to occur when the user uses the vehicle-mounted VR device in the running process of the vehicle is avoided.
In some embodiments, acquiring driving state data of a vehicle includes:
determining refreshing frequency of a display image in a vehicle-mounted VR scene; determining a generation period of particles according to the refresh frequency; and acquiring running state data every integral multiple time of the generation period of the spacer.
Specifically, the refresh frequency of the display image in the on-vehicle VR scene may be determined by the display screen on the on-vehicle VR device, and the value thereof may be 50Hz, 60Hz, 70Hz, or the like.
The system can refresh the frequency and according to the formulaDetermining a period of an image frame, whereinf denotes a refresh frequency, T denotes a period of an image frame, and a generation period of particles may be a period length of at least one frame. For example, as one example, if the refresh frequency of the current in-vehicle VR device display image is 50Hz, the period of the current frame may be found to be 20ms, and the generation period of the particles may be selected to be the period length of one frame, that is, the generation period of the particles may be 20ms.
The integral multiple time of the generation period may be set according to the need, and for example, the integral multiple time may be 1 time or 2 times time, that is, the acquisition period for acquiring the running state data may be 20ms, 40ms, or the like.
The generation period of the particles is determined by utilizing the refresh frequency of the current display image, so that the acquisition period of the running state data of the vehicle is determined, the correlation between the current acquisition period of the data and the generation period of the particles is ensured, the movement of the particles can be changed according to the change of the running state of the vehicle at the first time, the movement of the particles is more in accordance with the running state of the vehicle, and the accuracy of the movement of the particles is improved.
In some embodiments, determining the amount of movement of the particles in the in-vehicle VR scene per a preset time period based on the travel speed and the heading angle includes:
according to the running speed, the preset time and the course angle, determining velocity components and velocity correction amounts of particles in X-axis, Y-axis and Z-axis directions in the vehicle-mounted VR scene respectively; wherein, the X axis, the Y axis and the Z axis are three-dimensional coordinate systems established in the VR space in advance; and determining the movement amount of the particles according to the velocity components and the velocity correction amounts of the particles in the X-axis, Y-axis and Z-axis directions in the vehicle-mounted VR scene.
Specifically, the three-dimensional coordinate system may be established according to the right hand rule, and the Z-axis direction may be determined as the positive direction in which the vehicle is currently traveling, or may be determined to be consistent with the initial visual direction of the user, and in this case, the X-axis direction is the horizontal direction and the Y-axis direction is the vertical direction.
For example, when the Z-axis direction is the positive direction in which the vehicle is currently traveling, the velocity component in the Z-axis direction is the traveling velocity of the vehicle, and the velocity component in the X-axis direction may be found using the tangent of the heading angle on the Z-axis-O-X-axis plane, and similarly the velocity component in the Y-axis direction may be found using the tangent of the heading angle on the Z-axis-O-Y-axis plane.
In addition, specifically, when determining the movement amount of the particle according to the velocity component and the velocity correction amount of the particle in the X-axis, Y-axis, and Z-axis directions in the in-vehicle VR scene, the movement component in each axis direction may be calculated first according to the velocity component and the velocity correction amount in each axis direction, and then the movement component in each axis direction may be geometrically calculated to obtain the movement amount of the whole particle. Wherein, the calculation formula of the movement component in each axis direction is qt+c, q represents the velocity component in each axis direction, c represents the velocity correction amount in the corresponding axis direction, and t represents the preset time.
The running speed is decomposed into three perpendicular speed components in the three-dimensional coordinate system according to the course angle and corresponds to the vehicle-mounted VR scene, so that the movement speed of the particles is positively correlated with the running speed of the vehicle, the movement amount of the particles is determined through the speed component and the speed correction amount, the calculation process of the movement amount of the particles is more accurate and simple, and the calculation speed of calculating the particle track is improved.
In some embodiments, determining velocity components and velocity corrections of particles in X-axis, Y-axis, and Z-axis directions in an in-vehicle VR scene based on a travel speed, a preset time, and a heading angle, respectively, includes:
the velocity components and velocity correction amounts of the particles in the X-axis, Y-axis and Z-axis directions in the vehicle-mounted VR scene are calculated by the following formulas:
wherein a represents a velocity component of the particles in an X-axis direction in the in-vehicle VR scene, b represents a velocity correction amount of the particles in the X-axis direction in the in-vehicle VR scene, c represents a velocity component of the particles in a Y-axis direction in the in-vehicle VR scene, d represents a velocity correction amount of the particles in the Y-axis direction in the in-vehicle VR scene, e represents a velocity component of the particles in a Z-axis direction in the in-vehicle VR scene, and f represents a velocity correction amount of the particles in the Z-axis direction in the in-vehicle VR scene; t represents a preset time; dv is the speed variation; alpha represents the proportionality coefficient of the running speed and the line-of-sight speed in the X-axis direction in the vehicle-mounted VR scene, beta represents the proportionality coefficient of the running speed and the line-of-sight speed in the Y-axis direction, and gamma represents the proportionality coefficient of the running speed and the line-of-sight speed in the Z-axis direction; the yaw horizontal angle represents the component on the horizontal angle after vector decomposition of the course angle, and the yaw vertical angle represents the component on the vertical angle after vector decomposition of the course angle.
In particular, the method comprises the steps of,may be expressed as the rate of shift in VR viewing distance in the X, Y, Z axis direction. Taking the X axis as an example, according to the velocity component and the velocity correction amount in the X axis direction, the motion distance at of the particles in the X axis direction can be obtained after the preset time, and because the conversion of the real scene and the VR viewing distance has a certain error, the current moving amount at+b of the X axis is finally obtained after a certain correction, namely the velocity correction amount, and therefore, the offset velocity in the X axis direction is +.>Is getting->After the values of a and b, the values of a and b may be divided according to a predetermined ratio, for example, assuming that the predetermined ratio may be 9:1,/v>A is 9,b is 1; similarly, the offset rate can be obtained in the Y-axis and Z-axis directions>
Further, (v+dv) ×tan (yaw horizontal angle), (v+dv) ×tan (yaw vertical angle), and (v+dv) may be expressed as velocity components in the X-axis, Y-axis, and Z-axis, which are obtained by decomposing the travel velocity of the vehicle in reality by the course angle, which may be vector-decomposed in the horizontal angle and the vertical angle, to obtain the yaw horizontal angle and the yaw vertical angle.
Alpha, beta and gamma are proportionality coefficients of the running speed of the vehicle and the particle velocity in the visual range of VR, the velocity in VR and reality are related, the calculation mode can be deduced for designing an application scene, and a certain algorithm rule is added according to the visual angle position and direction of a user. In general, the scaling factor may be a preset value, or may be a dynamic value, but it is required to perform a numerical calibration.
In addition, when the running state of the vehicle corresponds to the movement state of the particles through the formula, individual variability can be added to each particle, namely, a small amount of randomness is added after the corresponding relation, so that the movement tracks of the particles are not completely identical.
In this way, the velocity components and velocity correction amounts of the particles in the directions of the X axis, the Y axis and the Z axis in the vehicle-mounted VR scene are obtained through calculation, the real vehicle running state and the movement velocity of the particles can be related, the two can be mutually corresponding in numerical value, the synchronism of the vehicle speed and the particle movement velocity is ensured, and the visual hardness and mechanical feeling of a user are avoided.
In some embodiments, determining the amount of movement of the particle according to the velocity component and the velocity correction amount of the particle in the X-axis, Y-axis, and Z-axis directions in the in-vehicle VR scene, respectively, further comprises:
determining a gravitational acceleration and an air resistance value in a vehicle-mounted VR scene; updating the movement amount of the particles in the Y-axis direction according to the gravity acceleration; the movement amount of the particles in the Z-axis direction is updated based on the air resistance value.
Specifically, a gravitational acceleration component may be added during particle motion, where the calculation formula of gravitational acceleration in the on-vehicle VR scene may beG is the gravity acceleration of the area where the vehicle is located, the numerical value can be 9.8, 9.9, 10.0 and the like, h is the gravity parameter which is proportionally converted into the VR scene in the real scene, the weight of the automobile can be the average weight of the human body, and Gy represents the gravity acceleration in the vehicle-mounted VR scene.
When the movement amount of the particles in the Y-axis direction is updated based on the gravitational acceleration, the updated movement amount in the Y-axis direction may be ct+d+gy.
In addition, an air resistance component can be added in the particle movement process, an air resistance calculation formula can be calculated by gz=pt, wherein p is a preset value related to air resistance, measurement can be performed through a sensor pre-installed on a vehicle, and the air resistance is obtained by converting the measurement into a VR line of sight, and Gz represents the air resistance value in a vehicle-mounted VR scene. When the movement amount of the particles in the Z-axis direction is updated based on the air resistance value, the updated movement amount in the Z-axis direction is et+f+gz.
In addition, it should be noted that, during the particle movement process, the particle can be detected, whether the particle collides is judged, and when the particle collision exists during the particle movement process, the particle track after the collision can be corrected by the momentum conservation law, and meanwhile, the collision can be set as an elastic collision or an inelastic collision; when it is detected that there is a collision tendency of the particles during the movement of the particles, the movement amount of the particles having a collision tendency may be increased or decreased by a preset value, which may be 0.5 unit number or 1 unit number.
By adding the gravity component and the air resistance component into the calculation result, the factors influencing the running state of the vehicle in the real environment are added into the particle running state, so that the movement of the particles is more real, the running process of the actual vehicle is more met, the consistency of the particle running state and the vehicle running state is improved, the user can take the movement of the particles as the reaction of the running state of the vehicle, the reference is provided for the perception of the user, and the condition that dizziness occurs when the user uses the vehicle-mounted VR equipment is reduced.
In some embodiments, before determining the amount of movement of the particles in the in-vehicle VR scene according to the travel speed and the heading angle, further comprising:
controlling a default position in a three-dimensional coordinate system pre-established in a VR space to generate an initial particle; generating N particles at predetermined intervals centering on the positions of the initial particles, wherein the positions of the N particles are (X 0 +i,Y 0 +j,Z 0 +k), i, j, k are normal distribution VR line-of-sight space unit values corresponding to random values, and N is greater than or equal to 1.
Specifically, as shown in fig. 3, the default position may be on a coordinate axis, may be at an origin, or may be at any position, and the embodiment is not limited specifically.
In addition, in this embodiment, N new particles may be generated around the center of the circle by using the initial particle as the center, where the preset time may be 20ms, 40ms, 60ms, etc., and the particle generation number N of each time may be a fixed value, may be 10, 20, 30, etc., may be a default number of the system, or may be defined by the user. In addition, in this embodiment, a preset area may be planned in the coordinate system, and when the particle motion slides out of the preset area, the particle may gradually disappear, or immediately disappear, or may continuously collide in the preset area, and when the number of collisions reaches a certain value, the particle may gradually disappear. i. j and k are normal distributed VR line-of-sight space unit values corresponding to random values, and can be limited to 10-100 in order to meet normal experience of users, and can also be changed according to application scenes.
By establishing a three-dimensional coordinate system and determining a particle motion area in the coordinate system, the motion trail of particles in each period is approximately the same as that of particles in the previous period, the mobility and consistency of the particles are ensured, the motion area of the particles cannot be changed greatly suddenly, the movement of the particles is ensured to accord with the basic cognition of a user, meanwhile, the motion trail among the particles is different, certain difference exists in each particle, the visual sense of hardness is avoided, and the interestingness of the movement of the particles is improved.
In some embodiments, after determining the movement amount of the particles in the in-vehicle VR scene according to the travel speed and the heading angle, the method further includes:
determining a target position of the particle after a preset time according to the current position, the moving direction and the moving amount of the particle by the following formula:
wherein X is (n-1) Representing the corresponding current position of the particles on the X-axis, X n Representing the corresponding position of the particles on the X axis after the preset time in the moving direction, Y (n-1) Representing the corresponding current position of the particle on the Y axis, Y n Indicating the corresponding position of the particles on the Y axis after the preset time in the moving direction, Z (n-1) Indicating the corresponding current position of the particles on the Z axis, Z n Indicating the corresponding position of the particles on the Z axis after the preset time passes in the moving direction; t represents a preset time; the X axis, the Y axis and the Z axis are three-dimensional coordinate systems established in the VR space in advance; a. c and e represent velocity components of the particles in the X-, Y-, and Z-directions in the in-vehicle VR scene, respectively, and b, d, and f represent velocity corrections of the particles in the X-, Y-, and Z-directions in the in-vehicle VR scene, respectively.
Specifically, at+b may be represented by the formulaCalculated, ct+d can be calculated from the formula +.>Calculated, et+f can be calculated from the formulaAnd (5) calculating to obtain the product. For example, assuming at+b=1, ct+d=1, et+f=1, n=1, then X (n-1) Is X 0 、Y (n-1) Is Y 0 、Z (n-1) Is Z 0 The coordinates being (X 0 ,Y 0 ,Z 0 ) Then X 1 =X 0 +1、Y 1 =Y 0 +1、Z 1 =Z 0 +1, i.e. the coordinates of the current particle are (X 0 +1,Y 0 +1,Z 0 +1)。
The current position of the particles is obtained through the formula, so that the position of each particle is accurately positioned, normal movement of the particles is ensured, and meanwhile, each particle is determined according to the previous position, so that when the movement state of the vehicle is changed, the movement track of the particles can be rapidly changed, and the consistency of the particles and the movement state of the vehicle is ensured.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not limit the implementation process of the embodiment of the present application in any way.
Any combination of the above optional solutions may be adopted to form an optional embodiment of the present application, which is not described herein in detail.
The following are device embodiments of the present application, which may be used to perform method embodiments of the present application. For details not disclosed in the device embodiments of the present application, please refer to the method embodiments of the present application.
Fig. 4 is a schematic diagram of a motion control device for particles in a vehicle-mounted VR scene according to an embodiment of the present application. As shown in fig. 4, the apparatus includes:
a data collection module 401, configured to obtain driving status data of the vehicle, where the driving status data includes a driving speed and a heading angle of the vehicle;
the direction determining module 402 is configured to determine an opposite direction of the driving speed, and determine an arbitrary direction within a preset angle range centered on the opposite direction as a moving direction of the particles in the vehicle-mounted virtual reality VR scene;
the speed calculation module 403 is configured to determine, according to the running speed and the heading angle, a movement amount of the particles in the vehicle-mounted VR scene per preset time;
the motion control module 404 is configured to control the particles in the on-vehicle VR scene to move in the moving direction and the moving amount according to the moving direction and the moving amount of the particles.
In some embodiments, the data collection module 401 is further to:
determining refreshing frequency of a display image in a vehicle-mounted VR scene; determining a generation period of particles according to the refresh frequency; and acquiring running state data every integral multiple time of the generation period of the spacer.
In some embodiments, the rate calculation module 403 is further configured to:
according to the running speed, the preset time and the course angle, determining velocity components and velocity correction amounts of particles in X-axis, Y-axis and Z-axis directions in the vehicle-mounted VR scene respectively; wherein, the X axis, the Y axis and the Z axis are three-dimensional coordinate systems established in the VR space in advance; and determining the movement amount of the particles according to the velocity components and the velocity correction amounts of the particles in the X-axis, Y-axis and Z-axis directions in the vehicle-mounted VR scene.
In some embodiments, the rate calculation module 403 is further configured to:
the velocity components and velocity correction amounts of the particles in the X-axis, Y-axis and Z-axis directions in the vehicle-mounted VR scene are calculated by the following formulas:
wherein a represents a velocity component of the particles in an X-axis direction in the in-vehicle VR scene, b represents a velocity correction amount of the particles in the X-axis direction in the in-vehicle VR scene, c represents a velocity component of the particles in a Y-axis direction in the in-vehicle VR scene, d represents a velocity correction amount of the particles in the Y-axis direction in the in-vehicle VR scene, e represents a velocity component of the particles in a Z-axis direction in the in-vehicle VR scene, and f represents a velocity correction amount of the particles in the Z-axis direction in the in-vehicle VR scene; t represents a preset time; dv is the speed variation; α represents a proportionality coefficient of the travel speed to the line-of-sight rate in the X-axis direction in the in-vehicle VR scene, β represents a proportionality coefficient of the travel speed to the line-of-sight rate in the Y-axis direction, and γ represents a proportionality coefficient of the travel speed to the line-of-sight rate in the Z-axis direction.
In some embodiments, the motion control module 404 is further configured to:
determining a gravitational acceleration and an air resistance value in a vehicle-mounted VR scene; updating the movement amount of the particles in the Y-axis direction according to the gravity acceleration; the movement amount of the particles in the Z-axis direction is updated based on the air resistance value.
In some embodiments, the rate calculation module 403 is further configured to:
controlling a default position in a three-dimensional coordinate system pre-established in a VR space to generate an initial particle; generating N particles at predetermined intervals centering on the positions of the initial particles, wherein the positions of the N particles are (X 0 +i,Y 0 +j,Z 0 +k), i, j, k are normal distribution VR line-of-sight space unit values corresponding to random values, and N is greater than or equal to 1.
In some embodiments, the rate calculation module 403 is further configured to:
determining a target position of the particle after a preset time according to the current position, the moving direction and the moving amount of the particle by the following formula:
wherein X is (n-1) Representing the corresponding current position of the particles on the X-axis, X n Representing the corresponding position of the particles on the X axis after the preset time in the moving direction, Y (n-1) Representing the corresponding current position of the particle on the Y axis, Y n Indicating the corresponding position of the particles on the Y axis after the preset time in the moving direction, Z (n-1) Indicating the corresponding current position of the particles on the Z axis, Z n Indicating the corresponding position of the particles on the Z axis after the preset time passes in the moving direction; t represents a preset time; the X axis, the Y axis and the Z axis are three-dimensional coordinate systems established in the VR space in advance; a. c and e represent velocity components of the particles in the X-, Y-, and Z-directions in the in-vehicle VR scene, respectively, and b, d, and f represent velocity corrections of the particles in the X-, Y-, and Z-directions in the in-vehicle VR scene, respectively.
It should be noted that, the apparatus provided in this embodiment can implement all the method steps executed by the method and can implement the same technical effects, which are not described herein again.
Fig. 5 is a schematic diagram of an electronic device 5 provided in an embodiment of the present application. As shown in fig. 5, the electronic apparatus 5 of this embodiment includes: a processor 501, a memory 502 and a computer program 503 stored in the memory 502 and executable on the processor 501. The steps of the various method embodiments described above are implemented by processor 501 when executing computer program 503. Alternatively, the processor 501, when executing the computer program 503, performs the functions of the modules/units in the above-described apparatus embodiments.
The electronic device 5 may be a desktop computer, a notebook computer, a palm computer, a cloud server, or the like. The electronic device 5 may include, but is not limited to, a processor 501 and a memory 502. It will be appreciated by those skilled in the art that fig. 5 is merely an example of the electronic device 5 and is not limiting of the electronic device 5 and may include more or fewer components than shown, or different components.
The processor 501 may be a central processing unit (Central Processing Unit, CPU) or other general purpose processor, digital signal processor (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), field programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like.
The memory 502 may be an internal storage unit of the electronic device 5, for example, a hard disk or a memory of the electronic device 5. The memory 502 may also be an external storage device of the electronic device 5, for example, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card) or the like, which are provided on the electronic device 5. Memory 502 may also include both internal storage units and external storage devices of electronic device 5. The memory 502 is used to store computer programs and other programs and data required by the electronic device.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit.
The integrated modules/units may be stored in a readable storage medium if implemented in the form of software functional units and sold or used as stand-alone products. Based on such understanding, the present application implements all or part of the flow in the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a readable storage medium, where the computer program may implement the steps of the method embodiments described above when executed by a processor. The computer program may comprise computer program code, which may be in source code form, object code form, executable file or in some intermediate form, etc. The readable storage medium may include: any entity or device capable of carrying computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting thereof; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. The method for controlling the movement of the particles in the vehicle-mounted VR scene is characterized by comprising the following steps of:
acquiring running state data of a vehicle, wherein the running state data comprises the running speed and the course angle of the vehicle;
determining the opposite direction of the running speed, and determining any direction in a preset angle range taking the opposite direction as a center as the moving direction of particles in the vehicle-mounted Virtual Reality (VR) scene;
according to the running speed and the course angle, determining the movement amount of particles in the vehicle-mounted VR scene after preset time;
and controlling particles in the vehicle-mounted VR scene to move in the moving direction and the moving amount according to the moving direction and the moving amount of the particles.
2. The method for controlling movement of particles in an in-vehicle VR scene according to claim 1, wherein said acquiring running state data of a vehicle comprises:
determining the refreshing frequency of a display image in the vehicle-mounted VR scene;
determining a generation period of the particles according to the refresh frequency;
and acquiring the driving state data every integral multiple time of the generation period of the particles.
3. The method for controlling movement of particles in an in-vehicle VR scene according to claim 1, wherein determining the amount of movement of the particles in the in-vehicle VR scene per a preset time according to the travel speed and the heading angle comprises:
determining velocity components and velocity correction amounts of the particles in X-axis, Y-axis and Z-axis directions in a vehicle-mounted VR scene according to the running speed, the preset time and the course angle; wherein the X axis, the Y axis and the Z axis are three-dimensional coordinate systems established in the VR space in advance;
and determining the movement amount of the particles according to the velocity components and the velocity correction amounts of the particles in the X-axis direction, the Y-axis direction and the Z-axis direction in the vehicle-mounted VR scene.
4. The method for controlling movement of particles in an in-vehicle VR scene according to claim 3, wherein determining velocity components and velocity corrections of the particles in X-axis, Y-axis, and Z-axis directions in the in-vehicle VR scene according to the travel speed, the preset time, and the heading angle, respectively, comprises:
calculating to obtain velocity components and velocity correction amounts of the particles in X-axis, Y-axis and Z-axis directions in the vehicle-mounted VR scene respectively through the following formulas:
wherein a represents a velocity component of the particles in an X-axis direction in the in-vehicle VR scene, b represents a velocity correction amount of the particles in the X-axis direction in the in-vehicle VR scene, c represents a velocity component of the particles in a Y-axis direction in the in-vehicle VR scene, d represents a velocity correction amount of the particles in the Y-axis direction in the in-vehicle VR scene, e represents a velocity component of the particles in a Z-axis direction in the in-vehicle VR scene, and f represents a velocity correction amount of the particles in the Z-axis direction in the in-vehicle VR scene; t represents the preset time; dv is the speed variation; alpha represents a proportionality coefficient of the running speed and the line-of-sight rate in the X-axis direction in the vehicle-mounted VR scene, beta represents a proportionality coefficient of the running speed and the line-of-sight rate in the Y-axis direction, and gamma represents a proportionality coefficient of the running speed and the line-of-sight rate in the Z-axis direction; the yaw horizontal angle represents the component on the horizontal angle after vector decomposition of the course angle, and the yaw vertical angle represents the component on the vertical angle after vector decomposition of the course angle.
5. The method for controlling movement of particles in an in-vehicle VR scene according to claim 3, wherein determining the movement amount of the particles according to the velocity components and the velocity correction amounts of the particles in the X-axis, Y-axis, and Z-axis directions in the in-vehicle VR scene, respectively, further comprises:
determining a gravitational acceleration and an air resistance value in the vehicle-mounted VR scene;
updating the movement amount of the particles in the Y-axis direction according to the gravity acceleration;
and updating the movement amount of the particles in the Z-axis direction according to the air resistance value.
6. The method for controlling the movement of the particles in the VR scene in the vehicle according to claim 1, further comprising, before determining the movement amount of the particles in the VR scene in the vehicle according to the travel speed and the heading angle:
controlling a default position in a three-dimensional coordinate system pre-established in a VR space to generate an initial particle;
generating N particles at predetermined intervals centering on the positions of the initial particles, wherein the positions of the N particles are (X 0 +i,Y 0 +j,Z 0 +k), wherein i, j and k are normal distributed VR line-of-sight space unit values corresponding to random values, and N is greater than or equal to 1.
7. The method for controlling the movement of the particles in the VR scene in the vehicle according to claim 1, wherein after determining the movement amount of the particles in the VR scene in the vehicle according to the travel speed and the heading angle, the method further comprises:
determining a target position of the particle after a preset time according to the current position of the particle, the moving direction and the moving amount by the following formula:
wherein X is (n-1) Representing the current position of the particle on the X-axis, X n Representing the corresponding position of the particles on the X axis after the preset time in the moving direction, Y (n-1) Representing the current position of the particle on the Y axis, Y n Indicating the corresponding position of the particles on the Y axis after the preset time in the moving direction, Z (n-1) Representing the current position of the particle on the Z axis, Z n Representing the corresponding position of the particles on the Z axis after the preset time passes through the movement direction; t represents the preset time; the X axis, the Y axis and the Z axis are three-dimensional coordinate systems which are established in the VR space in advance; a. c and e represent velocity components of the particles in the X-, Y-, and Z-directions in the in-vehicle VR scene, respectively, and b, d, and f represent velocity corrections of the particles in the X-, Y-, and Z-directions in the in-vehicle VR scene, respectively.
8. A motion control device for particles in a VR scene on board a vehicle, comprising:
the data collection module is used for obtaining running state data of the vehicle, wherein the running state data comprise the running speed and the course angle of the vehicle;
the direction determining module is used for determining the opposite direction of the running speed and determining any direction in a preset angle range taking the opposite direction as the center as the moving direction of the particles in the vehicle-mounted VR scene;
the speed calculation module is used for determining the movement amount of particles in the vehicle-mounted VR scene after preset time according to the running speed and the course angle;
and the motion control module is used for controlling the particles in the vehicle-mounted VR scene to move in the moving direction and the moving amount according to the moving direction and the moving amount of the particles.
9. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 7 when the computer program is executed.
10. A readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to any one of claims 1 to 7.
CN202311616381.1A 2023-11-28 2023-11-28 Method and device for controlling movement of particles in vehicle-mounted VR scene Pending CN117632339A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311616381.1A CN117632339A (en) 2023-11-28 2023-11-28 Method and device for controlling movement of particles in vehicle-mounted VR scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311616381.1A CN117632339A (en) 2023-11-28 2023-11-28 Method and device for controlling movement of particles in vehicle-mounted VR scene

Publications (1)

Publication Number Publication Date
CN117632339A true CN117632339A (en) 2024-03-01

Family

ID=90022912

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311616381.1A Pending CN117632339A (en) 2023-11-28 2023-11-28 Method and device for controlling movement of particles in vehicle-mounted VR scene

Country Status (1)

Country Link
CN (1) CN117632339A (en)

Similar Documents

Publication Publication Date Title
CN106525074B (en) A kind of compensation method, device, holder and the unmanned plane of holder drift
CN109643205B (en) Head tracking with adaptive reference
US11698258B1 (en) Relative inertial measurement system with visual correction
CN109186596B (en) IMU measurement data generation method, system, computer device and readable storage medium
CN110275191A (en) A kind of static drift modification method, device, mobile unit and storage medium
CN107293099B (en) A kind of attitude monitoring method, apparatus and wearable device
JP6315184B2 (en) Impact detection circuit, physical quantity detection device, electronic device, moving object, and impact detection method
CN109712196A (en) Camera calibration processing method, device, vehicle control apparatus and storage medium
CN108572723B (en) Carsickness prevention method and equipment
CN111246189B (en) Virtual screen projection implementation method and device and electronic equipment
Cho et al. RideVR: Reducing sickness for in-car virtual reality by mixed-in presentation of motion flow information
CN111207740A (en) Method, device, equipment and computer readable medium for positioning vehicle
CN112738496A (en) Image processing method, apparatus, system, and computer-readable medium
CN117632339A (en) Method and device for controlling movement of particles in vehicle-mounted VR scene
CN112363196B (en) Vehicle attribute determining method, device, storage medium and electronic equipment
CN109506674A (en) A kind of bearing calibration of acceleration and device
US20130085712A1 (en) Inertial sensing input apparatus and method thereof
CN109725729B (en) Image processing method, image control device, display control device, and display device
CN108762527A (en) A kind of recognition positioning method and device
US20200294275A1 (en) Head tracking with adaptive reference
US20110243391A1 (en) Method and device for producing image information
US10604012B2 (en) Display control device, display system, and display control method
CN112284378A (en) Automatic driving control method for zero drift of inertial measurement unit
CN110139141A (en) Video pictures rendering method, device, storage medium and electronic equipment
CN107203257A (en) A kind of head pose compensation method and relevant device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination