CN115278204A - Display device using method, device, equipment and storage medium - Google Patents

Display device using method, device, equipment and storage medium Download PDF

Info

Publication number
CN115278204A
CN115278204A CN202210919960.2A CN202210919960A CN115278204A CN 115278204 A CN115278204 A CN 115278204A CN 202210919960 A CN202210919960 A CN 202210919960A CN 115278204 A CN115278204 A CN 115278204A
Authority
CN
China
Prior art keywords
display device
determining
vehicle
relationship
positioning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210919960.2A
Other languages
Chinese (zh)
Inventor
沈继
刘娇
林仁义
庄世政
张弢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Geely Holding Group Co Ltd
Zhejiang Zeekr Intelligent Technology Co Ltd
Original Assignee
Zhejiang Geely Holding Group Co Ltd
Zhejiang Zeekr Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Geely Holding Group Co Ltd, Zhejiang Zeekr Intelligent Technology Co Ltd filed Critical Zhejiang Geely Holding Group Co Ltd
Priority to CN202210919960.2A priority Critical patent/CN115278204A/en
Publication of CN115278204A publication Critical patent/CN115278204A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0808Diagnosing performance data
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/12Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time in graphical form
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application discloses a display device using method, a device, equipment and a storage medium, wherein the method comprises the following steps: determining a first positional relationship of a vehicle and a ground surface, and determining a second positional relationship of the vehicle and the display device; determining target motion trajectory data of the display device relative to the earth's surface based on the first positioning relationship and the second positioning relationship; and determining an output picture of the display device based on the target motion trail data. The application solves the technical problem that in the prior art, when the display device is used in a running vehicle, the picture track is not in accordance with the actual motion state, and the user feels uncomfortable easily.

Description

Display device using method, device, equipment and storage medium
Technical Field
The application relates to the technical field of intelligent automobiles, in particular to a display device using method, a display device using device, equipment and a storage medium.
Background
At present, various display devices are integrated in a smart car to meet the entertainment requirements of users in journey, and particularly, display devices such as virtual reality glasses, augmented reality glasses and mixed reality glasses are integrated in the smart car to be used for the users to play games in the smart car in journey.
However, currently, most display devices are designed by users under a static condition, and users are prone to feel uncomfortable when using various display devices in running smart cars, for example, a user a sails on the sea with a yacht, and a user B drives in a mountain area with a car, and both use the head-mounted display device to enter a shooting game at a network service end, and because the movement trajectories of the respective vehicles are not consistent, the user a and the user B may feel uncomfortable, or even faint.
Disclosure of Invention
The application mainly aims to provide a use method, a use device, equipment and a storage medium of a display device, and aims to solve the technical problem that discomfort is easily felt by a user when the display device is used in a running vehicle in the prior art.
In order to achieve the above object, the present application provides a display device using method, including:
determining a first positional relationship of a vehicle and a ground surface, and determining a second positional relationship of the vehicle and the display device;
determining target motion trajectory data of the display device relative to the earth's surface based on the first positioning relationship and the second positioning relationship;
and determining an output picture of the display device based on the target motion trail data.
Optionally, the step of determining third motion trajectory data of the display device relative to the ground surface based on the first positioning relationship and the second positioning relationship includes:
determining first motion trajectory data of the vehicle relative to a ground surface based on the first positioning relationship;
determining second motion trajectory data of the display device relative to the vehicle based on the second positional relationship;
and determining target motion trail data of the display device relative to the earth surface based on the first motion trail data and the second motion trail data.
Optionally, the step of determining target motion trajectory data of the display device relative to the earth's surface based on the first positioning relationship and the second positioning relationship includes:
determining a third positioning relation between the display device and the earth surface based on the first positioning relation and the second positioning relation;
and determining target motion trajectory data of the display device relative to the ground surface based on the third positioning relation.
Optionally, the first positioning relationship is determined by a first sensor disposed in the vehicle and the second positioning relationship is determined by a second sensor disposed in the vehicle, wherein the first sensor and the second sensor are disposed separately.
Optionally, the step of determining a first positional relationship of the vehicle and the earth's surface is preceded by the method comprising:
determining whether a start instruction to start a virtual scene is detected;
and if the opening instruction is detected, executing a step of determining a first positioning relation between the vehicle and the earth surface.
Optionally, the step of determining an output screen of the display device based on the target motion trajectory data includes:
determining a first environment picture of the environment where the vehicle is located, and determining a second environment picture of the environment where the display device is located;
fusing the first environment picture and the second environment picture to obtain a fused picture;
and determining an output picture of the display device based on the target motion track data and the fusion picture.
Optionally, the third positioning relationship between the display device and the earth's surface includes a 6dof positional relationship between the display device and the earth's surface, the 6dof positional relationship including a fore-aft positional relationship, a left-right positional relationship, an up-down positional relationship, a roll angle relationship, a pitch angle relationship, and a yaw angle relationship.
The present application also provides a display device using apparatus, the display device using apparatus including:
the first determining module is used for determining a first positioning relation between a vehicle and the ground surface and determining a second positioning relation between the vehicle and the display device;
a second determination module, configured to determine target motion trajectory data of the display device relative to the earth's surface based on the first positioning relationship and the second positioning relationship;
and the third determining module is used for determining an output picture of the display device based on the target motion track data.
The present application further provides a display device using apparatus, where the display device using apparatus is an entity node apparatus, and the display device using apparatus includes: a memory, a processor and a program of the display device usage method stored on the memory and executable on the processor, the program of the display device usage method being executable by the processor to implement the steps of the display device usage method as described above.
The present application also provides a storage medium having a program stored thereon for implementing the method for using the display apparatus, wherein the program is executed by a processor to implement the steps of the method for using the display apparatus.
Compared with the prior art that the display device is used in a running vehicle and is easy to cause discomfort to a user, the method, the device, the equipment and the storage medium for using the display device determine a first positioning relation between the vehicle and the ground surface and determine a second positioning relation between the vehicle and the display device; determining target motion trajectory data of the display device relative to the earth's surface based on the first positioning relationship and the second positioning relationship; and determining an output picture of the display device based on the target motion track data. In the application, the target motion trajectory data of the display device relative to the earth surface is determined through the first positioning relationship between the vehicle and the earth surface and the second positioning relationship between the vehicle and the display device, and then the output picture of the display device is determined, namely, in the application, the output picture output to the user is associated with the target motion trajectory data, namely, the output game picture is matched with the motion state of the user (considering the motion trajectory data of the display device relative to the earth surface), and further, the user experience is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings without inventive labor.
FIG. 1 is a schematic flow chart illustrating a first embodiment of a method for using a display device according to the present application;
FIG. 2 is a schematic flow chart illustrating a second embodiment of a method for using a display device according to the present application;
fig. 3 is a schematic device structure diagram of a hardware operating environment according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a first scenario involved in a method for using a display apparatus according to the present application;
FIG. 5 is a schematic diagram of a second scenario involved in a method of using a display device according to the present application;
FIG. 6 is a diagram illustrating a third scenario related to a method for using a display device according to the present application;
FIG. 7 is a diagram illustrating a fourth scenario involved in a method for using a display device according to the present application;
FIG. 8 is a schematic diagram of a fifth scenario involved in a method for using a display device according to the present application;
fig. 9 is a schematic diagram of a sixth scenario involved in a method for using the display device according to the present application.
The objectives, features, and advantages of the present application will be further described with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of and not restrictive on the broad application.
In a first embodiment of the method for using a display device, referring to fig. 1, the method for using a display device includes:
step S10, determining a first positioning relation between a vehicle and the earth surface, and determining a second positioning relation between the vehicle and the display device;
step S20, determining target motion trajectory data of the display device relative to the earth surface based on the first positioning relation and the second positioning relation;
and step S30, determining an output picture of the display device based on the target motion trail data.
In the present example, the meaning of several parameters needs to be explained:
earth surface: not limited to aerial, underground tunnels;
vehicles, such as parachutes, vertical elevators, horses, and the like, but also train and steamships, and the like;
a display device: any device having a display function not limited to VR, AR, MR, PAD, camera, whole line, etc.;
external sensor (first sensor): a sensor for recording data such as a certain coordinate position, an altitude and the like of a vehicle on a map;
built-in sensor (second sensor): the vehicle's level, steering, elevation, etc. are recorded for use in confirming the display device position.
Wherein external and internal are with respect to the vehicle.
In the present embodiment, the vehicle, the ground surface, the display device, the external sensor and the internal sensor are in a specific positional relationship as shown in fig. 4 and fig. 5, where fig. 4 is a perspective view, and fig. 5 is a front view, and in fig. 4 and fig. 5, (1) or 1 indicates the ground surface, (2) or 2 indicates the vehicle, (3) or 3 indicates the display device, (4) or 4 and (5) or 5 indicates the sensor, where (4) or 4 indicates the sensor (external sensor) disposed outside the vehicle, for example, outside the vehicle, and the sensor is used to determine the first positional relationship between the vehicle and the ground surface. (5) Or 5 is a sensor (built-in sensor) provided in the vehicle for determining the second positional relationship of the vehicle and the display device.
In the present embodiment, the vehicle, the ground surface, the display device, the external sensor and the internal sensor have a workflow relationship as shown in fig. 7.
In fig. 7, the vehicle (2) or 2 is provided with an external sensor (4) or 4, the vehicle (2) or 2 is further correspondingly provided with a vehicle camera (a device for inputting an external picture, such as a vehicle-mounted camera), an internal sensor 5, and preset parameters for correspondingly determining the vehicle, so that the motion trail data of the vehicle can be accurately acquired.
The preset parameters of the vehicle include, for example, the external dimensions of the automobile, the dimensions of the interior device, the external dimensions of the automobile, and the dimensions of the interior device, which can be used in the virtual world 1:1, preventing the user from mistakenly touching or colliding and being injured in the virtual space. And the external sensor: like radar or third party GPS, etc., the external sensor may be physically located on the vehicle (2) or operate independently from the outside. The corresponding built-in sensors: sensors mounted in the vehicle for positioning the display device, such as whether the user is sitting in the front or rear row, left or right.
In this embodiment, in this fig. 7, the display device is further provided with a display device camera (a device for inputting an external picture, such as a camera on a VR and PAD device) and a display (an interface for viewing a rendered image, a viewable interactive game, and the like) of the display device, so that the motion trajectory data and the display picture of the display device can be accurately obtained.
In the present embodiment, the motion trajectory of the vehicle 2 may be: and 6dof freedom change positioning and moving track data of the automobile such as forward movement, backward movement, steering, ascending and descending.
In this embodiment, as shown in fig. 7, the overall implementation flow may be:
after the motion trajectory data of the vehicle and/or the motion trajectory data of the display device are subjected to fusion processing, output data for displaying an output image is obtained, for example, the output data is XR application content, and a rendered image is obtained.
Specifically, the motion trajectory data of the vehicle and/or the motion trajectory data of the display device are subjected to entity positioning and four-dimensional parameter fusion processing, respectively, to obtain output data (XR application content or rendered images).
In this embodiment, the specific development context may be:
background one: at present, various display devices are integrated in a smart car to meet the entertainment requirements of users in journey, and particularly, display devices such as virtual reality glasses, augmented reality glasses and mixed reality glasses are integrated in the smart car to be used for the users to play games in the smart car in journey.
However, most display devices are designed by users in a static state, and users are prone to discomfort when using various display devices in a running smart car, for example, a user a sails on the sea with a yacht, a user B drives in a mountain area with a car, both users use a head-mounted display device to enter a shooting game at a network server, and both the user a and the user B may feel uncomfortable or even faint.
Background two: at present, the data of the body state of a user when a vehicle runs is difficult to accurately record, namely the data of the body state of the user on the flat ground can only be recorded;
for example, data of blood pressure, heart rate, and the like of a user on a flat ground may be recorded, but it is difficult to accurately record data of blood pressure, heart rate, and the like of a user while riding a vehicle or a ship.
In this embodiment, the target motion trajectory data of the display device relative to the ground surface is determined according to the first positioning relationship between the vehicle and the ground surface and the second positioning relationship between the vehicle and the display device, and then the output picture of the display device is determined, that is, in this application, the output picture output to the user is associated with the target motion trajectory data, that is, the output game picture is matched with the motion state of the user (considering the motion trajectory data of the display device relative to the ground surface), and further, the user experience is improved.
In the present embodiment, the target motion trajectory data of the display device with respect to the ground surface is determined, and thus, the correspondingly acquired data of the physical state of the user is not only the data of the physical state of the user in the flat ground but also data with respect to the running vehicle, that is, data accurately determining the physical state of the user while the vehicle is running.
In the present embodiment, the operating state of the vehicle, such as: resistance, thrust, weightlessness, inertia, centrifugal force, etc. are combined with the content of the display device in a 6dof manner, and the experience of more kinds of display content can be improved, such as: the experience of different display contents such as sports, landscape appreciation, electronic games and the like.
In this embodiment, virtual somatosensory interaction can be realized even when a user passes through a region such as a tunnel or a mountain area (in the case of no signal, getting lost, or the like).
In this embodiment, the step of determining the first positioning relationship between the vehicle and the earth's surface is only performed when the virtual scene is opened, that is, in the present application, the use method of the display device is optional, so as to meet the requirements of users in different scenes.
In the present embodiment, the display apparatus using method is applied to a display apparatus using apparatus applied to a display apparatus using device belonging to a display apparatus using system including a camera of a different type, for example, a camera provided on a running vehicle and a camera provided on a display apparatus.
The method comprises the following specific steps:
step S10, determining a first positioning relation between a vehicle and the earth surface, and determining a second positioning relation between the vehicle and the display device;
in this embodiment, the manner of determining the first positioning relationship between the vehicle and the ground surface may be:
the first positioning relationship is determined by a sensor arranged on the vehicle, i.e. by an external sensor.
An external sensor: like radar, third party GPS, etc., may be physically located on the vehicle 2 or may be independently operated externally.
In this embodiment, the external sensor may also be a sensor formed by combining a radar, a camera, and the like.
In the present embodiment, the specific positional relationship between the vehicle, the ground surface and the display device is as shown in fig. 4 and 5, wherein fig. 4 is a perspective view and fig. 5 is a sectional view, and in fig. 4 and 5, specifically, in the case where there is no magnetic field disturbance at the time of initialization of the display device using the device;
the Z axis of the earth surface (1) is parallel to the earth meridian;
the X axis of the earth surface (1) is parallel to the earth latitude;
the earth's surface (1) coordinate point O1 is on the Y-axis of the earth's surface (1) at the same time as the vehicle (2) coordinate point O2 (the initialized earth's surface coordinate system is determined).
In this embodiment, it should be noted that the first positioning relationship includes a 6dof position relationship, specifically, a front-back position relationship, a left-right position relationship, an up-down position relationship, a roll angle relationship, a pitch angle relationship, and a yaw angle relationship, and the specific positioning relationship is as shown in fig. 6.
The specific way of determining the 6dof position relationship between the vehicle (2) and the ground surface (1) is as follows (it should be noted that at least one external sensor (4) is arranged on the vehicle (2):
the coordinate point O2 of the vehicle (2) is determined based on the external sensor (4), the determined data (ground surface coordinate system) of the X axis, the Y axis and the Z axis of the ground surface (1) are generated based on the external sensor (4), the front and back direction of the Z axis of the vehicle (2) is further determined, the left and right direction of the X axis of the vehicle (2) is determined, and the up and down direction of the Y axis of the vehicle (2) is determined.
In the embodiment, a roll angle relationship, a pitch angle relationship and a yaw angle relationship are also determined, and specifically, a Z-axis roll angle of the vehicle (2) is acquired by a built-in sensor (5) of the vehicle (2); the X-axis pitch angle of the vehicle (2) is acquired by a built-in sensor (5) of the vehicle (2); the Y-axis yaw angle of the vehicle (2) is acquired by an external sensor (4).
In this embodiment, it should be noted that the second positioning relationship includes a 6dof position relationship, specifically, a front-back position relationship, a left-right position relationship, an up-down position relationship, a roll angle relationship, a pitch angle relationship, and a yaw angle relationship.
In this embodiment, the second positioning relationship is obtained by a built-in sensor, which is a sensor installed in a vehicle, and the built-in sensor can be used to position the display device, such as to position whether the user sits in the front row or the rear row, and whether the user sits on the left side or the right side.
The specific way to determine the 6dof relationship between the display device (3) and the vehicle (2) is (it should be noted that at least one positioning device (built-in sensor, 5) is placed inside the vehicle (2)):
wherein, the coordinate point O3 of the display device (3) is determined by a built-in sensor (5) of the vehicle (2) and generates data of an X axis, a Y axis and a Z axis of the vehicle (2); specifically, the Z-axis front-back direction of the display device (3) is acquired by a built-in sensor (5) of the (2); and acquiring the left and right directions of the X axis of the display device (3) (determined by the Z axis of the display device (3)); the up-down direction of the Y axis of the display device (3) is also acquired (determined by the Z axis and the X axis of the display device (3)); the Z-axis rolling angle of the display device (3) is also obtained; the X-axis pitch angle of the display device (3) is also acquired; and the Y-axis yaw angle of the display device (3) is also acquired.
Step S20, determining target motion trajectory data of the display device relative to the earth surface based on the first positioning relation and the second positioning relation;
in this embodiment, based on the first positioning relationship and the second positioning relationship, the target motion trajectory data of the display device relative to the ground surface is determined, and the purpose of determining the target motion trajectory data of the display device relative to the ground surface is to: such that the use of the display device takes into account the user's operating data in dynamic situations.
In this embodiment, the motion trajectory may specifically be: and positioning and moving track data with 6dof freedom changes such as a forward and backward track, a steering track, an ascending and descending track and the like.
Wherein the step of determining third motion trajectory data of the display device relative to the earth's surface based on the first positional relationship and the second positional relationship comprises:
step S21, determining first motion track data of the vehicle relative to the ground surface based on the first positioning relation;
step S22, determining second motion trail data of the display device relative to the vehicle based on the second positioning relation;
and S23, determining target motion track data of the display device relative to the earth surface based on the first motion track data and the second motion track data.
In this embodiment, the target motion trajectory data is determined by fusing the first motion trajectory data and the second motion trajectory data.
In this embodiment, the first motion trajectory data is specifically determined based on the front-back position relationship, the left-right position relationship, the up-down position relationship, the roll angle relationship, the pitch angle relationship, and the yaw angle relationship of the vehicle with respect to the ground surface over a period of time, specifically, such as obtaining the front-back position relationship, the left-right position relationship, the up-down position relationship, the roll angle relationship, the pitch angle relationship, and the yaw angle relationship of the vehicle with respect to the ground surface in 1 st second, then obtaining the front-back position relationship, the left-right position relationship, the up-down position relationship, the roll angle relationship, the pitch angle relationship, and the yaw angle relationship of the vehicle with respect to the ground surface in 3 rd, 5 th, 7 th, and 10 th seconds, and further obtaining the first motion trajectory data of the vehicle with respect to the ground surface in 10 seconds.
In the same manner as described above, the second movement trace data (1 st, 3 rd, 5 th, 7 th, and 10 th seconds) of the display device with respect to the vehicle is obtained.
And after the first motion trail data and the second motion trail data are obtained, fusing target motion trail data (1 st second, 3 rd second, 5 th second, 7 th second and 10 th second) of the display device relative to the earth surface.
Specifically, in order to meet the requirement of real-time performance, after first motion trajectory data within 10 milliseconds and second motion trajectory data within 10 milliseconds are obtained, the trajectory data of the first motion trajectory data and the second motion trajectory data are fused, and target motion trajectory data of the display device relative to the earth surface are obtained. In addition, in the embodiment, only the first motion trail data and the second motion trail data in a short time are obtained, namely, the target motion trail data of the display device relative to the ground surface is determined in a fusion mode, and therefore the real-time requirement for determining the target motion trail data can be met.
That is, the display device (3) such as VR or PAD determines the data of the positioning and moving track of the change of the 6dof degree of freedom through the detection of the two sets of sensors (4) and (5).
In this embodiment, the step of determining target motion trajectory data of the display device relative to the earth's surface based on the first positioning relationship and the second positioning relationship includes:
step S24, determining a third positioning relation between the display device and the earth surface based on the first positioning relation and the second positioning relation;
and S25, determining target motion track data of the display device relative to the earth surface based on the third positioning relation.
In this embodiment, the third positioning relationship between the display device and the earth's surface is directly determined, and then the target motion trajectory data of the display device relative to the earth's surface is determined, instead of fusing the trajectory data every certain time period.
In a whole, as shown in fig. 4, (2) and (3) (second positioning relationship) are obtained by (5), and (1) and (2) (first positioning relationship) are obtained by (4).
The specific way of determining the third positioning relationship (6 dof positional relationship) between the display device and the earth surface is as follows:
determining a coordinate point O3 of the display device (3) based on (5), and generating a third positioning relation corresponding to the display device based on the first positioning relation and the second positioning relation, for example, determining an X axis, a Y axis and a Z axis of the corresponding display device;
specifically, the Z-axis front-back direction of the display device (3) can be determined;
determining the left and right directions of an X axis of a display device (3);
determining the up-down direction of the Y axis of the display device (3);
determining a Z-axis rolling angle of the display device (3);
determining the X-axis pitch angle of the display device (3);
and determining the Y-axis yaw angle of the display device (3).
Specifically, the front-back direction in the first positioning relationship is the p1-p2 direction, the front-back direction in the second positioning relationship is the p3-p4 direction, and the p5-p6 direction in the third positioning relationship can be obtained by performing logic operation on the p1-p2 direction and the p3-p4 direction through the pre-stored direction.
In this embodiment, a specific scenario is further provided, that is, a specific scenario in which the third positioning relationship is determined by the first positioning relationship and the second positioning relationship is shown in fig. 9:
in fig. 9, the landmark is (1), the vehicle is (2), the display device is (3), the external sensor is (4), the internal sensor is (5), the sensor (5) is disposed on (2) and connected to the display device (3), and (5) and (1) are not connected, and (3) and (4) are not connected, and both the landmark and the vehicle have a GPS positioning function, specifically, as shown in the C2 image in fig. 9, the user wears (3) to move left (first positioning relationship), and (2) to move right (second positioning relationship), as shown in the C3 image in fig. 9, after a certain distance, the positional relationship between (3) and (1) does not change (third positioning relationship).
And step S30, determining an output picture of the display device based on the target motion trail data.
In the embodiment, the target motion trajectory data, the size and initial positioning data of the vehicle, and the size and initial positioning data of the display device are fused.
The step of determining an output picture of the display device based on the target motion trajectory data includes:
step S31, determining a first environment picture of the environment where the vehicle is located, and determining a second environment picture of the environment where the display device is located;
step S32, fusing the first environment picture and the second environment picture to obtain a fused picture;
and step S33, determining an output picture of the display device based on the target motion trail data and the fusion picture.
In the present embodiment, how to obtain the output screen of the display device is described.
Firstly, a first environment picture of the environment is collected through a camera of the vehicle, and a second environment picture of the environment is collected through a camera of the display device.
And fusing the first environment picture and the second environment picture to obtain a fused picture.
And after the fusion picture is obtained, determining an output picture of the display device based on the target motion track data and the fusion picture, specifically, further fusing the 6dof change data in the target motion track data and the fusion picture to obtain the output picture. Specifically, the target motion trajectory data is converted into a first four-dimensional parameter set, and then the four-dimensional parameter set and the fused picture are further fused into an output picture, or the first motion trajectory data is converted into a second four-dimensional parameter set, and based on the first four-dimensional parameter set, the second four-dimensional parameter set and the fused picture, the first four-dimensional parameter set and the fused picture are further fused into an output picture. Or in this embodiment, the size and initial positioning data of the vehicle are also determined, and the size and initial positioning data of the vehicle are determined by the display device, and based on the size and initial positioning data of the vehicle, the size and initial positioning data of the display device, the first four-dimensional parameter set, the second four-dimensional parameter set, and the fused picture are further fused into an output picture.
Compared with the prior art that the display device is used in a running vehicle and is easy to cause discomfort to a user, the method, the device, the equipment and the storage medium for using the display device determine a first positioning relation between the vehicle and the ground surface and determine a second positioning relation between the vehicle and the display device; determining target motion trajectory data of the display device relative to the earth's surface based on the first positioning relationship and the second positioning relationship; and determining an output picture of the display device based on the target motion trail data. In the application, the target motion trajectory data of the display device relative to the earth surface is determined through the first positioning relationship between the vehicle and the earth surface and the second positioning relationship between the vehicle and the display device, and then the output picture of the display device is determined, namely, in the application, the output picture output to the user is associated with the target motion trajectory data, namely, the output game picture is matched with the motion state of the user (considering the motion trajectory data of the display device relative to the earth surface), and further, the user experience is improved.
Further, based on the above-described embodiments of the present application, another embodiment of the present application is provided, in which the first positioning relationship is determined by a first sensor provided in a vehicle, and the second positioning relationship is determined by a second sensor provided in the vehicle, wherein the first sensor and the second sensor are provided separately.
In this embodiment, the first sensor (external sensor) and the second sensor (internal sensor) are separately provided, and the purpose of the separate arrangement of the first sensor and the second sensor is to: the mutual influence of measurement caused by the fact that the first sensor and the second sensor are arranged close to each other is avoided, in addition, the first sensor is arranged outside the vehicle, the second sensor is arranged inside the vehicle, and the actual requirements of application can be met (the display device is generally applied inside the vehicle), furthermore, the first sensor is arranged outside the vehicle, and the first positioning relation can be obtained more accurately, because: the automobile bumper cannot be influenced by the damping performance of the automobile interior.
Further, based on the above-mentioned embodiments in the present application, another embodiment of the present application is provided, in which,
prior to the step of determining a first positional relationship of the vehicle and the earth's surface, the method comprises:
step A1, determining whether a starting instruction for starting a virtual scene is detected;
and A2, if the opening instruction is detected, executing a step of determining a first positioning relation between the vehicle and the earth surface.
In this embodiment, the step of determining the first positioning relationship between the vehicle and the earth's surface is only performed when the virtual scene is opened, that is, in the present application, the use method of the display device is optional, so as to meet the requirements of users in different scenes.
In this embodiment, a specific scenario one is further provided:
as shown in fig. 8, if the start instruction for opening the virtual scene is not detected, the technical solution in this embodiment may not be executed in real time, if the start instruction for opening the virtual scene is detected, each sensor is started and positioned (including a first positioning relationship, a second positioning relationship, and a third positioning relationship is further determined), it needs to be explained that the display device may not be positioned until entering the inside of the vehicle, and may be positioned within a sensing range (for example, the display device may be positioned around the vehicle), and after the start positioning (a partner may be preset in the virtual scene through a preset button, so as to improve the realistic experience), it is realized that the output picture and the current scene are synchronized (including displacement synchronization), that is, it is realized that the visual perspective effect of the real and virtual scenes is synchronized, for example, the vehicle goes up a slope, so that the synchronization of the effect is: the user will feel gravity and the vehicle will descend, so the effect is synchronized as follows: the user feels a weight loss. For example, some phenomena such as acceleration and back pushing feeling, braking inertia and the like are synchronized in the virtual world.
In the present embodiment, as shown in fig. 7, a specific application scenario of the vehicle, the ground surface and the display device is also provided, specifically:
01. an external sensor: devices or systems for locating or determining a relationship with the earth's surface, such as in radar or third party GPS, external sensors may be physically located on the vehicle 2 or run independently externally;
02. vehicle 2 motion trajectory: for example, the positioning and moving track data of 6dof freedom change such as forward and backward movement, steering, ascending and descending of the automobile;
03. vehicle 2 preset parameters: such as the external dimensions of the automobile and the dimensions of the interior decoration device, can be used in the virtual world 1:1, performing visual rendering on an object to prevent a user from being injured by mistakenly touching a virtual space;
04. a built-in sensor: sensors mounted on the vehicle for positioning the display device, such as whether the user is sitting in the front or rear row, left or right, included in the preset A2 picture, the users having determined the positional relationship with each other before getting on the vehicle;
05. movement locus of the display device 3: the display device equipment such as VR or PAD determines the positioning and moving track data of the 6dof freedom degree change in the detection range of the two groups of sensors;
06. entity positioning parameters: including the size and initial positioning data of the mobile device and the display apparatus;
07. a four-dimensional parameter set: 6dof data changes generated by the change of all entity units (including display devices, vehicles and the like) along with the change of time;
08. vehicle 2 camera: a display device for inputting an external picture, such as a vehicle-mounted camera;
09. display device 3 camera: display devices for inputting outside pictures, such as cameras on VR and PAD equipment;
xr application: fusing 08 and 09 data and outputting, such as outputting image content of the MR technology;
11. rendering an image: after the four-dimensional parameter set + XR is applied, picture rendering is carried out according to content requirements and then output is carried out;
12. the display device 3: the rendered images are viewed, and the user can view and interact with games and the like.
In this embodiment, it should be noted that the data of the 2 nd step and the 3 rd step, and the data of the 4 th step and the 5 th step are processed by the 6 th step and the 7 th step, respectively, and then the content of the 10 th step and/or the 11 th step can be obtained.
Referring to fig. 3, fig. 3 is a schematic device structure diagram of a hardware operating environment according to an embodiment of the present application.
As shown in fig. 3, the display device using apparatus may include: a processor 1001, such as a CPU, memory 1005, and a communication bus 1002. The communication bus 1002 is used to realize connection and communication between the processor 1001 and the memory 1005. The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a storage device separate from the processor 1001 described previously.
Optionally, the display device using apparatus may further include a rectangular user interface, a network interface, a camera, an RF (Radio Frequency) circuit, a sensor, an audio circuit, a WiFi module, and the like. The rectangular user interface may comprise a Display screen (Display), an input sub-module such as a Keyboard (Keyboard), and the optional rectangular user interface may also comprise a standard wired interface, a wireless interface. The network interface may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface).
It will be understood by those skilled in the art that the display apparatus using device structure shown in fig. 3 does not constitute a limitation of the display apparatus using device, and may include more or less components than those shown, or some components in combination, or a different arrangement of components.
As shown in fig. 3, a memory 1005, which is a storage medium, may include therein an operating system, a network communication module, and a display device usage program. The operating system is a program that manages and controls the use of the hardware and software resources of the device by the display apparatus, and supports the operation of the display apparatus using program as well as other software and/or programs. The network communication module is used for communication among the components in the memory 1005 and with other hardware and software in the system for using the display device.
In the apparatus for using a display device shown in fig. 3, the processor 1001 is configured to execute the display device using program stored in the memory 1005 to implement the steps of the method for using a display device as described in any one of the above.
The specific implementation of the display device using apparatus of the present application is substantially the same as the embodiments of the display device using method described above, and is not described herein again.
The present application also provides a display device using apparatus, comprising:
determining a first positional relationship of a vehicle and a ground surface, and determining a second positional relationship of the vehicle and the display device;
determining target motion trajectory data of the display device relative to the earth's surface based on the first positioning relationship and the second positioning relationship;
and determining an output picture of the display device based on the target motion trail data.
Optionally, the step of determining third motion trajectory data of the display device relative to the ground surface based on the first positioning relationship and the second positioning relationship includes:
determining first motion trajectory data of the vehicle relative to a ground surface based on the first positioning relationship;
determining second motion trajectory data of the display device relative to the vehicle based on the second positional relationship;
and determining target motion trail data of the display device relative to the earth surface based on the first motion trail data and the second motion trail data.
Optionally, the step of determining target motion trajectory data of the display device relative to the earth's surface based on the first positioning relationship and the second positioning relationship includes:
determining a third positioning relation between the display device and the earth surface based on the first positioning relation and the second positioning relation;
and determining target motion trajectory data of the display device relative to the earth surface based on the third positioning relation.
Optionally, the first positional relationship is determined by a first sensor disposed in the vehicle and the second positional relationship is determined by a second sensor disposed in the vehicle, wherein the first sensor and the second sensor are separately disposed.
Optionally, the step of determining a first positional relationship of the vehicle and the earth's surface is preceded by the method comprising:
determining whether a start instruction to start a virtual scene is detected;
and if the opening instruction is detected, executing a step of determining a first positioning relation between the vehicle and the earth surface.
Optionally, the step of determining an output screen of the display device based on the target motion trajectory data includes:
determining a first environment picture of the environment where the vehicle is located, and determining a second environment picture of the environment where the display device is located;
fusing the first environment picture and the second environment picture to obtain a fused picture;
and determining an output picture of the display device based on the target motion track data and the fusion picture.
Optionally, the third positioning relationship between the display device and the earth's surface includes a 6dof positional relationship between the display device and the earth's surface, the 6dof positional relationship including a fore-aft positional relationship, a left-right positional relationship, an up-down positional relationship, a roll angle relationship, a pitch angle relationship, and a yaw angle relationship.
The specific implementation of the display device using apparatus of the present application is substantially the same as the embodiments of the display device using method, and is not described herein again.
The embodiment of the application provides a storage medium, and the storage medium stores one or more programs, and the one or more programs can be further executed by one or more processors to realize the steps of the display device using method.
The specific implementation of the storage medium of the present application is substantially the same as the embodiments of the method for using the display device, and is not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a component of' 8230; \8230;" does not exclude the presence of another like element in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A method for using a display device, the method comprising:
determining a first positional relationship of a vehicle and a ground surface, and determining a second positional relationship of the vehicle and the display device;
determining target motion trajectory data of the display device relative to the earth's surface based on the first positioning relationship and the second positioning relationship;
and determining an output picture of the display device based on the target motion trail data.
2. The method of using a display device according to claim 1, wherein the step of determining third motion trajectory data of the display device relative to the surface based on the first positional relationship and the second positional relationship comprises:
determining first motion trajectory data of the vehicle relative to a ground surface based on the first positioning relationship;
determining second motion trajectory data of the display device relative to the vehicle based on the second positional relationship;
and determining target motion trail data of the display device relative to the earth surface based on the first motion trail data and the second motion trail data.
3. The method for using a display device according to claim 1, wherein the step of determining target motion trajectory data of the display device relative to the surface based on the first positioning relationship and the second positioning relationship comprises:
determining a third positioning relation between the display device and the earth surface based on the first positioning relation and the second positioning relation;
and determining target motion trajectory data of the display device relative to the ground surface based on the third positioning relation.
4. The method of using a display device according to claim 1, wherein the first positional relationship is determined by a first sensor disposed in a vehicle and the second positional relationship is determined by a second sensor disposed in the vehicle, wherein the first sensor and the second sensor are disposed separately.
5. The method of using a display device as claimed in claim 1, wherein the step of determining a first positional relationship of the vehicle to the earth's surface is preceded by the method comprising:
determining whether a start instruction to start a virtual scene is detected;
and if the opening instruction is detected, executing a step of determining a first positioning relation between the vehicle and the earth surface.
6. The method for using a display device according to claim 1, wherein the step of determining an output screen of the display device based on the target motion trajectory data comprises:
determining a first environment picture of the environment where the vehicle is located, and determining a second environment picture of the environment where the display device is located;
fusing the first environment picture and the second environment picture to obtain a fused picture;
and determining an output picture of the display device based on the target motion trail data and the fusion picture.
7. The method of using a display device of claim 2, wherein the third positional relationship between the display device and the surface of the earth comprises a 6dof positional relationship between the display device and the surface of the earth, the 6dof positional relationship comprising a fore-aft positional relationship, a left-right positional relationship, an up-down positional relationship, a roll angular relationship, a pitch angular relationship, and a yaw angular relationship.
8. A display device using apparatus, characterized by comprising:
the first determining module is used for determining a first positioning relation between a vehicle and the ground surface and determining a second positioning relation between the vehicle and the display device;
a second determination module, configured to determine target motion trajectory data of the display device relative to the earth's surface based on the first positioning relationship and the second positioning relationship;
and the third determining module is used for determining an output picture of the display device based on the target motion trail data.
9. A display device using apparatus, characterized by comprising: a memory, a processor, and a program stored on the memory for implementing the display device using method,
the memory is used for storing programs for realizing the using method of the display device;
the processor is used for executing the program for realizing the use method of the display device so as to realize the steps of the use method of the display device according to any one of claims 1 to 7.
10. A storage medium having stored thereon a program for implementing a method for use of a display device, the program being executable by a processor to implement the steps of the method for use of a display device according to any one of claims 1 to 7.
CN202210919960.2A 2022-07-27 2022-07-27 Display device using method, device, equipment and storage medium Pending CN115278204A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210919960.2A CN115278204A (en) 2022-07-27 2022-07-27 Display device using method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210919960.2A CN115278204A (en) 2022-07-27 2022-07-27 Display device using method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115278204A true CN115278204A (en) 2022-11-01

Family

ID=83747717

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210919960.2A Pending CN115278204A (en) 2022-07-27 2022-07-27 Display device using method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115278204A (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090002142A1 (en) * 2006-01-25 2009-01-01 Akihiro Morimoto Image Display Device
US20150097863A1 (en) * 2013-10-03 2015-04-09 Honda Motor Co., Ltd. System and method for dynamic in-vehicle virtual reality
CN104977717A (en) * 2014-04-14 2015-10-14 哈曼国际工业有限公司 Head mounted display presentation adjustment
CN106462232A (en) * 2014-05-01 2017-02-22 微软技术许可有限责任公司 Determining coordinate frames in a dynamic environment
US20180040162A1 (en) * 2016-08-05 2018-02-08 Uber Technologies, Inc. Virtual reality experience for a vehicle
WO2019025212A1 (en) * 2017-08-04 2019-02-07 Robert Bosch Gmbh Method for controlling a vr presentation in a means of locomotion, and vr presentation device
CN109426345A (en) * 2017-08-31 2019-03-05 北京网秦天下科技有限公司 VR equipment and its adaptive display method
US20190130878A1 (en) * 2017-10-31 2019-05-02 Uber Technologies, Inc. Systems and Methods for Presenting Virtual Content in a Vehicle
CN109716266A (en) * 2016-09-23 2019-05-03 苹果公司 Immersion is virtually shown
CN110007760A (en) * 2019-03-28 2019-07-12 京东方科技集团股份有限公司 Display control method, display control unit and display device
US20200192479A1 (en) * 2018-12-18 2020-06-18 Immersion Corporation Systems and methods for integrating environmental haptics in virtual reality
US10767997B1 (en) * 2019-02-25 2020-09-08 Qualcomm Incorporated Systems and methods for providing immersive extended reality experiences on moving platforms

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090002142A1 (en) * 2006-01-25 2009-01-01 Akihiro Morimoto Image Display Device
US20150097863A1 (en) * 2013-10-03 2015-04-09 Honda Motor Co., Ltd. System and method for dynamic in-vehicle virtual reality
US20170078638A1 (en) * 2013-10-03 2017-03-16 Honda Motor Co., Ltd. System and method for dynamic in-vehicle virtual reality
CN104977717A (en) * 2014-04-14 2015-10-14 哈曼国际工业有限公司 Head mounted display presentation adjustment
CN106462232A (en) * 2014-05-01 2017-02-22 微软技术许可有限责任公司 Determining coordinate frames in a dynamic environment
US20180040162A1 (en) * 2016-08-05 2018-02-08 Uber Technologies, Inc. Virtual reality experience for a vehicle
CN109716266A (en) * 2016-09-23 2019-05-03 苹果公司 Immersion is virtually shown
WO2019025212A1 (en) * 2017-08-04 2019-02-07 Robert Bosch Gmbh Method for controlling a vr presentation in a means of locomotion, and vr presentation device
CN109426345A (en) * 2017-08-31 2019-03-05 北京网秦天下科技有限公司 VR equipment and its adaptive display method
US20190130878A1 (en) * 2017-10-31 2019-05-02 Uber Technologies, Inc. Systems and Methods for Presenting Virtual Content in a Vehicle
US20200192479A1 (en) * 2018-12-18 2020-06-18 Immersion Corporation Systems and methods for integrating environmental haptics in virtual reality
US10767997B1 (en) * 2019-02-25 2020-09-08 Qualcomm Incorporated Systems and methods for providing immersive extended reality experiences on moving platforms
CN110007760A (en) * 2019-03-28 2019-07-12 京东方科技集团股份有限公司 Display control method, display control unit and display device

Similar Documents

Publication Publication Date Title
US11484790B2 (en) Reality vs virtual reality racing
EP3794851B1 (en) Shared environment for vehicle occupant and remote user
US11057574B2 (en) Systems, methods, and computer-readable media for using a video capture device to alleviate motion sickness via an augmented display for a passenger
KR101906711B1 (en) Active window for vehicle infomatics and virtual reality
US10169923B2 (en) Wearable display system that displays a workout guide
US20210346805A1 (en) Extended environmental using real-world environment data
CN111480194B (en) Information processing device, information processing method, program, display system, and moving object
CN108572722B (en) System and method for supporting augmented reality applications on a transport vehicle
EP3338136A1 (en) Augmented reality in vehicle platforms
US20160005333A1 (en) Real Time Car Driving Simulator
CN110478901A (en) Exchange method and system based on augmented reality equipment
WO2015176599A1 (en) Interaction method, interaction apparatus and user equipment
US10912916B2 (en) Electronic display adjustments to mitigate motion sickness
US20180182261A1 (en) Real Time Car Driving Simulator
KR20200005740A (en) Mobile sensor device for a head worn visual output device usable in a vehicle, and a method for operating a display system
KR101813018B1 (en) Appartus for providing 3d contents linked to vehicle and method thereof
CN115793852A (en) Method for acquiring operation indication based on cabin area, display method and related equipment
CN113343457B (en) Automatic driving simulation test method, device, equipment and storage medium
CN115278204A (en) Display device using method, device, equipment and storage medium
CN113041619B (en) Control method, device, equipment and medium for virtual vehicle
KR101881227B1 (en) Flight experience method using unmanned aerial vehicle
JP5672942B2 (en) Video display device, video display method, and program
EP4086102B1 (en) Navigation method and apparatus, electronic device, readable storage medium and computer program product
JP2019174523A (en) Display device that prompts posture opposite to inertial force and display control program and method
CN115035239B (en) Method and device for building virtual environment, computer equipment and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20221101