CN114326114A - Head-mounted display device control method, device, equipment and readable storage medium - Google Patents

Head-mounted display device control method, device, equipment and readable storage medium Download PDF

Info

Publication number
CN114326114A
CN114326114A CN202111434134.0A CN202111434134A CN114326114A CN 114326114 A CN114326114 A CN 114326114A CN 202111434134 A CN202111434134 A CN 202111434134A CN 114326114 A CN114326114 A CN 114326114A
Authority
CN
China
Prior art keywords
head
mounted display
display device
wearing
pressure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111434134.0A
Other languages
Chinese (zh)
Inventor
王政轩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Optical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Optical Technology Co Ltd filed Critical Goertek Optical Technology Co Ltd
Priority to CN202111434134.0A priority Critical patent/CN114326114A/en
Publication of CN114326114A publication Critical patent/CN114326114A/en
Pending legal-status Critical Current

Links

Images

Abstract

The embodiment of the disclosure discloses a method, a device, equipment and a readable storage medium for controlling a head-mounted display device, wherein the method comprises the following steps: obtaining a selected feature vector, wherein the feature vector comprises features that affect the wearing pressure of the head-mounted display device; acquiring a mapping function of the characteristic vector and the wearing pressure; determining a predicted wearing pressure of the head-mounted display device according to the vector value of the head-mounted display device for the feature vector and the mapping function; adjusting a current wearing pressure of the head-mounted display device to a predicted wearing pressure.

Description

Head-mounted display device control method, device, equipment and readable storage medium
Technical Field
The embodiment of the disclosure relates to the technical field of VR (virtual reality) equipment, in particular to a method, a device and equipment for controlling head-mounted display equipment and a readable storage medium.
Background
With the progress of Virtual Reality (VR) technology, VR headsets are becoming more and more popular, and higher requirements are placed on the wearing comfort and convenience of the VR headsets.
At present, according to the difference of wearing mode, VR head-mounted device divide into bandage formula and electrodynamic type two kinds. To the VR head-mounted device of bandage formula, can be through the elasticity or the length of adjusting the bandage to adjust VR head-mounted device's the pressure of wearing, this kind of mode needs manual adjustment, and inconvenient user uses, and in addition, the life of bandage is shorter. And to the VR head mounted device of electrodynamic type, do not need the user to adjust manually, however, can not adjust according to the change automatically of using the scene, user experience is relatively poor.
Therefore, it is necessary to provide a new method for controlling a head-mounted display device to adaptively adjust the wearing pressure of the head-mounted display device.
Disclosure of Invention
An object of the embodiments of the present disclosure is to provide a technical solution for controlling a head-mounted display device, so as to adaptively adjust a wearing pressure of the head-mounted display device according to an application scene.
According to a first aspect of embodiments of the present disclosure, there is provided a method of controlling a head-mounted display device, the method including:
obtaining a selected feature vector, wherein the feature vector comprises features that affect the wearing pressure of the head-mounted display device;
acquiring a mapping function of the characteristic vector and the wearing pressure;
determining a predicted wearing pressure of the head-mounted display device according to the vector value of the head-mounted display device for the feature vector and the mapping function;
adjusting a current wearing pressure of the head-mounted display device to a predicted wearing pressure.
Optionally, the feature vector comprises a wear state feature;
wherein the wearing state feature includes at least one of a pressure of a user's face with respect to a wearing portion of the head mounted display device, a distance between the user's eyes and a screen of the head mounted display device, a user wearing posture.
Optionally, the obtaining a mapping function of the feature vector and the wearing pressure includes:
acquiring wearing state data of the head-mounted display equipment as a training sample;
and training to obtain the mapping function according to the vector value of the training sample to the feature vector, the application scene information corresponding to the training sample and the actual wearing pressure corresponding to the application scene.
Optionally, before the obtaining the wearing state data of the head-mounted display device as a training sample, the method further includes:
providing a configuration interface for setting the head-mounted display device;
and responding to the operation of a user on the configuration interface, and obtaining application scene information and actual wearing pressure corresponding to the application scene information.
According to a second aspect of the embodiments of the present disclosure, there is provided a control apparatus of a head-mounted display device, the apparatus including:
a first obtaining module, configured to obtain a selected feature vector, where the feature vector includes features that affect wearing pressure of the head-mounted display device;
the second obtaining module is used for obtaining a mapping function of the characteristic vector and the wearing pressure;
a determining module, configured to determine a predicted wearing pressure of the head-mounted display device according to the vector value of the head-mounted display device for the feature vector and the mapping function;
and the adjusting module is used for adjusting the current wearing pressure of the head-mounted display equipment into the predicted wearing pressure.
Optionally, the feature vector comprises a wear state feature; wherein the wearing state feature includes at least one of a pressure of a user's face with respect to a wearing portion of the head mounted display device, a distance between the user's eyes and a screen of the head mounted display device, a user wearing posture.
Optionally, the second obtaining module includes:
the sample acquisition unit is used for acquiring wearing state data of the head-mounted display equipment as a training sample;
and the training unit is used for training to obtain the mapping function according to the vector value of the training sample to the feature vector, the application scene information corresponding to the training sample and the actual wearing pressure corresponding to the application scene.
Optionally, the apparatus further comprises:
the display module is used for providing a configuration interface for setting the head-mounted display equipment;
and the third acquisition module is used for responding to the operation of the user on the configuration interface and acquiring the application scene information and the actual wearing pressure corresponding to the application scene information.
According to a third aspect of the embodiments of the present disclosure, there is provided a head mounted display device including a driving mechanism, the head mounted display device further including:
a memory for storing executable computer instructions;
a processor for executing the control method according to the first aspect of the embodiments of the present disclosure, according to the control of the executable computer instructions;
wherein the driving mechanism is connected with the processor to adjust the current wearing pressure of the head-mounted display device to a target wearing pressure.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer readable storage medium having stored thereon computer instructions which, when executed by a processor, perform the method according to the first aspect of embodiments of the present disclosure.
According to the embodiment of the disclosure, the characteristic vector influencing the wearing pressure of the head-mounted display equipment is obtained, the predicted wearing pressure of the head-mounted display equipment is determined according to the vector value of the head-mounted display equipment to the characteristic vector and the mapping function of the characteristic vector and the wearing pressure, so that the current wearing pressure of the head-mounted display equipment is adjusted to be the predicted wearing pressure. In addition, the mapping function is obtained by training according to a large number of training samples, so that the predicted wearing pressure is determined by using the mapping function, and the accuracy of the obtained predicted wearing pressure can be improved.
Other features of, and advantages with, the disclosed embodiments will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings needed to be used in the embodiments will be briefly described below. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope. For a person skilled in the art, it is possible to derive other relevant figures from these figures without inventive effort.
Fig. 1 is a schematic hardware configuration diagram of a head-mounted display device that can be used to implement a control method of an embodiment;
FIG. 2 is a flow diagram of a method of controlling a head mounted display device according to one embodiment;
FIG. 3 is a functional block diagram of a control apparatus of a head mounted display device according to one embodiment;
fig. 4 is a hardware configuration diagram of a head-mounted display device according to an embodiment.
Detailed Description
Various exemplary embodiments of the present disclosure will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of parts and steps, numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the embodiments of the present disclosure unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
< hardware configuration >
Fig. 1 is a hardware configuration diagram of a control method of a head mounted display device that can be used to implement one embodiment.
As shown in fig. 1, the head mounted display apparatus 1000 may include a processor 1100, a memory 1200, an interface device 1300, a communication device 1400, a display device 1500, an input device 1600, a microphone 1700, and a speaker 1800. The processor 1100 may include, but is not limited to, a central processing unit CPU, a microprocessor MCU, or the like. The memory 1200 includes, for example, a ROM (read only memory), a RAM (random access memory), a nonvolatile memory such as a hard disk, and the like. The interface device 1300 includes, for example, various bus interfaces such as a serial bus interface (including a USB interface), a parallel bus interface, and the like. Communication device 1400 is capable of wired or wireless communication, for example. The display device 1500 is, for example, a liquid crystal display, an LED display, a touch display, or the like. The input device 1600 includes, for example, a touch screen, a keyboard, a handle, and the like. The microphone 1700 may be used for inputting voice information. The speaker 1800 may be used to output voice information.
The head-mounted display device 1000 may be, for example, a VR (Virtual Reality) device, an AR (Augmented Reality) device, an MR (Mixed Reality) device, and the like, which is not limited in this disclosure.
In this embodiment, the memory 1200 of the head mounted display device 1000 is used to store instructions for controlling the processor 1100 to operate to implement or support the implementation of a control method of a head mounted display device according to any of the embodiments. The skilled person can design the instructions according to the solution disclosed in the present specification. How the instructions control the operation of the processor is well known in the art and will not be described in detail herein.
It should be understood by those skilled in the art that although a plurality of apparatuses of the head mounted display apparatus 1000 are illustrated in fig. 1, the head mounted display apparatus 1000 of the embodiments of the present specification may only refer to some of the apparatuses, and may also include other apparatuses, which are not limited herein.
The head mounted display device 1000 shown in FIG. 1 is illustrative only and is not intended to limit the present description, its applications, or uses in any way.
Various embodiments and examples according to the present disclosure are described below with reference to the drawings.
< method examples >
Fig. 2 illustrates a control method of a head mounted display device according to an embodiment of the present disclosure, which may be implemented by, for example, the head mounted display device 1000 shown in fig. 1.
As shown in fig. 2, the method of controlling the head mounted display device provided by this embodiment may include the following steps S2100 to S2400.
Step S2100, obtaining a selected feature vector, wherein the feature vector includes features affecting the wearing pressure of the head-mounted display device.
In this embodiment, the feature vector may be a user-selected feature that affects the wearing pressure of the head-mounted display device.
In this embodiment, during the use process of the head-mounted display device by the user, the wearing state of the user is different for different application scenes. For example, for a gun battle type game scene of intense stimulation, the user is usually in an intense state, the pressure of the face of the user against the wearing part of the head mounted display device becomes large, and further, in the intense state of the user, the head of the user leans forward, and the distance between the eyes of the user and the screen of the head mounted display device decreases. For example, for a relaxed scene, for example, a dance-type game scene or a music-type game scene, the user is usually in a relaxed state, the pressure of the face of the user on the wearing portion of the head-mounted display device may become small, and further, when the user is in the relaxed state, the head of the user leans back or lies on the side, and the distance between the eyes of the user and the screen of the head-mounted display device may decrease. Based on this, the feature reflecting the wearing state can be taken as a feature affecting the wearing pressure of the head-mounted display device.
In one embodiment, the feature vector includes a wear state feature. The wearing state feature may include at least one of a pressure of a user's face with respect to a wearing portion of the head mounted display device, a distance between user's eyes and a screen of the head mounted display device, a user wearing posture.
In this embodiment, the head-mounted display device is provided with a wearing portion by which the head-mounted display device can be fixed to the head of the user during use of the head-mounted display device by the user. The wearing portion may be a band, for example.
In a specific example, the pressure between the wearing part of the head-mounted display device and the face of the user may be a pressure value of the face of the user relative to the wearing part detected by the pressure sensor.
In a specific implementation, at least one pressure sensor is arranged on the head-mounted display device, and the pressure value of the face of the user relative to the wearing part can be detected through the pressure sensor in the using process of the head-mounted display device. For example, in the case where one pressure sensor is provided on the head-mounted display device, the measurement value of the pressure sensor is taken as the pressure value of the user's face with respect to the wearing portion during use of the head-mounted display device. For example, in the case where a plurality of pressure sensors are provided on the head-mounted display device, an average value of measurement values of the plurality of pressure sensors is taken as a pressure value of the user's face with respect to the wearing portion during use of the head-mounted display device. It should be noted that, during the use of the head-mounted display device, the pressure sensor may measure the pressure of the face of the user relative to the wearing portion in real time, or may measure the pressure value of the face of the user relative to the wearing portion at a preset frequency.
In another specific example, the pressure between the wearing portion of the head mounted display device and the face of the user may be a pressure level of the face of the user relative to the wearing portion.
In this example, the pressure value of the user's face with respect to the wearing portion may be divided into a plurality of pressure levels, one pressure level corresponding to one pressure range. For example, the pressure value of the user's face with respect to the wearing portion is divided into 10 levels, and the pressure value of the user's face with respect to the wearing portion is larger as the pressure level increases. For example, for the most stressful scenario, the corresponding stress level is 10 levels, and for the easiest scenario, the corresponding stress level is 1 level. In specific implementation, during the process of using the head-mounted display device, the current pressure level of the face of the user relative to the wearing part is determined according to the pressure value of the face of the user relative to the wearing part detected by the pressure sensor and the mapping relation between the pressure value and the pressure level.
In this embodiment, the distance between the eyes of the user and the screen of the head-mounted display device can be detected by a distance sensor.
In a specific example, the distance between the eyes of the user and the screen of the head-mounted display device may be a distance value between the eyes of the user and the screen of the head-mounted display device detected by the distance sensor.
When the head-mounted display device is used, the distance value between the eyes of the user and the screen of the head-mounted display device can be detected through the distance sensor. For example, in the case where one distance sensor is provided on the head-mounted display device, a measurement value of the distance sensor is taken as a distance value between the eyes of the user and the screen of the head-mounted display device during use of the head-mounted display device. For example, in a case where a plurality of distance sensors are provided on the head mounted display device, an average value of measurement values of the plurality of distance sensors is taken as a distance value between the eyes of the user and the screen of the head mounted display device during use of the head mounted display device. It should be noted that, in the use process of the head-mounted display device, the distance sensor may measure the distance between the eyes of the user and the screen of the head-mounted display device in real time, or may measure the distance between the eyes of the user and the screen of the head-mounted display device at a preset frequency.
It should be noted that the distance sensor may be an infrared sensor, an ultrasonic sensor, and the like, which is not limited in this disclosure.
In another specific example, the distance between the eyes of the user and the screen of the head-mounted display device may be a distance grade between the eyes of the user and the screen of the head-mounted display device.
In this example, the distance value between the user's eye and the screen of the head mounted display device may be divided into a plurality of distance levels, one distance level corresponding to one distance range. For example, the distance value between the eyes of the user and the screen of the head-mounted display device is divided into 10 levels, and the distance between the eyes of the user and the screen of the head-mounted display device is larger as the distance level increases. In specific implementation, in the process of using the head-mounted display device, the current distance grade between the eyes of the user and the screen of the head-mounted display device is determined according to the distance value between the eyes of the user and the screen of the head-mounted display device detected by the distance sensor and the mapping relation between the distance value and the distance grade.
In this embodiment, the user wearing position may be a user head position, for example, a forward head tilt position, a backward head tilt position, a lying position. The user wearing posture may be detected by an Inertial Measurement Unit (IMU) on the head mounted display device.
In this embodiment, for different application scenarios, the wearing states of the user are different, and based on this, the predicted wearing pressure of the head-mounted display device can be obtained according to the pressure of the face of the user relative to the wearing portion of the head-mounted display device, the distance between the eyes of the user and the screen of the head-mounted display device, and the wearing posture of the user, so as to adaptively adjust the wearing pressure of the head-mounted display device according to the predicted wearing pressure, without requiring manual adjustment by the user, and the user experience is better. In addition, the pressure value of the face of the user relative to the wearing part is divided into a plurality of pressure levels, and the distance between the eyes of the user and the screen of the head-mounted display device is divided into a plurality of distance levels for prediction according to the pressure levels and the distance levels, so that the calculation amount can be reduced, the response speed can be improved, and the power consumption can be reduced.
After step S2100, step S2200 is executed to obtain a mapping function of the feature vector and the wearing pressure.
In one embodiment, the step of obtaining the mapping function of the feature vector and the wearing pressure may further include: step S3100-step S3200.
And step S3100, acquiring wearing state data of the head-mounted display device as a training sample.
According to this step S3100, the mapping function may be trained by training samples, and a mapping relationship between the selected feature vector and the wearing pressure may be obtained.
It will be appreciated that the greater the number of training samples, the more accurate the training results will generally be, but that after a certain number of training samples the increase in accuracy of the training results will become slower and slower until the orientation stabilizes. Here, the number of training samples required for the determination of the accuracy of the training results and the data processing cost can be considered.
In this embodiment, the wearing state data may be wearing state data generated when the user wears the head mounted display device for the first time. The wearing state data may include a pressure of the user's face against the wearing part detected by the pressure sensor, a distance between the user's eyes and a screen of the head-mounted display device by the distance sensor, and a user wearing posture detected by the inertial measurement unit.
In specific implementation, when a user wears the head-mounted display device for the first time, the actual wearing pressures corresponding to different application scene information are configured, after the configuration, the wearing pressure of the head-mounted display device is adjusted according to the configured actual wearing pressures corresponding to the different application scene information, and then, the data of the pressure sensor, the distance sensor and the inertial measurement unit can be read, so as to obtain the wearing state data corresponding to the application scene. And then, the application scenes can be switched to acquire the wearing state data corresponding to another application scene. And then, taking the wearing state data as a training sample to train to obtain a mapping function of the characteristic vector and the wearing pressure.
In one embodiment, before the obtaining the wearing state data of the head-mounted display device as the training sample, the method may further include: providing a configuration interface for setting the head-mounted display device; and responding to the operation of a user on the configuration interface, and obtaining application scene information and actual wearing pressure corresponding to the application scene information.
And the configuration interface is used for inputting the actual wearing pressure corresponding to the application scene information by the user.
The application scenario information may be a type of the application scenario. The type of application scenario may include a catatonic type scenario, for example, a catatonic gun battle type game scenario. The types of application scenes may include easy entertainment type scenes, such as music type game scenes, dance type game scenes.
The actual wearing pressure may be a wearing pressure corresponding to the application scenario information set by the user.
In specific implementation, for a stressful scene, the action amplitude of the user is large, and large wearing pressure is generally needed, so that influence on the game experience of the user due to falling off of the head-mounted display device is avoided. To light amusement class scene, user action range is less, wears display device and is difficult for droing, can set up less pressure of wearing to improve the travelling comfort of wearing. Based on the above, the user can input the actual wearing pressure corresponding to the stress-stimulation scene and the actual wearing pressure corresponding to the light entertainment scene on the configuration interface.
In this embodiment, the user can configure the actual wearing pressures of different application scenarios as required, so that the requirements of different users can be met, and the user experience is better.
After step S3100, step S3200 is executed, and the mapping function is obtained through training according to the vector value of the training sample to the feature vector, the application scenario information corresponding to the training sample, and the actual wearing pressure corresponding to the application scenario.
In one example, the mapping function f (x) may be obtained using an arbitrary multiple linear regression model.
For example, the multiple linear regression model may be a simple polynomial function reflecting the mapping function f (x), wherein the coefficients of each order of the polynomial function are unknown, and the coefficients of each order of the polynomial function may be determined by substituting the vector values of the feature vectors of the training samples and the corresponding actual wearing pressures into the polynomial function, thereby obtaining the mapping function f (x). The multiple linear regression model may be, for example, an LR model, a bayesian model, etc., and is not limited herein.
In this embodiment, the wearing state data of the head-mounted display device is obtained as a training sample, the mapping function is obtained through training according to the vector value of the training sample to the feature vector, the application scene information corresponding to the training sample and the actual wearing pressure corresponding to the application scene, so as to predict the wearing pressure according to the mapping function, and since the mapping function is obtained through training according to a large number of training samples, the mapping function is used to determine the predicted wearing pressure, and the accuracy of the obtained predicted wearing pressure can be improved.
After step S2200, step S2300 is executed to determine a predicted wearing pressure of the head mounted display device according to the vector value of the head mounted display device for the feature vector and the mapping function.
The independent variable of the mapping function f (X) is the feature vector X, and the dependent variable f (X) is the wearing pressure determined by the feature vector X.
In this embodiment, a mapping function of the feature vector and the wearing pressure is obtained according to step S2200, and after the vector value of the head-mounted display device for the feature vector is obtained according to step S2300, the vector value may be substituted into the mapping function to obtain the predicted wearing pressure of the head-mounted display device.
And step S2400, adjusting the current wearing pressure of the head-mounted display device to a predicted wearing pressure.
In specific implementation, after the predicted wearing pressure is obtained, the driving mechanism of the head-mounted display device is controlled to work according to the predicted wearing pressure so as to adjust the current wearing pressure of the head-mounted display device to the predicted wearing pressure.
The following describes a control method of the head mounted display device according to the embodiment with a specific example.
First, when a user wears the head-mounted display device for the first time, a configuration interface is displayed, the configuration interface comprises an application scene information option and a corresponding actual wearing pressure option, and the actual wearing pressure corresponding to the application scene information is obtained in response to the operation of the user on the configuration interface.
And then, in the process that the user uses the head-mounted display device, acquiring the current application scene, and adjusting the wearing pressure of the head-mounted display device according to the actual wearing pressure corresponding to the current application scene. Meanwhile, detection data of the pressure sensor, the distance sensor, and the inertial measurement unit of the head-mounted display device are acquired to obtain wearing state data, i.e., the pressure of the user's face against the wearing part, the distance between the user's eyes and the screen of the head-mounted display device, and the user's wearing posture. It should be noted that the wearing state data corresponding to different application scenes, that is, the pressure of the face of the user against the wearing part, the distance between the eyes of the user and the screen of the head-mounted display device, and the wearing posture of the user, may be obtained.
And then, taking the wearing state data as a training sample, and training to obtain the mapping function according to the vector value of the training sample to the feature vector, the application scene information corresponding to the training sample and the actual wearing pressure corresponding to the application scene.
Then, when the user wears the head-mounted display device for the second time, the vector values of the head-mounted display device for the feature vectors, that is, the pressure of the user's face against the wearing portion, the distance between the user's eyes and the screen of the head-mounted display device, and the user wearing posture, are acquired. And determining the predicted wearing pressure of the head-mounted display equipment according to the vector value of the head-mounted display equipment to the feature vector and the mapping function, and adjusting the current wearing pressure of the head-mounted display equipment to be the predicted wearing pressure.
During the use process of the head-mounted display device, if an application scene changes, namely the vector value of the head-mounted display device to the characteristic vector changes, the vector value of the head-mounted display device to the characteristic vector, namely the pressure of the face of the user relative to the wearing part, the distance between the eyes of the user and the screen of the head-mounted display device, and the wearing posture of the user are obtained. And determining the predicted wearing pressure of the head-mounted display equipment according to the vector value of the head-mounted display equipment to the feature vector and the mapping function, and adjusting the current wearing pressure of the head-mounted display equipment to be the predicted wearing pressure.
According to the embodiment of the disclosure, the characteristic vector influencing the wearing pressure of the head-mounted display equipment is obtained, the predicted wearing pressure of the head-mounted display equipment is determined according to the vector value of the head-mounted display equipment to the characteristic vector and the mapping function of the characteristic vector and the wearing pressure, so that the current wearing pressure of the head-mounted display equipment is adjusted to be the predicted wearing pressure. In addition, the mapping function is obtained by training according to a large number of training samples, so that the predicted wearing pressure is determined by using the mapping function, and the accuracy of the obtained predicted wearing pressure can be improved.
< apparatus embodiment >
The embodiment of the present disclosure provides a control apparatus of a head-mounted display device, and as shown in fig. 3, the control apparatus 300 of the head-mounted display device may include a first obtaining module 310, a second obtaining module 320, a determining module 330, and an adjusting module 340.
The first obtaining module 310 may be configured to obtain a selected feature vector, where the feature vector includes features that affect the wearing pressure of the head-mounted display device.
The second obtaining module 320 may be configured to obtain a mapping function of the feature vector and the wearing pressure.
The determining module 330 may be configured to determine a predicted wearing pressure of the head-mounted display device according to the vector value of the head-mounted display device for the feature vector and the mapping function.
The adjustment module 340 may be configured to adjust a current wearing pressure of the head mounted display device to a predicted wearing pressure.
In one embodiment, the feature vector includes a wear state feature; wherein the wearing state feature includes at least one of a pressure of a user's face with respect to a wearing portion of the head mounted display device, a distance between the user's eyes and a screen of the head mounted display device, a user wearing posture.
In one embodiment, the second obtaining module includes:
the sample acquisition unit is used for acquiring wearing state data of the head-mounted display equipment as a training sample;
and the training unit is used for training to obtain the mapping function according to the vector value of the training sample to the feature vector, the application scene information corresponding to the training sample and the actual wearing pressure corresponding to the application scene.
In one embodiment, the apparatus further comprises:
the display module is used for providing a configuration interface for setting the head-mounted display equipment;
and the third acquisition module is used for responding to the operation of the user on the configuration interface and acquiring the application scene information and the actual wearing pressure corresponding to the application scene information.
According to the embodiment of the disclosure, the characteristic vector influencing the wearing pressure of the head-mounted display equipment is obtained, the predicted wearing pressure of the head-mounted display equipment is determined according to the vector value of the head-mounted display equipment to the characteristic vector and the mapping function of the characteristic vector and the wearing pressure, so that the current wearing pressure of the head-mounted display equipment is adjusted to be the predicted wearing pressure. In addition, the mapping function is obtained by training according to a large number of training samples, so that the predicted wearing pressure is determined by using the mapping function, and the accuracy of the obtained predicted wearing pressure can be improved.
< apparatus embodiment >
Fig. 4 is a hardware configuration diagram of a head-mounted display device according to an embodiment. As shown in FIG. 4, the head mounted display device 400 includes a memory 410, a processor 420, and a drive mechanism 430.
The memory 410 may be used to store executable computer instructions.
The processor 420 may be configured to execute the method for controlling the head-mounted display device according to the method embodiments of the present disclosure, according to the control of the executable computer instructions.
The driving mechanism 430 is connected to the processor 420 to adjust the current wearing pressure of the head-mounted display device to a target wearing pressure.
The head-mounted display device 400 may be the head-mounted display device 1000 shown in fig. 1, or may be a device having another hardware structure, which is not limited herein. The head-mounted display device 400 may be, for example, a VR device, an AR device, an MR device, and the like, which is not limited in this disclosure.
The head-mounted display device 400 may also include a pressure sensor, a distance sensor, and an inertial measurement unit. Wherein the pressure sensor is used for detecting the pressure of the face of the user relative to the wearing part of the head-mounted display device. The distance sensor is used for detecting the distance between the eyes of the user and the screen of the head-mounted display device. The inertial measurement unit is used for detecting the wearing posture of the user.
In further embodiments, the head mounted display device 400 may comprise the control apparatus 300 of the above head mounted display device.
In one embodiment, the modules of the control apparatus 300 of the head-mounted display device above can be implemented by the processor 420 executing computer instructions stored in the memory 410.
< computer-readable storage Medium >
The disclosed embodiments also provide a computer readable storage medium having stored thereon computer instructions, which, when executed by a processor, perform the method for controlling a head-mounted display device provided by the disclosed embodiments.
The disclosed embodiments may be systems, methods, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement aspects of embodiments of the disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations for embodiments of the present disclosure may be assembly instructions, Instruction Set Architecture (ISA) instructions, machine related instructions, microcode, firmware instructions, state setting data, or source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the disclosed embodiments by personalizing the custom electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), with state information of the computer-readable program instructions.
Various aspects of embodiments of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, implementation by software, and implementation by a combination of software and hardware are equivalent.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the embodiments of the present disclosure is defined by the appended claims.

Claims (10)

1. A method of controlling a head-mounted display device, the method comprising:
obtaining a selected feature vector, wherein the feature vector comprises features that affect the wearing pressure of the head-mounted display device;
acquiring a mapping function of the characteristic vector and the wearing pressure;
determining a predicted wearing pressure of the head-mounted display device according to the vector value of the head-mounted display device for the feature vector and the mapping function;
adjusting a current wearing pressure of the head-mounted display device to a predicted wearing pressure.
2. The method of claim 1, wherein the feature vector comprises a wear state feature;
wherein the wearing state feature includes at least one of a pressure of a user's face with respect to a wearing portion of the head mounted display device, a distance between the user's eyes and a screen of the head mounted display device, a user wearing posture.
3. The method of claim 1, wherein the obtaining the mapping function of the feature vector and the wearing pressure comprises:
acquiring wearing state data of the head-mounted display equipment as a training sample;
and training to obtain the mapping function according to the vector value of the training sample to the feature vector, the application scene information corresponding to the training sample and the actual wearing pressure corresponding to the application scene.
4. The method of claim 3, wherein before the obtaining the wearing state data of the head mounted display device as the training sample, the method further comprises:
providing a configuration interface for setting the head-mounted display device;
and responding to the operation of a user on the configuration interface, and obtaining application scene information and actual wearing pressure corresponding to the application scene information.
5. A control apparatus of a head-mounted display device, the apparatus comprising:
a first obtaining module, configured to obtain a selected feature vector, where the feature vector includes features that affect wearing pressure of the head-mounted display device;
the second obtaining module is used for obtaining a mapping function of the characteristic vector and the wearing pressure;
a determining module, configured to determine a predicted wearing pressure of the head-mounted display device according to the vector value of the head-mounted display device for the feature vector and the mapping function;
and the adjusting module is used for adjusting the current wearing pressure of the head-mounted display equipment into the predicted wearing pressure.
6. The apparatus of claim 5, wherein the feature vector comprises a wear state feature;
wherein the wearing state feature includes at least one of a pressure of a user's face with respect to a wearing portion of the head mounted display device, a distance between the user's eyes and a screen of the head mounted display device, a user wearing posture.
7. The apparatus of claim 5, wherein the second obtaining module comprises:
the sample acquisition unit is used for acquiring wearing state data of the head-mounted display equipment as a training sample;
and the training unit is used for training to obtain the mapping function according to the vector value of the training sample to the feature vector, the application scene information corresponding to the training sample and the actual wearing pressure corresponding to the application scene.
8. The apparatus of claim 7, further comprising:
the display module is used for providing a configuration interface for setting the head-mounted display equipment;
and the third acquisition module is used for responding to the operation of the user on the configuration interface and acquiring the application scene information and the actual wearing pressure corresponding to the application scene information.
9. A head-mounted display device comprising a drive mechanism, the head-mounted display device further comprising:
a memory for storing executable computer instructions;
a processor for executing the control method according to any one of claims 1 to 4, according to the control of the executable computer instructions;
wherein the driving mechanism is connected with the processor to adjust the current wearing pressure of the head-mounted display device to a target wearing pressure.
10. A computer readable storage medium having stored thereon computer instructions which, when executed by a processor, perform the method of any of claims 1-4.
CN202111434134.0A 2021-11-29 2021-11-29 Head-mounted display device control method, device, equipment and readable storage medium Pending CN114326114A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111434134.0A CN114326114A (en) 2021-11-29 2021-11-29 Head-mounted display device control method, device, equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111434134.0A CN114326114A (en) 2021-11-29 2021-11-29 Head-mounted display device control method, device, equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN114326114A true CN114326114A (en) 2022-04-12

Family

ID=81046939

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111434134.0A Pending CN114326114A (en) 2021-11-29 2021-11-29 Head-mounted display device control method, device, equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN114326114A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107179609A (en) * 2017-06-12 2017-09-19 京东方科技集团股份有限公司 Virtual reality glasses
US20180364754A1 (en) * 2017-06-19 2018-12-20 Scott Sullivan System for Auto-Securing Virtual Reality Headgear
US20200089003A1 (en) * 2019-08-30 2020-03-19 Lg Electronics Inc. Electronic device
CN112068315A (en) * 2020-11-11 2020-12-11 宁波圻亿科技有限公司 Adjusting method and system of AR helmet
CN112444996A (en) * 2019-09-03 2021-03-05 苹果公司 Head-mounted device with tension adjustment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107179609A (en) * 2017-06-12 2017-09-19 京东方科技集团股份有限公司 Virtual reality glasses
US20180364754A1 (en) * 2017-06-19 2018-12-20 Scott Sullivan System for Auto-Securing Virtual Reality Headgear
US20200089003A1 (en) * 2019-08-30 2020-03-19 Lg Electronics Inc. Electronic device
CN112444996A (en) * 2019-09-03 2021-03-05 苹果公司 Head-mounted device with tension adjustment
CN112068315A (en) * 2020-11-11 2020-12-11 宁波圻亿科技有限公司 Adjusting method and system of AR helmet

Similar Documents

Publication Publication Date Title
US11250940B2 (en) Exercise feedback provision apparatus and method
US10254828B2 (en) Detection of improper viewing posture
JP2019072716A (en) Eccentric Rotating Mass Actuator Optimization for Haptic Effect
KR20180041642A (en) Scene analysis for improved eye tracking
KR101839441B1 (en) Head-mounted display controlled by tapping, method for controlling the same and computer program for controlling the same
CN106339076B (en) Control method and control device based on action recognition
WO2016185685A1 (en) Information processing apparatus, information processing method, and program
KR20190067227A (en) Input Controller Stabilization for Virtual Reality System
JP2017063916A (en) Apparatus, method and program for determining force sense to be presented
US20180144557A1 (en) Method and user terminal for providing hologram image-based message service, and hologram image display device
EP2463766A1 (en) Program, apparatus, system, and method for scrolling a displayed image.
CN107376341B (en) Data processing method and device for gamepad and gamepad
US11182953B2 (en) Mobile device integration with a virtual reality environment
JP2015215696A (en) Electronic equipment, program, and warning method in electronic equipment
CN113302672A (en) Speed-variable speech sounding machine
CN103631375A (en) Method and apparatus for controlling vibration intensity according to situation awareness in electronic device
US10464211B2 (en) Generating control signal for tele-presence robot
CN111741400B (en) Earphone position adjusting method, device, equipment and storage medium
KR102580521B1 (en) Electronic apparatus and method of adjusting sound volume thereof
CN114326114A (en) Head-mounted display device control method, device, equipment and readable storage medium
CN114339193A (en) Head-mounted display device control method, device, equipment and readable storage medium
US20190171587A1 (en) Communication mode control for wearable devices
US20220245447A1 (en) Systems and methods for quantization aware training of a neural network for heterogeneous hardware platform
CN115087957A (en) Virtual scene
JP2010104396A (en) Situational determination device, system, method and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20221117

Address after: No. 500, Songling Road, Laoshan District, Qingdao, Shandong 266101

Applicant after: GOERTEK TECHNOLOGY Co.,Ltd.

Address before: 261061 workshop 1, phase III, Geer Photoelectric Industrial Park, 3999 Huixian Road, Yongchun community, Qingchi street, high tech Zone, Weifang City, Shandong Province

Applicant before: GoerTek Optical Technology Co.,Ltd.

TA01 Transfer of patent application right
RJ01 Rejection of invention patent application after publication

Application publication date: 20220412

RJ01 Rejection of invention patent application after publication