CN106774912B - Method and device for controlling VR equipment - Google Patents

Method and device for controlling VR equipment Download PDF

Info

Publication number
CN106774912B
CN106774912B CN201611217984.4A CN201611217984A CN106774912B CN 106774912 B CN106774912 B CN 106774912B CN 201611217984 A CN201611217984 A CN 201611217984A CN 106774912 B CN106774912 B CN 106774912B
Authority
CN
China
Prior art keywords
eyeball part
image
time
specified
eyeball
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611217984.4A
Other languages
Chinese (zh)
Other versions
CN106774912A (en
Inventor
熊磊
阙鑫地
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Original Assignee
Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yulong Computer Telecommunication Scientific Shenzhen Co Ltd filed Critical Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Priority to CN201611217984.4A priority Critical patent/CN106774912B/en
Publication of CN106774912A publication Critical patent/CN106774912A/en
Application granted granted Critical
Publication of CN106774912B publication Critical patent/CN106774912B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Abstract

The invention discloses a method and a device for controlling VR equipment, relates to the technical field of communication, and can provide a flexible and convenient VR equipment control mode for a user. The method of the invention comprises the following steps: acquiring a moving distance of a user eyeball part in a certain time, a position of the eyeball part at a starting moment of the certain time and a position of the eyeball part at a stopping moment of the certain time through an infrared camera; if the moving distance is larger than a distance threshold value, determining a first image corresponding to the position of the eyeball part at the starting moment and a second image corresponding to the position of the eyeball part at the ending moment; the first image is reduced according to a first specified scale, and the second image is enlarged according to a second specified scale. The method is suitable for the control process of the VR equipment.

Description

Method and device for controlling VR equipment
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a method and an apparatus for controlling a VR device.
Background
With the popularization of Virtual Reality (VR) technology, VR devices are also widely used. At present, the user can control VR equipment through the VR helmet that self was worn, the bluetooth handle that can communicate with VR equipment etc. perhaps, the user can directly control VR equipment through touch pad or entity button.
However, there are often problems with implementing the manipulation of VR devices as described above. For example, when a user operates the VR device through a VR helmet worn by the user, if the operation steps are complicated, the user needs to frequently rotate the head to rotate a gyroscope arranged in the VR helmet, which may cause eye fatigue or head dizziness of the user; if a user tries to control the VR device through the bluetooth handle, after purchasing the VR device, the bluetooth handle matched with the VR device is necessarily required to be equipped, and subsequently, the bluetooth handle is also required to be maintained when the VR device is maintained, so that additional consumption of the user is caused; if the user directly controls VR equipment through touch pad or entity button, then when the user wears the VR helmet, still need frequently carry out other touch-control operations, will influence the experience effect that the VR technique brought for the user is personally on the scene like this, and often can have the restriction of removal space when the user wears the VR helmet, and then influence the operating efficiency among the actual operation process.
Therefore, the three ways of controlling the VR equipment cannot provide flexible and convenient VR equipment control ways for users.
Disclosure of Invention
The invention provides a method and a device for controlling VR equipment, which can provide a flexible and convenient VR equipment control mode for a user.
In order to achieve the purpose, the embodiment of the invention adopts the following technical scheme:
in a first aspect, the invention provides a method of operating a VR device, the method comprising:
acquiring a moving distance of a user eyeball part in a certain time, a position of the eyeball part at a starting moment of the certain time and a position of the eyeball part at a stopping moment of the certain time through an infrared camera;
if the moving distance is larger than a distance threshold value, determining a first image corresponding to the position of the eyeball part at the starting moment and a second image corresponding to the position of the eyeball part at the ending moment;
the first image is reduced according to a first specified scale, and the second image is enlarged according to a second specified scale.
In a second aspect, the present invention provides an apparatus for operating a VR device, the apparatus comprising:
the system comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring the moving distance of a user eyeball part in a certain time, the position of the eyeball part at the starting moment of the certain time and the position of the eyeball part at the ending moment of the certain time through an infrared camera;
a determining module, configured to determine, if the moving distance acquired by the acquiring module is greater than a distance threshold, a first image corresponding to a position where the eyeball part is located at the start time, and a second image corresponding to a position where the eyeball part is located at the stop time;
and the processing module is used for reducing the first image determined by the determining module according to a first specified proportion and amplifying the second image determined by the determining module according to a second specified proportion.
Compared with the prior art that a user controls VR equipment through a VR helmet worn by the user and a Bluetooth handle capable of communicating with the VR equipment, or the user can directly control the VR equipment through a touch pad or a physical key, the method and the device for controlling the VR equipment can track the watching track of an eyeball part of the user by means of an infrared camera, determine a first image and a second image respectively watched at the starting moment and the ending moment of a watching process after the moving distance of the eyeball part in a certain time is larger than a distance threshold, reduce the first image according to a first specified proportion, and enlarge the second image according to a second specified proportion. Therefore, the invention mainly realizes the reduction or enlargement processing of the first image and the second image presented to the user by the VR equipment according to the movement condition of the eyeball part when the user normally watches the VR equipment. In this way, the user need not make the gyroscope of the VR helmet rotate through the head is rotated, also need not to dispose extra bluetooth handle, touch pad or entity button etc. for this VR device. Therefore, the technical scheme provided by the invention can provide a flexible and convenient VR equipment control mode for a user.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of a method for operating a VR device according to an embodiment of the present invention;
fig. 2 is a schematic view of a display interface of a VR device according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a display interface of another VR device according to an embodiment of the present invention;
fig. 4 is a schematic diagram of a display interface of another VR device according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a display interface of another VR device according to an embodiment of the present invention;
fig. 6 is a schematic diagram of a display interface of another VR device according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a display interface of another VR device according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of an apparatus for operating a VR device according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a VR headset according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the invention can be used for a VR helmet, and an Infrared (IR) camera is arranged on the VR helmet. Wherein, infrared camera can be in real time to wearing the eyeball part of the user of this VR helmet and discern to track the removal orbit of this eyeball part in a period of time, thereby confirm the moving distance of this eyeball part in a period of time, and the position of each eyeball part constantly. It should be noted that the above-mentioned position may be a relative position of the eyeball part with respect to a fixed reference object, or a point coordinate position determined by directly using a specific coordinate system, and a specific representation manner of the position is not limited in the present invention. Then, the VR headset can determine when to zoom in or zoom out each image according to the acquired parameters. It should be noted that the present invention is not only applicable to the VR helmet, but also applicable to other wearable devices provided with an infrared camera, for example, devices such as VR glasses having similar functions to the VR helmet.
An embodiment of the present invention provides a method for operating a VR device, and as shown in fig. 1, the method may be applied to the VR headset provided with an infrared camera, and the method may include:
101. the moving distance of the eyeball part of the user in a certain time, the position of the eyeball part at the starting moment of the certain time and the position of the eyeball part at the ending moment of the certain time are obtained through the infrared camera.
After the infrared camera collects an image at a certain moment, the eye region of a user in the image can be effectively identified, and then the position of the eyeball part in the eye region can be distinguished according to different display colors. It should be noted that the manner of identifying the eyeball part is not limited to the above manner, and the use of the infrared camera for identification is only one possible implementation manner, and is not limited herein.
Because the infrared camera can identify the position of the eyeball part at each sampling moment, in the invention, if a certain time is taken as a sampling period for controlling the VR device, images respectively corresponding to the positions of the eyeball parts acquired at the starting moment and the ending moment of the sampling period can be taken as the control content of the control process. The specific timing of the manipulation may be referred to in step 102, and the specific manipulation manner may be referred to in step 103.
It should be noted that, once the positions of the eyeball part of the user at the starting time and the ending time are determined, the moving distance of the eyeball part can be determined according to the relationship between the two positions. For example, if two positions are represented by coordinates, the moving distance may be a linear distance between the two sampling points represented by the coordinates in the coordinate system; alternatively, if both the positions are expressed in a relative positional relationship with the same reference object, the reference object is fixed and the position is not changed, and therefore, the moving distance of the eyeball part can be determined similarly based on the two relative positional relationships.
102. And if the moving distance is greater than the distance threshold, determining a first image corresponding to the position of the eyeball part at the starting moment and a second image corresponding to the position of the eyeball part at the ending moment.
In order to avoid the problem of acquisition accuracy, which causes misoperation in the process of operating the VR device, in the present invention, the next operation can be performed only after the moving distance of the eyeball part is greater than the distance threshold, and if the moving distance of the eyeball part is less than or equal to the distance threshold, it can be considered that the user does not wish to perform the following operation process. Wherein, the distance threshold value can be preset by the staff when allotting the VR helmet, or set according to the operation habit of oneself by the user, or the sensitive condition of eyeball part, and set up the in-process, staff and user all can refer to the historical data of other users in the actual use in-process in the historical time, will make the distance threshold value that sets up like this can be better provide effectual opportunity of controlling for the user to reduce the probability that the maloperation produced.
103. The first image is reduced in a first specified scale and the second image is enlarged in a second specified scale.
Wherein the second specified ratio may be inversely proportional to the first specified ratio
In the present invention, it is considered that the manipulation process in step 103 mainly presents the moving trajectory of the eyeball part of the user on the display interface of the VR device, so as to implement the interaction between the VR device and the user, that is, the display effect desired by the user is synchronized on the display interface in real time, therefore, along with the moving process of the eyeball part of the user, the user can be prompted by enlarging or reducing the image, and the current manipulation process of the user is recognized by the VR device and presents a corresponding effect.
For example, generally, when the user does not manipulate the VR device, as shown in fig. 2, images 1 to 9 may be presented on the display interface of the VR device. Each image may be used to represent a trigger channel of a function, that is, if the user completes the operation of clicking on the image 9, the VR device triggers the function corresponding to the image 9. Or, the display interface of the VR device may be divided into regions according to functions, where the region division is to divide the display interface of the whole VR device into at least two regions, and then when the eyeball part of the user moves to a range framed by a certain region, there is an opportunity to trigger the function corresponding to the region.
In the present invention, it is considered that before and after the moving process of the eyeball part may correspond to different images, and therefore, the image corresponding to the position of the eyeball part at the previous moment may be in a situation of being enlarged, for example, as shown in fig. 3, the image 4 is in a situation of being enlarged. After the above-mentioned moving process is performed, that is, the image corresponding to the position of the eyeball part at the current time is changed from the image 4 to the image 5, at this time, it is necessary to not only enlarge and display the image 5, but also restore the display scale of the image 4 which the user no longer gazes at, that is, to make the image 4 in a reduced display state with respect to the previous time, as shown in fig. 4.
Because the proportion according to which the image is enlarged and reduced is inversely proportional, when the display interface changes along with the movement track of the eyeball part, the image which is enlarged first and then reduced or the image which is reduced first and then enlarged does not influence the visual effect of the user after restoration. That is, when an image is enlarged at a second scale and the focusing position of the eyeball part is separated from the image, the image is reduced at a first scale inversely proportional to the second scale, so that the size of the image after the reduction is the same as that of the image before the enlargement. It should be noted that, if the user needs to experience a zoom that is not visually regular in size, the first ratio and the second ratio may not be inversely proportional, and are not limited in the embodiment of the present invention. For example, if the user tries to distinguish the gaze track of the eyeball part from other positions where no gaze track is generated, the first ratio may be close to and different from the inverse ratio of the second ratio, so that not only the movement track of the eyeball part can be presented, but also the visual effect on the user caused by the first ratio and the second ratio not being inversely proportional to each other can be reduced as much as possible.
According to the method, the fixation track of the eyeball part of the user can be tracked by means of the infrared camera, when the moving distance of the eyeball part in a certain time is larger than a distance threshold, a first image and a second image which are respectively fixed at the starting moment and the ending moment of a fixation process are determined, then the first image is reduced according to a first specified proportion, and the second image is enlarged according to a second specified proportion. Therefore, the invention mainly realizes the reduction or enlargement processing of the first image and the second image presented to the user by the VR equipment according to the movement condition of the eyeball part when the user normally watches the VR equipment. In this way, the user need not make the gyroscope of the VR helmet rotate through the head is rotated, also need not to dispose extra bluetooth handle, touch pad or entity button etc. for this VR device. Therefore, the technical scheme provided by the invention can provide a flexible and convenient VR equipment control mode for a user.
In an implementation manner of the embodiment of the present invention, a time that the eyeball part stays at the same position may be detected, and when the staying time reaches a time threshold, a corresponding function of an image corresponding to the position may be triggered. Therefore, on the basis of the implementation shown in fig. 1, the implementation shown in fig. 5 can also be realized. After step 103 is executed to reduce the first image according to the first specified scale and enlarge the second image according to the second specified scale, steps 104 and 105 may also be executed:
104. detecting a time during which the eyeball part stays in the corresponding area of the second image.
105. And when the time is greater than or equal to the time threshold, triggering the VR device to execute the function corresponding to the second image.
For example, as shown in fig. 4, if the user tries to trigger the function corresponding to the image 5, the user may keep watching the area of the image 5 until the time threshold is reached, and the VR device may execute the function corresponding to the image 5, that is, click on the image 5 to access the function corresponding to the image 5.
It should be noted that the time threshold may be preset by a worker or a user, and in the setting process, an appropriate time length may be set according to an operation habit of the user, for example, 1.5 seconds, which means that, when the user continuously watches the same image for 1.5 seconds, the VR device may automatically trigger a function corresponding to the image, for example, jump to another display interface corresponding to the image, or directly start a certain video, and the like.
Therefore, the VR equipment can present the watching track of the user, and can determine whether the watching duration meets the condition of triggering the click operation or not by collecting the time when the eyeball part of the user is located at the same position, and if the watching duration meets the condition, the function corresponding to the amplified image can be directly triggered, so that the user can further control the VR equipment.
In order to reduce the manipulation error generated each time the VR device is manipulated using the VR headset, in one implementation of the embodiment of the present invention, calibration may be performed each time the user wears the VR headset, considering that the user may greatly change the relative position between the infrared camera and the eyeball part each time the user wears the VR headset. Therefore, based on the implementation shown in fig. 1 or fig. 5, taking fig. 1 as an example, the implementation shown in fig. 6 can also be implemented. Before step 101 is executed to acquire the moving distance of the eyeball part of the user within a certain time, the position of the eyeball part at the starting time of the certain time, and the position of the eyeball part at the ending time of the certain time, the position relationship between the infrared camera and the eyeball part may also be calibrated, which may specifically be implemented as step 1061 to step 1065:
1061. and acquiring the positions of the eyeball parts when the eyeball parts at the specified time sequentially watch each specified image according to a preset sequence.
The designated time is a start time or a time at which the user looks at the designated image with the highest resolution in the historical time.
The preset sequence is a preset gazing sequence for calibration, as shown in fig. 7, the preset sequence may be a sequence from left to right or from right to left, or a certain preset irregular sequence, and the specific setting basis and setting manner of the preset sequence are not limited herein.
1062. And acquiring the position relation between every two specified images in all the specified images.
1063. And determining the position relation between the eyeball part and the gazing target according to the position of the eyeball part when each designated image is gazed and the position relation between every two designated images in all the designated images.
Therefore, the number of the designated images is at least two, and each designated image is located at a different position in the display interface. In the embodiment of the present invention, the VR headset may record the position of the eyeball part corresponding to each designated image and the position relationship between every two designated images, and then obtain the corresponding relationship between the position of the eyeball part and the gaze target, that is, the position relationship between the position of the eyeball part and each designated image according to the recorded content.
1064. And determining the relative relation according to the position relation between the position of the eyeball part and the watching target and the position of the eyeball part when the current eyeball part watches the specified image.
In the present invention, the relative relationship may be expressed by a mathematical formula, or may be expressed by other manners, such as a deviation angle, a deviation direction, a deviation distance, and other parameters of the eyeball position relative to the infrared camera, which are not limited herein.
Considering that the accuracy of the calibration process is affected by the calibration times, the more the calibration times are, the more accurate the result after calibration is. In the present invention, multiple references may be set to implement multiple calibration, for example, as shown in fig. 7, a display interface of the VR device is shown, the display interface includes 4 designated images, that is, images 1 to 4, and different verification sequences may be adopted for all the designated images to obtain multiple groups of verification results, so that a more accurate relative relationship is obtained.
The position of the designated image for calibration is not limited to the display position shown in fig. 7, and it is sufficient to ensure that the position where the designated image appears is the same when calibration is performed at the current time and the designated time.
1065. And calibrating the position relation between the infrared camera and the eyeball part according to the relative relation.
It should be noted that, the user can adjust the position of the VR headset worn on the head of the user by manual or remote control, and the like, and reduce the values of the parameters representing the relative relationship as much as possible, such as the deviation angle, the deviation direction, the deviation distance, and the like, thereby completing the calibration of the positional relationship.
In the invention, after the calibration is performed by adopting the above method, a better visual effect can be presented to the user, and after the display interface presented in front of the eyes of the user at the current moment is calibrated, the display interface is basically the same as the display interface corresponding to the previous default optimal display condition, or is basically the same as the display interface presented when the user uses the VR device last time.
An embodiment of the present invention provides an apparatus 20 for operating a VR device, as shown in fig. 8, where the apparatus 20 can be used to execute any one of the method processes shown in fig. 1, fig. 5, and fig. 6, and the apparatus 20 includes:
the obtaining module 21 is configured to obtain, through the infrared camera, a moving distance of the eyeball part of the user within a certain time, a position of the eyeball part at a start time of the certain time, and a position of the eyeball part at an end time of the certain time.
A determining module 22, configured to determine, if the moving distance acquired by the acquiring module 21 is greater than the distance threshold, a first image corresponding to a position where the eyeball part is located at the start time, and a second image corresponding to a position where the eyeball part is located at the stop time.
And a processing module 23, configured to reduce the first image determined by the determining module 22 according to a first specified ratio, and enlarge the second image determined by the determining module 22 according to a second specified ratio, where the second specified ratio is inversely proportional to the first specified ratio.
In one implementation manner of the embodiment of the present invention, the apparatus 20 further includes:
and the detection module 24 is used for detecting the time that the eyeball part stays in the corresponding area of the second image.
And the control module 25 is configured to trigger the virtual reality VR device to execute a function corresponding to the second image after the time detected by the detection module 24 is greater than or equal to the time threshold.
In one implementation manner of the embodiment of the present invention, the apparatus 20 further includes:
and the calibration module 26 is used for calibrating the position relation between the infrared camera and the eyeball part.
In an implementation manner of the embodiment of the present invention, the calibration module 26 is specifically configured to:
acquiring the positions of eyeball parts when the eyeball parts at the appointed time sequentially watch each appointed image according to a preset sequence;
acquiring the position relation between every two appointed images in all the appointed images;
determining the position relation between the eyeball part and a gazing target according to the position of the eyeball part when each appointed image is gazed and the position relation between every two appointed images in all the appointed images;
determining a relative relation according to the position relation between the position of the eyeball part and the watching target and the position of the eyeball part when the current eyeball part watches the appointed image;
and calibrating the position relation between the infrared camera and the eyeball part according to the relative relation.
Compared with the prior art that a user controls VR equipment through a VR helmet worn by the user and a Bluetooth handle capable of communicating with the VR equipment, or the user can directly control the VR equipment through a touch pad or a physical key, the device can track the watching track of an eyeball part of the user by means of an infrared camera, determine a first image and a second image respectively watched at the starting moment and the ending moment of a watching process after the moving distance of the eyeball part in a certain time is larger than a distance threshold, reduce the first image according to a first specified proportion, and enlarge the second image according to a second specified proportion. Therefore, the invention mainly realizes the reduction or enlargement processing of the first image and the second image presented to the user by the VR equipment according to the movement condition of the eyeball part when the user normally watches the VR equipment. In this way, the user need not make the gyroscope of the VR helmet rotate through the head is rotated, also need not to dispose extra bluetooth handle, touch pad or entity button etc. for this VR device. Therefore, the technical scheme provided by the invention can provide a flexible and convenient VR equipment control mode for a user.
An embodiment of the invention provides a VR headset 30 for operating a VR device, as shown in fig. 9, the VR headset 30 includes a communication interface 31 and a processor 32. In an embodiment of the invention, the VR headset 30 may also include a memory 33 and a bus 34. Wherein the bus 34 may be used to connect the communication interface 31, the processor 32 and the memory, and the memory 33 may be used to store data generated by the VR headset 30 during execution of the method flows shown in fig. 1, 5 and 6.
The communication interface 31 is configured to obtain, through the infrared camera, a moving distance of the eyeball part of the user within a certain time, a position of the eyeball part at a start time of the certain time, and a position of the eyeball part at an end time of the certain time. And the processor 32 is used for determining a first image corresponding to the position of the eyeball part at the starting time and a second image corresponding to the position of the eyeball part at the ending time if the moving distance is greater than the distance threshold, reducing the first image according to a first specified proportion, and enlarging the second image according to a second specified proportion. The processor 32 is further configured to detect a time during which the eyeball part stays in the corresponding area of the second image; and when the time is greater than or equal to the time threshold, triggering the VR device to execute the function corresponding to the second image. After the communication interface 31 obtains the moving distance and the two positions, the processor 32 is further configured to obtain the positions of the eyeball parts when the eyeball parts watch each of the designated images in sequence according to a preset sequence at the designated time; acquiring the position relation between every two appointed images in all the appointed images; determining the position relation between the eyeball part and a gazing target according to the position of the eyeball part when each appointed image is gazed and the position relation between every two appointed images in all the appointed images; determining a relative relation according to the position relation between the position of the eyeball part and the watching target and the position of the eyeball part when the current eyeball part watches the appointed image; and calibrating the position relation between the infrared camera and the eyeball part according to the relative relation.
Compared with the prior art that a user controls VR equipment through a VR helmet worn by the user and a Bluetooth handle capable of communicating with the VR equipment, or the user can directly control the VR equipment through a touch pad or a physical key, the VR helmet capable of controlling VR equipment can track the watching track of an eyeball part of the user by means of an infrared camera, determine a first image and a second image respectively watched at the starting moment and the ending moment of a watching process after the moving distance of the eyeball part in a certain time is larger than a distance threshold, reduce the first image according to a first specified proportion, and enlarge the second image according to a second specified proportion. Therefore, the invention mainly realizes the reduction or enlargement processing of the first image and the second image presented to the user by the VR equipment according to the movement condition of the eyeball part when the user normally watches the VR equipment. In this way, the user need not make the gyroscope of the VR helmet rotate through the head is rotated, also need not to dispose extra bluetooth handle, touch pad or entity button etc. for this VR device. Therefore, the technical scheme provided by the invention can provide a flexible and convenient VR equipment control mode for a user.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the apparatus embodiment, since it is substantially similar to the method embodiment, it is relatively simple to describe, and reference may be made to some descriptions of the method embodiment for relevant points.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The above description is only for the specific embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (6)

1. A method of operating a VR device, the method comprising:
acquiring a moving distance of a user eyeball part in a certain time, a position of the eyeball part at a starting moment of the certain time and a position of the eyeball part at a stopping moment of the certain time through an infrared camera;
if the moving distance is larger than a distance threshold value, determining a first image corresponding to the position of the eyeball part at the starting moment and a second image corresponding to the position of the eyeball part at the ending moment;
reducing the first image according to a first specified proportion, and enlarging the second image according to a second specified proportion;
before the obtaining, by the infrared camera, a moving distance of a user eyeball part within a certain time, a position of the eyeball part at a start time of the certain time, and a position of the eyeball part at an end time of the certain time, calibrating a positional relationship between the infrared camera and the eyeball part, specifically including:
acquiring the position of the eyeball part when the eyeball part is watched on each appointed image in sequence at the appointed moment according to a preset sequence;
acquiring the position relation between every two appointed images in all the appointed images;
determining the position relation between the eyeball part and a fixation target according to the position of the eyeball part when each specified image is fixed and the position relation between every two specified images in all the specified images;
determining a relative relation according to the position relation between the position of the eyeball part and a fixation target and the position of the eyeball part when the eyeball part is fixed on the appointed image;
and calibrating the position relation between the infrared camera and the eyeball part according to the relative relation.
2. The method of claim 1, wherein after the reducing the first image at a first specified scale and the enlarging the second image at a second specified scale, the method further comprises:
detecting the time for which the eyeball part stays in the corresponding area of the second image;
and when the time is greater than or equal to a time threshold value, triggering the virtual reality VR equipment to execute a function corresponding to the second image.
3. The method of claim 1, wherein the specified time is the starting time or a time of highest sharpness when the user gazes at the specified image in historical time.
4. An apparatus to operate a VR device, the apparatus comprising:
the system comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring the moving distance of a user eyeball part in a certain time, the position of the eyeball part at the starting moment of the certain time and the position of the eyeball part at the ending moment of the certain time through an infrared camera;
a determining module, configured to determine, if the moving distance acquired by the acquiring module is greater than a distance threshold, a first image corresponding to a position where the eyeball part is located at the start time, and a second image corresponding to a position where the eyeball part is located at the stop time;
the processing module is used for reducing the first image determined by the determining module according to a first specified proportion and amplifying the second image determined by the determining module according to a second specified proportion;
a calibration module, configured to calibrate a positional relationship between the infrared camera and the eyeball part before the infrared camera acquires a movement distance of the eyeball part of the user within a certain time, a position of the eyeball part at a start time of the certain time, and a position of the eyeball part at a stop time of the certain time, and specifically configured to:
acquiring the position of the eyeball part when the eyeball part is watched on each appointed image in sequence at the appointed moment according to a preset sequence;
acquiring the position relation between every two appointed images in all the appointed images;
determining the position relation between the eyeball part and a fixation target according to the position of the eyeball part when each specified image is fixed and the position relation between every two specified images in all the specified images;
determining a relative relation according to the position relation between the position of the eyeball part and a fixation target and the position of the eyeball part when the eyeball part is fixed on the appointed image;
and calibrating the position relation between the infrared camera and the eyeball part according to the relative relation.
5. The apparatus of claim 4, further comprising:
the detection module is used for detecting the time that the eyeball part stays in the corresponding area of the second image;
and the control module is used for triggering the virtual reality VR equipment to execute the function corresponding to the second image after the time detected by the detection module is greater than or equal to a time threshold.
6. The apparatus of claim 4, wherein the specified time is the starting time or a time with highest definition when a user gazes at the specified image in a historical time.
CN201611217984.4A 2016-12-26 2016-12-26 Method and device for controlling VR equipment Active CN106774912B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611217984.4A CN106774912B (en) 2016-12-26 2016-12-26 Method and device for controlling VR equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611217984.4A CN106774912B (en) 2016-12-26 2016-12-26 Method and device for controlling VR equipment

Publications (2)

Publication Number Publication Date
CN106774912A CN106774912A (en) 2017-05-31
CN106774912B true CN106774912B (en) 2020-04-07

Family

ID=58925912

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611217984.4A Active CN106774912B (en) 2016-12-26 2016-12-26 Method and device for controlling VR equipment

Country Status (1)

Country Link
CN (1) CN106774912B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107797664B (en) * 2017-10-27 2021-05-07 Oppo广东移动通信有限公司 Content display method and device and electronic device
CN111338480A (en) * 2020-02-26 2020-06-26 惠州Tcl移动通信有限公司 Image updating method and device, storage medium and mobile terminal

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1661536A (en) * 2004-02-23 2005-08-31 鸿富锦精密工业(深圳)有限公司 Non-linear and non-tree configured menu mode
CN103838374A (en) * 2014-02-28 2014-06-04 深圳市中兴移动通信有限公司 Message notification method and message notification device
CN104699250A (en) * 2015-03-31 2015-06-10 小米科技有限责任公司 Display control method, display control device and electronic equipment
CN104793750A (en) * 2015-04-30 2015-07-22 努比亚技术有限公司 Input method and device
CN104915099A (en) * 2015-06-16 2015-09-16 努比亚技术有限公司 Icon sorting method and terminal equipment
CN104951070A (en) * 2015-06-02 2015-09-30 无锡天脉聚源传媒科技有限公司 Method and device for manipulating device based on eyes

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2545202C (en) * 2003-11-14 2014-01-14 Queen's University At Kingston Method and apparatus for calibration-free eye tracking
US8879801B2 (en) * 2011-10-03 2014-11-04 Qualcomm Incorporated Image-based head position tracking method and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1661536A (en) * 2004-02-23 2005-08-31 鸿富锦精密工业(深圳)有限公司 Non-linear and non-tree configured menu mode
CN103838374A (en) * 2014-02-28 2014-06-04 深圳市中兴移动通信有限公司 Message notification method and message notification device
CN104699250A (en) * 2015-03-31 2015-06-10 小米科技有限责任公司 Display control method, display control device and electronic equipment
CN104793750A (en) * 2015-04-30 2015-07-22 努比亚技术有限公司 Input method and device
CN104951070A (en) * 2015-06-02 2015-09-30 无锡天脉聚源传媒科技有限公司 Method and device for manipulating device based on eyes
CN104915099A (en) * 2015-06-16 2015-09-16 努比亚技术有限公司 Icon sorting method and terminal equipment

Also Published As

Publication number Publication date
CN106774912A (en) 2017-05-31

Similar Documents

Publication Publication Date Title
JP7191714B2 (en) Systems and methods for direct pointing detection for interaction with digital devices
JP6677239B2 (en) Information processing apparatus, control method, and program
JP6652613B2 (en) Wearable device, control method, control program, and imaging device
JP4399513B2 (en) EEG interface system, EEG interface apparatus, method, and computer program
US9001006B2 (en) Optical-see-through head mounted display system and interactive operation
EP2956844B1 (en) Systems and methods of eye tracking calibration
JP6123694B2 (en) Information processing apparatus, information processing method, and program
KR20150032019A (en) Method and apparatus for providing user interface by using eye tracking
US20100189426A1 (en) System and method for human machine interface for zoom content on display
US10896545B1 (en) Near eye display interface for artificial reality applications
US20140085189A1 (en) Line-of-sight detection apparatus, line-of-sight detection method, and program therefor
US10877647B2 (en) Estimations within displays
CN106774912B (en) Method and device for controlling VR equipment
CN113495613B (en) Eyeball tracking calibration method and device
JPWO2020080107A1 (en) Information processing equipment, information processing methods, and programs
US10466780B1 (en) Systems and methods for eye tracking calibration, eye vergence gestures for interface control, and visual aids therefor
JP2016126687A (en) Head-mounted display, operation reception method, and operation reception program
CN112148112B (en) Calibration method and device, nonvolatile storage medium and processor
KR20190085466A (en) Method and device to determine trigger intent of user
KR102325684B1 (en) Eye tracking input apparatus thar is attached to head and input method using this
JP6982656B2 (en) Eye-pointing system operated by the eye
JP5887297B2 (en) Image processing apparatus and image processing program
CN113491502A (en) Eyeball tracking calibration inspection method, device, equipment and storage medium
CN112114657B (en) Method and system for collecting gaze point information
JP6876639B2 (en) Display control device, display control method and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant