CN114973042A - VR handle position detection method, device, equipment and medium - Google Patents

VR handle position detection method, device, equipment and medium Download PDF

Info

Publication number
CN114973042A
CN114973042A CN202210520123.2A CN202210520123A CN114973042A CN 114973042 A CN114973042 A CN 114973042A CN 202210520123 A CN202210520123 A CN 202210520123A CN 114973042 A CN114973042 A CN 114973042A
Authority
CN
China
Prior art keywords
handle
equipment
camera
current
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210520123.2A
Other languages
Chinese (zh)
Inventor
尹伟
张方方
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Inc
Original Assignee
Goertek Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Inc filed Critical Goertek Inc
Priority to CN202210520123.2A priority Critical patent/CN114973042A/en
Publication of CN114973042A publication Critical patent/CN114973042A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Abstract

The application discloses a VR handle position detection method, device, equipment and medium, and relates to the field of virtual reality. Acquiring an image of a handle through a camera of the current VR equipment; wherein the VR device includes a head-mounted device and a handle; judging whether the handle is in the field angle range of the camera; if not, sending a handle position confirmation request to other equipment with a camera shooting function so as to obtain position information of the head-mounted equipment and the handle of the current VR equipment through the other equipment with the camera shooting function; and confirming the position of the handle according to the position information so as to correct the position tracking of the handle. Therefore, according to the scheme, after the handle moves out of the detection range of the camera, the position of the handle is confirmed through other equipment with the camera shooting function, so that the specific position of the handle is confirmed, and the position of the handle is conveniently tracked and corrected. The motion range of handle that enlarges has promoted the experience of using.

Description

VR handle position detection method, device, equipment and medium
Technical Field
The present application relates to the field of virtual reality, and in particular, to a VR handle position detecting method, apparatus, device, and medium.
Background
Virtual Reality (VR) technology is that Virtual and real are combined with each other. The data in real life is utilized, the electronic signals generated by computer technology are combined with various output devices to be converted into phenomena which can be felt by people, the phenomena can be true and true objects in reality and can also be substances which can not be seen by the naked eyes, and the phenomena are expressed by a three-dimensional model. These phenomena are called virtual reality because they are not directly visible but a real world simulated by computer technology. The immersion is the most important characteristic of the virtual reality technology, namely, the user can become and feel that the user is a part of the environment created by the computer system, the immersion of the virtual reality technology depends on the perception system of the user, and when the user perceives the stimulation of the virtual world, including touch sensation, taste sensation, smell sensation, motion perception and the like, thinking resonance is generated to cause psychological immersion, and the feeling is like entering the real world.
At present, VR product on the market is through placing the lamp area on the handle, shoots the lamp area of handle through the camera of wearing in the use, calculates the position and the rotation angle of handle through shooing in lamp area to combine inertial sensor to realize the detection to the position of handle and rotation angle. The VR handle that this kind of mode was realized has the problem that the camera shot the dead angle, if do not have the position behind the brain of camera, position behind the body etc. when the handle moved to camera dead angle position department, just can't accurate judgement handle's position.
In view of the above problems, it is an urgent need to solve by those skilled in the art to design a VR handle position detection method to avoid the problem that the handle cannot be positioned when the camera is in a dead angle.
Disclosure of Invention
The application aims to provide a VR handle position detection method, device, equipment and medium, and solves the problem that when a handle moves to a dead angle position of a camera in the use of VR equipment, the position of the handle cannot be accurately judged.
In order to solve the above technical problem, the present application provides a VR handle position detecting method, including:
acquiring an image of a handle through a camera of the current VR equipment; wherein the VR device comprises a head-mounted device and the handle;
judging whether the handle is in the field angle range of the camera or not;
if not, sending a handle position confirmation request to other equipment with a camera shooting function so as to obtain the position information of the head-mounted equipment and the handle of the current VR equipment through the other equipment with the camera shooting function;
and confirming the position of the handle according to the position information.
Preferably, the specific step of acquiring the position information of the headset and the handle of the current VR device by the other device with the camera function includes:
receiving the handle position confirmation request by each piece of other equipment with the camera shooting function;
respectively starting the cameras according to the handle position confirmation requests;
judging whether an image containing the identification information of the current VR equipment can be shot or not; wherein the identification information is respectively located on the surfaces of the head-mounted device and the handle;
if so, acquiring the position information of the head-mounted equipment and the handle of the current VR equipment through the camera respectively according to the identification information.
Preferably, before the sending of the handle position confirmation request to the other apparatus having the image capturing function, the method further includes:
judging whether other equipment with the camera shooting function meeting the preset requirement exists or not;
and if so, sending a handle position confirmation request to other equipment with the camera shooting function.
Preferably, if it is determined that the handle is within the field angle range of the camera, the method further includes:
acquiring distance information between the handle and the head-mounted equipment, and acquiring the turning angle of the handle;
confirming the position of the handle according to the distance information and the overturning angle so as to conveniently track the position of the handle;
and returning to the step of acquiring the image of the handle through the camera of the current VR equipment.
Preferably, the specific step of position tracking the handle comprises:
acquiring acceleration information and angular velocity information of the handle;
and updating the position of the handle and the turnover angle according to the acceleration information and the angular velocity information so as to track the position of the handle.
Preferably, the determining whether there is the other device having the image capturing function that meets a preset requirement includes:
judging whether other equipment with the camera shooting function is present, wherein the other equipment with the camera shooting function is in line with a preset distance with the current VR equipment;
if yes, confirming that the other equipment with the camera shooting function meets the preset requirement, and sending a handle position confirmation request to the other equipment with the camera shooting function.
Preferably, after the confirming that there is the other apparatus having the image capturing function which meets the preset requirement, the method further includes:
judging whether the obtained shooting angles of the other equipment with the camera shooting function which accords with the preset distance are different or not;
and if so, entering a step of sending a handle position confirmation request to other equipment with a camera shooting function.
In order to solve the above technical problem, the present application further provides a VR handle position detecting device, including:
the acquisition module is used for acquiring an image of the handle through a camera of the current VR equipment; wherein the VR device comprises a head-mounted device and the handle;
the judging module is used for judging whether the handle is in the field angle range of the camera or not; if not, triggering a sending module;
the sending module is configured to send a handle position confirmation request to other devices with a camera function, so as to obtain position information of the headset and the handle of the current VR device through the other devices with the camera function;
and the confirming module is used for confirming the position of the handle according to the position information.
In order to solve the above technical problem, the present application further provides a VR device, including:
a memory for storing a computer program;
and the processor is used for realizing the steps of the VR handle position detection method when executing the computer program.
In order to solve the above technical problem, the present application further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the steps of the VR handle position detecting method are implemented.
According to the VR handle position detection method, the image of the handle is collected through the camera of the current VR equipment; wherein the VR device includes a head-mounted device and a handle; judging whether the handle is in the field angle range of the camera or not; if not, sending a handle position confirmation request to other equipment with a camera shooting function so as to obtain position information of the head-mounted equipment and the handle of the current VR equipment through the other equipment with the camera shooting function; and confirming the position of the handle according to the position information so as to correct the position tracking of the handle. Therefore, according to the scheme, after the handle moves out of the detection range of the camera, the position of the handle is confirmed through other equipment with the camera shooting function, so that the specific position of the handle is confirmed, and the position of the handle is conveniently tracked and corrected. The motion range of handle that enlarges has promoted the experience of using.
In addition, the embodiment of the application also provides a VR handle position detection device, VR equipment and a computer readable storage medium, and the effect is the same as above.
Drawings
In order to more clearly illustrate the embodiments of the present application, the drawings needed for the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained by those skilled in the art without inventive effort.
Fig. 1 is a flowchart of a VR handle position detection method according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of an application scenario of multiple VR devices according to an embodiment of the present application;
fig. 3 is a flowchart of another VR handle position detection method according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a VR handle position detecting device according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a VR device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without any creative effort belong to the protection scope of the present application.
The core of the application is to provide a VR handle position detection method, device, equipment and medium.
In order that those skilled in the art will better understand the disclosure, the following detailed description will be given with reference to the accompanying drawings.
Fig. 1 is a flowchart of a VR handle position detection method according to an embodiment of the present application. As shown in fig. 1, the method comprises:
s10: acquiring an image of a handle through a camera of the current VR equipment; wherein, the VR device includes a head-mounted device and a handle.
S11: judging whether the handle is in the field angle range of the camera or not; if not, the process proceeds to step S12.
S12: and sending a handle position confirmation request to other equipment with the camera shooting function so as to obtain the position information of the head-mounted equipment and the handle of the current VR equipment through the other equipment with the camera shooting function.
S13: and confirming the position of the handle according to the position information.
It is understood that a VR device is a technological device that can simulate the real world, and provides a technique of providing an immersive sensation in an interactive three-dimensional environment generated on a computer by comprehensively using a computer graphic system and various interface devices for reality and control. Typically, a VR device consists primarily of a head-mounted device and a handle. The head-mounted equipment is a display of the VR equipment, and can output a picture of a virtual environment through computer simulation so as to give people an environmental immersion feeling; the presentation of the VR is primarily presented through the VR headset. The handle is the controller of VR equipment, can carry out specific control operation in the different use of VR equipment. VR product on the present market through placing the lamp area on the handle, shoots the lamp area of handle through the camera of wearing in the use, calculates the position and the rotation angle of handle through shooing in lamp area to the inertial sensor who combines VR equipment realizes the detection to the position and the rotation angle of handle. The VR handle that this kind of mode was realized has the problem that the camera shot the dead angle, if do not have the back of the brain position of camera, position after the body etc. when the handle moved to camera dead angle position department, just can't accurate judgement handle's position. Therefore, an embodiment of the present application provides a VR handle position detecting method, which includes:
the method includes the steps that an image of a handle is collected through a camera of the current VR device, the camera of the current VR device is usually arranged on the head-mounted device, and can be specifically arranged on the surface of the head-mounted device or other positions of the head-mounted device. The camera can capture images of the handle at a preset capture frame rate, for example, 30fps, and the specific capture frame rate is not limited in this embodiment, depending on the specific implementation. The mode of the image of handle is gathered to the camera, shoots the lamp area of handle usually, just can confirm when shooting the lamp area and shoot the handle. In this embodiment, the light strip may be disposed on the surface of the handle or other positions of the handle, and meanwhile, the specific shape, size and color of the light strip are not limited, and are determined according to specific implementation conditions. In addition, the acquisition of the image of the handle can also be directly realized by identifying the handle, which is not limited in this embodiment and is determined according to specific implementation conditions.
The camera through current VR equipment gathers the in-process of the image of handle, judges whether the handle is at the field angle scope of camera. The Field of view (FOV) is also called Field of view in optical engineering, and the size of the FOV determines the Field of view of the optical instrument. In an optical instrument, an angle formed by two edges of a lens, which is the maximum range in which an object image of a target to be measured can pass through, is called a field angle. The size of the field angle determines the field range of the optical instrument, the larger the field angle is, the larger the field is, and the smaller the optical magnification is; the target object is not captured in the lens beyond the field angle. Therefore, whether the handle is in the field angle range of the camera or not is judged, namely whether the handle can be shot by the camera or not is judged; known from above-mentioned embodiment, can be specifically for judging the lamp area that the handle can be shot to the camera. And if the handle is confirmed not to be in the field angle range of the camera, sending a handle position confirmation request to the other equipment with the camera shooting function so as to acquire the position information of the head-mounted equipment and the handle of the current VR equipment through the other equipment with the camera shooting function.
It can be understood that other devices with a camera function are in the same application scene as the current VR device. Other devices with a camera function include, but are not limited to, other VR devices, Augmented Reality (AR) devices, and even mobile phones or tablet computers; the camera only needs to be ensured to have the shooting function, and the specific types of other equipment with the camera shooting function are not limited and are determined according to specific implementation conditions. Meanwhile, in specific implementation, the position information of the head-mounted device and the handle of the current VR device can be simultaneously acquired by using other devices with the same type and camera shooting function, for example, the position information is acquired by two other VR devices; different types of other equipment with the camera shooting function can be selected to simultaneously acquire the position information of the head-mounted equipment and the handle of the current VR equipment, for example, the position information is acquired through a mobile phone and AR equipment respectively; the specific application manner of the other device with the image capturing function is not limited in this embodiment, and is determined according to the specific implementation situation.
In addition, other devices with a shooting function need to be capable of being in communication connection with the current VR device, so that a handle position confirmation request sent by the current VR device can be received, and position information can be sent to the current VR device. The Communication connection mode may be Bluetooth Low Energy (BLE) connection, Near Field Communication (NFC), or wireless network connection, and only needs to ensure that other devices having a camera function can communicate with the current VR device, which is determined according to specific implementation conditions.
In consideration of the application scenario of the current VR device, the present embodiment provides a specific embodiment of another device having an image capturing function. Fig. 2 is a schematic diagram of an application scenario of multiple VR devices according to an embodiment of the present application. As shown in fig. 2, in some usage scenarios of VR devices, there are usually multiple users respectively using multiple sets of VR devices to experience the same VR scenario. Therefore, in this embodiment, when the camera of one VR device cannot capture an image of the handle of the VR device and perform handle positioning, the other devices having the shooting function may be specifically the remaining VR devices having the shooting function, that is, the handle is shot by the remaining VR devices in the same scene, so as to assist in positioning the handle. As shown in fig. 2, for example, the current VR device a determines that its own handle is not within the field angle range of the camera, sends a handle position confirmation request to the remaining VR devices, i.e., VR device B and VR device C, and acquires the position information of the head-mounted device and the handle of the current VR device a through VR device B and VR device C.
It should be noted that, for example, in fig. 2, when the VR device B and the VR device C acquire the position information of the head-mounted device and the handle of the VR device a that sends out the handle position confirmation request, the cameras of the VR device B and the VR device C may take pictures of the lamp strips on the head-mounted device and the handle, so as to acquire the position information; the distances between the head-mounted device and the handle of the VR device a can be measured respectively through the positioning signals between the VR devices, so that the position information of the VR device a can be acquired.
After the position information is obtained, the position of the handle is confirmed based on the position information. It can be understood that, since the position information of the headset device and the handle of the current VR device is obtained by the other device with the camera function, the confirmation process can be implemented on the other device with the camera function according to the confirmation of the position information of the handle of the current VR device, and the specific position of the handle is sent to the current VR device after the position of the handle is confirmed; the position information can also be directly sent to the current VR equipment, the current VR equipment confirms the position of the handle through the position information, the specific mode of confirming the position of the handle according to the position information is not limited, and the specific implementation condition is determined.
In addition, after the current VR device confirms the position of the handle, the process may immediately return to step S10 to capture an image of the handle of the VR device so as to confirm the position of the handle; the image of the handle may also be captured after a preset period, which is not limited in this embodiment and is determined according to specific implementation conditions.
In this embodiment, an image of the handle is acquired through a camera of the current VR device; wherein the VR device includes a head-mounted device and a handle; judging whether the handle is in the field angle range of the camera or not; if not, sending a handle position confirmation request to other equipment with a camera shooting function so as to obtain position information of the head-mounted equipment and the handle of the current VR equipment through the other equipment with the camera shooting function; and confirming the position of the handle according to the position information so as to correct the position tracking of the handle. Therefore, according to the scheme, after the handle moves out of the detection range of the camera, the position of the handle is confirmed through other equipment with the camera shooting function, so that the specific position of the handle is confirmed, and the position of the handle is conveniently tracked and corrected. The motion range of handle that enlarges has promoted the experience of using.
On the basis of the above-described embodiment:
as a preferred embodiment, the specific step of acquiring the position information of the head-mounted device and the handle of the current VR device by using another device with a camera function includes:
other equipment with the camera shooting function receives a handle position confirmation request;
respectively starting the cameras according to the handle position confirmation requests;
judging whether an image containing identification information of the current VR equipment can be shot or not; the identification information is respectively positioned on the surfaces of the head-mounted equipment and the handle;
if yes, the position information of the head-mounted equipment and the handle of the current VR equipment is collected through the camera according to the identification information.
In the above embodiments, the specific manner of acquiring the location information is not limited, and is determined according to specific implementation situations. As a preferred embodiment, in this embodiment, the specific steps of acquiring the position information of the head-mounted device and the handle of the current VR device by another device having a camera function are as follows:
first, the other devices each having an imaging function receive a handle position confirmation request. The handle position confirmation request enables another device having an image pickup function to confirm which VR device has been requested, and thereby perform image pickup. After other equipment with the camera shooting function receives the handle position confirmation request, VR equipment needing to be shot can be confirmed, and the cameras are respectively started according to the handle position confirmation request. And starting shooting after each camera is started.
It should be noted that since images captured after the camera is started by another device having a camera function are not necessarily images of the headset and the handle of the current VR device, the acquisition of the position information of the headset and the handle of the current VR device cannot be achieved. Therefore, in the shooting process, it is also necessary to determine whether an image containing the identification information of the current VR device can be shot; wherein, the identification information is respectively positioned on the surfaces of the head-mounted device and the handle. The identification information can be a preset pattern, can also be a certain color or a lamp strip flickering at a preset frequency, and only other equipment with the camera shooting function needs to be enabled to confirm to shoot the current VR equipment. The identification information is located on the surfaces of the head-mounted device and the handle, and the specific position of the identification information on the surfaces of the head-mounted device and the handle is not limited and is determined according to specific implementation conditions.
When judging that the image containing the identification information of the current VR equipment is shot, the position information of the head-mounted equipment and the handle of the current VR equipment is collected through the camera according to the identification information. Specifically, when other devices with a camera function are the remaining VR devices, as exemplified by VR device B and VR device C in fig. 2, when the two respectively obtain a handle position confirmation request of current VR device a, the two respectively start to take a picture, and by taking a picture of current VR device a, position information of the headset and the handle of current VR device a in the coordinate systems of VR device B and VR device C can be respectively obtained. The position information acquired by the VR device B is as follows: current VR device a head mounted position information (x1, y1, z1, θ x1, θ y1, θ z1), current VR device a handle position information (x2, y2, z2, θ x2, θ y2, θ z 2); the position information acquired by the VR device C is as follows: current VR device a head mounted position information (x3, y3, z3, θ x3, θ y3, θ z3), current VR device a handle position information (x4, y4, z4, θ x4, θ y4, θ z 4).
It can be understood that the position information of the head-mounted device and the handle of the current VR device a in the coordinate systems of the VR device B and the VR device C, including the handle coordinate information and the handle turning angle, can be respectively obtained through the position information. Further, the VR device B and the VR device C respectively send the obtained location information to the current VR device a. After receiving the feedback position information, the VR device a converts the data through a trigonometric function to obtain the position information (Δ x1, Δ y1, Δ z1, Δ θ x1, Δ θ y1, Δ θ z1) of the handle in its own coordinate system, thereby confirming the position of the handle.
It should be noted that, because the position information received by the VR device a is from the VR device B and the VR device C, respectively, when the position information is processed, the two sets of position information can be averaged, and then the data is converted to obtain the position information of the handle in the coordinate system of the handle; and the reliability of the two groups of position information can be compared, and the data conversion is carried out on the group of position information with higher reliability to obtain the position information of the handle in the coordinate system. The specific processing manner of the current VR device receiving the location information is not limited, and depends on the specific implementation situation.
In this embodiment, a handle position confirmation request is received by each of the other devices having the image pickup function; respectively starting the cameras according to the handle position confirmation requests; judging whether an image containing identification information of the current VR equipment can be shot or not; the identification information is respectively positioned on the surfaces of the head-mounted equipment and the handle; if, then gather the position information of the head-mounted device of current VR equipment and handle through the camera respectively according to identification information, realized gathering the position information of the head-mounted device of current VR equipment and handle through other equipment that have the function of making a video recording to follow-up current VR equipment confirms the position of handle according to position information.
Fig. 3 is a flowchart of another VR handle position detection method according to an embodiment of the present application. As shown in fig. 3, in order to confirm whether there is any other device having an image capturing function that can provide position information of the handle, before transmitting a handle position confirmation request to the other device having an image capturing function, the method further includes:
s14: judging whether other equipment with a camera shooting function meeting the preset requirement exists or not; if yes, the process proceeds to step S12.
It can be understood that, since the confirmation of the handle position of the VR device is implemented by other devices with the camera function in the present application, in an actual application, other devices with the camera function that can provide the handle position information may not exist in the same VR scene, or the other devices with the camera function that exist may not provide the position information of the handle of the current VR device. In order to prevent the current VR device from continuously sending the handle position confirmation request and increasing the power consumption of the device, before sending the handle position confirmation request to other devices with the camera function, whether other devices with the camera function meeting the preset requirements exist is judged. And when confirming that other equipment with the camera shooting function meets the preset requirement exists, sending a handle position confirmation request to the other equipment with the camera shooting function.
In the present embodiment, the preset requirement is not limited. Considering that the definition of an image shot by other equipment with the camera shooting function to the current VR equipment directly influences the acquisition of the position information of the head-mounted equipment and the handle of the current VR equipment, the preset requirement can be set to be that the distance between the other equipment with the camera shooting function and the current VR equipment meets the preset distance, so that the other equipment with the camera shooting function meeting the requirement on the distance is screened, when the distance meets the requirement, a handle position confirmation request is sent to the other equipment with the camera shooting function meeting the distance requirement, and the acquisition of the position information of the head-mounted equipment and the handle of the current VR equipment is carried out. The preset distance is not limited and depends on the specific implementation.
In addition, because the current VR device is in communication connection with other devices with the image pickup function, and is usually in the same VR scene, there may be an association between images taken by the current VR device and other devices with the image pickup function. Therefore, the current VR equipment and other equipment with the camera shooting function can share the shot images and share image information, whether other equipment with the camera shooting function can shoot the current VR equipment is judged through the images, if yes, the equipment meets the preset requirement, the current VR equipment sends a handle position confirmation request to the equipment, and the head-mounted equipment of the current VR equipment and the position information of the handle are obtained.
It is to be noted that when other apparatuses having an image pickup function that meet preset requirements are obtained, there may be a large number of other apparatuses having an image pickup function that meet the preset requirements. The acquisition of the head-mounted device of current VR equipment and the positional information of handle often only need several have the function of making a video recording other equipment realize can, and adopt a large amount of other equipment that have the function of making a video recording that accord with the preset requirement to carry out the acquisition of the head-mounted device of current VR equipment and the positional information of handle, can have the problem that increases the equipment consumption, and extravagant equipment resource. Therefore, when a large number of other devices with the camera shooting function meeting the preset requirements exist, the current VR device can be set to only send the handle position confirmation requests to the preset number of other devices with the camera shooting function, the calculation amount of the position information is prevented from being increased, and the power consumption of the device is prevented from being increased. The preset number is not limited in this embodiment, and is determined according to specific implementation conditions.
In addition, when there are a large number of other devices with an image capturing function that meet preset requirements, other devices with an image capturing function at different image capturing angles may be screened, or other devices with an image capturing function that are idle in the current process may be screened, and then a handle position confirmation request may be sent to these devices.
In this embodiment, whether other devices with the image capturing function which meet the preset requirements exist is determined before the handle position confirmation request is sent to the other devices with the image capturing function, and the handle position confirmation request is sent to the devices after the other devices with the image capturing function which meet the preset requirements exist, so that the problem that the power consumption of the devices is increased due to the fact that the handle position confirmation request is continuously sent by the current VR device is solved.
As shown in fig. 3, if the handle is determined to be within the field angle range of the camera, the method further includes:
s15: the distance information between the handle and the head-mounted equipment is acquired, and the turning angle of the handle is acquired.
S16: confirming the position of the handle according to the distance information and the turning angle so as to conveniently track the position of the handle; return to step S10.
In the above-described embodiment, when the handle is not within the field angle range of the camera, the position information of the handle is acquired by another apparatus having an image pickup function, thereby confirming the position of the handle. In the embodiment, when the handle is in the field angle range of the camera, the position of the handle is confirmed by the camera of the current VR device. The method comprises the following specific steps:
when the handle is in the field angle range of the camera, firstly, the distance information between the handle and the head-mounted equipment is obtained, and the overturning angle of the handle is obtained. Specifically, shooting through a camera to obtain an image of the lamp strip of the handle; by imaging the light strip, the distance of the handle from the head-mounted device (x5, y5, z5) and the flip angle of the handle at that time (θ x5, θ y5, θ z5) are calculated. Further, confirming the position of the handle according to the distance information and the overturning angle; the confirmation of the handle position is obtained through the distance information and the overturning angle through a trigonometric function. After the position of the handle is acquired, the process returns to step S10 to cyclically acquire the position of the handle.
It should be noted that, in the present application, whether the position of the handle is obtained through its own camera or other device with an image capturing function, the obtained position of the handle is used to correct the position of the handle so as to track the position of the movement of the handle. The specific process of tracking the position of the handle is not limited in this embodiment, and depends on the specific implementation.
In this embodiment, if it is determined that the handle is within the field angle range of the camera, distance information between the handle and the head-mounted device is acquired, and the turning angle of the handle is acquired. The position of the handle is confirmed according to the distance information and the turning angle, so that the position of the handle is obtained, and the position of the handle is conveniently tracked subsequently.
On the basis of the above-described embodiment:
as a preferred embodiment, the specific steps of position tracking the handle include:
acquiring acceleration information and angular velocity information of a handle;
and updating the position and the turnover angle of the handle according to the acceleration information and the angular speed information so as to track the position of the handle.
In the above embodiments, the specific process of position tracking of the handle is not limited, and depends on the specific implementation. As a preferred embodiment, in this embodiment, first, acceleration information and angular velocity information of the handle are acquired; specifically, acceleration information and angular velocity information of the movement of the handle (Ax1, Ay1, Az1, ω x1, ω y1, ω y1) are acquired, and as the handle continues to move, continuous acceleration and angular velocity information of the handle (Ax2, Ay2, Az2, ω x2, ω y2, ω y2), …, (Axn, Ayn, Azn, ω xn, ω yn, ω yn) is obtained. It is understood that the acquisition of acceleration information and angular velocity information is acquired by a sensor in the handle of the current VR device.
Further, the position and the turnover angle of the handle are updated according to the acceleration information and the angular velocity information; that is, the real-time hand grip position and roll-over information (Xs1, Ys1, Zs1, θ x1, θ y1, θ z1) is integrated based on the acceleration and angular velocity information (Ax1, Ay1, Az1, ω x1, ω y1, ω y1) of the hand grip obtained as described above, and the hand grip position information is updated. And continuously acquiring the latest integral positions (Xs2, Ys2, Zs2, theta x2, theta y2, theta z2), …, (Xsn, Ysn, Zsn, theta xn, theta yn and theta zn) of the handle along with the continuous movement of the handle, thereby realizing the position tracking of the handle.
In the embodiment, acceleration information and angular velocity information of the handle are acquired; and the position and the turnover angle of the handle are updated according to the acceleration information and the angular velocity information, so that the position tracking of the handle is realized.
On the basis of the above-described embodiment:
as a preferred embodiment, the determining whether there is another apparatus having an image capturing function that meets a preset requirement includes:
judging whether other equipment with a camera shooting function, which is in line with a preset distance from the current VR equipment, exists;
if yes, it is confirmed that there is another device having an image pickup function that meets the preset requirement, and the process proceeds to step S12.
In the foregoing embodiment, the preset requirement is not limited, and as a preferred embodiment, in this embodiment, the preset requirement is to determine whether there is another device with an image capturing function that is in accordance with a preset distance from the current VR device, and when there is another device with an image capturing function that is in accordance with a preset distance from the current VR device, the preset requirement is met.
Specifically, because the definition of the image that other equipment with the function of making a video recording shot current VR equipment directly influences the acquisition of the positional information of the head-mounted device and the handle of current VR equipment, the distance that consequently sets up to predetermine between other equipment with the function of making a video recording and the current VR equipment of requirement satisfies the preset distance to other equipment with the function of making a video recording that meets the requirements in the screening distance. When confirming that other equipment with the camera shooting function which accords with the preset distance with the current VR equipment exists, sending a handle position confirmation request to the other equipment with the camera shooting function which accords with the distance requirement, and accordingly obtaining position information of the head-mounted equipment and the handle of the current VR equipment. In the present embodiment, the preset distance is not limited, and is determined according to specific implementation situations.
In this embodiment, when it is determined that there is another device having a camera function that matches the preset distance from the current VR device, it is determined that there is another device having a camera function that matches the preset requirement, and the process proceeds to step S12, so as to obtain position information of the subsequent handle.
On the basis of the above-described embodiment:
as a preferred embodiment, after confirming that there is another device having an image capturing function that meets a preset requirement, the method further includes:
judging whether the obtained shooting angles of other equipment with the camera shooting function which accords with the preset distance are different;
and if so, sending a handle position confirmation request to other equipment with the camera shooting function.
In the above embodiment, it can be seen that, when there are a large number of other devices with an image capturing function that meet the preset requirement, obtaining the position information of the headset and the handle of the current VR device through all other devices with an image capturing function that meet the preset requirement may cause waste of device power consumption. As a preferred embodiment, in this embodiment, after confirming that there is another device with an image capturing function that meets a preset requirement, it is determined whether the obtained other device with an image capturing function that meets a preset distance has a different image capturing angle; and if so, sending a handle position confirmation request to other equipment with the camera shooting function.
It is to be understood that in the above-described embodiment, the other device with the image capturing function that matches the preset distance is screened out by determining whether there is another device with the image capturing function that matches the preset distance. The shooting angles of the other devices with the camera shooting function to the current VR device usually differ. In some cases, however, the selected other devices with the image capturing function have the same image capturing angle with respect to the current VR device, for example, the other devices with the image capturing function are all right in front of the current VR device; at this time, the process difference of the position information of the head-mounted device and the handle of the current VR device acquired by the other devices with the camera shooting function is not high, which is equivalent to repeated calculation, and the power consumption of the device is increased.
Therefore, after screening out other devices with the camera function which meet the preset distance, it is also required to judge whether the obtained other devices with the camera function which meet the preset distance have different shooting angles; for example, one of the devices is behind the current VR device, and the other device is on the side of the current VR device, so that the reliability of the obtained position information is higher; if so, the step of sending a handle position confirmation request to other equipment with the camera shooting function is carried out, namely, the handle position confirmation request is sent to other equipment with the camera shooting function, which accords with the preset distance and has different shooting angles.
In this embodiment, after confirming that there is another device with a camera function meeting the preset requirement, it is determined whether the obtained camera angles of the other devices with a camera function meeting the preset distance are different; if yes, the process proceeds to step S12. The position information of the current VR equipment obtained by other equipment with the camera shooting function is enabled to be reduced, the equipment power consumption is reduced, and the obtained position information has higher reliability.
In the above embodiments, the VR handle position detecting method is described in detail, and the application also provides embodiments corresponding to the VR handle position detecting device and the VR device.
Fig. 4 is a schematic structural diagram of a VR handle position detecting device according to an embodiment of the present application. As shown in fig. 4, the VR handle position detecting device includes:
the acquisition module 10 is used for acquiring an image of the handle through a camera of the current VR equipment; wherein, the VR device includes a head-mounted device and a handle.
The judging module 11 is used for judging whether the handle is in the field angle range of the camera; if not, the sending module is triggered.
A sending module 12, configured to send a handle position confirmation request to the other device with the camera function, so as to obtain position information of the handle and the head-mounted device of the current VR device through the other device with the camera function.
And the confirming module 13 is used for confirming the position of the handle according to the position information.
Since the embodiments of the apparatus portion and the method portion correspond to each other, please refer to the description of the embodiments of the method portion for the embodiments of the apparatus portion, which is not repeated here.
Fig. 5 is a schematic structural diagram of a VR device provided in an embodiment of the present application, and as shown in fig. 5, the VR device includes:
a memory 20 for storing a computer program;
the processor 21 is configured to execute the computer program to implement the steps of the VR handle position detection method as mentioned in the above embodiments.
The VR device provided by this embodiment may include, but is not limited to, a smart phone, a tablet computer, a notebook computer, or a desktop computer.
The processor 21 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The Processor 21 may be implemented in at least one hardware form of a Digital Signal Processor (DSP), a Field-Programmable Gate Array (FPGA), and a Programmable Logic Array (PLA). The processor 21 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 21 may be integrated with a Graphics Processing Unit (GPU), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, the processor 21 may further include an Artificial Intelligence (AI) processor for processing computational operations related to machine learning.
The memory 20 may include one or more computer-readable storage media, which may be non-transitory. Memory 20 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In this embodiment, the memory 20 is at least used for storing the following computer program 201, wherein after being loaded and executed by the processor 21, the computer program can implement the relevant steps of the VR handle position detection method disclosed in any one of the foregoing embodiments. In addition, the resources stored in the memory 20 may also include an operating system 202, data 203, and the like, and the storage manner may be a transient storage manner or a permanent storage manner. Operating system 202 may include, among others, Windows, Unix, Linux, and the like. The data 203 may include, but is not limited to, data related to VR handle position detection methods.
In some embodiments, the VR device may also include a display 22, an input-output interface 23, a communication interface 24, a power supply 25, and a communication bus 26.
Those skilled in the art will appreciate that the configuration shown in fig. 5 does not constitute a limitation of VR devices and may include more or fewer components than shown.
Finally, the application also provides a corresponding embodiment of the computer readable storage medium. The computer-readable storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps as set forth in the above-mentioned method embodiments.
It is to be understood that if the method in the above embodiments is implemented in the form of software functional units and sold or used as a stand-alone product, it can be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium and executes all or part of the steps of the methods described in the embodiments of the present application, or all or part of the technical solutions. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a portable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The VR handle position detection method, apparatus, device, and medium provided by the present application are described in detail above. The embodiments are described in a progressive manner in the specification, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description. It should be noted that, for those skilled in the art, it is possible to make several improvements and modifications to the present application without departing from the principle of the present application, and such improvements and modifications also fall within the scope of the claims of the present application.
It is further noted that, in the present specification, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.

Claims (10)

1. A VR handle position detection method, comprising:
acquiring an image of a handle through a camera of the current VR equipment; wherein the VR device comprises a head-mounted device and the handle;
judging whether the handle is in the field angle range of the camera or not;
if not, sending a handle position confirmation request to other equipment with a camera shooting function so as to obtain the position information of the head-mounted equipment and the handle of the current VR equipment through the other equipment with the camera shooting function;
and confirming the position of the handle according to the position information.
2. The VR handle position detection method of claim 1, wherein the specific step of obtaining the position information of the headset and the handle of the current VR device through the other device with the camera function includes:
receiving the handle position confirmation request by each piece of other equipment with the camera shooting function;
respectively starting the cameras according to the handle position confirmation requests;
judging whether an image containing the identification information of the current VR equipment can be shot or not; the identification information is respectively positioned on the surfaces of the head-mounted equipment and the handle;
if so, acquiring the position information of the head-mounted equipment and the handle of the current VR equipment through the camera respectively according to the identification information.
3. The VR handle position detection method of claim 2, further comprising, before sending a handle position confirmation request to another device having a camera function:
judging whether other equipment with the camera shooting function meeting the preset requirement exists or not;
and if so, sending a handle position confirmation request to other equipment with the camera shooting function.
4. The VR handle position detecting method of claim 1, further comprising, if the handle is determined to be within a field angle range of the camera:
acquiring distance information between the handle and the head-mounted equipment, and acquiring the turning angle of the handle;
confirming the position of the handle according to the distance information and the overturning angle so as to conveniently track the position of the handle;
and returning to the step of acquiring the image of the handle through the camera of the current VR equipment.
5. The VR handle position detection method of claim 4, wherein the step of tracking the position of the handle comprises:
acquiring acceleration information and angular velocity information of the handle;
and updating the position of the handle and the turnover angle according to the acceleration information and the angular velocity information so as to track the position of the handle.
6. The method of claim 3, wherein the determining whether the other device with the camera function meets a predetermined requirement comprises:
judging whether other equipment with the camera shooting function is present, wherein the other equipment with the camera shooting function is in line with a preset distance with the current VR equipment;
if yes, confirming that the other equipment with the camera shooting function meets the preset requirement, and sending a handle position confirmation request to the other equipment with the camera shooting function.
7. The VR handle position detection method of claim 6, further comprising, after the confirming that the other device with the camera function meeting the preset requirement exists, the step of:
judging whether the obtained shooting angles of the other equipment with the camera shooting function which accords with the preset distance are different or not;
and if so, sending a handle position confirmation request to other equipment with the camera shooting function.
8. A VR handle position detecting device, comprising:
the acquisition module is used for acquiring an image of the handle through a camera of the current VR equipment; wherein the VR device comprises a head-mounted device and the handle;
the judging module is used for judging whether the handle is in the field angle range of the camera or not; if not, triggering a sending module;
the sending module is configured to send a handle position confirmation request to other devices with a camera function, so as to obtain position information of the headset and the handle of the current VR device through the other devices with the camera function;
and the confirming module is used for confirming the position of the handle according to the position information.
9. A VR device, comprising:
a memory for storing a computer program;
a processor for implementing the steps of the VR handle position detection method as claimed in any one of claims 1 to 7 when the computer program is executed.
10. A computer-readable storage medium, having stored thereon a computer program which, when executed by a processor, performs the steps of the VR handle position detection method of any of claims 1 to 7.
CN202210520123.2A 2022-05-13 2022-05-13 VR handle position detection method, device, equipment and medium Pending CN114973042A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210520123.2A CN114973042A (en) 2022-05-13 2022-05-13 VR handle position detection method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210520123.2A CN114973042A (en) 2022-05-13 2022-05-13 VR handle position detection method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN114973042A true CN114973042A (en) 2022-08-30

Family

ID=82982655

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210520123.2A Pending CN114973042A (en) 2022-05-13 2022-05-13 VR handle position detection method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN114973042A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108737720A (en) * 2018-04-11 2018-11-02 努比亚技术有限公司 Wearable device image pickup method, wearable device and computer readable storage medium
CN109445596A (en) * 2018-11-02 2019-03-08 北京盈迪曼德科技有限公司 A kind of integral type mixed reality wears display system
CN109613983A (en) * 2018-12-26 2019-04-12 青岛小鸟看看科技有限公司 It wears the localization method of handle in display system, device and wears display system
CN109671118A (en) * 2018-11-02 2019-04-23 北京盈迪曼德科技有限公司 A kind of more people's exchange methods of virtual reality, apparatus and system
CN110262667A (en) * 2019-07-29 2019-09-20 上海乐相科技有限公司 A kind of virtual reality device and localization method
CN113318435A (en) * 2021-04-27 2021-08-31 青岛小鸟看看科技有限公司 Control method and device of handle control tracker and head-mounted display equipment
CN113721767A (en) * 2021-08-30 2021-11-30 歌尔光学科技有限公司 Handle tracking method, device, system and medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108737720A (en) * 2018-04-11 2018-11-02 努比亚技术有限公司 Wearable device image pickup method, wearable device and computer readable storage medium
CN109445596A (en) * 2018-11-02 2019-03-08 北京盈迪曼德科技有限公司 A kind of integral type mixed reality wears display system
CN109671118A (en) * 2018-11-02 2019-04-23 北京盈迪曼德科技有限公司 A kind of more people's exchange methods of virtual reality, apparatus and system
CN109613983A (en) * 2018-12-26 2019-04-12 青岛小鸟看看科技有限公司 It wears the localization method of handle in display system, device and wears display system
CN110262667A (en) * 2019-07-29 2019-09-20 上海乐相科技有限公司 A kind of virtual reality device and localization method
CN113318435A (en) * 2021-04-27 2021-08-31 青岛小鸟看看科技有限公司 Control method and device of handle control tracker and head-mounted display equipment
CN113721767A (en) * 2021-08-30 2021-11-30 歌尔光学科技有限公司 Handle tracking method, device, system and medium

Similar Documents

Publication Publication Date Title
CN111126182B (en) Lane line detection method, lane line detection device, electronic device, and storage medium
KR102595150B1 (en) Method for controlling multiple virtual characters, device, apparatus, and storage medium
EP3136204B1 (en) Image processing device and image processing method
EP2352078B1 (en) Information processing apparatus, information processing method, information recording medium, and program
CN108989678B (en) Image processing method and mobile terminal
CN104536579A (en) Interactive three-dimensional scenery and digital image high-speed fusing processing system and method
CN111324250B (en) Three-dimensional image adjusting method, device and equipment and readable storage medium
CN111833461B (en) Method and device for realizing special effect of image, electronic equipment and storage medium
EP3128413A1 (en) Sharing mediated reality content
CN111062255A (en) Three-dimensional point cloud labeling method, device, equipment and storage medium
WO2022052620A1 (en) Image generation method and electronic device
CN110968190B (en) IMU for touch detection
CN113160427A (en) Virtual scene creating method, device, equipment and storage medium
JP2015118442A (en) Information processor, information processing method, and program
CN114026606A (en) Fast hand meshing for dynamic occlusion
CN111127541B (en) Method and device for determining vehicle size and storage medium
CN108961424B (en) Virtual information processing method, device and storage medium
CN111833459B (en) Image processing method and device, electronic equipment and storage medium
CN108140124B (en) Prompt message determination method and device and electronic equipment
CN114973042A (en) VR handle position detection method, device, equipment and medium
CN113432620B (en) Error estimation method and device, vehicle-mounted terminal and storage medium
CN112822398B (en) Shooting method and device and electronic equipment
CN112711335B (en) Virtual environment picture display method, device, equipment and storage medium
CN114201028B (en) Augmented reality system and method for anchoring display virtual object thereof
CN113935678A (en) Method, device, equipment and storage medium for determining multiple distribution terminals held by distributor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination