CN110888529A - Virtual reality scene control method, virtual reality equipment and control device thereof - Google Patents

Virtual reality scene control method, virtual reality equipment and control device thereof Download PDF

Info

Publication number
CN110888529A
CN110888529A CN201911128776.0A CN201911128776A CN110888529A CN 110888529 A CN110888529 A CN 110888529A CN 201911128776 A CN201911128776 A CN 201911128776A CN 110888529 A CN110888529 A CN 110888529A
Authority
CN
China
Prior art keywords
virtual reality
control device
signal
finger
displaying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911128776.0A
Other languages
Chinese (zh)
Other versions
CN110888529B (en
Inventor
袁新焰
胡明明
朱振华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Allwinner Technology Co Ltd
Original Assignee
Allwinner Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Allwinner Technology Co Ltd filed Critical Allwinner Technology Co Ltd
Priority to CN201911128776.0A priority Critical patent/CN110888529B/en
Publication of CN110888529A publication Critical patent/CN110888529A/en
Application granted granted Critical
Publication of CN110888529B publication Critical patent/CN110888529B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a virtual reality scene control method, virtual reality equipment and a control device thereof, wherein the method comprises the steps of receiving a form signal and a finger position signal sent by a virtual reality control device, displaying an image of the virtual reality control device on a display screen according to the form signal and the finger position signal, and displaying the image of a virtual finger; and receiving a control signal sent by the virtual reality control device, and executing a preset operation corresponding to the control signal. The virtual reality device provided by the invention is provided with a processor and a memory, and the processor can realize the virtual reality scene control method when executing a computer program. The invention can enable the user to intuitively know the form of the control device of the virtual reality equipment and the position of the finger, and avoid the user from pressing wrong keys or mistakenly operating the touch pad.

Description

Virtual reality scene control method, virtual reality equipment and control device thereof
Technical Field
The invention relates to the technical field of virtual reality, in particular to a virtual reality scene control method, virtual reality equipment for realizing the method and a control device used by the virtual reality equipment.
Background
A Virtual Reality (Virtual Reality) device displays a Virtual image through a display screen and incorporates sensor technology, such as acquiring motion of a human body through the sensor technology, and the displayed image varies with the motion of the human body. Because the existing virtual reality equipment brings good visual experience to users, the existing virtual reality equipment is widely accepted by people. At that time, most of the existing virtual reality devices are not provided with a control device, for example, a control handle is not arranged, so that the user experience is poor, and the backward interactive experience cannot meet the increasingly complex interactive scene requirement.
At present, the most common virtual reality equipment is the equipment of wear-type, for example VR glasses or VR helmet etc. compare with smart mobile phone, smart machine such as panel computer, and current virtual reality equipment does not have the touch-sensitive screen, can't realize click response fast, also can't realize the input through clicking the touch-sensitive screen in the user's use exactly. In addition, in the process of using the virtual reality device, the eyes usually cannot see the real world objects, but the existing virtual reality device does not have the function of detecting the eye movement, so that the eyes cannot participate in the interaction in the whole interaction process, and the physical examination of the user is influenced.
For this reason, some virtual reality devices are currently provided with an interactive handle, for example, chinese patent application No. CN201610266980 sets forth a handle for a virtual reality device, the handle is provided with a sensor and a key, and the handle can wirelessly communicate with the virtual reality device. In the process of using the control device of the virtual reality equipment, the handle can be operated by hands, and the handle sends a control signal to the virtual reality equipment so as to control the work of the virtual reality equipment.
However, when the user uses the virtual reality device, the user usually wears VR glasses or VR helmets, and eyes of the user look at the display screen of the virtual reality device, so that the shape of the handle and the positions of keys on the handle are often difficult to observe, and inconvenience is brought to the user for operating the handle. Since the user cannot see the positions of the keys on the handle, it is often unclear which key on the handle is pressed by a finger, and it is easy to cause that the operation executed by the virtual reality device is not the operation that the user wants to execute due to the fact that the wrong key is pressed, and the use of the user is influenced.
Disclosure of Invention
The invention mainly aims to provide a virtual reality scene control method which is convenient for a user to watch a control device.
Another object of the present invention is to provide a virtual reality device capable of implementing the above virtual reality scene control method.
Still another object of the present invention is to provide a control device used with the above virtual reality apparatus.
In order to achieve the above main object, the virtual reality scene control method provided by the present invention includes receiving a form signal and a finger position signal sent by a virtual reality control device, displaying an image of the virtual reality control device on a display screen according to the form signal and the finger position signal, and displaying an image of a virtual finger; and receiving a control signal sent by the virtual reality control device, and executing a preset operation corresponding to the control signal.
According to the scheme, when a user operates the control device of the virtual reality equipment, such as the handle, the image of the control device is displayed on the display screen of the virtual reality equipment, and the image of the virtual finger is displayed, so that the user can know the position relation between the finger and the control device, the control device is easily operated, for example, a key which is expected to be pressed can be more accurately pressed, and the virtual reality equipment can execute expected operation.
Preferably, the receiving the finger position signal includes: and receiving an operation preparation signal sent by the virtual reality control device, and displaying a region to be operated in a control device image of the virtual reality equipment in a first display mode.
Therefore, when a user brings a finger close to a key to be controlled or touches the key, the key is displayed on a display screen of the virtual reality device in a preset display mode, and the user knows that the finger is about to press the key.
Further, the receiving the finger position signal comprises: and receiving an operation signal sent by the virtual reality control device, and displaying an operation area in the control device image of the virtual reality equipment in a second display mode.
Therefore, after a user presses a certain key or touches the control device touch screen and sends a control instruction, the key or the touch area is displayed on the display screen of the virtual reality device, so that the user can know the sent control instruction.
Further, the displaying the region to be operated in the control device image of the virtual reality apparatus in the first display mode includes: displaying a region to be operated in a control device image of the virtual reality equipment in a first color; the displaying of the operation region in the control device image of the virtual reality apparatus in the second display mode includes: and displaying the operation area in the control device image of the virtual reality equipment in a second color.
Therefore, when the control device sends out the operation preparation signal or the operation signal, the to-be-operated area or the operation area is displayed in different colors, and a user can know the position of the finger on the control device more intuitively.
In a further aspect, the virtual reality control apparatus includes at least one button; the method for displaying the area to be operated in the control device image of the virtual reality equipment in the first display mode comprises the following steps: displaying a key corresponding to the image of the virtual finger in a first color; the displaying of the operation region in the control device image of the virtual reality apparatus in the second display mode includes: and displaying the key corresponding to the image of the virtual finger in a second color.
Therefore, the keys to be operated by the user or the keys pressed by the user are displayed in different colors respectively, so that the user can more intuitively know what the currently operated keys are, and the user can clearly know the operation executed by the virtual reality equipment.
In order to achieve the above another object, the present invention further provides a virtual reality device, which includes a processor, a memory and a display screen, wherein the memory stores a computer program, and the computer program implements the steps of the virtual reality scene control method when executed by the processor.
In order to achieve the above another object, the present invention further provides a control device for a virtual reality apparatus, including a housing, wherein the housing is provided with an operation area, and the housing is provided with a wireless signal transceiver; wherein, still be provided with form sensor and finger position sensor in the casing, the form signal that wireless signal transceiver received the form sensor and sent, and the finger position signal that receives finger position sensor and send form signal and finger position signal to virtual reality equipment.
By the scheme, the control device can send the form signal and the finger position signal to the virtual reality equipment, and after the virtual reality equipment receives the signal, the image and the finger position of the control device can be displayed on the display screen, so that a user can know the relative position between the finger and the control device conveniently, and the operation of the user is facilitated.
The method comprises the following steps that when a finger position sensor detects that a finger is close to an operation area, a wireless signal receiving and sending device sends an operation preparation signal to the virtual reality equipment, and when the finger position sensor detects that the finger presses or touches the operation area, the wireless signal receiving and sending device sends an operation signal to the virtual reality equipment.
Therefore, when the fingers of the user approach the operation area or touch the operation area, the control device respectively sends the operation preparation signal and the operation signal to the virtual reality equipment, and the virtual reality equipment can respectively display the operation area in different display modes, so that the user can visually know the keys to be operated.
Drawings
FIG. 1 is a block diagram of a virtual reality device according to an embodiment of the present invention.
Fig. 2 is a schematic structural diagram of an embodiment of a virtual reality device control apparatus according to the present invention.
Fig. 3 is a schematic structural diagram of another view angle of the virtual reality device control apparatus according to an embodiment of the present invention.
Fig. 4 is a flowchart of a virtual reality scene control method according to an embodiment of the present invention.
Fig. 5 is a flowchart of displaying a virtual finger image in an embodiment of the virtual reality scene control method of the present invention.
The invention is further explained with reference to the drawings and the embodiments.
Detailed Description
The virtual reality scene control method is applied to virtual reality equipment, the virtual reality equipment can be equipment such as VR glasses or VR helmets, and the virtual reality equipment can receive signals of a control device, for example, signals of a handle, and display images of the virtual control device on a display screen according to the signals of the handle.
Referring to fig. 1, the virtual reality apparatus of the present invention has a processor 10, a memory 11, a display 12, and a wireless signal transceiver 13, preferably, a computer program is stored in the memory 11, and after the processor 10 reads and executes the computer program, the virtual reality scene control method of the present invention can be implemented, and specific steps of the method will be described in detail below.
The display screen 12 of the virtual reality device may be an LED display screen or an LCD display screen, and may receive the signal sent by the processor 10, and display a corresponding image according to the signal sent by the processor 10, for example, display an image of a preset scene, and after the user wears VR glasses or a VR helmet, the user may view the corresponding image through the display screen 12, so that the user has a feeling of being personally on the scene.
Because the virtual reality device of the present invention needs to communicate with the control device, such as the control handle, the virtual reality device is provided with the wireless signal transceiver 13, in this embodiment, the wireless signal transceiver 13 is a bluetooth module, and correspondingly, the wireless signal transceiver on the control handle is also a bluetooth module. When the virtual reality device works, the wireless signal transceiver 13 communicates with the bluetooth module on the control handle, and realizes the transmission of wireless signals through the bluetooth module.
Referring to fig. 2 and 3, in the present embodiment, the control device used in cooperation with the virtual reality apparatus is a control handle 20, the control handle 20 has a casing 25, a touch pad 21 is disposed at one end of the casing 25, a key region 21 is disposed on the casing 25, and four keys 23 are disposed in the key region 21, for example, the four keys 23 are keys A, B, C, D respectively. When the user needs to control the virtual reality device, the control handle 20 can be held by hand, and the virtual reality device can be controlled by pressing the keys on the control handle 20.
In order to detect the form of the joystick 20, a plurality of sensors, for example, form sensors are provided in the joystick 20, the form sensors of the present embodiment are used to detect the form of the joystick 20, for example, sensors such as a gravity sensor, a multi-axis acceleration sensor, and an electronic gyroscope are provided in the housing 25 to detect the rotational posture of the joystick 20, and the form sensors may constitute the inertial sensor system 27 of the present embodiment. Preferably, the inertial sensor system 27 is enclosed within the housing 25 and may be integrated into a module and arranged in a waterproof, sealed arrangement.
The control handle 20 needs to detect not only its own form but also the movement of the user's hand, for example, whether the user's hand holds the control handle 20 and whether the user's finger presses a certain key, and therefore, the control handle 20 is further provided with a finger position sensor including a temperature sensor 28 provided on the side wall of the housing 25. When the user holds the control handle 20 with his or her finger placed on the side of the housing 25, the temperature sensor 28 may detect a change in temperature to determine that the user's finger is placed on the side wall of the handle 20. Of course, an infrared sensor may be provided on the side wall of the housing 25 to detect whether a user's finger is placed on the side wall of the housing 25, or the infrared sensor alone may be used to detect.
Of course, the temperature sensor and the infrared sensor are not limited to those provided on the side wall of the control handle 20, and the temperature sensor and the infrared sensor may be provided at other positions of the casing 25 of the control handle 20, and the positions of the plurality of fingers may be determined by signals detected by the plurality of temperature sensors and the infrared sensor.
In addition, since the user must use a finger to press the key 23 when manipulating the control handle 20, a rectangular touch pad may be disposed on the key region 22, and after the user's finger touches the key region 21, the touch sensor may receive a touch signal of the user's finger, so as to determine the position of the finger on the key region 21. For example, a capacitive touch pad may be used, and when the distance between the user's finger and the key area 21 is less than a preset distance, for example, 2 cm, the capacitive touch pad detects the signal of the user's finger.
Further, a touch pad is provided on the surface of each key 23, for example, a capacitive touch pad is provided, and when a user's finger approaches or the key 23, the touch pad can detect the motion of the finger. Since the user's finger has not pressed a key, it may be determined that the user has not actually issued an operation signal, but since the user's finger has approached or touched the key 23, indicating that the user may need to press the key 23, the touch panel may issue a signal, such as a signal that a finger is approaching a certain key 23. The processor in the control handle 20 can receive the signal sent by the touch pad and send the received signal to the virtual reality device through a wireless signal transceiver, such as a bluetooth module.
Preferably, when the user approaches a certain key 23 but does not press the key 23, the signal sent by the control handle 20 to the virtual reality device is an operation preparation signal, and the operation preparation signal includes a signal that the finger of the user approaches the certain key 23.
If the user presses a certain key 23, it indicates that the user sends an actual operation signal, and at this time, after the processor in the control handle 20 receives the signal that the user presses a certain key 23, the processor sends the operation signal that the user presses a certain key 23 to the virtual reality device through the wireless signal transceiver.
In addition, the circular touch pad 21 disposed on the top of the control handle 20 may receive a touch signal of a user's finger, for example, when the user's finger approaches a certain region of the touch pad 21, the touch pad 21 transmits a signal that the finger approaches the region to the virtual reality device. Preferably, when the touch panel 21 detects that the distance between the finger and the touch panel 21 is less than the first threshold value and greater than the second threshold value, it is determined that the user is ready to perform an operation on the touch panel 21, and at this time, an operation preparation signal may be issued. For example, when the distance between the user's finger and the touch panel 21 is between 2 cm and 5 cm, indicating that the user intends to perform a touch operation on the touch panel 21, but the user does not actually issue a control signal, the control handle 20 may issue an operation preparation signal to the virtual reality device.
If the finger of the user touches the touch pad 21, for example, slides to form a certain track, double-clicks the touch pad 21, etc., indicating that the user has sent an instruction for an actual operation, at this time, the touch pad 21 detects a specific position touched by the user on the touch pad 21, forms specific data of the touch track or a position of double-click, and sends an operation signal to the virtual reality device.
It is understood that, in the present embodiment, the touch pad 21 and the key area 22 constitute an operation area of the present embodiment, and the finger of the user performs pressing, touching, sliding, clicking, and other operations in the operation area and sends out a control signal.
In this way, the virtual reality device can display the virtual image of the control handle 20 on the display 12 after receiving the form signal of the control handle 20 sent by the control handle 20, and can display the image of the virtual finger on the display 12 after receiving the finger position signal of the user, and the user can know the relative position relationship between the finger and the control handle 20 through the image displayed on the display 12.
And, the virtual reality device may display a region to be operated in a first display manner after receiving the operation preparation signal transmitted from the control handle 20, and display the operation region in a second display manner after receiving the operation signal, so as to be used for knowing a position of the finger placed on the control handle 20.
The control method of the virtual reality scene according to the present invention is described with reference to fig. 4 and 5. First, step S1 is executed, and the virtual reality device receives a shape signal and a finger position signal of a control device, which is the control handle 20 used by the user in this embodiment. The inertial sensor system 27 is provided on the control handle 20 for detecting the form of the control handle 20, for example, whether the control handle 20 is in a horizontally placed state, a vertically placed state, or an inclined state, and if the control handle 20 is in an inclined state, the inclination angle of the control handle 20 with respect to the horizontal direction, the inclination angle of the control handle 20 in the vertical direction, and the like can be detected. In addition, the control handle 20 can also detect whether the head portion thereof, that is, the end where the touch pad 21 is located is placed upward or downward, and in addition, can detect whether the side where the key region 22 is located is placed upward or downward, and the inertial sensor system 27 transmits the detected signal to the virtual reality device.
Control handle 20 also detects the position of the user's fingers, forming a finger position signal, e.g., detecting to which area of control handle 20 the user's fingers are touching, how many fingers are touching on control handle 20. Since a large number of temperature sensors, infrared sensors, or touch pads are provided on the control handle 20, the positions of a plurality of fingers of the user on the control handle 20 can be accurately detected.
After the virtual reality device receives the shape signal and the finger position signal transmitted from the control handle 20, step S2 is executed to display the image of the control handle 20 and the image of the virtual finger on the display 12 based on the shape signal and the finger position signal. Preferably, the control handle 20 and the image of the virtual finger are displayed in a semi-transparent manner on a preset area of the display 12. Preferably, the control handle 20 and the image of the virtual finger are displayed on one corner of the display screen 12, and the image display area of the control handle 20 is no more than 20% of the display area of the entire display screen 12, so as to reduce interference with the user's viewing of other images.
Since the hand position signal sent by the control handle 20 to the virtual reality device further includes an operation preparation signal and an operation signal, in order to facilitate the user to see the key pressed by the finger or the touched region, the region on the control handle 20 may be displayed in different display manners. Referring to fig. 5, the virtual reality device first performs step S11 to determine whether an operation preparation signal is received, and if so, performs step S12.
For example, the user touches a finger on the surface of the key a, but does not press the key a, which indicates that the user intends to press the key a and issues a corresponding control instruction. The touch sensor on the control handle 20 can detect a signal that the user touches a finger on the key a, which is an operation preparation signal, and transmit the operation preparation signal to the virtual reality device.
After receiving the signal that the user finger touches the key a, the virtual reality device executes step S12 to display the region to be operated in the first display mode. In this embodiment, the to-be-operated area is an area corresponding to the operation preparation signal, for example, if the operation preparation signal is a signal that a finger touches the key a, the to-be-operated area is an area where the key a is located. Therefore, in step S12, the area where the key a is located may be displayed as a first preset color, such as green, to prompt the user that the key 23 touched by the finger is the key a.
Then, step S13 is executed, the virtual reality device determines whether an operation signal is received, if so, step S14 is executed, otherwise, the process returns to step S11. In the present embodiment, the operation signal is an actual control signal sent by a user through a finger pressing or touch operation, for example, a user presses the key a, or a double-click or sliding operation performed on the touch panel 21.
If the virtual reality device receives the operation signal, step S14 is executed to display the operation region in the second display mode. In this embodiment, the operation area is an area corresponding to the operation signal, for example, after the user presses the key a, the operation area is an area corresponding to the key a. Therefore, in step S14, the operation region may be displayed using a preset color, such as a region where the key a is displayed using red. It should be noted that the color used in the second display mode is different from the color used in the first display mode, and preferably, the two colors need to have a large color difference, such as red and green, blue and gray, and the like.
Of course, the first display mode and the second display mode are not necessarily different in display color, and may be other display modes, for example, the first display mode is to add a green border line to the contour of the region to be operated and to display the green border line in a blinking manner, and the second display mode is to add a red border line to the contour of the operation region and not to display the red border line in a blinking manner.
In addition, the region to be operated and the operation region are not necessarily the region where the key is located, but may be a certain region on the touch panel 21, such as a region where a finger approaches or a region where the finger touches, and if the finger slides on the touch panel 21 to form a sliding track, the region where the sliding track passes is taken as the operation region, and the region of the sliding track is displayed in the second display manner.
Referring to fig. 4, after the virtual reality device generates and displays the virtual image of the joystick 20 and the virtual image of the finger, step S3 is executed to determine whether the control signal transmitted from the joystick 20 is received. For example, the user presses a certain key, double-clicks or slides on the touch panel 21 to form a track, or the user shakes the joystick 20. After the virtual reality device receives the control signal, step S4 is executed, and according to the received control signal, a preset operation corresponding to the control signal is executed. For example, when the user presses the key a, an operation corresponding to the key a is performed, such as displaying the next virtual scene, or fast forwarding the displayed video for 5 seconds, increasing the volume, and the like.
Finally, step S5 is executed to determine whether to terminate the control, for example, if the user signals termination through the control handle 20, or if the user removes VR glasses, VR headset, or the like, and if the user instructs termination of the control, the control of the virtual reality device is terminated, and the control handle 20 no longer signals the virtual reality device.
It can be seen that, in the scheme of the present invention, the plurality of sensors on the control handle 20 detect the form of the control handle 20, detect the position of the finger of the user, send the detected signal to the virtual reality device, and the virtual reality device displays the form of the control handle 20, the position of the finger, and the region to be operated and the operation region on the display screen 12 according to the received signal, which is beneficial for the user to view the position of the finger on the control handle 20 through the display screen 12, and avoids the operation error of the user.
In addition, the invention displays the area to be operated and the operation area by using different display modes, so that the user can very intuitively know the area touched by the current finger and know which key is pressed by the finger or the track formed by the touch. Once the user finds that the finger is placed on the wrong key, the user can move away from the finger in time, so that the user is effectively prevented from sending out wrong control instructions, and the user experience is greatly improved.
Of course, the above-mentioned embodiments are only preferred embodiments of the present invention, and many more variations can be made in practical applications, for example, the shape of the control handle can be designed according to actual needs, or the shape sensor and the finger position sensor arranged on the control handle can be adjusted according to use needs, and these variations do not affect the implementation of the present invention, and should be included in the protection scope of the present invention.

Claims (10)

1. The virtual reality scene control method is characterized by comprising the following steps:
receiving a form signal and a finger position signal sent by a virtual reality control device, displaying an image of the virtual reality control device on a display screen according to the form signal and the finger position signal, and displaying an image of a virtual finger;
and receiving a control signal sent by the virtual reality control device, and executing a preset operation corresponding to the control signal.
2. The virtual reality scene control method according to claim 1, wherein:
receiving the finger position signal comprises: and receiving an operation preparation signal sent by the virtual reality control device, and displaying a region to be operated in the virtual reality control device image in a first display mode.
3. The virtual reality scene control method according to claim 2, wherein:
receiving the finger position signal comprises: and receiving an operation signal sent by the virtual reality control device, and displaying an operation area in the control device image of the virtual reality equipment in a second display mode.
4. The virtual reality scene control method according to claim 3, wherein:
the method for displaying the area to be operated in the control device image of the virtual reality equipment in the first display mode comprises the following steps: displaying a region to be operated in a control device image of the virtual reality equipment in a first color;
the displaying the operation area in the control device image of the virtual reality equipment in the second display mode comprises the following steps: and displaying the operation area in the control device image of the virtual reality equipment in a second color.
5. The virtual reality scene control method of claim 3 or 4, wherein:
the virtual reality control device comprises at least one key;
the method for displaying the area to be operated in the control device image of the virtual reality equipment in the first display mode comprises the following steps: displaying a key corresponding to the image of the virtual finger in a first color;
the displaying the operation area in the control device image of the virtual reality equipment in the second display mode comprises the following steps: and displaying the key corresponding to the image of the virtual finger in a second color.
6. Virtual reality device, characterized in that it comprises a processor, a memory and a display screen, the memory storing a computer program which, when executed by the processor, carries out the steps of the virtual reality scene control method according to any one of claims 1 to 5.
7. The control device of the virtual reality equipment comprises
The wireless signal receiving and transmitting device comprises a shell, wherein an operation area is arranged on the shell, and a wireless signal receiving and transmitting device is arranged in the shell;
the method is characterized in that:
still be provided with form sensor and finger position sensor in the casing, wireless signal transceiver receives the form signal that the form sensor sent, and receive the finger position signal that finger position sensor sent will the form signal with finger position signal sends to virtual reality equipment.
8. The control device of a virtual reality apparatus according to claim 7, wherein:
when the finger position sensor detects that a finger approaches the operation area, the wireless signal transceiver sends an operation preparation signal to the virtual reality equipment.
9. The control device of a virtual reality apparatus according to claim 8, wherein:
when the finger position sensor detects that a finger presses or touches the operation area, the wireless signal receiving and sending device sends an operation signal to the virtual reality equipment.
10. The control apparatus for a virtual reality device according to any one of claims 7 to 9, wherein:
the operation area is a key area, and the finger position sensor comprises a touch sensor arranged near the key area.
CN201911128776.0A 2019-11-18 2019-11-18 Virtual reality scene control method, virtual reality device and control device thereof Active CN110888529B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911128776.0A CN110888529B (en) 2019-11-18 2019-11-18 Virtual reality scene control method, virtual reality device and control device thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911128776.0A CN110888529B (en) 2019-11-18 2019-11-18 Virtual reality scene control method, virtual reality device and control device thereof

Publications (2)

Publication Number Publication Date
CN110888529A true CN110888529A (en) 2020-03-17
CN110888529B CN110888529B (en) 2023-11-21

Family

ID=69747868

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911128776.0A Active CN110888529B (en) 2019-11-18 2019-11-18 Virtual reality scene control method, virtual reality device and control device thereof

Country Status (1)

Country Link
CN (1) CN110888529B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113778525A (en) * 2021-09-16 2021-12-10 中国南方电网有限责任公司超高压输电公司昆明局 Air switch control method and device, computer equipment and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204291252U (en) * 2014-11-14 2015-04-22 西安中科微光医疗技术有限公司 A kind of panoramic map display system based on virtual implementing helmet
CN105975061A (en) * 2016-04-26 2016-09-28 乐视控股(北京)有限公司 Control method and apparatus for virtual reality scene as well as handle
CN106354412A (en) * 2016-08-30 2017-01-25 乐视控股(北京)有限公司 Input method and device based on virtual reality equipment
CN106445166A (en) * 2016-10-19 2017-02-22 歌尔科技有限公司 Virtual reality helmet and method of switching display information of virtual reality helmet
CN106484119A (en) * 2016-10-24 2017-03-08 网易(杭州)网络有限公司 Virtual reality system and virtual reality system input method
CN106873785A (en) * 2017-03-31 2017-06-20 网易(杭州)网络有限公司 For the safety custody method and device of virtual reality device
US20170293351A1 (en) * 2016-04-07 2017-10-12 Ariadne's Thread (Usa), Inc. (Dba Immerex) Head mounted display linked to a touch sensitive input device
CN107291359A (en) * 2017-06-06 2017-10-24 歌尔股份有限公司 A kind of input method, device and system
US20170336882A1 (en) * 2016-05-17 2017-11-23 Google Inc. Virtual/augmented reality input device
US20190033960A1 (en) * 2017-07-27 2019-01-31 Htc Corporation Method of Display User Movement in Virtual Reality System and Related Device
CN109407935A (en) * 2018-09-14 2019-03-01 歌尔科技有限公司 A kind of virtual reality display control method, device and system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204291252U (en) * 2014-11-14 2015-04-22 西安中科微光医疗技术有限公司 A kind of panoramic map display system based on virtual implementing helmet
US20170293351A1 (en) * 2016-04-07 2017-10-12 Ariadne's Thread (Usa), Inc. (Dba Immerex) Head mounted display linked to a touch sensitive input device
CN105975061A (en) * 2016-04-26 2016-09-28 乐视控股(北京)有限公司 Control method and apparatus for virtual reality scene as well as handle
US20170336882A1 (en) * 2016-05-17 2017-11-23 Google Inc. Virtual/augmented reality input device
CN106354412A (en) * 2016-08-30 2017-01-25 乐视控股(北京)有限公司 Input method and device based on virtual reality equipment
CN106445166A (en) * 2016-10-19 2017-02-22 歌尔科技有限公司 Virtual reality helmet and method of switching display information of virtual reality helmet
CN106484119A (en) * 2016-10-24 2017-03-08 网易(杭州)网络有限公司 Virtual reality system and virtual reality system input method
CN106873785A (en) * 2017-03-31 2017-06-20 网易(杭州)网络有限公司 For the safety custody method and device of virtual reality device
CN107291359A (en) * 2017-06-06 2017-10-24 歌尔股份有限公司 A kind of input method, device and system
US20190033960A1 (en) * 2017-07-27 2019-01-31 Htc Corporation Method of Display User Movement in Virtual Reality System and Related Device
CN109407935A (en) * 2018-09-14 2019-03-01 歌尔科技有限公司 A kind of virtual reality display control method, device and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113778525A (en) * 2021-09-16 2021-12-10 中国南方电网有限责任公司超高压输电公司昆明局 Air switch control method and device, computer equipment and storage medium
CN113778525B (en) * 2021-09-16 2024-04-26 中国南方电网有限责任公司超高压输电公司昆明局 Air-break control monitoring method and device based on lora communication and computer equipment

Also Published As

Publication number Publication date
CN110888529B (en) 2023-11-21

Similar Documents

Publication Publication Date Title
US20210103338A1 (en) User Interface Control of Responsive Devices
US10620699B2 (en) Head mounted display, mobile information terminal, image processing apparatus, display control program, display control method, and display system
US20170293351A1 (en) Head mounted display linked to a touch sensitive input device
KR101522991B1 (en) Operation Input Apparatus, Operation Input Method, and Program
CN110362231B (en) Head-up touch device, image display method and device
US20130069883A1 (en) Portable information processing terminal
US10203760B2 (en) Display device and control method thereof, gesture recognition method, and head-mounted display device
US20140022171A1 (en) System and method for controlling an external system using a remote device with a depth sensor
JPWO2018198910A1 (en) Information processing apparatus, information processing apparatus control method, and program
EP3716031B1 (en) Rendering device and rendering method
CN109558061A (en) A kind of method of controlling operation thereof and terminal
US10890982B2 (en) System and method for multipurpose input device for two-dimensional and three-dimensional environments
CN111258420B (en) Information interaction method, head-mounted device and medium
KR101872272B1 (en) Method and apparatus for controlling of electronic device using a control device
KR102297473B1 (en) Apparatus and method for providing touch inputs by using human body
CN105183236A (en) Touch screen input device and method
CN110888529B (en) Virtual reality scene control method, virtual reality device and control device thereof
CN117784926A (en) Control device, control method, and computer-readable storage medium
CN103713387A (en) Electronic device and acquisition method
CN111240483B (en) Operation control method, head-mounted device, and medium
CN108897477A (en) A kind of method of controlling operation thereof and terminal device
CN108509022A (en) The control method and device of virtual reality device
KR20090085821A (en) Interface device, games using the same and method for controlling contents
CN114840083A (en) VR handle, electronic equipment, control method and control device of electronic equipment
JP3211484U (en) Tactile controller

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant