CN112083796A - Control method, head-mounted device, mobile terminal and control system - Google Patents

Control method, head-mounted device, mobile terminal and control system Download PDF

Info

Publication number
CN112083796A
CN112083796A CN201910506685.XA CN201910506685A CN112083796A CN 112083796 A CN112083796 A CN 112083796A CN 201910506685 A CN201910506685 A CN 201910506685A CN 112083796 A CN112083796 A CN 112083796A
Authority
CN
China
Prior art keywords
control
mobile terminal
head
display
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910506685.XA
Other languages
Chinese (zh)
Inventor
杜鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910506685.XA priority Critical patent/CN112083796A/en
Publication of CN112083796A publication Critical patent/CN112083796A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)

Abstract

The application discloses a control method, a head-mounted device, a mobile terminal and a control system. The control method is used for a control system, the control system comprises a head-mounted device and a mobile terminal connected with the head-mounted device, the head-mounted device comprises a display, and the control method comprises the following steps: the mobile terminal collects control information; the mobile terminal determines a control instruction of the display according to the control information; and the head-mounted equipment controls the display to display according to the control instruction. Therefore, the control instruction of the display is determined according to the control information collected by the mobile terminal, the display is controlled to display according to the control instruction, the head-mounted equipment is controlled through the mobile terminal simply and conveniently, and the improvement of user experience is facilitated.

Description

Control method, head-mounted device, mobile terminal and control system
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a control method, a head-mounted device, a mobile terminal, and a control system.
Background
The related art head-mounted device is generally controlled by a key or a gesture. Under the condition of controlling through the keys, the head-mounted equipment can be provided with the keys on the machine body or matched with a control handle with the keys. Under the condition of controlling through gestures, a depth camera, a binocular camera, a monocular camera and the like are generally adopted to detect the gestures of a user, then based on algorithms such as machine learning and the like, image recognition is carried out, and comparison is carried out with a preset gesture action image, so that the space gesture operation is realized. In addition, the related art also uses an infrared laser transmitter to determine the gesture state by detecting the received infrared reflection.
However, the user needs to lift his hand to control the operation by using the keys, which is inconvenient. The use of an external handle for control requires additional hardware to be configured. Gesture recognition achieved by the camera and the infrared laser transmitter is limited by angles of the camera and the infrared laser receiver, can only act within a characteristic range, and meanwhile complex image algorithms need to be combined, so that system resources are consumed very much. Moreover, the use of cameras and infrared emitters is power hungry and not conducive to use in mobile portable devices.
Disclosure of Invention
The application provides a control method, a head-mounted device, a mobile terminal and a control system.
The embodiment of the application provides a control method. The control method is used for controlling a system, the control system comprises a head-mounted device and a mobile terminal connected with the head-mounted device, the head-mounted device comprises a display, and the control method comprises the following steps:
the mobile terminal collects control information;
the mobile terminal determines a control instruction of the display according to the control information;
and the head-mounted equipment controls the display to display according to the control instruction.
The embodiment of the application provides a control method. The control method is used for a head-mounted device, the head-mounted device comprises a display, the head-mounted device is used for connecting a mobile terminal, and the control method comprises the following steps:
acquiring a control instruction sent by the mobile terminal, wherein the control instruction is determined by the mobile terminal according to control information acquired by the mobile terminal;
and controlling the display to display according to the control instruction.
The embodiment of the application provides a control method. The control method is used for a mobile terminal, the mobile terminal is connected with a head-mounted device, the head-mounted device comprises a display, and the control method comprises the following steps:
acquiring control information acquired by the mobile terminal;
determining a control instruction of the display according to the control information;
and sending the control instruction to the head-mounted equipment so that the head-mounted equipment controls the display to display according to the control instruction.
The embodiment of the application provides a control system. The control system comprises a system processor, a head-mounted device and a mobile terminal connected with the head-mounted device, wherein the system processor is connected with the head-mounted device and the mobile terminal, the head-mounted device comprises a display, and the system processor is used for acquiring control information through the mobile terminal; and a control instruction for determining the display according to the control information through the mobile terminal; and the head-mounted equipment is used for controlling the display to display according to the control instruction.
The embodiment of the application provides a head-mounted device. The head-mounted device comprises a display and a device processor connected with the display, the device processor is connected with a mobile terminal, the device processor is used for acquiring a control instruction sent by the mobile terminal, and the control instruction is determined by the mobile terminal according to control information acquired by the mobile terminal; and controlling the display to display according to the control instruction.
The embodiment of the application provides a mobile terminal. The mobile terminal comprises a terminal processor, the terminal processor is connected with a head-mounted device, the head-mounted device comprises a display, and the terminal processor is used for acquiring control information acquired by the mobile terminal; and a control instruction for determining the display according to the control information; and the control instruction is sent to the head-mounted equipment so that the head-mounted equipment controls the display to display according to the control instruction.
According to the control method, the head-mounted device, the mobile terminal and the control system, the control instruction of the display is determined according to the control information collected by the mobile terminal, the display is controlled to display according to the control instruction, the head-mounted device is simply and conveniently controlled through the mobile terminal, and the user experience is improved.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic perspective view of a head-mounted device according to an embodiment of the present application;
FIG. 2 is a schematic plan view of a headset of another embodiment of the headset of the present application;
FIG. 3 is a schematic plan view of a partial structure of a head-mounted device according to an embodiment of the present application;
FIG. 4 is a schematic illustration of an adjustment process of the head-mounted device of an embodiment of the present application;
FIG. 5 is another schematic illustration of an adjustment process of the headset of an embodiment of the present application;
FIG. 6 is a schematic plan view of a portion of a headset structure according to another embodiment of the present application;
FIG. 7 is a schematic plan view of a partial structure of a headgear of yet another embodiment of the present application;
fig. 8 is a flowchart illustrating a control method of the control system according to the embodiment of the present application;
FIG. 9 is a block schematic diagram of a control system of an embodiment of the present application;
FIG. 10 is a schematic flow chart diagram of a control method of a control system according to another embodiment of the present application;
fig. 11 is a schematic view of a control method of the control system according to the embodiment of the present application;
fig. 12 is another schematic view of a control method of the control system according to the embodiment of the present application;
fig. 13 is a schematic view of still another scenario of a control method of the control system according to the embodiment of the present application;
fig. 14 is a schematic view of still another scenario of a control method of the control system according to the embodiment of the present application;
fig. 15 is a flowchart illustrating a control method of a control system according to another embodiment of the present application;
fig. 16 is a flowchart illustrating a control method of a control system according to still another embodiment of the present application;
fig. 17 is a flowchart illustrating a control method of a control system according to another embodiment of the present application;
fig. 18 is a flowchart illustrating a control method of a control system according to still another embodiment of the present application;
fig. 19 is a flowchart illustrating a control method of a control system according to still another embodiment of the present application;
fig. 20 is a flowchart illustrating a control method of the head-mounted device according to the embodiment of the present application;
fig. 21 is a flowchart illustrating a control method of a head-mounted device according to another embodiment of the present application;
fig. 22 is a flowchart illustrating a control method of a head-mounted device according to still another embodiment of the present application;
fig. 23 is a flowchart illustrating a control method of a head-mounted device according to still another embodiment of the present application;
fig. 24 is a flowchart illustrating a control method of a mobile terminal according to an embodiment of the present application;
fig. 25 is a flowchart illustrating a control method of a mobile terminal according to another embodiment of the present application;
fig. 26 is a flowchart illustrating a control method of a mobile terminal according to still another embodiment of the present application;
fig. 27 is a flowchart illustrating a method for controlling a mobile terminal according to still another embodiment of the present application.
Description of the reference symbols:
a control system 1000, a system processor 1001;
the head-mounted device 100, the device processor 101, the housing 20, the receiving slot 22, the housing top wall 24, the housing bottom wall 26, the notch 262, the housing side wall 28, the supporting member 30, the first bracket 32, the first bending portion 322, the second bracket 34, the second bending portion 342, the elastic band 36, the display 40, the diopter member 50, the diopter chamber 52, the transparent liquid 54, the first film layer 56, the second film layer 58, the side wall 59, the adjusting mechanism 60, the cavity 62, the sliding slot 622, the sliding member 64, the driving member 66, the knob 662, the screw 664, the gear 666, the rack 668, the driving motor 669, the motor shaft 6691, the input device 6692, and the adjusting chamber 68;
mobile terminal 200, terminal processor 201, motion sensor 202, touch sensor 203.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application and should not be construed as limiting the present application.
Referring to fig. 1 and 2, a headset 100 according to an embodiment of the present disclosure includes a housing 20, a support member 30, a display 40, a diopter member 50, and an adjustment mechanism 60.
The housing 20 is an external component of the head-mounted device 100 and serves to protect and secure the internal components of the head-mounted device 100. By enclosing the internal components with the housing 20, direct damage to the internal components from external factors can be avoided.
Specifically, in this embodiment, the housing 20 may be used to house and secure at least one of the display 40, the diopter member 50, and the adjustment mechanism 60. In the example of fig. 2, the housing 20 is formed with a receiving slot 22, and the display 40 and the diopter member 50 are received in the receiving slot 22. The adjustment mechanism 60 is partially exposed from the housing 20.
The housing 20 also includes a housing top wall 24, a housing bottom wall 26, and housing side walls 28. The middle of the housing bottom wall 26 forms a notch 262 toward the housing top wall 24. Alternatively, the housing 20 is generally "B" shaped. When the user wears the head-mounted device 100, the head-mounted device 100 can be erected on the bridge of the nose of the user through the notch 262, so that the stability of the head-mounted device 100 can be guaranteed, and the wearing comfort of the user can be guaranteed. The adjustment mechanism 60 may be partially exposed from the housing sidewall 28 to allow the user to adjust the diopter member 50.
In addition, housing 20 may be formed from a Computer Numerically Controlled (CNC) machine tool aluminum alloy, or may be injection molded from Polycarbonate (PC) or PC and Acrylonitrile Butadiene Styrene (ABS). The specific manner of manufacturing and the specific materials of the housing 20 are not limited herein.
The support member 30 is used to support the head-mounted device 100. When the user wears the head-mounted device 100, the head-mounted device 100 may be fixed on the user's head by the support member 30. In the example of fig. 2, the support member 30 includes a first bracket 32, a second bracket 34, and an elastic band 36.
The first bracket 32 and the second bracket 34 are symmetrically disposed about the notch 262. Specifically, the first stand 32 and the second stand 34 are rotatably disposed at the edge of the housing 20, and the first stand 32 and the second stand 34 can be stacked adjacent to the housing 20 for storage when the user does not need to use the head-mounted device 100. When the user needs to use the head-mounted device 100, the first support 32 and the second support 34 can be unfolded to realize the supporting function of the first support 32 and the second support 34.
The first bracket 32 has a first bent portion 322 formed at an end thereof away from the housing 20, and the first bent portion 322 is bent toward the housing bottom wall 26. In this way, when the user wears the head-mounted device 100, the first bending part 322 may be erected on the ear of the user, so that the head-mounted device 100 is not easy to slide off.
Similarly, the end of the second bracket 34 away from the housing 20 is formed with a second bent portion 342. The explanation and description of the second bending portion 342 can refer to the first bending portion 322, and are not repeated herein for avoiding redundancy.
The elastic band 36 detachably connects the first bracket 32 and the second bracket 34. In this way, when the user wears the head-mounted device 100 to perform strenuous activities, the head-mounted device 100 can be further fixed by the elastic band 36, and the head-mounted device 100 is prevented from loosening or even falling during strenuous activities. It is understood that in other examples, the elastic band 36 may be omitted.
In this embodiment, the display 40 includes an OLED display screen. The OLED display does not need a backlight, which is advantageous for the light and thin of the head-mounted device 100. Moreover, the OLED screen has a large visual angle and low power consumption, and is favorable for saving the power consumption.
Of course, the display 40 may also be an LED display or a Micro LED display. These displays are merely examples and embodiments of the present application are not limited thereto.
Referring also to fig. 3, the diopter member 50 is disposed on a side of the display 40. The refractive member 50 includes a refractive cavity 52, a light-transmissive liquid 54, a first film layer 56, a second film layer 58, and sidewalls 59.
A light-transmissive liquid 54 is disposed within the refractive cavity 52. The adjustment mechanism 60 is used to adjust the amount of the light-transmissive liquid 54 to adjust the configuration of the diopter member 50. Specifically, the second film layer 58 is disposed opposite to the first film layer 56, the sidewall 59 connects the first film layer 56 and the second film layer 58, the first film layer 56, the second film layer 58 and the sidewall 59 enclose the light refraction cavity 52, and the adjusting mechanism 60 is used for adjusting the amount of the transparent liquid 54 to change the shape of the first film layer 56 and/or the second film layer 58.
In this way, the implementation of the dioptric function of the dioptric member 50 is achieved. Specifically, "changing the shape of the first film layer 56 and/or the second film layer 58" includes three cases: in the first case: changing the shape of the first film layer 56 and not changing the shape of the second film layer 58; in the second case: not changing the shape of the first film layer 56 and changing the shape of the second film layer 58; in the third case: the shape of the first film layer 56 is changed and the shape of the second film layer 58 is changed. Note that, for convenience of explanation, in the present embodiment, the first case is explained as an example.
The first film layer 56 may be elastic. It will be appreciated that as the amount of the optically transparent liquid 54 in the refractive cavity 52 changes, the pressure within the refractive cavity 52 changes, thereby causing a change in the configuration of the refractive member 50.
In one example, the adjustment mechanism 60 decreases the amount of the optically transparent liquid 54 in the refractive chamber 52, decreases the pressure within the refractive chamber 52, increases the pressure differential between the pressure outside the refractive chamber 52 and the pressure within the refractive chamber 52, and causes the refractive chamber 52 to be more concave.
In another example, the adjustment mechanism 60 increases the amount of the optically transparent liquid 54 in the refractive chamber 52, increases the pressure within the refractive chamber 52, decreases the pressure differential between the pressure outside the refractive chamber 52 and the pressure within the refractive chamber 52, and increases the convexity of the refractive chamber 52.
In this way, it is achieved that the form of the refractive member 50 is adjusted by adjusting the amount of the light-transmissive liquid 54.
An adjustment mechanism 60 is coupled to the diopter member 50. The adjustment mechanism 60 is used to adjust the configuration of the diopter member 50 to adjust the diopter of the diopter member 50. Specifically, adjustment mechanism 60 includes a cavity 62, a slide 64, a drive member 66, an adjustment cavity 68, and a switch 61.
The sliding member 64 is slidably disposed in the cavity 62, the driving member 66 is connected to the sliding member 64, the cavity 62 and the sliding member 64 jointly define a regulation cavity 68, the regulation cavity 68 is communicated with the refractive cavity 52 through the side wall 59, and the driving member 66 is used for driving the sliding member 64 to slide relative to the cavity 62 to adjust the volume of the regulation cavity 68 so as to regulate the amount of the transparent liquid 54 in the refractive cavity 52.
In this way, the adjustment of the volume of the adjustment chamber 68 by the slider 64 is achieved to adjust the amount of the light-transmissive liquid 54 in the refractive chamber 52. In one example, referring to FIG. 4, as the slide member 64 slides away from the sidewall 59, the volume of the adjustment chamber 68 increases, the pressure within the adjustment chamber 68 decreases, the optically transparent liquid 54 within the refractive chamber 52 enters the adjustment chamber 68, and the first membrane layer 56 increasingly recedes inwardly.
In another example, referring to fig. 5, when the sliding member 64 slides toward the side wall 59, the volume of the adjusting cavity 68 decreases, the pressure inside the adjusting cavity 68 increases, the transparent liquid 54 inside the adjusting cavity 68 enters the refractive cavity 52, and the first film 56 protrudes outward.
The sidewall 59 defines a flow passage 591, the flow passage 591 communicating the accommodation chamber 68 with the refraction chamber 52. The adjustment mechanism 60 includes a switch 61 provided in the flow passage 591, and the switch 61 is used to control the open-close state of the flow passage 591.
In this embodiment, the number of switches 61 is two, and both switches 61 are one-way switches, wherein one switch 61 is used for controlling the flow of the transparent liquid 54 from the adjustment chamber 68 to the refraction chamber 52, and the other switch 61 is used for controlling the flow of the transparent liquid 54 from the refraction chamber 52 to the adjustment chamber 68.
In this manner, the flow of the light-transmissive liquid 54 between the adjustment chamber 68 and the refractive chamber 52 is effected by the switch 61 to maintain pressure equilibrium across the side wall 59. As previously described, a change in the volume of the accommodation chamber 68 causes a change in the pressure in the accommodation chamber 68, thereby causing the now-transparent liquid 54 to flow between the accommodation chamber 68 and the refractive chamber 52. The switch 61 controls the open/close state of the flow passage 591 to control the flow of the transparent liquid 54 between the adjustment chamber 68 and the refraction chamber 52, thereby controlling the adjustment of the shape of the refraction member 50.
In one example, referring to FIG. 4, the switch 61 that controls the flow of the optically transparent liquid 54 from the diopter chamber 52 to the adjustment chamber 68 is opened, the slide 64 slides away from the side wall 59, the volume of the adjustment chamber 68 increases, the pressure within the adjustment chamber 68 decreases, the optically transparent liquid 54 within the diopter chamber 52 passes through the switch 61 into the adjustment chamber 68, and the first film layer 56 increasingly recedes inwardly.
In another example, the switch 61 controlling the flow of the optically transparent liquid 54 from the diopter chamber 52 to the adjustment chamber 68 is closed, and even if the slide member 64 slides away from the side wall 59, the volume of the adjustment chamber 68 increases, the pressure within the adjustment chamber 68 decreases, the optically transparent liquid 54 within the diopter chamber 52 cannot enter the adjustment chamber 68, and the configuration of the first film layer 56 does not change.
In yet another example, referring to FIG. 5, the switch 61 controlling the flow of the transparent liquid 54 from the adjustment chamber 68 to the refraction chamber 52 is opened, the sliding member 64 slides toward the side wall 59, the volume of the adjustment chamber 68 decreases, the pressure in the adjustment chamber 68 increases, the transparent liquid 54 in the adjustment chamber 68 enters the refraction chamber 52 through the switch 61, and the first film 56 bulges outward.
In yet another example, the switch 61 controlling the flow of the transparent liquid 54 from the adjustment chamber 68 to the refraction chamber 52 is closed, and even if the sliding member 64 slides toward the side wall 59, the volume of the adjustment chamber 68 decreases, the pressure in the adjustment chamber 68 increases, the transparent liquid 54 in the adjustment chamber 68 cannot enter the refraction chamber 52, and the configuration of the first film layer 56 is not changed.
The driving member 66 may perform its function of driving the sliding member 64 to slide based on various structures and principles.
In the example of fig. 1, 2, 3, 4, and 5, the driving member 66 includes a knob 662 and a lead screw 664, the lead screw 664 connects the knob 662 and the slider 64, and the knob 662 is used to drive the lead screw 664 to rotate so as to slide the slider 64 relative to the cavity 62.
In this manner, the slider 64 is driven by the knob 662 and the lead screw 664. Because the screw 664 and the knob 662 are matched to convert the rotary motion of the knob 662 into the linear motion of the screw 664, when the knob 662 is rotated by a user, the screw 664 drives the sliding member 64 to slide relative to the cavity 62, so as to cause the volume of the adjusting cavity 68 to change, and further adjust the amount of the transparent liquid 54 in the refractive cavity 52. The knob 662 may be exposed from the housing 20 for easy rotation by a user.
Specifically, a threaded portion is formed on the knob 662, a threaded portion engaged with the knob 662 is formed on the lead screw 664, and the knob 662 and the lead screw 664 are threadedly coupled.
While the knob 662 is rotated, the switch 61 may be correspondingly turned on. In this way, the transparent liquid 54 can flow, and the pressure balance between the two sides of the sidewall 59 is ensured.
In one example, the knob 662 is rotated clockwise and the slide 64 is slid away from the sidewall 59, opening the switch 61 that controls the flow of the optically transparent liquid 54 from the refractive chamber 52 to the adjustment chamber 68. In another example, the knob 662 is rotated counterclockwise and the slide 64 is slid in a direction toward the sidewall 59, which opens the switch 61 that controls the flow of the optically transparent liquid 54 from the adjustment chamber 68 to the refractive chamber 52.
Note that in the present embodiment, the rotation angle of the knob 662 and the dioptric power of the dioptric member 50 are not related, and the user may rotate the knob 662 to a position where the visual experience is optimal. Of course, in other embodiments, the angle of rotation of the knob 662 may be correlated to the diopter number of the diopter member 50. Here, whether or not the rotation angle of the knob 662 is related to the dioptric power of the dioptric member 50 is not limited.
Referring to fig. 6, the driving member 66 includes a gear 666 and a rack 668 engaged with the gear 666, the rack 668 is connected to the gear 666 and the sliding member 64, and the gear 666 is used to drive the rack 668 to move so as to slide the sliding member 64 relative to the cavity 62.
In this way, the slide 64 is driven by the gear 666 and the rack 668. Since the cooperation of the gear 666 and the rack 668 can convert the rotation of the gear 666 into the linear movement of the rack 668, when the user rotates the gear 666, the rack 668 can drive the sliding member 64 to slide relative to the cavity 62, so as to cause the volume of the adjusting cavity 68 to change, thereby adjusting the amount of the transparent liquid 54 in the refractive cavity 52. Gear 666 may be exposed from housing 20 for convenient rotation by a user.
Similarly, switch 61 may be correspondingly opened while gear 666 is rotating. In this way, the transparent liquid 54 can flow, and the pressure balance between the two sides of the sidewall 59 is ensured.
In one example, clockwise rotation of the gear 666 causes the rack 668 to engage the gear 666, the length of the rack 668 is shortened, and the switch 61, which controls the flow of the lucent liquid 54 from the diopter chamber 52 to the adjustment chamber 68, is opened by pulling the slide 64 away from the side wall 59.
In another example, the counter-clockwise rotation of the gear 666 disengages the rack 668 engaged on the gear 666 from the gear 666, the length of the rack 668 increases, pushing the slide 64 to move in a direction towards the side wall 59, which opens the switch 61 controlling the flow of the translucent liquid 54 from the adjustment chamber 68 to the diopter chamber 52.
Similarly, in this embodiment, the angle of rotation of gear 666 and the diopter number of diopter member 50 are not correlated, and the user may rotate gear 666 to the position where the visual experience is optimal. Of course, in other embodiments, the angle of rotation of gear 666 can be correlated with the diopter number of diopter member 50. Here, whether or not the rotation angle of the gear 666 and the dioptric power of the dioptric member 50 are related is not limited
Referring to fig. 7, the driving part 66 includes a driving motor 669, a motor shaft 6691 of the driving motor 669 is connected to the sliding member 64, and the driving motor 669 is used for driving the sliding member 64 to slide relative to the cavity 62.
In this manner, the slide 64 is driven by the drive motor 668. Specifically, the drive motor 669 may be a linear motor. The linear motor has a simple structure, does not need to pass through an intermediate conversion mechanism and directly generates linear motion, can reduce motion inertia and improve dynamic response performance and positioning accuracy. The slider 64 is driven by the drive motor 668, so that the driving of the slider 64 is editable. For example, the drive motor 668 can be correlated to the degree of refraction by prior calibration. The user can directly input the dioptric power and the drive motor 668 is automatically operated to drive the slide member 64 to slide to the corresponding position.
Further, the driving component 66 may further include an input device 6692, and the input device 6692 includes, but is not limited to, a key, a knob, or a touch screen. In the example of fig. 7, the input device 6692 is a key, and two keys are respectively disposed on opposite sides of the cavity 62. The keys may be exposed from the housing 20 for easy depression by a user. The key can control the working time of the driving motor 669 according to the number or time of external force pressing, thereby controlling the sliding distance of the sliding member 64.
Similarly, while the drive motor 669 is operating, the switch 61 may be correspondingly opened. In this way, the transparent liquid 54 can flow, and the pressure balance between the two sides of the sidewall 59 is ensured.
In one example, a user presses one of the two buttons to extend the motor shaft 6691, and the motor shaft 6691 pushes the slider 64 to move toward the side wall 59, which opens the switch 61 that controls the flow of the optically transparent liquid 54 from the adjustment chamber 68 to the refraction chamber 52.
In another example, the user presses the other of the two buttons to cause the motor shaft 6691 to contract, and the motor shaft 6691 pulls the slider 64 away from the side wall 59, which opens the switch 61 that controls the flow of the optically transparent liquid 54 from the diopter chamber 52 to the adjustment chamber 68.
It should be noted that the structure of the diopter member 50 includes not only the above diopter chamber 52, the light-transmissive liquid 54, the first film layer 56, the second film layer 58 and the side wall 59, as long as the diopter member 50 can achieve the diopter change effect. For example, in other aspects, the diopter member 50 includes a plurality of lenses and a drive member for driving each lens from the stored position to the diopter position. Thus, the diopter of the diopter member 50 can be changed by the combination of the plurality of lenses. Of course, the driving member can also drive each lens moved to the dioptric position to move on the dioptric optical axis, thereby changing the diopter of the dioptric member 50.
Thus, the above-described configuration of the diopter components includes the shape and state of the diopter components, and the change in diopter is accomplished by changing the shape of the first film 56 and/or the second film 58 in the manner of the structure of the diopter cavity 52, the light-transmissive liquid 54, the first film 56, the second film 58, and the side wall 59; the above structure mode of a plurality of lenses and driving pieces realizes the diopter change by changing the state of the lenses.
In summary, the present embodiment provides a head-mounted device 1000, the head-mounted device 100 includes a display 40, a diopter member 50, and an adjustment mechanism 60. The diopter member 50 is disposed on the side of the display 40. An adjustment mechanism 60 is coupled to the diopter members 50, the adjustment mechanism 60 being operable to adjust the configuration of the diopter members 50 to adjust the diopter power of the diopter members 50.
The head-mounted device 1000 according to the embodiment of the application adjusts the form of the diopter component 50 through the adjusting mechanism 60 to adjust the diopter of the diopter component 50, so that a user with ametropia can clearly see the image displayed by the display 40, and the user experience is improved.
Furthermore, in the headset 1000 of the embodiment of the present application, the diopter member 50 and the adjustment mechanism 60 can linearly correct the diopter power, so that each person with different diopter power can flexibly wear the headset. Meanwhile, the diopter component 50 and the adjusting mechanism 60 are small in size, and the wearing experience of the head-mounted device 100 is not affected. The user does not need to purchase many lenses and the price can be reduced.
Referring to fig. 1, 8 and 9, an embodiment of the present application provides a control method. The control method is used to control the system 1000. The control system 1000 includes a head-mounted device 100 and a mobile terminal 200 connected to the head-mounted device 100, the head-mounted device 100 including a display 40.
The control method comprises the following steps:
step S12: the mobile terminal 200 collects control information;
step S14: the mobile terminal 200 determines a control instruction of the display 40 according to the control information;
step S16: the head-mounted device 100 controls the display 40 to display according to the control instruction.
The embodiment of the application provides a control system 1000. The control system 1000 comprises a system processor 1001, a head-mounted device 100 and a mobile terminal 200 connected with the head-mounted device 100, wherein the system processor 1001 is connected with the head-mounted device 100 and the mobile terminal 200, the head-mounted device 100 comprises a display 40, and the system processor 1001 is used for collecting control information through the mobile terminal 200; and a control instruction for determining the display 40 according to the control information through the mobile terminal 200; and for controlling the display 40 to display according to the control instructions by the head-mounted device 100.
According to the control method of the control system 1000 and the control system 1000 in the embodiment of the application, the control instruction of the display 40 is determined according to the control information collected by the mobile terminal 200, and the display 40 is controlled to display according to the control instruction, so that the control of the head-mounted device 100 through the mobile terminal 200 is simply and conveniently realized, and the improvement of user experience is facilitated.
Specifically, the control information collected by the mobile terminal 200 may refer to raw information output by a sensor of the mobile terminal 200. For example, signals output by a touch screen, data output by an angular velocity sensor, data output by an acceleration sensor, and the like. The specific form of the control information is not limited herein.
Control instructions may refer to instructions that may act directly on the display 40 and control the display 40. For example, a continue play instruction, a shutdown instruction, a brightness adjustment instruction, an interface switching instruction, etc. The specific form of the control command is not limited herein. It will be appreciated that the control information is processed to extract control commands for the display 40.
In addition, the head-mounted device 100 may be an electronic device such as electronic glasses, a headset, and an electronic helmet. The specific form of the head-mounted device 100 is not limited herein.
The mobile terminal 200 may be a wearable device such as a smart band, a smart ring, a smart glove, or an electronic device such as a mobile phone, a tablet computer, or a personal computer. The specific form of the mobile terminal 200 is not limited herein.
Note that, for convenience of description, the control method of the control system 1000 according to the embodiment of the present application is explained in the embodiment of the present application by taking the example where the head-mounted device 100 is an electronic glasses and the mobile terminal 200 is a mobile phone. This does not represent a limitation on the specific forms of the headset 100 and the mobile terminal 200. In addition, in the example of fig. 1 and 2, the display 40 is in a binocular form. It will be appreciated that the display 40 may also be monocular.
Referring to fig. 10, in some embodiments, step S14 includes:
step S142: the mobile terminal 200 determines a control gesture according to the control information;
step S144: the mobile terminal 200 determines a control instruction according to the control gesture.
In some embodiments, the system processor 1001 is configured to determine a control gesture from the control information via the mobile terminal 200; and for determining a control instruction according to the control gesture by the mobile terminal 200.
In this manner, determination of control gestures from the control information is achieved, thereby determining control instructions for the display 40. Since the control information refers to raw information output from a sensor of the mobile terminal 200, the display 40 cannot be directly controlled according to the raw information output from the sensor of the mobile terminal 200 in a normal case. Therefore, the control information needs to be processed to obtain controllable instructions, so as to control the display 40.
The control system 1000 may be pre-stored with a corresponding relationship between the control gesture and the control command. In particular, the correspondence may be stored in the head mounted device 100, in the mobile terminal 200, or in other components of the control system 1000.
The correspondence may be stored in the head-mounted device 100 and/or the mobile terminal 200 by a manufacturer before shipment, may be set by the user, or may be imported by the user from a device other than the head-mounted device 100 and the mobile terminal 200. The source of the correspondence is not limited herein.
In step S144, after the control gesture is determined, a control command corresponding to the control gesture can be queried according to the corresponding relationship. Thereby realizing that the control instruction is determined according to the control gesture.
Specifically, the control gesture includes, but is not limited to, an angle of the mobile terminal 200, a position of the mobile terminal 200, a motion trajectory of the mobile terminal 200, and a sliding trajectory sliding on the mobile terminal 200.
Referring to fig. 11, in an example, the mobile terminal 200 is a mobile phone. The plane of the display screen of the mobile terminal 200 is originally vertical to the horizontal plane, and the controlled person 401 in the display 40 originally walks on the road 402. The user lays the mobile terminal 200 flat so that the plane where the display screen of the mobile terminal 200 is located is parallel to the horizontal plane, and determines the control gesture according to the control information generated thereby as follows: and horizontally placing, wherein the control command can be determined to be: the control display 40 shows the controlled character lying flat. The display 40 is controlled to display the controlled character 401 lying on the road 402 according to the control instruction.
Referring to fig. 12, in another example, the mobile terminal 200 is a mobile phone. The mobile terminal 200 is originally separated from the head-mounted device 100, and the display 40 originally displays a paused video picture. The user brings the mobile terminal 200 into contact with the first cradle 32 of the head-mounted device 100, and determines the control gesture as: the mobile terminal contacts the first support. According to the control gesture, the control command can be determined as follows: and continuing playing. The display 40 is controlled to continue playing the paused video picture according to the control instruction.
Referring to fig. 13, in another example, the mobile terminal 200 is a mobile phone. The user translates the mobile terminal 200 upward, and determines the control gesture as: and (4) translating upwards. According to the control gesture, the control command can be determined as follows: turning the page forward. The display 40 is controlled to turn the page forward according to the control command.
Referring to fig. 14, in another example, the mobile terminal 200 is a mobile phone. The user draws "Z" on the touch screen of the mobile terminal 200, and determines a control gesture according to the control information thus generated as: and Z. According to the control gesture, the control command can be determined as follows: the display is turned off. The display 40 is controlled to be turned off according to the control command.
Of course, in other examples, determining the control command according to the control gesture may also include the following scenarios: when the control gesture is that the mobile terminal translates leftwards, determining that the control command is as follows: returning to the upper level menu; when the control gesture is that the mobile terminal translates rightwards, determining that the control command is as follows: entering a next level menu; when the control gesture is upward translation of the mobile terminal, determining that the control command is as follows: the page scrolls downwards; when the control gesture is downward translation of the mobile terminal, determining that the control command is as follows: the page is scrolled upwards; when the control gesture is small-amplitude front and back movement of the mobile terminal, determining that a control instruction is as follows: and (4) selecting and determining. The specific manner in which the control command is determined by determining the control gesture from the control information is not limited herein.
In the present embodiment, the control gesture is determined from the control information, and the control command is determined based on the control gesture. It is understood that in other embodiments, the control speech may be determined according to the control information, and the control instruction may be determined according to the control speech. The specific manner of determining the control command from the control information is not limited herein.
Referring to fig. 15, in some embodiments, the mobile terminal 200 includes a motion sensor 202, the control information includes motion information, and step S142 includes:
step S1422: the mobile terminal 200 determines motion information according to output data of the motion sensor 202;
step S1424: the mobile terminal 200 determines a control gesture according to the motion information.
In some embodiments, the mobile terminal 200 includes a motion sensor 202, the control information includes motion information, and the system processor 1001 is configured to determine, by the mobile terminal 200, the motion information from output data of the motion sensor 202; and for determining a control gesture from the motion information through the mobile terminal 200.
In this way, determination of a control gesture from control information is achieved. Specifically, the motion sensor 202 may refer to a sensor that can detect a motion state and a motion trajectory of the mobile terminal 200. Such as acceleration sensors that can measure acceleration and gyroscopes that can measure angular motion.
Here, "output data of the motion sensor 202" refers to raw data output by the motion sensor 202. Such as the angle of the output, acceleration, etc. In general, the motion information cannot be directly determined from the raw data output by the motion sensor 202. Thus, the raw data may be processed, e.g. noise reduced, calculated, etc., to determine motion information from the output data.
In the example of fig. 11, the motion information may be determined from the output data of motion sensor 202 as: the plane on which the display screen of the mobile terminal 200 is located is parallel to the horizontal plane, thereby determining that the control gesture is: and (4) flatly placing, and further determining that the control command is as follows: the control display 40 shows the controlled character lying flat. The display 40 is controlled to display the controlled character 401 lying on the road 402 according to the control instruction.
In the example of fig. 13, the motion information may be determined from the output data of motion sensor 202 as: the mobile terminal 200 translates upward, thereby determining that the control gesture is: and translating upwards to determine that the control command is forward page turning. The display 40 is controlled to turn the page forward according to the control command.
Referring to fig. 16, in some embodiments, the mobile terminal 200 includes a touch sensor 203, the control information includes slide information, and the step S142 includes:
step S1426: the mobile terminal 200 determines slide information according to output data of the touch sensor 203;
step S1428: the mobile terminal 200 determines a control gesture according to the slide information.
In some embodiments, the mobile terminal 200 includes a touch sensor 203, the control information includes slide information, and the system processor 1001 is configured to determine the slide information from output data of the touch sensor 203 through the mobile terminal 200; and for determining a control gesture according to the slide information through the mobile terminal 200.
In this way, determination of a control gesture from control information is achieved. Specifically, the touch sensor 203 may refer to a sensor that can detect a touch condition of the mobile terminal 200. Such as a touch screen.
Similarly, here, "output data of the touch sensor 203" refers to raw data output by the touch sensor 203. Such as electrical signals output by a touch screen. In general, the slide information cannot be directly determined from the raw data output from the touch sensor 203. Thus, the raw data may be processed, e.g. noise reduced, calculated, etc., to determine the slip information from the output data.
In the example of fig. 14, the slide information may be determined from the output data of the touch sensor 203 as: draw "Z" on the touch screen, thereby determining that the control gesture is: z, further determining that the control command is: the display is turned off. The display 40 is controlled to be turned off according to the control command.
Of course, the control information may also include location information. The location information may refer to a relative location of the mobile terminal 200 and the headset 100. The mobile terminal 200 may include hall sensors, and the headset 100 may be provided with magnetic members, and may determine position information from output data of the hall sensors and determine a control gesture from the position information.
For example, in the example of fig. 12, the output data of the hall sensor may determine the position information as: the mobile terminal 200 is in contact with the first stand 32 of the head mounted device 100. Thereby determining that the control gesture is: the mobile terminal contacts the first support. And further determining that the control command is: and continuing playing. The display 40 is controlled to continue playing the paused video picture according to the control instruction.
Referring to fig. 17, in some embodiments, the head-mounted device 100 includes a device processor 101 connected to the display 40, the mobile terminal 200 includes a terminal processor 201 connected to the display 40, and the computing power of the terminal processor 201 is greater than that of the device processor 101, and the control method includes:
step S18: the head-mounted device 100 determines performance requirements of the head-mounted device 100;
step S19: the head-mounted device 100 selects the device processor 101 or the terminal processor 201 to output signals for controlling the display 40 according to the performance requirements.
In some embodiments, the head mounted device 100 includes a device processor 101 coupled to the display 40, the mobile terminal 200 includes a terminal processor 201 coupled to the display 40, the terminal processor 201 has a computing power greater than a computing power of the device processor 101, and the system processor 1001 is configured to determine a performance requirement of the head mounted device 100 via the head mounted device 100; and for selecting, by the head-mounted device 100, either the device processor 101 or the terminal processor 201 to output signals for controlling the display 40 according to performance requirements.
In this way, high performance can be provided for the head-mounted device 100 through the mobile terminal 200, and portability of the head-mounted device 100 and comfort of wearing by a user can be ensured. It will be appreciated that since the device processor 101 may be selected according to the performance requirements or the terminal processor 201 may output signals for controlling the display 40, in the case of high performance requirements, the terminal processor 201 may share the computational stress for the device processor 101, thereby ensuring high performance of the head mounted device 100. Therefore, the head-mounted device 100 does not need to be configured with a processor with high performance, which is beneficial to the miniaturization and the light weight of the head-mounted device 100 and can also reduce the heat generation of the head-mounted device 100.
Referring to fig. 18, in some embodiments, step S18 includes:
step S182: the head mounted device 100 determines a current application of the head mounted device 100;
step S184: the head-mounted device 100 determines performance requirements according to the current application.
In some embodiments, the system processor 1001 is configured to determine, by the head mounted device 100, a current application of the head mounted device 100; and for determining performance requirements by the head-mounted device 100 according to the current application.
In this manner, determining performance requirements of the head-mounted device 100 is achieved. It will be appreciated that the performance required for different applications is not the same and therefore the performance requirements may be determined according to the current application of the head-mounted device 100. In particular, the number of current applications of the head mounted device 100 may be 1, 2, 4, or other numbers. The number of current applications is not limited herein. Note that the performance requirement of the current application may refer to the total performance requirement of all applications currently running.
Specifically, the head-mounted device 100 includes a first application and a second application, the performance requirement of the first application is greater than the performance requirement of the second application, the performance requirement includes a first performance requirement and a second performance requirement, the first performance requirement is greater than the second performance requirement, and the step S184 includes:
in a case where the current application includes a first application, the head-mounted device 100 determines the performance requirement as a first performance requirement;
in a case where the current application is the second application, the head-mounted device 100 determines the performance requirement as the second performance requirement.
In this manner, performance requirements are determined based on the current application. Note that "performance requirements" herein refer to resources such as computing resources, power resources, etc. that are required for the application to operate normally. That is, the performance of the head-mounted device 100 is at a cost when the application is operating normally.
In this embodiment, the first application is a high-performance application, and the second application is a low-performance application. The first performance requirement is a high performance requirement and the second performance requirement is a low performance requirement.
It will be appreciated that the number of current applications may be plural, and in the case where the current applications include first applications, even if there is only one first application, the performance requirements of the current applications may be significantly increased. Thus, the performance requirement is determined to be the first performance requirement as long as the current application comprises the first application. Similarly, when the current applications are all the second applications, the performance requirement is determined to be the second performance requirement.
The first application may refer to an application that needs to be rendered and cannot directly read and display content, such as performing scene reconstruction. The second application may refer to an application that does not require rendering, and can directly read and display content, such as music, documents, videos, and the like.
In particular, the first application may include an Augmented Reality Application (AR). In this embodiment, the first application includes at least one of a gesture recognition application, an eye tracking application, a three-dimensional model application, a scene reconstruction application, a Simultaneous Localization And Mapping (SLAM) application, And a 6Degree of Freedom (6 DoF) application. The second application comprises at least one of an information reminding application, a document reading application and a video and audio playing application.
The information alert application may alert the headset 100 of the current power, time, notes, etc. The document reading application may read a Word document, a TXT document, a PPT document, an Excel document, and the like. The video and audio playing application can play videos, music and the like. The content read by the applications does not need to be rendered and can be directly read.
In one example, the previous application is an information alert application, the performance requirement is a second performance requirement, no performance support is provided by mobile terminal 200 for headset 100, and headset 100 and mobile terminal 200 are not connected. While the current applications include SLAM applications, the performance requirement is a first performance requirement, and the mobile terminal 200 is required to provide performance support for the headset 100, and the headset 100 and the mobile terminal 200 are connected. The SLAM application needs to calculate the motion posture of the head-mounted device 100 and the information of the surrounding scene according to the data acquired by the camera module and the sensor, and re-render a corresponding picture according to the information to realize scene reconstruction. Therefore, the device processor 101 sends the data collected by the camera module and the sensor to the terminal processor 201, the terminal processor 201 calculates the motion posture of the head-mounted device 100 and the information of the surrounding scene according to the data collected by the camera module and the sensor, and re-renders the corresponding pictures according to the information to control the display 40 to display, thereby realizing scene reconstruction.
Of course, the device processor 101 may also preprocess data collected by the camera module and the sensor, and send the preprocessed data to the terminal processor 201. For example, the device processor 101 may calculate the motion posture of the head-mounted device 100 itself and the information of the surrounding scene according to the data collected by the camera module and the sensor, then transmit the motion posture of the head-mounted device 100 itself and the information of the surrounding scene to the terminal processor 201, and the terminal processor 201 re-renders the corresponding picture according to the information to control the display 40 to display, thereby implementing scene reconstruction.
In another example, where the current application is Word, i.e., a document reading application, the performance requirement is a second performance requirement, and the mobile terminal 200 is not required to provide performance support for the headset 100.
Alternatively, the value of the performance requirement may be determined based on the current application and compared to a requirement threshold. Determining the performance requirement as a first performance requirement when the value of the performance requirement is greater than a requirement threshold; in the event that the value of the performance requirement is less than or equal to the requirement threshold, the performance requirement is determined to be a second performance requirement. This also controls the mobile terminal 200 to support the headset 100 in case of too high performance requirements due to too many second applications. The specific manner in which the performance requirements of the headset 100 are determined is not limited herein.
Referring to FIG. 19, in some embodiments, the performance requirements include a first performance requirement and a second performance requirement, the first performance requirement being greater than the second performance requirement, and step S19 includes:
step S192: in the case that the performance requirement is the first performance requirement, the head-mounted device 100 selects the terminal processor 201 to output a signal for controlling the display 40 according to the performance requirement;
step S194: in the event that the performance requirement is the second performance requirement, the head mounted device 100 selects the device processor 101 to output a signal to control the display 40 according to the performance requirement.
In some embodiments, the performance requirement includes a first performance requirement and a second performance requirement, the first performance requirement is greater than the second performance requirement, and the system processor 1001 is configured to select, by the head-mounted device 100, the terminal processor 201 to output a signal for controlling the display 40 according to the performance requirement if the performance requirement is the first performance requirement; and for selecting, by the head-mounted device 100, a signal for the device processor 101 to output control of the display 40 according to the performance requirement, if the performance requirement is the second performance requirement.
In this manner, selection of a device processor 101 or terminal processor 201 to output signals to control the display 40 according to performance requirements is achieved. It will be appreciated that in the case where the performance requirement is the first performance requirement, the performance requirement is not met by the device processor 101 alone and requires support by the terminal processor 201. Accordingly, the terminal processor 201 may be selected to output a signal for controlling the display 40, thereby enabling the mobile terminal 200 to support the head-mounted device 100, thereby securing the performance of the head-mounted device 100.
Similarly, in the case that the performance requirement is the second performance requirement, the device processor 101 may meet the performance requirement without the support of the terminal processor 201. Thus, the selectable device processor 101 outputs signals that control the display 40.
The head-mounted device 100 and the mobile terminal 200 may be connected by wire or wirelessly. In one example, the first interface of the head-mounted device 100 and the second interface of the mobile terminal 200 may be connected by a connection line, thereby implementing a wired connection of the head-mounted device 100 and the mobile terminal 200. Furthermore, the connecting line is internally provided with a plurality of lines including power lines and signal lines. The power line may transmit power to enable the mobile terminal 200 to provide power support for the head-mounted device 100. The signal lines may transmit data that enables the mobile terminal 200 to provide computational support to the headset 100. The wired connection may be automatically connected by the headset 100 or manually connected by the user.
The wireless connection of the headset 100 and the mobile terminal 200 may be implemented through a first wireless transceiver of the headset 100 and a second wireless transceiver of the mobile terminal 200, so that data is wirelessly transmitted between the headset 100 and the mobile terminal 200, and the mobile terminal 200 wirelessly charges the headset 100. In addition, the wireless connection between the head-mounted device 100 and the mobile terminal 200 may also be implemented by bluetooth, wireless local area network (e.g., Wi-Fi), millimeter wave communication, optical communication, and the like. The specific manner of wireless connection is not limited herein.
Referring to fig. 20, an embodiment of the present application provides a control method. The control method is used for the head-mounted device 100, the head-mounted device 100 comprises a display 40, the head-mounted device 100 is used for connecting the mobile terminal 200, and the control method comprises the following steps:
step S22: acquiring a control instruction sent by the mobile terminal 200, wherein the control instruction is determined by the mobile terminal 200 according to the control information acquired by the mobile terminal 200;
step S24: and controlling the display 40 to display according to the control instruction.
The embodiment of the application provides a head-mounted device 100. The head-mounted device 100 comprises a display 40 and a device processor 101 connected with the display 40, the device processor 101 is connected with the mobile terminal 200, the device processor 101 is used for acquiring a control instruction sent by the mobile terminal 200, and the control instruction is determined by the mobile terminal 200 according to control information collected by the mobile terminal 200; and controlling the display 40 to display according to the control instruction.
According to the control method of the head-mounted device 100 and the head-mounted device 100, the control instruction of the display 40 is determined according to the control information collected by the mobile terminal 200, and the display 40 is controlled to display according to the control instruction, so that the control of the head-mounted device 100 through the mobile terminal 200 is simply and conveniently realized, and the improvement of user experience is facilitated.
Note that, the explanation and the explanation of the control method of the head mounted device 100 and the head mounted device 100 can refer to the explanation and the explanation of the control method of the control system 1000 and the control system 1000 described above. To avoid redundancy, it is not described herein.
Referring to fig. 21, in some embodiments, the head-mounted device 100 includes a device processor 101 connected to the display 40, the mobile terminal 200 includes a terminal processor 201 connected to the display 40, and the computing power of the terminal processor 201 is greater than that of the device processor 101, and the control method includes:
step S26: determining performance requirements of the head-mounted device 100;
step S28: the device processor 101 or the terminal processor 201 is selected to output a signal for controlling the display 40 according to the performance requirements.
In some embodiments, the head mounted device 100 includes a device processor 101 coupled to the display 40, the mobile terminal 200 includes a terminal processor 201 coupled to the display 40, the terminal processor 201 having a computing power greater than a computing power of the device processor 101, the device processor 101 to determine a performance requirement of the head mounted device 100; and for selecting a signal for the device processor 101 or the terminal processor 201 to output control of the display 40 according to performance requirements.
Referring to fig. 22, in some embodiments, step S26 includes:
step S262: determining a current application of the head mounted device 100;
step S264: the performance requirements are determined according to the current application.
In some embodiments, the device processor 101 is configured to determine a current application of the head mounted device 100; and for determining performance requirements based on the current application.
Referring to FIG. 23, in some embodiments, the performance requirements include a first performance requirement and a second performance requirement, the first performance requirement being greater than the second performance requirement, and step S28 includes:
step S282: under the condition that the performance requirement is a first performance requirement, selecting the terminal processor 201 to output a signal for controlling the display 40 according to the performance requirement;
step S284: in the event that the performance requirement is a second performance requirement, the device processor 101 outputs a signal to control the display 40 in accordance with the performance requirement.
In some embodiments, the performance requirement includes a first performance requirement and a second performance requirement, the first performance requirement being greater than the second performance requirement, the device processor 101 is configured to select the terminal processor 201 to output a signal for controlling the display 40 according to the performance requirement if the performance requirement is the first performance requirement; and for selecting a signal by the device processor 101 to output control of the display 40 in accordance with the performance requirement in the event that the performance requirement is a second performance requirement.
Referring to fig. 24, an embodiment of the present application provides a control method. The control method is used for the mobile terminal 200, the mobile terminal 200 is connected with the head-mounted device 100, the head-mounted device 100 comprises a display 40, and the control method comprises the following steps:
step S32: acquiring control information acquired by the mobile terminal 200;
step S34: determining a control instruction of the display 40 according to the control information;
step S36: and sending a control instruction to the head-mounted device 100 to enable the head-mounted device 100 to control the display 40 to display according to the control instruction.
The embodiment of the application provides a mobile terminal 200. The mobile terminal 200 comprises a terminal processor 201, the terminal processor 201 is connected with the head-mounted device 100, the head-mounted device 100 comprises a display 40, and the terminal processor 201 is used for acquiring control information acquired by the mobile terminal 200; and control instructions for determining the display 40 based on the control information; and is used for sending a control instruction to the head-mounted device 100 so as to enable the head-mounted device 100 to control the display 40 to display according to the control instruction.
According to the control method of the mobile terminal 200 and the mobile terminal 200, the control instruction of the display 40 is determined according to the control information collected by the mobile terminal 200, and the display 40 is controlled to display according to the control instruction, so that the control of the head-mounted device 100 through the mobile terminal 200 is simply and conveniently realized, and the improvement of user experience is facilitated.
Note that the explanation and explanation of the control method of the mobile terminal 200 and the mobile terminal 200 can refer to the explanation and explanation of the control method of the control system 1000 and the control system 1000 described above. To avoid redundancy, it is not described herein.
Referring to fig. 25, in some embodiments, step S34 includes:
step S342: determining a control gesture according to the control information;
step S344: and determining a control instruction according to the control gesture.
In some embodiments, the terminal processor 201 is configured to determine a control gesture from the control information; and for determining a control instruction from the control gesture.
Referring to fig. 26, in some embodiments, the mobile terminal 200 includes a motion sensor 202, the control information includes motion information, and step S342 includes:
step S3422: determining motion information from the output data of the motion sensor 202;
step S3424: a control gesture is determined from the motion information.
In some embodiments, the mobile terminal 200 includes a motion sensor 202, the control information includes motion information, and the terminal processor 201 is configured to determine the motion information from output data of the motion sensor 202; and for determining a control gesture from the motion information.
Referring to fig. 27, in some embodiments, the mobile terminal 200 includes the touch sensor 203, the control information includes sliding information, and step S342 includes:
step S3426: determining sliding information from the output data of the touch sensor 203;
step S3428: and determining a control gesture according to the sliding information.
In some embodiments, the mobile terminal 200 includes a touch sensor 203, the control information includes slide information, and the terminal processor 201 is configured to determine the slide information according to output data of the touch sensor 203; and for determining a control gesture from the slide information.
It will be understood by those skilled in the art that all or part of the processes of the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, and the program may be stored in a non-volatile computer readable storage medium, and when executed, may include the processes of the embodiments of the methods as described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), or the like.
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (30)

1. A control method for controlling a system, the control system including a head-mounted device and a mobile terminal connected to the head-mounted device, the head-mounted device including a display, the control method comprising:
the mobile terminal collects control information;
the mobile terminal determines a control instruction of the display according to the control information;
and the head-mounted equipment controls the display to display according to the control instruction.
2. The method according to claim 1, wherein the determining, by the mobile terminal, the control instruction of the display according to the control information comprises:
the mobile terminal determines a control gesture according to the control information;
and the mobile terminal determines the control instruction according to the control gesture.
3. The control method according to claim 2, wherein the mobile terminal comprises a motion sensor, the control information comprises motion information, and the mobile terminal determines a control gesture according to the control information, comprising:
the mobile terminal determines the motion information according to the output data of the motion sensor;
and the mobile terminal determines the control gesture according to the motion information.
4. The control method according to claim 2, wherein the mobile terminal comprises a touch sensor, the control information comprises slide information, and the mobile terminal determines a control gesture according to the control information, comprising:
the mobile terminal determines the sliding information according to the output data of the touch sensor;
and the mobile terminal determines the control gesture according to the sliding information.
5. The control method according to claim 1, wherein the head-mounted device includes a device processor connected to the display, wherein the mobile terminal includes a terminal processor connected to the display, wherein a computing power of the terminal processor is greater than a computing power of the device processor, and wherein the control method includes:
the head-mounted device determining performance requirements of the head-mounted device;
and the head-mounted equipment selects the equipment processor or the terminal processor to output a signal for controlling the display according to the performance requirement.
6. The control method of claim 5, wherein the headset determines performance requirements of the headset, comprising:
the headset determines a current application of the headset;
the head-mounted device determines the performance requirements according to the current application.
7. The method of claim 5, wherein the performance requirements include a first performance requirement and a second performance requirement, the first performance requirement being greater than the second performance requirement, and wherein selecting the device processor or the terminal processor to output a signal to control the display by the head-mounted device based on the performance requirements comprises:
under the condition that the performance requirement is the first performance requirement, the head-mounted equipment selects the terminal processor to output a signal for controlling the display according to the performance requirement;
and under the condition that the performance requirement is the second performance requirement, the head-mounted equipment selects the equipment processor to output a signal for controlling the display according to the performance requirement.
8. A control method for a head-mounted device, wherein the head-mounted device comprises a display, and the head-mounted device is used for connecting a mobile terminal, the control method comprising:
acquiring a control instruction sent by the mobile terminal, wherein the control instruction is determined by the mobile terminal according to control information acquired by the mobile terminal;
and controlling the display to display according to the control instruction.
9. The control method according to claim 8, wherein the head-mounted device includes a device processor connected to the display, wherein the mobile terminal includes a terminal processor connected to the display, wherein the terminal processor has a computing power greater than that of the device processor, and wherein the control method includes:
determining performance requirements of the head-mounted device;
and selecting the equipment processor or the terminal processor to output a signal for controlling the display according to the performance requirement.
10. The control method of claim 9, wherein determining the performance requirements of the head-mounted device comprises:
determining a current application of the headset;
determining the performance requirement according to the current application.
11. The method of claim 9, wherein the performance requirements include a first performance requirement and a second performance requirement, the first performance requirement being greater than the second performance requirement, and wherein selecting the device processor or the terminal processor to output a signal to control the display based on the performance requirements comprises:
under the condition that the performance requirement is the first performance requirement, selecting the terminal processor to output a signal for controlling the display according to the performance requirement;
and under the condition that the performance requirement is the second performance requirement, selecting the equipment processor to output a signal for controlling the display according to the performance requirement.
12. A control method for a mobile terminal, wherein the mobile terminal is connected with a head-mounted device, the head-mounted device comprises a display, the control method comprises:
acquiring control information acquired by the mobile terminal;
determining a control instruction of the display according to the control information;
and sending the control instruction to the head-mounted equipment so that the head-mounted equipment controls the display to display according to the control instruction.
13. The method of claim 12, wherein determining the control command for the display based on the control information comprises:
determining a control gesture according to the control information;
determining the control instruction according to the control gesture.
14. The method of claim 13, wherein the mobile terminal comprises a motion sensor, wherein the control information comprises motion information, and wherein determining the control gesture based on the control information comprises:
determining the motion information according to the output data of the motion sensor;
determining the control gesture according to the motion information.
15. The control method according to claim 13, wherein the mobile terminal includes a touch sensor, the control information includes slide information, and determining a control gesture according to the control information includes:
determining the sliding information according to the output data of the touch sensor;
and determining the control gesture according to the sliding information.
16. A control system is characterized by comprising a system processor, a head-mounted device and a mobile terminal connected with the head-mounted device, wherein the system processor is connected with the head-mounted device and the mobile terminal, the head-mounted device comprises a display, and the system processor is used for collecting control information through the mobile terminal; and a control instruction for determining the display according to the control information through the mobile terminal; and the head-mounted equipment is used for controlling the display to display according to the control instruction.
17. The control system of claim 16, wherein the system processor is configured to determine, by the mobile terminal, a control gesture based on the control information; and the control instruction is determined by the mobile terminal according to the control gesture.
18. The control system of claim 17, wherein the mobile terminal includes a motion sensor, wherein the control information includes motion information, and wherein the system processor is configured to determine, by the mobile terminal, the motion information based on output data from the motion sensor; and the control gesture is determined according to the motion information through the mobile terminal.
19. The control system of claim 17, wherein the mobile terminal includes a touch sensor, wherein the control information includes swipe information, and wherein the system processor is configured to determine, by the mobile terminal, the swipe information from output data of the touch sensor; and the control gesture is determined by the mobile terminal according to the sliding information.
20. The control system of claim 16, wherein the head-mounted device includes a device processor coupled to the display, wherein the mobile terminal includes a terminal processor coupled to the display, wherein the terminal processor has a greater computing power than the device processor, and wherein the system processor is configured to determine performance requirements of the head-mounted device via the head-mounted device; and the head-mounted equipment is used for selecting the equipment processor or the terminal processor to output a signal for controlling the display according to the performance requirement.
21. The control system of claim 20, wherein the system processor is configured to determine, by the headset, a current application of the headset; and means for determining, by the headset, the performance requirement according to the current application.
22. The control system of claim 20, wherein the performance requirements include a first performance requirement and a second performance requirement, the first performance requirement being greater than the second performance requirement, the system processor configured to select, via the head-mounted device, the terminal processor output to control the display in accordance with the performance requirements if the performance requirements are the first performance requirements; and selecting, by the headset, the device processor to output a signal to control the display according to the performance requirement if the performance requirement is the second performance requirement.
23. A head-mounted device is characterized by comprising a display and a device processor connected with the display, wherein the device processor is connected with a mobile terminal and is used for acquiring a control instruction sent by the mobile terminal, and the control instruction is determined by the mobile terminal according to control information acquired by the mobile terminal; and controlling the display to display according to the control instruction.
24. The headset of claim 23, wherein the headset comprises a device processor coupled to the display, wherein the mobile terminal comprises a terminal processor coupled to the display, wherein the terminal processor has a greater computational power than the device processor, and wherein the device processor is configured to determine performance requirements of the headset; and the device processor or the terminal processor is selected to output a signal for controlling the display according to the performance requirement.
25. The headset of claim 24, wherein the device processor is configured to determine a current application of the headset; and for determining said performance requirement in dependence on said current application.
26. The headset of claim 24, wherein the performance requirements include a first performance requirement and a second performance requirement, the first performance requirement being greater than the second performance requirement, the device processor being configured to select the terminal processor output signal for controlling the display based on the performance requirement if the performance requirement is the first performance requirement; and selecting the device processor to output a signal for controlling the display according to the performance requirement if the performance requirement is the second performance requirement.
27. A mobile terminal is characterized in that the mobile terminal comprises a terminal processor, the terminal processor is connected with a head-mounted device, the head-mounted device comprises a display, and the terminal processor is used for acquiring control information acquired by the mobile terminal; and a control instruction for determining the display according to the control information; and the control instruction is sent to the head-mounted equipment so that the head-mounted equipment controls the display to display according to the control instruction.
28. The mobile terminal of claim 27, wherein the terminal processor is configured to determine a control gesture based on the control information; and for determining the control instruction from the control gesture.
29. The mobile terminal of claim 28, wherein the mobile terminal comprises a motion sensor, wherein the control information comprises motion information, and wherein the terminal processor is configured to determine the motion information based on output data from the motion sensor; and for determining the control gesture from the motion information.
30. The mobile terminal of claim 28, wherein the mobile terminal comprises a touch sensor, wherein the control information comprises slide information, and wherein the terminal processor is configured to determine the slide information according to output data of the touch sensor; and means for determining the control gesture from the swipe information.
CN201910506685.XA 2019-06-12 2019-06-12 Control method, head-mounted device, mobile terminal and control system Pending CN112083796A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910506685.XA CN112083796A (en) 2019-06-12 2019-06-12 Control method, head-mounted device, mobile terminal and control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910506685.XA CN112083796A (en) 2019-06-12 2019-06-12 Control method, head-mounted device, mobile terminal and control system

Publications (1)

Publication Number Publication Date
CN112083796A true CN112083796A (en) 2020-12-15

Family

ID=73733537

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910506685.XA Pending CN112083796A (en) 2019-06-12 2019-06-12 Control method, head-mounted device, mobile terminal and control system

Country Status (1)

Country Link
CN (1) CN112083796A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114355785A (en) * 2022-01-04 2022-04-15 南方科技大学 Device control method and device, electronic device and readable storage medium
CN115334274A (en) * 2022-08-17 2022-11-11 上海疆通科技有限公司 Remote assistance method and device based on augmented reality

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101890719A (en) * 2010-07-09 2010-11-24 中国科学院深圳先进技术研究院 Robot remote control device and robot system
JP2011186856A (en) * 2010-03-09 2011-09-22 Nec Corp Mobile terminal to be used with head mounted display as external display device
CN104866110A (en) * 2015-06-10 2015-08-26 深圳市腾讯计算机系统有限公司 Gesture control method, mobile terminal and system
CN105892636A (en) * 2015-11-20 2016-08-24 乐视致新电子科技(天津)有限公司 Control method applied to head-mounted device and head-mounted device
KR20160149066A (en) * 2015-06-17 2016-12-27 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN106445098A (en) * 2016-07-29 2017-02-22 北京小米移动软件有限公司 Control method and control apparatus used for head-mounted device, and mobile device
CN106708412A (en) * 2017-01-17 2017-05-24 亿航智能设备(广州)有限公司 Method and device for controlling intelligent terminals
CN107122009A (en) * 2017-05-23 2017-09-01 北京小鸟看看科技有限公司 It is a kind of to realize mobile terminal and wear method that display device is interacted, wear display device, back splint and system
CN109407757A (en) * 2018-02-09 2019-03-01 北京小米移动软件有限公司 Display system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011186856A (en) * 2010-03-09 2011-09-22 Nec Corp Mobile terminal to be used with head mounted display as external display device
CN101890719A (en) * 2010-07-09 2010-11-24 中国科学院深圳先进技术研究院 Robot remote control device and robot system
CN104866110A (en) * 2015-06-10 2015-08-26 深圳市腾讯计算机系统有限公司 Gesture control method, mobile terminal and system
KR20160149066A (en) * 2015-06-17 2016-12-27 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN105892636A (en) * 2015-11-20 2016-08-24 乐视致新电子科技(天津)有限公司 Control method applied to head-mounted device and head-mounted device
CN106445098A (en) * 2016-07-29 2017-02-22 北京小米移动软件有限公司 Control method and control apparatus used for head-mounted device, and mobile device
CN106708412A (en) * 2017-01-17 2017-05-24 亿航智能设备(广州)有限公司 Method and device for controlling intelligent terminals
CN107122009A (en) * 2017-05-23 2017-09-01 北京小鸟看看科技有限公司 It is a kind of to realize mobile terminal and wear method that display device is interacted, wear display device, back splint and system
CN109407757A (en) * 2018-02-09 2019-03-01 北京小米移动软件有限公司 Display system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114355785A (en) * 2022-01-04 2022-04-15 南方科技大学 Device control method and device, electronic device and readable storage medium
CN114355785B (en) * 2022-01-04 2024-04-26 南方科技大学 Device control method, device, electronic device and readable storage medium
CN115334274A (en) * 2022-08-17 2022-11-11 上海疆通科技有限公司 Remote assistance method and device based on augmented reality

Similar Documents

Publication Publication Date Title
CN212181161U (en) Wearable imaging device
US8706170B2 (en) Miniature communications gateway for head mounted display
US20180210544A1 (en) Head Tracking Based Gesture Control Techniques For Head Mounted Displays
CN110398840B (en) Method for adjusting optical center distance, head-mounted device and storage medium
US10261579B2 (en) Head-mounted display apparatus
CN107436683B (en) Display controller, electronic device and virtual reality device
US9122307B2 (en) Advanced remote control of host application using motion and voice commands
US7593757B2 (en) Mobile information apparatus
US8862186B2 (en) Lapel microphone micro-display system incorporating mobile information access system
US9500867B2 (en) Head-tracking based selection technique for head mounted displays (HMD)
US20140111427A1 (en) LifeBoard - Series Of Home Pages For Head Mounted Displays (HMD) That Respond to Head Tracking
WO2020238462A1 (en) Control method, wearable device, and storage medium
KR20160033605A (en) Apparatus and method for displying content
US11947728B2 (en) Electronic device for executing function based on hand gesture and method for operating thereof
CN112083796A (en) Control method, head-mounted device, mobile terminal and control system
CN110275620B (en) Interaction method, interaction device, head-mounted equipment and storage medium
KR20220100143A (en) Wearable electoronic device and input structure using motion sensor in the same
US20230152900A1 (en) Wearable electronic device for displaying virtual object and method of controlling the same
CN209951534U (en) System for subverting traditional visual habits and performing brain-eye coordination movement image training
CN112558847B (en) Method for controlling interface display and head-mounted display
CN111948807B (en) Control method, control device, wearable device and storage medium
CN112068693A (en) Control method, head-mounted device, server and computing system
US20240087221A1 (en) Method and apparatus for determining persona of avatar object in virtual space
JP2018092206A (en) Head-mounted display, program, and method for controlling head-mounted display
KR20240030863A (en) Electronic devices for preventing display damage and method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination