WO2023178667A1 - 一种头戴显示设备及控制方法 - Google Patents

一种头戴显示设备及控制方法 Download PDF

Info

Publication number
WO2023178667A1
WO2023178667A1 PCT/CN2022/083108 CN2022083108W WO2023178667A1 WO 2023178667 A1 WO2023178667 A1 WO 2023178667A1 CN 2022083108 W CN2022083108 W CN 2022083108W WO 2023178667 A1 WO2023178667 A1 WO 2023178667A1
Authority
WO
WIPO (PCT)
Prior art keywords
head
display device
mounted display
user
touch panel
Prior art date
Application number
PCT/CN2022/083108
Other languages
English (en)
French (fr)
Inventor
黄敏
徐振华
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2022/083108 priority Critical patent/WO2023178667A1/zh
Publication of WO2023178667A1 publication Critical patent/WO2023178667A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present application relates to the technical field of head-mounted devices, and in particular, to a head-mounted display device and a control method.
  • buttons and other interactive components are usually provided on the head-mounted display device.
  • the user's hand may accidentally touch the interactive component (such as the touch panel) of the head-mounted display device for detecting the user's input operation. In this way, Incorrect input will occur, which will affect the user's normal use and reduce the user's experience.
  • the interactive component such as the touch panel
  • this application is proposed to provide a head-mounted display device and a control method that solve the above problems.
  • a head-mounted display device including:
  • a device main body having a wearing surface facing the user's face and a plurality of outer surfaces
  • Interactive components at least one of the lateral surfaces is provided with the interactive components
  • Detection sensor at least one detection sensor is provided on the wearing surface
  • the device body controls the interactive component to transition to an unlocked state
  • the device body controls the interactive component to transition to a locked state.
  • a method for controlling a head-mounted display device includes a touch panel for detecting touch operations of a user wearing the head-mounted display device. and sensors for outputting sensory data, including:
  • a head-mounted display device including a touch panel for detecting touch operations of a user wearing the head-mounted display device, and a touch panel for outputting sensing data. sensors and processors, among which,
  • the processor is used to perform the following operations:
  • the technical solution provided by the embodiment of the present application can detect whether the head-mounted display device is being worn through the detection sensor, so that according to the detection result, the main body of the device controls the interactive component to switch to an unlocked state or a locked state, thereby making it convenient for the user to use, while also preventing Improve the user experience when interactive components are misoperated.
  • Figure 1 is a schematic structural diagram of the wearing surface of a head-mounted display device provided by an embodiment of the present application
  • Figure 2 is a schematic structural diagram of one side of the head-mounted display device provided by an embodiment of the present application.
  • Figure 3 is a schematic structural diagram of the front of a head-mounted display device provided by an embodiment of the present application.
  • Figure 4 is a schematic flowchart of an interaction process of an interactive component provided by an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of a wearing surface of a head-mounted display device provided by another embodiment of the present application.
  • Figure 1 is a schematic structural diagram of a wearing surface of a head-mounted display device provided by an embodiment of the present application.
  • Figure 2 is a schematic structural diagram of a side of a head-mounted display device provided by an embodiment of the present application.
  • Figure 3 is a schematic structural diagram of a head-mounted display device provided by an embodiment of the present application. The provided structural schematic diagram of the front face of the head-mounted display device is shown in Figures 1 to 3.
  • a head-mounted display device including: a device body 10, interactive components, and a detection sensor 30.
  • the device body 10 has a wearing surface 11 facing the user's face and a plurality of outer surfaces 12 .
  • At least one outer side 12 is provided with an interactive component, which is used to detect the user's input operation.
  • the head-mounted display device can generate control instructions based on the user's operations detected by the interactive components, and perform work tasks according to the control instructions.
  • At least one detection sensor 30 is provided on the wearing surface 11. In some cases, the detection sensor 30 can be arranged at other positions.
  • the device body 10 controls the interactive component to transition to the unlocked state.
  • the detection sensor 30 detects that the device body 10 is in a non-wearing state, the device body 10 controls the interactive component to transition to a locked state.
  • the device body 10 includes but is not limited to video playback glasses, VR glasses (Virtual Reality, abbreviated as VR) and AR glasses (Augmented Reality, Augmented Reality, abbreviated as AR).
  • VR Virtual Reality
  • AR Augmented Reality
  • the device body 10 has a front, a back, a top, a bottom, a left and a right.
  • the back is facing the direction of the user, that is, the back is the wearing surface 11.
  • the front, top, bottom, left and right are all is the outer surface 12 of the device body 10 .
  • the front, top, bottom, left and right surfaces are all independent planes, and together with the rear surface, the device body 10 has a roughly hexahedral structure.
  • the back surface is an independent plane, and the front surface, top surface, bottom surface, left surface, and right surface are all located on the same arc surface. Together with the back surface, the device main body 10 has a roughly hemispherical structure.
  • the multiple outer sides 12 of the device body 10 can also have other structures, which are not specifically limited here.
  • the main body of the equipment 10 is equipped with a control system.
  • the control system is communicatively connected with the interactive components and the detection sensor 30.
  • the interactions from the interactive components can be transmitted to the control system, so that the control system controls the main body of the equipment 10 to respond accordingly.
  • the detection sensor 30 sends the detected detection result to the control system, so that the control system controls the interactive component to convert between an unlocked state and a locked state.
  • the unlocked state of the interactive component means that the user can interact with the user through contact and/or non-contact means.
  • the interactive component interacts with the interactive component through sliding, clicking, voice, etc., and then sends sensing data to the control system through the interactive component so that the control system can respond accordingly, thereby achieving control of the head-mounted display device.
  • the locked state includes but is not limited to two methods.
  • One method of locked state means that the user can still interact with the interactive component, and the interactive component can still recognize the user's interactive actions. However, at this time, the interactive component does not communicate with the control.
  • the system sends sensing data, and the head-mounted display device does not respond to the user's interactive actions.
  • This state can also be called the interactive component being in a standby state.
  • the interactive component can recognize the preset instruction, then the interactive component can send sensing data corresponding to the preset instruction to the control system, so that The control system responds accordingly to control the head-mounted display device.
  • the interactive component when the interactive component is in a locked state, if the user's interactive action is a preset action, after the interactive component is recognized, the corresponding transmission can still be sent to the control system. Sensing data, this method will be explained in detail in subsequent embodiments, and will not be described in detail here.
  • the interactive component does not recognize the user's interactive actions, so the interactive component does not send sensing data to the control system. In this way, the head-mounted display device does not respond to the user's touch pad. touch operation.
  • the interactive components can be arranged on one outer side 12 or multiple outer sides 12 according to different requirements. For example, to facilitate the operation of the user's right hand, the interactive components can be arranged on the right side.
  • the number of detection sensors 30 can be set according to different requirements, such as one or more, and can be set at different positions on the wearing surface 11 according to different requirements to facilitate wearing detection.
  • the interactive components are in a locked state.
  • the user's hand may touch the interactive component.
  • the interactive component is in a locked state, so no sensing information will be sent out, and the device body 10 will not respond accordingly.
  • the detection sensor 30 detects that the device body 10 has been worn on the user's head, and sends the detection result to the control system in the device body 10.
  • the control system controls the interactive component to transition from the locked state. into an unlocked state for user interaction.
  • the technical solution provided by the embodiment of the present application can detect whether the head-mounted display device is worn through the detection sensor 30, so that the device body 10 controls the interactive component to switch to an unlocked state or a locked state according to the detection result, thereby making it convenient for the user to use and at the same time. It can prevent interactive components from being misoperated and improve the user experience.
  • one way of arranging the detection sensor 30 is that the wearing surface 11 has an eyebrow area corresponding to the position of the user's eyebrows, and the detection sensor 30 is provided in the eyebrow area.
  • the detection sensor 30 can be more convenient for detection, making the detection results more accurate.
  • the position of the detection sensor 30 will not affect other components on the device body 10, such as the setting of the lenses in the eye area.
  • the detection sensor 30 can also be arranged in the area around the eye area, the area above the nose area, the forehead area, etc. according to different needs.
  • the detection sensor 30 may be provided outside the device body 10 or inside the device body 10 according to different requirements.
  • the wearing surface 11 is provided with a detection hole for the detection sensor 30 to perform detection.
  • the detection sensor 30 transmits detection signals through the detection hole.
  • the detection sensor 30 device The protection of the main body 10 prevents interference from other objects and the human body, making the detection more accurate. At the same time, it can also effectively prevent damage and improve the service life.
  • the detection sensor 30 can be implemented in a variety of ways.
  • One implementation of the detection sensor 30 is that the detection sensor 30 includes a distance sensor, a brightness sensor, a temperature sensor, a vision sensor, an iris sensor, and a contact sensor. of one or more.
  • the detection direction of the detection sensor 30 is toward the direction of the user's head when the head-mounted display device is worn.
  • the detection method of the distance sensor is to detect the distance of the object relative to the distance sensor.
  • the relative distance of the object is less than the distance threshold, it means that the device body 10 is worn, and the device body 10 controls the interactive component to switch to an unlocked state for the user to use.
  • the relative distance of the object is greater than or equal to the preset distance threshold, it means that the device body 10 is not worn, and the device body 10 controls the interactive components to switch to a locked state to prevent the interactive components from being accidentally touched.
  • the detection method of the brightness sensor is to detect the brightness of light.
  • the brightness of the light is less than the brightness threshold, it means that the device body 10 is worn, and the device body 10 controls the interactive components to switch to an unlocked state for the user's use.
  • the brightness of the light is greater than or equal to the preset brightness threshold, it means that the device body 10 is not worn, and the device body 10 controls the interactive components to switch to a locked state to prevent accidental touches of the interactive components and avoid unnecessary operation inputs. .
  • the temperature sensor can detect the temperature by detecting the temperature. When the detected temperature is greater than or equal to the preset temperature threshold, it means that the device body 10 is worn, and the device body 10 controls the interactive component to switch to an unlocked state for the user to use. When the detected temperature is less than the temperature threshold, it means that the device body 10 is not worn, and the device body 10 controls the interactive components to switch to a locked state to prevent the interactive components from being accidentally touched.
  • the visual sensor and the iris sensor can be detected by detecting the user's eyes. When it is detected that the user's eyes are within the preset detection range, it means that the device body 10 is worn, and the device body 10 controls the interactive component to switch to unlocking. status for users to use. When the user's eyes cannot be detected, it means that the device body 10 is not worn, and the device body 10 controls the interactive component to switch to a locked state.
  • One implementation of the contact sensor is a pressure detection sensor.
  • a preset pressure threshold it means that the device body 10 is worn, and the device body 10 controls the interactive component to switch to an unlocked state for the user to use.
  • the detected pressure is less than the preset pressure threshold, it means that the device body 10 is not worn, and the device body 10 controls the interactive component to switch to a locked state.
  • the various implementation methods of the detection sensor 30 can be implemented separately or can be used in combination with each other, and are not specifically limited in the embodiments of the present application.
  • the interactive component can be implemented in a variety of ways.
  • One possible implementation manner of the interactive component is that the interactive component is at least one of a touch panel 20 , a control button, and a sound collector.
  • the interactive component is a sound collector
  • the user can issue control instructions through voice, and the voice is collected by the sound collector, thereby realizing interaction with the device body 10 .
  • the detection sensor 30 detects that the device body 10 is not worn, the device body 10 controls the sound collector to switch to the locked state. At this time, if the user outputs voice to the sound collector, the sound collector will not send a corresponding message to the device body 10 Control instructions effectively prevent the sound collector from being misoperated.
  • the interactive component is a control button
  • the user can manipulate the control button by pressing, pushing and pulling, etc., thereby emitting corresponding sensing data and realizing interaction with the device body 10 .
  • the detection sensor 30 detects that the device body 10 is not worn, the device body 10 controls the control button to switch to a locked state. At this time, if the user presses, pushes or pulls the control button, the control button will not move toward the device body 10 Send corresponding sensing data to effectively prevent control buttons from being misoperated.
  • the touch panel 20 includes but is not limited to a capacitive touch panel.
  • the capacitive touch panel uses the human body to sense the touch panel when it is close to the touch panel.
  • the capacitance value between the pad and the ground changes generally pF level, pF is the unit of capacitance value, Chinese is picofarad
  • the detected change in capacitance value is calculated by the microprocessor, and the interference is filtered out to finally determine Is there a human body close to realize the button function.
  • the detection sensor 30 detects that the device body 10 is not worn, the device body 10 controls the touch panel 20 to switch to the locked state. At this time, if the user clicks, presses, slides, etc. on the touch panel 20, the touch panel 20 will be in a locked state. The pad 20 will not send corresponding sensing data to the device body 10, effectively preventing the touch pad 20 from being misoperated.
  • the control button protrudes from the surface of the outer side 12, which can facilitate the user to locate it, thereby avoiding accidentally touching the control button when not in use.
  • the control button can be accurately found. So as to control it accordingly.
  • the touch panel 20 When the interactive component is the touch panel 20, the touch panel 20 may be protruding from the surface of the outer side 12, or may be flush with the surface of the outer side 12, making it relatively difficult to position.
  • the outer side 12 of the touch panel 20 is provided with a first position corresponding to the circumferential periphery of the touch panel 20 .
  • Identity structure 121 The user can position the touch panel 20 through the first identification structure 121. For example, after the user puts on the device body 10, he or she searches the device body 10 with his hands. When the first identification structure 121 is found, the user can locate the touch panel 20 according to the first identification structure. The structure 121 can infer the relative position of the touch panel 20 to avoid or manipulate the touch panel 20 .
  • the first identification structure 121 is connected to the circumferential edge of the touch panel 20. That is, the first identification structure 121 is set at the edge of the touch panel 20.
  • the user touches the first identification structure 121 he or she knows that the edge of the touch panel 20 has been touched. If he does not want to accidentally touch it, he will not touch it again. Search “forward” to avoid the touch panel 20 . If you want to operate the touch panel 20 , you can continue to search “forward” based on the position of the first identification structure 121 to find the touch panel 20 .
  • Another possible way to set the position of the first identification structure 121 is to space the first identification structure 121 away from the circumferential edge of the touch panel 20 by a first distance.
  • the touch pad 20 When the user touches the first identification structure 121, he knows that he is close to the touch panel 20. If he does not want to accidentally touch it, he will no longer fumble "forward" to avoid the touch panel 20.
  • the touch pad 20 When the touch pad 20 is operated, the touch pad 20 can be found by continuing to search “forward” based on the position of the first identification structure 121 . In this way, there is still a certain distance between the first identification structure 121 and the touch panel 20. When the user does not stop groping in time, accidental touches on the touch panel 20 can be effectively avoided.
  • the first identification structure 121 can be implemented in a variety of ways.
  • One possible implementation method is, as shown in FIGS. 2 and 3 , the first identification structure 121 is an annular third ring-shaped ring disposed around the circumferential periphery of the touch panel 20 .
  • the protrusions or grooves are structures relative to the surface of the outer side 12 , which can be perceived by the user through touch, thus serving as a reminder.
  • multiple annular first identification structures 121 may be arranged at intervals along the radial direction of the touch panel 20 , with the first identification structures 121 being closest to the touch panel 20 .
  • the first identification structures 121 of the control panel 20 are connected to or separated from the edge of the touch panel 20 by a first distance, and each first identification structure 121 is separated from an adjacent first identification structure 121 by a second distance.
  • the first identification structure 121 is a plurality of second protruding structures and/or a plurality of second concavities arranged at intervals along the circumferential periphery of the touch panel 20 . trough structure.
  • the second protruding structure may be a bump, and the second groove structure may be a pit.
  • a plurality of second protruding structures or a plurality of second groove structures are arranged into one or more annular structures surrounding the touch panel 20 settings to serve as a reminder.
  • the first identification structure 121 can also be formed by a combination of a plurality of second protruding structures and a plurality of second groove structures.
  • the second protruding structures and the second groove structures can be arranged at regular intervals or irregularly. Arranged in a pattern to create a rough tactile feel for easy prompting.
  • one possible way is to provide a second identification structure 122 on the touch panel 20, and the second identification structure 122 is disposed on the touch panel.
  • the third protruding structure or the third groove structure on the control panel 20 After the user has put on the device body 10, the touchpad 20 is in an unlocked state, and the user can use the touchpad 20 for control.
  • the touchpad 20 When the user touches the touch panel 20, he or she will not immediately know the relative position of the touch, which may affect usage. For example, referring to FIG. 2 , when the user touches area A of the touch panel 20 and slides downward, it is easy to slide out of the control area of the touch panel 20 , so that the control command cannot be fully issued.
  • the user can search to find the second identification structure 122 before issuing the manipulation instruction, and thereby determine the position where the touch panel 20 is touched based on the second identification structure 122 .
  • the second identification structure 122 can be arranged in a variety of ways. One possible way is to set the second identification structure 122 in the center area of the touch panel 20 so that the user can find the second identification by groping before issuing a control instruction. structure 122, so that the location of the center area of the touch panel 20 is determined according to the second identification structure 122, so that the touch panel 20 can be controlled around the center area to avoid sliding out of the control area and improve usage efficiency.
  • the plurality of second identification structures 122 are evenly distributed on the surface of the touch panel 20 .
  • the user can find the touch panel 20 according to the plurality of second identification structures 122 and search for the touch panel 20 according to the plurality of second identification structures 122 .
  • the position of 122 determines the position where the touch panel 20 is touched, so as to control it, avoid sliding out of the control area, and improve usage efficiency.
  • this arrangement of the second identification structure 122 and the arrangement of the second identification structure 122 in the central area of the touch panel 20 can be implemented separately, or can be combined with each other and used.
  • the embodiments of the present application are not specifically limited. .
  • the interactive components are in an unlocked state, which can facilitate the user to perform interactive operations.
  • the user wears the device body 10 there is no need for interactive operations for a long period of time.
  • the user watches a movie through the device body 10, and the movie playback time is usually about one or two hours.
  • User interaction is generally not required during movie playback.
  • the user can choose to actively lock the interactive component.
  • One active locking operation that can be implemented is that the detection sensor 30 detects that the device body 10 is in the wearing state.
  • the device body 10 controls the interactive component to switch to the locked state.
  • one way to issue the first preset instruction is that during use, if the user wants to actively lock the touch panel 20 , he or she can long press with two fingers for a preset time, for example, press and hold with two fingers for 2 seconds. Thus, the touch panel 20 can be triggered to be locked, thereby preventing the touch panel 20 from being misoperated.
  • the interactive component is converted to a locked state through the first preset instruction. At this time, the user can still interact with the interactive component, and the interactive component can still recognize the user's interactive actions. However, at this time, the interactive component does not send control instructions to the control system. , the head-mounted display device does not respond to user interactions.
  • an active unlocking operation that can be implemented is that when the interactive component detects the second preset command, the device body 10 controls the interactive component to switch to the unlocked state.
  • one way of issuing the second preset instruction is that when the touch panel 20 is in a locked state, the interactive component can still recognize the user's interactive action. If the user wants to actively unlock the touch panel 20 , You can press and hold with two fingers for a preset duration, for example, press and hold with two fingers for 2 seconds, thereby triggering the unlocking of the touch pad 20 so that the touch pad 20 can be interactively operated.
  • the interactive component when it is a control button, it can also be locked or unlocked by long pressing.
  • a predetermined voice can be emitted, such as "lock interaction", “unlock”, etc. to lock or unlock.
  • the "long press”, “scheduled voice” and other methods mentioned above mean that when the interactive component is in the locked state, the user's interactive action is a predetermined action. After the interactive component is recognized, the corresponding sensor can still be sent to the control system. data.
  • one implementation method is that the device body 10 has a display 40.
  • the display 40 is used to display the corresponding prompt screen.
  • the display 40 on the device body 10 is also used to display a corresponding prompt screen when the interactive component is in the process of unlocking or locking.
  • the prompt screen includes but is not limited to the corresponding unlocking or locking icon. , progress bar, numerical value, etc.
  • the display of the prompt screen will not affect the currently displayed video screen.
  • the prompt screen will be displayed in the form of a floating window, such as displayed at the edge of the video screen.
  • the device main body 10 has a speaker, and the interactive component When transitioning to the unlocked or locked state, the speaker is used to emit corresponding prompt sounds. In addition to playing the sound of video images, the speakers on the device body 10 are also used to play corresponding prompt sounds when the interactive components are unlocked or locked.
  • the prompt sounds include but are not limited to voice prompts, beeps, etc. . The playback of the prompt sound will not affect the sound of the video screen being displayed and will not affect the user experience.
  • the user can also control the head-mounted display device through the terminal device.
  • the control system provided in the device body 10 can receive instructions from the terminal device and respond accordingly, thereby controlling the head-mounted display device.
  • the user can control the switching of the locked state and the unlocked state of the interactive component through the terminal device, and can also control the playback, pause, screen switching and other various functions of the device body 10 .
  • Terminal devices include but are not limited to smartphones, various types of computers, remote controls, handles, etc.
  • embodiments of the present application also provide a control method for the head-mounted display device, which method can be used to control the head-mounted display device in the above embodiments.
  • the head-mounted display device includes a touch panel 20 for detecting touch operations of a user wearing the head-mounted display device and a touch panel for outputting
  • the method of detecting sensor data 30 includes:
  • Step S101 Determine whether the user wearing the head-mounted display device takes off the head-mounted display device based on the sensing data.
  • the detection sensor 30 can detect the corresponding state, generate corresponding sensing data, and transmit the sensing data to the control system in the device body 10, and the control system receives the corresponding After obtaining the sensing data, it can be determined that the user takes off the head-mounted display device, and then step S102 is executed.
  • Step S102 In response to determining that the user takes off the head-mounted display device, generate a touchpad 20 locking instruction.
  • the control system determines based on the sensing data that the user has taken off the head-mounted display device, that is, when the driving head-mounted display device is not worn, it means that the touch panel 20 does not need to be used. In order to avoid the touch panel 20 from being misoperated, therefore , the control system generates a locking instruction for the touchpad 20 so as to lock the touchpad 20 .
  • Step S103 Lock the touch panel 20 according to the touch panel 20 locking instruction.
  • the touch panel 20 When the touch panel 20 is in a locked state, it can effectively prevent the touch panel 20 from being misoperated and improve the user experience.
  • the touch panel 20 is in a locked state including but not limited to two ways.
  • One way of the locked state means that the user can still interact with the touch panel 20.
  • the touch pad 20 can still recognize the user's interactive actions, but at this time the touch pad 20 does not send sensing data to the control system, and the head-mounted display device does not respond to the user's interactive actions.
  • This state can also be called a touch screen.
  • the control panel 20 is in a standby state.
  • the touchpad 20 can recognize the preset instruction, at this time the touchpad 20 can send a transmission corresponding to the preset instruction to the control system. Sensing data so that the control system can respond accordingly to control the head-mounted display device.
  • the touchpad 20 when the touchpad 20 is in a locked state, if the user's interaction action is a preset action, after the touchpad 20 recognizes it, it can still send a message to the control system. Corresponding sensing data is sent. This method will be described in detail in subsequent embodiments and will not be described in detail here.
  • Another way to lock the state is to turn off the detection and recognition function of the touch panel 20.
  • the touch panel 20 does not recognize the user's interactive actions, so the touch panel 20 does not send sensing data to the control system. In this way, the head-mounted display device Does not respond to user touches on the touchpad.
  • the technical solution provided by the embodiment of the present application can detect whether the head-mounted display device is taken off by the user through the detection sensor 30, that is, whether the head-mounted display device is worn, and thereby convert the touch panel 20 into a The locked state can effectively prevent the touch panel 20 from being misoperated and improve the user experience.
  • control method also includes:
  • Step S201 Determine whether the user wears the head-mounted display device again according to the sensing data.
  • the detection sensor 30 can detect the corresponding state, generate corresponding sensing data, and transmit the sensing data to the control system in the device body 10, and the control system receives the corresponding sensing data. After receiving the data, it can be determined that the user is wearing the head-mounted display device, and then step S202 is executed.
  • Step S202 In response to determining that the user re-wears the head-mounted display device, generate an unlocking instruction.
  • the control system determines that the user is wearing the head-mounted display device based on the sensing data, that is, when the head-mounted display device is driven to be worn, it indicates that the touch pad 20 needs to be used.
  • the control system directly generates an unlocking function. instructions to unlock the touch pad 20 so that the user can perform corresponding operations through the touch pad 20 .
  • Step S203 Unlock the touch panel 20 according to the unlocking instruction.
  • the user can interact with the touch panel 20 through contact, such as sliding, clicking, etc., and then send control instructions to the control system through the touch panel 20.
  • the control system In order for the control system to respond accordingly, the head-mounted display device can be controlled.
  • the detection sensor 30 can detect the corresponding state and generate sensing data.
  • the control system generates a touch panel locking instruction according to the sensing data to lock the touch panel 20 , that is, the touch panel 20 is in a locked state, which can effectively prevent the touch panel 20 from being misoperated.
  • the user's hand may touch the touch pad 20. At this time, the touch pad 20 is in a locked state, so no sensing data will be sent out. The device will not respond accordingly.
  • the detection sensor 30 can detect that the head-mounted display device has been worn on the user's head, and sends sensing data to the control system.
  • the system control generates an unlocking instruction based on the corresponding sensing data. , unlocking the touch panel 20, and at this time, the touch panel 20 switches from the locked state to the unlocked state, so that the user can perform interactive operations.
  • the technical solution provided by the embodiment of the present application can detect whether the head-mounted display device is worn through the detection sensor 30, so that the head-mounted display device controls the touch panel 20 to switch to an unlocked state or a locked state according to the detection result, thereby facilitating the user's use. At the same time, it can also prevent the touch panel 20 from being misoperated and improve the user experience.
  • the detection sensor 30 can be implemented in a variety of ways.
  • One implementation of the detection sensor 30 is that the detection sensor 30 includes a distance sensor, a brightness sensor, a temperature sensor, a vision sensor, an iris sensor, and a contact sensor. of one or more.
  • the detection direction of the detection sensor 30 is toward the direction of the user's head when the head-mounted display device is worn.
  • Various implementation methods of the detection sensor 30 can be implemented separately or can be used in combination with each other, and are not specifically limited in the embodiments of this application.
  • one way of arranging the detection sensor 30 is that the wearing surface 11 has an eyebrow area corresponding to the position of the user's eyebrows, and the detection sensor 30 is provided in the eyebrow area.
  • the detection sensor 30 includes a distance sensor disposed between two sets of optical components of the head-mounted display device. In this arrangement, the detection sensor 30 can be more convenient for detection, making the detection results more accurate. At the same time, the position of the detection sensor 30 will not affect other components on the device body 10, such as the setting of the lenses in the eye area.
  • the detection sensor 30 can also be arranged in the area around the eye area, the area above the nose area, the forehead area, etc. according to different needs.
  • the detection sensor 30 may be provided outside the device body 10 or inside the device body 10 according to different requirements.
  • the wearing surface 11 is provided with a detection hole for the detection sensor 30 to perform detection.
  • the detection sensor 30 transmits detection signals through the detection hole.
  • the detection sensor 30 device The protection of the main body 10 prevents interference from other objects and the human body, making the detection more accurate. At the same time, it can also effectively prevent damage and improve the service life.
  • the detection sensor 30 includes a distance sensor.
  • the detection method of the distance sensor is to detect the distance of the object relative to the distance sensor, thereby generating different sensing data to determine whether the head-mounted display device is being worn.
  • the detection sensor 30 includes a distance sensor disposed between two sets of optical components of the head-mounted display device.
  • step S101 determine whether the user wearing the head-mounted display device takes off the head-mounted display device based on the sensing data, including:
  • Step S1011 Determine whether the distance data output by the distance sensor is greater than or equal to a preset distance threshold.
  • Step S1012 If yes, determine that the user wearing the head-mounted display device takes off the head-mounted display device.
  • the control system When the relative distance between the object and the distance sensor is less than the preset distance threshold, it means that the device body 10 is worn, then the control system generates an unlocking instruction, and the touch panel 20 is unlocked and converted to an unlocked state for the user to use.
  • the relative distance between the object and the distance sensor is greater than or equal to the preset distance threshold, it means that the head-mounted display device is not worn, then the control system generates a touchpad lock instruction, and the touchpad 20 is converted to a locked state to prevent the touchpad from being worn. 20Inadvertent touch occurs.
  • the distance threshold can be set accordingly according to different requirements, and is not specifically limited in the embodiments of this application.
  • the detection sensor 30 includes a distance sensor and a brightness sensor.
  • the detection method of the distance sensor is to detect the distance of the object relative to the distance sensor
  • the detection method of the brightness sensor is to detect the brightness of light, thereby generating different sensing data to determine whether the head-mounted display device is being worn.
  • step S101 determine whether the user wearing the head-mounted display device takes off the head-mounted display device based on the sensing data, including:
  • Step S1013 Determine whether the distance data output by the distance sensor is greater than or equal to the preset distance threshold, and determine whether the brightness data output by the brightness sensor is greater than or equal to the preset brightness threshold;
  • Step S1014 If the distance data is greater than or equal to the preset distance threshold and the brightness data is greater than or equal to the preset brightness threshold, determine that the user wearing the head-mounted display device takes off the head-mounted display device.
  • the distance sensor can also be combined with the brightness sensor for joint detection.
  • the detection method of the brightness sensor is to detect the brightness of light. When the brightness of the light is less than the brightness threshold, it means that the head-mounted display device is worn, and then the control system controls the touch panel 20 to switch to an unlocked state for the user to use.
  • the control system controls the touch panel 20 to switch to a locked state to prevent the touch panel 20 from accidentally touching and avoid accidental touch.
  • the brightness threshold can be set accordingly according to different requirements, and is not specifically limited in the embodiments of this application.
  • the implementation of the detection sensor 30 also includes one or more of a temperature sensor, a vision sensor, an iris sensor, and a contact sensor.
  • the temperature sensor can detect the temperature by detecting the temperature. When the detected temperature is greater than or equal to the preset temperature threshold, it means that the head-mounted display device is worn, and then the control system controls the touch panel 20 to switch to an unlocked state so that the user can use. When the detected temperature is less than the preset temperature threshold, it means that the head-mounted display device is not worn, and then the control system controls the touch panel 20 to switch to a locked state to prevent accidental touches on the touch panel 20 .
  • the visual sensor and the iris sensor can be detected by detecting the user's eyes. When it is detected that the user's eyes are within a preset detection range, it means that the head-mounted display device is worn, and then the control system controls the touch panel 20 to switch It is in an unlocked state for users to use. When the user's eyes cannot be detected, it means that the head-mounted display device is not worn, and then the control system controls the touch panel 20 to switch to a locked state.
  • the contact sensor is a pressure detection sensor.
  • a preset pressure threshold it means that the head-mounted display device is worn, and then the control system controls the touch panel 20 to switch to an unlocked state so that the user can use.
  • the detected pressure is less than the preset pressure threshold, it means that the head-mounted display device is not worn, and then the control system controls the touch panel 20 to switch to a locked state.
  • each threshold in the above content can be set accordingly according to different needs, and is not specifically limited in the embodiments of this application.
  • the touch panel 20 is in an unlocked state, which can facilitate the user to perform interactive operations.
  • the user wears the head-mounted display device, there is no need for interaction for a long period of time.
  • the user watches a movie through the head-mounted display device, and the movie playback time is usually one or two hours. Or so, the user usually does not need to interact during the movie playback.
  • the user can choose to actively lock the interactive component.
  • control method of the head-mounted display device also includes:
  • Step S301 Obtain a touchpad lock instruction sent by a terminal device that is communicatively connected to the head-mounted display device, where the touchpad lock instruction is generated by the terminal device detecting the user's touchpad 20 locking operation.
  • the user can also control the head-mounted display device through the terminal device.
  • the control system provided in the device body 10 can receive instructions from the terminal device and respond accordingly, thereby controlling the head-mounted display device.
  • the user can control the transition between the locked state and the unlocked state of the touch panel 20 through the terminal device, and can also control playback, pause, screen switching and other functions of the head-mounted display device.
  • Terminal devices include but are not limited to smartphones, various types of computers, remote controls, handles, etc.
  • the terminal device is communicatively connected with the control system of the head-mounted display device.
  • the terminal device When the user performs an operation on the terminal device, such as when the user inputs a locking operation of the touchpad 20, the terminal device generates a touchpad locking instruction and locks the touchpad.
  • the pad lock command is sent to the control system, so that the control system controls the touch pad 20 to switch to a locked state, which can effectively prevent misoperation of the touch pad 20 .
  • the user When the user wants to unlock the touch panel 20 again, the user also performs an unlocking operation on the terminal device. For example, when the user inputs the unlocking operation of the touch panel 20, the terminal device generates a touch panel unlocking instruction and locks the touch panel instruction. is sent to the control system, so that the control system unlocks the touch panel 20 and switches the touch panel 20 to a locked state to facilitate the user to perform corresponding operations.
  • control method of the head-mounted display device also includes:
  • Step S401 Obtain a touchpad locking instruction, where the touchpad locking instruction is generated by detecting the establishment of a communication connection between the head-mounted display device and the terminal device.
  • the user When the user wants to use the terminal device to control the head-mounted display device, the user can completely perform the corresponding operations through the terminal device. Therefore, when the terminal device establishes a communication connection with the head-mounted display device, the touch pad 20 can be locked, thereby This reduces the steps for the user to lock the touch panel 20 through the terminal device, reduces the complexity of the operation, and thereby improves the user comfort.
  • the touchpad locking instruction includes but is not limited to the following methods.
  • One possible way is to obtain the touchpad locking instruction, including: obtaining the touchpad locking instruction sent by the terminal device, where the touchpad locking instruction It is generated by the terminal device detecting the establishment of a communication connection between the head-mounted display device and the terminal device.
  • the terminal device detects that the communication connection has been established, so the terminal device generates a touchpad lock instruction and sends it to the control system of the head-mounted display device, so that the control system controls the touch panel.
  • the control panel 20 is locked.
  • Another implementable way is to obtain the touchpad 20 locking instruction, including: determining whether the head-mounted display device establishes a communication connection with the terminal device; and in response to determining that the communication connection is established, generating the touchpad 20 locking instruction.
  • the control system in the head-mounted display device detects that the communication connection has been established, and the control system generates a touchpad locking instruction to control the touchpad 20 to lock.
  • the above two methods can be implemented separately or combined with each other. When combined, they can correct each other. When the above two methods are satisfied, the touch panel 20 is locked.
  • the corresponding operations of the head-mounted display device can be realized through the touch panel 20 and the terminal device.
  • the head-mounted display device can also be controlled. Control of playback, pause, screen switching and other functions.
  • the picture played by the head-mounted display device may be image data stored in the head-mounted display device, or may be image data sent from an external device. Based on this, in the embodiment of the present application, the method for controlling the head-mounted display device also includes:
  • Step S501 Control the display of the head-mounted display device to display the image data sent by the terminal device.
  • the terminal device is a mobile phone, computer, etc.
  • the user can send image data such as pictures, videos, and game screens in the terminal device to the display of the head-mounted display device for synchronous display.
  • the user can also perform corresponding controls on the terminal device. , that is, the user can use the head-mounted display device as a display of the terminal device to increase the immersive experience.
  • Step S502 Control the display of the head-mounted display device to display the image data collected by the image acquisition device of the movable platform, where the terminal device is used to control the movable platform.
  • the terminal device is a mobile phone, computer, remote control, handle, etc.
  • the movable platform includes but is not limited to unmanned aerial vehicle, unmanned vehicle, movable robot, etc.
  • the movable platform can send the image data recorded by the camera to the head.
  • the monitor of the display device can be displayed synchronously, and can also be displayed synchronously on the terminal device.
  • the user can also perform corresponding controls on the terminal device. For example, the user can control the movement of the movable platform, control the robotic arm, and rotate the camera through the terminal device. , take pictures, video, shout, etc., that is, users can use the head-mounted display device as the "eyes" of the movable platform to increase the immersive experience.
  • the user in addition to actively locking the touch panel 20 through the terminal device, in order to prevent the user or others from misoperation on the touch panel 20, the user can also choose to lock the touch panel 20 through the touch panel 20. Active lock operation.
  • control method of the head-mounted display device also includes:
  • Step S601 In response to determining that the touch panel 20 detects the user's touch shielding operation, generate a touch panel lock instruction
  • Step S602 Shield the touch operations detected by the touch panel 20 except the user's touch shield release operation according to the touch panel lock instruction;
  • the head-mounted display device controls the touch panel 20 to transition to a locked state.
  • the first preset instruction ie, touch shielding operation
  • the head-mounted display device controls the touch panel 20 to transition to a locked state.
  • the first preset instruction ie, touch shielding operation
  • the head-mounted display device controls the touch panel 20 to transition to a locked state.
  • the first preset instruction ie, touch shielding operation
  • the touch panel locking instruction is generated, which can trigger the locking of the touch panel 20, thereby preventing the touch panel 20 from being misoperated.
  • the touch panel 20 is switched to the locked state through the first preset instruction. At this time, the user can still interact with the touch panel 20 and the touch panel 20 can still recognize the user's interactive actions.
  • the touch panel 20 detects touch operations other than the user's touch mask release operation, that is, if the operation input by the user is not a touch mask release operation, the touch panel 20 does not send a control instruction to the control system.
  • the display device does not respond to user interactions. If the operation input by the user is a touch shield release operation, then the touch panel 20 sends a control instruction to the control system, the head-mounted display device responds to the user's interactive action, unlocks the touch panel 20, and switches the touch panel 20 to unlock. state.
  • control method of the head-mounted display device also includes:
  • Step S603 In response to determining that the touch panel 20 detects the user's touch shield release operation, generate a touch shield release instruction
  • Step S604 Unblock the touch operation detected by the touch panel 20 except for the user's touch operation according to the touch mask release instruction.
  • the touch panel 20 when the touch panel 20 is in a locked state and the user wants to unlock the touch panel 20 again, the user can input a touch shield release operation.
  • the touch panel 20 detects the second preset command (touch shield release) When operating), the control system generates a touch shield release instruction, and the control system controls the touch panel 20 to transition to an unlocked state.
  • the touch panel 20 is unblocked except for the user's touch operations, and the user Various corresponding operations can be input through the touch panel 20 .
  • one way of issuing the second preset instruction is that when the touch panel 20 is in a locked state, the touch panel 20 can still recognize the user's interactive action. If the user wants to actively unlock the touch panel, 20. You can press and hold with two fingers for a preset duration, for example, press and hold with two fingers for 2 seconds, thereby triggering the unlocking of the touch pad 20, so that the touch pad 20 can be interactively operated.
  • the method for controlling the head-mounted display device also includes:
  • Step S701 Determine whether the user is wearing the head-mounted display device according to the sensing data
  • a touch panel lock instruction is generated, including:
  • the touch panel lock instruction is generated.
  • the touch pad 20 is locked.
  • the detection sensor 30 detects that the head-mounted display device is in a wearing state.
  • the touch panel 20 detects the first preset instruction (ie, the touch shielding operation)
  • the head-mounted display device controls the touch panel 20 to switch to the locked state. , thereby preventing the touch panel 20 from being misoperated.
  • the user can still interact with the touch panel 20 , and the touch panel 20 can still recognize the user's interactive actions.
  • the touch panel 20 blocks the touch screen detected by the touch panel 20 except for the user's touch shield release operation.
  • the touch panel 20 does not send a control instruction to the control system, and the head-mounted display device does not respond to the user's interactive action. If the operation input by the user is a touch shield release operation, then the touch panel 20 sends a control instruction to the control system, the head-mounted display device responds to the user's interactive action, unlocks the touch panel 20, and switches the touch panel 20 to unlock. state.
  • a touch shield release instruction is generated, including:
  • the touch shield releasing instruction is generated.
  • the touch pad 20 is unlocked.
  • the detection sensor 30 detects that the head-mounted display device is in a wearing state, and when the touch panel 20 is in a locked state, when the user wants to unlock the touch panel 20 again, the user can input a touch shield release operation.
  • the control panel 20 detects the second preset command (touch shield release operation)
  • the control system generates a touch shield release command, and the control system controls the touch panel 20 to transition to an unlocked state.
  • the touch screen is released according to the touch shield release command.
  • the user can input corresponding various operations through the touch panel 20 .
  • control method of the head-mounted display device also includes:
  • Step S801 Control the display of the head-mounted display device to display a touch shielding notification; and/or,
  • Step S802 Control the display of the head-mounted display device to display the touch shield release notification.
  • the head-mounted display device has a display 40.
  • the display 40 When the control system generates a touch shielding instruction and/or a touch shielding release finger, the display 40 is used to display a corresponding prompt screen.
  • the display 40 on the head-mounted display device can also display a corresponding prompt screen when the touch panel 20 is in the unlocking or locking process.
  • the prompt screen includes but is not limited to the corresponding unlocking screen. Or lock icons, progress bars, values, etc.
  • the display of the prompt screen will not affect the currently displayed video screen.
  • the prompt screen will be displayed in the form of a floating window, such as displayed at the edge of the video screen.
  • the head-mounted display device has a speaker.
  • the speaker when the touch panel 20 is converted to an unlocked state or a locked state, the speaker is used to emit a corresponding prompt sound.
  • the speakers on the head-mounted display device are also used to play corresponding prompt sounds when the touch panel 20 is unlocked or locked.
  • the prompt sounds include but are not limited to voice prompts, beeps, etc. Chirping etc. The playback of the prompt sound will not affect the sound of the video screen being displayed and will not affect the user experience.
  • the technical solution provided by the embodiments of the present application can detect whether the head-mounted display device is worn through the detection sensor 30, so that according to the detection result, the device body 10 controls the interactive component to switch to the unlocked state or the locked state, thereby facilitating the user. While in use, it can also prevent interactive components from being mis-operated, avoid unnecessary operational input, and improve the user experience. At the same time, there is a marking structure at the corresponding position of the interactive component, which can not only prevent users from accidentally touching it, but also facilitate users to position the interactive component, and make differences in the structure to facilitate blind operation by users.
  • an embodiment of the present application also provides a head-mounted display device 50, which includes a touch panel 51 for detecting touch operations of a user wearing the head-mounted display device, and a touch panel 51 for outputting sensing data.
  • detection sensor 52 and processor 53 wherein,
  • the processor is used to perform the following operations:
  • the processor is also used to:
  • the detection sensor includes one or more of a distance sensor, a brightness sensor, a temperature sensor, a vision sensor, an iris sensor, and a contact sensor.
  • the detection sensor includes a distance sensor
  • the processor is specifically used for:
  • the detection sensor includes a distance sensor and a brightness sensor
  • the processor is specifically used for:
  • the distance data is greater than or equal to the preset distance threshold and the brightness data is greater than or equal to the preset brightness threshold, it is determined that the user wearing the head-mounted display device takes off the head-mounted display device.
  • the processor is also used to:
  • the processor is also used to:
  • the touchpad locking instruction is generated by detecting the establishment of a communication connection between the head-mounted display device and the terminal device.
  • the processor is specifically configured to:
  • the processor is specifically configured to:
  • a touchpad lock instruction is generated.
  • the processor is also used to:
  • the display of the head-mounted display device is controlled to display the image data collected by the image acquisition device of the movable platform, wherein the terminal device is used to control the movable platform.
  • the processor is also used to:
  • the processor is also used to:
  • the processor is specifically used to control the processor
  • a touch shield releasing instruction is generated.
  • the processor is also used to:
  • the detection sensor includes a distance sensor disposed between two sets of optical components of the head mounted display device.
  • the head-mounted display device 50 can perform the control method of the head-mounted display device as mentioned above.
  • the head-mounted display device 50 can have any features of the head-mounted display device as mentioned above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Position Input By Displaying (AREA)

Abstract

本发明提供一种头戴显示设备及控制方法,头戴显示设备包括:设备主体,设备主体具有朝向使用者面部的佩戴面及多个外侧面;交互部件,至少一个外侧面上设有交互部件;检测传感器,佩戴面上设有至少一个检测传感器;其中,当检测传感器检测到设备主体处于佩戴状态时,设备主体控制交互部件转换为解锁状态;当检测传感器检测到设备主体处于非佩戴状态时,设备主体控制交互部件转换为锁定状态。本发明通过检测传感器可检测到头戴显示设备是否被佩戴,从而根据检测结果,设备主体控制交互部件转换解锁状态或锁定状态,从而方便用户使用的同时,还可防止交互部件被误操作,改善用户的使用体验。

Description

一种头戴显示设备及控制方法 技术领域
本申请涉及头戴设备技术领域,尤其涉及一种头戴显示设备及控制方法。
背景技术
随着科学技术的不断发展,头戴显示设备逐渐走入人们的生活中。为方便用户使用头戴显示设备,通常在头戴显示设备设置操控按钮等用于交互的部件。
但是,当用户佩戴头戴显示设备或者从头上取下头戴显示设备时,用户的手部会误触到头戴式显示设备的用于检测用户的输入操作的交互部件(例如触摸板),这样会产生误输入,进而影响用户正常使用,从而降低用户的使用体验感。
申请内容
鉴于上述问题,提出了本申请,以便提供一种解决上述问题的头戴显示设备及控制方法。
在本申请的一个实施例中,提供了一种头戴显示设备,包括:
设备主体,所述设备主体具有朝向使用者面部的佩戴面及多个外侧面;
交互部件,至少一个所述外侧面上设有所述交互部件;
检测传感器,所述佩戴面上设有至少一个所述检测传感器;
其中,当所述检测传感器检测到所述设备主体处于佩戴状态时,所述设备主体控制所述交互部件转换为解锁状态;
当所述检测传感器检测到所述设备主体处于非佩戴状态时,所述设备主体控制所述交互部件转换为锁定状态。
本申请的另一实施例中,还提供了一种头戴式显示装置的控制方法,所述头戴式显示装置包括用于检测佩戴所述头戴式显示装置的用户的触摸操作的触摸板和用于输出传感数据的传感器,包括:
根据所述传感数据确定佩戴所述头戴式显示装置的用户是否脱下所述头戴式显示装置;
响应于确定用户脱下所述头戴式显示装置,生成触摸板锁定指令;
根据所述触摸板锁定指令锁定所述触摸板。
本申请的另一实施例中,还提供了一种头戴式显示设备,包括用于检测佩戴所述头戴式显示设备的用户的触摸操作的触控板、用于输出传感数据的检测传感器和处理器,其中,
所述处理器,用于执行以下操作:
根据所述传感数据确定佩戴所述头戴式显示设备的用户是否脱下所述头戴式显示设备;
响应于确定用户脱下所述头戴式显示设备,生成触控板锁定指令;
根据所述触控板锁定指令锁定所述触控板。
本申请实施例提供的技术方案,通过检测传感器可检测到头戴显示设备是否被佩戴,从而根据检测结果,设备主体控制交互部件转换解锁状态或锁定状态,从而方便用户使用的同时,还可防止交互部件被误操作,改善用户的使用体验。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本申请一实施例提供的头戴显示设备的佩戴面的结构示意图;
图2为本申请一实施例提供的头戴显示设备的一个侧面的结构示意图;
图3为本申请一实施例提供的头戴显示设备的正面的结构示意图;
图4为本申请一实施例提供的交互部件的一种交互过程的流程示意图;
图5为本申请另一实施例提供的头戴显示设备的佩戴面的结构示意图。
具体实施方式
为了使本技术领域的人员更好地理解本申请方案,下面将结合本申请 实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
需要说明的是,在本申请的描述中,术语“第一”、“第二”仅用于方便描述不同的部件,而不能理解为指示或暗示顺序关系、相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。
除非另有定义,本文所使用的所有的技术和科学术语与属于本申请的技术领域的技术人员通常理解的含义相同。本文中在本申请的说明书中所使用的术语只是为了描述具体的实施例的目的,不是旨在于限制本申请。
图1为本申请一实施例提供的头戴显示设备的佩戴面的结构示意图,图2为本申请一实施例提供的头戴显示设备的一个侧面的结构示意图,图3为本申请一实施例提供的头戴显示设备的正面的结构示意图,参见图1至图3中所示。
在本申请的一个实施例中,提供了一种头戴显示设备,包括:设备主体10、交互部件及检测传感器30。设备主体10具有朝向使用者面部的佩戴面11及多个外侧面12。至少一个外侧面12上设有交互部件,所述交互部件用于检测用户的输入操作,头戴显示设备可以根据交互部件检测到的用户的操作生成控制指令,并根据所述控制指令执行工作任务。佩戴面11上设有至少一个检测传感器30,在某些情况中,检测传感器30可以设置在其他位置。
其中,当检测传感器30检测到设备主体10处于佩戴状态时,设备主体10控制交互部件转换为解锁状态。当检测传感器30检测到设备主体10处于非佩戴状态时,设备主体10控制交互部件转换为锁定状态。
举例来说,在本申请的一些可实现的实施例中,设备主体10包括但不限于为视频播放眼镜、VR眼镜(虚拟现实技术,Virtual Reality,缩写为VR)及AR眼镜(增强现实技术,Augmented Reality,缩写为AR)。相对空间方位来说,设备主体10具有前面、后面、顶面、底面、左面及右面,其中,后面为朝向用户所在方向,即后面为佩戴面11,前面、顶面、底面、左面及右面均为设备主体10的外侧面12。在一些可实现的方式中,前面、 顶面、底面、左面及右面均为独立的平面,配合后面,使得设备主体10大致呈六面体结构。在另一些可实现的方式中,后面为一个独立的平面,前面、顶面、底面、左面及右面均位于同一个弧形面上,配合后面,使得设备主体10大致呈半球体结构。当然,在一些其他可实现的实施例中,设备主体10的多个外侧面12还可呈其他结构,此处不做具体限定。
设备主体10内设有控制系统,控制系统与交互部件及检测传感器30通信连接,交互部件发出的交互可传输至控制系统,从而控制系统控制设备主体10进行相应的响应。检测传感器30将检测到的检测结果发送至控制系统,从而控制系统控制交互部件进行解锁状态及锁定状态的转换,其中,交互部件的解锁状态是指用户能够通过接触方式和/或非接触方式与交互部件进行交互,如通过滑动、点击、语音等方式与交互部件进行交互,进而经过交互部件向控制系统发送传感数据,以便控制系统进行相应的响应,从而实现对头戴显示设备的控制。
锁定状态包括但不限于分为两种方式,锁定状态的一种方式是指用户仍然能够与交互部件进行交互,交互部件仍能够识别到用户的交互动作,但是,此时交互部件并不向控制系统发送传感数据,头戴式显示装置不响应用户的交互动作,此种状态也可称之为交互部件处于待机状态。
进一步地,在此种方式下,若当用户的交互动作为预设动作,即交互部件能够识别到预设指令,此时交互部件可向控制系统发送与预设指令对应的传感数据,以便控制系统进行相应的响应,从而实现对头戴显示设备的控制。需要说明的是,在本申请的一些可实现的实施例中,当交互部件处于锁定状态时,若用户的交互动作为预设的动作,交互部件识别之后,仍可向控制系统发送相应的传感数据,此种方式会在后续的实施例中详细说明,此处暂不详述。
锁定状态的另一种方式是指关闭交互部件的检测识别功能,交互部件不识别用户的交互动作,因此交互部件不向控制系统发送传感数据,这样头戴式显示装置不响应用户对触摸板的触摸操作。交互部件可根据不同的需求设置在一个外侧面12或多个外侧面12上,如,为便于用户右手操作,交互部件可设置在右面上。检测传感器30的数量可根据不同的需求进行设置,如可设置为一个或多个,并根据不同的需求设置在佩戴面11上的不同位置处,以便进行佩戴检测。
在使用时,当设备主体10未佩戴在用户的头部时,交互部件处于锁定状态。当用户拿起设备主体10准备佩戴时,用户的手部有可能触碰到交互部件,而此时交互部件处于锁定状态,所以不会发出传感信息,设备主体10不会做出相应的响应。
当用户将设备主体10佩戴在头部上时,检测传感器30检测到设备主体10已经佩戴在用户的头部,发送检测结果至设备主体10中的控制系统,控制系统控制交互部件从锁定状态转换成解锁状态,以便用户进行交互操作。
本申请实施例提供的技术方案,通过检测传感器30可检测到头戴显示设备是否被佩戴,从而根据检测结果,设备主体10控制交互部件转换解锁状态或锁定状态,从而方便用户使用的同时,还可防止交互部件被误操作,改善用户的使用体验。
进一步地,为更加便于检测传感器30的检测,检测传感器30的一种设置方式是,佩戴面11上具有与使用者眉心位置对应的眉心区,眉心区处设有检测传感器30。此种设置方式下,可更加方便检测传感器30进行检测,使得检测的结果更加准确,同时,检测传感器30的位置不会影响设备主体10上其他部件,如不会影响眼睛区域的镜片的设置。检测传感器30除了可设置在眉心区之外,也可根据不同的需求,眼睛区的周围区域、鼻子区的上方区域及额头区等位置设置检测传感器30。
进一步地,根据不同的需求检测传感器30可设在设备主体10的外部,也可设置在设备主体10的内部。检测传感器30设置在设备主体10的内部时,佩戴面11上设有供检测传感器30进行检测的检测孔,检测传感器30通过检测孔进行检测信号的传输,此种设置方式下,检测传感器30设备主体10的保护,不会其他物品及人体的干扰,使得检测更加准确,同时,还可有效防止损坏,提高使用寿命。
本申请实施例中,检测传感器30的实现方式包括多种,检测传感器30的一种可实现的方式是,检测传感器30包括距离传感器、亮度传感器、温度传感器、视觉传感器、虹膜传感器、接触传感器中的一种或多种。检测传感器30的检测方向朝向佩戴头戴显示设备时用户的头部方向。
距离传感器的检测方式是检测物体相对距离传感器的距离,当物体的相对距离小于距离阈值时,则说明设备主体10被佩戴,那么设备主体10 控制交互部件转换为解锁状态,以便用户使用。当物体的相对距离大于或等于预设的距离阈值时,则说明设备主体10未被佩戴,那么设备主体10控制交互部件转换为锁定状态,防止交互部件发生误触的情况。
亮度传感器的检测方式是检测光的亮度,当光的亮度小于亮度阈值时,则说明设备主体10被佩戴,那么设备主体10控制交互部件转换为解锁状态,以便用户使用。当光的亮度大于或等于预设的亮度阈值时,则说明设备主体10未被佩戴,那么设备主体10控制交互部件转换为锁定状态,防止交互部件发生误触的情况,避免不必要的操作输入。
温度传感器可以通过检测温度的方式进行检测,当检测到的温度大于或等于预设的温度阈值时,则说明设备主体10被佩戴,那么设备主体10控制交互部件转换为解锁状态,以便用户使用。当检测到的温度小于温度阈值时,则说明设备主体10未被佩戴,那么设备主体10控制交互部件转换为锁定状态,防止交互部件发生误触的情况。
视觉传感器及虹膜传感器可以通过检测用户眼部的方式进行检测,当检测到用户的眼部处于预设的检测范围内时,则说明设备主体10被佩戴,那么设备主体10控制交互部件转换为解锁状态,以便用户使用。当检测不到用户眼部时,则说明设备主体10未被佩戴,那么设备主体10控制交互部件转换为锁定状态。
接触传感器的一种实现方式是压力检测传感器,当检测到压力大于或等于预设的压力阈值时,则说明设备主体10被佩戴,那么设备主体10控制交互部件转换为解锁状态,以便用户使用。当检测压力小于预设的压力阈值时,则说明设备主体10未被佩戴,那么设备主体10控制交互部件转换为锁定状态。需要说明的是,检测传感器30的多种实现方式可以分别单独实现,也可相互组合在一起使用,本申请实施例并不做具体限定。
本申请实施例中,交互部件的实现方式包括多种,交互部件的一种可实现的方式是,交互部件为触控板20、控制按钮及声音采集器中的至少一个。交互部件为声音采集器时,用户可通过语音发出控制指令,语音被声音采集器进行采集,从而实现与设备主体10之间的交互行为。当检测传感器30检测到设备主体10未被佩戴时,设备主体10控制声音采集器转换为锁定状态,此时,用户若对声音采集器输出语音,声音采集器不会向设备主体10发出相应的控制指令,有效防止声音采集器被误操作。
交互部件为控制按钮时,用户可通过按压、推拉等方式对控制按钮进行操控,从而发出相应的传感数据,实现与设备主体10之间的交互行为。当检测传感器30测到设备主体10未被佩戴时,设备主体10控制控制按钮转换为锁定状态,此时,用户若对控制按钮进行按压、推拉等方式的操控,控制按钮不会向设备主体10发出相应的传感数据,有效防止控制按钮被误操作。
交互部件为触控板20时,触控板20的一种可实现方式是,触控板20包括但不限于为电容式触摸板,电容式触摸板是利用人体接近触摸板时,使传感焊盘和接地之间的电容值发生改变(一般pF级别,pF为电容值单位,中文为皮法),将检测到的电容值的改变通过微处理器的计算,滤除干扰从而最终判断出是否有人体的接近来实现按键功能。其相比传统的机械按键的优势在于没有机械损坏,可以使用玻璃、亚克力、塑料等非金属作为操作面板,使产品外观上更加美观。同时,相比之下还可以实现传统机械按键难以实现的多种滑动操作,使得人机交互做的更加符合人的直观操作。当检测传感器30检测到设备主体10未被佩戴时,设备主体10控制触控板20转换为锁定状态,此时,用户若对触控板20进行点击、按压、滑动等方式的操控,触控板20不会向设备主体10发出相应的传感数据,有效防止触控板20被误操作。
当交互部件为控制按钮时,控制按钮凸出于外侧面12的表面,可方便用户对其进行定位,从而在不使用,避免对控制按钮进行误触,在使用时,可准确找到控制按钮,从而对其进行相应的操控。
当交互部件为触控板20时,触控板20可为凸出于外侧面12的表面,也可为与外侧面12的表面平齐,相对不易被定位。为进一步方便用户对触控板20进行定位,在本申请的一些可实现的实施例中,设置有触控板20的外侧面12上,对应于触控板20的周向外周设有第一标识结构121。用户可通过第一标识结构121实现对触控板20的定位,例如,用户佩戴上设备主体10之后,用手在设备主体10上摸索,当摸索到第一标识结构121时,根据第一标识结构121即可推断出触控板20的相对位置,从而进行避开或者对触控板20进行操控。
第一标识结构121相对于触控板20的位置的设置方式包括多种,一种可实现的位置设置方式是,第一标识结构121与触控板20的周向边缘处相 接。即第一标识结构121设置触控板20的边缘处,当用户触摸到第一标识结构121时,即可知道此时已经触摸到触控板20的边缘位置了,若不想误触则不再“向前”进行摸索,以避开触控板20,若想对触控板20进行操作,则可基于第一标识结构121的位置继续“向前”摸索,从而找到触控板20。
第一标识结构121的另一种可实现的位置设置方式是,第一标识结构121与触控板20的周向边缘处间隔第一距离。当用户触摸到第一标识结构121时,即可知道此时已经快要接近触控板20了,若不想误触则不再“向前”进行摸索,以避开触控板20,若想对触控板20进行操作,则可基于第一标识结构121的位置继续“向前”摸索,从而找到触控板20。此种方式下,第一标识结构121与触控板20之间还具有一定的距离,当用户没有及时停止摸索时,也可有效地避免对触控板20进行误触。
第一标识结构121的实现方式包括也多种,一种可实现的方式是,参见图2及图3,第一标识结构121为环设于触控板20的周向外周的环状的第一凸起结构或第一凹槽结构。凸起或凹槽为相对于外侧面12的表面的结构,用户通过触感即可感知到,从而起到提示作用。第一标识结构121可为一个也可为多个,第一标识结构121为多个时,多个环状的第一标识结构121沿着触控板20的径向方向间隔布置,最靠近触控板20的第一标识结构121与触控板20的边缘相接或间隔第一距离,各第一标识结构121与相邻的第一标识结构121间隔第二距离。
第一标识结构121的另一种可实现的方式是,第一标识结构121为沿着触控板20的周向外周间隔排列设置的多个第二凸起结构和/或多个第二凹槽结构。第二凸起结构可为凸点,第二凹槽结构可为凹坑,多个第二凸起结构或多个第二凹槽结构排列成一个或多个环状结构,围绕触控板20设置,以起到提示作用。当然,第一标识结构121也可为多个第二凸起结构和多个第二凹槽结构组合形成,第二凸起结构和第二凹槽结构可呈一定规则间隔排列,也可无规则式排列,从而形成粗糙触感,便于进行提示。
为便于用户对触控板20的相对位置进行定位,本申请实施例中,一种可实现的方式是,触控板20上设有第二标识结构122,第二标识结构122为设置在触控板20上的第三凸起结构或第三凹槽结构。当用户佩戴好设备主体10后,此时触控板20处于解锁状态,用户可以使用触控板20进行操控。而当用户触摸到触控板20时,并不会马上获知触摸到的相对位置在哪, 使用会有影响。例如,参见图2,当用户触摸到触控板20的A区域时,若用户向下滑动时,很容易滑出触控板20的操控区域,使得操控指令并不能完全发出。为改善这种影响操作的情况,用户在发出操控指令之前,可摸索找到第二标识结构122,从而根据第二标识结构122确定触摸到触控板20的位置。
进一步地,第二标识结构122的设置方式包括多种,一种可实现的方式是,第二标识结构122设置在触控板20中心区域,用户在发出操控指令之前,可摸索找到第二标识结构122,从而根据第二标识结构122确定触控板20中心区域所在位置,从而可围绕中心区域进行操控,避免滑出操控区域的情况出现,提高使用效率。
另一种实现方式是,多个第二标识结构122均布在触控板20的表面上,用户可根据多个第二标识结构122摸索找到触控板20,并根据多个第二标识结构122的位置确定触摸到触控板20的位置,从而进行操控,避免滑出操控区域的情况出现,提高使用效率。当然,第二标识结构122的此种设置方式与第二标识结构122设置在触控板20中心区域的方式可以分别单独实现,也可相互组合在一起使用,本申请实施例并不做具体限定。
在本申请实施例中,当用户佩戴好设备主体10后,交互部件为解锁状态,可便于用户进行交互操作。而在一些使用场景中,用户佩戴好设备主体10之后,在很长一段时间之内并不需要进行交互操作,例如,用户通过设备主体10观看电影,电影播放时长通常在一两个小时左右,在电影播放期间,用户通常不需要进行相应的交互操作。此种情况下,为了防止用户或其他人对交互部件造成误操作,用户可以选择对交互部件进行主动锁定操作。
一种可实现的主动锁定操作是,检测传感器30检测到设备主体10处于佩戴状态,当交互部件检测到第一预设指令时,设备主体10控制交互部件转换为锁定状态。参见图4,一种发出第一预设指令的方式是,在使用过程中,如果用户想主动锁定触控板20,可以使用双指长按预设时长,例如可以双指长按2秒,从而可触发锁定触控板20,从而可避免触控板20被误操作。通过第一预设指令将交互部件转换的锁定状态,此时用户仍然能够与交互部件进行交互,交互部件仍能够识别到用户的交互动作,但是,此时交互部件并不向控制系统发送控制指令,头戴式显示装置不响应用户 的交互动作。
而当用户想要再次解锁触控板20时,一种可实现的主动解锁操作是,当交互部件检测到第二预设指令时,设备主体10控制交互部件转换为解锁状态。继续参见图4,一种发出第二预设指令的方式是,在触控板20处于锁定状态时,此时交互部件仍能够识别到用户的交互动作,如果用户想主动解锁触控板20,可以使用双指长按预设时长,例如可以双指长按2秒,从而可触发解锁触控板20,从而可对触控板20进行交互操作。
当然,当交互部件为控制按钮时,也可通过长按的方式进行锁定或解锁。当交互部件为声音采集器时,可发出预定语音,如“锁定交互”、“解除锁定”等方式进行锁定或解锁。上述中所述的“长按”、“预定语音”等方式即为当交互部件处于锁定状态时,用户的交互动作为预定的动作,交互部件识别之后,仍可向控制系统发送相应的传感数据。
进一步地,为使得用户更加直观地获知交互部件处于何种状态,本申请实施例中,一种可实现的方式是,设备主体10具有显示器40,当交互部件检测到第一预设指令或第二预设指令时,显示器40用于显示相应提示画面。设备主体10上的显示器40除了用于显示视频画面之外,当交互部件处于解锁或锁定过程中时,显示器40用于显示相应的提示画面,提示画面包括但不限于为相应的解锁或锁定图标、进度条、数值等。提示画面的显示并不会影响正在显示的视频画面,提示画面会以浮窗的形式显示,如显示在视频画面的边缘位置。
有时用户会过于专注于视频画面,从而忽视了提示画面,为进一步地向用户提示交互部件处于何种状态,本申请实施例中,一种可实现的方式是,设备主体10具有扬声器,交互部件转换为解锁状态或锁定状态时,扬声器用于发出相应提示声音。设备主体10上的扬声器除了用于播放视频画面的声音之外,当交互部件完成解锁或锁定时,扬声器还用于播放相应的提示声音,提示声音包括但不限于为语音提示、蜂鸣声等。提示声音的播放并不会影响正在显示的视频画面的声音,不会影响用户的使用体验。
进一步地,为方便用户对头戴显示设备的操控,除了上述的操控方式之外,在本申请实施例中,用户还可以通过终端设备对头戴显示设备进行操控。一种可实现的方式是,设备主体10内设置的控制系统可以接收终端设备发来的指令,从而进行相应的响应,从而实现对头戴显示设备的控制。 用户通过终端设备可控制交互部件的锁定状态及解锁状态的转换,还可以控制设备主体10播放、暂停、切换画面以及其他多种功能的控制。终端设备包括但不限于为智能手机、各种类型的电脑、遥控器、手柄等。
相应地,基于上述实施例提供的头戴显示设备,本申请实施例还提供了一种头戴显示设备的控制方法,该方法可用于上述实施例中头戴显示设备的控制。
具体地,一种头戴式显示设备的控制方法,参见图1至图3,头戴式显示设备包括用于检测佩戴头戴式显示设备的用户的触摸操作的触控板20和用于输出传感数据的检测传感器30,方法包括:
步骤S101:根据传感数据确定佩戴头戴式显示设备的用户是否脱下头戴式显示设备。当用户佩带或者脱下头戴显示设备时,检测传感器30能够检测到相应的状态,并生成相应的传感数据,并将传感数据传输到设备主体10内的控制系统,控制系统接收到相应的传感数据后,能够判断出用户脱下头戴显示设备,进而执行步骤S102。
步骤S102:响应于确定用户脱下头戴式显示设备,生成触控板20锁定指令。当控制系统根据传感数据确定用户脱下头戴式显示设备,即驱动头戴显示设备未被佩戴时,则说明触控板20不需要被使用,为避免触控板20被误操作,因此,控制系统生成触控板20锁定指令,以便将触控板20锁定。
步骤S103:根据触控板20锁定指令锁定触控板20。触控板20处于锁定状态时,可有效防止触控板20被误操作,改善用户的使用体验。
需要说明的是,在本申请实施例中,触控板20处于锁定状态包括但不限于分为两种方式,锁定状态的一种方式是指用户仍然能够与触控板20进行交互,触控板20仍能够识别到用户的交互动作,但是,此时触控板20并不向控制系统发送传感数据,头戴式显示装置不响应用户的交互动作,此种状态也可称之为触控板20处于待机状态。
进一步地,在此种方式下,若当用户的交互动作为预设动作,即触控板20能够识别到预设指令,此时触控板20可向控制系统发送与预设指令对应的传感数据,以便控制系统进行相应的响应,从而实现对头戴显示设备的控制。需要说明的是,在本申请的一些可实现的实施例中,当触控板20处于锁定状态时,若用户的交互动作为预设的动作,触控板20识别之 后,仍可向控制系统发送相应的传感数据,此种方式会在后续的实施例中详细说明,此处暂不详述。
锁定状态的另一种方式是指关闭触控板20的检测识别功能,触控板20不识别用户的交互动作,因此触控板20不向控制系统发送传感数据,这样头戴式显示装置不响应用户对触摸板的触摸操作。
本申请实施例提供的技术方案,通过检测传感器30可检测到头戴显示设备是否被用户脱下,即可检测到头戴显示设备是否被佩戴,从而根据检测结果,将触控板20转换为锁定状态,可有效防止触控板20被误操作,改善用户的使用体验。
进一步地,为便于用户在佩戴头戴显示设备时,通过触控板20进行操控,控制方法还包括:
步骤S201:根据传感数据确定用户是否重新佩戴头戴式显示设备。当用户佩带头戴显示设备时,检测传感器30能够检测到相应的状态,并生成相应的传感数据,并将传感数据传输到设备主体10内的控制系统,控制系统接收到相应的传感数据后,能够判断出用户佩带头戴显示设备,进而执行步骤S202。
步骤S202:响应于确定用户重新佩戴头戴式显示设备,生成解锁指令。当控制系统根据传感数据确定用户佩戴头戴式显示设备,即驱动头戴显示设备被佩戴时,则说明触控板20需要被使用,为避免增加用户解锁的操作步骤,控制系统直接生成解锁指令,以便将触控板20解锁,以便用户能够通过触控板20进行相应的操控。
步骤S203:根据解锁指令解除对触控板20的锁定。触控板20处于解锁状态时,用户能够通过接触方式与触控板20进行交互,如通过滑动、点击等方式与触控板20进行交互,进而经过触控板20向控制系统发送控制指令,以便控制系统进行相应的响应,从而实现对头戴显示设备的控制。
举例来说,头戴显示设备在使用时,当头戴显示设备未佩戴在用户的头部时,或者用户脱下头戴显示设备时,检测传感器30可检测到相应的状态并发生传感数据至控制系统,控制系统根据传感数据生成触摸板锁定指令,以锁定触控板20,即触控板20处于锁定状态,可有效防止触控板20被误操作。例如,当用户拿起头戴显示设备准备佩戴时,用户的手部有可能触碰到触控板20,而此时触控板20处于锁定状态,所以不会发出传感 数据,头戴显示设备不会做出相应的响应。
当用户将头戴显示设备佩戴在头部上时,检测传感器30可检测到头戴显示设备已经佩戴在用户的头部,发送传感数据至控制系统,系统控制根据相应传感数据生成解锁指令,解除对触控板20的锁定,此时触控板20从锁定状态转换成解锁状态,以便用户进行交互操作。
本申请实施例提供的技术方案,通过检测传感器30可检测到头戴显示设备是否被佩戴,从而根据检测结果,头戴显示设备控制触控板20转换解锁状态或锁定状态,从而方便用户使用的同时,还可防止触控板20被误操作,改善用户的使用体验。
本申请实施例中,检测传感器30的实现方式包括多种,检测传感器30的一种可实现的方式是,检测传感器30包括距离传感器、亮度传感器、温度传感器、视觉传感器、虹膜传感器、接触传感器中的一种或多种。检测传感器30的检测方向朝向佩戴头戴显示设备时用户的头部方向。检测传感器30的多种实现方式可以分别单独实现,也可相互组合在一起使用,本申请实施例并不做具体限定。
进一步地,为更加便于检测传感器30的检测,检测传感器30的一种设置方式是,佩戴面11上具有与使用者眉心位置对应的眉心区,眉心区处设有检测传感器30。例如,检测传感器30包括设置头戴式显示设备的两组光学组件之间的距离传感器。此种设置方式下,可更加方便检测传感器30进行检测,使得检测的结果更加准确,同时,检测传感器30的位置不会影响设备主体10上其他部件,如不会影响眼睛区域的镜片的设置。检测传感器30除了可设置在眉心区之外,也可根据不同的需求,眼睛区的周围区域、鼻子区的上方区域及额头区等位置设置检测传感器30。
进一步地,根据不同的需求检测传感器30可设在设备主体10的外部,也可设置在设备主体10的内部。检测传感器30设置在设备主体10的内部时,佩戴面11上设有供检测传感器30进行检测的检测孔,检测传感器30通过检测孔进行检测信号的传输,此种设置方式下,检测传感器30设备主体10的保护,不会其他物品及人体的干扰,使得检测更加准确,同时,还可有效防止损坏,提高使用寿命。
检测传感器30的一种可实现方式是,检测传感器30包括距离传感器。距离传感器的检测方式是检测物体相对距离传感器的距离,从而生成不同 的传感数据,以便确定头戴显示设备是否被佩戴。进一步地,检测传感器30包括设置头戴式显示设备的两组光学组件之间的距离传感器。
针对步骤S101,根据传感数据确定佩戴头戴式显示设备的用户是否脱下头戴式显示设备,包括:
步骤S1011:确定距离传感器输出的距离数据是否大于或等于预设的距离阈值。
步骤S1012:若是时,确定佩戴头戴式显示设备的用户脱下头戴式显示设备。
当物体与距离传感器的相对距离小于预设的距离阈值时,则说明设备主体10被佩戴,那么控制系统生成解锁指令,触控板20解除锁定,转换为解锁状态,以便用户使用。当物体与距离传感器的相对距离大于或等于预设的距离阈值时,则说明头戴显示设备未被佩戴,那么控制系统生成触摸板锁定指令,触控板20转换为锁定状态,防止触控板20发生误触的情况。距离阈值可根据不同的需求进行相应的设置,本申请实施例中不做具体限定。
为使得传感数据更加精确,检测传感器30的另一种可实现方式是,检测传感器30包括距离传感器和亮度传感器。距离传感器的检测方式是检测物体相对距离传感器的距离,亮度传感器的检测方式是检测光的亮度,从而生成不同的传感数据,以便确定头戴显示设备是否被佩戴。
针对步骤S101,根据传感数据确定佩戴头戴式显示设备的用户是否脱下头戴式显示设备,包括:
步骤S1013:确定距离传感器输出的距离数据是否大于或等于预设的距离阈值,确定亮度传感器输出的亮度数据是否大于或等于预设的亮度阈值;
步骤S1014:若距离数据大于或等于预设的距离阈值且亮度数据大于或等于预设的亮度阈值时,确定佩戴头戴式显示设备的用户脱下头戴式显示设备。
距离传感器的检测方式可参考上述记载内容,此处不再详述。为进一步提高检测精度,距离传感器还可结合亮度传感器共同检测,当两者的检测结果为一致时,则可准确地确定用户是否佩戴头戴式显示设备。亮度传感器的检测方式是检测光的亮度,当光的亮度小于亮度阈值时,则说明头 戴显示设备被佩戴,那么控制系统控制触控板20转换为解锁状态,以便用户使用。当光的亮度大于或等于预设的亮度阈值时,则说明头戴显示设备未被佩戴,那么控制系统控制触控板20转换为锁定状态,防止触控板20发生误触的情况,避免不必要的操作输入。亮度阈值可根据不同的需求进行相应的设置,本申请实施例中不做具体限定。
进一步地,本申请实施例中,检测传感器30的实现方式还包括温度传感器、视觉传感器、虹膜传感器、接触传感器中的一种或多种。
温度传感器可以通过检测温度的方式进行检测,当检测到的温度大于或等于预设的温度阈值时,则说明头戴显示设备被佩戴,那么控制系统控制触控板20转换为解锁状态,以便用户使用。当检测到的温度小于预设的温度阈值时,则说明头戴显示设备未被佩戴,那么控制系统控制触控板20转换为锁定状态,防止触控板20发生误触的情况。
视觉传感器及虹膜传感器可以通过检测用户眼部的方式进行检测,当检测到用户的眼部处于预设的检测范围内时,则说明头戴显示设备被佩戴,那么控制系统控制触控板20转换为解锁状态,以便用户使用。当检测不到用户眼部时,则说明头戴显示设备未被佩戴,那么控制系统控制触控板20转换为锁定状态。
接触传感器的一种实现方式是压力检测传感器,当检测到压力大于或等于预设的压力阈值时,则说明头戴显示设备被佩戴,那么控制系统控制触控板20转换为解锁状态,以便用户使用。当检测压力小于预设的压力阈值时,则说明头戴显示设备未被佩戴,那么控制系统控制触控板20转换为锁定状态。
需要说明的是,上述内容中各阈值均可根据不同的需求进行相应的设置,本申请实施例中不做具体限定。
在本申请实施例中,当用户佩戴好头戴显示设备后,触控板20处于解锁状态,可便于用户进行交互操作。而在一些使用场景中,用户佩戴好头戴显示设备之后,在很长一段时间之内并不需要进行交互操作,例如,用户通过头戴显示设备观看电影,电影播放时长通常在一两个小时左右,在电影播放期间,用户通常不需要进行相应的交互操作。此种情况下,为了防止用户或其他人对交互部件造成误操作,用户可以选择对交互部件进行主动锁定操作。
一种可实现的主动锁定操作的方式是,头戴显示设备的控制方法还包括:
步骤S301:获取与头戴式显示设备通信连接的终端设备发送的触控板锁定指令,其中,触控板锁定指令是终端设备检测用户的触控板20锁定操作而生成的。
为方便用户对头戴显示设备的操控,除了上述通过触控板20实现对头戴显示设备的操控方式之外,在本申请实施例中,用户还可以通过终端设备对头戴显示设备进行操控。一种可实现的方式是,设备主体10内设置的控制系统可以接收终端设备发来的指令,从而进行相应的响应,从而实现对头戴显示设备的控制。用户通过终端设备可控制触控板20的锁定状态及解锁状态的转换,还可以控制头戴显示设备的播放、暂停、切换画面以及其他多种功能的控制。终端设备包括但不限于为智能手机、各种类型的电脑、遥控器、手柄等。
获取与头戴式显示设备通信连接的终端设备发送的触控板锁定指令,其中,触控板锁定指令是终端设备检测用户的触控板20锁定操作而生成的。
举例来说,终端设备与头戴显示设备的控制系统通信连接,当用户通过在终端设备进行操作,如用户输入触控板20锁定操作时,终端设备生成触控板锁定指令,并将触控板锁定指令发送至控制系统,从而控制系统控制触控板20转换为锁定状态,可有效防止触控板20的误操作。
而当用户想要再次解锁触控板20时,用户还通过在终端设备进行解锁操作,如用户输入触控板20解锁操作时,终端设备生成触控板解锁指令,并将触控板锁定指令发送至控制系统,从而控制系统解除触控板20的锁定,触控板20转换为锁定状态,以方便用户进行相应的操作。
进一步地,为减少操作步骤,另一种可实现的主动锁定操作的方式是,头戴显示设备的控制方法还包括:
步骤S401:获取触控板锁定指令,其中,触控板锁定指令是通过检测头戴式显示设备与终端设备之间的通信连接建立而生成的。
当用户想用终端设备对头戴设备进行控制时,用户可完全经过终端设备进行相应的操作,因此,可在终端设备与头戴显示设备建立通信连接时,即将触控板20进行锁定,从而减少用户经终端设备将触控板20锁定的步骤,降低操作复杂性,从而提高了用户使用舒适度。
触控板锁定指令包括但不限于以下几种方式获取,一种可实现的方式是,获取触控板锁定指令,包括:获取终端设备发送的触控板锁定指令,其中,触控板锁定指令是终端设备检测头戴式显示设备与终端设备之间的通信连接建立而生成的。当终端设备与头戴显示设备建立通信连接时,终端设备即检测到通信连接已建立,从而终端设备生成触控板锁定指令,并发送至头戴显示设备的控制系统,从而使得控制系统控制触控板20进行锁定。
另一种可实现的方式是,获取触控板20锁定指令,包括:确定头戴式显示设备是否与终端设备建立通信连接;响应于确定建立通信连接,生成触控板20锁定指令。当终端设备与头戴显示设备建立通信连接时,头戴显示设备中的控制系统即检测到通信连接已建立,从而控制系统生成触控板锁定指令,从而控制触控板20进行锁定。
上述两种方式可分别单独实现,也可相互组合在一起实现,组合在一起实现时,可相互进行校正,当上述两种方式均满足时,触控板20进行锁定。
本申请实施例中,通过触控板20及终端设备均可实现头戴显示设备的相应操作,除了可控制触控板20的锁定状态及解锁状态的转换之外,还可以控制头戴显示设备的播放、暂停、切换画面以及其他多种功能的控制。头戴显示设备播放的画面可为存储于头戴显示设备内的图像数据,也可为外部设备发送来的图像数据。基于此,本申请实施例中,头戴显示设备的控制方法还包括:
步骤S501:控制头戴式显示设备的显示器显示终端设备发送的图像数据。例如,终端设备为手机、电脑等,用户可将终端设备中的图片、视频、游戏画面等图像数据发送至头戴显示设备的显示器进行同步显示,同时用户还可在终端设备上进行相应的操控,即用户可将头戴显示设备当做终端设备的显示器进行使用,以增加沉浸式的体验。
或者,
步骤S502:控制头戴式显示设备的显示器显示可移动平台的图像采集装置采集的图像数据,其中,终端设备用于控制可移动平台。
例如,终端设备为手机、电脑、遥控器、手柄等,可移动平台包括但不限于为无人飞行器、无人车、可移动机器人等,可移动平台可将通过摄 像头记录的图像数据发送至头戴显示设备的显示器进行同步显示,同时也可同步显示在终端设备上,用户还可在终端设备上进行相应的操控,如用户通过终端设备可控制可移动平台的移动、控制机械臂、旋转摄像头、拍照、录像、喊话等,即用户可将头戴显示设备当做可移动平台的“眼睛”进行使用,以增加沉浸式的体验。
进一步地,在本申请实施例中,除了可以通过终端设备对触控板20进行主动锁定外,为了防止用户或其他人对触控板20造成误操作,用户还可以选择通过触控板20进行主动锁定操作。
一种可实现的方式是,头戴显示设备的控制方法还包括:
步骤S601:响应于确定触控板20检测到用户的触摸屏蔽操作,生成触摸板锁定指令;
步骤S602:根据触摸板锁定指令屏蔽触控板20检测到的除用户的触摸屏蔽解除操作之外的触摸操作;
举例来说,当触控板20检测到第一预设指令(即触摸屏蔽操作)时,头戴显示设备控制触控板20转换为锁定状态。参见图4,一种发出第一预设指令的方式是,在使用过程中,如果用户想主动锁定触控板20,可以使用双指长按预设时长,例如可以双指长按2秒,从而生成触摸板锁定指令,可触发锁定触控板20,从而可避免触控板20被误操作。通过第一预设指令将触控板20转换的锁定状态,此时用户仍然能够与触控板20进行交互,触控板20仍能够识别到用户的交互动作,但是,此时触控板20屏蔽触控板20检测到的除用户的触摸屏蔽解除操作之外的触摸操作,即如果用户输入的操作不是触摸屏蔽解除操作,那么触控板20并不向控制系统发送控制指令,头戴式显示装置不响应用户的交互动作。如果,用户输入的操作是触摸屏蔽解除操作,那么触控板20向控制系统发送控制指令,头戴式显示装置响应用户的交互动作,解除触控板20的锁定,触控板20转换为解锁状态。
另一种可实现的方式是,头戴显示设备的控制方法还包括:
步骤S603:响应于确定触控板20检测到用户的触摸屏蔽解除操作,生成触摸屏蔽解除指令;
步骤S604:根据触摸屏蔽解除指令解除对触控板20检测到的除用户的触摸操作的屏蔽。
举例来说,当触控板20处于锁定状态时,当用户想要再次解锁触控板20时,用户可输入触摸屏蔽解除操作,当触控板20检测到第二预设指令(触摸屏蔽解除操作)时,控制系统生成触摸屏蔽解除指令,控制系统控制触控板20转换为解锁状态,此时,根据触摸屏蔽解除指令解除对触控板20检测到的除用户的触摸操作的屏蔽,用户可以通过触控板20输入相应的各种操作。
继续参见图4,一种发出第二预设指令的方式是,在触控板20处于锁定状态时,此时触控板20仍能够识别到用户的交互动作,如果用户想主动解锁触控板20,可以使用双指长按预设时长,例如可以双指长按2秒,从而可触发解锁触控板20,从而可对触控板20进行交互操作。
需要说明的是,上述两种方式可分别单独实现,也可相互组合在一起实现。
进一步地,本申请实施例中,头戴显示设备的控制方法还包括:
步骤S701:根据传感数据确定用户是否佩戴头戴式显示设备;
针对步骤S601,响应于确定触控板20检测到用户的触摸屏蔽操作,生成触摸板锁定指令,包括:
响应于确定触控板20检测到用户的触摸屏蔽操作且确定用户佩戴头戴式显示设备时,生成触摸板锁定指令。
为进一步地确定用户在佩戴头戴显示设备时,将触控板20进行锁定。举例来说,检测传感器30检测到头戴显示设备处于佩戴状态,当触控板20检测到第一预设指令(即触摸屏蔽操作)时,头戴显示设备控制触控板20转换为锁定状态,从而可避免触控板20被误操作。此时用户仍然能够与触控板20进行交互,触控板20仍能够识别到用户的交互动作,但是,此时触控板20屏蔽触控板20检测到的除用户的触摸屏蔽解除操作之外的触摸操作,即如果用户输入的操作不是触摸屏蔽解除操作,那么触控板20并不向控制系统发送控制指令,头戴式显示装置不响应用户的交互动作。如果,用户输入的操作是触摸屏蔽解除操作,那么触控板20向控制系统发送控制指令,头戴式显示装置响应用户的交互动作,解除触控板20的锁定,触控板20转换为解锁状态。
针对步骤S603,响应于确定触控板20检测到用户的触摸屏蔽解除操作,生成触摸屏蔽解除指令,包括:
响应于确定触控板20检测到用户的触摸屏蔽解除操作且确定用户佩戴头戴式显示设备时,生成触摸屏蔽解除指令。
为进一步地确定用户在佩戴头戴显示设备时,将触控板20解除锁定。举例来说,检测传感器30检测到头戴显示设备处于佩戴状态,且当触控板20处于锁定状态时,当用户想要再次解锁触控板20时,用户可输入触摸屏蔽解除操作,当触控板20检测到第二预设指令(触摸屏蔽解除操作)时,控制系统生成触摸屏蔽解除指令,控制系统控制触控板20转换为解锁状态,此时,根据触摸屏蔽解除指令解除对触控板20检测到的除用户的触摸操作的屏蔽,用户可以通过触控板20输入相应的各种操作。
需要说明的是,上述两种方式可分别单独实现,也可相互组合在一起实现。
进一步地,为使得用户更加直观地获知触控板20处于何种状态,本申请实施例中,一种可实现的方式是,头戴显示设备的控制方法还包括:
步骤S801:控制头戴式显示设备的显示器显示触摸屏蔽通知;和/或,
步骤S802:控制头戴式显示设备的显示器显示触摸屏蔽解除通知。
头戴显示设备具有显示器40,当控制系统生成触摸屏蔽指令和/或触摸屏蔽解除指时,显示器40用于显示相应提示画面。头戴显示设备上的显示器40除了用于显示视频画面之外,当触控板20处于解锁或锁定过程中时,显示器40还可显示相应的提示画面,提示画面包括但不限于为相应的解锁或锁定图标、进度条、数值等。提示画面的显示并不会影响正在显示的视频画面,提示画面会以浮窗的形式显示,如显示在视频画面的边缘位置。
有时用户会过于专注于视频画面,从而忽视了提示画面,为进一步地向用户提示触控板20处于何种状态,本申请实施例中,一种可实现的方式是,头戴显示设备具有扬声器,触控板20转换为解锁状态或锁定状态时,扬声器用于发出相应提示声音。头戴显示设备上的扬声器除了用于播放视频画面的声音之外,当触控板20完成解锁或锁定时,扬声器还用于播放相应的提示声音,提示声音包括但不限于为语音提示、蜂鸣声等。提示声音的播放并不会影响正在显示的视频画面的声音,不会影响用户的使用体验。
综上所述,本申请实施例提供的技术方案,通过检测传感器30可检测到头戴显示设备是否被佩戴,从而根据检测结果,设备主体10控制交互部件转换解锁状态或锁定状态,从而方便用户使用的同时,还可防止交互部 件被误操作,避免不必要的操作输入,改善用户的使用体验。同时,在交互部件对应的位置处设有标识结构,可起到防止用户误碰作用的同时,还便于用户对交互部件进行定位,在结构上做出差异化,方便用户进行盲操。
如图5所示,本申请实施例还提供一种头戴式显示设备50,包括用于检测佩戴所述头戴式显示设备的用户的触摸操作的触控板51、用于输出传感数据的检测传感器52和处理器53,其中,
所述处理器用于执行以下操作:
根据所述传感数据确定佩戴所述头戴式显示设备的用户是否脱下所述头戴式显示设备;
响应于确定用户脱下所述头戴式显示设备,生成触控板锁定指令;
根据所述触控板锁定指令锁定所述触控板。
在某些实施例中,所述处理器还用于:
根据所述传感数据确定用户是否重新佩戴所述头戴式显示设备;
响应于确定用户重新佩戴所述头戴式显示设备,生成解锁指令;
根据所述解锁指令解除对触控板的锁定。
在某些实施例中,所述检测传感器包括距离传感器、亮度传感器、温度传感器、视觉传感器、虹膜传感器、接触传感器中的一种或多种。
在某些实施例中,所述检测传感器包括距离传感器;
所述处理器具体用于:
确定距离传感器输出的距离数据是否大于或等于预设的距离阈值;
若是时,确定佩戴所述头戴式显示设备的用户脱下所述头戴式显示设备。
在某些实施例中,所述检测传感器包括距离传感器和亮度传感器;
所述处理器具体用于:
确定距离传感器输出的距离数据是否大于或等于预设的距离阈值,确定亮度检测传感器输出的亮度数据是否大于或等于预设的亮度阈值;
若距离数据大于或等于预设的距离阈值且亮度数据大于或等于预设的亮度阈值时,确定佩戴所述头戴式显示设备的用户脱下所述头戴式显示设备。
在某些实施例中,所述处理器还用于:
获取与所述头戴式显示设备通信连接的终端设备发送的触控板锁定指 令,其中,所述触控板锁定指令是所述终端设备检测用户的触控板锁定操作而生成的。
在某些实施例中,所述处理器还用于:
获取触控板锁定指令,其中,所述触控板锁定指令是通过检测所述头戴式显示设备与终端设备之间的通信连接建立而生成的。
在某些实施例中,所述处理器具体用于:
获取所述终端设备发送的触控板锁定指令,其中,所述触控板锁定指令是终端设备检测所述头戴式显示设备与终端设备之间的通信连接建立而生成的。
在某些实施例中,所述处理器具体用于:
确定所述头戴式显示设备是否与终端设备建立通信连接;
响应于确定建立所述通信连接,生成触控板锁定指令。
在某些实施例中,所述处理器还用于:
控制头戴式显示设备的显示器显示所述终端设备发送的图像数据;或者,
控制头戴式显示设备的显示器显示可移动平台的图像采集装置采集的图像数据,其中,所述终端设备用于控制可移动平台。
在某些实施例中,所述处理器还用于:
响应于确定所述触控板检测到用户的触摸屏蔽操作,生成触摸板锁定指令;
根据所述触摸板锁定指令屏蔽所述触控板检测到的除用户的触摸屏蔽解除操作之外的触摸操作;和/或,
响应于确定所述触控板检测到用户的触摸屏蔽解除操作,生成触摸屏蔽解除指令;
根据所述触摸屏蔽解除指令解除对所述触控板检测到的除用户的所述触摸操作的屏蔽。
在某些实施例中,所述处理器还用于:
根据所述传感数据确定用户是否佩戴所述头戴式显示设备;
所述处理器具体用于
响应于确定所述触控板检测到用户的触摸屏蔽操作且确定所述用户佩戴所述头戴式显示设备时,生成触摸板锁定指令;和/或,
响应于确定所述触控板检测到用户的触摸屏蔽解除操作且确定所述用户佩戴所述头戴式显示设备时,生成触摸屏蔽解除指令。
在某些实施例中,所述处理器还用于:
控制头戴式显示设备的显示器显示触摸屏蔽通知;和/或,
控制头戴式显示设备的显示器显示触摸屏蔽解除通知。
在某些实施例中,所述检测传感器包括设置所述头戴式显示设备的两组光学组件之间的距离传感器。
其中,所述头戴式显示设备50可以执行如前所述头戴式显示设备的控制方法,另外,所述头戴式显示设备50可以具有如前所述的头戴式显示装置任何特征。
最后应说明的是:以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围。

Claims (38)

  1. 一种头戴显示设备,其特征在于,包括:
    设备主体,所述设备主体具有朝向使用者面部的佩戴面及多个外侧面;
    交互部件,至少一个所述外侧面上设有所述交互部件;
    检测传感器,所述佩戴面上设有至少一个所述检测传感器;
    其中,当所述检测传感器检测到所述设备主体处于佩戴状态时,所述设备主体控制所述交互部件转换为解锁状态;
    当所述检测传感器检测到所述设备主体处于非佩戴状态时,所述设备主体控制所述交互部件转换为锁定状态。
  2. 根据权利要求1所述的头戴显示设备,其特征在于,所述佩戴面上具有与所述使用者眉心位置对应的眉心区,所述眉心区处设有所述检测传感器。
  3. 根据权利要求1所述的头戴显示设备,其特征在于,所述检测传感器包括距离传感器、亮度传感器、温度传感器、视觉传感器、虹膜传感器、接触传感器中的一种或多种。
  4. 根据权利要求1至3中任一项所述的头戴显示设备,其特征在于,所述交互部件为触控板及、控制按钮及声音采集器中的至少一个。
  5. 根据权利要求4所述的头戴显示设备,其特征在于,当所述交互部件为触控板时,设置有所述触控板的所述外侧面上,对应于所述触控板的周向外周设有第一标识结构;
    所述第一标识结构与所述触控板的周向边缘处相接;或者,
    所述第一标识结构与所述触控板的周向边缘处间隔第一距离。
  6. 根据权利要求5所述的头戴显示设备,其特征在于,所述第一标识结构为环设于所述触控板的周向外周的环状的第一凸起结构或第一凹槽结构;或者,
    所述第一标识结构为沿着所述触控板的周向外周间隔排列设置的多个第二凸起结构和/或多个第二凹槽结构。
  7. 根据权利要求4所述的头戴显示设备,其特征在于,所述触控板上设有第二标识结构,所述第二标识结构为设置在所述触控板上的第三凸起结构或第三凹槽结构。
  8. 根据权利要求1至3中任一项所述的头戴显示设备,其特征在于,所述检测传感器检测到所述设备主体处于佩戴状态,当所述交互部件检测到第一预设指令时,所述设备主体控制所述交互部件转换为锁定状态;
    当所述交互部件检测到第二预设指令时,所述设备主体控制所述交互部件转换为解锁状态。
  9. 根据权利要求8所述的头戴显示设备,其特征在于,所述设备主体具有显示器,当所述交互部件检测到所述第一预设指令或所述第二预设指令时,所述显示器用于显示相应提示画面。
  10. 根据权利要求1至3中任一项所述的头戴显示设备,其特征在于,所述设备主体具有扬声器,所述交互部件转换为解锁状态或锁定状态时,所述扬声器用于发出相应提示声音。
  11. 一种头戴式显示设备的控制方法,所述头戴式显示设备包括用于检测佩戴所述头戴式显示设备的用户的触摸操作的触控板和用于输出传感数据的检测传感器,其特征在于,包括:
    根据所述传感数据确定佩戴所述头戴式显示设备的用户是否脱下所述头戴式显示设备;
    响应于确定用户脱下所述头戴式显示设备,生成触控板锁定指令;
    根据所述触控板锁定指令锁定所述触控板。
  12. 根据权利要求11所述的方法,其特征在于,所述方法还包括:
    根据所述传感数据确定用户是否重新佩戴所述头戴式显示设备;
    响应于确定用户重新佩戴所述头戴式显示设备,生成解锁指令;
    根据所述解锁指令解除对触控板的锁定。
  13. 根据权利要求11所述的方法,其特征在于,所述检测传感器包括距离传感器、亮度传感器、温度传感器、视觉传感器、虹膜传感器、接触传感器中的一种或多种。
  14. 根据权利要求13所述的方法,其特征在于,所述检测传感器包括距离传感器;
    所述根据所述传感数据确定佩戴所述头戴式显示设备的用户是否脱下所述头戴式显示设备,包括:
    确定距离传感器输出的距离数据是否大于或等于预设的距离阈值;
    若是时,确定佩戴所述头戴式显示设备的用户脱下所述头戴式显示设 备。
  15. 根据权利要求13所述的方法,其特征在于,所述检测传感器包括距离传感器和亮度传感器;
    所述根据所述传感数据确定佩戴所述头戴式显示设备的用户是否脱下所述头戴式显示设备,包括:
    确定距离传感器输出的距离数据是否大于或等于预设的距离阈值,确定亮度检测传感器输出的亮度数据是否大于或等于预设的亮度阈值;
    若距离数据大于或等于预设的距离阈值且亮度数据大于或等于预设的亮度阈值时,确定佩戴所述头戴式显示设备的用户脱下所述头戴式显示设备。
  16. 根据权利要求11所述的方法,其特征在于,所述方法还包括:
    获取与所述头戴式显示设备通信连接的终端设备发送的触控板锁定指令,其中,所述触控板锁定指令是所述终端设备检测用户的触控板锁定操作而生成的。
  17. 根据权利要求11所述的方法,其特征在于,所述方法还包括:
    获取触控板锁定指令,其中,所述触控板锁定指令是通过检测所述头戴式显示设备与终端设备之间的通信连接建立而生成的。
  18. 根据权利要求17所述的方法,其特征在于,所述获取触控板锁定指令,包括:
    获取所述终端设备发送的触控板锁定指令,其中,所述触控板锁定指令是终端设备检测所述头戴式显示设备与终端设备之间的通信连接建立而生成的。
  19. 根据权利要求17所述的方法,其特征在于,所述获取触控板锁定指令,包括:
    确定所述头戴式显示设备是否与终端设备建立通信连接;
    响应于确定建立所述通信连接,生成触控板锁定指令。
  20. 根据权利要求17所述的方法,其特征在于,所述方法还包括:
    控制头戴式显示设备的显示器显示所述终端设备发送的图像数据;或者,
    控制头戴式显示设备的显示器显示可移动平台的图像采集装置采集的图像数据,其中,所述终端设备用于控制可移动平台。
  21. 根据权利要求11所述的方法,其特征在于,所述方法还包括:
    响应于确定所述触控板检测到用户的触摸屏蔽操作,生成触摸板锁定指令;
    根据所述触摸板锁定指令屏蔽所述触控板检测到的除用户的触摸屏蔽解除操作之外的触摸操作;和/或,
    响应于确定所述触控板检测到用户的触摸屏蔽解除操作,生成触摸屏蔽解除指令;
    根据所述触摸屏蔽解除指令解除对所述触控板检测到的除用户的所述触摸操作的屏蔽。
  22. 根据权利要求21所述的方法,其特征在于,所述方法还包括:
    根据所述传感数据确定用户是否佩戴所述头戴式显示设备;
    所述响应于确定所述触控板检测到用户的触摸屏蔽操作,生成触摸板锁定指令,包括:
    响应于确定所述触控板检测到用户的触摸屏蔽操作且确定所述用户佩戴所述头戴式显示设备时,生成触摸板锁定指令;和/或,
    所述响应于确定所述触控板检测到用户的触摸屏蔽解除操作,生成触摸屏蔽解除指令,包括:
    响应于确定所述触控板检测到用户的触摸屏蔽解除操作且确定所述用户佩戴所述头戴式显示设备时,生成触摸屏蔽解除指令。
  23. 根据权利要求11所述的方法,其特征在于,所述方法还包括:
    控制头戴式显示设备的显示器显示触摸屏蔽通知;和/或,
    控制头戴式显示设备的显示器显示触摸屏蔽解除通知。
  24. 根据权利要求11所述的方法,其特征在于,所述检测传感器包括设置所述头戴式显示设备的两组光学组件之间的距离传感器。
  25. 一种头戴式显示设备,其特征在于,包括用于检测佩戴所述头戴式显示设备的用户的触摸操作的触控板、用于输出传感数据的检测传感器和处理器;其中,
    所述处理器,用于执行以下操作:
    根据所述传感数据确定佩戴所述头戴式显示设备的用户是否脱下所述头戴式显示设备;
    响应于确定用户脱下所述头戴式显示设备,生成触控板锁定指令;
    根据所述触控板锁定指令锁定所述触控板。
  26. 根据权利要求25所述的头戴式显示设备,其特征在于,所述处理器还用于:
    根据所述传感数据确定用户是否重新佩戴所述头戴式显示设备;
    响应于确定用户重新佩戴所述头戴式显示设备,生成解锁指令;
    根据所述解锁指令解除对触控板的锁定。
  27. 根据权利要求25所述的头戴式显示设备,其特征在于,所述检测传感器包括距离传感器、亮度传感器、温度传感器、视觉传感器、虹膜传感器、接触传感器中的一种或多种。
  28. 根据权利要求27所述的头戴式显示设备,其特征在于,所述检测传感器包括距离传感器;
    所述处理器具体用于:
    确定距离传感器输出的距离数据是否大于或等于预设的距离阈值;
    若是时,确定佩戴所述头戴式显示设备的用户脱下所述头戴式显示设备。
  29. 根据权利要求27所述的头戴式显示设备,其特征在于,所述检测传感器包括距离传感器和亮度传感器;
    所述处理器具体用于:
    确定距离传感器输出的距离数据是否大于或等于预设的距离阈值,确定亮度检测传感器输出的亮度数据是否大于或等于预设的亮度阈值;
    若距离数据大于或等于预设的距离阈值且亮度数据大于或等于预设的亮度阈值时,确定佩戴所述头戴式显示设备的用户脱下所述头戴式显示设备。
  30. 根据权利要求25所述的头戴式显示设备,其特征在于,所述处理器还用于:
    获取与所述头戴式显示设备通信连接的终端设备发送的触控板锁定指令,其中,所述触控板锁定指令是所述终端设备检测用户的触控板锁定操作而生成的。
  31. 根据权利要求25所述的头戴式显示设备,其特征在于,所述处理器还用于:
    获取触控板锁定指令,其中,所述触控板锁定指令是通过检测所述头 戴式显示设备与终端设备之间的通信连接建立而生成的。
  32. 根据权利要求31所述的头戴式显示设备,其特征在于,所述处理器具体用于:
    获取所述终端设备发送的触控板锁定指令,其中,所述触控板锁定指令是终端设备检测所述头戴式显示设备与终端设备之间的通信连接建立而生成的。
  33. 根据权利要求31所述的头戴式显示设备,其特征在于,所述处理器具体用于:
    确定所述头戴式显示设备是否与终端设备建立通信连接;
    响应于确定建立所述通信连接,生成触控板锁定指令。
  34. 根据权利要求31所述的头戴式显示设备,其特征在于,所述处理器还用于:
    控制头戴式显示设备的显示器显示所述终端设备发送的图像数据;或者,
    控制头戴式显示设备的显示器显示可移动平台的图像采集装置采集的图像数据,其中,所述终端设备用于控制可移动平台。
  35. 根据权利要求25所述的头戴式显示设备,其特征在于,所述处理器还用于:
    响应于确定所述触控板检测到用户的触摸屏蔽操作,生成触摸板锁定指令;
    根据所述触摸板锁定指令屏蔽所述触控板检测到的除用户的触摸屏蔽解除操作之外的触摸操作;和/或,
    响应于确定所述触控板检测到用户的触摸屏蔽解除操作,生成触摸屏蔽解除指令;
    根据所述触摸屏蔽解除指令解除对所述触控板检测到的除用户的所述触摸操作的屏蔽。
  36. 根据权利要求35所述的头戴式显示设备,其特征在于,所述处理器还用于:
    根据所述传感数据确定用户是否佩戴所述头戴式显示设备;
    所述处理器具体用于
    响应于确定所述触控板检测到用户的触摸屏蔽操作且确定所述用户佩 戴所述头戴式显示设备时,生成触摸板锁定指令;和/或,
    响应于确定所述触控板检测到用户的触摸屏蔽解除操作且确定所述用户佩戴所述头戴式显示设备时,生成触摸屏蔽解除指令。
  37. 根据权利要求25所述的头戴式显示设备,其特征在于,所述处理器还用于:
    控制头戴式显示设备的显示器显示触摸屏蔽通知;和/或,
    控制头戴式显示设备的显示器显示触摸屏蔽解除通知。
  38. 根据权利要求25所述的头戴式显示设备,其特征在于,所述检测传感器包括设置所述头戴式显示设备的两组光学组件之间的距离传感器。
PCT/CN2022/083108 2022-03-25 2022-03-25 一种头戴显示设备及控制方法 WO2023178667A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/083108 WO2023178667A1 (zh) 2022-03-25 2022-03-25 一种头戴显示设备及控制方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/083108 WO2023178667A1 (zh) 2022-03-25 2022-03-25 一种头戴显示设备及控制方法

Publications (1)

Publication Number Publication Date
WO2023178667A1 true WO2023178667A1 (zh) 2023-09-28

Family

ID=88099582

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/083108 WO2023178667A1 (zh) 2022-03-25 2022-03-25 一种头戴显示设备及控制方法

Country Status (1)

Country Link
WO (1) WO2023178667A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103064187A (zh) * 2011-10-24 2013-04-24 索尼公司 头戴式显示器及用于控制头戴式显示器的方法
CN104704450A (zh) * 2012-10-10 2015-06-10 奥林巴斯株式会社 头戴式显示装置、锁定解除处理系统、程序、头戴式显示装置的控制方法及锁定解除处理系统的控制方法
CN105212450A (zh) * 2015-09-25 2016-01-06 联想(北京)有限公司 电子设备及调整方法
JP2016167844A (ja) * 2016-04-21 2016-09-15 ソニー株式会社 ヘッド・マウント・ディスプレイ及びヘッド・マウント・ディスプレイの制御方法、並びに表示システム及びその制御方法
US9590680B1 (en) * 2007-08-22 2017-03-07 Plantronics, Inc. Don doff controlled headset user interface
CN107995956A (zh) * 2016-09-14 2018-05-04 深圳市柔宇科技有限公司 头戴式显示装置及控制方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9590680B1 (en) * 2007-08-22 2017-03-07 Plantronics, Inc. Don doff controlled headset user interface
CN103064187A (zh) * 2011-10-24 2013-04-24 索尼公司 头戴式显示器及用于控制头戴式显示器的方法
CN104704450A (zh) * 2012-10-10 2015-06-10 奥林巴斯株式会社 头戴式显示装置、锁定解除处理系统、程序、头戴式显示装置的控制方法及锁定解除处理系统的控制方法
CN105212450A (zh) * 2015-09-25 2016-01-06 联想(北京)有限公司 电子设备及调整方法
JP2016167844A (ja) * 2016-04-21 2016-09-15 ソニー株式会社 ヘッド・マウント・ディスプレイ及びヘッド・マウント・ディスプレイの制御方法、並びに表示システム及びその制御方法
CN107995956A (zh) * 2016-09-14 2018-05-04 深圳市柔宇科技有限公司 头戴式显示装置及控制方法

Similar Documents

Publication Publication Date Title
JP5694354B2 (ja) ポータブルコンピューティングデバイス上の3次元仮想オブジェクトを制御するシステム及び方法
US9804733B2 (en) Dynamic cursor focus in a multi-display information handling system environment
KR101984590B1 (ko) 디스플레이 디바이스 및 그 제어 방법
WO2017177006A1 (en) Head mounted display linked to a touch sensitive input device
WO2013088725A1 (en) Head-mounted display and information display apparatus
US9829988B2 (en) Three dimensional user interface session control using depth sensors
KR20170054423A (ko) 다-표면 컨트롤러
JP6264871B2 (ja) 情報処理装置および情報処理装置の制御方法
KR20140042280A (ko) 포터블 디바이스 및 그 제어 방법
US20120313848A1 (en) Three Dimensional User Interface Session Control
US20150193000A1 (en) Image-based interactive device and implementing method thereof
KR20150091322A (ko) 아이웨어 상에서의 멀티 터치 상호작용 기법
US10956112B2 (en) System for controlling a display device
US20150054741A1 (en) Display control device, display control method, and program
WO2013121807A1 (ja) 情報処理装置、情報処理方法およびコンピュータプログラム
US20160313819A1 (en) Information Handling System Modular Capacitive Mat
KR102297473B1 (ko) 신체를 이용하여 터치 입력을 제공하는 장치 및 방법
KR20150120043A (ko) 터치 센싱 기능과 지문 인증 기능을 모두 포함하는 모바일 장치
JP2021192240A (ja) 携帯用通信ターミナル、方向性入力ユニット及びこれと関連した方法
CN112099618A (zh) 电子设备、电子设备的控制方法和存储介质
WO2021160000A1 (zh) 可穿戴设备及控制方法
KR20140035861A (ko) 헤드 마운트 디스플레이를 위한 사용자 인터페이스 제공 장치 및 방법
WO2021098696A1 (zh) 触控方法及电子设备
US10852816B2 (en) Gaze-informed zoom and pan with manual speed control
KR20100131213A (ko) 제스쳐-기반의 리모컨 시스템

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22932728

Country of ref document: EP

Kind code of ref document: A1