WO2017051721A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2017051721A1
WO2017051721A1 PCT/JP2016/076559 JP2016076559W WO2017051721A1 WO 2017051721 A1 WO2017051721 A1 WO 2017051721A1 JP 2016076559 W JP2016076559 W JP 2016076559W WO 2017051721 A1 WO2017051721 A1 WO 2017051721A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
control unit
unit
information processing
processing apparatus
Prior art date
Application number
PCT/JP2016/076559
Other languages
English (en)
Japanese (ja)
Inventor
翼 塚原
木村 淳
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2017051721A1 publication Critical patent/WO2017051721A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present technology relates to an information processing device, an information processing method, and a program, and more particularly, to an information processing device, an information processing method, and a program that can prevent an erroneous operation on an HMD.
  • HMD Head-Mounted Display
  • the HMD has a function of providing an image to a user who wears the HMD.
  • Patent Document 1 discloses an HMD that includes a touch sensor on the side surface of a spectacle-shaped frame.
  • This technology has been made in view of such a situation, and is intended to prevent erroneous operation on the HMD.
  • An information processing apparatus suppresses reflection of an operation to an operation reception unit provided in a housing attached to the user's head on output based on information on a user's exercise state
  • An output control unit is provided.
  • An information processing method suppresses reflection of an operation to an operation reception unit provided in a housing attached to the user's head on output based on information on the user's exercise state. Including the steps of:
  • a program reflects, in an output, an operation to an operation reception unit provided in a housing attached to the user's head based on information on the user's movement state.
  • a process including the step of suppressing is executed.
  • the reflection of the operation to the operation reception unit provided in the housing mounted on the user's head is suppressed based on the information on the user's motion state.
  • FIG. 1 shows a configuration example of a display module according to the present embodiment.
  • the display module 10 and the eyeglass-type frame 20 constitute an HMD (Head-Mounted Display).
  • the display module 10 as an information processing apparatus is attached to the right temple part (right eye side) of the pair of temple parts of the eyeglass-type frame 20 attached to the user's head.
  • the display module 10 includes a control unit 11, a display unit 12, and an optical unit 13.
  • the control unit 11 is connected to the display unit 12 and the eyeglass-type frame 20. Specifically, the end of the casing of the control unit 11 in the long side direction is coupled to the display unit 12, and the side surface of the casing of the control unit 11 is connected to the temple portion of the glasses-type frame 20 via a connecting member.
  • the control unit 11 incorporates a control board for controlling the operation of the display module 10.
  • the control unit 11 includes a control board having a CPU (Central Processing Unit) or a RAM (Random Access Memory).
  • the touch surface 21 a of the touch sensor 21 is provided on the upper surface of the casing of the control unit 11. As will be described later, a touch surface 21b (not shown) of the touch sensor 21 is provided on the lower surface of the housing of the control unit 11.
  • the touch surfaces 21a and 21b receive user operations.
  • the touch surfaces 21a and 21b are more than the surface of the casing of the control unit 11 on which they are provided so that the user can recognize the positions of the touch surfaces 21a and 21b by touching the surface of the casing of the control unit 11. Make it indented.
  • the touch surfaces 21a and 21b may be made of a material different from the surface of the housing of the control unit 11 on which the touch surfaces 21a and 21b are provided.
  • the control unit 11 is connected to the optical unit 13 via the display unit 12.
  • the display unit 12 connects the control unit 11 and the optical unit 13 and supports the optical unit 13. Specifically, the display unit 12 is coupled to the end of the control unit 11 and the end of the optical unit 13 to fix the optical unit 13.
  • the display unit 12 includes a signal line for communicating data relating to an image provided from the control unit 11 to the optical unit 13 and a display for projecting the image onto the optical unit 13.
  • the optical unit 13 projects image light to the outside of the display module 10. Specifically, the optical unit 13 sends image light provided from the control unit 11 via the display unit 12 to the outside of the display module 10 through the eyepiece, that is, to the eyes of the user wearing the display module 10. Project.
  • FIG. 2 is a perspective view showing a configuration example of the optical unit 13
  • FIG. 3 is a cross-sectional view showing a configuration example of the optical unit 13.
  • FIG. 4 is a diagram illustrating an example of a state in which the display module 10 is mounted on the user.
  • the optical unit 13 includes a light guide unit 31, a reflection unit 32, and an eyepiece 33.
  • the projection unit 41 is a display panel that displays an image provided from the control unit 11, and emits image light toward the light guide unit 31 by displaying the image.
  • the projection unit 41 is configured by, for example, an organic EL (Electro-Luminescence) display.
  • the light guide unit 31 guides the incident image light to the reflection unit 32.
  • the light guide unit 31 is, for example, a columnar member as shown in FIG. As shown in FIG. 3, the light guide unit 31 guides the image light so that the image light projected from the projection unit 41 reaches the reflection unit 32 without leaking outside the display module 10.
  • the light guide part 31 may be a cylindrical member having a hollow inside, or may be a transparent member that transmits image light.
  • the reflection unit 32 reflects the reaching image light toward the eyepiece 33.
  • the reflection unit 32 reflects the image light guided by the light guide unit 31 toward the position of the eyepiece lens 34.
  • the eyepiece 33 enlarges the image. Specifically, the eyepiece 33 refracts the image light reflected by the reflection unit 32 and enlarges the image related to the image light.
  • the optical unit 13 is formed such that the length in the direction in which the light is emitted toward the user's pupil is shorter than the other direction in the region where the light is emitted toward the user's pupil is equal to or shorter than the upper limit of pupil diameter variation. Is done. Specifically, the optical unit 13 is formed so that the width of the optical unit 13 in the short direction is equal to or less than the average pupil diameter.
  • the optical unit 13 is formed such that the length L2 of the width in the short direction of the optical unit 13 is equal to or less than the average pupil diameter L1, for example. Since the human pupil diameter L1 generally varies in the range of about 2 mm to 8 mm, L2 is set to 8 mm or less, for example, 4 mm. Further, in the optical unit 13, the shape of the region where light is emitted may be a circle, an ellipse, or another polygon in addition to a rectangle.
  • FIG. 5 is a cross-sectional view showing an example of the internal configuration of the control unit 11.
  • a touch sensor 21 is provided on the front side of the control unit 11 (the lens part side of the spectacles-type frame 20).
  • the touch surface 21a of the touch sensor 21 is provided on the first surface (upper surface in FIG. 5) of the housing of the control unit 11, and the touch surface 21b is on the opposite side of the first surface of the housing of the control unit 11. It is provided on the second surface (the lower surface in FIG. 5).
  • the touch sensor 21 is a capacitive touch sensor that detects a capacitance generated between the electrode and the object.
  • the touch surfaces 21a and 21b are configured by arranging a plurality of touch electrodes on each of two substrates provided on the upper surface side and the lower surface side of the housing of the control unit 11.
  • the two substrates constituting each of the touch surfaces 21a and 21b are electrically connected to each other by a flexible flat cable (FFC) 51.
  • FFC flexible flat cable
  • a sensor chip may be provided for each substrate constituting one touch surface.
  • the wireless antenna 52 is provided on the rear side of the control unit 11 (on the side opposite to the lens portion of the spectacles frame 20).
  • the wireless antenna 52 performs wireless communication with an external device (for example, a mobile terminal owned by a user) using, for example, Wi-Fi (registered trademark) or Bluetooth (registered trademark).
  • an external device for example, a mobile terminal owned by a user
  • Wi-Fi registered trademark
  • Bluetooth registered trademark
  • the display module 10 including the control unit 11 can be connected to any one of the temple portions of the eyeglass-type frame 20.
  • the display module 10 is assumed to be attached to the right temple portion (right eye side) of the temple portions of the eyeglass-type frame 20. Further, as shown in FIG. 6, the display module 10 can be attached to the left temple portion (left eye side) of the temple portions of the eyeglass-type frame 20.
  • the touch surface 21 b of the touch sensor 21 is provided on the upper surface of the housing of the control unit 11, and the touch surface 21 a (not shown) of the touch sensor 21 is disposed on the lower surface of the housing of the control unit 11. Will be provided.
  • the display module 10 can be attached to any of the left and right temple portions of the glasses-type frame 20. The user can operate the touch sensor 21 with the same operability.
  • FIG. 7 illustrates a functional configuration example of the display module 10 according to the first embodiment of the present technology.
  • the display module 10 shown in FIG. 7 includes operation reception units 111a and 111b, a mounting state determination unit 112, a display control unit 113, and a display unit 114.
  • the operation reception units 111 a and 111 b correspond to the touch surfaces 21 a and 21 b of the touch sensor 21.
  • the operation reception units 111a and 111b receive user operations such as taps, long taps, and swipes.
  • the operation reception units 111a and 111b supply a signal corresponding to the received user operation to the display control unit 113.
  • the mounting state determination unit 112 determines the mounting state of the housing of the control unit 11. Specifically, the wearing state determination unit 112 determines which of the pair of temple portions of the eyeglass-type frame 20 the control unit 11 (housing) is connected to. The wearing state determination unit 112 supplies a signal representing the determination result to the display control unit 113.
  • the wearing state determination unit 112 is configured to include an acceleration sensor, and determines which of the temple units is connected by determining the posture of the housing based on the output of the acceleration sensor.
  • the wearing state determination unit 112 is configured to include a gyro sensor, and determines the position of the housing based on the output of the gyro sensor, thereby determining which of the temple units is connected. May be.
  • the wearing state determining unit 112 may determine which of the temple units is connected by performing near field communication such as NFC (Near Field Communication) with the glasses-type frame 20. Further, the mounting state determination unit 112 may determine which of the temple units is connected based on the positional relationship between the control unit 11 and the optical unit 13.
  • NFC Near Field Communication
  • the wearing state determination unit 112 may determine which of the temple units is connected based on information preset by the user. For example, in the setting mode in which various settings of the display module 10 are performed, the user can select which of the temple parts to connect to, so that the wearing state determination unit 112 can determine whether the temple part has You may make it discriminate
  • the display control unit 113 is configured as an output control unit that controls output according to an operation on the operation receiving units 111a and 111b, depending on the determined mounting state of the control unit 11 (housing). Specifically, the display control unit 113 controls the display of an image on the display unit 114 based on signals from the operation receiving units 111a and 111b and the wearing state determination unit 112.
  • the display unit 114 corresponds to the display unit 12 and the optical unit 13. As shown in FIGS. 1 and 6, the display unit 114 may be provided in the vicinity of the lens unit of the spectacles type frame 20 or may be provided in the lens unit of the spectacles type frame 20 itself. Good.
  • Display module display processing Next, display processing of the display module 10 will be described with reference to the flowchart of FIG. The process of FIG. 8 is started when, for example, the display module 10 (the housing of the control unit 11) is connected to one of the temple parts of the eyeglass-type frame 20.
  • step S11 the wearing state determination unit 112 determines whether the housing of the control unit 11 is connected to the left or right temple unit.
  • the wearing state determination unit 112 supplies a signal representing the determination result to the display control unit 113.
  • step S12 the display control unit 113 displays a screen on the display unit 114 in a direction corresponding to which of the left and right temple units is connected based on the signal from the mounting state determination unit 112.
  • the output of the acceleration sensor or gyro sensor constituting the mounting state determination unit 112 is treated as being inverted depending on which of the left and right temple portions is connected to the case. Also become. Specifically, the output of the acceleration sensor or the gyro sensor itself may be reversed, or the display control unit 113 may recognize that the output of the acceleration sensor or the gyro sensor is reversed.
  • the display control unit 113 may display on the display unit 114 information indicating which of the left and right temple units the casing of the control unit 11 is connected to.
  • the housing of the control unit 11 when the housing of the control unit 11 is connected to the left temple portion in the upper state of FIG. 9, the housing displayed on the upper left of the screen displayed on the display unit 114 as shown in the lower portion of FIG.
  • the wording 132 of “left side mode” indicating that the body is connected to the left temple part is displayed.
  • the screen shown in FIG. 9 is an example of a menu screen of an application related to display provided by the display module 10.
  • icons representing the menu B, the menu C, and the menu D are displayed.
  • the icon representing the menu C is displayed larger than the other icons in the center of the menu screen and is in a selectable state.
  • the icon displayed on the menu screen is scroll-displayed in the horizontal direction (left-right direction) by a user operation on the touch sensor 21.
  • the display control unit 113 determines whether or not a touch surface facing down among the touch surfaces 21 a and 21 b of the touch sensor 21 has been operated. Specifically, when the housing of the control unit 11 is connected to the right temple portion (FIG. 1), it is determined whether or not the touch surface 21b has been operated. On the other hand, when the housing of the control unit 11 is connected to the left temple portion (FIG. 6), it is determined whether or not the touch surface 21a has been operated.
  • step S13 If it is determined in step S13 that the touch surface facing downward has been operated, the process proceeds to step S14.
  • step S14 the display control unit 113 performs screen display on the display unit 114 in accordance with a user operation on the touch surface facing downward.
  • step S13 if it is determined in step S13 that the touch surface facing downward is not operated, the display control unit 113 does nothing until the touch surface facing downward is operated. Therefore, even if the touch surface facing upward is operated, the menu screen shown in the upper part of FIGS. 10 and 11 does not scroll.
  • the display control unit 113 suppresses the reflection of the operation on the touch surface facing upward on the output. Specifically, the display control unit 113 invalidates the operation on the touch surface facing upward. In other words, even if the touch surface facing upward is operated, the display control unit 113 does not determine that there is an operation or does not perform display control after determining that there is an operation. Thereby, the user's operation on the touch surface facing upward is not reflected on the screen displayed on the display unit 114. In addition, the display control unit 113 turns off the power supply to the electrode corresponding to the touch surface facing upward among the touch surfaces of the touch sensor 111, so that the user's operation to the touch surface facing upward can be performed. You may make it not reflect on the screen currently displayed on the display part 114. FIG.
  • one of the two touch surfaces corresponds to an operation on the touch surface facing downward.
  • Screen display is performed.
  • the user can realize the same screen display by the same operation in each case.
  • the operability of the HMD can be further improved.
  • the display control unit 113 disables the operation to the operation reception unit 111a or 111b so that the user operation is not reflected on the screen displayed on the display unit 114.
  • the user's operation may not be reflected on the screen displayed on the display unit 114 but may be made difficult to be reflected.
  • the display control unit 113 reduces the gain of the output amount with respect to the operation input amount of the operation receiving unit 111a or 111b based on the mounting state of the housing of the control unit 11.
  • the display control unit 113 may tighten the detection conditions for the operation to the operation receiving unit 111a or 111b based on the mounting state of the housing of the control unit 11. For example, based on the mounting state of the casing of the control unit 11, the capacitance value serving as a reference for detecting a user operation (tap or swipe) on the touch surface facing upward is increased.
  • the display control unit 113 may reduce the range in which an operation to the operation receiving unit 111a or 111b can be detected based on the mounting state of the housing of the control unit 11.
  • the display control unit 113 may lower the detection sensitivity of the operation to the operation receiving unit 111a or 111b based on the mounting state of the housing of the control unit 11.
  • the operation reception units 111a and 111b correspond to the touch surfaces 21a and 21b of the touch sensor 21, but may correspond to other configurations.
  • the operation receiving units 111a and 111b may be physical input devices such as buttons and dials.
  • the operation receiving units 111a and 111b may be sensors that detect a user's gesture or line of sight by detecting visible light, infrared light, radio waves, sound waves, or the like.
  • a swipe operation in the left-right direction that is, a planar operation on the touch surface may be accepted. Further, a direction in which the finger is separated from the touch surface may be detected, and a three-dimensional operation on the touch surface may be accepted.
  • the housing of the control unit 11 includes, for example, left and right temple portions so that the front and rear directions of the display unit 12 and the optical unit 13 are switched so that the surface on which the touch surface is provided is always directed downward. It shall have a structure connectable to.
  • the relative finger of the touch surface to the touch surface is determined depending on whether the housing of the control unit 11 is connected to the left temple portion or the right temple portion.
  • the direction of movement is opposite to the direction of finger movement detected by the touch surface.
  • the display control unit 111 switches the output (screen display) according to the operation on the touch surface so as to be reversed based on the mounting state of the housing of the control unit 11. For example, the display control unit 111 reverses the display control for the signal from the touch surface depending on whether the housing of the control unit 11 is connected to the right temple unit or the left temple unit. .
  • the casing of the control unit 11 is connected to the right temple part and the menu screen shown in the upper part of FIG.
  • the direction of the finger movement detected by the touch surface is the direction from the far side to the near side from the connecting portion with the temple portion.
  • the menu screen scrolls to the left.
  • the icon representing the menu C displayed on the right side of the menu screen is displayed largely in the center of the menu screen.
  • the display control unit 111 inverts the signal from the touch surface depending on whether the housing of the control unit 11 is connected to the right temple unit or the left temple unit.
  • the screen display may be reversed.
  • FIG. 12 is an external view showing another configuration example of the display module according to the first embodiment.
  • the display module 10 shown in FIG. 12 further includes a camera 151 in addition to the configuration of the display module 10 shown in FIG.
  • the camera 151 is provided on the surface of the housing of the control unit 11 on the surface side of the eyeglass-type frame 20 so that the lens is exposed.
  • the camera 151 captures an image of the visual field direction of the eyeglass-type frame 20.
  • FIG. 13 shows a functional configuration example of the display module 10 shown in FIG.
  • the display module 10 shown in FIG. 13 further includes an imaging unit 171 and a recording unit 172 in addition to the configuration of the display module 10 shown in FIG.
  • the imaging unit 171 corresponds to the camera 151.
  • the imaging unit 171 supplies the captured image data to the display control unit 113.
  • the display control unit 113 displays an image on the display unit 114 based on the data from the imaging unit 171.
  • the imaging unit 171 supplies the captured image data to the recording unit 172. That is, the recording unit 172 records the image captured by the imaging unit 171.
  • FIG. 14 Images capture processing of display module
  • the display module 10 the housing of the control unit 11
  • step S31 the mounting state determination unit 112 determines whether the housing (control unit 11) is connected to the left or right temple unit.
  • the wearing state determination unit 112 supplies a signal representing the determination result to the display control unit 113.
  • step S ⁇ b> 32 the display control unit 113 displays a live view, a so-called live view, in a direction corresponding to which of the left and right temples is connected to the display unit based on the signal from the mounting state determination unit 112. 114.
  • the image light projected from the optical unit 13 has a top-and-bottom direction as viewed from the user.
  • the display control unit 113 switches the top-to-bottom direction of the through image displayed on the display unit 114 depending on which of the left and right temple units the casing is connected to.
  • the display control unit 113 may display on the display unit 114 information indicating which of the left and right temple units the casing of the control unit 11 is connected to.
  • step S ⁇ b> 33 the display control unit 113 determines whether or not the touch surface facing down of the touch surfaces 21 a and 21 b of the touch sensor 21 has been tapped, for example. Specifically, when the housing of the control unit 11 is connected to the right temple, it is determined whether or not the touch surface 21b has been operated. On the other hand, when the housing of the control unit 11 is connected to the left temple, it is determined whether or not the touch surface 21a has been operated.
  • step S33 If it is determined in step S33 that the touch surface facing downward has been operated, the process proceeds to step S34.
  • step S34 the imaging unit 171 causes the recording unit 172 to record a still image captured when the touch surface facing downward is operated.
  • This still image is also an image oriented according to whether the housing is connected to the left or right temple part.
  • step S33 if it is determined in step S33 that the touch surface facing downward is not operated, the still image is not recorded until the touch surface facing downward is operated. Therefore, even if the touch surface facing upward is operated, a still image is not recorded.
  • FIG. 15 illustrates a functional configuration example of the display module 10 according to the second embodiment of the present technology.
  • 15 includes an operation reception unit 211, an exercise state acquisition unit 212, a display control unit 213, and a display unit 214.
  • the operation reception unit 211 corresponds to the touch surfaces 21 a and 21 b of the touch sensor 21.
  • the operation reception unit 211 receives user operations such as tap, long tap, and swipe.
  • the operation reception unit 211 supplies a signal corresponding to the received user operation to the display control unit 213.
  • the exercise state acquisition unit 212 acquires information regarding the exercise state of the user wearing the display module 10. Specifically, the exercise state acquisition unit 212 acquires the user's exercise amount (strength of exercise) as information on the exercise state.
  • the intensity of exercise is calculated by, for example, biological sensing (sensing of biological information such as heart rate, sweating, body temperature, blood pressure, myoelectric potential, brain wave, respiration, etc.). Further, the intensity of exercise may be calculated based on the movement speed of the user and the result of action recognition such as whether the user is stationary, walking, or running.
  • the exercise state acquisition unit 212 supplies a signal representing information about the acquired exercise state to the display control unit 113.
  • the exercise state acquisition unit 212 is configured to include an acceleration sensor, and acquires the output of the acceleration sensor as information related to the user's exercise state. Further, the exercise state acquisition unit 212 may be configured to include a GPS (Global Positioning System) sensor, and the output of the GPS sensor may be acquired as information on the user's exercise state.
  • GPS Global Positioning System
  • the display control unit 213 controls display of an image on the display unit 214.
  • the display control unit 213 is configured as an output control unit that controls an output to the operation receiving unit 211 based on the acquired information on the user's exercise state. Specifically, the display control unit 213 controls an output for an operation on an image displayed on the display unit 214 based on a signal from the exercise state acquisition unit 212.
  • the display unit 214 corresponds to the display unit 12 and the optical unit 13. As shown in FIGS. 1 and 6, the display unit 214 may be provided in the vicinity of the lens unit of the glasses-type frame 20 or may be provided in the lens unit of the glasses-type frame 20 itself. Good.
  • Display module input / output control processing Next, display processing of the display module 10 will be described with reference to the flowchart of FIG. The process of FIG. 16 is started in a state in which the eyeglass frame 20 to which the display module 10 is connected is attached to the user and a predetermined screen is displayed on the display unit 214.
  • step S71 the exercise state acquisition unit 212 acquires information related to the user's exercise state.
  • the exercise state acquisition unit 212 supplies a signal representing information related to the user's exercise state to the display control unit 213.
  • step S72 the display control unit 213 determines whether or not the user's exercise state is a predetermined exercise state based on the signal from the exercise state acquisition unit 212. Specifically, the display control unit 213 determines whether or not the amount of exercise exceeds a predetermined amount (predetermined level) based on the amount of exercise (intensity of exercise) of the user. For example, the display control unit 213 can determine that the amount of exercise of the user exceeds a predetermined amount when the movement speed of the user exceeds a predetermined speed. Further, the display control unit 213 can determine that the amount of exercise of the user exceeds a predetermined amount when the user is traveling.
  • a predetermined amount predetermined level
  • the display control unit 213 determines that the user's exercise amount exceeds the predetermined amount based on the biological information obtained by the biological sensing. Can be determined. Furthermore, the display control unit 213 calculates the vibration of the display module 10 (control unit 11) from the output of the acceleration sensor included in the motion state acquisition unit 212, and when the period of the vibration is shorter than a predetermined time, When the amplitude of vibration is larger than a predetermined amount, it can be determined that the user's momentum exceeds a predetermined amount.
  • step S72 If it is determined in step S72 that the user's exercise state is a predetermined exercise state, that is, the user's exercise amount (intensity of exercise) exceeds a predetermined amount (predetermined level), the process proceeds to step S73. move on.
  • step S ⁇ b> 73 the display control unit 213 displays an operation on the operation reception unit 211 on the display unit 214 by controlling an output to the operation reception unit 211 with respect to the screen displayed on the display unit 214. Suppress the reflection on the screen. Specifically, the operation to the operation reception unit 211 is invalidated. In other words, even if the operation receiving unit 211 is operated, the display control unit 213 does not determine that there is an operation or does not perform display control after determining that there is an operation. Thereby, the user's operation is not reflected on the screen displayed on the display unit 214.
  • the display control unit 213 may display information indicating a change in the exercise state of the user on the screen displayed on the display unit 214.
  • the color of the indicator 251 displayed at the upper right of the menu screen shown in FIG. 18 may be changed according to the user's exercise state. Specifically, in a state where the intensity of the user's exercise does not exceed a predetermined level, the color of the indicator 251 is, for example, white or green as shown in the upper part of FIG. When the intensity of the user's exercise exceeds a predetermined level, the color of the indicator 251 changes to, for example, black or red as shown in the lower part of FIG.
  • step S72 when it is determined in step S72 that the user's exercise state is not the predetermined exercise state, that is, the intensity of the user's exercise does not exceed the predetermined level, the process proceeds to step S71. Returning, the processes of steps S71 and S72 are repeated.
  • the display control unit 213 does not determine that there is an operation or does not perform display control after determining that there is an operation. The user operation is no longer reflected on the screen displayed on the display unit 214.
  • the display control unit 213 may turn off the power supply to the operation reception unit 211 so that the user's operation is not reflected on the screen displayed on the display unit 214.
  • the user operation may not be reflected on the screen displayed on the display unit 214 but may be made difficult to reflect.
  • the display control unit 213 reduces the gain of the output amount with respect to the operation input amount of the operation receiving unit 211 based on the user's exercise state. For example, when the user's exercise intensity exceeds a predetermined level, the scroll amount of the menu screen relative to the finger movement amount in the swipe operation to the operation reception unit 211 is normal (the user's exercise intensity is predetermined). (If the level is not exceeded).
  • the display control unit 213 may tighten the detection conditions for the operation to the operation receiving unit 211 based on the user's exercise state. For example, based on the user's exercise state, the capacitance value serving as a reference for detecting a user operation (tap or swipe) in the operation reception unit 211 may be increased. Thereby, when the intensity of the user's exercise exceeds a predetermined level, the detection sensitivity of the operation reception unit 211 itself can be lowered.
  • the display control unit 213 may reduce the range in which an operation to the operation receiving unit 211 can be detected based on the user's exercise state.
  • the display control unit 213 may reduce the detection sensitivity of the operation to the operation receiving unit 211 based on the user's exercise state.
  • the display control unit 213 may cause the operation receiving unit 211 to output an operation after a predetermined operation based on the user's exercise state. For example, when the intensity of the user's exercise exceeds a predetermined level, an output for a normal tap or swipe is made after a long tap for a predetermined time or longer is given to the operation reception unit 211. As a result, the user performs an operation on the operation receiving unit 211 after suppressing the exercise, and as a result, an erroneous operation can be prevented.
  • the display control unit 213 may invalidate a steady operation on the operation receiving unit 211 based on the user's exercise state. For example, when the intensity of the user's exercise exceeds a predetermined level, the continuous touch on the operation reception unit 211 is ignored. Thereby, even if the decoration of the helmet worn by the user, the user's own hair, etc. are always touching the touch surface of the touch sensor 21, it is possible to prevent erroneous operations caused by the touch. .
  • the output for the operation to the operation reception unit 211 is controlled based on the amount of exercise of the user, but the output for the operation is further controlled based on the direction of movement of the user. Good.
  • the output for the operation is controlled.
  • the operation of the operation reception unit 211 is more difficult when the user is running with a lot of vertical movement. Therefore, even when the amount of exercise is the same, when the amount of exercise in the vertical direction exceeds a predetermined amount, reflection of the operation on the output is suppressed. Thereby, it becomes possible to prevent an erroneous operation by a user who is running.
  • the operation reception unit 211 corresponds to the touch surfaces 21a and 21b of the touch sensor 21, but may correspond to other configurations.
  • the operation reception unit 211 may be a physical input device such as a button or a dial.
  • the operation reception unit 211 may be sensors that detect a user's gesture or line of sight by detecting visible light, infrared light, radio waves, sound waves, or the like. ⁇ 4. Other>
  • the first embodiment described above can have the following configuration.
  • the HMD 410 including the operation receiving units 111a and 111b, the wearing state determining unit 112, and the display unit 114 and the information processing apparatus 420 including the display control unit 113 are configured as separate units. You may make it mutually perform wireless communication using wired communication or Wi-Fi (trademark), Bluetooth (trademark), etc. mutually.
  • the information processing apparatus 420 is configured as a mobile terminal such as a smartphone.
  • the HMD 410 and the information processing apparatus 420 may be connected via a network 430 such as the Internet or an intranet.
  • the information processing apparatus 420 is configured as a server on a network, for example.
  • the second embodiment described above can have the following configuration.
  • an HMD 510 including an operation receiving unit 211, an exercise state acquiring unit 212, and a display unit 214 and an information processing device 520 including a display control unit 213 are configured as separate units, Wired communication or wireless communication using Wi-Fi (registered trademark), Bluetooth (registered trademark), or the like may be performed.
  • the information processing device 520 is configured as a mobile terminal such as a smartphone.
  • the HMD 510 and the information processing apparatus 520 may be connected via a network 530 such as the Internet or an intranet.
  • the information processing apparatus 520 is configured as a server on a network.
  • the display module 10 as the information processing apparatus of the present technology is configured to include the touch sensor and the display unit, but may be configured not to include the display unit, or vice versa.
  • the touch sensor may not be included.
  • the display method of the display module 10 is such that the length in the direction in which the length is shorter than the other direction of the region where the light is emitted toward the pupil of the user is equal to or less than the upper limit of the pupil diameter variation. That is, a method using an eyepiece optical system, that is, the user's pupil is divided into a portion where light related to an external image is incident and a portion where light related to an electronically generated image is incident, and an external image In the above description, the generated image is superposed optically (pupil division method).
  • the display method of the display module 10 may be a prism method, a hologram method, or other so-called see-through type or see-around type display method.
  • the present technology can be applied to a transmissive display type or video see-through type HMD.
  • the present technology may be applied to an HMD other than the glasses type, for example, a hat type HMD, or may be applied to other wearable terminals such as a wristwatch type or a wristband type.
  • the series of processes described above can be executed by hardware or software.
  • a program constituting the software is installed in the computer.
  • the computer includes, for example, a general-purpose personal computer capable of executing various functions by installing a computer incorporated in dedicated hardware and various programs.
  • FIG. 23 is a block diagram showing an example of the hardware configuration of a computer that executes the above-described series of processing by a program.
  • a CPU 901 In the computer, a CPU 901, a ROM (Read Only Memory) 902, and a RAM (Random Access Memory) 903 are connected to each other by a bus 904.
  • a bus 904. In the computer, a CPU 901, a ROM (Read Only Memory) 902, and a RAM (Random Access Memory) 903 are connected to each other by a bus 904.
  • An input / output interface 905 is further connected to the bus 904.
  • An input unit 906, an output unit 907, a storage unit 908, a communication unit 909, and a drive 910 are connected to the input / output interface 905.
  • the input unit 906 includes a keyboard, a mouse, a microphone, and the like.
  • the output unit 907 includes a display, a speaker, and the like.
  • the storage unit 908 includes a hard disk, a nonvolatile memory, and the like.
  • the communication unit 909 includes a network interface or the like.
  • the drive 910 drives a removable medium 911 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
  • the CPU 901 loads the program stored in the storage unit 908 to the RAM 903 via the input / output interface 905 and the bus 904 and executes the program, for example. Is performed.
  • the program executed by the computer (CPU 901) can be provided by being recorded on a removable medium 911 as a package medium, for example.
  • the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
  • the program can be installed in the storage unit 908 via the input / output interface 905 by attaching the removable medium 911 to the drive 910.
  • the program can be received by the communication unit 909 via a wired or wireless transmission medium and installed in the storage unit 908.
  • the program can be installed in the ROM 902 or the storage unit 908 in advance.
  • the program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
  • An information processing apparatus comprising: an output control unit that suppresses reflection of an operation to an operation receiving unit provided in a housing mounted on the user's head based on information related to a user's exercise state.
  • the output control unit invalidates an operation to the operation receiving unit based on information on the exercise state of the user.
  • the output control unit invalidates an operation to the operation receiving unit based on the amount of exercise of the user.
  • the output control unit invalidates an operation to the operation receiving unit when the amount of exercise of the user exceeds a predetermined amount.
  • the information processing apparatus invalidates an operation to the operation receiving unit when a moving speed of the user exceeds a predetermined speed.
  • the output control unit invalidates an operation to the operation receiving unit based on an amount of exercise and an exercise direction of the user.
  • the information processing apparatus decreases a gain of an output amount with respect to an operation input amount to the operation receiving unit based on information on the user's exercise state.
  • the output control unit tightens a condition for detecting an operation to the operation receiving unit based on information related to the user's exercise state.
  • the information processing apparatus wherein the output control unit reduces a range in which an operation to the operation receiving unit can be detected based on information on the user's exercise state. (10) The information processing apparatus according to (1), wherein the output control unit lowers detection sensitivity of an operation to the operation receiving unit based on information on the user's exercise state. (11) The information processing apparatus according to (1), wherein the output control unit causes the operation receiving unit to output an operation after a predetermined operation based on information on the user's exercise state. (12) The information processing apparatus according to (1), wherein the output control unit invalidates a steady operation on the operation receiving unit based on information related to the user's exercise state.
  • the information processing apparatus according to any one of (1) to (12), wherein the housing is connected to a glasses-type frame that is mounted on the head of the user. (14) It further comprises a display unit provided in or near the lens unit of the spectacles type frame, The information processing apparatus according to (13), wherein the output control unit suppresses reflection of an operation on the operation receiving unit on display on the display unit based on information on the exercise state of the user. (15) The information processing apparatus according to (14), wherein the output control unit causes the display unit to display information indicating a change in the exercise state of the user. (16) The information processing apparatus according to (1), wherein the operation reception unit includes a touch surface of a touch sensor.

Abstract

La présente invention concerne un dispositif et un procédé de traitement d'informations ainsi qu'un programme permettant d'empêcher des erreurs de fonctionnement par rapport à des visiocasques (HMD). Une unité de commande d'entrée/sortie commande une sortie par rapport à une opération entrée dans une unité d'acceptation d'opération disposée dans une enceinte portée sur la tête d'un utilisateur, sur la base d'informations relatives à un état de mouvement de l'utilisateur. L'invention est applicable à des modules d'affichage qui sont des éléments constitutifs de HMD.
PCT/JP2016/076559 2015-09-24 2016-09-09 Dispositif de traitement d'informations, procédé de traitement d'informations et programme WO2017051721A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-187342 2015-09-24
JP2015187342 2015-09-24

Publications (1)

Publication Number Publication Date
WO2017051721A1 true WO2017051721A1 (fr) 2017-03-30

Family

ID=58386675

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/076559 WO2017051721A1 (fr) 2015-09-24 2016-09-09 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Country Status (1)

Country Link
WO (1) WO2017051721A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020034997A (ja) * 2018-08-27 2020-03-05 Dynabook株式会社 電子機器、ウェアラブル機器および設定方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012212237A (ja) * 2011-03-30 2012-11-01 Namco Bandai Games Inc 画像生成システム、サーバシステム、プログラム及び情報記憶媒体
WO2014196038A1 (fr) * 2013-06-05 2014-12-11 三菱電機株式会社 Dispositif de traitement d'informations par détection de ligne de visibilité et procédé de traitement d'informations
WO2015108112A1 (fr) * 2014-01-15 2015-07-23 株式会社Juice Design Dispositif et procédé de détermination de manipulation ainsi que programme

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012212237A (ja) * 2011-03-30 2012-11-01 Namco Bandai Games Inc 画像生成システム、サーバシステム、プログラム及び情報記憶媒体
WO2014196038A1 (fr) * 2013-06-05 2014-12-11 三菱電機株式会社 Dispositif de traitement d'informations par détection de ligne de visibilité et procédé de traitement d'informations
WO2015108112A1 (fr) * 2014-01-15 2015-07-23 株式会社Juice Design Dispositif et procédé de détermination de manipulation ainsi que programme

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020034997A (ja) * 2018-08-27 2020-03-05 Dynabook株式会社 電子機器、ウェアラブル機器および設定方法

Similar Documents

Publication Publication Date Title
US10261579B2 (en) Head-mounted display apparatus
US10635182B2 (en) Head mounted display device and control method for head mounted display device
US20150009309A1 (en) Optical Frame for Glasses and the Like with Built-In Camera and Special Actuator Feature
JP2018101019A (ja) 表示装置及び表示装置の制御方法
US10976836B2 (en) Head-mounted display apparatus and method of controlling head-mounted display apparatus
CN111630477A (zh) 提供增强现实服务的设备及其操作方法
JP2017102768A (ja) 情報処理装置、表示装置、情報処理方法、及び、プログラム
KR20180002387A (ko) 헤드 마운티드 디스플레이 장치 및 이의 제어방법
JP6405991B2 (ja) 電子機器、表示装置、及び、電子機器の制御方法
US11284058B2 (en) Utilizing dual cameras for continuous camera capture
KR20220148921A (ko) 근육 센서들을 사용하여 안면 표정을 결정하는 안경류
US11215846B1 (en) Wearable proximity sensing array for the visually impaired
JP2018124721A (ja) 頭部装着型表示装置、及び頭部装着型表示装置の制御方法
KR20180004112A (ko) 안경형 단말기 및 이의 제어방법
JP2016122177A (ja) 表示装置、及び、表示装置の制御方法
JP6638392B2 (ja) 表示装置、表示システム、表示装置の制御方法、及び、プログラム
WO2017051721A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2017051720A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
US20180260068A1 (en) Input device, input control method, and computer program
JP2017079389A (ja) 表示装置、表示装置の制御方法、及び、プログラム
JP2019053644A (ja) 頭部装着型表示装置、及び頭部装着型表示装置の制御方法
JP2017120302A (ja) 表示装置、表示システム、表示装置の制御方法、及び、プログラム
JP2016116066A (ja) 表示装置、及び、表示装置の制御方法
JP2016034091A (ja) 表示装置、表示装置の制御方法、および、プログラム
US20220214744A1 (en) Wearable electronic device and input structure using motion sensor in the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16848509

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 16848509

Country of ref document: EP

Kind code of ref document: A1