WO2013147147A1 - Head-mounted display and computer program - Google Patents

Head-mounted display and computer program Download PDF

Info

Publication number
WO2013147147A1
WO2013147147A1 PCT/JP2013/059503 JP2013059503W WO2013147147A1 WO 2013147147 A1 WO2013147147 A1 WO 2013147147A1 JP 2013059503 W JP2013059503 W JP 2013059503W WO 2013147147 A1 WO2013147147 A1 WO 2013147147A1
Authority
WO
WIPO (PCT)
Prior art keywords
feature
feature amount
movement
identification information
unit
Prior art date
Application number
PCT/JP2013/059503
Other languages
French (fr)
Japanese (ja)
Inventor
井上 浩
邦宏 伊藤
Original Assignee
ブラザー工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ブラザー工業株式会社 filed Critical ブラザー工業株式会社
Publication of WO2013147147A1 publication Critical patent/WO2013147147A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • the present disclosure relates to a head mounted display and a computer program for the head mounted display.
  • Patent Document 1 discloses a display device that can easily give an operation instruction by an operator and can easily give a complicated operation instruction.
  • the control unit that controls the head-mounted display detects that a part of the operator's body such as a hand or a finger has shown a predetermined movement, the process corresponding to the virtual icon in the space is activated. To do.
  • the control unit controls the operation of the head-mounted display according to the movement of the body part of the user of the head-mounted display
  • the control unit suitably controls according to the movement of the body part of the user.
  • the predetermined one movement has a feature for each user. For example, each user moves with the user's habit.
  • the instruction to set the image displayed on the head mounted display as the image of the next page is set to a movement of moving the right hand from the right side to the left side in the left-right direction of the user.
  • the amount of movement from the right side to the left side is the first amount.
  • the second user may be a second quantity greater than the first quantity. Even the same user may move differently.
  • the movement may change between wearing a garment that easily moves the body part and wearing a garment that hardly moves the body part.
  • This disclosure is intended to provide a head-mounted display and a computer program for the head-mounted display that can obtain suitable operability.
  • the first acquisition unit that acquires the external field image data indicating the external field image including the predetermined portion of the target imaged by the imaging unit, the external image data acquired by the first acquisition unit
  • Second acquisition means for acquiring a first movement of a predetermined part corresponding to the first process
  • processing means for controlling the first process corresponding to the first movement acquired by the acquisition means
  • Registration means for registering the identification information for identifying the object having the predetermined part and the first feature quantity representing the feature of the first movement in association with the storage unit, and accepting input of identification information Receiving means, wherein the registration means is configured to register the first feature amount registered in association with the identification information received by the receiving means after the receiving means has received the identification information.
  • Second acquisition means According to a second feature amount representing the feature of the acquired first motion, updating the first feature amount that is associated with the already identified the identification information of a head-mounted display.
  • the first feature amount representing the first movement feature for the first process is registered in the storage unit in association with the identification information for identifying the user who uses the head mounted display.
  • a user who uses the head-mounted display is identified, and a first feature amount associated with the identified identification information for identifying the identified user is represented by the second feature representing the newly acquired first motion feature.
  • the feature amount can be updated.
  • the first movement for the first process is not always performed in a constant state. According to this head mounted display, the first feature amount is updated with respect to the first movement, so that the operability for instructing the first process can be improved for each user.
  • This head mounted display may be as follows. First determination means for determining whether the second feature value acquired by the second acquisition means corresponds to the first feature value registered in the storage unit in association with the specified identification information. And when the first determination means determines that the second feature quantity does not correspond to the first feature quantity, the second feature quantity is compared with the first feature quantity. Second determining means for determining whether the second feature value is included in a predetermined range with respect to the first feature value. The first process corresponding to the first movement is controlled in response to the determination that the second feature value is included in the range of the first feature value. In response to the first movement in response to determining that the predetermined range is not included. Does not control the first process, it may be.
  • this head mounted display may be as follows.
  • the registration unit is configured to follow the second feature amount according to the second feature amount,
  • the first feature value associated with the specified identification information is updated, and the second determination unit does not include the second feature value within a predetermined range with respect to the first feature value.
  • the first feature amount associated with the identified identification information may not be updated according to the second feature amount. According to this, even if it is not determined that the first movement corresponds to the first feature amount, whether or not the first movement is within a predetermined range on the basis of the first feature amount. Accordingly, the first process and the update of the first feature amount can be controlled.
  • the second acquisition means corresponds to a second process that is executed by the head mounted display in accordance with the external image acquired by the first acquisition means and has a fixed relationship with the first process.
  • the second movement of the predetermined part to be acquired the processing means controls the second processing corresponding to the second movement acquired by the second acquisition means, the registration means, After the second determination unit determines that the second feature amount is included in a predetermined range with respect to the first feature amount, the acquisition unit performs the first movement and the second feature amount. And the registration of the first feature amount updated in accordance with the second feature amount in the storage unit in response to the movement of The first feature value before being updated with the second feature value. , It may be so. According to this, the update of the first feature amount can be suitably performed based on continuous instructions.
  • the registration unit may reset the first feature amount registered in the storage unit when an instruction to reset registration in the storage unit is input to the head mounted display. According to this, the first feature amount registered in the storage unit can be reset.
  • the first feature acquired by the second acquisition unit after the reception unit has received the identification information, and the first feature amount registered in the storage unit in association with the identified identification information
  • a third judging means for judging the correspondence between the processing means and the processing means when the judging means judges that the first movement corresponds to the first feature amount by the obtaining means.
  • the first process corresponding to the newly acquired first movement may be controlled. According to this, the first process is controlled, and the operation corresponding to the first process can be suitably executed on the head mounted display.
  • Another aspect of the present disclosure is a computer program readable by a control unit that controls a head-mounted display, wherein the control unit displays an external world image including a predetermined portion of a target imaged by the imaging unit.
  • First acquisition means for acquiring image data
  • second acquisition means for acquiring a first movement of a predetermined part corresponding to a first process according to external image data acquired by the first acquisition means
  • Processing means for controlling the first process corresponding to the first movement acquired by the acquisition means; identification information for identifying an object having the predetermined part; and a first characteristic representing the characteristics of the first movement.
  • a registration unit that registers the one feature value in association with the storage unit, and a reception unit that receives the input of the identification information.
  • the first feature amount registered in association with the identification information is a second feature representing the first movement feature acquired by the second acquisition unit after the reception unit has received the identification information.
  • a head-mounted display to be updated can be realized with the second feature amount.
  • This computer program may be specified as a computer program for a head mounted display that further includes the above-described configuration. According to such a computer program, a head-mounted display that further includes the above-described configuration can be realized.
  • FIG. 1 It is a figure which shows an example of a head mounted display. It is a top view of a display apparatus. It is sectional drawing which cut
  • the head mounted display is hereinafter referred to as HMD.
  • the HMD 1 includes a system box 2 and a display device 3. As shown in FIG. 1, the system box 2 and the display device 3 are connected, for example, via a transmission cable 4.
  • the system box 2 transmits an image signal and supplies power to the display device 3.
  • the display device 3 is detachably attached to the spectacle frame 5.
  • the spectacle frame 5 is mounted on the user's head.
  • the spectacle frame 5 is an example for mounting the display device 3 on the user's head.
  • the display device 3 may be mounted on the user's head by a mounting unit different from the spectacle frame 5.
  • the display device 3 includes a housing 30.
  • the housing 30 is a rectangular tubular resin member, and is formed in an L shape in plan view.
  • a half mirror 31 as a deflection member is provided at the right end of the housing 30.
  • a camera 32 is provided on the upper surface of the housing 30.
  • the camera 32 captures an image around the user.
  • the camera 32 is provided on the upper surface of the housing 30 so as to capture an external image in a direction corresponding to the direction in which the user's face faces.
  • the spectacle frame 5 includes a left frame portion 52, a right frame portion 53, a central frame portion 54, and a support portion 56.
  • the left frame portion 52 extending in the front-rear direction is hung on the user's left ear.
  • the right frame portion 53 extending in the front-rear direction is hung on the user's right ear.
  • the central frame portion 54 extending in the left-right direction connects the front end portion of the left frame portion 52 and the front end portion of the right frame portion 53 and is disposed on the user's face portion.
  • a pair of nose pads 55 are provided at the center in the longitudinal direction of the center frame 54.
  • the support portion 56 is provided on the upper left end side of the central frame portion 54.
  • the support part 56 includes a downward extension part 58.
  • the downward extension 58 extends in the up-down direction at the front left of the user's face.
  • the downward extending portion 58 is slidably engaged with a groove 57 formed in the support portion 56 and extending in the left-right direction. The position of the display device 3 in the left-right direction is adjusted by sliding the lower extension portion 58 in the left-right direction.
  • the housing 30 is provided with an attachment portion 33.
  • the attachment portion 33 is provided at a portion of the housing 30 that faces the spectacle frame 5.
  • the attachment portion 33 has a U-shaped groove along the vertical direction.
  • a downward extending portion 58 provided on the support portion 56 of the spectacle frame 5 is slidably engaged with the U-shaped groove of the attachment portion 33.
  • the position of the display device 3 in the vertical direction is adjusted by sliding the casing 30 attached to the downward extending portion 58 in the vertical direction.
  • the housing 30 includes an image light forming unit 34 and an eyepiece optical unit 35. In FIG. 2B, the housing 30 is omitted for simplification.
  • the image light Lim emitted from the image light forming unit 34 is collected by the eyepiece optical unit 35.
  • a part of the condensed image light Lim is reflected by the half mirror 31 and guided to the user's eye EB.
  • the image light forming unit 34 is provided at the left end inside the housing 30.
  • the image light forming unit 34 forms the image light Lim based on the image signal from the system box 2.
  • the image light forming unit 34 is configured by a known spatial light modulation element.
  • the spatial light modulation element is, for example, a liquid crystal display composed of a liquid crystal display element and a light source, or an organic EL (Electro-Luminescence).
  • the image light forming unit 34 may be a known retinal scanning display that projects an image on the retina by mechanically two-dimensionally scanning light from a light source such as a laser instead of the spatial light modulator. .
  • the eyepiece optical unit 35 includes a lens 36 and a lens holder 37.
  • the left end of the lens holder 37 is in contact with the right end of the image light forming unit 34.
  • a lens 36 is held inside the right side of the lens holder 37. That is, the lens 36 and the image light forming unit 34 are separated by the lens holder 37 by a distance corresponding to the display distance of the virtual image displayed to the user.
  • the lens 36 is a plurality of lenses arranged in the left-right direction.
  • the lens 36 is composed of a plurality of lenses in order to achieve desired optical characteristics.
  • the lens 36 may be composed of a single lens.
  • the eyepiece optical unit 35 condenses the image light Lim and guides it to the half mirror 31.
  • the image light Lim collected by the lens 36 is diffused light or parallel light. That is, “collection” is an action on the incident light flux of a lens having a positive power as a whole, and is not limited to that the outgoing light flux is convergent light.
  • the plate-shaped half mirror 31 is connected to the right end of the housing 30. Specifically, the half mirror 31 is sandwiched from above and below by a predetermined portion of the housing 30 at the right end of the housing 30.
  • the half mirror 31 is formed by vapor-depositing a metal such as aluminum so that the transmittance is 50% on the surface of a plate-like transparent member such as glass or light-transmitting resin.
  • the light transmissive resin is, for example, acrylic, polyacetal or the like. The transmittance of the half mirror 31 may not be 50%.
  • “translucent” in the present embodiment is such that a part of the image light Lim is reflected and a part of the external light is transmitted, so that the user can substantially visually recognize the image (virtual image) and the external environment. It is a concept including the state which is.
  • the system box 2 includes a CPU 20, a program ROM 21, a flash ROM 22, a RAM 23, a communication circuit 24, a video RAM 25, an image processing unit 26, and a peripheral I / F 27.
  • the CPU 20 controls various processes executed in the system box 2.
  • the processes controlled by the CPU 20 are, for example, a main process shown in FIG. 5 and a routine process shown in FIG.
  • the CPU 20 instructs the image processing unit 26 to execute image processing.
  • the program ROM 21 stores computer programs for various processes executed in the system box 2.
  • the flash ROM 22 stores various data.
  • Data stored in the flash ROM 22 is, for example, image data and a database.
  • the image data is data corresponding to the image displayed on the display device 3.
  • the image data includes data corresponding to images for a plurality of pages. In the present embodiment, image data corresponding to images for a plurality of pages will be described as an example.
  • the user visually recognizes the image of each page corresponding to the image data displayed on the display device 3.
  • the database is a storage unit in which identification information and a first feature amount are registered in association with each other.
  • the identification information is information for identifying a user who uses the HMD 1.
  • the first feature amount is information representing a motion feature of a predetermined part of the target, for example, a part of the user's body, set as an instruction for processing executed by the HMD 1.
  • the processing executed by the HMD 1 is, for example, page turning processing and page returning processing.
  • the page turning process is a process in which the image displayed on the display device 3 is used as the image of the next page.
  • the page return process is a process in which the image displayed on the display device 3 is the image of the previous page.
  • the page feed process and the page return process are processes having a relation related to S214 of the routine process shown in FIG.
  • the user's body part is, for example, a hand.
  • the movement for the page turning process is a movement of moving one hand from one side to the other side in the left-right direction of the user when the body part of the user is a hand. Specifically, this is a movement of moving the right hand from the right side to the left side.
  • the movement for the page return process is a movement of moving the hand on one side from the other side to the one side in the left-right direction of the user. Specifically, this is a movement of moving the right hand from the left side to the right side.
  • the feature amount representing the feature of the right hand, the feature amount representing the feature of the hand movement direction, and the feature amount representing the feature of the hand movement amount are Are registered in the database in association with the identification information.
  • the first feature quantity that is associated with the identification information and registered in the database is managed for each process.
  • the RAM 23 serves as a work area when the CPU 20 executes the computer program stored in the program ROM 21.
  • the communication circuit 24 controls communication with the display device 3 and the like.
  • the transmission cable 4 is electrically connected to the communication circuit 24.
  • the communication circuit 24 transmits an image signal to the display device 3 via the transmission cable 4.
  • the communication circuit 24 supplies power from a battery or the like to the display device 3 via the transmission cable 4.
  • the communication circuit 24 receives an external image signal transmitted from the display device 3 via the transmission cable 4.
  • the external image signal is a signal that represents external image data corresponding to the external image captured by the camera 32.
  • the video RAM 25 stores image data transmitted to the display device 3 as an image signal.
  • the video RAM 25 stores external image data based on an external image signal received by the communication circuit 24.
  • the image processing unit 26 reads image data from the flash ROM 22 to the video RAM 25, executes image processing on the image data stored in the video RAM 25, and generates an image signal.
  • the image processing unit 26 generates external field image data from the received external field image signal.
  • the image processing unit 26 performs image processing on the external image data in accordance with a command from the CPU 20.
  • the image processing unit 26 is provided to execute various image processes in order to reduce the processing load on the CPU 20.
  • the peripheral I / F 27 is an interface to which predetermined parts are electrically connected.
  • a power switch 271, a power lamp 272, and an operation unit 273 are connected to the peripheral I / F 27.
  • the power switch 271 is a switch for switching on and off the power to the HMD 1.
  • the HMD 1 is activated.
  • the power lamp 272 is a lamp indicating that the power is on.
  • the power lamp 272 is lit when the power switch 271 is turned on.
  • the operation unit 273 is an interface for inputting a predetermined instruction to the system box 2.
  • the operation unit 273 includes a plurality of operation buttons. The predetermined instruction is input by appropriately operating the operation buttons of the operation unit 273.
  • the display device 3 includes a CPU 38, a program ROM 39, a RAM 40, a communication circuit 41, and a peripheral I / F 43 in addition to the image light forming unit 34.
  • Each unit included in the display device 3 is built in the housing 30 together with the image light forming unit 34 and the like.
  • the CPU 38 controls various processes executed on the display device 3.
  • the CPU 38 drives the image light forming unit 34 to form the image light Lim corresponding to the image signal, and controls the image to be displayed to the user.
  • the program ROM 39 stores computer programs for various processes executed by the display device 3.
  • the process executed by the display device 3 is, for example, a process related to the formation of the image light Lim by the image light forming unit 34.
  • the RAM 40 serves as a work area when the CPU 38 executes the computer program stored in the program ROM 39.
  • the communication circuit 41 controls communication with the system box 2 and the like.
  • the transmission cable 4 is electrically connected to the communication circuit 41.
  • the transmission cable 4 extends rearward from the housing 30 and is connected to the system box 2.
  • the communication circuit 41 transmits an external image signal to the system box 2 via the transmission cable 4.
  • the communication circuit 41 receives an image signal transmitted from the system box 2 via the transmission cable 4.
  • the communication circuit 41 is supplied with power from the system box 2 via the transmission cable 4.
  • the supplied power is supplied to each part of the display device 3 and the camera 32.
  • the peripheral I / F 43 is an interface to which the camera 32 is connected. An external image signal representing external image data corresponding to the external image captured by the camera 32 is transmitted from the communication circuit 41 to the system box 2 via the peripheral I / F 43.
  • the HMD 1 is controlled by the CPU 20 of the system box 2 and the CPU 38 of the display device 3. That is, various functions are realized in the system box 2 by the CPU 20 executing the computer program stored in the program ROM 21.
  • the CPU 20 can also be specified as a control unit serving as various functional units included in the HMD 1.
  • various functions are realized in the display device 3 by the CPU 38 executing the computer program stored in the program ROM 39.
  • the CPU 38 can also be specified as a control unit serving as various functional units included in the HMD 1.
  • the computer program is written in the program ROM 21 and the program ROM 39 when the HMD 1 is shipped from the factory.
  • the program ROM 39 is an example of a computer-readable storage device.
  • the storage device is a storage medium excluding a temporary storage medium.
  • the storage device may be a non-transitory storage medium.
  • a non-transitory storage medium can retain data regardless of the length of time to store the data.
  • the computer program may be stored in a storage medium such as an external server. When the computer program is stored in the server, the computer program is downloaded from an external server or the like via the connection interface and stored in the program ROM 21 and the program ROM 39 as appropriate. In this case, the computer program is transmitted to the HMD 1 from an external server or the like as a computer-readable temporary storage medium (for example, a transmission signal).
  • the main process executed by the HMD 1 will be described with reference to FIG.
  • the main process is executed by the CPU 20 of the system box 2.
  • the main process is started when the user operates the power switch 271 to turn on the power.
  • the CPU 20 uses the RAM 23 to execute a computer program for main processing stored in the program ROM 21.
  • the computer program includes a computer program module for routine processing.
  • the CPU 20 that started the main process executes personal authentication.
  • Identification information for identifying a user is used for personal authentication.
  • the user operates the operation unit 273 and inputs his / her identification information.
  • the identification information may be input by another method.
  • identification information according to the read information may be input by reading a wireless tag including information that can identify an individual with a reading unit.
  • identification information based on the imaged body part may be input by capturing an image of a body part having a characteristic amount that can identify an individual such as a hand with the camera 32.
  • the CPU 20 acquires the input identification information and stores it in the RAM 23.
  • the user using the HMD 1 is specified by the input identification information.
  • the identification information acquired in S100 and capable of identifying the user using the HMD 1 is referred to as “specified identification information”.
  • the CPU 20 performs control to display a user menu. Thereby, the display device 3 displays the user menu, and the user visually recognizes the user menu.
  • the user menu displayed in S102 includes a plurality of options. For example, the following three options are included.
  • the first option is an option for resetting the learning result of the gesture.
  • the learning result is obtained by the learning process shown in S118 and S210 in FIG.
  • the second option is an option for performing gesture learning. When gesture learning is selected and gesture learning is performed, the first feature value registered in the database in association with the identified identification information is updated.
  • the third option is an option for displaying an image on the display device 3. When an option for displaying an image is selected, the display device 3 displays an image corresponding to the image data stored in the flash ROM 22. The user operates the operation unit 273 to select any option.
  • the CPU 20 determines whether or not a learning result reset is input according to the display of the user menu.
  • the CPU 20 resets the first feature amount registered in association with the identified identification information in the database stored in the flash ROM 22 (S106).
  • the first feature value registered in association with the identified identification information is a predetermined initial value.
  • the CPU 20 controls to display the user menu again. In the user menu displayed after resetting, the reset of the learning result may not be input. This is because the first feature amount has already been reset. In the present embodiment, it is assumed that the reset of the learning result cannot be input in the user menu after executing S106.
  • the CPU 20 determines whether or not gesture learning is input according to the display of the user menu (S108). When the image display is input and the gesture learning is not input (S108: No), the CPU 20 shifts the process to S124. When gesture learning is input (S108: Yes), the CPU 20 controls the camera 32 to be activated (S110). As a result, the camera 32 is activated. Thereafter, the CPU 20 controls to display a selection screen for inputting a learning target. Thereby, the display device 3 displays the selection screen, and the user visually recognizes the selection screen.
  • the selection screen includes a plurality of options for specifying each process executed by the HMD 1 or the movement of the body part.
  • the selection screen includes options related to page feed processing and options related to page return processing.
  • the user operates the operation unit 273 and inputs a desired option.
  • the user inputs options related to the page turning process.
  • the CPU 20 specifies a learning target in accordance with the input option.
  • the CPU 20 specifies, as a learning target, a first feature amount representing a movement feature for the page turning process associated with the identified identification information in the database. To do.
  • the CPU 20 determines whether or not the user is ready to perform the movement represented by the first feature amount of the learning target specified in S112.
  • the CPU 20 determines S114 according to whether or not the operation key is pressed in the operation unit 273.
  • the CPU 20 denies S114 (S114: No) and repeats the process of S114.
  • the CPU 20 affirms S114 (S114: Yes), and controls the camera 32 to start capturing an external image (S116).
  • the learning target specified in S112 is related to the page turning process, the user moves the right hand from the right side to the left side, for example, in front of the body. At this time, the user turns his / her face toward the moving right hand.
  • the camera 32 captures an external image including the right hand of the moving user.
  • the imaging of the outside world image ends, for example, at a timing performed for a predetermined period.
  • An external image signal representing external image data corresponding to the external image captured by the camera 32 is transmitted and received via the transmission cable 4 between the communication circuit 41 of the display device 3 and the communication circuit 24 of the system box 2.
  • the external image signal is converted into external image data by the image processing unit 26, and the generated external image data is stored in the video RAM 25.
  • the CPU 20 executes a learning process on the outside world image data stored in the video RAM 25.
  • the CPU 20 acquires the motion of the body part included in the external image from the external image data stored in the video RAM 25. For example, the CPU 20 acquires the movement of the right hand that moves from the right side to the left side.
  • the CPU 20 When acquiring the movement of the right hand, the CPU 20 performs image processing for specifying the movement of the body part such as the right hand included in the external image in the image processing unit 26 with respect to the external image data stored in the video RAM 25. Command to run.
  • the CPU 20 acquires the movement of the body part such as the right hand from the result of the image processing by the image processing unit 26. Thereafter, the CPU 20 obtains a feature amount representing the acquired feature of the movement.
  • the image processing for obtaining the feature amount executed by the image processing unit 26 is performed according to an already developed image processing technique.
  • the feature amount of movement is, for example, the moving distance of the body part in the external image, the moving direction, or both.
  • the feature quantity for updating the first feature quantity registered in the database shown in S ⁇ b> 118 and S ⁇ b> 210 in FIG. 6, like the acquired motion feature quantity, is referred to as a second feature quantity.
  • the CPU 20 updates the first feature amount of the learning target associated with the identified identification information in the database according to the second feature amount.
  • S104 is denied (S104: No)
  • the first feature amount to be learned is the first feature amount associated with the identified identification information in the database at the start of the main process.
  • S106 is executed, the first feature amount to be learned is the first feature amount after reset.
  • the update in the learning process is performed using a neural network (SOM). Neural network (SOM) is a known technique. A specific description of the update by the neural network (SOM) is omitted.
  • the CPU 20 registers, in the database, the first feature amount updated in S118 as the first feature amount to be learned associated with the identified identification information.
  • the CPU 20 controls to display the selection screen again.
  • the user operates the operation unit 273 and inputs a desired option.
  • the selection screen includes options for instructing completion of gesture learning.
  • the CPU 20 determines whether or not the input option is an instruction to complete the gesture learning according to the display of the selection screen.
  • an option related to a process other than the page turning process is input (S122: No)
  • the process returns to S110, and S110 to S120 are executed.
  • S122 may be denied (S122: No), and the unprocessed portions may be executed sequentially.
  • the HMD 1 starts displaying an image corresponding to the image data stored in the flash ROM 22.
  • the image processing unit 26 performs image processing on a predetermined page portion of the image data stored in the flash ROM 22, and an image signal generated by the image processing is transmitted from the communication circuit 24 to the display device 3.
  • the image light forming unit 34 forms the image light Lim based on the image signal received by the communication circuit 41, and an image corresponding to the image signal is displayed. The user visually recognizes the displayed image.
  • the CPU 20 that has started the routine process controls the camera 32 to start capturing an external image (S200). For example, the user moves the right hand from the right side to the left side in front of the body. When moving the right hand from the right side to the left side, the user turns the face toward the moving right hand.
  • the camera 32 captures an external image including the right hand of the moving user.
  • the imaging of the outside world image is continued during the routine processing.
  • An external image signal representing external image data corresponding to the external image captured by the camera 32 is transmitted and received via the transmission cable 4 between the communication circuit 41 of the display device 3 and the communication circuit 24 of the system box 2.
  • the external image signal is converted into external image data by the image processing unit 26, and the generated external image data is stored in the video RAM 25.
  • the CPU 20 acquires the movement of the body part included in the external image from the external image data stored in the video RAM 25. For example, the CPU 20 acquires the movement of the right hand that moves from the right side to the left side. The movement of the body part is acquired in the same manner as described in S118.
  • a motion that may be a motion for inputting an instruction for a predetermined process executed by the HMD 1, such as the acquired motion of a body part is referred to as an “operation motion”.
  • the CPU 20 determines the correspondence between the operation action and the first feature amount associated with the identified identification information. Specifically, the CPU 20 determines whether or not the feature quantity representing the feature of the operation action corresponds to the first feature quantity registered in the database in association with the identified identification information. With respect to S202, the “moving right hand from the right side to the left side” which is an operation for instructing the page turning process will be described as an example. In S ⁇ b> 202, the CPU 20 includes a “right hand”, a “hand movement direction”, and a “hand movement direction” that are included in the first feature amount indicated in the record “No. 1” in FIG. 4. It is individually determined whether or not it is the same as each feature amount of “movement amount”.
  • the CPU 20 determines whether or not the feature quantity representing the feature of the body part in the operation operation matches the feature quantity representing the feature of the right hand. Next, the CPU 20 determines whether or not the feature amount representing the movement direction of the body part in the operation operation matches the feature amount representing the feature of the hand movement direction from the right side to the left side. Furthermore, the CPU 20 determines whether or not the feature amount that represents the movement amount of the body part in the operation operation matches the feature amount that represents the movement amount of the hand.
  • S202 is affirmed (S202: Yes). Also in the case where the difference between the two is in a range set in advance as an error, S202 may be affirmed (S202: Yes), as in the same case.
  • the error may be appropriately changed by an operation via the operation unit 273. S202 is performed for all first feature values registered in the database in association with the identified identification information.
  • the CPU 20 controls processing corresponding to the first feature amount (S204). As shown in the record “No. 1” in FIG. 4, the CPU 20 controls the page turning process based on the example of “moving the right hand from the right side to the left side”.
  • the image processing unit 26 performs image processing on the next page portion of the image data, and transmits an image signal generated by the image processing to the display device 3 from the communication circuit 24.
  • the image light Lim is formed by the image light forming unit 34 based on the image signal received by the communication circuit 41, and the next page image corresponding to the image signal is displayed. Thereafter, the CPU 20 shifts the processing to S220.
  • the CPU 20 determines whether or not the difference between the feature amount representing the feature of the operation action and the first feature amount is within a predetermined threshold (S206). .
  • a predetermined range is set in advance for the first feature amount.
  • a range wider than the above-described error is set.
  • the threshold is appropriately set based on a range in which the operation can be determined to be similar to the movement corresponding to the first feature amount.
  • the CPU 20 denies S206 (S206: No), and moves the process to S220.
  • the operation action is outside the threshold value, it is determined that the operation action is not a movement related to an instruction input to the HMD 1.
  • the CPU 20 affirms S206 (S206: No).
  • the CPU 20 controls the process corresponding to the first feature amount used as the determination criterion in S202, as in S204.
  • the CPU 20 controls page turning processing, and the HMD 1 executes page turning of the image displayed on the display device 3.
  • the CPU 20 executes a learning process.
  • the learning process is executed in the same manner as S118 of the main process for the operation operation. That is, the CPU 20 obtains a feature amount representing the feature of the operation operation as the second feature amount from the acquired operation operation.
  • the CPU 20 updates the first feature amount associated with the identified identification information in the database according to the obtained second feature amount.
  • the updated first feature value is a feature value for the process controlled in S208.
  • the CPU 20 registers the first feature quantity updated in S210 in the database as the first feature quantity associated with the identified identification information and related to the process executed in S208.
  • the CPU 20 acquires operation operations for processing having a certain relationship with the processing controlled in S208 in succession to the operation operations acquired for processing in S208 within a predetermined period. It is judged whether it was done.
  • the CPU 20 causes the image processing unit 26 to continuously generate image processing for specifying the movement of the body part included in the external image by continuous imaging with the camera 32 and store the external environment stored in the video RAM 25. Command to run on image data.
  • the CPU 20 determines a certain relationship with the process controlled in S208 from the result of the image processing by the image processing unit 26.
  • the operation action for the process to have is acquired.
  • the image processing executed by the image processing unit 26 is performed according to an already developed image processing technique.
  • the process having a certain relationship with the process controlled in S208 is, for example, a process that is paired with the process controlled in S208.
  • the process controlled in S208 is a page turning process
  • the process paired with the page turning process is a page return process.
  • the CPU 20 acquires a movement of moving the right hand from the right side to the left side by image processing in the image processing unit 26, and then continuously moves the right hand from the left side to the right side within a predetermined period. It is determined whether or not a movement to perform is acquired. If the process controlled in S208 and executed as a result of S208 is not intended by the user, it is assumed that the user often performs a movement to cancel the process of S208.
  • the predetermined period is set in consideration of how long the user's action for instructing the page return process is performed.
  • the CPU 20 determines whether or not the movement of moving the right hand from the left side to the right side is acquired.
  • the operation operation acquired by the image processing in the image processing unit 26 in the record “No. 2” in FIG. This is performed according to whether or not it matches the first feature amount of the movement of the body part.
  • the CPU 20 includes a “right hand”, a “hand moving direction”, and a “hand movement” that are included in the first feature amount shown in the record “No. 2” in FIG. It is individually determined whether or not it is the same as each feature amount of “movement amount”.
  • the CPU 20 determines whether or not the feature quantity representing the feature of the body part in the operation operation matches the feature quantity representing the feature of the right hand.
  • the CPU 20 determines whether or not the feature amount representing the movement direction of the body part in the operation operation matches the feature amount representing the feature of the hand movement direction from the left side to the right side. Furthermore, the CPU 20 determines whether or not the feature amount that represents the movement amount of the body part in the operation operation matches the feature amount that represents the movement amount of the hand. The determination of the feature values of “right hand”, “hand movement direction”, and “hand movement amount” all match, and the feature amount representing the feature of the operation is registered in the database in association with the identified identification information. If it is determined that it is the same as the first feature amount that has been made, S214 is affirmed (S214: Yes).
  • S214 may be affirmed (S214: Yes).
  • the error may be appropriately changed by an operation via the operation unit 273. At least one of the determination of each feature amount of “right hand”, “hand movement direction”, and “hand movement amount” is not the same, and “right hand”, “hand movement direction”, and “hand movement”
  • S214 is negative (S214: No).
  • the CPU 20 shifts the process to S220.
  • the CPU 20 executes the first feature amount registered in association with the identification information specified in the database in S212 before executing S212. (S216).
  • the CPU 20 controls processing corresponding to the paired movement determined to have been acquired in S214 (S218). For example, when the paired movement is a movement in which the right hand is moved from the left side to the right side, the CPU 20 controls the page return process shown in the record “No. 2” in FIG.
  • the image processing unit 26 performs image processing on the previous page portion of the image data, and transmits an image signal generated by the image processing from the communication circuit 24 to the display device 3. .
  • image light Lim is formed by the image light forming unit 34 based on the image signal received by the communication circuit 41, and the previous page image corresponding to the image signal is displayed. Thereafter, the CPU 20 shifts the processing to S220.
  • the CPU 20 determines whether or not the user has operated the power switch 271 to turn off the power. When the power is not turned off (S220: No), the CPU 20 returns the process to S202. When the power is turned off (S220: Yes), the CPU 20 ends the routine process. CPU20 which complete
  • S104 it is determined whether or not a reset of the learning result is input in the main process. If the reset of the learning result is input (S104: Yes), the CPU 20 specifies the database stored in the flash ROM 22. The first feature amount registered in association with the completed identification information is reset (S106). In order to reset, the first feature amount registered in the database can be returned to the initial value.
  • S202 Even if S202 is negative (S202: No), it is determined whether the correspondence between the operation action and the first feature amount is within the threshold (S206), and if it is within the threshold (S206: Yes), in S208, the CPU 20 controls the process corresponding to the operation. After that, the first feature value related to the processing controlled in S208 is updated in S210, and the updated first feature value is registered in S212. did. However, when an operation operation for processing that is paired with the processing controlled in S208 is acquired within a predetermined period, following the operation operation acquired for the processing in S208 (S214: Yes). ), Canceling the registration in S212, and returning to the setting before the execution of S212 in S216.
  • the HMD 1 has a system box 2 and a display device 3 as separate bodies. It is good also considering the predetermined each part of each part with which the system box 2 is provided as integral HMD1 incorporated in the housing
  • a power switch 271, a power lamp 272, and an operation unit 273 are connected to the peripheral I / F 43 of the display device 3. The operation unit 273 is operated when a predetermined instruction is input to the integrated HMD 1.
  • the battery may be built in the housing 30. When the battery is built in the housing 30, the communication circuit 41 may be omitted.
  • the communication circuit 41 is supplied with power from the external battery via the transmission cable 4.
  • the CPU 38 of the display device 3 similarly performs the main process executed by the CPU 20 using the RAM 40.
  • the CPU 38 executes the main process
  • the CPU 38 similarly executes a routine process that is executed along with the execution of the main process.
  • the program ROM 39 stores a computer program for main processing including routine processing.
  • Each process executed by the image processing unit 26 may be executed by the CPU 20 when the CPU 20 or the integrated HMD 1 is used.
  • a part of the RAM 40 may be allocated as a video RAM.
  • the image processing unit 26 and the video RAM 25 may be omitted.

Abstract

The purpose of the present invention is to provide a head-mounted display and a computer program for a head-mounted display capable of obtaining advantageous operability. A first movement of a part of the body of a user is acquired in accordance with a photographed image of the outside world, the first movement being set as an instruction for a first processing executed by the head-mounted display in which the light of an image corresponding to the image displayed to the user is formed. The first processing corresponding to the acquired first movement is controlled. A first characteristic value for expressing a characteristic of the first movement registered in relation to identification information for identifying a user (S120). The user is specified (S100). The first characteristic value related to the specified identification information is updated (S118) in accordance with a second characteristic value for expressing a characteristic of the newly acquired first movement when the first characteristic value has been registered in relation to the specified identification information for identifying a specified user.

Description

ヘッドマウントディスプレイ及びコンピュータプログラムHead mounted display and computer program
 本開示は、ヘッドマウントディスプレイと、ヘッドマウントディスプレイのためのコンピュータプログラムに関する。 The present disclosure relates to a head mounted display and a computer program for the head mounted display.
 ユーザの頭部に装着され、ユーザに画像を表示するヘッドマウントディスプレイに関する技術が提案されている。特許文献1には、操作者による操作の指示が容易に行え、且つ複雑な操作指示も簡単に行うことのできる、表示装置が開示されている。ヘッドマウントディスプレイを制御する制御部が、手、又は指等の操作者の身体の一部が所定の動きを示したことを検出した場合に、空間内の仮想的なアイコンに対応する処理を起動する。 A technology related to a head-mounted display that is mounted on the user's head and displays an image to the user has been proposed. Patent Document 1 discloses a display device that can easily give an operation instruction by an operator and can easily give a complicated operation instruction. When the control unit that controls the head-mounted display detects that a part of the operator's body such as a hand or a finger has shown a predetermined movement, the process corresponding to the virtual icon in the space is activated. To do.
特開平8-6708号公報JP-A-8-6708
 制御部が、ヘッドマウントディスプレイの操作を、ヘッドマウントディスプレイのユーザの身体の部分の動きに従って制御する場合、制御部が、ユーザの身体の部分の動きに従って好適に制御することが望ましい。しかし、所定の1つの動きは、ユーザ毎の特徴を有する。例えば、各ユーザは、ユーザの癖を伴って動く。例えば、ヘッドマウントディスプレイに表示されている画像を、次頁の画像とする指示が、ユーザの左右方向において、右手を右側から左側に移動させる動きに設定されているとする。第一のユーザは、右側から左側への移動量が第一量である。第一のユーザに対して、第二のユーザは、第一量より大きな第二量である場合もある。同一のユーザであっても、異なって動く場合がある。例えば、身体の部分を動かしやすい衣服を着ているときと、身体の部分を動かしにくい衣服を着ているときでは、動きが変化することがある。 When the control unit controls the operation of the head-mounted display according to the movement of the body part of the user of the head-mounted display, it is desirable that the control unit suitably controls according to the movement of the body part of the user. However, the predetermined one movement has a feature for each user. For example, each user moves with the user's habit. For example, it is assumed that the instruction to set the image displayed on the head mounted display as the image of the next page is set to a movement of moving the right hand from the right side to the left side in the left-right direction of the user. For the first user, the amount of movement from the right side to the left side is the first amount. For the first user, the second user may be a second quantity greater than the first quantity. Even the same user may move differently. For example, the movement may change between wearing a garment that easily moves the body part and wearing a garment that hardly moves the body part.
 本開示は、好適な操作性を得られる、ヘッドマウントディスプレイ及びヘッドマウントディスプレイのためのコンピュータプログラムを提供することを目的とする。 This disclosure is intended to provide a head-mounted display and a computer program for the head-mounted display that can obtain suitable operability.
 本開示の一側面は、撮像部によって撮像された対象の所定の部位を含む外界像を示す外界像データを取得する第1取得手段と、前記第1取得手段によって取得された外界像データに従い、第一の処理に対応する所定の部位の第一の動きを取得する第2取得手段と、前記取得手段で取得された前記第一の動きに対応した前記第一の処理を制御する処理手段と、前記所定の部位を有する対象を識別する識別情報と、前記第一の動きの特徴を表す第一の特徴量と、を記憶部に関連付けて登録する登録手段と、ユ識別情報の入力を受け付ける受付手段と、を備え、前記登録手段は、前記受付手段によって受け付けられた前記識別情報に関連付けて登録されている前記第一の特徴量を、前記受付手段が前記識別情報を受け付けた後で前記第2取得手段で取得された前記第一の動きの特徴を表す第二の特徴量に従い、前記特定済みの前記識別情報に関連付けられた前記第一の特徴量を更新する、ヘッドマウントディスプレイである。これによれば、ヘッドマウントディスプレイを利用するユーザを識別する識別情報に関連付けて、第一の処理のための第一の動きの特徴を表す第一の特徴量が、記憶部に登録される。ヘッドマウントディスプレイを利用するユーザを特定し、特定済みのユーザを識別する特定済みの識別情報に関連付けられた第一の特徴量を、新たに取得された第一の動きの特徴を表す第二の特徴量で、更新することができる。上述したように、第一の処理のための第一の動きが、常に、一定の状態で行われるとは限らない。このヘッドマウントディスプレイによれば、第一の動きに関し、第一の特徴量が更新されるため、ユーザ毎に、第一の処理を指示するための操作性を改善することができる。 According to one aspect of the present disclosure, according to the first acquisition unit that acquires the external field image data indicating the external field image including the predetermined portion of the target imaged by the imaging unit, the external image data acquired by the first acquisition unit, Second acquisition means for acquiring a first movement of a predetermined part corresponding to the first process; and processing means for controlling the first process corresponding to the first movement acquired by the acquisition means; Registration means for registering the identification information for identifying the object having the predetermined part and the first feature quantity representing the feature of the first movement in association with the storage unit, and accepting input of identification information Receiving means, wherein the registration means is configured to register the first feature amount registered in association with the identification information received by the receiving means after the receiving means has received the identification information. Second acquisition means According to a second feature amount representing the feature of the acquired first motion, updating the first feature amount that is associated with the already identified the identification information of a head-mounted display. According to this, the first feature amount representing the first movement feature for the first process is registered in the storage unit in association with the identification information for identifying the user who uses the head mounted display. A user who uses the head-mounted display is identified, and a first feature amount associated with the identified identification information for identifying the identified user is represented by the second feature representing the newly acquired first motion feature. The feature amount can be updated. As described above, the first movement for the first process is not always performed in a constant state. According to this head mounted display, the first feature amount is updated with respect to the first movement, so that the operability for instructing the first process can be improved for each user.
 このヘッドマウントディスプレイは、次のようにしてもよい。前記第2取得手段で取得された前記第二の特徴量が、前記特定済みの識別情報に関連付けて前記記憶部に登録された前記第一の特徴量に対応するかを判断する第1判断手段と、前記第1判断手段が前記第二の特徴量が前記第一の特徴量に対応していないと判断したことに応じて、前記第二の特徴量が、前記第一の特徴量に対して所定の範囲に含まれるかを判断する第2判断手段と、を備え、前記処理手段は、前記第2判断手段が前記第二の特徴量が、前記第一の特徴量に対して、所定の範囲に含まれると判断したことに応じて、前記第一の動きに対応した前記第一の処理を制御し、前記第2判断手段が前記第2の特徴量が、前記第一の特徴量に対して、前記所定の範囲に含まれていないと判断したことに応じて、前記第一の動きに対応した前記第一の処理を制御しない、ようにしてもよい。 This head mounted display may be as follows. First determination means for determining whether the second feature value acquired by the second acquisition means corresponds to the first feature value registered in the storage unit in association with the specified identification information. And when the first determination means determines that the second feature quantity does not correspond to the first feature quantity, the second feature quantity is compared with the first feature quantity. Second determining means for determining whether the second feature value is included in a predetermined range with respect to the first feature value. The first process corresponding to the first movement is controlled in response to the determination that the second feature value is included in the range of the first feature value. In response to the first movement in response to determining that the predetermined range is not included. Does not control the first process, it may be.
 また、このヘッドマウントディスプレイは、次のようにしてもよい。前記登録手段は、前記第2判断手段が前記第二の特徴量が、前記第一の特徴量に対して所定の範囲に含まれると判断したことに応じて、前記第二の特徴量に従い、前記特定済みの識別情報に関連付けられた前記第一の特徴量を更新し、前記第2判断手段が前記第二の特徴量が、前記第一の特徴量に対して所定の範囲に含まれないと判断したことに応じて、前記第二の特徴量に従い、前記特定済みの識別情報に関連付けられた前記第一の特徴量を更新しない、ようにしてもよい。これによれば、第一の動きが第一の特徴量に対応していると判断されない場合であっても、第一の特徴量を基準として、第一の動きが所定の範囲であるか否かに従い、第一の処理及び第一の特徴量の更新を制御することができる。 Also, this head mounted display may be as follows. In response to determining that the second feature amount is included in a predetermined range with respect to the first feature amount, the registration unit is configured to follow the second feature amount according to the second feature amount, The first feature value associated with the specified identification information is updated, and the second determination unit does not include the second feature value within a predetermined range with respect to the first feature value. In response to the determination, the first feature amount associated with the identified identification information may not be updated according to the second feature amount. According to this, even if it is not determined that the first movement corresponds to the first feature amount, whether or not the first movement is within a predetermined range on the basis of the first feature amount. Accordingly, the first process and the update of the first feature amount can be controlled.
 前記第2取得手段は、前記第1取得手段によって取得された外界像に従い、前記ヘッドマウントディスプレイで実行される処理であって、前記第一の処理と一定の関係を有する第二の処理に対応する前記所定の部位の第二の動きを取得し、前記処理手段は、前記第2取得手段で取得された前記第二の動きに対応した前記第二の処理を制御し、前記登録手段は、前記第2判断手段が前記第二の特徴量が、前記第一の特徴量に対して所定の範囲に含まれると判断した後で、前記取得手段で、前記第一の動きと、前記第二の動きと、が所定の期間内に新たに順次連続して取得されたことに応じて、前記第二の特徴量に従い更新された前記第一の特徴量の前記記憶部での登録が、前記第二の特徴量で更新される前の前記第一の特徴量となるようにする、ようにしてもよい。これによれば、第一の特徴量の更新を、連続した指示に基づき、好適に行うことができる。制御された第一の処理と一定の関係を有する第二の処理のための指示として、第二の動きが、所定の期間内に、第一の動きに連続して取得されたような場合、第一の処理は、ユーザの意図した処理ではなかったと考えられる。上述したヘッドマウントディスプレイによれば、ユーザの意図しない第一の処理のための新たな第一の動きに基づいた第一の特徴量の更新を回避することができる。 The second acquisition means corresponds to a second process that is executed by the head mounted display in accordance with the external image acquired by the first acquisition means and has a fixed relationship with the first process. The second movement of the predetermined part to be acquired, the processing means controls the second processing corresponding to the second movement acquired by the second acquisition means, the registration means, After the second determination unit determines that the second feature amount is included in a predetermined range with respect to the first feature amount, the acquisition unit performs the first movement and the second feature amount. And the registration of the first feature amount updated in accordance with the second feature amount in the storage unit in response to the movement of The first feature value before being updated with the second feature value. , It may be so. According to this, the update of the first feature amount can be suitably performed based on continuous instructions. As an instruction for the second process having a certain relationship with the controlled first process, when the second movement is acquired continuously within the predetermined period, It is considered that the first process was not a process intended by the user. According to the head mounted display described above, it is possible to avoid the update of the first feature amount based on the new first movement for the first process unintended by the user.
 前記登録手段は、前記ヘッドマウントディスプレイに、前記記憶部での登録をリセットする指示が入力された場合、前記記憶部に登録された前記第一の特徴量をリセットする、ようにしてもよい。これによれば、記憶部に登録された第一の特徴量をリセットすることができる。 The registration unit may reset the first feature amount registered in the storage unit when an instruction to reset registration in the storage unit is input to the head mounted display. According to this, the first feature amount registered in the storage unit can be reset.
 前記受付手段が前記識別情報を受け付けた後で前記第2取得手段によって取得された前記第一の動きと、前記特定済みの識別情報に関連付けて前記記憶部に登録された前記第一の特徴量と、の対応を判断する第3判断手段を備え、前記処理手段は、前記判断手段によって前記第一の動きが前記第一の特徴量に対応していると判断された場合、前記取得手段で新たに取得された前記第一の動きに対応した前記第一の処理を制御する、ようにしてもよい。これによれば、第一の処理が制御され、第一の処理に対応した動作が、ヘッドマウントディスプレイで好適に実行されるようにすることができる。 The first feature acquired by the second acquisition unit after the reception unit has received the identification information, and the first feature amount registered in the storage unit in association with the identified identification information And a third judging means for judging the correspondence between the processing means and the processing means when the judging means judges that the first movement corresponds to the first feature amount by the obtaining means. The first process corresponding to the newly acquired first movement may be controlled. According to this, the first process is controlled, and the operation corresponding to the first process can be suitably executed on the head mounted display.
 本開示の他の側面は、ヘッドマウントディスプレイを制御する制御部が読み取り可能なコンピュータプログラムであって、前記制御部を、前記撮像部によって撮像された対象の所定の部位を含む外界像を示す外界像データを取得する第1取得手段と、前記第1取得手段によって取得された外界像データに従い、第一の処理に対応する所定の部位の第一の動きを取得する第2取得手段と、前記取得手段で取得された前記第一の動きに対応した前記第一の処理を制御する処理手段と、前記所定の部位を有する対象を識別する識別情報と、前記第一の動きの特徴を表す第一の特徴量と、を記憶部に関連付けて登録する登録手段と、前記識別情報の入力を受け付ける受付手段と、して機能させ、前記登録手段は、前記受付手段によって受付られた前記識別情報に関連付けて登録されている前記第一の特徴量を、前記受付手段が前記識別情報を受け付けた後で前記第2取得手段で取得された前記第一の動きの特徴を表す第二の特徴量に従い、前記特定済みの前記識別情報に関連付けられた前記第一の特徴量を更新する、機能を含む、コンピュータプログラムである。これによれば、上述した通り、ユーザを特定し、特定済みのユーザを識別する特定済みの識別情報に関連付けられた第一の特徴量を、新たに取得された第一の動きの特徴を表す第二の特徴量で、更新するヘッドマウントディスプレイを実現することができる。このコンピュータプログラムは、上述した構成をさらに含むヘッドマウントディスプレイのためのコンピュータプログラムとして特定されてもよい。このようなコンピュータプログラムによれば、上述した構成をさらに含むヘッドマウントディスプレイを、実現することもできる。 Another aspect of the present disclosure is a computer program readable by a control unit that controls a head-mounted display, wherein the control unit displays an external world image including a predetermined portion of a target imaged by the imaging unit. First acquisition means for acquiring image data, second acquisition means for acquiring a first movement of a predetermined part corresponding to a first process according to external image data acquired by the first acquisition means, Processing means for controlling the first process corresponding to the first movement acquired by the acquisition means; identification information for identifying an object having the predetermined part; and a first characteristic representing the characteristics of the first movement. And a registration unit that registers the one feature value in association with the storage unit, and a reception unit that receives the input of the identification information. The first feature amount registered in association with the identification information is a second feature representing the first movement feature acquired by the second acquisition unit after the reception unit has received the identification information. A computer program including a function of updating the first feature amount associated with the identified identification information according to a feature amount. According to this, as described above, the first feature amount associated with the identified identification information that identifies the identified user and identifies the identified user is represented by the newly acquired first motion feature. A head-mounted display to be updated can be realized with the second feature amount. This computer program may be specified as a computer program for a head mounted display that further includes the above-described configuration. According to such a computer program, a head-mounted display that further includes the above-described configuration can be realized.
 本開示によれば、好適な操作性を得られる、ヘッドマウントディスプレイ及びヘッドマウントディスプレイのためのコンピュータプログラムを得ることができる。 According to the present disclosure, it is possible to obtain a head-mounted display and a computer program for the head-mounted display that can obtain suitable operability.
ヘッドマウントディスプレイの一例を示す図である。It is a figure which shows an example of a head mounted display. 表示装置の平面図である。It is a top view of a display apparatus. 図1に示す上下方向の中心で表示装置を切断した断面図である。It is sectional drawing which cut | disconnected the display apparatus in the center of the up-down direction shown in FIG. ヘッドマウントディスプレイの電気的構成を示すブロック図である。It is a block diagram which shows the electric constitution of a head mounted display. ヘッドマウントディスプレイで実行される処理のための指示として設定されたユーザの身体の部分の動きの特徴を表す特徴量を説明する図である。It is a figure explaining the feature-value showing the characteristic of the motion of the user's body part set as the instruction | indication for the process performed with a head mounted display. メイン処理のフローチャートである。It is a flowchart of a main process. ルーチン処理のフローチャートである。It is a flowchart of a routine process.
 本開示を実施するための実施形態について、図面を用いて説明する。本開示は、以下に記載の構成に限定されるものではなく、同一の技術的思想において種々の構成を採用することができる。例えば、以下に示す構成の一部は、省略し又は他の構成等に置換してもよい。他の構成を含むようにしてもよい。 Embodiments for carrying out the present disclosure will be described with reference to the drawings. The present disclosure is not limited to the configurations described below, and various configurations can be employed in the same technical idea. For example, some of the configurations shown below may be omitted or replaced with other configurations. Other configurations may be included.
 <ヘッドマウントディスプレイ>
 ヘッドマウントディスプレイ1の概要について、図1及び図2を参照して説明する。ヘッドマウントディスプレイは、以下、HMDという。図1及び図2における前後方向及び左右方向と、図1における上下方向は、ユーザが表示装置3を眼鏡フレーム5によって頭部に装着した状態における、ユーザの前後方向、左右方向及び上下方向に準ずる。HMD1は、システムボックス2と、表示装置3を含む。システムボックス2と表示装置3は、図1に示すように、例えば、伝送ケーブル4を介して、接続される。システムボックス2は、表示装置3に、画像信号を送信し、電源を供給する。
<Head mounted display>
An outline of the head mounted display 1 will be described with reference to FIGS. 1 and 2. The head mounted display is hereinafter referred to as HMD. The front-rear direction and the left-right direction in FIG. 1 and FIG. 2 and the up-down direction in FIG. . The HMD 1 includes a system box 2 and a display device 3. As shown in FIG. 1, the system box 2 and the display device 3 are connected, for example, via a transmission cable 4. The system box 2 transmits an image signal and supplies power to the display device 3.
 表示装置3は、眼鏡フレーム5に着脱可能に取り付けられる。眼鏡フレーム5はユーザの頭部に装着される。眼鏡フレーム5は、表示装置3をユーザの頭部に装着するための一例である。眼鏡フレーム5とは異なる装着部によって、表示装置3をユーザの頭部に装着するようにしてもよい。表示装置3は筐体30を備える。筐体30は四角筒状の樹脂部材であり、平面視L字型に形成されている。筐体30の右端には、偏向部材としてのハーフミラー31が設けられる。筐体30の上面には、カメラ32が設けられる。カメラ32は、ユーザの周囲を撮像する。本実施形態では、カメラ32は、ユーザの顔面が向く方向に対応した方向の外界像が撮像されるようにして、筐体30の上面に設けられている。 The display device 3 is detachably attached to the spectacle frame 5. The spectacle frame 5 is mounted on the user's head. The spectacle frame 5 is an example for mounting the display device 3 on the user's head. The display device 3 may be mounted on the user's head by a mounting unit different from the spectacle frame 5. The display device 3 includes a housing 30. The housing 30 is a rectangular tubular resin member, and is formed in an L shape in plan view. A half mirror 31 as a deflection member is provided at the right end of the housing 30. A camera 32 is provided on the upper surface of the housing 30. The camera 32 captures an image around the user. In the present embodiment, the camera 32 is provided on the upper surface of the housing 30 so as to capture an external image in a direction corresponding to the direction in which the user's face faces.
 眼鏡フレーム5は、図1に示すように、左フレーム部52と、右フレーム部53と、中央フレーム部54と、支持部56とを備える。前後方向に延びる左フレーム部52は、ユーザの左耳に掛けられる。前後方向に延びる右フレーム部53は、ユーザの右耳に掛けられる。左右方向に延びる中央フレーム部54は、左フレーム部52の前端部と、右フレーム部53の前端部との間を連結し、ユーザの顔面部分に配置される。中央フレーム部54の長手方向における中央部には、一対の鼻当て部55が設けられる。支持部56は、中央フレーム部54の上面左端側に設けられる。支持部56は、下方延出部58を備える。下方延出部58は、ユーザの顔の左前方において上下方向に延出される。下方延出部58は、支持部56に形成された左右方向に延びる溝57に対して、摺動可能に係合する。下方延出部58の左右方向への摺動によって、表示装置3の左右方向における位置が調整される。 As shown in FIG. 1, the spectacle frame 5 includes a left frame portion 52, a right frame portion 53, a central frame portion 54, and a support portion 56. The left frame portion 52 extending in the front-rear direction is hung on the user's left ear. The right frame portion 53 extending in the front-rear direction is hung on the user's right ear. The central frame portion 54 extending in the left-right direction connects the front end portion of the left frame portion 52 and the front end portion of the right frame portion 53 and is disposed on the user's face portion. A pair of nose pads 55 are provided at the center in the longitudinal direction of the center frame 54. The support portion 56 is provided on the upper left end side of the central frame portion 54. The support part 56 includes a downward extension part 58. The downward extension 58 extends in the up-down direction at the front left of the user's face. The downward extending portion 58 is slidably engaged with a groove 57 formed in the support portion 56 and extending in the left-right direction. The position of the display device 3 in the left-right direction is adjusted by sliding the lower extension portion 58 in the left-right direction.
 表示装置3について、図2A、及び図2Bを参照して説明する。図2Aに示すように、筐体30には、取付部33が設けられる。取付部33は、筐体30の眼鏡フレーム5に対向する部分に設けられる。取付部33は上下方向に沿ったU字溝を有する。取付部33のU字溝に対して、眼鏡フレーム5の支持部56に設けられた下方延出部58が摺動可能に係合する。下方延出部58に取り付けられた筐体30が上下方向に摺動することで、表示装置3の上下方向における位置が調整される。図2Bに示す様に、筐体30は、画像光形成部34と、接眼光学部35とを内蔵する。図2(B)では簡略化のため、筐体30は省略されている。画像光形成部34から出射した画像光Limは、接眼光学部35によって集光される。集光された画像光Limの一部は、ハーフミラー31によって反射され、ユーザの眼EBに導かれる。 The display device 3 will be described with reference to FIGS. 2A and 2B. As shown in FIG. 2A, the housing 30 is provided with an attachment portion 33. The attachment portion 33 is provided at a portion of the housing 30 that faces the spectacle frame 5. The attachment portion 33 has a U-shaped groove along the vertical direction. A downward extending portion 58 provided on the support portion 56 of the spectacle frame 5 is slidably engaged with the U-shaped groove of the attachment portion 33. The position of the display device 3 in the vertical direction is adjusted by sliding the casing 30 attached to the downward extending portion 58 in the vertical direction. As illustrated in FIG. 2B, the housing 30 includes an image light forming unit 34 and an eyepiece optical unit 35. In FIG. 2B, the housing 30 is omitted for simplification. The image light Lim emitted from the image light forming unit 34 is collected by the eyepiece optical unit 35. A part of the condensed image light Lim is reflected by the half mirror 31 and guided to the user's eye EB.
 画像光形成部34は、筐体30内部の左端に設けられる。画像光形成部34は、システムボックス2からの画像信号に基づいた画像光Limを形成する。画像光形成部34は、周知の空間光変調素子によって構成される。空間光変調素子は、例えば、液晶表示素子と光源とで構成される液晶ディスプレイ、又は、有機EL(Electro-Luminescence)等である。画像光形成部34は、空間光変調素子の代わりに、レーザ等の光源からの光を機械的に二次元走査することで網膜上に画像を投影する、周知の網膜走査ディスプレイであってもよい。 The image light forming unit 34 is provided at the left end inside the housing 30. The image light forming unit 34 forms the image light Lim based on the image signal from the system box 2. The image light forming unit 34 is configured by a known spatial light modulation element. The spatial light modulation element is, for example, a liquid crystal display composed of a liquid crystal display element and a light source, or an organic EL (Electro-Luminescence). The image light forming unit 34 may be a known retinal scanning display that projects an image on the retina by mechanically two-dimensionally scanning light from a light source such as a laser instead of the spatial light modulator. .
 接眼光学部35は、レンズ36とレンズホルダー37で構成される。レンズホルダー37の左端は、画像光形成部34の右端に接触する。レンズホルダー37の右側内部には、レンズ36が保持される。即ち、レンズ36と画像光形成部34は、レンズホルダー37によって、ユーザに表示される虚像の表示距離に対応した距離だけ離間される。レンズ36は、左右方向に並べられた複数のレンズである。本実施形態では、レンズ36は、所望の光学特性を達成するために、複数のレンズで構成される。しかし、レンズ36は、単一のレンズで構成されてもよい。接眼光学部35は、画像光Limを集光してハーフミラー31へと導く。なお、表示装置3によってユーザは虚像を視認するため、レンズ36によって集光された画像光Limは、拡散光又は平行光である。即ち、「集光」とは、全体として正のパワーを有するレンズの入射光束に対する作用のことであり、出射光束が収束光となっていることに限定されない。 The eyepiece optical unit 35 includes a lens 36 and a lens holder 37. The left end of the lens holder 37 is in contact with the right end of the image light forming unit 34. A lens 36 is held inside the right side of the lens holder 37. That is, the lens 36 and the image light forming unit 34 are separated by the lens holder 37 by a distance corresponding to the display distance of the virtual image displayed to the user. The lens 36 is a plurality of lenses arranged in the left-right direction. In the present embodiment, the lens 36 is composed of a plurality of lenses in order to achieve desired optical characteristics. However, the lens 36 may be composed of a single lens. The eyepiece optical unit 35 condenses the image light Lim and guides it to the half mirror 31. Since the user visually recognizes a virtual image by the display device 3, the image light Lim collected by the lens 36 is diffused light or parallel light. That is, “collection” is an action on the incident light flux of a lens having a positive power as a whole, and is not limited to that the outgoing light flux is convergent light.
 板状のハーフミラー31は、筐体30の右端に接続される。具体的に、ハーフミラー31は、筐体30の右端において、筐体30の所定の部分に上下方向から挟持される。ハーフミラー31は、例えば、ガラスや光透過性樹脂等の板状透明部材の表面に対して、透過率が50%となるようにアルミニウム等の金属を蒸着して形成される。光透過性樹脂は、例えば、アクリル、ポリアセタール等である。ハーフミラー31の透過率は50%でなくてもよい。即ち、本実施形態における「半透明」は、画像光Limの一部が反射され、外界光の一部が透過することによって、ユーザが実質的に画像(虚像)と外界を重畳して視認可能である状態を含む概念である。 The plate-shaped half mirror 31 is connected to the right end of the housing 30. Specifically, the half mirror 31 is sandwiched from above and below by a predetermined portion of the housing 30 at the right end of the housing 30. The half mirror 31 is formed by vapor-depositing a metal such as aluminum so that the transmittance is 50% on the surface of a plate-like transparent member such as glass or light-transmitting resin. The light transmissive resin is, for example, acrylic, polyacetal or the like. The transmittance of the half mirror 31 may not be 50%. In other words, “translucent” in the present embodiment is such that a part of the image light Lim is reflected and a part of the external light is transmitted, so that the user can substantially visually recognize the image (virtual image) and the external environment. It is a concept including the state which is.
 <電気的構成>
 システムボックス2及び表示装置3の電気的構成の概略について、図3等を参照して説明する。システムボックス2は、CPU20と、プログラムROM21と、フラッシュROM22と、RAM23と、通信回路24と、ビデオRAM25と、画像処理部26と、周辺I/F27を備える。CPU20は、システムボックス2で実行される各種の処理を制御する。CPU20によって制御される処理は、例えば、図5に示すメイン処理と、図6に示すルーチン処理である。CPU20は、画像処理部26に対し、画像処理の実行を指示する。プログラムROM21は、システムボックス2で実行される各種の処理のためのコンピュータプログラムを記憶する。
<Electrical configuration>
An outline of the electrical configuration of the system box 2 and the display device 3 will be described with reference to FIG. The system box 2 includes a CPU 20, a program ROM 21, a flash ROM 22, a RAM 23, a communication circuit 24, a video RAM 25, an image processing unit 26, and a peripheral I / F 27. The CPU 20 controls various processes executed in the system box 2. The processes controlled by the CPU 20 are, for example, a main process shown in FIG. 5 and a routine process shown in FIG. The CPU 20 instructs the image processing unit 26 to execute image processing. The program ROM 21 stores computer programs for various processes executed in the system box 2.
 フラッシュROM22は、各種のデータを記憶する。フラッシュROM22に記憶されるデータは、例えば、画像データと、データベースである。画像データは、表示装置3で表示される画像に対応したデータである。画像データは、複数頁分の画像に対応したデータを含む。本実施形態では、複数頁分の画像に対応した画像データを例として説明する。ユーザは、表示装置3に表示された画像データに対応した各頁の画像を視認する。データベースは、識別情報と第一の特徴量とが関連付けて登録された記憶部である。識別情報は、HMD1を利用するユーザを識別するための情報である。第一の特徴量は、HMD1で実行される処理のための指示として設定された、対象の有する所定の部位、例えば、ユーザの身体の部分の動きの特徴を表す情報である。 The flash ROM 22 stores various data. Data stored in the flash ROM 22 is, for example, image data and a database. The image data is data corresponding to the image displayed on the display device 3. The image data includes data corresponding to images for a plurality of pages. In the present embodiment, image data corresponding to images for a plurality of pages will be described as an example. The user visually recognizes the image of each page corresponding to the image data displayed on the display device 3. The database is a storage unit in which identification information and a first feature amount are registered in association with each other. The identification information is information for identifying a user who uses the HMD 1. The first feature amount is information representing a motion feature of a predetermined part of the target, for example, a part of the user's body, set as an instruction for processing executed by the HMD 1.
 HMD1で実行される処理は、例えば、頁送り処理と、頁戻し処理である。頁送り処理は、表示装置3で表示されている画像を次頁の画像とする処理である。頁戻し処理は、表示装置3で表示されている画像を前頁の画像とする処理である。頁送り処理及び頁戻し処理は、図6に示すルーチン処理のS214に関連した関係を有する処理である。ユーザの身体の部分は、例えば、手である。頁送り処理のための動きは、ユーザの身体の部分が手である場合、ユーザの左右方向において、一方側の手を、一方側から他方側に移動させる動きである。具体的には、右手を右側から左側に移動させる動きである。頁戻し処理のための動きは、ユーザの左右方向において、一方側の手を他方側から一方側に移動させる動きである。具体的には、右手を左側から右側に移動させる動きである。前述した具体例に基づけば、図4に示す様に、右手の特徴を表す特徴量と、手の移動方向の特徴を表す特徴量と、手の移動量の特徴を表す特徴量が、第一の特徴量として、識別情報に関連付けられてデータベースに登録される。図4から明らかな通り、識別情報に関連付けられてデータベースに登録される第一の特徴量は、処理毎に管理される。 The processing executed by the HMD 1 is, for example, page turning processing and page returning processing. The page turning process is a process in which the image displayed on the display device 3 is used as the image of the next page. The page return process is a process in which the image displayed on the display device 3 is the image of the previous page. The page feed process and the page return process are processes having a relation related to S214 of the routine process shown in FIG. The user's body part is, for example, a hand. The movement for the page turning process is a movement of moving one hand from one side to the other side in the left-right direction of the user when the body part of the user is a hand. Specifically, this is a movement of moving the right hand from the right side to the left side. The movement for the page return process is a movement of moving the hand on one side from the other side to the one side in the left-right direction of the user. Specifically, this is a movement of moving the right hand from the left side to the right side. Based on the specific example described above, as shown in FIG. 4, the feature amount representing the feature of the right hand, the feature amount representing the feature of the hand movement direction, and the feature amount representing the feature of the hand movement amount are Are registered in the database in association with the identification information. As is clear from FIG. 4, the first feature quantity that is associated with the identification information and registered in the database is managed for each process.
 RAM23は、CPU20がプログラムROM21に記憶されたコンピュータプログラムを実行する際の作業領域となる。通信回路24は、表示装置3との間の通信等を制御する。通信回路24には、伝送ケーブル4が、電気的に接続される。通信回路24は、伝送ケーブル4を介して、表示装置3に画像信号を送信する。通信回路24は、伝送ケーブル4を介して、表示装置3にバッテリー等からの電源を供給する。通信回路24は、伝送ケーブル4を介して、表示装置3から送信される外界像信号を受信する。外界像信号は、カメラ32で撮像された外界像に対応した外界像データを表す信号である。ビデオRAM25は、画像信号として表示装置3に送信される画像データを記憶する。ビデオRAM25は、通信回路24で受信された外界像信号による外界像データを記憶する。画像処理部26は、フラッシュROM22からビデオRAM25に画像データを読み出し、ビデオRAM25に記憶された画像データに対して画像処理を実行し、画像信号を生成する。画像処理部26は、受信された外界像信号から外界像データを生成する。画像処理部26は、CPU20からの指令に従い、外界像データを画像処理する。画像処理部26は、CPU20の処理負担を軽減するために、種々の画像処理を実行するために設けられる。 The RAM 23 serves as a work area when the CPU 20 executes the computer program stored in the program ROM 21. The communication circuit 24 controls communication with the display device 3 and the like. The transmission cable 4 is electrically connected to the communication circuit 24. The communication circuit 24 transmits an image signal to the display device 3 via the transmission cable 4. The communication circuit 24 supplies power from a battery or the like to the display device 3 via the transmission cable 4. The communication circuit 24 receives an external image signal transmitted from the display device 3 via the transmission cable 4. The external image signal is a signal that represents external image data corresponding to the external image captured by the camera 32. The video RAM 25 stores image data transmitted to the display device 3 as an image signal. The video RAM 25 stores external image data based on an external image signal received by the communication circuit 24. The image processing unit 26 reads image data from the flash ROM 22 to the video RAM 25, executes image processing on the image data stored in the video RAM 25, and generates an image signal. The image processing unit 26 generates external field image data from the received external field image signal. The image processing unit 26 performs image processing on the external image data in accordance with a command from the CPU 20. The image processing unit 26 is provided to execute various image processes in order to reduce the processing load on the CPU 20.
 周辺I/F27は、所定の各部が電気的に接続されるインターフェースである。周辺I/F27には、例えば、電源スイッチ271と、電源ランプ272と、操作部273が接続される。電源スイッチ271は、HMD1への電源のオン及びオフを切り替えるスイッチである。電源スイッチ271がオンされると、HMD1が起動する。詳細には、電源スイッチ271がオンされると、バッテリーからシステムボックス2に電源が供給されると共に、伝送ケーブル4を介して、表示装置3に電源が供給される。電源ランプ272は、電源がオンであることを示すランプである。電源ランプ272は、電源スイッチ271がオンされたタイミングで点灯する。操作部273は、システムボックス2に対する所定の指示を入力するためのインターフェースである。操作部273は、複数の操作ボタンを含む。所定の指示は、操作部273の操作ボタンを、適宜操作して入力される。 The peripheral I / F 27 is an interface to which predetermined parts are electrically connected. For example, a power switch 271, a power lamp 272, and an operation unit 273 are connected to the peripheral I / F 27. The power switch 271 is a switch for switching on and off the power to the HMD 1. When the power switch 271 is turned on, the HMD 1 is activated. Specifically, when the power switch 271 is turned on, power is supplied from the battery to the system box 2 and power is supplied to the display device 3 via the transmission cable 4. The power lamp 272 is a lamp indicating that the power is on. The power lamp 272 is lit when the power switch 271 is turned on. The operation unit 273 is an interface for inputting a predetermined instruction to the system box 2. The operation unit 273 includes a plurality of operation buttons. The predetermined instruction is input by appropriately operating the operation buttons of the operation unit 273.
 表示装置3は、画像光形成部34の他、CPU38と、プログラムROM39と、RAM40と、通信回路41と、周辺I/F43を備える。表示装置3が備える各部は、画像光形成部34等と共に、筐体30に内蔵される。CPU38は、表示装置3で実行される各種の処理を制御する。例えば、CPU38は、画像光形成部34を駆動させ、画像信号に対応した画像光Limを形成し、ユーザに画像が表示されるように制御する。プログラムROM39は、表示装置3で実行される各種の処理のためのコンピュータプログラムを記憶する。表示装置3で実行される処理は、例えば、画像光形成部34による画像光Limの形成に関する処理である。RAM40は、CPU38がプログラムROM39に記憶されたコンピュータプログラムを実行する際の作業領域となる。 The display device 3 includes a CPU 38, a program ROM 39, a RAM 40, a communication circuit 41, and a peripheral I / F 43 in addition to the image light forming unit 34. Each unit included in the display device 3 is built in the housing 30 together with the image light forming unit 34 and the like. The CPU 38 controls various processes executed on the display device 3. For example, the CPU 38 drives the image light forming unit 34 to form the image light Lim corresponding to the image signal, and controls the image to be displayed to the user. The program ROM 39 stores computer programs for various processes executed by the display device 3. The process executed by the display device 3 is, for example, a process related to the formation of the image light Lim by the image light forming unit 34. The RAM 40 serves as a work area when the CPU 38 executes the computer program stored in the program ROM 39.
 通信回路41は、システムボックス2との間の通信等を制御する。通信回路41には、伝送ケーブル4が、電気的に接続される。伝送ケーブル4は、筐体30から後側に伸長し、システムボックス2に接続される。通信回路41は、伝送ケーブル4を介して、システムボックス2に外界像信号を送信する。通信回路41は、伝送ケーブル4を介して、システムボックス2から送信される画像信号を受信する。通信回路41は、伝送ケーブル4を介して、システムボックス2から電源の供給を受ける。供給された電源は、表示装置3の各部及びカメラ32に供給される。周辺I/F43は、カメラ32が接続されるインターフェースである。カメラ32で撮像された外界像に対応した外界像データを表す外界像信号は、周辺I/F43を経由して、通信回路41からシステムボックス2に送信される。 The communication circuit 41 controls communication with the system box 2 and the like. The transmission cable 4 is electrically connected to the communication circuit 41. The transmission cable 4 extends rearward from the housing 30 and is connected to the system box 2. The communication circuit 41 transmits an external image signal to the system box 2 via the transmission cable 4. The communication circuit 41 receives an image signal transmitted from the system box 2 via the transmission cable 4. The communication circuit 41 is supplied with power from the system box 2 via the transmission cable 4. The supplied power is supplied to each part of the display device 3 and the camera 32. The peripheral I / F 43 is an interface to which the camera 32 is connected. An external image signal representing external image data corresponding to the external image captured by the camera 32 is transmitted from the communication circuit 41 to the system box 2 via the peripheral I / F 43.
 HMD1は、システムボックス2のCPU20と、表示装置3のCPU38によって制御される。即ち、CPU20がプログラムROM21に記憶されたコンピュータプログラムを実行することで、システムボックス2では、各種の機能が実現される。CPU20は、HMD1が備える各種の機能手段となる制御部として特定することもできる。同様に、CPU38が、プログラムROM39に記憶されたコンピュータプログラムを実行することで、表示装置3では、各種の機能が実現される。CPU38は、HMD1が備える各種の機能手段となる制御部として特定することもできる。なお、コンピュータプログラムは、HMD1の工場出荷時にプログラムROM21及びプログラムROM39に書き込まれる。プログラムROM39は、コンピュータで読み取り可能な記憶装置の一例である。プログラムROM39の代わりに、例えば、ROM、HDD,RAMなどが記憶装置として利用されて良い。また、この場合の記憶装置は、一時的な記憶媒体を除く記憶媒体である。記憶装置は、非一時的な記憶媒体であってよい。非一時的な記憶媒体は、データを記憶する時間の長さに関わらず、データを留めておくことが可能なものである。また、コンピュータプログラムは、外部のサーバ等の記憶媒体に保存されていてもよい。コンピュータプログラムがサーバに記憶される場合、コンピュータプログラムは、接続インターフェースを介して外部のサーバなどからダウンロードされ、プログラムROM21及びプログラムROM39に適宜記憶される。この場合、コンピュータプログラムは、コンピュータで読み取り可能な一時的な記憶媒体(例えば、伝送信号)として、外部のサーバなどからHMD1に送信される。 The HMD 1 is controlled by the CPU 20 of the system box 2 and the CPU 38 of the display device 3. That is, various functions are realized in the system box 2 by the CPU 20 executing the computer program stored in the program ROM 21. The CPU 20 can also be specified as a control unit serving as various functional units included in the HMD 1. Similarly, various functions are realized in the display device 3 by the CPU 38 executing the computer program stored in the program ROM 39. The CPU 38 can also be specified as a control unit serving as various functional units included in the HMD 1. The computer program is written in the program ROM 21 and the program ROM 39 when the HMD 1 is shipped from the factory. The program ROM 39 is an example of a computer-readable storage device. Instead of the program ROM 39, for example, ROM, HDD, RAM, etc. may be used as the storage device. In this case, the storage device is a storage medium excluding a temporary storage medium. The storage device may be a non-transitory storage medium. A non-transitory storage medium can retain data regardless of the length of time to store the data. The computer program may be stored in a storage medium such as an external server. When the computer program is stored in the server, the computer program is downloaded from an external server or the like via the connection interface and stored in the program ROM 21 and the program ROM 39 as appropriate. In this case, the computer program is transmitted to the HMD 1 from an external server or the like as a computer-readable temporary storage medium (for example, a transmission signal).
 <メイン処理>
 HMD1で実行されるメイン処理について、図5を参照して説明する。メイン処理は、システムボックス2のCPU20によって実行される。メイン処理は、ユーザが、電源スイッチ271を操作し、電源がオンされた場合に開始される。CPU20は、プログラムROM21に記憶されたメイン処理のためのコンピュータプログラムを、RAM23を利用して実行する。コンピュータプログラムは、ルーチン処理のためのコンピュータプログラムモジュールを含む。
<Main processing>
The main process executed by the HMD 1 will be described with reference to FIG. The main process is executed by the CPU 20 of the system box 2. The main process is started when the user operates the power switch 271 to turn on the power. The CPU 20 uses the RAM 23 to execute a computer program for main processing stored in the program ROM 21. The computer program includes a computer program module for routine processing.
 S100では、メイン処理を開始したCPU20は、個人認証を実行する。個人認証には、ユーザを識別するための識別情報が用いられる。ユーザは、例えば、操作部273を操作し、自身の識別情報を入力する。識別情報の入力は、この他の方法にて行われてもよい。例えば、個人を識別可能な情報を含む無線タグを、読取部で読み取ることで、読み取られた情報に従った識別情報が入力されるようにしてもよい。また、手等の個人を識別可能な特徴量を有する身体の部分を、カメラ32で撮像することで、撮像された身体の部分に基づいた識別情報が入力されるようにしてもよい。CPU20は、入力された識別情報を取得し、RAM23に記憶する。入力された識別情報によって、HMD1を利用しているユーザが特定される。本実施形態では、S100で取得され、HMD1を利用しているユーザを識別できる識別情報を、「特定済みの識別情報」という。S102では、CPU20は、ユーザメニューを表示するように制御する。これによって、表示装置3は、ユーザメニューを表示し、ユーザは、ユーザメニューを視認する。 In S100, the CPU 20 that started the main process executes personal authentication. Identification information for identifying a user is used for personal authentication. For example, the user operates the operation unit 273 and inputs his / her identification information. The identification information may be input by another method. For example, identification information according to the read information may be input by reading a wireless tag including information that can identify an individual with a reading unit. Alternatively, identification information based on the imaged body part may be input by capturing an image of a body part having a characteristic amount that can identify an individual such as a hand with the camera 32. The CPU 20 acquires the input identification information and stores it in the RAM 23. The user using the HMD 1 is specified by the input identification information. In the present embodiment, the identification information acquired in S100 and capable of identifying the user using the HMD 1 is referred to as “specified identification information”. In S102, the CPU 20 performs control to display a user menu. Thereby, the display device 3 displays the user menu, and the user visually recognizes the user menu.
 S102で表示されるユーザメニューには、複数の選択肢が含まれる。例えば、次のような3つの選択肢が含まれる。第一の選択肢は、ジェスチャの学習結果をリセットするための選択肢である。学習結果は、S118、及び図6のS210に示す学習処理によって得られる。第二の選択肢は、ジェスチャ学習を実行するための選択肢である。ジェスチャ学習が選択され、ジェスチャ学習が実行された場合、データベースで、特定済みの識別情報に関連付けて登録された第一の特徴量は、更新される。第三の選択肢は、表示装置3で画像を表示させる選択肢である。画像を表示させる選択肢が選択された場合、表示装置3は、フラッシュROM22に記憶された画像データに対応した画像を表示する。ユーザは、操作部273を操作し、何れかの選択肢を選択する。 The user menu displayed in S102 includes a plurality of options. For example, the following three options are included. The first option is an option for resetting the learning result of the gesture. The learning result is obtained by the learning process shown in S118 and S210 in FIG. The second option is an option for performing gesture learning. When gesture learning is selected and gesture learning is performed, the first feature value registered in the database in association with the identified identification information is updated. The third option is an option for displaying an image on the display device 3. When an option for displaying an image is selected, the display device 3 displays an image corresponding to the image data stored in the flash ROM 22. The user operates the operation unit 273 to select any option.
 S104では、CPU20は、ユーザメニューの表示に応じ、学習結果のリセットが入力されたか否かを判断する。学習結果のリセットが入力された場合(S104:Yes)、CPU20は、フラッシュROM22に記憶されたデータベースで、特定済みの識別情報に関連付けて登録された第一の特徴量をリセットする(S106)。特定済みの識別情報に関連付けて登録された第一の特徴量は、予め定められた初期値となる。S106を実行した後、CPU20は、再度、ユーザメニューを表示するように制御する。リセット後に表示されるユーザメニューでは、学習結果のリセットは入力できない状態としてもよい。既に、第一の特徴量はリセットされているためである。本実施形態では、S106を実行した後のユーザメニューでは、学習結果のリセットを入力できないこととする。 In S104, the CPU 20 determines whether or not a learning result reset is input according to the display of the user menu. When reset of the learning result is input (S104: Yes), the CPU 20 resets the first feature amount registered in association with the identified identification information in the database stored in the flash ROM 22 (S106). The first feature value registered in association with the identified identification information is a predetermined initial value. After executing S106, the CPU 20 controls to display the user menu again. In the user menu displayed after resetting, the reset of the learning result may not be input. This is because the first feature amount has already been reset. In the present embodiment, it is assumed that the reset of the learning result cannot be input in the user menu after executing S106.
 学習結果のリセットが入力されていない場合(S104:No)、又は、S106を実行した後、CPU20は、ユーザメニューの表示に応じ、ジェスチャ学習が入力されたか否かを判断する(S108)。画像表示が入力され、ジェスチャ学習が入力されていない場合(S108:No)、CPU20は、処理をS124に移行する。ジェスチャ学習が入力された場合(S108:Yes)、CPU20は、カメラ32を起動するように制御する(S110)。これによって、カメラ32が起動する。その後、CPU20は、学習対象を入力するための選択画面を表示するように制御する。これによって、表示装置3は、選択画面を表示し、ユーザは、選択画面を視認する。選択画面は、HMD1で実行される各処理又は身体の部分の動きを特定するための複数の選択肢を含む。例えば、選択画面は、頁送り処理に関連した選択肢と、頁戻し処理に関連した選択肢を含む。ユーザは、操作部273を操作し、所望する選択肢を入力する。例えば、ユーザは、頁送り処理に関連した選択肢を入力する。S112では、CPU20は、入力された選択肢に従い、学習対象を特定する。頁送り処理に関連した選択肢が入力された場合、CPU20は、データベースにおいて、特定済みの識別情報に関連付けられた、頁送り処理のための動きの特徴を表す第一の特徴量を学習対象として特定する。 If the learning result reset is not input (S104: No), or after executing S106, the CPU 20 determines whether or not gesture learning is input according to the display of the user menu (S108). When the image display is input and the gesture learning is not input (S108: No), the CPU 20 shifts the process to S124. When gesture learning is input (S108: Yes), the CPU 20 controls the camera 32 to be activated (S110). As a result, the camera 32 is activated. Thereafter, the CPU 20 controls to display a selection screen for inputting a learning target. Thereby, the display device 3 displays the selection screen, and the user visually recognizes the selection screen. The selection screen includes a plurality of options for specifying each process executed by the HMD 1 or the movement of the body part. For example, the selection screen includes options related to page feed processing and options related to page return processing. The user operates the operation unit 273 and inputs a desired option. For example, the user inputs options related to the page turning process. In S112, the CPU 20 specifies a learning target in accordance with the input option. When an option related to the page turning process is input, the CPU 20 specifies, as a learning target, a first feature amount representing a movement feature for the page turning process associated with the identified identification information in the database. To do.
 S114では、CPU20は、ユーザにおいて、S112で特定された学習対象の第一の特徴量が表す動きを行う準備が整ったかを判断する。準備が整ったユーザは、操作部273において、準備完了の指示に関連付けられた操作キーを押下する。CPU20は、操作部273において、操作キーが押下されたか否かに従い、S114を判断する。操作キーが押下されない場合、CPU20は、S114を否定し(S114:No)、S114の処理を繰り返して実行する。操作キーが押下された場合、CPU20は、S114を肯定し(S114:Yes)、カメラ32で外界像の撮像を開始するように制御する(S116)。S112で特定された学習対象が頁送り処理に関連したものであった場合、ユーザは、例えば、身体の前で、右手を右側から左側に移動させる。このとき、ユーザは、顔面を、移動させている右手の方向に向ける。カメラ32は、移動するユーザの右手を含む外界像を撮像する。外界像の撮像は、例えば、予め定めた一定期間行われたタイミングで終了する。 In S114, the CPU 20 determines whether or not the user is ready to perform the movement represented by the first feature amount of the learning target specified in S112. The user who is ready presses down the operation key associated with the preparation completion instruction on the operation unit 273. The CPU 20 determines S114 according to whether or not the operation key is pressed in the operation unit 273. When the operation key is not pressed, the CPU 20 denies S114 (S114: No) and repeats the process of S114. When the operation key is pressed, the CPU 20 affirms S114 (S114: Yes), and controls the camera 32 to start capturing an external image (S116). When the learning target specified in S112 is related to the page turning process, the user moves the right hand from the right side to the left side, for example, in front of the body. At this time, the user turns his / her face toward the moving right hand. The camera 32 captures an external image including the right hand of the moving user. The imaging of the outside world image ends, for example, at a timing performed for a predetermined period.
 カメラ32で撮像された外界像に対応した外界像データを表す外界像信号は、表示装置3の通信回路41と、システムボックス2の通信回路24との間を、伝送ケーブル4を介して送受信される。外界像信号は、画像処理部26によって外界像データとされ、生成された外界像データがビデオRAM25に記憶される。S118では、CPU20は、ビデオRAM25に記憶された外界像データを対象として、学習処理を実行する。S118での学習処理において、CPU20は、ビデオRAM25に記憶された外界像データから、外界像に含まれる身体の部分の動きを取得する。例えば、CPU20は、右側から左側へと移動する右手の動きを取得する。右手の動きを取得するとき、CPU20は、画像処理部26に、外界像に含まれる右手等の身体の部分の動きを特定するための画像処理を、ビデオRAM25に記憶された外界像データに対して実行するように指令する。CPU20は、画像処理部26による画像処理の結果から、右手等の身体の部分の動きを取得する。その後、CPU20は、取得された動きの特徴を表す特徴量を求める。画像処理部26で実行される特徴量を求める画像処理は、既に開発された画像処理技術に従い行われる。動きの特徴量は、例えば、外界像における身体の部分の移動距離、移動方向、又はその両方である。本実施形態では、取得された動きの特徴量のように、S118、及び図6のS210に示すデータベースに登録された第一の特徴量を更新するための特徴量を、第二の特徴量という。CPU20は、第二の特徴量に従い、データベースで、特定済みの識別情報に関連付けられた学習対象の第一の特徴量を更新する。S104が否定されていた場合(S104:No)、学習対象の第一の特徴量は、メイン処理開始時のデータベースで、特定済みの識別情報に関連付けられた第一の特徴量である。S106が実行されていた場合、学習対象の第一の特徴量は、リセット後の第一の特徴量である。学習処理における更新は、ニューラルネットワーク(SOM)を用いて行われる。ニューラルネットワーク(SOM)は、既に知られた技術である。ニューラルネットワーク(SOM)による更新の具体的な説明は、省略する。 An external image signal representing external image data corresponding to the external image captured by the camera 32 is transmitted and received via the transmission cable 4 between the communication circuit 41 of the display device 3 and the communication circuit 24 of the system box 2. The The external image signal is converted into external image data by the image processing unit 26, and the generated external image data is stored in the video RAM 25. In S <b> 118, the CPU 20 executes a learning process on the outside world image data stored in the video RAM 25. In the learning process in S118, the CPU 20 acquires the motion of the body part included in the external image from the external image data stored in the video RAM 25. For example, the CPU 20 acquires the movement of the right hand that moves from the right side to the left side. When acquiring the movement of the right hand, the CPU 20 performs image processing for specifying the movement of the body part such as the right hand included in the external image in the image processing unit 26 with respect to the external image data stored in the video RAM 25. Command to run. The CPU 20 acquires the movement of the body part such as the right hand from the result of the image processing by the image processing unit 26. Thereafter, the CPU 20 obtains a feature amount representing the acquired feature of the movement. The image processing for obtaining the feature amount executed by the image processing unit 26 is performed according to an already developed image processing technique. The feature amount of movement is, for example, the moving distance of the body part in the external image, the moving direction, or both. In the present embodiment, the feature quantity for updating the first feature quantity registered in the database shown in S <b> 118 and S <b> 210 in FIG. 6, like the acquired motion feature quantity, is referred to as a second feature quantity. . The CPU 20 updates the first feature amount of the learning target associated with the identified identification information in the database according to the second feature amount. When S104 is denied (S104: No), the first feature amount to be learned is the first feature amount associated with the identified identification information in the database at the start of the main process. When S106 is executed, the first feature amount to be learned is the first feature amount after reset. The update in the learning process is performed using a neural network (SOM). Neural network (SOM) is a known technique. A specific description of the update by the neural network (SOM) is omitted.
 S120では、S118を実行した後、CPU20は、特定済みの識別情報に関連付けられた学習対象の第一の特徴量として、S118で更新された第一の特徴量を、データベースに登録する。CPU20は、選択画面を再度表示するように制御する。ユーザは、操作部273を操作し、所望する選択肢を入力する。選択画面は、ジェスチャ学習の完了を指示する選択肢を含む。S122では、CPU20は、選択画面の表示に応じ、入力された選択肢が、ジェスチャ学習の完了を指示するものであったか否かを判断する。頁送り処理以外の処理に関連した選択肢が入力された場合(S122:No)、処理をS110に戻し、S110~S120を実行する。S112に際し表示された選択画面において、複数入力できるようにしてもよい。複数入力した場合、入力された全てに対し、S112~S120が実行されていなければ、S122を否定し(S122:No)、未処理分について、順次実行するようにしてもよい。 In S120, after executing S118, the CPU 20 registers, in the database, the first feature amount updated in S118 as the first feature amount to be learned associated with the identified identification information. The CPU 20 controls to display the selection screen again. The user operates the operation unit 273 and inputs a desired option. The selection screen includes options for instructing completion of gesture learning. In S122, the CPU 20 determines whether or not the input option is an instruction to complete the gesture learning according to the display of the selection screen. When an option related to a process other than the page turning process is input (S122: No), the process returns to S110, and S110 to S120 are executed. On the selection screen displayed at S112, a plurality of inputs may be made possible. When a plurality of inputs are made, if S112 to S120 are not executed for all input, S122 may be denied (S122: No), and the unprocessed portions may be executed sequentially.
 ジェスチャ学習の完了を指示する選択肢が入力された場合(S122:Yes)、CPU20は、処理をS124に移行する。S112に際し表示された選択画面で複数入力できるようにした場合、入力された全てに対し、S112~S120が実行されたとき、S122は肯定される(S122:Yes)。S124で、CPU20は、ルーチン処理を実行する。CPU20は、ルーチン処理の終了に伴い、メイン処理を終了する。 When an option for instructing completion of gesture learning is input (S122: Yes), the CPU 20 shifts the process to S124. In the case where a plurality of inputs can be made on the selection screen displayed at S112, when S112 to S120 are executed for all input, S122 is affirmed (S122: Yes). In S124, the CPU 20 executes a routine process. The CPU 20 ends the main process with the end of the routine process.
 <ルーチン処理>
 図5に示すメイン処理のS124で実行されるルーチン処理について、図6を参照して説明する。ルーチン処理の開始に合わせ、HMD1では、フラッシュROM22に記憶された画像データに対応した画像の表示が開始される。具体的に、画像処理部26では、フラッシュROM22に記憶された画像データのうち、所定の頁部分が画像処理され、画像処理によって生成された画像信号が、通信回路24から表示装置3に送信される。表示装置3では、通信回路41で受信された画像信号に基づき、画像光形成部34で画像光Limが形成され、画像信号に対応した画像が表示される。ユーザは、表示される画像を視認する。
<Routine processing>
The routine process executed in S124 of the main process shown in FIG. 5 will be described with reference to FIG. At the start of the routine processing, the HMD 1 starts displaying an image corresponding to the image data stored in the flash ROM 22. Specifically, the image processing unit 26 performs image processing on a predetermined page portion of the image data stored in the flash ROM 22, and an image signal generated by the image processing is transmitted from the communication circuit 24 to the display device 3. The In the display device 3, the image light forming unit 34 forms the image light Lim based on the image signal received by the communication circuit 41, and an image corresponding to the image signal is displayed. The user visually recognizes the displayed image.
 ルーチン処理を開始したCPU20は、カメラ32で外界像の撮像を開始するように制御する(S200)。例えば、ユーザは、身体の前で、右手を右側から左側に移動させる。右手を右側から左側に移動させるとき、ユーザは、顔面を、移動させている右手の方向に向ける。カメラ32は、移動するユーザの右手を含む外界像を撮像する。外界像の撮像は、ルーチン処理が実行されている期間、継続される。カメラ32で撮像された外界像に対応した外界像データを表す外界像信号は、表示装置3の通信回路41と、システムボックス2の通信回路24との間を、伝送ケーブル4を介して送受信される。外界像信号は、画像処理部26によって外界像データとされ、生成された外界像データがビデオRAM25に記憶される。CPU20は、ビデオRAM25に記憶された外界像データから、外界像に含まれる身体の部分の動きを取得する。例えば、CPU20は、右側から左側へと移動する右手の動きを取得する。身体の部分の動きは、S118の説明と同様に取得される。以下では、取得された身体の部分の動きのように、HMD1で実行される所定の処理のための指示を入力するための動きである可能性のある動作を、「操作動作」という。 The CPU 20 that has started the routine process controls the camera 32 to start capturing an external image (S200). For example, the user moves the right hand from the right side to the left side in front of the body. When moving the right hand from the right side to the left side, the user turns the face toward the moving right hand. The camera 32 captures an external image including the right hand of the moving user. The imaging of the outside world image is continued during the routine processing. An external image signal representing external image data corresponding to the external image captured by the camera 32 is transmitted and received via the transmission cable 4 between the communication circuit 41 of the display device 3 and the communication circuit 24 of the system box 2. The The external image signal is converted into external image data by the image processing unit 26, and the generated external image data is stored in the video RAM 25. The CPU 20 acquires the movement of the body part included in the external image from the external image data stored in the video RAM 25. For example, the CPU 20 acquires the movement of the right hand that moves from the right side to the left side. The movement of the body part is acquired in the same manner as described in S118. Hereinafter, a motion that may be a motion for inputting an instruction for a predetermined process executed by the HMD 1, such as the acquired motion of a body part, is referred to as an “operation motion”.
 S202では、CPU20は、操作動作と、特定済みの識別情報に関連付けられた第一の特徴量との対応について判断する。具体的に、CPU20は、操作動作の特徴を表す特徴量が、データベースに、特定済みの識別情報に関連付けて登録された第一の特徴量に対応しているか否かを判断する。S202に関し、頁送り処理を指示するための動作である「右手を右側から左側に移動」を例に説明する。S202で、CPU20は、操作動作の特徴を表す特徴量が、図4の「No.1」のレコードに示す第一の特徴量に含まれる「右手」、「手の移動方向」及び「手の移動量」の各特徴量と同一であるか否かを個別に判断する。先ず、CPU20は、操作動作における身体の部分の特徴を表す特徴量が、右手の特徴を表す特徴量に一致するか否かを判断する。次に、CPU20は、操作動作における身体の部分の移動方向を表す特徴量が、右側から左側への手の移動方向の特徴を表す特徴量に一致するか否かを判断する。さらに、CPU20は、操作動作における身体の部分の移動量を表す特徴量が、手の移動量を表す特徴量に一致するか否かを判断する。 In S202, the CPU 20 determines the correspondence between the operation action and the first feature amount associated with the identified identification information. Specifically, the CPU 20 determines whether or not the feature quantity representing the feature of the operation action corresponds to the first feature quantity registered in the database in association with the identified identification information. With respect to S202, the “moving right hand from the right side to the left side” which is an operation for instructing the page turning process will be described as an example. In S <b> 202, the CPU 20 includes a “right hand”, a “hand movement direction”, and a “hand movement direction” that are included in the first feature amount indicated in the record “No. 1” in FIG. 4. It is individually determined whether or not it is the same as each feature amount of “movement amount”. First, the CPU 20 determines whether or not the feature quantity representing the feature of the body part in the operation operation matches the feature quantity representing the feature of the right hand. Next, the CPU 20 determines whether or not the feature amount representing the movement direction of the body part in the operation operation matches the feature amount representing the feature of the hand movement direction from the right side to the left side. Furthermore, the CPU 20 determines whether or not the feature amount that represents the movement amount of the body part in the operation operation matches the feature amount that represents the movement amount of the hand.
 「右手」、「手の移動方向」及び「手の移動量」の各特徴量の判断が全て一致し、操作動作の特徴を表す特徴量が、データベースに、特定済みの識別情報に関連付けて登録された第一の特徴量と同一であると判断される場合、S202は肯定される(S202:Yes)。両者の差が誤差として予め設定された範囲である場合についても、同一の場合と同様、S202を肯定(S202:Yes)するようにしてもよい。誤差は、操作部273を介した操作によって、適宜変更できるようにしてもよい。S202は、特定済みの識別情報に関連付けてデータベースに登録された全ての第一の特徴量を対象として行われる。全ての第一の特徴量について、「右手」、「手の移動方向」及び「手の移動量」の各特徴量の判断の少なくとも何れか1つが同一ではなく、また、両者の差が予め定めた範囲外である場合、S202は否定される(S202:No)。S202が肯定された場合(S202:Yes)、CPU20は、第一の特徴量に対応した処理を制御する(S204)。図4の「No.1」のレコードに示す様に、「右手を右側から左側に移動」に関する例に基づけば、CPU20は、頁送り処理を制御する。S204に対応して、画像処理部26では、画像データのうち、次頁部分が画像処理され、画像処理によって生成された画像信号が、通信回路24から表示装置3に送信される。表示装置3では、通信回路41で受信された画像信号に基づき、画像光形成部34で画像光Limが形成され、画像信号に対応した次頁の画像が表示される。その後、CPU20は、処理をS220に移行する。 The determination of the feature values of “right hand”, “hand movement direction”, and “hand movement amount” all match, and the feature amount representing the feature of the operation is registered in the database in association with the identified identification information. If it is determined that it is the same as the first feature amount that has been made, S202 is affirmed (S202: Yes). Also in the case where the difference between the two is in a range set in advance as an error, S202 may be affirmed (S202: Yes), as in the same case. The error may be appropriately changed by an operation via the operation unit 273. S202 is performed for all first feature values registered in the database in association with the identified identification information. For all of the first feature values, at least one of the determination of each feature value of “right hand”, “hand movement direction”, and “hand movement amount” is not the same, and the difference between the two is determined in advance. If it is out of the range, S202 is denied (S202: No). When S202 is affirmed (S202: Yes), the CPU 20 controls processing corresponding to the first feature amount (S204). As shown in the record “No. 1” in FIG. 4, the CPU 20 controls the page turning process based on the example of “moving the right hand from the right side to the left side”. Corresponding to S <b> 204, the image processing unit 26 performs image processing on the next page portion of the image data, and transmits an image signal generated by the image processing to the display device 3 from the communication circuit 24. In the display device 3, the image light Lim is formed by the image light forming unit 34 based on the image signal received by the communication circuit 41, and the next page image corresponding to the image signal is displayed. Thereafter, the CPU 20 shifts the processing to S220.
 S202が否定された場合(S202:No)、CPU20は、操作動作の特徴を表す特徴量と第一の特徴量との差が、予め定めた閾値内であるか否かを判断する(S206)。閾値としては、第一の特徴量に対して、所定の範囲が予め設定される。閾値とされる所定の範囲は、上述した誤差より広い範囲が設定される。閾値は、操作動作が、第一の特徴量に対応した動きに類似していると判断できる範囲に基づき、適宜設定される。操作動作が閾値外である場合、CPU20は、S206を否定し(S206:No)、処理をS220に移行する。操作動作が閾値外である場合、操作動作は、HMD1に対する指示の入力に関する動きではなかったと判断される。操作動作が閾値内である場合、CPU20は、S206を肯定する(S206:No)。操作動作が閾値内である場合、S208では、CPU20は、S204の場合と同様、S202の判断基準とされる第一の特徴量に対応した処理を制御する。例えば、CPU20は、頁送り処理を制御し、HMD1では、表示装置3で表示されている画像の頁送りが、実行される。S208を実行した後、S210では、CPU20は、学習処理を実行する。学習処理は、操作動作を対象として、メイン処理のS118と同様にして実行される。即ち、CPU20は、第二の特徴量としての操作動作の特徴を表す特徴量を、取得された操作動作から求める。CPU20は、求められた第二の特徴量に従い、データベースで、特定済みの識別情報に関連付けられた第一の特徴量を更新する。更新される第一の特徴量は、S208で制御された処理のための特徴量である。S212では、CPU20は、特定済みの識別情報に関連付けられ、S208で実行された処理に関する第一の特徴量として、S210で更新された第一の特徴量を、データベースに登録する。 When S202 is denied (S202: No), the CPU 20 determines whether or not the difference between the feature amount representing the feature of the operation action and the first feature amount is within a predetermined threshold (S206). . As the threshold, a predetermined range is set in advance for the first feature amount. As the predetermined range set as the threshold, a range wider than the above-described error is set. The threshold is appropriately set based on a range in which the operation can be determined to be similar to the movement corresponding to the first feature amount. When the operation operation is outside the threshold, the CPU 20 denies S206 (S206: No), and moves the process to S220. When the operation action is outside the threshold value, it is determined that the operation action is not a movement related to an instruction input to the HMD 1. When the operation operation is within the threshold, the CPU 20 affirms S206 (S206: No). When the operation action is within the threshold value, in S208, the CPU 20 controls the process corresponding to the first feature amount used as the determination criterion in S202, as in S204. For example, the CPU 20 controls page turning processing, and the HMD 1 executes page turning of the image displayed on the display device 3. After executing S208, in S210, the CPU 20 executes a learning process. The learning process is executed in the same manner as S118 of the main process for the operation operation. That is, the CPU 20 obtains a feature amount representing the feature of the operation operation as the second feature amount from the acquired operation operation. The CPU 20 updates the first feature amount associated with the identified identification information in the database according to the obtained second feature amount. The updated first feature value is a feature value for the process controlled in S208. In S212, the CPU 20 registers the first feature quantity updated in S210 in the database as the first feature quantity associated with the identified identification information and related to the process executed in S208.
 S214では、CPU20は、所定の期間内に、S208での処理のために取得された操作動作に連続して、S208で制御された処理と一定の関係を有する処理のための操作動作が、取得されたか否かを判断する。CPU20は、画像処理部26に、外界像に含まれる身体の部分の動きを特定するための画像処理を、カメラ32での継続した撮像によって、連続的に生成され、ビデオRAM25に記憶された外界像データに対して実行するように指令する。S208で制御された処理と一定の関係を有する処理のための操作動作が行われていた場合、CPU20は、画像処理部26による画像処理の結果から、S208で制御された処理と一定の関係を有する処理のための操作動作を取得する。S214において、画像処理部26で実行される画像処理は、既に開発された画像処理技術に従い行われる。 In S214, the CPU 20 acquires operation operations for processing having a certain relationship with the processing controlled in S208 in succession to the operation operations acquired for processing in S208 within a predetermined period. It is judged whether it was done. The CPU 20 causes the image processing unit 26 to continuously generate image processing for specifying the movement of the body part included in the external image by continuous imaging with the camera 32 and store the external environment stored in the video RAM 25. Command to run on image data. When an operation operation for a process having a certain relationship with the process controlled in S208 has been performed, the CPU 20 determines a certain relationship with the process controlled in S208 from the result of the image processing by the image processing unit 26. The operation action for the process to have is acquired. In S214, the image processing executed by the image processing unit 26 is performed according to an already developed image processing technique.
 S208で制御された処理と一定の関係を有する処理は、例えば、S208で制御された処理と対になる処理である。具体的には、S208で制御された処理が頁送り処理である場合、頁送り処理と対になる処理は、頁戻し処理である。CPU20は、S208の制御に先立ち、画像処理部26での画像処理によって、右手を右側から左側に移動する動きを取得した後、所定の期間内に、連続して、右手を左側から右側に移動する動きが取得されたか否かを判断する。S208で制御され、S208の結果として実行された処理が、ユーザの意図しないものであった場合、ユーザは、S208の処理を取り消すための動きを行うことが多いと想定される。例えば、ユーザの意図に反し、表示されている画像が、次頁の画像となった場合、ユーザは、元々表示されていた画像を再度表示させるため、頁戻しの処理を指示すると考えられる。所定の期間は、頁戻しの処理を指示するユーザの行為がどの程度の期間で行われるかを考慮し、設定される。 The process having a certain relationship with the process controlled in S208 is, for example, a process that is paired with the process controlled in S208. Specifically, when the process controlled in S208 is a page turning process, the process paired with the page turning process is a page return process. Prior to the control of S208, the CPU 20 acquires a movement of moving the right hand from the right side to the left side by image processing in the image processing unit 26, and then continuously moves the right hand from the left side to the right side within a predetermined period. It is determined whether or not a movement to perform is acquired. If the process controlled in S208 and executed as a result of S208 is not intended by the user, it is assumed that the user often performs a movement to cancel the process of S208. For example, when the displayed image becomes the image of the next page against the user's intention, it is considered that the user instructs the page return process to display the image that was originally displayed again. The predetermined period is set in consideration of how long the user's action for instructing the page return process is performed.
 S214に関し、例えば、右手を左側から右側に移動する動きが取得されたか否かは、画像処理部26での画像処理によって取得された操作動作が、図4の「No.2」のレコードに示す身体の部分の動きの第一の特徴量に一致するか否かに従い行われる。具体的に、CPU20は、操作動作の特徴を表す特徴量が、図4の「No.2」のレコードに示す第一の特徴量に含まれる「右手」、「手の移動方向」及び「手の移動量」の各特徴量と同一であるか否かを個別に判断する。先ず、CPU20は、操作動作における身体の部分の特徴を表す特徴量が、右手の特徴を表す特徴量に一致するか否かを判断する。次に、CPU20は、操作動作における身体の部分の移動方向を表す特徴量が、左側から右側への手の移動方向の特徴を表す特徴量に一致するか否かを判断する。さらに、CPU20は、操作動作における身体の部分の移動量を表す特徴量が、手の移動量を表す特徴量に一致するか否かを判断する。「右手」、「手の移動方向」及び「手の移動量」の各特徴量の判断が全て一致し、操作動作の特徴を表す特徴量が、データベースに、特定済みの識別情報に関連付けて登録された第一の特徴量と同一であると判断される場合、S214は肯定される(S214:Yes)。「右手」、「手の移動方向」及び「手の移動量」の各特徴量と第一の特徴量との差が誤差として予め設定された範囲である場合についても、同一の場合と同様、S214を肯定(S214:Yes)するようにしてもよい。誤差は、操作部273を介した操作によって、適宜変更できるようにしてもよい。「右手」、「手の移動方向」及び「手の移動量」の各特徴量の判断の少なくとも何れか1つが同一ではなく、また、「右手」、「手の移動方向」及び「手の移動量」の各特徴量と第一の特徴量との差が予め定めた範囲外である場合、S214は否定される(S214:No)。 Regarding S214, for example, whether or not the movement of moving the right hand from the left side to the right side is acquired is indicated by the operation operation acquired by the image processing in the image processing unit 26 in the record “No. 2” in FIG. This is performed according to whether or not it matches the first feature amount of the movement of the body part. Specifically, the CPU 20 includes a “right hand”, a “hand moving direction”, and a “hand movement” that are included in the first feature amount shown in the record “No. 2” in FIG. It is individually determined whether or not it is the same as each feature amount of “movement amount”. First, the CPU 20 determines whether or not the feature quantity representing the feature of the body part in the operation operation matches the feature quantity representing the feature of the right hand. Next, the CPU 20 determines whether or not the feature amount representing the movement direction of the body part in the operation operation matches the feature amount representing the feature of the hand movement direction from the left side to the right side. Furthermore, the CPU 20 determines whether or not the feature amount that represents the movement amount of the body part in the operation operation matches the feature amount that represents the movement amount of the hand. The determination of the feature values of “right hand”, “hand movement direction”, and “hand movement amount” all match, and the feature amount representing the feature of the operation is registered in the database in association with the identified identification information. If it is determined that it is the same as the first feature amount that has been made, S214 is affirmed (S214: Yes). As for the case where the difference between each feature amount of the “right hand”, “hand movement direction” and “hand movement amount” and the first feature amount is a preset range as an error, S214 may be affirmed (S214: Yes). The error may be appropriately changed by an operation via the operation unit 273. At least one of the determination of each feature amount of “right hand”, “hand movement direction”, and “hand movement amount” is not the same, and “right hand”, “hand movement direction”, and “hand movement” When the difference between each feature amount of the “quantity” and the first feature amount is outside the predetermined range, S214 is negative (S214: No).
 所定の期間内にS208での処理と一定の関係を有する処理のための操作動作が取得されていない場合(S214:No)、CPU20は、処理をS220に移行する。所定の期間内に前述した操作動作が取得された場合(S214:Yes)、CPU20は、S212でデータベースに特定済みの識別情報に関連付けて登録された第一の特徴量を、S212を実行する前の設定に戻す(S216)。続けて、CPU20は、S214で取得されたと判断された対になる動きに対応した処理を制御する(S218)。例えば、対になる動きが、右手を左側から右側に移動させた動きであった場合、CPU20は、図4の「No.2」のレコードに示す頁戻し処理を制御する。頁戻し処理の制御に対応して、画像処理部26では、画像データのうち、前頁部分が画像処理され、画像処理によって生成された画像信号が、通信回路24から表示装置3に送信される。表示装置3では、通信回路41で受信された画像信号に基づき、画像光形成部34で画像光Limが形成され、画像信号に対応した前頁の画像が表示される。その後、CPU20は、処理をS220に移行する。 If the operation operation for the process having a certain relationship with the process in S208 is not acquired within the predetermined period (S214: No), the CPU 20 shifts the process to S220. When the operation operation described above is acquired within the predetermined period (S214: Yes), the CPU 20 executes the first feature amount registered in association with the identification information specified in the database in S212 before executing S212. (S216). Subsequently, the CPU 20 controls processing corresponding to the paired movement determined to have been acquired in S214 (S218). For example, when the paired movement is a movement in which the right hand is moved from the left side to the right side, the CPU 20 controls the page return process shown in the record “No. 2” in FIG. Corresponding to the control of the page return processing, the image processing unit 26 performs image processing on the previous page portion of the image data, and transmits an image signal generated by the image processing from the communication circuit 24 to the display device 3. . In the display device 3, image light Lim is formed by the image light forming unit 34 based on the image signal received by the communication circuit 41, and the previous page image corresponding to the image signal is displayed. Thereafter, the CPU 20 shifts the processing to S220.
 S220で、CPU20は、ユーザが、電源スイッチ271を操作し、電源がオフされたか否かを判断する。電源がオフされていない場合(S220:No)、CPU20は、処理を、S202に戻す。電源がオフされた場合(S220:Yes)、CPU20は、ルーチン処理を終了する。ルーチン処理を終了したCPU20は、処理をメイン処理に戻し、メイン処理を終了する。 In S220, the CPU 20 determines whether or not the user has operated the power switch 271 to turn off the power. When the power is not turned off (S220: No), the CPU 20 returns the process to S202. When the power is turned off (S220: Yes), the CPU 20 ends the routine process. CPU20 which complete | finished routine processing returns a process to a main process, and complete | finishes a main process.
 <本実施形態の効果>
 本実施形態によれば、次のような効果を得ることができる。
<Effect of this embodiment>
According to this embodiment, the following effects can be obtained.
 (1)メイン処理の開始後、S100にて個人認証を行い、その後、ジェスチャ学習が入力された場合(S108:Yes)、個人認証によって特定されたHMD1のユーザを識別する特定済みの識別情報に関連付けられ、S112で学習対象とされた第一の特徴量を、学習処理によって、S116で撮像されたユーザの身体の部分の動きの特徴を表す第二の特徴量でS118にて更新し、更新された第一の特徴量を、特定済みの識別情報に関連付けてS120にて登録することとした。そのため、ユーザ毎に、特定の処理に関する第一の特徴量を、更新することができる。S118の学習処理によって、第一の特徴量が、更新されるため、ユーザ毎に、第一の特徴量が表す動きに対応した処理を指示するための操作性を改善することができる。なお、HMD1を生産する生産者は、HMD1の生産に際し、S110~S120の各処理を、単独で実行する。S110~S120の各処理を実行することによって、初期値としてのS106でのリセット後の第一の特徴量がデータベースに登録される。 (1) After the main process is started, personal authentication is performed in S100. After that, when gesture learning is input (S108: Yes), the specified identification information for identifying the user of HMD1 specified by the personal authentication is used. The first feature amount that is associated and is the learning target in S112 is updated in S118 with the second feature amount that represents the motion feature of the user's body image captured in S116 by the learning process, and is updated. The determined first feature amount is registered in S120 in association with the identified identification information. Therefore, the first feature amount related to a specific process can be updated for each user. Since the first feature value is updated by the learning process of S118, the operability for instructing the process corresponding to the motion represented by the first feature value can be improved for each user. Note that the producer who produces the HMD1 executes each processing of S110 to S120 independently when producing the HMD1. By executing each processing of S110 to S120, the first feature value after the reset in S106 as an initial value is registered in the database.
 S104にて、メイン処理で、学習結果のリセットが入力されたか否かを判断し、学習結果のリセットが入力された場合(S104:Yes)、CPU20は、フラッシュROM22に記憶されたデータベースで、特定済みの識別情報に関連付けて登録された第一の特徴量をリセットする(S106)こととした。リセットするため、データベースに登録された第一の特徴量を初期値に戻すことができる。 In S104, it is determined whether or not a reset of the learning result is input in the main process. If the reset of the learning result is input (S104: Yes), the CPU 20 specifies the database stored in the flash ROM 22. The first feature amount registered in association with the completed identification information is reset (S106). In order to reset, the first feature amount registered in the database can be returned to the initial value.
 (2)S202にて、ルーチン処理で、外界像データに対する画像処理によって取得された操作動作と、特定済みの識別情報に関連付けられた第一の特徴量との対応を判断し、操作動作の特徴を表す特徴量が、データベースに、特定済みの識別情報に関連付けて登録された第一の特徴量と同一である場合(S202:Yes)、CPU20は、操作動作に対応する処理を制御する(S204)こととした。そのため、HMD1では、操作動作に対応した処理を実行させることができる。上述した通り、S202では、操作動作の特徴を表す特徴量と第一の特徴量との差が誤差として予め設定された範囲である場合についても、S202の判断を肯定(S202:Yes)するようにしてもよい。 (2) In S202, the correspondence between the operation operation acquired by the image processing on the external image data and the first feature amount associated with the identified identification information is determined in a routine process, and the operation operation feature is determined. Is the same as the first feature value registered in the database in association with the identified identification information (S202: Yes), the CPU 20 controls processing corresponding to the operation operation (S204). ) Therefore, the HMD 1 can execute processing corresponding to the operation operation. As described above, in S202, the determination in S202 is also affirmed (S202: Yes) even in the case where the difference between the feature amount representing the feature of the operation action and the first feature amount is within a preset range as an error. It may be.
 S202が否定された場合(S202:No)であっても、操作動作と第一の特徴量との対応が閾値内であるか否かを判断し(S206)、閾値内であれば(S206:Yes)、S208にてCPU20は、操作動作に対応する処理を制御することとした。その後、学習処理によって、特定済みの識別情報に関連付けられ、S208で制御された処理に関する第一の特徴量をS210にて更新し、更新された第一の特徴量をS212にて登録することとした。但し、所定の期間内に、S208での処理のために取得された操作動作に連続して、S208で制御された処理と対になる処理のための操作動作が取得された場合(S214:Yes)、S212での登録を取り消し、S216にてS212実行前の設定に戻すこととした。S212実行前の設定に戻すため、状況に応じた処理の制御と、制御された処理に基づいた第一の特徴量の登録を実現することができる。所定の期間内に、対になる2つの処理のための操作動作が連続して取得されるような場合、先に制御されたS208での処理は、ユーザの意図した処理ではなかったと考えられる。従って、ユーザの意図しない処理に関連した動きに基づいた第一の特徴量の更新を回避することができる。 Even if S202 is negative (S202: No), it is determined whether the correspondence between the operation action and the first feature amount is within the threshold (S206), and if it is within the threshold (S206: Yes), in S208, the CPU 20 controls the process corresponding to the operation. After that, the first feature value related to the processing controlled in S208 is updated in S210, and the updated first feature value is registered in S212. did. However, when an operation operation for processing that is paired with the processing controlled in S208 is acquired within a predetermined period, following the operation operation acquired for the processing in S208 (S214: Yes). ), Canceling the registration in S212, and returning to the setting before the execution of S212 in S216. Since the setting is returned to the setting before the execution of S212, it is possible to realize the control of the process according to the situation and the registration of the first feature amount based on the controlled process. In the case where the operation operations for the two processes that are paired are continuously acquired within the predetermined period, it is considered that the previously controlled process in S208 was not the process intended by the user. Accordingly, it is possible to avoid the update of the first feature amount based on the movement related to the process not intended by the user.
 <変形例>
 本実施形態は、次のようにすることもできる。
<Modification>
This embodiment can also be performed as follows.
 (1)ルーチン処理において、S202が否定された場合(S202:No)、S206以降の処理を実行し、その後、電源がオフされていない場合(S220:No)、処理をS202に戻すこととした。この他、S206以降の処理を省略し、S202が否定された場合(S202:No)、処理をS220に移行するようにしてもよい。処理をS220に移行する構成にすることにより、S206~S218を省略し、ルーチン処理の手順をシンプルにすることができる。 (1) In the routine process, when S202 is denied (S202: No), the process after S206 is executed, and when the power is not turned off (S220: No), the process is returned to S202. . In addition, the processing after S206 may be omitted, and when S202 is denied (S202: No), the processing may be shifted to S220. By adopting a configuration that shifts the processing to S220, S206 to S218 can be omitted, and the routine processing procedure can be simplified.
 (2)ルーチン処理で、S202が否定され、S206が肯定された場合(S202:No、S206:Yes)において、S214が肯定(S214:Yes)されると、更新された第一の特徴量の登録を取り消す(S216)こととした。このような処理は、次のようにして行われるようにしてもよい。即ち、CPU20は、S206が肯定された場合(S206:Yes)、続けて、S214を実行する。CPU20は、S214が否定された場合(S214:No)、S208~S212を実行し、その後、処理をS220に移行する。S214が肯定された場合(S214:Yes)、CPU20は、S208~S212を実行することなく、処理をS220に移行する。処理をS220に移行する構成にすることにより、S216及びS218を省略し、ルーチン処理の手順をシンプルにすることができる。 (2) In the routine processing, when S202 is denied and S206 is affirmed (S202: No, S206: Yes), when S214 is affirmed (S214: Yes), the updated first feature value The registration is canceled (S216). Such processing may be performed as follows. That is, when S206 is affirmed (S206: Yes), the CPU 20 subsequently executes S214. If S214 is negative (S214: No), the CPU 20 executes S208 to S212, and then proceeds to S220. When S214 is affirmed (S214: Yes), the CPU 20 proceeds to S220 without executing S208 to S212. By adopting a configuration that shifts the processing to S220, S216 and S218 can be omitted, and the routine processing procedure can be simplified.
 (3)システムボックス2と表示装置3を別体としたHMD1とした。システムボックス2が備える各部のうちの所定の各部を、筐体30に内蔵した一体的なHMD1としてもよい。具体的に、フラッシュROM22と、ビデオRAM25と、画像処理部26が、筐体30に内蔵される。表示装置3の周辺I/F43には、カメラ32の他、電源スイッチ271、電源ランプ272及び操作部273が接続される。操作部273は、一体的なHMD1に所定の指示を入力する際に操作される。バッテリーは、筐体30に内蔵させてもよい。バッテリーが筐体30に内蔵された場合、通信回路41は、省略してもよい。但し、バッテリーを内蔵させない場合、通信回路41は、外部のバッテリーから伝送ケーブル4を介して、電源の供給を受ける。表示装置3のCPU38は、CPU20が実行したメイン処理を、RAM40を利用して、同様に実行する。CPU38が、メイン処理を実行するとき、CPU38は、メイン処理の実行に伴い実行されるルーチン処理を、同様に実行する。プログラムROM39は、ルーチン処理を含むメイン処理のためのコンピュータプログラムを記憶する。 (3) The HMD 1 has a system box 2 and a display device 3 as separate bodies. It is good also considering the predetermined each part of each part with which the system box 2 is provided as integral HMD1 incorporated in the housing | casing 30. FIG. Specifically, the flash ROM 22, the video RAM 25, and the image processing unit 26 are built in the housing 30. In addition to the camera 32, a power switch 271, a power lamp 272, and an operation unit 273 are connected to the peripheral I / F 43 of the display device 3. The operation unit 273 is operated when a predetermined instruction is input to the integrated HMD 1. The battery may be built in the housing 30. When the battery is built in the housing 30, the communication circuit 41 may be omitted. However, when the battery is not built in, the communication circuit 41 is supplied with power from the external battery via the transmission cable 4. The CPU 38 of the display device 3 similarly performs the main process executed by the CPU 20 using the RAM 40. When the CPU 38 executes the main process, the CPU 38 similarly executes a routine process that is executed along with the execution of the main process. The program ROM 39 stores a computer program for main processing including routine processing.
 画像処理部26によって実行された各処理は、CPU20、又は一体的なHMD1である場合、CPU38によって実行されるようにしてもよい。RAM23、又は一体的なHMD1である場合、RAM40の一部を、ビデオRAMとして割り当てるようにしてもよい。ビデオRAMとして割り当てる場合、画像処理部26及びビデオRAM25は省略してもよい。 Each process executed by the image processing unit 26 may be executed by the CPU 20 when the CPU 20 or the integrated HMD 1 is used. In the case of the RAM 23 or the integrated HMD 1, a part of the RAM 40 may be allocated as a video RAM. When allocating as video RAM, the image processing unit 26 and the video RAM 25 may be omitted.
 1 HMD
 2 システムボックス
 3 表示装置
 4 伝送ケーブル
 5 眼鏡フレーム
 20 CPU
 21 プログラムROM
 22 フラッシュROM
 23 RAM
 24 通信回路
 25 ビデオRAM
 26 画像処理部
 27 周辺I/F
 30 筐体
 31 ハーフミラー
 32 カメラ
 33 取付部
 34 画像光形成部
 35 接眼光学部
 36 レンズ
 37 レンズホルダー
 38 CPU
 39 プログラムROM
 40 RAM
 41 通信回路
 43 周辺I/F
 52 左フレーム部
 53 右フレーム部
 54 中央フレーム部
 55 鼻当て部
 56 支持部
 57 溝
 58 下方延出部
 271 電源スイッチ
 272 電源ランプ
 273 操作部
 EB 眼
 Lim 画像光
1 HMD
2 System Box 3 Display 4 Transmission Cable 5 Eyeglass Frame 20 CPU
21 Program ROM
22 Flash ROM
23 RAM
24 communication circuit 25 video RAM
26 Image processing unit 27 Peripheral I / F
DESCRIPTION OF SYMBOLS 30 Housing | casing 31 Half mirror 32 Camera 33 Mounting part 34 Image light formation part 35 Eyepiece optical part 36 Lens 37 Lens holder 38 CPU
39 Program ROM
40 RAM
41 Communication circuit 43 Peripheral I / F
52 left frame portion 53 right frame portion 54 central frame portion 55 nose pad portion 56 support portion 57 groove 58 downward extension portion 271 power switch 272 power lamp 273 operation portion EB eye Lim image light

Claims (7)

  1.  撮像部によって撮像された対象の所定の部位を含む外界像を示す外界像データを取得する第1取得手段と、
     前記第1取得手段によって取得された外界像データに従い、第一の処理に対応する所定の部位の第一の動きを取得する第2取得手段と、
     前記取得手段で取得された前記第一の動きに対応した前記第一の処理を制御する処理手段と、
     前記所定の部位を有する対象を識別する識別情報と、前記第一の動きの特徴を表す第一の特徴量と、を記憶部に関連付けて登録する登録手段と、
     識別情報の入力を受け付ける受付手段と、を備え、
     前記登録手段は、前記受付手段によって受け付けられた前記識別情報に関連付けて登録されている前記第一の特徴量を、前記受付手段が前記識別情報を受け付けた後で前記第2取得手段で取得された前記第一の動きの特徴を表す第二の特徴量に従い、前記特定済みの前記識別情報に関連付けられた前記第一の特徴量を更新する、ヘッドマウントディスプレイ。
    First acquisition means for acquiring external image data indicating an external image including a predetermined portion of a target imaged by the imaging unit;
    Second acquisition means for acquiring a first movement of a predetermined part corresponding to the first processing according to the external image data acquired by the first acquisition means;
    Processing means for controlling the first processing corresponding to the first movement acquired by the acquisition means;
    Registration means for registering identification information for identifying a target having the predetermined part and a first feature amount representing a feature of the first movement in association with a storage unit;
    Receiving means for receiving input of identification information,
    The registration means acquires the first feature value registered in association with the identification information received by the reception means by the second acquisition means after the reception means receives the identification information. A head mounted display that updates the first feature amount associated with the identified identification information according to a second feature amount representing the feature of the first movement.
  2.  前記第2取得手段で取得された前記第二の特徴量が、前記特定済みの識別情報に関連付けて前記記憶部に登録された前記第一の特徴量に対応するかを判断する第1判断手段と、
     前記第1判断手段が前記第二の特徴量が前記第一の特徴量に対応していないと判断したことに応じて、前記第二の特徴量が、前記第一の特徴量に対して所定の範囲に含まれるかを判断する第2判断手段とを備え、
     前記処理手段は、
      前記第2判断手段が前記第二の特徴量が、前記第一の特徴量に対して、所定の範囲に含まれると判断したことに応じて、前記第一の動きに対応した前記第一の処理を制御し、
     前記第2判断手段が前記第二の特徴量が、前記第一の特徴量に対して、前記所定の範囲に含まれていないと判断したことに応じて、前記第一の動きに対応した前記第一の処理を制御しない、
    請求項1に記載のヘッドマウントディスプレイ。
    First determination means for determining whether the second feature value acquired by the second acquisition means corresponds to the first feature value registered in the storage unit in association with the specified identification information. When,
    In response to the first determination unit determining that the second feature value does not correspond to the first feature value, the second feature value is predetermined with respect to the first feature value. And a second judging means for judging whether or not it is included in the range of
    The processing means includes
    When the second determination unit determines that the second feature amount is included in a predetermined range with respect to the first feature amount, the first feature corresponding to the first movement is determined. Control the process,
    In response to determining that the second feature amount is not included in the predetermined range with respect to the first feature amount, the second determination unit corresponds to the first movement. Do not control the first process,
    The head mounted display according to claim 1.
  3.  前記登録手段は、
     前記第2判断手段が前記第二の特徴量が、前記第一の特徴量に対して所定の範囲に含まれると判断したことに応じて、前記第二の特徴量に従い、前記特定済みの識別情報に関連付けられた前記第一の特徴量を更新し、
     前記第2判断手段が前記第二の特徴量が、前記第一の特徴量に対して所定の範囲に含まれないと判断したことに応じて、前記第二の特徴量に従い、前記特定済みの識別情報に関連付けられた前記第一の特徴量を更新しない、請求項2に記載のヘッドマウントディスプレイ。
    The registration means includes
    In response to determining that the second feature value is included in a predetermined range with respect to the first feature value, the second determination unit determines the identified identification according to the second feature value. Updating the first feature amount associated with the information;
    In response to determining that the second feature amount is not included in a predetermined range with respect to the first feature amount, the second determination unit determines the specified feature according to the second feature amount. The head mounted display according to claim 2, wherein the first feature amount associated with the identification information is not updated.
  4.  前記第2取得手段は、前記第1取得手段によって取得された外界像に従い、前記ヘッドマウントディスプレイで実行される処理であって、前記第一の処理と一定の関係を有する第二の処理に対応する前記所定の部位の第二の動きを取得し、
     前記処理手段は、前記第2取得手段で取得された前記第二の動きに対応した前記第二の処理を制御し、
     前記登録手段は、前記第2判断手段が前記第二の特徴量が、前記第一の特徴量に対して所定の範囲に含まれると判断した後で、前記取得手段で、前記第一の動きと、前記第二の動きと、が所定の期間内に新たに順次連続して取得されたことに応じて、前記第二の特徴量に従い更新された前記第一の特徴量の前記記憶部での登録が、前記第二の特徴量で更新される前の前記第一の特徴量となるようにする、請求項2に記載のヘッドマウントディスプレイ。
    The second acquisition means corresponds to a second process that is executed by the head mounted display in accordance with the external image acquired by the first acquisition means and has a fixed relationship with the first process. To obtain a second movement of the predetermined part,
    The processing means controls the second processing corresponding to the second movement acquired by the second acquisition means,
    The registering unit is configured such that after the second determining unit determines that the second feature amount is included in a predetermined range with respect to the first feature amount, the acquiring unit performs the first movement. And the second movement, the storage unit of the first feature amount updated according to the second feature amount in response to the new sequential acquisition within a predetermined period. The head-mounted display according to claim 2, wherein the registration is the first feature amount before being updated with the second feature amount.
  5.  前記登録手段は、前記ヘッドマウントディスプレイに、前記記憶部での登録をリセットする指示が入力された場合、前記記憶部に登録された前記第一の特徴量をリセットする、請求項1に記載のヘッドマウントディスプレイ。 2. The registration unit according to claim 1, wherein the registration unit resets the first feature amount registered in the storage unit when an instruction to reset registration in the storage unit is input to the head mounted display. 3. Head mounted display.
  6.  前記受付手段が前記識別情報を受け付けた後で前記第2取得手段によって取得された前記第一の動きと、前記特定済みの識別情報に関連付けて前記記憶部に登録された前記第一の特徴量と、の対応を判断する第3判断手段を備え、
     前記処理手段は、前記判断手段によって前記第一の動きが前記第一の特徴量に対応していると判断された場合、前記取得手段で新たに取得された前記第一の動きに対応した前記第一の処理を制御する、請求項1に記載のヘッドマウントディスプレイ。
    The first feature acquired by the second acquisition unit after the reception unit has received the identification information, and the first feature amount registered in the storage unit in association with the identified identification information And a third determination means for determining the correspondence between
    When the determination unit determines that the first movement corresponds to the first feature amount, the processing unit corresponds to the first movement newly acquired by the acquisition unit. The head mounted display of Claim 1 which controls a 1st process.
  7.  ヘッドマウントディスプレイを制御する制御部が読み取り可能なコンピュータプログラムであって、
     前記制御部を、
     前記撮像部によって撮像された対象の所定の部位を含む外界像を示す外界像データを取得する第1取得手段と、前記第1取得手段によって取得された外界像データに従い、第一の処理に対応する所定の部位の第一の動きを取得する第2取得手段と、
     前記取得手段で取得された前記第一の動きに対応した前記第一の処理を制御する処理手段と、
     前記所定の部位を有する対象を識別する識別情報と、前記第一の動きの特徴を表す第一の特徴量と、を記憶部に関連付けて登録する登録手段と、
     前記識別情報の入力を受け付ける受付手段と、して機能させ、
     前記登録手段は、前記受付手段によって受け付けられた前記識別情報に関連付けて登録されている前記第一の特徴量を、前記受付手段が前記識別情報を受け付けた後で前記第2取得手段で取得された前記第一の動きの特徴を表す第二の特徴量に従い、前記特定済みの前記識別情報に関連付けられた前記第一の特徴量を更新する、機能を含む、コンピュータプログラム。
    A computer program that can be read by a controller that controls the head-mounted display,
    The control unit
    Corresponding to the first processing according to the first acquisition means for acquiring the external image data indicating the external image including the predetermined part of the target imaged by the imaging unit, and the external image data acquired by the first acquisition means Second acquisition means for acquiring a first movement of a predetermined part to be performed;
    Processing means for controlling the first processing corresponding to the first movement acquired by the acquisition means;
    Registration means for registering identification information for identifying a target having the predetermined part and a first feature amount representing a feature of the first movement in association with a storage unit;
    Function as a receiving means for receiving the input of the identification information,
    The registration means acquires the first feature value registered in association with the identification information received by the reception means by the second acquisition means after the reception means receives the identification information. A computer program comprising a function of updating the first feature quantity associated with the identified identification information according to a second feature quantity representing the feature of the first movement.
PCT/JP2013/059503 2012-03-29 2013-03-29 Head-mounted display and computer program WO2013147147A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012078043A JP2013206411A (en) 2012-03-29 2012-03-29 Head-mounted display and computer program
JP2012-078043 2012-03-29

Publications (1)

Publication Number Publication Date
WO2013147147A1 true WO2013147147A1 (en) 2013-10-03

Family

ID=49260389

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/059503 WO2013147147A1 (en) 2012-03-29 2013-03-29 Head-mounted display and computer program

Country Status (2)

Country Link
JP (1) JP2013206411A (en)
WO (1) WO2013147147A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201518979A (en) * 2013-11-15 2015-05-16 Utechzone Co Ltd Handheld eye-controlled ocular device, password input device and method, computer-readable recording medium and computer program product
JP2015210797A (en) * 2014-04-30 2015-11-24 シャープ株式会社 Display divice
US20160178906A1 (en) * 2014-12-19 2016-06-23 Intel Corporation Virtual wearables

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000148381A (en) * 1998-11-05 2000-05-26 Telecommunication Advancement Organization Of Japan Input image processing method, input image processor and recording medium on which input image processing program has been recorded
JP2002251235A (en) * 2001-02-23 2002-09-06 Fujitsu Ltd User interface system
JP2004054580A (en) * 2002-07-19 2004-02-19 Sharp Corp Information processing method, information processing device, information processing program, and recording medium
JP2004157602A (en) * 2002-11-01 2004-06-03 Toshiba Corp Apparatus and method for person recognition and passage controller
JP2005056059A (en) * 2003-08-01 2005-03-03 Canon Inc Input device and method using head mounting type display equipped with image pickup part

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009098676A (en) * 2007-09-28 2009-05-07 Nikon Corp Control device and head mount display

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000148381A (en) * 1998-11-05 2000-05-26 Telecommunication Advancement Organization Of Japan Input image processing method, input image processor and recording medium on which input image processing program has been recorded
JP2002251235A (en) * 2001-02-23 2002-09-06 Fujitsu Ltd User interface system
JP2004054580A (en) * 2002-07-19 2004-02-19 Sharp Corp Information processing method, information processing device, information processing program, and recording medium
JP2004157602A (en) * 2002-11-01 2004-06-03 Toshiba Corp Apparatus and method for person recognition and passage controller
JP2005056059A (en) * 2003-08-01 2005-03-03 Canon Inc Input device and method using head mounting type display equipped with image pickup part

Also Published As

Publication number Publication date
JP2013206411A (en) 2013-10-07

Similar Documents

Publication Publication Date Title
WO2013146862A1 (en) Head-mounted display and computer program
US10643390B2 (en) Head mounted display, method for controlling head mounted display, and computer program
JP5423716B2 (en) Head mounted display
US9792710B2 (en) Display device, and method of controlling display device
US20170161957A1 (en) Information processing apparatus, display apparatus, information processing method, and program
JP6339887B2 (en) Image display device
JP5304329B2 (en) Head mounted display device, image control method, and image control program
JP6094305B2 (en) Head-mounted display device and method for controlling head-mounted display device
JP2016031761A (en) Spectacle type terminal
JP2013250849A (en) Guidance display system, guidance display device, guidance display method, and guidance display program
JP2010139901A (en) Head mount display
JP2016206617A (en) Display system
JP2017067876A (en) Head-mounted display, method for controlling head-mounted display, and computer program
WO2013147147A1 (en) Head-mounted display and computer program
JP6776578B2 (en) Input device, input method, computer program
JP2018084886A (en) Head mounted type display device, head mounted type display device control method, computer program
US10884498B2 (en) Display device and method for controlling display device
JP2017146726A (en) Movement support device and movement support method
JP6319220B2 (en) Transparent wearable terminal, data processing apparatus, and data processing system
JP6801329B2 (en) Image forming device, information processing device and information processing system
JP2023544107A (en) Optical stylus for optical positioning devices
JP2017182460A (en) Head-mounted type display device, method for controlling head-mounted type display device, and computer program
JP2018042004A (en) Display device, head-mounted type display device, and method for controlling display device
JP6015691B2 (en) Image display device and program
JP5979110B2 (en) Head mounted display and control program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13769689

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 13769689

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE