JP2013254251A - Head-mounted display device, control method, and program - Google Patents

Head-mounted display device, control method, and program Download PDF

Info

Publication number
JP2013254251A
JP2013254251A JP2012128054A JP2012128054A JP2013254251A JP 2013254251 A JP2013254251 A JP 2013254251A JP 2012128054 A JP2012128054 A JP 2012128054A JP 2012128054 A JP2012128054 A JP 2012128054A JP 2013254251 A JP2013254251 A JP 2013254251A
Authority
JP
Japan
Prior art keywords
operation
motion
map
head
mounted display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2012128054A
Other languages
Japanese (ja)
Inventor
Satoshi Shimomura
聡 下村
Original Assignee
Nec System Technologies Ltd
Necシステムテクノロジー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nec System Technologies Ltd, Necシステムテクノロジー株式会社 filed Critical Nec System Technologies Ltd
Priority to JP2012128054A priority Critical patent/JP2013254251A/en
Publication of JP2013254251A publication Critical patent/JP2013254251A/en
Application status is Pending legal-status Critical

Links

Images

Abstract

PROBLEM TO BE SOLVED: To provide a head-mounted display device capable of executing processing depending on movement properties for respective users, a control method, and a program.SOLUTION: The head-mounted display device mounted on a user's head includes a sensor 4, an operation execution part 55, and a movement learning part 54. The sensor 4 detects movement of the user's head. The operation execution part 55 determines whether the detected movement exists on a movement map in which movement that is previously set and operation are associated with each other. Also, when determining the detected movement exists, the operation execution part 55 specifies operation corresponding to the detected movement and executes processing corresponding to the specified operation. When processing is executed by the operation execution part 55 and it is determined that the detected movement does not exist before the execution of the processing, the movement learning part 54 updates the movement map by using the movement that is an object of the determination of the non-existence.

Description

  The present invention relates to a head mounted display device mounted on a user's head, a control method, and a program.

  Conventionally, when operating the head mounted display device, for example, when scrolling the screen displayed on the display of the head mounted display device, the head mounted display device is operated using a controller.

  Here, since the head-mounted display device is used while being worn by the user, the controller included in the head-mounted display device is small. Further, since the controller is small, the number of buttons or switches arranged on the controller is also limited.

  Thus, there is a problem that if the controller is small, there are many erroneous operations. Further, when the number of buttons or switches arranged on the controller is small, there is a problem that the number of possible operations is reduced. In order to cope with such a problem, for example, a technique for operating the head mounted display device by the operation of the head of the user wearing the head mounted display device has been proposed.

  For example, in the head-mounted display device disclosed in Patent Literature 1, the head movement detection device detects the movement of the user's head with a sensor. Then, the acceleration change of the user's head detected by the head motion detection device is sent to the central control device, and the motion input process collates the acceleration change pattern with the pattern stored in the storage device. As a result of this collation, if the acceleration change pattern detected by the head motion detection device matches the pattern stored in the storage device or is within a predetermined threshold range, it is determined that the user has moved the head. Then, processing corresponding to that is executed.

Japanese Patent Laid-Open No. 2004-233909

  By the way, in the head mounted display device of this patent document 1, the pattern memorize | stored in the memory | storage device uses the common thing for all the users, and does not match a user's operation characteristic (user's operation habit). There wasn't. For this reason, depending on the user, there is a problem that processing is not performed as expected even if the head is moved.

  Accordingly, an object of the present invention is to provide a head mounted display device, a control method, and a program capable of executing processing in accordance with operation characteristics for each user.

In order to achieve the above object, a head mounted display device according to one aspect of the present invention is a head mounted display device mounted on a user's head,
A sensor for detecting movement of the user's head;
It is determined whether or not the detected motion exists on a motion map in which a preset motion and operation are associated with each other. When it is determined that the detected motion is present, the detected motion corresponds to the detected motion An operation execution unit that identifies an operation to be performed and executes processing according to the identified operation;
When the process is executed by the operation execution unit and it is determined that the detected operation does not exist before the execution of the process, the target that is determined to be non-existent An action learning unit for updating the action map using an action;
It is characterized by providing.

In order to achieve the above object, a control method according to one aspect of the present invention is a control method for a head mounted display device mounted on a user's head,
(A) causing a sensor attached to the head mounted display to detect the movement of the user's head;
(B) determining whether or not the motion detected in the step (a) exists on a motion map in which a preset motion and operation are associated with each other;
(C) In the step (b), when it is determined that the motion detected in the step (a) exists, an operation corresponding to the detected motion is specified, and the operation corresponding to the specified operation is determined. The steps of executing the processing,
(D) The process is executed in the step (c), and it is determined that the detected operation does not exist before the process in the step (c) is executed. Updating the action map using the action that was determined to be non-existent when
It is characterized by including.

Furthermore, in order to achieve the above object, a program according to one aspect of the present invention is a program for executing control of a head mounted display device mounted on a user's head by a processor.
(A) causing a sensor attached to the head mounted display to detect the movement of the user's head;
(B) determining whether or not the motion detected in the step (a) exists on a motion map in which a preset motion and operation are associated with each other;
(C) In the step (b), when it is determined that the motion detected in the step (a) exists, an operation corresponding to the detected motion is specified, and the operation corresponding to the specified operation is determined. The steps of executing the processing,
(D) The process is executed in the step (c), and it is determined that the detected operation does not exist before the process in the step (c) is executed. Updating the action map using the action that was determined to be non-existent when
Is executed.

  As described above, according to the head-mounted display device, the control method for the head-mounted display device, and the program according to the present invention, processing can be executed in accordance with the operation characteristics for each user.

FIG. 1 is a functional block diagram showing a configuration of a head mounted display device according to an embodiment of the present invention. FIG. 2 is a schematic view showing the appearance of the head mounted display device according to the embodiment of the present invention. FIG. 3 is a diagram showing an example of an operation map of the head mounted display device according to the embodiment of the present invention. FIG. 4 is a flowchart showing an operation procedure when the head-mounted display device according to the embodiment of the present invention executes an operation map creation process or a read process. FIG. 5 is a flowchart showing the operation in the normal mode of the head mounted display device according to the embodiment of the present invention. FIG. 6 is a flowchart showing the operation in the operation characteristic registration mode of the head mounted display device according to the embodiment of the present invention. FIG. 7 is a diagram showing an example of an operation map after the update of the head mounted display device according to the embodiment of the present invention.

(Embodiment)
Hereinafter, a head mounted display device according to an embodiment of the present invention will be described with reference to the drawings.

[Head-mounted display device]
First, the configuration of the head mounted display device according to the present embodiment will be described with reference to FIG. FIG. 1 is a functional block diagram showing a configuration of a head mounted display device according to an embodiment of the present invention.

  As shown in FIG. 1, the head mounted display device 1 in the present embodiment includes a sensor 4, an operation execution unit 55, and an action learning unit 54. The sensor 4 is for detecting the movement of the user's head.

  The operation execution unit 55 determines whether or not the detected operation exists on the operation map in which the preset operation and the operation are associated with each other. Further, when it is determined that the detected action exists, the operation execution unit 55 specifies an operation corresponding to the detected action, and executes a process corresponding to the specified operation.

  The motion learning unit 54 is not present when the operation is performed by the operation execution unit 55 and it is determined that the detected motion does not exist on the motion map before the processing is performed. The motion map is updated using the motion that is the target of the determination.

  In the head-mounted display device 1 configured as described above, the motion learning unit 54 updates the motion map using the motion that the operation execution unit 55 determines is not present in the motion map. It matches the user's operating characteristics. As a result, the head mounted display device 1 can execute processing according to the user's operating characteristics.

  Here, the configuration of the head mounted display device 1 will be described more specifically with reference to the drawings. FIG. 2 is a schematic view showing the appearance of the head mounted display device according to the embodiment of the present invention.

  As shown in FIG. 2, in the present embodiment, the head mounted display device 1 includes a display 2 that outputs an image to a user, and a mounting unit 3 for mounting the display 2 on the user's head. I have. The head mounted display device 1 further includes a sensor 4 attached to the mounting unit 3 or the display 2 and a controller connected to the display 2 and the sensor 4 by wire or wirelessly.

  The display 2 is disposed in front of at least one eye of the user wearing the head mounted display device 1 and provides the user with various images such as a selection screen and a menu screen.

  The mounting portion 3 is formed in a substantially U shape so as to conform to the shape of the user's head, and can be adjusted so as to expand and contract to fit the size of the user's head.

  The sensor 4 is for detecting the movement of the head of the user wearing the head mounted display device 1. Specifically, the sensor 4 detects how many times the user's head has rotated up, down, right, or left with reference to the state of facing the front, and at that time Detect the acceleration of. As the sensor 4, an acceleration sensor 4a for detecting acceleration and an inclination sensor 4b for detecting inclination can be used. A sensor that can detect both acceleration and tilt may be used. The acceleration sensor 4a and the tilt sensor 4b are collectively referred to as a sensor 4.

  The controller 5 includes a cross key 50a for inputting an instruction from the user, and three buttons 50b including an A button, a B button, and a C button. The cross key 50a includes an upper button, a lower button, a right button, and a left button.

  As shown in FIG. 1, the head-mounted display device 1 includes a controller 5, a display unit 51, an operation map selection unit 52, an input reception unit 53, an operation learning unit 54, an operation execution unit 55, an operation map storage unit 56, and A temporary data storage unit 57 is provided.

  The operation map storage unit 56 stores an operation map for each user and a default operation map. As shown in FIG. 3, the motion map for each user stored in the motion map storage unit 56 has a user ID for specifying a user who uses the motion map, and the motion and operation of the user's head. And correspondence information. The motion map has a motion direction, a numerical value range of acceleration, and a numerical value range of inclination as head motions. In the present embodiment, the operation map further includes a button pressing operation. That is, the action map according to the present embodiment stores correspondence information between the action of the user's head and the button pressing action and the operation.

  Note that there are up, down, right, and left as the motion direction. For example, when the user rotates the head up around the horizontal axis so as to face upward, the motion direction is “up”. In addition, when the user rotates the head down around the horizontal axis so as to face downward, the operation direction is “down”. In addition, when the user rotates the head to the right around the vertical axis so that the user is facing right, the movement direction is “right”, and the head is rotated to the left around the vertical axis so that the user is facing left. In this case, the operation direction is “left”.

  The acceleration indicates the acceleration when the user rotates the head. Further, the inclination indicates an angle of how many times the user has rotated the head from the state where the head is directed in the front direction.

  In addition, as a button pressing operation, the operation map includes information about whether or not to press a button and which button 50b to press when pressing. Note that “−” in the operation map in FIG. 3 indicates that the button is not pressed, and “A”, “B”, and “C” in the operation map in FIG.

In addition, the operation includes not only an operation performed once but also an operation performed twice. For example, the operation “cancel” is described according to the motion map of FIG. 3. The motion direction is right, the acceleration is 2 to 5 m / s 2 , the head is rotated 40 to 90 degrees to the right, and then the acceleration is performed. Corresponds to the operation of rotating the head to the left by 40 to 90 degrees at 2 to 5 m / s 2 . Note that the interval between the first operation and the second operation can be, for example, within one second.

  The default action map stored in the action map storage unit 56 is an action map that is copied and used when a new user starts use, and stores default data.

  The temporary data storage unit 57 is an area for temporarily storing the acceleration and inclination acquired by the operation execution unit 55 and the acceleration and inclination acquired by the motion learning unit 54.

  The display unit 51 displays a user interface such as a selection screen or a menu screen on the display 2 or displays various types of information. When the user who uses the head mounted display device inputs an instruction via various buttons of the controller 5, the input receiving unit 53 receives the input.

  The operation map selection unit 52 selects a specific operation map from the operation maps for each user stored in the operation map storage unit 56 in accordance with an instruction from the input reception unit 53. In addition, the behavior map selection unit 52 copies a default behavior map to a new user and creates it as a new behavior map.

  In this embodiment, the operation execution unit 55 acquires the motion of the head detected by the sensor 4 and executes processing corresponding to the operation corresponding to the motion acquired from the sensor 4 based on the motion map. Further, when the operation acquired from the sensor 4 does not exist on the operation map, the operation execution unit 55 stores the operation in the temporary storage unit 57 as a failed operation.

  The motion learning unit 54 has a function of updating the motion map based on the user's motion characteristics.

[Operation of head mounted display device]
Next, the operation of the head mounted display device according to the embodiment of the present invention will be described with reference to FIGS. 4 to 7 with appropriate reference to FIGS. In the present embodiment, since the head mounted display device control method is implemented by operating the head mounted display device, the description of the control method in the present embodiment is replaced with the following description of the head mounted display device operation.

  First, an operation when the head mounted display device 1 according to the embodiment of the present invention creates or reads an operation map will be described with reference to FIG. FIG. 4 is a flowchart showing an operation procedure when the head-mounted display device according to the embodiment of the present invention executes an operation map creation process or a read process.

  As shown in FIG. 4, first, when the head-mounted display device 1 is activated by the user pressing the power button or the like, the display unit 51 displays a selection screen on the display 2 of the head-mounted display device 1 (step S1). ). Specifically, the display unit 51 displays on the display a selection screen for selecting whether the user using the head mounted display device 1 is a new user or a registered user.

  Next, when the user selects either a new user or a registered user using the cross key 50a and the button 50b of the controller, the input reception unit 53 selects either the new user or the registered user. An input of selection information indicating such is accepted (step S2). Then, the input reception unit 53 outputs the selection information to the operation map selection unit 52.

  Next, the motion map selection unit 52 determines whether the user currently using the head mounted display device 1 is a new user based on the selection information received from the input reception unit 53 (step S3).

  When determining that the user currently using the head mounted display device 1 is a new user (Yes in Step S3), the operation map selection unit 52 newly registers an operation map for the user (Step S4). Specifically, the motion map selection unit 52 creates a new motion map by copying the default motion map stored in the motion map storage unit 56. Then, the motion map selection unit 52 registers the user ID of the user currently using the head mounted display device 1 in the motion map in order to identify the newly created motion map. Thereby, an operation map for a new user is newly registered.

  Then, the motion map selection unit 52 determines that the motion map newly registered in step S4 is the motion map of the user currently using the head mounted display device 1, and then stores the motion map in the motion learning unit 54 and the operation execution unit 55. Deliver (step S5).

  On the other hand, when the operation map selection unit 52 determines that the user currently using the head mounted display device 1 is a registered user (No in step S3), the display unit 51 displays a list screen of user IDs of registered users. Is displayed on the display 2 (step S6). The display unit 51 can also display a list screen of user names associated with the user ID on the display 2.

  When the user selects his / her user ID from the list screen of user IDs displayed on the display 2 using the cross key 50a and the button 50b of the controller, the input reception unit 53 receives input of the selected user ID. (Step S7). The input reception unit 53 outputs information regarding the received user ID to the operation map selection unit 52.

  Next, based on the information regarding the user ID received from the input reception unit 53, the operation map selection unit 52 reads the operation map corresponding to the user ID from the operation map storage unit 56 (step S8).

  Then, the motion map selection unit 52 passes the motion map read from the motion map storage unit 56 to the motion learning unit 54 and the operation execution unit 55 (step S9).

  Next, the operation of the head mounted display device in the “normal mode” will be described. The normal mode is a mode in which the head mounted display device executes processing based on the movement of the user's head. In addition to the “normal mode”, the head mounted display device 1 according to the present embodiment has two modes of “operation characteristic registration mode” to be described in detail later.

  FIG. 5 is a flowchart showing the operation in the normal mode of the head mounted display device according to the embodiment of the present invention. The operation of the head-mounted display device in the normal mode is performed at least after the above-described head-mounted display device executes the operation map creation process or reading process.

  As shown in FIG. 5, first, the display unit 51 is a menu screen on which the user can select whether to operate the head-mounted display device 1 in the “normal mode” or “operation characteristic registration mode”. Is displayed on the display 2 (step S11).

  When the user selects the normal mode using the cross key 50a and the button 50b of the controller, the input receiving unit 53 receives input of selection information indicating that the user has selected the normal mode (step S12). Then, the input reception unit 53 outputs information indicating that the operation is performed in the normal mode to the operation learning unit 54 and the operation execution unit 55. Thereby, the operation learning unit 54 and the operation execution unit 55 operate in the normal mode. The operations of the motion learning unit 54 and the operation execution unit 55 in the normal mode will be described below.

  First, the user performs a head operation and button operation. For example, when the user wants to scroll the screen displayed on the display 2 to the right, the user rotates the head to the right around the vertical axis so as to turn right without pressing any button on the controller. Then, the sensor 4 detects that the movement direction is right, and the head and the head rotate about the vertical axis several times with the acceleration and inclination at that time, that is, the state facing the front as 0 degrees. The angle is detected. Thereby, the operation execution part 55 acquires a motion direction, an acceleration, and an angle as a motion of a head via the sensor 4 (step S13).

  The operation execution unit 55 determines whether or not the motion specified by the motion direction, acceleration, and angle acquired from the sensor 4 is present on the motion map of the user (step S14). That is, the operation execution unit 55 determines whether or not there is an operation on the operation map in which the operation directions acquired from the sensor 4 match and the acceleration and angle acquired from the sensor 4 fall within the range. When the user performs an operation of pressing a button, the operation execution unit 55 also determines whether or not the button pressing operation acquired from the input reception 53 matches the button pressing operation on the operation map.

  When it is determined that the acquired motion direction, acceleration, and angle do not exist on the motion map (No in step S14), the operation execution unit 55 temporarily stores the acquired motion direction, acceleration, and angle as a failure motion. Store in the storage unit 57 (step S15). The operation execution unit 55 does not execute processing for the operation.

  On the other hand, when the operation executing unit 55 determines that there is a motion on the motion map that matches the acquired motion direction and the acquired acceleration and angle are within the range (Yes in step S14), Based on this, an operation corresponding to the operation is specified, and a process corresponding to the operation is executed (step S16).

  After executing the process in step S16, the operation execution unit 55 determines whether or not a failure operation is stored in the temporary data storage unit 57 (step S17). When the operation execution unit 5 determines that the failure operation is not stored in the temporary data storage unit 57 (No in step S17), the operation is ended.

  On the other hand, when the operation execution unit 55 determines that the failure operation is stored in the temporary data storage unit 57 (Yes in step S17), the operation execution unit 55 learns data related to the failure operation stored in the temporary data storage unit 57. It outputs to the part 54 (step S18).

  The motion learning unit 54 updates the motion map using the failed motion received from the operation execution unit 55 (step S19). Specifically, the action learning unit 54 updates the action map so that the process executed in step S <b> 16 described above can be executed by the failure action received from the operation execution unit 55. For example, the motion learning unit 54 updates the motion map so as to widen the numerical range of the acceleration or angle of the motion map in order to include the failure motion.

  In addition, about the process in which the action learning part 54 updates an action map using a failure action, you may make it update an action map only when predetermined conditions are satisfy | filled. For example, the operation learning unit 54 determines whether the operation direction of the failure operation and the button pressing operation match the operation direction and the button pressing operation of the corresponding operation map operation. If they do not match, the motion learning unit 54 does not update the motion map.

  If the motion direction and the button press motion match, then the motion learning unit 54 extracts the acceleration and angle of the failed motion that are not within the numerical value range of the motion map motion. Then, when the acceleration is extracted, the motion learning unit 54 determines whether or not the acceleration of the failure motion is within a predetermined range from the upper limit or the lower limit of the numerical range of the acceleration of the motion map. When the motion learning unit 54 determines that the acceleration of the failure motion is within the predetermined range, the motion learning unit 54 updates the motion map by expanding the numerical range of the acceleration of the motion map so that the acceleration of the failure motion is included. For example, when the acceleration of the failure motion is out of the lower limit of the numerical range of the motion map, the motion learning unit 54 sets the acceleration of the failure motion as the lower limit of the numerical range of the acceleration of the motion map. In addition, when the acceleration of the failure motion is out of the upper limit of the numerical range of the motion map, the motion learning unit 54 sets the acceleration of the failure motion as the upper limit of the numerical range of the acceleration of the motion map.

  Further, the motion learning unit 54 performs the same processing as when the acceleration is extracted even when the angle is extracted. That is, the motion learning unit 54 determines whether or not the angle of the failure motion is within a predetermined range from the upper limit or the lower limit of the numerical value range of the motion map angle. When the motion learning unit 54 determines that the angle of the failure motion is within the predetermined range, the motion learning unit 54 updates the motion map by expanding the numerical value range of the angle of the motion map so that the angle of the failure motion is included. For example, if the angle of the failure action is outside the lower limit of the numerical value range of the action map, the action learning unit 54 sets the angle of the failure action as the lower limit of the numerical value range of the action map angle. If the angle of the failure motion is outside the upper limit of the numerical range of the motion map, the motion learning unit 54 sets the angle of the failure motion as the upper limit of the numerical range of the acceleration of the motion map.

  Then, after updating the motion map, the motion learning unit 54 deletes the data stored in the temporary data storage unit 57 (step S20). Further, since the failure operation does not satisfy the predetermined condition described above, the data stored in the temporary data storage unit 57 is deleted even when the operation learning unit 54 does not update the operation map.

  By repeating the above steps S13 to S20, the numerical range of the acceleration and the angle of the user's motion map matches the user's motion characteristics, and the accuracy during use is improved. If the failure operation continues twice in succession, the operation learning unit 54 may update the operation map using the two failure operations, or the operation map may be updated using only the latest failure operation. It may be updated.

  Next, the operation of the head mounted display device according to the present embodiment in the “operation characteristic registration mode” will be described with reference to the drawings. The “motion characteristic registration mode” is a mode for registering the motion characteristics of the user's head for each operation.

  FIG. 6 is a flowchart showing the operation in the registration mode of the operation characteristics of the head mounted display device according to the embodiment of the present invention. The operation in the operation characteristic registration mode of the head-mounted display device is preferably performed at least after the above-described head-mounted display device executes the operation map creation process or reading process. Further, the operation in the registration mode of the operation characteristics of the head mounted display device may be performed before the operation in the normal mode of the head mounted display device described above, may be performed later, or may be performed in the middle.

  As shown in FIG. 6, first, the display unit 51 displays a menu screen on which the user can select whether to operate the head mounted display device 1 in the “operation characteristic registration mode” or the “normal mode”. Is displayed on the display 2 (step S21).

  When the user selects the operation characteristic registration mode using the controller's cross key 50a and the button 50b, the input receiving unit 53 receives input of selection information indicating that the user has selected the operation characteristic registration mode (step S40). S22). Then, the input reception unit 53 outputs information indicating that the operation is performed in the operation characteristic registration mode to the operation learning unit 54 and the operation execution unit 55. As a result, the motion learning unit 54 and the operation execution unit 55 operate in the motion characteristic registration mode. The operations of the operation learning unit 54 and the operation execution unit 55 in the operation characteristic registration mode will be described below.

  First, the display unit 51 displays a list of operations in the operation map on the display 2 (step S23). The user uses the cross key 50 a and the button 50 b of the controller 5 to select one operation for which the operation characteristics are to be registered from the list of operations displayed on the display 2.

  The input receiving unit 53 receives an input of information specifying the operation selected by the user (step S24). Then, the input receiving unit 53 outputs information specifying the operation to the motion learning unit 54.

  Subsequently, when the user performs a head motion in response to the selected operation, the motion learning unit 54 acquires a motion direction, an acceleration, and an angle as the motion of the user's head from the sensor 4 ( Step S25).

  The motion learning unit 54 stores the motion direction, acceleration, and angle acquired from the sensor 4 in the temporary data storage unit 57 (step S26). Preferably, the processes in steps S25 and S26 are performed several times to accumulate a plurality of data in the temporary data storage unit 57.

  The motion learning unit 54 exceeds the numerical ranges of the acceleration and angle in the motion map motion corresponding to the operation selected by the user in step S24 among the accelerations and angles stored in the temporary data storage unit 57. It is determined whether or not there is (step S27).

  When it is determined that the acceleration or angle exceeding the numerical value range of the acceleration or angle of the motion map is not stored in the temporary data storage unit 57 (No in step S27), the motion learning unit 54 updates the motion map. Without processing.

  On the other hand, the motion learning unit 54 updates the motion map (step S28) when the temporary data storage unit 57 has an acceleration or angle exceeding the numerical range of the acceleration or angle of the motion map (Yes in step S27). . Specifically, the motion learning unit 54 corrects the numerical value range of acceleration or angle of the motion map so that the acceleration or angle stored in the temporary data storage unit 57 falls within the numerical value range of acceleration or angle of the motion map. To update the motion map. By repeating the registration of the above operation characteristics, the numerical range of the user's operation map matches the user's operation characteristics, and the accuracy in use is improved.

[Concrete example]
Next, a specific example of the operation in the normal mode of the head mounted display device in the normal mode will be described. Note that the operation map of the user wearing the head-mounted display device according to this specific example is the same as FIG.

First, in order to scroll the screen displayed on the display 2 to the right, the user wearing the head mounted display device 1 performs an operation of rotating the head so as to face right. The movement of the head at this time is assumed that the movement direction is right, the acceleration is 3 m / s 2 , and the angle is 38 degrees.

  Then, the sensor 4 detects an operation direction, an acceleration, and an angle as the operation of the user's head. Thereby, the operation execution part 55 acquires these data via the sensor 4 (step S13).

  Next, the operation execution unit 55 determines whether or not the motion specified by the acquired motion direction, acceleration, and angle exists on the motion map (step S14).

  And the operation execution part 55 determines with the operation | movement specified by the acquired operation | movement direction, acceleration, and angle not existing in the operation | movement map of FIG. 3 (No of step S14). That is, in the information specifying the motion acquired by the operation execution unit 55, the angle of 38 degrees is not within the numerical value range of any motion angle on the motion map. For this reason, the operation execution unit 55 determines that the motion specified by the acquired motion direction, acceleration, and angle does not exist on the motion map.

  And since the acquired operation does not exist on an operation | movement map, the operation execution part 55 stores this as temporary operation in the temporary data storage part 57 (step S15).

Since the screen displayed on the display 2 was not scrolled to the right by the operation of the head, the user performs the operation of rotating the head again so as to face the right direction. The movement of the head at this time is assumed that the movement direction is right, the acceleration is 3 m / s 2 , and the angle is 42 degrees.

  The sensor 4 detects the motion direction, acceleration, and angle, and the operation execution unit 55 acquires the motion direction, acceleration, and angle detected by the sensor 4 (step S13).

  Then, the operation execution unit 55 determines whether or not the motion specified by the acquired motion direction, acceleration, and angle exists on the motion map (step S14). Here, since this operation exists on the operation map, that is, corresponds to the third operation from the top in FIG. 3, the operation execution unit 55 determines that the acquired operation exists on the operation map ( Yes in step S14).

  Next, the operation execution unit 55 identifies an operation corresponding to the operation based on the operation map of FIG. 3, and executes a process corresponding to the operation (step S16). That is, since the operation corresponding to the above operation is “right”, the operation execution unit 55 scrolls the screen of the display 2 to the right by executing a process corresponding to the operation “right”.

Subsequently, the operation execution unit 55 determines whether or not a failure operation is stored in the temporary data storage unit 57 (step S17). As a result of the above step S15, the temporary data storage unit 57 stores a failure operation in which the operation direction is right, the acceleration is 3 m / s 2 , and the angle is 38 degrees, so the operation execution unit 55 stores the failure operation. (Yes in step S17).

  Next, the operation execution unit 55 outputs the motion direction, acceleration, and angle regarding the failed motion to the motion learning unit 54 (step S18).

  The motion learning unit 54 updates the motion map of FIG. 3 using the motion direction, acceleration, and angle of the failed motion (step S19). Specifically, the motion learning unit 54 first determines whether or not the failed motion satisfies a condition for updating the motion map, and updates the motion map when it is determined that the condition is satisfied.

First, the action learning unit 45 compares the action when the process is executed in step S16, that is, the third action from the top of FIG. 3 with the failure action. The action learning unit 54 determines that the action direction to be “right” and the button action not to press the button match between the failure action and the action map action. Further, since the acceleration of the failure motion is 3 m / s 2, it is determined that this is also within the numerical range of the acceleration of the motion map ( 2 to 5 m / s 2 ). Therefore, the motion learning unit 45 determines that the conditions for updating the motion map are satisfied for the motion direction, acceleration, and button motion.

  Then, the motion learning unit 54 determines whether or not the angle of the failure motion is within a predetermined range from the upper limit and the lower limit of the angle numerical range defined in the motion map. For example, if the predetermined range is 10% of the numerical range, the numerical range is 40 to 90 degrees, so 10% of the numerical range is 5 degrees. Therefore, if the angle of the failure action is within a range of 35 to 95 degrees, the action learning unit 54 determines that all the conditions for updating the action map are satisfied, and the action is performed so that the failure action is included. Update the map.

  Here, since the angle of the failure action is 38 degrees, the action learning unit 54 determines that the predetermined condition is satisfied, and the action map includes the angle 38 degrees of the failure action within the numerical range. Correct the numerical range of the angle. For example, as shown in a portion surrounded by a thick line in FIG. 7, the motion learning unit 54 changes the numerical range of the angle in the third motion from the top of the motion map from 40 to 90 degrees to 38 to 90 degrees. Correct it. FIG. 7 is a diagram showing an example of an operation map after updating the head mounted display device according to the embodiment of the present invention.

[program]
In the present embodiment, the controller 5 causes a one-chip microcomputer to execute steps S1 to S9 shown in FIG. 4, steps S11 to S20 shown in FIG. 5, and steps S21 to S28 shown in FIG. This can be realized by installing a program. In this case, the CPU of the one-chip microcomputer functions as a display unit 51, an operation map selection unit 52, an input reception unit 53, an operation learning unit 54, and an operation execution unit 55, and performs processing.

  As described above, according to the present embodiment, the motion learning unit 54 can update the motion map in accordance with the motion characteristics of the user, so that the head-mounted display device is easy to use every time it is used. It becomes.

  Part or all of the above-described embodiments can be expressed by (Appendix 1) to (Appendix 18) described below, but is not limited to the following description.

(Appendix 1)
A head mounted display device mounted on a user's head,
A sensor for detecting movement of the user's head;
It is determined whether or not the detected motion exists on a motion map in which a preset motion and operation are associated with each other. When it is determined that the detected motion is present, the detected motion corresponds to the detected motion An operation execution unit that identifies an operation to be performed and executes processing according to the identified operation;
When the process is executed by the operation execution unit and it is determined that the detected operation does not exist before the execution of the process, the target that is determined to be non-existent An action learning unit for updating the action map using an action;
A head mounted display device.

(Appendix 2)
The head mounted display device according to appendix 1, wherein the motion learning unit updates the motion map so that the motion that has been determined to be nonexistent is present on the motion map.

(Appendix 3)
The supplementary note 1 or 2, wherein the action learning part updates the action map so that the process executed by the operation execution part is executed by the action determined not to exist before the execution of the process. Head mounted display device.

(Appendix 4)
The sensor detects the value of the acceleration and the inclination for each movement direction as the movement of the user's head,
The motion map associates a set numerical range for each acceleration and inclination for each motion direction with an operation,
The motion learning unit is subject to the determination that the motion learning unit does not exist, and the acceleration and the slope value for each motion direction are within the numerical range of the slope for each acceleration and motion direction set in the motion map. The head mounted display device according to any one of appendices 1 to 3, wherein the operation map is updated so as to enter.

(Appendix 5)
The operation learning unit updates the operation map so that the process executed by the operation execution unit is executed by the operation determined to be nonexistent. The head mounted display apparatus in any one.

(Appendix 6)
An operation map storage unit for storing the operation map;
The head mounted display device according to any one of appendices 1 to 5, wherein the motion map storage unit stores the motion map for each user.

(Appendix 7)
A method for controlling a head mounted display device mounted on a user's head,
(A) causing a sensor attached to the head mounted display to detect the movement of the user's head;
(B) determining whether or not the motion detected in the step (a) exists on a motion map in which a preset motion and operation are associated with each other;
(C) In the step (b), when it is determined that the motion detected in the step (a) exists, an operation corresponding to the detected motion is specified, and the operation corresponding to the specified operation is determined. The steps of executing the processing,
(D) The process is executed in the step (c), and it is determined that the detected operation does not exist before the process in the step (c) is executed. Updating the action map using the action that was determined to be non-existent when
Including a control method.

(Appendix 8)
The control method according to appendix 7, wherein, in the step (d), the operation map is updated so that the operation that has been determined to be nonexistent exists on the operation map.

(Appendix 9)
(7) The operation map is updated in the step (d) so that the process executed in the step (c) is executed by the operation determined not to exist before the execution of the process. The control method described in 1.

(Appendix 10)
In the step (a), as the movement of the user's head, detection of acceleration and inclination value for each movement direction is executed,
The motion map associates a set numerical range for each acceleration and inclination for each motion direction with an operation,
In step (d), the value of the slope for each acceleration and motion direction, which is the object of determination that it does not exist, is within the numerical range of the slope for each acceleration and motion direction set in the motion map. The control method according to any one of appendices 7 to 9, wherein the operation map is updated so as to enter.

(Appendix 11)
In the step (d), the operation map is updated so that the processing executed by the operation execution unit is executed by the operation that is determined to be nonexistent. The control method in any one of.

(Appendix 12)
The motion map is stored in a motion map storage unit,
The control method according to any one of appendices 7 to 11, wherein the operation map storage unit stores the operation map for each user.

(Appendix 13)
A program for executing control of a head mounted display device mounted on a user's head by a processor, the processor including:
(A) causing a sensor attached to the head mounted display to detect the movement of the user's head;
(B) determining whether or not the motion detected in the step (a) exists on a motion map in which a preset motion and operation are associated with each other;
(C) In the step (b), when it is determined that the motion detected in the step (a) exists, an operation corresponding to the detected motion is specified, and the operation corresponding to the specified operation is determined. The steps of executing the processing,
(D) The process is executed in the step (c), and it is determined that the detected operation does not exist before the process in the step (c) is executed. Updating the action map using the action that was determined to be non-existent when
A program that executes

(Appendix 14)
14. The program according to appendix 13, wherein in the step (d), the operation map is updated so that the operation subject to determination that it does not exist exists on the operation map.

(Appendix 15)
Appendix 13 or 14 wherein the operation map is updated in step (d) so that the process executed in step (c) is executed by the operation determined not to exist before the execution of the process. The program described in.

(Appendix 16)
In the step (a), as the movement of the user's head, detection of acceleration and inclination value for each movement direction is executed,
The motion map associates a set numerical range for each acceleration and inclination for each motion direction with an operation,
In step (d), the value of the slope for each acceleration and motion direction, which is the object of determination that it does not exist, is within the numerical range of the slope for each acceleration and motion direction set in the motion map. The program according to any one of appendices 13 to 15, wherein the operation map is updated so as to enter.

(Appendix 17)
Supplementary notes 13 to 16 in which, in the step (d), the action map is updated so that the process executed by the operation execution unit is executed by the action that is determined to be nonexistent. A program according to any of the above.

(Appendix 18)
The motion map is stored in a motion map storage unit,
The program according to any one of appendices 13 to 17, wherein the action map storage unit stores the action map for each user.

  According to the present embodiment, it is possible to execute processing in accordance with user operation characteristics. For this reason, it is useful for a head-mounted display device that executes processing based on the motion of the head having user motion characteristics. Further, the controller 5 of the head mounted display device 1 in the above embodiment can be used as a remote controller such as an AV device or a controller of a game device.

DESCRIPTION OF SYMBOLS 1 Head mounted display apparatus 2 Display 4 Sensor 5 Controller 54 Action learning part 55 Operation execution part

Claims (8)

  1. A head mounted display device mounted on a user's head,
    A sensor for detecting movement of the user's head;
    It is determined whether or not the detected motion exists on a motion map in which a preset motion and operation are associated with each other. When it is determined that the detected motion is present, the detected motion corresponds to the detected motion An operation execution unit that identifies an operation to be performed and executes processing according to the identified operation;
    When the process is executed by the operation execution unit and it is determined that the detected operation does not exist before the execution of the process, the target that is determined to be non-existent An action learning unit for updating the action map using an action;
    A head mounted display device.
  2.   The head-mounted display device according to claim 1, wherein the motion learning unit updates the motion map so that the motion that has been determined to be non-existent is present on the motion map.
  3.   The operation learning unit updates the operation map so that the process executed by the operation execution unit is executed by the operation determined not to exist before the execution of the process. The head mounted display device described.
  4. The sensor detects the value of the acceleration and the inclination for each movement direction as the movement of the user's head,
    The motion map associates a set numerical range for each acceleration and inclination for each motion direction with an operation,
    The motion learning unit is subject to the determination that the motion learning unit does not exist, and the acceleration and the slope value for each motion direction are within the numerical range of the slope for each acceleration and motion direction set in the motion map. The head mounted display device according to claim 1, wherein the operation map is updated so as to enter.
  5.   The operation learning unit updates the operation map so that the processing executed by the operation execution unit is executed by the operation determined to be nonexistent. The head mounted display apparatus in any one of.
  6. An operation map storage unit for storing the operation map;
    The head mounted display device according to claim 1, wherein the motion map storage unit stores the motion map for each user.
  7. A method for controlling a head mounted display device mounted on a user's head,
    (A) causing a sensor attached to the head mounted display to detect the movement of the user's head;
    (B) determining whether or not the motion detected in the step (a) exists on a motion map in which a preset motion and operation are associated with each other;
    (C) In the step (b), when it is determined that the motion detected in the step (a) exists, an operation corresponding to the detected motion is specified, and the operation corresponding to the specified operation is determined. The steps of executing the processing,
    (D) The process is executed in the step (c), and it is determined that the detected operation does not exist before the process in the step (c) is executed. Updating the action map using the action that was determined to be non-existent when
    Including a control method.
  8. A program for executing control of a head mounted display device mounted on a user's head by a processor, the processor including:
    (A) causing a sensor attached to the head mounted display to detect the movement of the user's head;
    (B) determining whether or not the motion detected in the step (a) exists on a motion map in which a preset motion and operation are associated with each other;
    (C) In the step (b), when it is determined that the motion detected in the step (a) exists, an operation corresponding to the detected motion is specified, and the operation corresponding to the specified operation is determined. The steps of executing the processing,
    (D) The process is executed in the step (c), and it is determined that the detected operation does not exist before the process in the step (c) is executed. Updating the action map using the action that was determined to be non-existent when
    A program that executes
JP2012128054A 2012-06-05 2012-06-05 Head-mounted display device, control method, and program Pending JP2013254251A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2012128054A JP2013254251A (en) 2012-06-05 2012-06-05 Head-mounted display device, control method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2012128054A JP2013254251A (en) 2012-06-05 2012-06-05 Head-mounted display device, control method, and program

Publications (1)

Publication Number Publication Date
JP2013254251A true JP2013254251A (en) 2013-12-19

Family

ID=49951727

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2012128054A Pending JP2013254251A (en) 2012-06-05 2012-06-05 Head-mounted display device, control method, and program

Country Status (1)

Country Link
JP (1) JP2013254251A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016076061A (en) * 2014-10-06 2016-05-12 三菱電機株式会社 Operation input device
JP2016134063A (en) * 2015-01-21 2016-07-25 セイコーエプソン株式会社 Wearable computer, the command setting method and a command setting program
JP2018508805A (en) * 2014-12-29 2018-03-29 株式会社ソニー・インタラクティブエンタテインメント Method and system for user interaction in a virtual or augmented reality scene using a head mounted display
US9965830B2 (en) 2014-12-12 2018-05-08 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009064199A (en) * 2007-09-05 2009-03-26 Casio Comput Co Ltd Gesture recognition apparatus and gesture recognition method
JP2009123042A (en) * 2007-11-16 2009-06-04 Nikon Corp Controller and head mounted display
JP2010205214A (en) * 2009-03-06 2010-09-16 Nikon Corp Controller and head-mount display device
JP2011197992A (en) * 2010-03-19 2011-10-06 Fujitsu Ltd Device and method for determining motion, and computer program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009064199A (en) * 2007-09-05 2009-03-26 Casio Comput Co Ltd Gesture recognition apparatus and gesture recognition method
JP2009123042A (en) * 2007-11-16 2009-06-04 Nikon Corp Controller and head mounted display
JP2010205214A (en) * 2009-03-06 2010-09-16 Nikon Corp Controller and head-mount display device
JP2011197992A (en) * 2010-03-19 2011-10-06 Fujitsu Ltd Device and method for determining motion, and computer program

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016076061A (en) * 2014-10-06 2016-05-12 三菱電機株式会社 Operation input device
US9965830B2 (en) 2014-12-12 2018-05-08 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and program
JP2018508805A (en) * 2014-12-29 2018-03-29 株式会社ソニー・インタラクティブエンタテインメント Method and system for user interaction in a virtual or augmented reality scene using a head mounted display
US10073516B2 (en) 2014-12-29 2018-09-11 Sony Interactive Entertainment Inc. Methods and systems for user interaction within virtual reality scene using head mounted display
JP2016134063A (en) * 2015-01-21 2016-07-25 セイコーエプソン株式会社 Wearable computer, the command setting method and a command setting program

Similar Documents

Publication Publication Date Title
US8523358B2 (en) Information processing apparatus, method, and storage medium storing program
KR20150122062A (en) Method for providing user interaction with wearable device and wearable device implenenting thereof
TWI571790B (en) Method and electronic device for changing coordinate values of icons according to a sensing signal
EP2738660A2 (en) Method for providing user interface based on physical engine and an electronic device thereof
US8150384B2 (en) Methods and apparatuses for gesture based remote control
JP5993127B2 (en) Head-mounted display device, information terminal, program, information storage medium, image processing system, head-mounted display device control method, and information terminal control method
EP2960783B1 (en) Mobile terminal and method for controlling the same
JP5678324B2 (en) Display device, computer program, and display method
KR20130097433A (en) Application switching apparatus and method
JP5945417B2 (en) Electronics
JP6188288B2 (en) Information processing apparatus and control method thereof
CN107407964B (en) It is stored with for the memory and computer system in the computer program for immersing the control object run of type Virtual Space
US20130162603A1 (en) Electronic device and touch input control method thereof
US8487882B2 (en) Touch-panel display device and portable equipment
CN108352085A (en) virtual space providing method and program
KR20140139241A (en) Method for processing input and an electronic device thereof
KR100582349B1 (en) Apparatus for instituting function by movement in handset and method thereof
CN104380237A (en) Reactive user interface for head-mounted display
USRE45411E1 (en) Method for entering commands and/or characters for a portable communication device equipped with a tilt sensor
US9176578B2 (en) Control apparatus, control method, program, input signal receiving apparatus, operation input apparatus, and input system for performing processing with input units on different surfaces
EP2302880B1 (en) Apparatus and method for controlling menu navigation in a terminal by using an inertial sensor in said terminal
JP5857257B2 (en) Display device and display direction switching method
JP5248225B2 (en) Content display device, content display method, and program
JP4574329B2 (en) Image display device
JP5795047B2 (en) Control device for controlling television interface, remote control device and control method thereof

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20131010

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20140528

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20140604

A711 Notification of change in applicant

Free format text: JAPANESE INTERMEDIATE CODE: A712

Effective date: 20140609

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20141104