WO2013042530A1 - 表示装置、表示制御方法およびプログラム - Google Patents
表示装置、表示制御方法およびプログラム Download PDFInfo
- Publication number
- WO2013042530A1 WO2013042530A1 PCT/JP2012/072347 JP2012072347W WO2013042530A1 WO 2013042530 A1 WO2013042530 A1 WO 2013042530A1 JP 2012072347 W JP2012072347 W JP 2012072347W WO 2013042530 A1 WO2013042530 A1 WO 2013042530A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- main body
- display
- movement
- apparatus main
- motion detection
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 21
- 238000001514 detection method Methods 0.000 claims abstract description 137
- 230000001133 acceleration Effects 0.000 claims description 37
- 238000006073 displacement reaction Methods 0.000 claims description 12
- 230000005484 gravity Effects 0.000 claims description 5
- 238000004891 communication Methods 0.000 description 17
- 238000010586 diagram Methods 0.000 description 7
- 230000000052 comparative effect Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
Definitions
- the present invention relates to a technique for displaying a desired region in a two-dimensional or three-dimensional virtual space.
- a virtual space access system that displays a desired area in a two-dimensional or three-dimensional virtual space is known.
- Patent Document 1 describes a mobile terminal that accesses a virtual space in which an application screen is arranged.
- the portable terminal described in Patent Document 1 includes an acceleration sensor, a motion data calculation unit, a screen arrangement unit, a table, a screen movement unit, a display ratio calculation unit, and an input device.
- the acceleration sensor detects the movement of the mobile terminal body.
- the motion data calculation unit calculates motion data indicating the motion of the mobile terminal body based on the output signal of the acceleration sensor.
- the screen arrangement unit arranges the application screen on the virtual space.
- the display device displays an area in the virtual space.
- the table holds information in which the motion data is associated with the screen scroll direction in the virtual space.
- the screen moving unit refers to the table and displays the screen scroll direction corresponding to the movement data.
- the area on the virtual space displayed by the display device is determined and moved in the determined screen scroll direction.
- the display ratio calculation unit calculates the display ratio of the application screen in the area on the virtual space displayed by the display device.
- the input device accepts an application operation.
- Patent Document 2 describes a display device that displays a virtual image and a video experience device using the display device.
- the display device described in Patent Document 2 includes a device main body including a video display device having a video displayable area wider than an actual video display region, a displacement detection unit that detects a displacement of the device main body, and a detection by the displacement detection unit.
- Display control means for correcting displacement of the video displayed on the video display device based on the result.
- the display device is a head mounted display and is mounted on the head.
- the user observes the video displayed on the video display device. For example, when the head is moved, the video area on the screen of the video display device moves in conjunction with the movement. This movement of the image area is observed as image blur.
- image blurring is corrected based on the displacement detection result of the device body. Thereby, even if the head is moved, the image is displayed at the same position on the screen.
- a video experience device described in Patent Document 2 uses the above display device together with a posture changing device, and includes base posture detecting means for detecting the movement of the base of the posture changing device, and the displacement of the display device. Based on the displacement obtained by subtracting the movement component of the base detected by the base posture detection means from the detection result of the detection means, the video display position in the video display device is optimized. Thereby, even when the user wearing the display device uses the posture changing device, the video is displayed at the same position on the screen.
- an acceleration sensor attached to the portable terminal body detects the movement of the portable terminal body, and the screen is scrolled in conjunction with the detected movement.
- the acceleration sensor detects movements unrelated to operations on the mobile terminal main body, such as movement of the user holding the mobile terminal main body or change of body orientation, and thus the user does not intend.
- the screen display may be changed by the operation.
- the movement amount of the user holding the mobile terminal body is large, for example, when the user moves on a moving means such as a train in a state where an area in the virtual space is displayed on the display device, the mobile terminal body is Moving outside the virtual space, nothing is displayed on the screen.
- An object of the present invention is to provide a display device, a display control method, and a program capable of preventing a screen display from being changed by an operation not intended by a user.
- the display device of the present invention provides: An apparatus main body provided with a display means; First movement detecting means for detecting movement of the apparatus body; Second movement detecting means for detecting movement of a holding body for holding the apparatus main body; Control means for detecting an operation on the main body of the apparatus based on detection results of the first and second motion detection means and controlling display of the display means in accordance with the detected operation.
- the display control method of the present invention includes: Detecting the movement of the apparatus body provided with the display means, and detecting the movement of the holding body holding the apparatus body; Detecting an operation on the apparatus main body based on the detection result of the movement of the apparatus main body and the detection result of the movement of the holding body, and controlling the display of the display means according to the detected operation.
- the program of the present invention An operation on the apparatus main body is detected based on a first detection result of detecting the movement of the apparatus main body including the display unit and a second detection result of detecting the movement of the holding body that holds the apparatus main body. Processing, And causing the computer to execute processing for controlling the display of the display means in accordance with the detected operation.
- FIG. 1st Embodiment of this invention It is a block diagram which shows the structure of the display apparatus which is the 1st Embodiment of this invention. It is a flowchart which shows one procedure of the operation detection process performed with the display apparatus shown in FIG. It is a block diagram which shows the structure of the virtual space access apparatus which is the 2nd Embodiment of this invention. It is a flowchart which shows one procedure of the access process performed with the virtual space access apparatus shown in FIG. It is a schematic diagram which shows an example of access to the virtual space in the case where it is not accompanied by movements, such as a user's walk performed by the virtual space access apparatus shown in FIG. It is a schematic diagram for demonstrating the comparative example of the access to virtual space.
- FIG. 1 It is a schematic diagram which shows an example of access to virtual space in the case of accompanying movement, such as a user's walk, performed with the virtual space access apparatus shown in FIG. It is a figure for demonstrating the acceleration information of the motion detection means of the virtual space access apparatus shown in FIG.
- FIG. 1 is a block diagram showing a configuration of a display device according to the first embodiment of the present invention.
- the display device shown in FIG. 1 includes a device main body 10 including a display means 11, a control means 12, and a motion detection means 13, and a motion detection means 14 that detects the movement of a holding body that holds the device main body 10.
- the holding body is a person or a wheelchair.
- the display means 11 is a display device such as a liquid crystal display, and displays, for example, a partial region of a two-dimensional or three-dimensional virtual space.
- the motion detection means 13 detects at least one of movement and posture change as the motion of the apparatus body 10.
- the posture change includes a change in inclination and a change in orientation (orientation).
- an acceleration sensor, a magnetic sensor, or a combination thereof may be used as the motion detection means 13.
- the motion detection means 14 detects at least one of movement and posture change as the movement of the holding body.
- the posture change includes a change in inclination and a change in orientation (orientation).
- an acceleration sensor, a magnetic sensor, or a combination thereof may be used as the motion detection means 14.
- the communication between the motion detection means 14 and the control means 12 may be wireless communication or communication via a communication line.
- the detection results of the motion detector 13 and the motion detector 14 are supplied to the controller 12.
- the control means 12 detects an operation on the apparatus main body 10 based on the detection results of the motion detection means 13 and 14, and controls the display of the display means 11 according to the detected operation.
- the control unit 12 when an operation such as moving or tilting the apparatus main body 10 in a desired direction is performed, the control unit 12 receives an operation on the apparatus main body 10 via the motion detection units 13 and 14 and displays the display unit. 11 display is controlled.
- the motion detection means 13 Since the motion detection means 13 is mounted on the apparatus main body 10, when the holding body moves while operating the apparatus main body 10, the motion detection means 13 detects the movement of the apparatus main body 10 according to the operation. At the same time, the movement of the apparatus main body 10 accompanying the movement of the holding body is detected. That is, the detection result of the motion detection means 13 includes a component of movement of the apparatus main body 10 according to the operation and a component of movement of the apparatus main body 10 accompanying the movement of the holding body.
- the motion detection means 14 detects the movement of the holding body.
- the detection result of the motion detection unit 14 corresponds to a component of the motion of the apparatus main body 10 accompanying the movement of the holding body in the detection result of the motion detection unit 13. Therefore, by subtracting the detection result of the motion detection means 14 from the detection result of the motion detection means 13, the motion component of the apparatus main body 10 according to the operation can be extracted.
- the control unit 12 can detect an operation on the apparatus main body 10 based on a component of movement of the apparatus main body 10 according to the operation.
- the motion detection means 13 detects the movement of the apparatus main body 10 accompanying the movement of the holding body.
- the detection result of the motion detection means 13 includes only the motion component of the apparatus main body 10 accompanying the movement of the holding body.
- the motion detection means 14 detects the movement of the holding body. Therefore, when the detection result of the motion detection means 14 is subtracted from the detection result of the motion detection means 13, the movement accompanying the movement of the holding body is cancelled. As a result, the motion detection means 13 does not detect the movement of the apparatus main body 10, and the control means 12 determines that no operation has been performed on the apparatus main body 10.
- the motion detecting means 13 detects the movement of the apparatus main body 10 according to the operation.
- the detection result of the motion detector 13 includes only the motion component of the apparatus main body 10 according to the operation.
- the movement detector 14 does not detect the movement of the holding body. Therefore, the control means 12 detects an operation on the apparatus main body 10 only from the motion component of the apparatus main body 10 corresponding to the operation detected by the movement detection means 13.
- the above-described operation is for the movement of the apparatus main body 10 and the holding body, but the operation can be detected by the same control for the posture change of the apparatus main body 10 and the holding body.
- FIG. 2 is a flowchart showing one procedure of processing for detecting an operation on the apparatus main body 10.
- control means 12 acquires the detection result A from the motion detection means 13 and the detection result B from the motion detection means 14 (step S10).
- the detection results A and B are acquired at regular intervals.
- control means 12 determines whether or not the detection result A is 0 (step S11).
- step S11 determines whether or not the detection result B is 0 (step S12).
- step S12 determines whether or not the result of subtracting the detection result B from the detection result A is 0 (step S13).
- step S13 If the determination result in step S13 is “No”, the control means 12 detects an operation on the apparatus main body 10 based on the result obtained by subtracting the detection result B from the detection result A (step S15). And the control means 12 controls the display of the display means 11 according to the detected operation.
- step S11 determines whether the operation detection process is terminated. If the determination result in step S11 is “Yes” and if the determination result in step S13 is “Yes”, the operation detection process is terminated.
- step S12 If the determination result in step S12 is “Yes”, the control means 12 detects an operation on the apparatus main body 10 based on the detection result A (step S15). And the control means 12 controls the display of the display means 11 according to the detected operation.
- the operation on the apparatus main body 10 can be reliably detected by canceling the movement component of the apparatus main body 10 accompanying the movement of the holding body and the posture change. Therefore, an operation unintended by the user can be suppressed.
- steps S11, S12, and S15 may be omitted. In this case, the operation with respect to the apparatus main body 10 can be reliably detected by a simpler procedure.
- the movement component of the holding body detected by the movement detection unit 14 is configured to substantially match the movement component of the apparatus main body 10 accompanying the movement of the holding body detected by the movement detection unit 13.
- the detection accuracy of the operation on the apparatus main body 10 can be improved.
- the motion detection means 13 detects the movement of the apparatus main body 10 according to the movement of the holder at the same time as detecting the movement of the apparatus main body 10 according to the operation. .
- the motion detection unit 13 detects the movement component of the holding body detected by the motion detection unit 14.
- the movement component of the apparatus main body 10 accompanying the movement of the holding body is substantially the same.
- the movement detecting means 14 that can detect only the movement of the holder without detecting the movement of the hand is composed of, for example, a plurality of acceleration sensors. Specifically, a plurality of acceleration sensors are attached to the holding body, and the control unit 12 detects the movement of the holding body based on the average value of the outputs of the respective acceleration sensors.
- a plurality of acceleration sensors capable of detecting the center of gravity of the holding body may be used as the motion detection means 14.
- a plurality of acceleration sensors are attached to the holding body so that the center of gravity of the holding body can be detected, and the control unit 12 detects the movement of the center of gravity of the holding body based on the output value of each acceleration sensor. To do.
- the holding body that holds the apparatus main body 10 when the holding body that holds the apparatus main body 10 is a moving body such as a wheelchair, the moving body has a base portion that holds the apparatus main body 10 so as to be rotatable or movable.
- the motion detector 14 is attached to a portion other than the base of the moving body.
- the motion detection means 13 and 14 when the motion detection means 13 and 14 are configured using a magnetic sensor, the motion detection means 13 detects a change in the orientation of the apparatus body, and the motion detection means 14 Detect changes in orientation.
- the operation with respect to the apparatus main body 10 can be detected by canceling the movement component of the apparatus main body 10 that accompanies a change in the orientation of the holding body.
- the holder is moving by a moving means such as a car, even if the directions of the moving means and the holder change, the display does not change accordingly. This is effective, for example, when an object indicating a direction is arranged in the virtual space by navigation or the like.
- control means 12 may detect an operation based on an acceleration detection signal greater than a predetermined value of the motion detection means 13 and 14. Thereby, operation with respect to the apparatus main body 10 can be performed more reliably.
- the control means 12 may detect an operation based on a signal obtained by removing a predetermined displacement portion (an acceleration displacement portion that returns to substantially the same value within a predetermined time) from the output signals of the motion detection means 13 and 14. . This also makes it possible to more reliably perform operations on the apparatus main body 10.
- a predetermined displacement portion an acceleration displacement portion that returns to substantially the same value within a predetermined time
- control means 12 may be configured using a computer (CPU: Central Processing unit) that operates according to a program.
- the program can cause a computer to execute at least the operation detection process described above.
- the program may be provided using a recording medium or may be provided via a communication network (for example, the Internet).
- FIG. 3 is a block diagram showing the configuration of the virtual space access apparatus according to the second embodiment of the present invention.
- the virtual space access device shown in FIG. 3 includes a device main body 20 and a module 30.
- the apparatus main body 20 includes display means 21, control means 22, motion detection means 23, virtual space generation means 24, and communication means 25.
- the display means 21 and the motion detection means 23 are the same as the display means 11 and the motion detection means 13 shown in FIG.
- the apparatus main body 20 may include an output device such as a speaker or an input device such as a key or a touch panel.
- the module 30 is mounted on or attached to a holding body, and has a motion detection means 31 and a communication means 32.
- the motion detection means 31 is the same as the motion detection means 14 shown in FIG.
- the module 30 may be configured as a mounting portion for earphones, headphones, glasses, a hat, accessories, and the like.
- the virtual space generation unit 24 holds data for forming a two-dimensional or three-dimensional virtual space, and forms a virtual space based on the held data in accordance with an instruction signal from the control unit 22.
- the display means 21 displays a part of the virtual space according to the display control signal from the control means 22.
- Wireless communication is communication using radio waves, light, sound, etc.
- short-range wireless communication such as infrared communication may be used.
- the control means 22 acquires the detection result of the motion detection means 31 via the communication means 25 and 32.
- the control means 22 acquires the detection results of the motion detection means 23 and 31 at regular intervals, detects an operation on the apparatus main body 20 based on the acquired detection results, and displays the display means 11 according to the detected operation. Control the display of.
- FIG. 4 is a flowchart showing a procedure of access processing performed by the virtual space access device of the present embodiment.
- control unit 22 When the control unit 22 receives an instruction to display the virtual space via the input device, the control unit 22 supplies an instruction signal to form the virtual space to the virtual space forming unit 24, and the virtual space forming unit 24 A virtual space is formed according to the signal (step S20).
- control means 22 supplies a display control signal for displaying a predetermined area of the virtual space to the display means 21, and the display means 21 displays the predetermined area of the virtual space according to the display control signal (step). S21).
- control means 22 acquires the detection results of the motion detection means 23 and 31 at regular intervals, and detects an operation on the apparatus main body 20 based on the acquired detection results (step S22).
- This operation detection process is executed in the same procedure as the operation detection process shown in FIG.
- control means 22 supplies a display control signal based on the operation detected in step S22 to the display means 21, and the display means 21 changes the display position in the virtual space according to the display control signal (step S23).
- an operation unintended by the user based on the movement of the holding body or a change in posture can be eliminated, so that a desired area in the virtual space can be reliably displayed on the display unit 21. Can do. That is, accurate access to the virtual space can be realized.
- FIG. 5 shows an example of access to the virtual space when the user does not move such as walking.
- a world map virtual space 200 is formed.
- the user can hold the apparatus main body 20 with one hand and move the apparatus main body 20 to a desired position in the virtual space 200 to display an image of the desired position in the virtual space 200 on the display unit 21. .
- FIG. 6 shows an example of access to the virtual space when the module 30 is not used as a comparative example when the user is moving such as walking.
- the module 30 is not attached to the user 100, and the control unit 22 detects an operation based only on the detection result of the motion detection unit 23.
- the apparatus main body 20 moves out of the virtual space, and no video is displayed on the display means 21.
- FIG. 7 shows an example of access to the virtual space when the user is moving such as walking.
- the module 30 is attached to the user 100, and the control unit 22 detects an operation based on the detection results of the motion detection units 23 and 31. In this case, even if the user 100 moves, the apparatus main body 20 does not move out of the virtual space, and an image at a desired position in the virtual space is displayed on the display means 21.
- FIG. 8 shows an example of acceleration information when the motion detectors 23 and 31 are configured using an acceleration sensor.
- the motion detection means 23 outputs acceleration information A
- the motion detection means 31 outputs acceleration information B.
- the acceleration information A includes a movement component of the apparatus main body 20 accompanying the operation and a movement component of the apparatus main body 20 accompanying the movement of the user 100.
- the acceleration information B includes a motion component of the user 100.
- the acceleration information C which is a motion component of the apparatus main body 20 accompanying the operation, is obtained.
- the control unit 23 performs an operation on the apparatus main body 20 based on the acceleration information C obtained by subtracting the acceleration information B from the acceleration information A. Can be detected.
- the virtual space forming unit 24 may be provided separately from the device body 20.
- the control means 22 may detect operation based on the acceleration detection signal more than the predetermined value of the motion detection means 23 and 31. . As a result, fine shaking of the virtual space can be prevented and access to a desired space area can be performed more reliably.
- the control means 22 may detect an operation based on a signal obtained by removing a predetermined displacement portion (an acceleration displacement portion that returns to substantially the same value within a predetermined time) from the output signals of the motion detection means 23 and 31. Also by this, it is possible to prevent the virtual space from being shaken finely, and it is possible to more reliably access a desired space area.
- a predetermined displacement portion an acceleration displacement portion that returns to substantially the same value within a predetermined time
- control means 22 may be configured using a computer that operates according to a program.
- the program can cause a computer to execute at least the access process described above.
- the program may be provided using a computer-readable recording medium, for example, an optical disc such as a CD (Compact Disc) or a DVD (Digital Video Disc), a USB (Universal Serial Bus) memory, a memory card, or the like. It may be provided via a network (for example, the Internet).
- the device of the present invention described above can also be applied to mobile devices (mobile terminals such as tablet terminals and notebook personal computers, mobile phones, etc.), head mounted displays, eyeglass-type display devices, and game machines.
- mobile devices mobile terminals such as tablet terminals and notebook personal computers, mobile phones, etc.
- head mounted displays eyeglass-type display devices, and game machines.
Abstract
Description
表示手段を備えた装置本体と、
前記装置本体の動きを検出する第1の動き検出手段と、
前記装置本体を保持する保持体の動きを検出する第2の動き検出手段と、
前記第1および第2の動き検出手段の検出結果に基づいて、前記装置本体に対する操作を検出し、該検出した操作に従って前記表示手段の表示を制御する制御手段とを有する。
表示手段を備えた装置本体の動きを検出するとともに、前記装置本体を保持する保持体の動きを検出し、
前記装置本体の動きの検出結果および前記保持体の動きの検出結果に基づいて、前記装置本体に対する操作を検出し、該検出した操作に従って前記表示手段の表示を制御することを含む。
表示手段を備えた装置本体の動きを検出した第1の検出結果と、前記装置本体を保持する保持体の動きを検出した第2の検出結果とに基づいて、前記装置本体に対する操作を検出する処理と、
前記検出した操作に従って前記表示手段の表示を制御する処理とを、コンピュータに実行させる。
11 表示手段
12 制御手段
13、14 置誤記検出手段
図1は、本発明の第1の実施形態である表示装置の構成を示すブロック図である。
図3は、本発明の第2の実施形態である仮想空間アクセス装置の構成を示すブロック図である。
Claims (10)
- 表示手段を備えた装置本体と、
前記装置本体の動きを検出する第1の動き検出手段と、
前記装置本体を保持する保持体の動きを検出する第2の動き検出手段と、
前記第1および第2の動き検出手段の検出結果に基づいて、前記装置本体に対する操作を検出し、該検出した操作に従って前記表示手段の表示を制御する制御手段とを有する、表示装置。 - 前記第1の動き検出手段は、前記装置本体の移動および姿勢変化の少なくとも一方を検出し、
前記第2の動き検出手段は、前記保持体の移動および姿勢変化の少なくとも一方を検出する、請求項1に記載の表示装置。 - 前記制御手段は、前記第1の動き検出手段の検出結果から前記第2の動き検出手段の検出結果を差し引いた結果に基づいて、前記装置本体に対する操作を検出する、請求項1または2に記載の表示装置。
- 仮想空間を形成する仮想空間形成手段を、さらに有し、
前記制御手段は、前記仮想空間の一部の領域の映像を前記表示手段に表示させるとともに、前記第1および第2の動き検出手段の検出結果に基づいて検出した、前記装置本体に対する操作に従って、前記表示手段が表示する前記仮想空間上の表示領域を変化させる、請求項1から3のいずれか1項に記載の表示装置。 - 前記第1および第2の動き検出手段はそれぞれ、加速度センサであり、
前記制御手段は、前記第1および第2の動き検出手段の所定値以上の出力信号に基づいて、前記装置本体に対する操作を検出する、請求項1から4のいずれか1項に記載の表示装置。 - 前記第1および第2の動き検出手段はそれぞれ、加速度センサであり、
前記制御手段は、前記第1および第2の動き検出手段それぞれの出力信号から所定の変位を示す部分を取り除いた後の信号に基づいて、前記装置本体に対する操作を検出する、請求項1から4のいずれか1項に記載の表示装置。 - 前記第2の動き検出手段は、複数の加速度センサからなり、
前記制御手段は、前記第2の動き検出手段から前記複数の加速度センサの出力値の平均値を示す検出結果を取得する、請求項1から4のいずれか1項に記載の表示装置。 - 前記第2の動き検出手段は、前記保持体の重心を検出し、
前記制御手段は、前記第2の動き検出手段から前記保持体の重心の変化を示す検出結果を取得する、請求項1から4のいずれか1項に記載の表示装置。 - 表示手段を備えた装置本体の動きを検出するとともに、前記装置本体を保持する保持体の動きを検出し、
前記装置本体の動きの検出結果および前記保持体の動きの検出結果に基づいて、前記装置本体に対する操作を検出し、該検出した操作に従って前記表示手段の表示を制御する、表示制御方法。 - 表示手段を備えた装置本体の動きを検出した第1の検出結果と、前記装置本体を保持する保持体の動きを検出した第2の検出結果とに基づいて、前記装置本体に対する操作を検出する処理と、
前記検出した操作に従って前記表示手段の表示を制御する処理とを、コンピュータに実行させるプログラム。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/343,236 US20140218288A1 (en) | 2011-09-22 | 2012-09-03 | Display device, display control method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011207779 | 2011-09-22 | ||
JP2011-207779 | 2011-09-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013042530A1 true WO2013042530A1 (ja) | 2013-03-28 |
Family
ID=47914307
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/072347 WO2013042530A1 (ja) | 2011-09-22 | 2012-09-03 | 表示装置、表示制御方法およびプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US20140218288A1 (ja) |
JP (1) | JPWO2013042530A1 (ja) |
WO (1) | WO2013042530A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018180331A1 (ja) * | 2017-03-28 | 2018-10-04 | 国立大学法人九州工業大学 | 運転者状態検知装置 |
JP2021067899A (ja) * | 2019-10-28 | 2021-04-30 | 株式会社日立製作所 | 頭部装着型表示装置および表示コンテンツ制御方法 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10254614A (ja) * | 1997-03-06 | 1998-09-25 | Hitachi Ltd | 携帯型電子処理装置及びその操作方法 |
JPH11202257A (ja) * | 1998-01-14 | 1999-07-30 | Sony Corp | ディスプレイ装置および映像体感装置 |
JP2001175883A (ja) * | 1999-12-16 | 2001-06-29 | Sony Corp | バーチャルリアリティ装置 |
JP2002007027A (ja) * | 2000-06-27 | 2002-01-11 | Masanori Idesawa | 画像情報表示装置 |
JP2007121489A (ja) * | 2005-10-26 | 2007-05-17 | Nec Corp | 携帯表示装置 |
JP2010086192A (ja) * | 2008-09-30 | 2010-04-15 | Sony Ericsson Mobile Communications Ab | 携帯機器、コンピュータプログラムおよび記録媒体 |
WO2011025578A1 (en) * | 2009-08-31 | 2011-03-03 | Ge Aviation Systems Llc | Method and system for a motion compensated input device |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003280630A (ja) * | 2002-03-20 | 2003-10-02 | Toshiba Corp | 情報処理装置および同装置で使用される表示制御方法 |
US8279242B2 (en) * | 2008-09-26 | 2012-10-02 | Microsoft Corporation | Compensating for anticipated movement of a device |
-
2012
- 2012-09-03 JP JP2013534655A patent/JPWO2013042530A1/ja active Pending
- 2012-09-03 US US14/343,236 patent/US20140218288A1/en not_active Abandoned
- 2012-09-03 WO PCT/JP2012/072347 patent/WO2013042530A1/ja active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10254614A (ja) * | 1997-03-06 | 1998-09-25 | Hitachi Ltd | 携帯型電子処理装置及びその操作方法 |
JPH11202257A (ja) * | 1998-01-14 | 1999-07-30 | Sony Corp | ディスプレイ装置および映像体感装置 |
JP2001175883A (ja) * | 1999-12-16 | 2001-06-29 | Sony Corp | バーチャルリアリティ装置 |
JP2002007027A (ja) * | 2000-06-27 | 2002-01-11 | Masanori Idesawa | 画像情報表示装置 |
JP2007121489A (ja) * | 2005-10-26 | 2007-05-17 | Nec Corp | 携帯表示装置 |
JP2010086192A (ja) * | 2008-09-30 | 2010-04-15 | Sony Ericsson Mobile Communications Ab | 携帯機器、コンピュータプログラムおよび記録媒体 |
WO2011025578A1 (en) * | 2009-08-31 | 2011-03-03 | Ge Aviation Systems Llc | Method and system for a motion compensated input device |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018180331A1 (ja) * | 2017-03-28 | 2018-10-04 | 国立大学法人九州工業大学 | 運転者状態検知装置 |
CN110476194A (zh) * | 2017-03-28 | 2019-11-19 | 国立大学法人九州工业大学 | 驾驶员状态探测装置 |
JPWO2018180331A1 (ja) * | 2017-03-28 | 2020-02-06 | 国立大学法人九州工業大学 | 運転者状態検知装置 |
JP2021067899A (ja) * | 2019-10-28 | 2021-04-30 | 株式会社日立製作所 | 頭部装着型表示装置および表示コンテンツ制御方法 |
Also Published As
Publication number | Publication date |
---|---|
US20140218288A1 (en) | 2014-08-07 |
JPWO2013042530A1 (ja) | 2015-03-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5869177B1 (ja) | 仮想現実空間映像表示方法、及び、プログラム | |
JP6820405B2 (ja) | 拡張および/または仮想現実環境における6自由度コントローラを用いた仮想オブジェクトの操作 | |
US9122307B2 (en) | Advanced remote control of host application using motion and voice commands | |
US20130002534A1 (en) | Systems and Methods for Controlling a Cursor on a Display Using a Trackpad Input Device | |
JP2013516660A (ja) | 透過性電子デバイス | |
JPWO2012114592A1 (ja) | 表示装置、表示制御方法およびプログラム | |
US20120284671A1 (en) | Systems and methods for interface mangement | |
US10948299B1 (en) | Relative inertial measurement system with visual correction | |
US20140092040A1 (en) | Electronic apparatus and display control method | |
WO2021136495A1 (zh) | 一种车机显示界面的页面布局控制方法及装置 | |
US9557781B2 (en) | Adjusting coordinates of touch input | |
US9665232B2 (en) | Information-processing device, storage medium, information-processing method, and information-processing system for enlarging or reducing an image displayed on a display device | |
JP2017059196A (ja) | 仮想現実空間映像表示方法、及び、プログラム | |
WO2013042530A1 (ja) | 表示装置、表示制御方法およびプログラム | |
WO2017022031A1 (ja) | 情報端末装置 | |
JP6033465B2 (ja) | 表示制御装置 | |
JP2014197334A (ja) | 表示制御装置、その制御方法、および制御プログラム | |
US20150033180A1 (en) | Display control apparatus, display control method, display control signal generating apparatus, display control signal generating method, program, and display control system | |
US20160253088A1 (en) | Display control apparatus and display control method | |
JP6014420B2 (ja) | 操作制御装置、操作制御方法、および、操作制御装置用プログラム | |
JP7129897B2 (ja) | 電子機器 | |
EP2801015A1 (en) | Adjusting coordinates of touch input | |
JP2012073583A (ja) | 情報表示装置、情報表示装置の制御方法、情報表示装置の制御プログラム | |
KR102289368B1 (ko) | 단말기 및 그를 이용한 오브젝트 제어 방법 | |
KR102295245B1 (ko) | 단말기 및 그를 이용한 이벤트 진행 제어 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12833883 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14343236 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2013534655 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12833883 Country of ref document: EP Kind code of ref document: A1 |