WO2015104884A1 - 情報処理システム、情報処理方法およびプログラム - Google Patents
情報処理システム、情報処理方法およびプログラム Download PDFInfo
- Publication number
- WO2015104884A1 WO2015104884A1 PCT/JP2014/078117 JP2014078117W WO2015104884A1 WO 2015104884 A1 WO2015104884 A1 WO 2015104884A1 JP 2014078117 W JP2014078117 W JP 2014078117W WO 2015104884 A1 WO2015104884 A1 WO 2015104884A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information processing
- processing system
- control unit
- predetermined
- detected
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
Definitions
- This disclosure relates to an information processing system, an information processing method, and a program.
- a technique for determining a pointing position based on a motion detected by a sensor is disclosed.
- a method for determining a pointing position based on an angular velocity detected by a gyro sensor is disclosed (for example, see Patent Document 1).
- a line-of-sight detection unit that detects a user's line of sight
- an operation detection unit that detects an operation in space by the user
- a position control unit that determines a pointing position based on the operation.
- An information processing system is provided in which the position control unit moves the pointing position based on the line of sight when a predetermined operation is detected.
- a predetermined operation is performed by a processor including detecting a user's line of sight, detecting an operation in space by the user, and determining a pointing position based on the operation.
- An information processing method is provided that includes moving the pointing position based on the line of sight when detected.
- the computer includes a line-of-sight detection unit that detects a user's line of sight, an operation detection unit that detects an operation in space by the user, a position control unit that determines a pointing position based on the operation,
- the position control unit is provided with a program for functioning as an information processing system that moves the pointing position based on the line of sight when a predetermined operation is detected.
- the pointing position when controlling the pointing position, the pointing position can be easily moved to a position desired by the user.
- the above effects are not necessarily limited, and any of the effects shown in the present specification, or other effects that can be grasped from the present specification, together with or in place of the above effects. May be played.
- FIG. 5 is a flowchart showing a flow of operations of an information processing system according to an embodiment of the present disclosure. It is a figure which shows the 1st example of a cursor display in case a pointing position reaches the edge part of a display area. It is a figure which shows the 1st example of a cursor display in case a pointing position does not fit in the inside of a display area. It is a figure which shows the 2nd example of a cursor display when a pointing position reaches the edge part of a display area.
- a plurality of constituent elements having substantially the same functional configuration may be distinguished by adding different alphabets or numbers after the same reference numeral. However, when it is not necessary to particularly distinguish each of a plurality of constituent elements having substantially the same functional configuration, only the same reference numerals are given.
- FIG. 1 is a diagram for describing an overview of an information processing system 10 according to an embodiment of the present disclosure.
- the information processing system 10 includes a detection unit 120, an imaging unit 130, and a display unit 150.
- the display unit 150 has a display area 151 and displays a screen in the display area 151.
- the screen displayed in the display area 151 is not particularly limited.
- the user U may be present at a position where the screen displayed in the display area 151 can be viewed.
- the detection unit 120 detects predetermined detection data.
- the detection data detected by the detection unit 120 is used for the determination of the pointing position P by the user U.
- the detection unit 120 includes a gyro sensor and the XY coordinates in the display area 151 corresponding to the initial posture of the detection unit 120 and the angular velocity detected by the gyro sensor is calculated as the pointing position P mainly.
- Details of the calculation of the pointing position P are disclosed in Patent Document 1 (International Publication No. 2009/008372).
- the determination of the pointing position P by the user U may be made in any way as will be described later.
- the display unit 150 may display a cursor Cu at the pointing position P.
- the cursor Cu is indicated by a white arrow with high definition, but the color, shape, and definition of the cursor Cu are not limited.
- the shape of the cursor Cu may be an ellipse or a circle as will be described later.
- the cursor Cu may have a lower definition than other objects displayed in the display area 151.
- the display unit 150 may not display the cursor Cu at the pointing position P.
- an operation corresponding to the object may be executed.
- the object displayed in the display area 151 is not particularly limited, and may be a still image, a moving image, a button, or a character string.
- the predetermined determination operation is not particularly limited, and may be an operation of pressing a button provided on the detection unit 120, or an operation of moving the detection unit 120 so that the detection unit 120 exhibits a predetermined movement. May be.
- the operation corresponding to the object is not particularly limited, and may be a transition to a screen corresponding to the object.
- the imaging unit 130 has a function of capturing an image including part or all of the user U's body. When a part of the body of the user U is imaged, the part of the body of the user U may be the face area of the user U or the eye area of the user U.
- the pointing position P is determined based on the detection data detected by the detection unit 120, it is desirable to provide a technique that can easily move the pointing position P to a position desired by the user. . Therefore, in the present specification, when the pointing position P is determined based on the detection data detected by the detection unit 120, a technique that can easily move the pointing position P to a position desired by the user is proposed. To do.
- the pointing position P deviates from the position desired by the user.
- the reasons why the pointing position P deviates from the position desired by the user in this way are the drift phenomenon in which the pointing position P moves regardless of the intention of the user U, and the initial position of the pointing position P is desired by the user. It is assumed that it is out of position.
- a technique for easily moving the pointing position P to a position desired by the user can be particularly effectively applied to such a situation.
- FIG. 2 is a diagram illustrating a functional configuration example of the information processing system 10 according to the embodiment of the present disclosure.
- the information processing system 10 includes a control unit 110, a detection unit 120, an imaging unit 130, a storage unit 140, and a display unit 150.
- the control unit 110 exhibits various functions of the control unit 110 by executing a program stored in the storage unit 140 or another storage medium. As illustrated in FIG. 2, the control unit 110 includes functional blocks such as a line-of-sight detection unit 111, an operation detection unit 112, a position control unit 113, and a display control unit 114. Details of the functions of each functional block will be described later.
- the control unit 110 may be configured by a processor such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a DSP (Digital Signal Processor), or a SoC (System on Chip). Moreover, the control part 110 may be comprised with the electronic circuit for performing various arithmetic processing.
- a processor such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a DSP (Digital Signal Processor), or a SoC (System on Chip).
- the control part 110 may be comprised with the electronic circuit for performing various arithmetic processing.
- the detection unit 120 detects predetermined detection data and outputs it to the control unit 110.
- the detection unit 120 may include other sensors.
- the detection unit 120 may include a triaxial acceleration sensor.
- the information processing system 10 may not include the detection unit 120.
- the detection unit 120 is integrated with the information processing system 10, but the detection unit 120 may be configured separately from the information processing system 10.
- the imaging unit 130 is a camera module that captures an image.
- the imaging unit 130 images an actual space using an imaging device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) to generate an image.
- An image generated by the imaging unit 130 is output to the control unit 110.
- the imaging unit 130 is integrated with the information processing system 10, but the imaging unit 130 may be configured separately from the information processing system 10.
- an imaging device connected to the information processing system 10 by wire or wireless may be handled as the imaging unit 130.
- the storage unit 140 stores a program for operating the control unit 110 using a storage medium such as a semiconductor memory or a hard disk. Further, for example, the storage unit 140 can store various data used by the program. In the example illustrated in FIG. 2, the storage unit 140 is integrated with the information processing system 10, but the storage unit 140 may be configured separately from the information processing system 10.
- the display unit 150 displays various information according to control by the control unit 110.
- the display unit 150 may be realized by an LCD (Liquid Crystal Display), an organic EL (Electro-Luminescence) display, a projector, or a hologram display. Also good.
- the display unit 150 is integrated with the information processing system 10, but the display unit 150 may be configured separately from the information processing system 10.
- a display device connected to the information processing system 10 by wire or wireless may be handled as the display unit 150.
- the line-of-sight detection unit 111 detects the user's line of sight.
- the detection method of the user's gaze by the gaze detection unit 111 is not particularly limited.
- the line-of-sight detection unit 111 may detect the user's line of sight based on the eye area. More specifically, the line-of-sight detection unit 111 may detect the user's line of sight by pattern matching with respect to the eye region.
- the operation detection unit 112 detects a space operation by the user based on the detection data. Then, the position control unit 113 determines the pointing position P based on the operation detected by the operation detection unit 112.
- the position control unit 113 calculates the XY coordinates in the display area 151 according to the initial posture of the detection unit 120 and the angular velocity detected by the gyro sensor as the pointing position P will be mainly described. However, the determination of the pointing position P may be made in any way as will be described later.
- the position control unit 113 moves the pointing position P based on the user's line of sight when a predetermined operation is detected by the operation detection unit 112. Then, the pointing position P can be easily moved to a position desired by the user.
- the predetermined operation detected by the operation detection unit 112 is not particularly limited.
- the operation detection unit 112 may detect a predetermined operation based on the detection data.
- the operation detection unit 112 may detect a predetermined movement recognized from the detection data as a predetermined operation.
- a predetermined change in the XY coordinates in the display area 151 according to the initial posture of the detection unit 120 and the angular velocity detected by the gyro sensor is treated as a predetermined movement
- the predetermined movement is not particularly limited.
- a predetermined change in acceleration detected by the acceleration sensor may be treated as a predetermined movement, or a predetermined change in angular velocity detected by the gyro sensor may be treated as a predetermined movement.
- the operation detection unit 112 may detect a predetermined state recognized from the detection data as a predetermined operation.
- FIG. 3 and FIG. 3 and 4 are diagrams for explaining examples of conditions for detecting a user operation as a predetermined operation.
- FIG. 3 shows a locus of the pointing position P, and shows a start position Ps and an end position Pe.
- the operation detection unit 112 may detect a predetermined operation when a change in movement in a predetermined direction satisfies a predetermined condition.
- FIG. 3 shows an example in which a predetermined operation is detected by the operation detection unit 112 when the horizontal inversion exceeds the upper limit number of times within a predetermined time.
- the upper limit number is 3 is shown as an example in FIG. 3, the upper limit number is not particularly limited.
- the predetermined direction is not limited to the horizontal direction, and may be the vertical direction or another direction.
- the predetermined time is not particularly limited.
- the operation detection unit 112 may detect a predetermined operation when the third inversion is detected between the first inversion and the predetermined time t.
- the predetermined time t is not particularly limited, but may be 400 ms, for example. Further, the operation detecting unit 112 may reset the elapsed time from the first reversal when the third reversal is not detected between the first reversal and the predetermined time t.
- FIG. 3 shows an example in which a predetermined operation is detected by the operation detection unit 112 when the horizontal inversion exceeds the upper limit number of times within a predetermined time, but an example in which the predetermined operation is detected is shown. It is not limited to examples.
- the predetermined operation may be detected by the operation detection unit 112 when a predetermined locus is drawn by the movement recognized from the detection data.
- the predetermined locus is not particularly limited, but may be a circle or a handwriting such as a predetermined character or a predetermined symbol.
- a predetermined operation can be detected by the operation detection unit 112, but the predetermined operation may be erroneously detected.
- the detection unit 120 may vibrate in the horizontal direction, and the horizontal inversion may exceed the upper limit number of times within a predetermined time. Therefore, even if a predetermined movement is detected, there may be a case where a predetermined operation is not detected.
- a specific example will be described in more detail with reference to FIG.
- FIG. 5 is a diagram for explaining an example of a condition for not detecting an operation by the user as a predetermined operation. For example, even if the horizontal reversal exceeds the upper limit number of times within a predetermined time, the operation detection unit 112 determines that the amount of movement M (see FIG. 5) of the pointing position P in the predetermined time exceeds the upper limit amount. Therefore, it is not necessary to detect a predetermined operation. Further, for example, even if the horizontal reversal exceeds the upper limit number of times within a predetermined time, if the horizontal displacement W (see FIG. 5) of the pointing position P at the predetermined time exceeds the lower limit amount, It is not necessary to detect the operation.
- FIG. 6 is a diagram illustrating a cursor display example before moving the position when a predetermined operation is detected.
- the display control unit 114 displays the cursor Cu at the pointing position P.
- the user U wants to move the pointing position P.
- the user U swings the detection unit 120 in the horizontal direction, and moves the line of sight to a position where the pointing position P is desired to be moved.
- a predetermined operation is detected by the operation detection unit 112, and the pointing position P is moved based on the line of sight.
- the destination of the pointing position P may be an intersection of the user U's line of sight and the display area 151. Referring to FIG. 6, such an intersection is shown as viewpoint V.
- FIG. 7 is a diagram showing an object display example when a predetermined operation is detected. As illustrated in FIG. 7, the display control unit 114 may display the object Obj when a predetermined operation is detected.
- the object Obj may be any object, but may be an object based on the pointing position P.
- FIG. 7 shows an example where the object Obj is a circle centered on the pointing position P.
- the control of the pointing position P may be continued. However, if the pointing position P moves while the user U is performing a gesture so that the predetermined operation is detected. The pointing position P may move against the will of the user U. Therefore, the position control unit 113 may fix the pointing position P until a predetermined time elapses after at least the object Obj is displayed.
- FIG. 7 shows a moving image in which the object Obj gradually becomes smaller and the object Obj converges to the viewpoint V and the display of the object Obj ends when a predetermined time has elapsed since the display of the object Obj was started.
- the display control unit 114 may display an object such that the user U can grasp the period during which the pointing position P is fixed. Then, the user U can grasp the timing when the movement of the pointing position P is resumed.
- FIG. 8 is a diagram illustrating a cursor display example after the position is moved when a predetermined operation is detected.
- the display control unit 114 may display the cursor Cu at the pointing position P that has been moved based on the line of sight of the user U in this way.
- the position control unit 113 can control the pointing position P based on the detection data. In such a case, the display control unit 114 may move the cursor Cu in accordance with the movement of the pointing position P.
- the display control unit 114 may display a predetermined indicator when the line of sight is not detected.
- the indicator is not particularly limited, and may be a display of an object (for example, a message) indicating that the line of sight is not detected.
- the display control unit 114 may change the cursor Cu when the line of sight is not detected.
- how to change cursor Cu is not limited, You may change at least any one of the color of a cursor Cu, a shape, and a sharpness.
- the display control unit 114 may perform predetermined feedback when a predetermined operation is detected when the line of sight is not detected.
- the feedback method is not particularly limited, and may be a display of an object (for example, a message) indicating that the line of sight is not detected.
- the cursor Cu may always be displayed, but may be hidden when a predetermined cursor non-display condition is satisfied. That is, the display control unit 114 may hide the cursor Cu when a predetermined cursor non-display condition is satisfied. Although the cursor non-display condition is not particularly limited, the display control unit 114 may hide the cursor Cu when the amount of movement detected from the detected data is below a threshold for a predetermined time. In such a case, it is considered that the user U does not intend to change the pointing position P.
- the display control unit 114 may perform predetermined feedback when a predetermined operation is detected when the line of sight is not detected in a state where the cursor Cu is displayed.
- the feedback method is not particularly limited, and may be a display of an object (for example, a message) indicating that the line of sight is not detected.
- the position control unit 113 may move the pointing position P based on the line of sight when a predetermined operation is detected in a state where the cursor Cu is hidden. If it does so, there exists an effect that it becomes possible to reduce possibility that a user will lose sight of the pointing position P.
- the destination of the pointing position P may be an intersection of the user U's line of sight and the display area 151.
- the position control unit 113 does not move the pointing position P according to the line of sight when a predetermined redisplay operation is detected when the line of sight is not detected in the state where the cursor Cu is hidden. Therefore, in such a case, the position control unit 113 may determine the pointing position P at the position of the cursor Cu when the cursor Cu is not displayed.
- the predetermined redisplay operation is not particularly limited, and may be a transition from a state of no motion detected based on detection data to a state of motion.
- FIG. 9 is a flowchart showing a flow of operations of the information processing system 10 according to the embodiment of the present disclosure.
- the operation flow illustrated in FIG. 9 is merely an example of the operation flow of the information processing system 10. Therefore, the operation flow of the information processing system 10 is not limited to the example shown in FIG.
- the operation detection unit 112 detects an operation on the space by the user based on the detection data (S11). Subsequently, the position control unit 113 determines a pointing position based on the operation detected by the operation detection unit 112 (S12). Subsequently, the display control unit 114 displays a cursor at the pointing position (S13). Note that the cursor may not be displayed. Subsequently, the line-of-sight detection unit 111 detects the user's line of sight (S14).
- S14 line-of-sight detection unit 111 detects the user's line of sight
- the position control unit 113 determines whether a predetermined operation has been detected by the operation detection unit 112 (S15). When a predetermined operation is not detected by the operation detection unit 112 (“No” in S15), the position control unit 113 shifts the operation to S11. On the other hand, when a predetermined operation is detected by the operation detection unit 112 (“Yes” in S15), the position control unit 113 moves the pointing position based on the line of sight (S16). If the pointing position is moved in this manner, the pointing position can be easily moved to a position desired by the user.
- the display control unit 114 displays a predetermined object (S17).
- the position control unit 113 fixes the pointing position until a predetermined time elapses after at least the object is displayed (S18). Further, the display control unit 114 moves the cursor to the pointing position (S19).
- Control unit 110 determines whether or not to continue the operation (S20), and when it is determined that the operation is to be continued (“Yes” in S20), the operation is shifted to S11. On the other hand, when it is determined that the operation is to be ended (“No” in S20), control unit 110 ends the operation.
- the pointing position may be inside the display area 151 or outside the display area 151. That is, the position control unit 113 may determine the pointing position inside or outside the display area 151 based on an operation in space by the user. As a first example, the position control unit 113 may fix the pointing position to the end of the display area 151 when the pointing position does not fit within the display area 151. According to the first example, when the pointing position is deviated from the position desired by the user, the pointing position can be fixed to the end of the display area 151 and corrected. The first example will be described in detail with reference to FIGS. 10 and 11.
- FIG. 10 is a diagram illustrating a first example of cursor display when the pointing position P reaches the end of the display area 151.
- the position control unit 113 determines the pointing position P based on a spatial operation by the user
- the pointing position P has reached the end of the display area 151.
- the display control unit 114 displays the position of the cursor Cu at the pointing position P. In this situation, it is assumed that an operation is performed in which the pointing position P is moved outside the display area 151.
- FIG. 11 is a diagram showing a first example of cursor display when the pointing position P does not fit inside the display area 151.
- the display control unit 114 displays the position of the cursor Cu in the display area. 151 is fixed to the end. Further, the position control unit 113 fixes the pointing position P to the end of the display area 151 when the pointing position P does not fit inside the display area 151.
- the pointing position when the pointing position does not fit within the display area 151, the pointing position is fixed to the end of the display area 151. However, the pointing position is controlled by other methods. Also good.
- the position control unit 113 may move the pointing position to the outside of the display area 151 when the pointing position does not fit inside the display area 151. According to the second example, even when a situation where the pointing position does not fit in the display area 151 occurs, the possibility that the pointing position shifts from a position desired by the user can be reduced. The second example will be described in detail with reference to FIGS. 12 and 13.
- FIG. 12 is a diagram showing a second example of the cursor display when the pointing position P reaches the end of the display area 151.
- the position control unit 113 determines the pointing position P based on a spatial operation by the user, the pointing position P has reached the end of the display area 151.
- the display control unit 114 displays the position of the cursor Cu at the pointing position P. In this situation, it is assumed that an operation is performed in which the pointing position P is moved outside the display area 151.
- FIG. 13 is a diagram showing a second example of the cursor display when the pointing position P does not fit inside the display area 151.
- the display control unit 114 displays the position of the cursor Cu in the display area. 151 is fixed to the end.
- the position control unit 113 determines the pointing position P at the XY coordinates on the virtual plane obtained by expanding the display area 151.
- the display control unit 114 moves the cursor Cu according to the distance D between the pointing position P and the end of the display area 151. It is good to change.
- FIG. 13 shows an example in which the display control unit 114 increases the degree of deformation of the cursor Cu as the distance D increases.
- the change of the cursor Cu may be a change of the color of the cursor Cu.
- the display control unit 114 may increase the color change of the cursor Cu as the distance D increases.
- the change of the cursor Cu may be the sharpness of the cursor Cu.
- the display control unit 114 may increase the change in the sharpness of the cursor Cu as the distance D increases.
- the operation detection unit 112 detects an operation in space by the user based on the detection data.
- an image captured by the imaging unit 130 may be used instead of the detection data detected by the detection unit 120.
- FIG. 14 is a diagram illustrating a modification of the functional configuration of the information processing system 10 according to the embodiment of the present disclosure.
- the information processing system 10 may not include the detection unit 120.
- the operation detection unit 112 may recognize a predetermined recognition target from the image captured by the imaging unit 130 and detect an operation based on the movement or state of the recognition target.
- the recognition target may be an object moved by the user or the user's own body.
- the user's own body may be a part of the user's own body or the entire user's own body.
- the position control unit 113 may determine the pointing position based on the direction of the object.
- the position control unit 113 may determine the pointing position based on the movement of the object.
- the operation detection unit 112 may detect a predetermined operation when the object shows a predetermined movement, or may detect a predetermined operation when the object enters a predetermined state. .
- the position control unit 113 may determine the pointing position based on the direction of the user's own body.
- the direction of the user's own body may be the direction indicated by the user's finger or the direction from the elbow to the fingertip.
- the position control unit 113 may determine the pointing position based on the movement of the user's own body.
- the operation detection unit 112 may detect a predetermined operation when the user's own body shows a predetermined movement, or when the user's own body enters a predetermined state. May be detected.
- the predetermined movement may be a movement in which the user shakes his hand, or may be another movement.
- the predetermined state may be a state where the user has spread his hand, a state where the user has held his hand, or another state.
- FIG. 15 is a diagram illustrating a hardware configuration example of the information processing system 10 according to the embodiment of the present disclosure.
- the hardware configuration example illustrated in FIG. 15 is merely an example of the hardware configuration of the information processing system 10. Therefore, the hardware configuration of the information processing system 10 is not limited to the example illustrated in FIG.
- the information processing system 10 includes a CPU (Central Processing Unit) 801, a ROM (Read Only Memory) 802, a RAM (Random Access Memory) 803, a sensor 804, an input device 808, An output device 810, a storage device 811, a drive 812, an imaging device 813, and a communication device 815 are provided.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- the CPU 801 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing system 10 according to various programs. Further, the CPU 801 may be a microprocessor.
- the ROM 802 stores programs used by the CPU 801, calculation parameters, and the like.
- the RAM 803 temporarily stores programs used in the execution of the CPU 801, parameters that change as appropriate during the execution, and the like. These are connected to each other by a host bus including a CPU bus.
- the sensor 804 includes various detection sensors such as a state detection sensor for detecting the state of the information processing system 10 and its peripheral circuits.
- Examples of the sensor 804 include a positioning sensor, an inclination sensor, an acceleration sensor, a gyro sensor, an orientation sensor, a temperature sensor, a humidity sensor, and an illuminance sensor.
- a detection signal from the sensor 804 is sent to the CPU 801. Thereby, the CPU 801 can know the state of the information processing system 10 (position, inclination, acceleration, angular velocity, direction, temperature, humidity, illuminance, etc.).
- the input device 808 generates an input signal based on an input by the user, such as a mouse, a keyboard, a touch panel, a button (power button, etc.), a microphone, a switch, a dial, and a lever, and an operation unit for the user to input information.
- An input control circuit for outputting to the CPU 801 is configured.
- a user of the information processing system 10 can input various data and instruct a processing operation to the information processing system 10 by operating the input device 808.
- the position in which these operation parts are provided is not specifically limited.
- the operation unit may be provided on the side surface of the housing of the information processing system 10 or may be provided on the same surface as the surface on which the display is provided.
- the output device 810 may include a display device such as an LCD (Liquid Crystal Display), an OLED (Organic Light Emitting Diode), and a lamp. Further, the output device 810 may include an audio output device such as a speaker and headphones. For example, the display device displays a captured image or a generated image. On the other hand, the audio output device converts audio data or the like into audio and outputs it.
- a display device such as an LCD (Liquid Crystal Display), an OLED (Organic Light Emitting Diode), and a lamp.
- the output device 810 may include an audio output device such as a speaker and headphones.
- the display device displays a captured image or a generated image.
- the audio output device converts audio data or the like into audio and outputs it.
- the storage device 811 is a data storage device configured as an example of a storage unit of the information processing system 10.
- the storage device 811 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
- the storage device 811 stores programs executed by the CPU 801 and various data.
- the drive 812 is a reader / writer for a storage medium, and is built in or externally attached to the information processing system 10.
- the drive 812 reads information recorded on a removable storage medium such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the information to the RAM 803.
- the drive 812 can also write information in a removable storage medium.
- the image pickup apparatus 813 uses various members such as an image pickup element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and a lens for controlling the formation of a subject image on the image pickup element. It is an apparatus that images a real space and generates a captured image.
- the imaging device 813 may capture a still image, or may capture a moving image.
- the communication device 815 communicates with an external device via a network (or directly).
- the communication device 815 may be an interface for wireless communication, and may include, for example, a communication antenna, an RF (Radio Frequency) circuit, a baseband processor, and the like.
- a specific example of an interface for wireless communication is CDMA (Code Division Multiple). Access), W-CDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), and communication units such as modems corresponding to Wi-fi (registered trademark) (Wireless Fidelity).
- the communication device 815 may be an interface for wired communication, and may include, for example, a connection terminal, a transmission circuit, and other communication processing circuits. Further, the CPU 801 and the communication device 815 may be configured by one chip or may be realized as separate devices. Although not shown in FIG. 15, the information processing system 10 may be driven by power supplied from a power source such as a rechargeable battery, and the power source is connected to the information processing system 10. On the other hand, it may be configured to be detachable.
- a power source such as a rechargeable battery
- the display unit 150 and the storage unit 140 can be realized by the output device 810 and the storage device 811, respectively.
- the control unit 110 can be realized by the CPU 801. Therefore, a program for causing a computer to function as the information processing system 10 having the control unit 110 is held in the storage device 811, the ROM 802, or the RAM 803, and the CPU 801 can execute the program.
- the configuration for outputting the display control information to the display unit 150 corresponds to an example of the “output unit”.
- the output unit may be realized by a device such as a signal line positioned between the CPU 801 and the bus illustrated in FIG.
- the display control information can be realized by, for example, a video signal such as an RGB signal or an HDMI (High-Definition Multimedia Interface) signal for the CPU 801 to control the output device 810 (for example, a display).
- each of the detection unit 120, the display unit 150, the control unit 110, and the storage unit 140 may be provided in different information processing apparatuses connected via a network.
- the information processing apparatus provided with the control unit 110 and the storage unit 140 corresponds to, for example, a server such as a web server or a cloud server, and the detection unit 120 and the display unit 150 are connected to the server via the network.
- a configuration in which the server having the control unit 110 transmits display control information to the client via a network (for example, a modem) Communication interface) corresponds to an example of an “output unit”.
- the content of the display control information may be changed as appropriate according to the system configuration.
- the display control information is HTML (HyperText Markup Language), SGML (Standard Generalized Markup Language), XML (Extensible Markup), etc. It may be realized by a markup language.
- the display control information described above is merely an example, and a method for transmitting / receiving information between the display control unit 114 and the display unit 150, a type of transmission path, or a medium used for transmission / reception of information (for example, , And may be appropriately changed according to a radio signal or light).
- the line-of-sight detection unit that detects the line of sight of the user
- the operation detection unit that detects an operation in space by the user
- the pointing position is determined based on the operation.
- An information processing system is provided that moves the pointing position based on the line of sight when a predetermined operation is detected. According to such a configuration, the pointing position can be easily moved to a position desired by the user.
- the position control unit 113 may determine the pointing positions of each of the plurality of users. Further, when a predetermined operation is detected, the position control unit 113 may move the user's pointing position according to the line of sight of the user who performed the operation.
- the display control unit 114 may display a cursor at the pointing position of each of a plurality of users. At this time, the cursors of the plurality of users may be different for each user. For example, the display control unit 114 may display the cursor in a different color for each user. In such a case, the display control unit 114 may match the color of the cursor with the color of the detection unit 120 operated by the user. The color of the detection unit 120 may be recognized from the image by the display control unit 114, or may be determined according to the ID of the detection unit 120 registered in the information processing system 10.
- the operation of the information processing system 10 does not necessarily have to be performed in chronological order in the order described in the flowchart.
- the operation of the information processing system 10 may be performed in an order different from the order described in the flowchart, or at least some of the operations described in the flowchart may be performed in parallel.
- a line-of-sight detector that detects the line of sight of the user; An operation detection unit for detecting an operation in space by the user; A position controller that determines a pointing position based on the operation, The position control unit moves the pointing position based on the line of sight when a predetermined operation is detected; Information processing system.
- the information processing system includes a display control unit that displays a predetermined object when the predetermined operation is detected. The information processing system according to (1).
- the position control unit fixes the pointing position until a predetermined time elapses after at least the predetermined object is displayed; The information processing system according to (2).
- the display control unit performs a predetermined feedback when the predetermined operation is detected when the line of sight is not detected.
- the display control unit displays a predetermined indicator when the line of sight is not detected.
- the display control unit changes a cursor when the line of sight is not detected.
- the operation detection unit detects the predetermined operation based on detection data.
- the operation detection unit detects a predetermined state recognized from the detection data as the predetermined operation;
- the operation detection unit detects a predetermined movement recognized from the detection data as the predetermined operation; The information processing system according to (7).
- the display control unit displays a cursor at the pointing position; The information processing system according to any one of (2) to (9).
- the display control unit performs predetermined feedback when the predetermined operation is detected when the line of sight is not detected in a state where the cursor is displayed.
- (12) The display control unit hides the cursor when a predetermined cursor non-display condition is satisfied, The information processing system according to any one of (2) to (11).
- the position control unit moves the pointing position based on the line of sight when the predetermined operation is detected in a state where the cursor is hidden.
- the position control unit when a predetermined redisplay operation is detected at the time of non-detection of the line of sight in a state where the cursor is hidden, the pointing at the position of the cursor when the cursor is hidden Determine the position, The information processing system according to (12).
- the position control unit determines the pointing position inside or outside a display area based on the operation; The information processing system according to any one of (2) to (14).
- the display control unit fixes the position of the cursor to an end of the display area when the pointing position does not fit inside the display area.
- the display control unit changes the cursor according to a distance between the pointing position and an end of the display area when the pointing position does not fit inside the display area.
- the position control unit fixes the pointing position to an end of the display area when the pointing position does not fit inside the display area.
- Computer A line-of-sight detector that detects the line of sight of the user; An operation detection unit for detecting an operation in space by the user; A position controller that determines a pointing position based on the operation, The position control unit moves the pointing position based on the line of sight when a predetermined operation is detected; A program for functioning as an information processing system.
Abstract
Description
1.情報処理システムの概要
2.情報処理システムの機能構成例
3.情報処理システムの機能詳細
4.情報処理システムの機能構成の変形例
5.情報処理システムのハードウェア構成例
6.むすび
まず、本開示の実施形態に係る情報処理システム10の概要について説明する。図1は、本開示の実施形態に係る情報処理システム10の概要を説明するための図である。図1を参照すると、情報処理システム10は、検出部120、撮像部130および表示部150を備えている。表示部150は表示領域151を有し、表示領域151に画面を表示する。表示領域151に表示される画面は特に限定されない。図1に示したように、ユーザUは、表示領域151に表示された画面を閲覧可能な位置に存在してよい。
続いて、本開示の実施形態に係る情報処理システム10の機能構成例について説明する。図2は、本開示の実施形態に係る情報処理システム10の機能構成例を示す図である。図2に示すように、情報処理システム10は、制御部110、検出部120、撮像部130、記憶部140および表示部150を備える。
続いて、本開示の実施形態に係る情報処理システム10の機能詳細について説明する。情報処理システム10において、視線検出部111は、ユーザの視線を検出する。視線検出部111によるユーザの視線の検出手法は特に限定されない。例えば、視線検出部111は、撮像部130によって撮像された画像に目領域が映っている場合には、目領域に基づいてユーザの視線を検出すればよい。より詳細には、視線検出部111は、目領域に対するパターンマッチングによりユーザの視線を検出してよい。
以上に説明したように、操作検出部112は、検出データに基づいてユーザによる空間上の操作を検出する。上記では、検出データが検出部120によって検出される例を示したが、検出部120によって検出される検出データの代わりに、撮像部130によって撮像された画像が用いられてもよい。
続いて、本開示の実施形態に係る情報処理システム10のハードウェア構成例について説明する。図15は、本開示の実施形態に係る情報処理システム10のハードウェア構成例を示す図である。ただし、図15に示したハードウェア構成例は、情報処理システム10のハードウェア構成の一例を示したに過ぎない。したがって、情報処理システム10のハードウェア構成は、図15に示した例に限定されない。
Access)、W-CDMA(Wideband Code Division Multiple Access)、LTE(Long Term Evolution)、Wi-fi(登録商標)(Wireless Fidelity)のような通信方式に対応したモデムなどの通信ユニットが挙げられる。
以上説明したように、本開示の実施形態によれば、ユーザの視線を検出する視線検出部と、前記ユーザによる空間上の操作を検出する操作検出部と、前記操作に基づいてポインティング位置を決定する位置制御部と、を備え、前記位置制御部は、所定の操作が検出された場合に前記視線に基づいて前記ポインティング位置を移動させる、情報処理システムが提供される。かかる構成によれば、ポインティング位置をユーザの所望する位置に容易に移動させることが可能となる。
(1)
ユーザの視線を検出する視線検出部と、
前記ユーザによる空間上の操作を検出する操作検出部と、
前記操作に基づいてポインティング位置を決定する位置制御部と、を備え、
前記位置制御部は、所定の操作が検出された場合に前記視線に基づいて前記ポインティング位置を移動させる、
情報処理システム。
(2)
前記情報処理システムは、前記所定の操作が検出された場合に所定のオブジェクトを表示させる表示制御部を備える、
前記(1)に記載の情報処理システム。
(3)
前記位置制御部は、少なくとも前記所定のオブジェクトが表示されてから所定時間が経過するまで前記ポインティング位置を固定する、
前記(2)に記載の情報処理システム。
(4)
前記表示制御部は、前記視線の非検出時に前記所定の操作が検出された場合に、所定のフィードバックを行う、
前記(2)または(3)に記載の情報処理システム。
(5)
前記表示制御部は、前記視線の非検出時に所定のインジケータを表示させる、
前記(2)~(4)のいずれか一項に記載の情報処理システム。
(6)
前記表示制御部は、前記視線の非検出時にカーソルを変化させる、
前記(2)~(4)のいずれか一項に記載の情報処理システム。
(7)
前記操作検出部は、検出データに基づいて前記所定の操作を検出する、
前記(1)~(6)のいずれか一項に記載の情報処理システム。
(8)
前記操作検出部は、前記検出データから認識される所定の状態を前記所定の操作として検出する、
前記(7)に記載の情報処理システム。
(9)
前記操作検出部は、前記検出データから認識される所定の動きを前記所定の操作として検出する、
前記(7)に記載の情報処理システム。
(10)
前記表示制御部は、前記ポインティング位置にカーソルを表示させる、
前記(2)~(9)のいずれか一項に記載の情報処理システム。
(11)
前記表示制御部は、前記カーソルが表示されている状態において前記視線の非検出時に前記所定の操作が検出された場合に、所定のフィードバックを行う、
前記(10)に記載の情報処理システム。
(12)
前記表示制御部は、所定のカーソル非表示条件が満たされた場合に前記カーソルを非表示にする、
前記(2)~(11)のいずれか一項に記載の情報処理システム。
(13)
前記位置制御部は、前記カーソルが非表示にされた状態において前記所定の操作が検出された場合に、前記視線に基づいて前記ポインティング位置を移動させる、
前記(12)に記載の情報処理システム。
(14)
前記位置制御部は、前記カーソルが非表示にされた状態において前記視線の非検出時に所定の再表示操作が検出された場合に、前記カーソルを非表示にしたときの前記カーソルの位置に前記ポインティング位置を決定する、
前記(12)に記載の情報処理システム。
(15)
前記位置制御部は、前記操作に基づいて前記ポインティング位置を表示領域の内部または外部に決定する、
前記(2)~(14)のいずれか一項に記載の情報処理システム。
(16)
前記表示制御部は、前記ポインティング位置が前記表示領域の内部に収まらない場合、前記カーソルの位置を前記表示領域の端部に固定する、
前記(15)に記載の情報処理システム。
(17)
前記表示制御部は、前記ポインティング位置が表示領域の内部に収まらない場合、前記ポインティング位置と前記表示領域の端部との距離に応じて前記カーソルを変化させる、
前記(16)に記載の情報処理システム。
(18)
前記位置制御部は、前記ポインティング位置が表示領域の内部に収まらない場合、前記ポインティング位置を前記表示領域の端部に固定する、
前記(1)~(14)のいずれか一項に記載の情報処理システム。
(19)
ユーザの視線を検出することと、
前記ユーザによる空間上の操作を検出することと、
前記操作に基づいてポインティング位置を決定することと、を含み、
プロセッサにより所定の操作が検出された場合に前記視線に基づいて前記ポインティング位置を移動させること、
を含む、情報処理方法。
(20)
コンピュータを、
ユーザの視線を検出する視線検出部と、
前記ユーザによる空間上の操作を検出する操作検出部と、
前記操作に基づいてポインティング位置を決定する位置制御部と、を備え、
前記位置制御部は、所定の操作が検出された場合に前記視線に基づいて前記ポインティング位置を移動させる、
情報処理システムとして機能させるためのプログラム。
110 制御部
111 視線検出部
112 操作検出部
113 位置制御部
114 表示制御部
120 検出部
130 撮像部
140 記憶部
150 表示部
151 表示領域
Cu カーソル
D 距離
Obj オブジェクト
P ポインティング位置
U ユーザ
Claims (20)
- ユーザの視線を検出する視線検出部と、
前記ユーザによる空間上の操作を検出する操作検出部と、
前記操作に基づいてポインティング位置を決定する位置制御部と、を備え、
前記位置制御部は、所定の操作が検出された場合に前記視線に基づいて前記ポインティング位置を移動させる、
情報処理システム。 - 前記情報処理システムは、前記所定の操作が検出された場合に所定のオブジェクトを表示させる表示制御部を備える、
請求項1に記載の情報処理システム。 - 前記位置制御部は、少なくとも前記所定のオブジェクトが表示されてから所定時間が経過するまで前記ポインティング位置を固定する、
請求項2に記載の情報処理システム。 - 前記表示制御部は、前記視線の非検出時に前記所定の操作が検出された場合に、所定のフィードバックを行う、
請求項2に記載の情報処理システム。 - 前記表示制御部は、前記視線の非検出時に所定のインジケータを表示させる、
請求項2に記載の情報処理システム。 - 前記表示制御部は、前記視線の非検出時にカーソルを変化させる、
請求項2に記載の情報処理システム。 - 前記操作検出部は、検出データに基づいて前記所定の操作を検出する、
請求項1に記載の情報処理システム。 - 前記操作検出部は、前記検出データから認識される所定の状態を前記所定の操作として検出する、
請求項7に記載の情報処理システム。 - 前記操作検出部は、前記検出データから認識される所定の動きを前記所定の操作として検出する、
請求項7に記載の情報処理システム。 - 前記表示制御部は、前記ポインティング位置にカーソルを表示させる、
請求項2に記載の情報処理システム。 - 前記表示制御部は、前記カーソルが表示されている状態において前記視線の非検出時に前記所定の操作が検出された場合に、所定のフィードバックを行う、
請求項10に記載の情報処理システム。 - 前記表示制御部は、所定のカーソル非表示条件が満たされた場合に前記カーソルを非表示にする、
請求項2に記載の情報処理システム。 - 前記位置制御部は、前記カーソルが非表示にされた状態において前記所定の操作が検出された場合に、前記視線に基づいて前記ポインティング位置を移動させる、
請求項12に記載の情報処理システム。 - 前記位置制御部は、前記カーソルが非表示にされた状態において前記視線の非検出時に所定の再表示操作が検出された場合に、前記カーソルを非表示にしたときの前記カーソルの位置に前記ポインティング位置を決定する、
請求項12に記載の情報処理システム。 - 前記位置制御部は、前記操作に基づいて前記ポインティング位置を表示領域の内部または外部に決定する、
請求項2に記載の情報処理システム。 - 前記表示制御部は、前記ポインティング位置が前記表示領域の内部に収まらない場合、カーソルの位置を前記表示領域の端部に固定する、
請求項15に記載の情報処理システム。 - 前記表示制御部は、前記ポインティング位置が表示領域の内部に収まらない場合、前記ポインティング位置と前記表示領域の端部との距離に応じて前記カーソルを変化させる、
請求項16に記載の情報処理システム。 - 前記位置制御部は、前記ポインティング位置が表示領域の内部に収まらない場合、前記ポインティング位置を前記表示領域の端部に固定する、
請求項1に記載の情報処理システム。 - ユーザの視線を検出することと、
前記ユーザによる空間上の操作を検出することと、
前記操作に基づいてポインティング位置を決定することと、を含み、
プロセッサにより所定の操作が検出された場合に前記視線に基づいて前記ポインティング位置を移動させること、
を含む、情報処理方法。 - コンピュータを、
ユーザの視線を検出する視線検出部と、
前記ユーザによる空間上の操作を検出する操作検出部と、
前記操作に基づいてポインティング位置を決定する位置制御部と、を備え、
前記位置制御部は、所定の操作が検出された場合に前記視線に基づいて前記ポインティング位置を移動させる、
情報処理システムとして機能させるためのプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP14878312.9A EP3093738A4 (en) | 2014-01-08 | 2014-10-22 | Information processing system, information processing method, and program |
CN201480072110.7A CN105874409A (zh) | 2014-01-08 | 2014-10-22 | 信息处理系统、信息处理方法及程序 |
JP2015556712A JP6504058B2 (ja) | 2014-01-08 | 2014-10-22 | 情報処理システム、情報処理方法およびプログラム |
US15/038,329 US20160291692A1 (en) | 2014-01-08 | 2014-10-22 | Information processing system, information processing method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-001763 | 2014-01-08 | ||
JP2014001763 | 2014-01-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015104884A1 true WO2015104884A1 (ja) | 2015-07-16 |
Family
ID=53523726
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/078117 WO2015104884A1 (ja) | 2014-01-08 | 2014-10-22 | 情報処理システム、情報処理方法およびプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20160291692A1 (ja) |
EP (1) | EP3093738A4 (ja) |
JP (1) | JP6504058B2 (ja) |
CN (1) | CN105874409A (ja) |
WO (1) | WO2015104884A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017091327A (ja) * | 2015-11-12 | 2017-05-25 | 富士通株式会社 | ポインティング支援装置、ポインティング支援方法およびポインティング支援プログラム |
WO2020195201A1 (ja) * | 2019-03-25 | 2020-10-01 | 富士フイルム株式会社 | 投影制御装置、投影装置、投影制御方法、及び投影制御プログラム |
US11233941B2 (en) | 2019-10-24 | 2022-01-25 | Canon Kabushiki Kaisha | Electronic device that receives line of sight input, method of controlling electronic device, and non-transitory computer readable medium |
JP7383471B2 (ja) | 2019-12-20 | 2023-11-20 | キヤノン株式会社 | 電子機器およびその制御方法 |
JP7455764B2 (ja) | 2018-06-02 | 2024-03-26 | マーシブ テクノロジーズ,インコーポレイティド | モバイルデバイスを使用した共有ディスプレイの注釈システム |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107402691A (zh) * | 2017-07-13 | 2017-11-28 | 深圳Tcl新技术有限公司 | 终端光标的显示方法、显示装置及计算机可读存储介质 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009008372A1 (ja) | 2007-07-06 | 2009-01-15 | Sony Corporation | 入力装置、制御装置、制御システム、制御方法及びハンドヘルド装置 |
WO2009072504A1 (ja) * | 2007-12-07 | 2009-06-11 | Sony Corporation | 制御装置、入力装置、制御システム、制御方法及びハンドヘルド装置 |
JP2012048568A (ja) * | 2010-08-27 | 2012-03-08 | Canon Inc | 情報処理装置及び方法 |
JP2012105034A (ja) * | 2010-11-10 | 2012-05-31 | Panasonic Corp | リモートコントロールシステム |
WO2012145180A1 (en) * | 2011-04-21 | 2012-10-26 | Sony Computer Entertainment Inc. | Gaze-assisted computer interface |
JP2013120447A (ja) * | 2011-12-06 | 2013-06-17 | Nippon Soken Inc | 表示制御システム |
JP2013125985A (ja) * | 2011-12-13 | 2013-06-24 | Sharp Corp | 表示システム |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6204828B1 (en) * | 1998-03-31 | 2001-03-20 | International Business Machines Corporation | Integrated gaze/manual cursor positioning system |
US8896527B2 (en) * | 2009-04-07 | 2014-11-25 | Samsung Electronics Co., Ltd. | Multi-resolution pointing system |
US8982160B2 (en) * | 2010-04-16 | 2015-03-17 | Qualcomm, Incorporated | Apparatus and methods for dynamically correlating virtual keyboard dimensions to user finger size |
WO2011148951A1 (ja) * | 2010-05-26 | 2011-12-01 | 日立化成工業株式会社 | 波長変換型太陽電池封止材、及び太陽電池モジュール |
US8395362B2 (en) * | 2010-10-29 | 2013-03-12 | R2 Semiconductor, Inc. | Controlling a dead time of a switching voltage regulator |
JP5387557B2 (ja) * | 2010-12-27 | 2014-01-15 | カシオ計算機株式会社 | 情報処理装置及び方法、並びにプログラム |
JP2013134717A (ja) * | 2011-12-27 | 2013-07-08 | Aisin Aw Co Ltd | 操作入力システム |
GB2504492A (en) * | 2012-07-30 | 2014-02-05 | John Haddon | Gaze detection and physical input for cursor symbol |
JP2016529635A (ja) * | 2013-08-27 | 2016-09-23 | オークランド ユニサービシズ リミテッド | 凝視制御インターフェース方法およびシステム |
-
2014
- 2014-10-22 CN CN201480072110.7A patent/CN105874409A/zh active Pending
- 2014-10-22 JP JP2015556712A patent/JP6504058B2/ja active Active
- 2014-10-22 WO PCT/JP2014/078117 patent/WO2015104884A1/ja active Application Filing
- 2014-10-22 US US15/038,329 patent/US20160291692A1/en not_active Abandoned
- 2014-10-22 EP EP14878312.9A patent/EP3093738A4/en not_active Ceased
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009008372A1 (ja) | 2007-07-06 | 2009-01-15 | Sony Corporation | 入力装置、制御装置、制御システム、制御方法及びハンドヘルド装置 |
WO2009072504A1 (ja) * | 2007-12-07 | 2009-06-11 | Sony Corporation | 制御装置、入力装置、制御システム、制御方法及びハンドヘルド装置 |
JP2012048568A (ja) * | 2010-08-27 | 2012-03-08 | Canon Inc | 情報処理装置及び方法 |
JP2012105034A (ja) * | 2010-11-10 | 2012-05-31 | Panasonic Corp | リモートコントロールシステム |
WO2012145180A1 (en) * | 2011-04-21 | 2012-10-26 | Sony Computer Entertainment Inc. | Gaze-assisted computer interface |
JP2013120447A (ja) * | 2011-12-06 | 2013-06-17 | Nippon Soken Inc | 表示制御システム |
JP2013125985A (ja) * | 2011-12-13 | 2013-06-24 | Sharp Corp | 表示システム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3093738A4 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017091327A (ja) * | 2015-11-12 | 2017-05-25 | 富士通株式会社 | ポインティング支援装置、ポインティング支援方法およびポインティング支援プログラム |
JP7455764B2 (ja) | 2018-06-02 | 2024-03-26 | マーシブ テクノロジーズ,インコーポレイティド | モバイルデバイスを使用した共有ディスプレイの注釈システム |
WO2020195201A1 (ja) * | 2019-03-25 | 2020-10-01 | 富士フイルム株式会社 | 投影制御装置、投影装置、投影制御方法、及び投影制御プログラム |
US11587534B2 (en) | 2019-03-25 | 2023-02-21 | Fujifilm Corporation | Projection control device, projection apparatus, projection control method, and projection control program |
US11233941B2 (en) | 2019-10-24 | 2022-01-25 | Canon Kabushiki Kaisha | Electronic device that receives line of sight input, method of controlling electronic device, and non-transitory computer readable medium |
JP7383471B2 (ja) | 2019-12-20 | 2023-11-20 | キヤノン株式会社 | 電子機器およびその制御方法 |
Also Published As
Publication number | Publication date |
---|---|
EP3093738A4 (en) | 2017-06-21 |
US20160291692A1 (en) | 2016-10-06 |
JPWO2015104884A1 (ja) | 2017-03-23 |
CN105874409A (zh) | 2016-08-17 |
EP3093738A1 (en) | 2016-11-16 |
JP6504058B2 (ja) | 2019-04-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10761610B2 (en) | Vehicle systems and methods for interaction detection | |
US10318011B2 (en) | Gesture-controlled augmented reality experience using a mobile communications device | |
WO2015104884A1 (ja) | 情報処理システム、情報処理方法およびプログラム | |
US8549418B2 (en) | Projected display to enhance computer device use | |
EP2832107B1 (en) | Information processing apparatus, information processing method, and program | |
US9952667B2 (en) | Apparatus and method for calibration of gaze detection | |
US9690475B2 (en) | Information processing apparatus, information processing method, and program | |
JP4384240B2 (ja) | 画像処理装置、画像処理方法、画像処理プログラム | |
US11572653B2 (en) | Interactive augmented reality | |
JP6252409B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
US20180307309A1 (en) | Information processing apparatus, information processing method, and computer program | |
JP2016521894A (ja) | 検出されたジェスチャーに基づいてデバイス動作を実行するためのシステムおよび方法 | |
JPWO2012011263A1 (ja) | ジェスチャ入力装置およびジェスチャ入力方法 | |
JP2011081506A (ja) | 映像表示装置、および、その表示制御方法 | |
JP2010237765A (ja) | 情報処理装置、フォーカス移動制御方法及びフォーカス移動制御プログラム | |
JP5976787B2 (ja) | レーザー・ダイオード・モード | |
KR20160132811A (ko) | 주의 기반 렌더링 및 피델리티 | |
CN106201284B (zh) | 用户界面同步系统、方法 | |
JP6575518B2 (ja) | 表示制御装置、表示制御方法およびプログラム | |
JP4945617B2 (ja) | 画像処理装置、画像処理方法、画像処理プログラム | |
JP2016109726A (ja) | 情報処理装置、情報処理方法およびプログラム | |
US9727778B2 (en) | System and method for guided continuous body tracking for complex interaction | |
JP2010237766A (ja) | 情報処理装置、コマンド実行制御方法及びコマンド実行制御プログラム | |
CN108521545B (zh) | 基于增强现实的图像调整方法、装置、存储介质和电子设备 | |
US10855639B2 (en) | Information processing apparatus and information processing method for selection of a target user |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14878312 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15038329 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2015556712 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2014878312 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014878312 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |