WO2014155885A1 - 入力装置 - Google Patents
入力装置 Download PDFInfo
- Publication number
- WO2014155885A1 WO2014155885A1 PCT/JP2013/084894 JP2013084894W WO2014155885A1 WO 2014155885 A1 WO2014155885 A1 WO 2014155885A1 JP 2013084894 W JP2013084894 W JP 2013084894W WO 2014155885 A1 WO2014155885 A1 WO 2014155885A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- input
- user
- projection
- unit
- image
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
- G06F3/0426—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4122—Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42201—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] biosensors, e.g. heat sensor for presence detection, EEG sensors or any limb activity sensors worn by the user
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44218—Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/74—Projection arrangements for image reproduction, e.g. using eidophor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
Definitions
- the present invention relates to an input device that receives an input to a device that is an operation target from a user.
- Patent Document 1 discloses a technique for detecting the position and moving direction of an operation object such as a user's hand in a projected image and displaying a user interface image (input image) according to the detection result. Has been.
- Japanese Patent Publication Japanese Patent Laid-Open No. 2009-64109 (published on March 26, 2009)”
- Patent Document 1 describes that the display direction of the input image at a predetermined position is determined, but does not describe a technique for changing the position at which the input image is projected.
- the present invention has been made in view of the above problems, and an object thereof is to provide an input device capable of projecting the input image to a position desired by a user.
- an input device is an input device that receives an input to a target device from a user, and an input image for the user to perform an input operation is a projection target object Projection position determining means for determining which position on the projection plane of the image is to be projected based on the user's action indicating the position or a physical change caused by the action, and projected on the projection plane Pointing position specifying means for specifying the position indicated by the user with respect to the input image.
- the input image can be projected at a position desired by the user.
- a television (television receiver, display device) 1 which is an aspect of the input device of the present invention will be described in detail with reference to FIGS.
- the television 1 of one embodiment of the present invention is described as a television capable of Internet connection.
- the television according to the present invention is not limited to a television that can be connected to the Internet, and may be any television that can receive broadcast waves and output video and audio.
- the present invention can be applied to any device that functions in accordance with a user input operation, such as an air conditioner and a lighting device, in addition to the television.
- a user input operation such as an air conditioner and a lighting device
- dimensional relationships such as length, size, and width and shapes in the drawings are appropriately changed for clarity and simplification of the drawings, and do not represent actual dimensions and shapes.
- FIG. 2 is a schematic diagram illustrating a configuration of the control system 100 of the television 1 according to the present embodiment.
- the television 1 of one embodiment of the present invention determines which position on the projection plane 30 of the projection target object the input image 40 for the user A to perform an input operation is projected to. It is determined on the basis of a physical change (vibration) generated in accordance with the operation indicating the projection position of A.
- a plurality of vibration sensors 10a and 10b detect vibrations generated by an operation performed by the user A on the projection surface 30 (such as hitting the projection surface 30), and vibrations are detected. A detection signal indicating this is transmitted to the television 1.
- the television 1 determines the position where the user A performed the operation on the projection plane 30 by analyzing the detection signals from the vibration sensors 10a and 10b. Thereafter, the television 1 projects the input image 40 at the position.
- the input image 40 is, for example, an image imitating a remote controller or a keyboard.
- the projection target object is a table such as a low table or a dining table, and the top plate of the table serves as the projection surface 30.
- the plurality of vibration sensors 10 a and 10 b are arranged at different predetermined positions on the projection surface 30.
- the vibration sensor 10 a is disposed at the left end portion of the projection surface 30, and the vibration sensor 10 b is disposed at the right end portion of the projection surface 30.
- the vibration sensors 10a and 10b transmit a detection signal indicating that to the television 1.
- the projection position specifying unit 151 (see FIG. 1) of the television 1 detects and detects the time difference between the timing when the detection signal transmitted from the vibration sensor 10a is received and the timing when the detection signal transmitted from the vibration sensor 10b is received. Based on the order in which the signals are received, the position hit by the user on the projection plane 30 is determined.
- the area of the projection surface 30, in other words, the area in which the input image can be projected is sufficiently larger than the area of the input image 40. That is, the television 1 is a user who hits the projection surface 30 performed at an arbitrary position on the projection surface 30 that is sufficiently larger than the area of the projected input image, such as a low table or a dining table top plate. By detecting this operation, the position where the operation is performed can be set as the position where the input image 40 is projected.
- the process determining unit 155 (see FIG. 1) of the television 1 specifies the position designated by the user A with respect to the projected input image 40. Specifically, the process determination unit 155 captures the operation of the user A with respect to the input image 40 (for example, the operation of touching the input image 40 with a finger), and analyzes the captured image so that the user A Specify the indicated position. And the television 1 performs the process according to the specified position.
- the user A can project the input image 40 at an arbitrary position on the projection plane 30 that is sufficiently wider than the projected input image. Further, the user A can cause the television 1 to execute processing corresponding to the designated position by performing an operation of designating the position on the projected input image 40. Therefore, the user A can cause the television 1 to execute processing using the input image 40 at a desired position, as in the case of using a movable input device such as a remote controller.
- the user A can cause the television 1 to execute processing by performing an operation of touching the input image 40 with an image of pressing a key or button of an input device such as a normal keyboard or remote control.
- the television 1 when the user B who is in a position different from the user A performs an operation such as hitting the projection surface 30, the television 1 similarly determines the position where the user B performed the above operation on the projection surface 30. Then, the input image 40 is projected onto the position.
- each user projects the input image 40 to a desired position without moving the current position, and the projected image 40 is input.
- Input operation can be performed.
- FIG. 1 is a block diagram illustrating an example of a main configuration of the television 1 according to the present embodiment.
- the television 1 includes at least a position information reception unit 11, an image projection unit 12, an imaging unit 13, a storage unit 14, an input control unit 15, a television control unit 16, and a display unit 17. .
- the position information receiving unit 11 is a communication device that can perform wired communication or wireless communication with a plurality of vibration sensors 10a and 10b provided outside and receives signals from the vibration sensors 10a and 10b. Further, the vibration sensors 10a and 10b are arranged on the projection surface 30 as described above, detect vibrations associated with an operation performed by the user on the projection surface 30, and position detection signals indicating that the vibrations are detected. It transmits to the information receiver 11. When the position information receiving unit 11 receives the detection signals from the vibration sensors 10a and 10b, the position information receiving unit 11 supplies the detection signals to a projection position specifying unit 151 described later.
- the sensor that transmits a signal to the position information receiving unit 11 is not limited to the vibration sensor 10.
- an acceleration sensor may be used, and sound may be detected instead of vibration.
- Examples of the sensor that detects sound include a microphone.
- a microphone is used as a sensor, there is a possibility that malfunction due to the sound of television broadcasting may occur. For this reason, it is more preferable to use a sensor that detects vibration as a sensor that transmits a signal to the position information receiving unit 11 in terms of increasing the reliability of input to the television 1. Further, by using a sensor that detects vibration, the user can display the input image 40 with a minimum operation such as hitting the projection surface 30.
- the input control unit 15 is independent of a television control unit 16 described later.
- an input device that can operate even when the television 1 is in a standby state can be realized in the same manner as a conventional input device such as a remote controller.
- the user can start up the television 1 that is in a standby state from the position where the user views and can also put the TV 1 that is started into a standby state.
- the input control unit 15 includes a projection position specifying unit 151, a projection control unit 152, an imaging control unit 153, an image analysis unit 154, and a process determination unit 155.
- the projection position specifying unit (projection position determining means) 151 indicates to which position on the projection plane 30 of the projection target object the input image 40 for the user to perform an input operation is projected. It is a block that is determined based on a physical change that occurs with the operation.
- the projection position specifying unit 151 receives the timing of receiving the detection signal (first detection signal) transmitted from the vibration sensor 10a and the detection signal (second detection signal) transmitted from the vibration sensor 10b.
- the position on the projection plane 30 hit by the user is determined as the projection position of the input image 40 based on the time difference from the timing and the order in which the detection signals are received.
- Formulas for determining the projection position are stored in the storage unit 14 in advance.
- the projection position specifying unit 151 substitutes information indicating whether (i) the time difference and (ii) the first detection signal or the second detection signal are received first into the mathematical expression. calculate.
- the projection position specifying unit 151 supplies projection position information indicating the specified projection position to the projection control unit 152 and the imaging control unit 153.
- the projection control unit 152 controls the image projection unit 12 to project the input image 40 at the position indicated by the projection position information supplied from the projection position specifying unit 151. Specifically, the projection control unit 152 reads the input image 40 from the projection image storage unit 141, and causes the image projection unit 12 to project the input image 40 onto the projection position.
- an image imitating a keyboard is projected as shown in FIG.
- a button imitating an activation button for switching between the operation state and the standby state of the television 1 is drawn in addition to the keys of the normal keyboard.
- the operation state is a state in which images and sound are being output
- the standby state is a state in which power is supplied but output of images and sounds is stopped. That is, when the television 1 is in a standby state, the user touches the start button so that the television 1 enters an operation state.
- the user touches an input image 40 that simulates a keyboard while looking at the display screen of the television 1, so that the user can perform a complicated input operation as compared with a conventional remote control operation. It is also possible to do this.
- the input image 40 may be capable of an input operation corresponding to an operation of simultaneously pressing a plurality of keys (for example, an operation of pressing the Enter key while pressing the Ctrl key), as in a normal keyboard. Good.
- an image selected in advance by the user may be projected, or an image to be projected by the projection control unit 152 may be determined according to the usage state of the television 1. For example, when watching a television broadcast, the projection control unit 152 projects an input image 40 that simulates a remote control, and when using an Internet browser, the projection control unit 152 uses an input that simulates a keyboard.
- the image 40 may be projected.
- the input image may be an input image 50 imitating a display screen of a so-called smartphone display as shown in FIG.
- icons 51 51 (51a to 51d) indicating the functions of the television 1 are displayed. When the user touches the icon 51, processing corresponding to the touched icon 51 is performed.
- the television 1 may execute.
- the projection control unit 152 may be configured to display an arrow on the display unit 17 of the television 1 and cause the user to move the arrow with a fingertip instead of moving the arrow using a mouse.
- the input image 40 has a predetermined area simulating a touch pad, and the arrow may be moved according to the movement of the fingertip within this area.
- the input image 40 has an area in which a photograph displayed on the display unit 17 of the television 1 can be enlarged or reduced so that the user can perform pinch-in and pinch-out operations on the display surface of the smartphone with a fingertip. May be.
- the TV control unit 16 which will be described later, based on the operation performs the size of a specific image displayed on the display unit 17. May be changed.
- the projection control unit 152 may display the input image 50 or the area imitating the touch pad simultaneously with the input image 40 imitating the keyboard described above.
- Imaging control unit 153 The imaging control unit 153 captures the imaging direction (and the imaging range) of the imaging unit 13 so that the user's operation on the input image 40 projected at the position indicated by the projection position information supplied from the projection position specifying unit 151 can be captured. ) To control the image.
- the imaging control unit 153 supplies the image analysis unit 154 with image data (captured image data) obtained by imaging the region including the input image 40 captured by the imaging unit 13.
- the region including the input image 40 refers to a region in which the position designated by the user with respect to the input image 40 can be specified.
- the image analysis unit (instructed position specifying unit) 154 is a block that specifies a position instructed by the user with respect to the input image 40 projected on the projection plane 30. Specifically, the image analysis unit 154 analyzes the captured image data supplied from the imaging control unit 153 and determines whether an operation (such as an operation of touching with a finger) is performed on the input image 40 by the user. .
- the image analysis unit 154 specifies where the user touched the input image 40 and supplies touch position information indicating the specified position to the processing determination unit 155.
- the touch position may be specified using a coordinate system set for the image of the input image 40 included in the captured image data.
- the process determination unit 155 is a block that determines a process to be executed by the television 1 in accordance with the position (touch position) in the input image 40 instructed by the user.
- the storage unit 14 stores correspondence information indicating a correspondence relationship between the touch position in the projected input image 40 and the type of control signal transmitted to the television control unit 16.
- the process determining unit 155 refers to the correspondence information and identifies a control signal corresponding to the touch position indicated by the touch position information supplied from the image analysis unit 154.
- the process determination unit 155 supplies the specified control signal to the television control unit 16 described later.
- the processing executed by the television 1 corresponding to the control signal may be determined by the television 1 instead of the processing determination unit 155. .
- the image projecting unit 12 is a projector that projects the input image 40 onto the projection position identified by the projection position identifying unit 151.
- the image projection unit 12 can change the projection direction according to the projection position under the control of the projection control unit 152. Thereby, the image projection unit 12 can project the input image 40 at the projection position.
- the imaging unit 13 is a camera for imaging a user's operation. Specifically, the imaging unit 13 images a region including the projected input image 40 and supplies the captured image data to the imaging control unit 153.
- the storage unit 14 is a storage area for storing a control program executed by the input control unit 15 and various data (setting values, tables, etc.) read when the control program is executed.
- various conventionally known storage means for example, ROM (Read Only Memory), RAM (Random Access Memory), flash memory, EPROM (Erasable Programmable ROM), EEPROM (registered trademark) (Electrically EPROM) HDD (Hard Disk Drive) or the like can be used.
- ROM Read Only Memory
- RAM Random Access Memory
- flash memory for example, RAM (Random Access Memory), flash memory, EPROM (Erasable Programmable ROM), EEPROM (registered trademark) (Electrically EPROM) HDD (Hard Disk Drive) or the like can be used.
- ROM Read Only Memory
- RAM Random Access Memory
- EPROM Erasable Programmable ROM
- EEPROM registered trademark
- HDD Hard Disk Drive
- the storage unit 14 of the present embodiment includes a projection image storage unit 141.
- the projection image storage unit 141 is a storage area for storing data of various input images 40. Further, as described above, the storage unit 14 stores information indicating a correspondence relationship between the designated position with respect to the projected input image 40 and the process executed on the television 1 (not illustrated).
- the television control unit 16 is a control device that controls various functions of the television 1.
- the television control unit 16 performs the process indicated by the control signal supplied from the process determining unit 155.
- the control signal is information indicating a channel change
- the television control unit 16 receives a broadcast wave corresponding to the channel after the change, and displays an image on the display unit 17 described later.
- the control signal is information indicating acquisition of content through the Internet connection
- the television control unit 16 acquires content from an external server (not shown) and causes the display unit 17 to display the content image.
- the control signal is information indicating activation of the television 1 in the standby state or transition to the standby state
- the television control unit 16 starts or stops the output of the image and sound.
- the processing executed by the television control unit 16 is not limited to the above. That is, the television control unit 16 executes processing for realizing functions preset in the television 1. For example, changing the volume, displaying a program guide, starting an Internet browser, and the like are examples of processing.
- the display unit 17 is a display device that displays information processed by the television 1 as an image. Information processed by the television control unit 16 is displayed on the display unit 17.
- the display unit 17 is configured by a display device such as an LCD (liquid crystal display).
- FIG. 4 is a flowchart illustrating an example of the flow of input processing in the television 1.
- the position information receiving unit 11 receives the first and second detection signals indicating that the vibration accompanying the user's operation is detected from the plurality of vibration sensors 10a and 10b (YES in S1), the received first information is received.
- the second detection signal is supplied to the projection position specifying unit 151.
- the projection position specifying unit 151 calculates a time difference between the timing at which the first detection signal is received and the timing at which the second detection signal is received (S2). Furthermore, the projection position specifying unit 151 specifies the position where the vibration is generated on the projection plane 30, that is, the position where the operation is performed by the user, based on the calculated time difference and the order in which the detection signals are received (S3: Projection position determination step).
- the projection position specifying unit 151 supplies projection position information indicating the specified position to the projection control unit 152 and the imaging control unit 153.
- the projection control unit 152 changes the projection direction of the image projection unit 12 according to the projection position information supplied from the projection position specifying unit 151 (S4), and the input image 40 from the projection image storage unit 141. , And causes the image projection unit 12 to project the input image 40 onto the projection position.
- the imaging control unit 153 captures the imaging direction of the imaging unit 13 so that the user's operation on the input image 40 displayed at the projection position indicated by the projection position information supplied from the projection position specifying unit 151 can be captured. And the imaging unit 13 executes imaging (S5). Further, the imaging control unit 153 supplies captured image data indicating the image captured by the imaging unit 13 to the image analysis unit 154. Imaging by the imaging unit 13 may be performed at predetermined time intervals after the input image 40 is projected.
- the image analysis unit 154 further analyzes the captured image data when a user action indicating a position in the input image 40 is detected (YES in S6).
- the coordinates of the user's designated position on the input image 40 are detected (S7: designated position specifying step).
- the image analysis unit 154 supplies touch position information indicating the coordinates to the process determination unit 155.
- the process determining unit 155 refers to the correspondence information stored in the storage unit 14 and stores information on the process executed on the television 1 associated with the coordinates indicated by the supplied touch position information.
- the processing in the television 1 is determined by reading (S8). Thus, the input operation determination process ends.
- the process determining unit 155 supplies a control signal corresponding to the determined process to the television control unit 16, and the television control unit 16 executes a process corresponding to the supplied control signal.
- the television control unit 16 stops outputting video and audio, and as a result, the television 1 transitions to the standby state.
- FIG. 6 is a schematic diagram illustrating a configuration of the control system 200 of the television 110 according to the present embodiment.
- the television 110 according to the present embodiment does not require the vibration sensor 10 outside. That is, the television 110 further includes a human sensor 21 (see FIG. 5) that detects the position of the user A and a second imaging unit 22 (see FIG. 5) that images the operation of the user A.
- a human sensor 21 see FIG. 5
- a second imaging unit 22 see FIG. 5
- an input image 40 for the user A to perform an input operation is obtained.
- the position on the projection plane 30 that is to be projected is determined.
- FIG. 5 is a block diagram illustrating an example of a main configuration of the television 110 according to the present embodiment.
- the television 110 according to the present embodiment includes a human sensor 21 and a second imaging unit 22 instead of the position information receiving unit 11 included in the television 1 of the first embodiment.
- the television 110 includes a projection position specifying unit 156 instead of the projection position specifying unit 151.
- the human sensor 21 (user position detection means) is a sensor that detects the position of the user within the detection range.
- the detection range of the human sensor 21 may be limited to the projection plane 30 and a space area in the vicinity thereof. In this case, the human sensor 21 detects the position of the user when the user exists in the vicinity of the projection plane 30.
- the human sensor 21 is not limited to an infrared sensor, and may be a temperature sensor, as long as it can detect the position of the user and can be provided on the television 1. It may be.
- the human sensor 21 is a passive sensor that receives infrared rays even when the television 1 is in a standby state, and receives infrared rays emitted from the user when the user enters the detection range. Further, when the human sensor 21 detects that the user is within the detection range by receiving infrared rays emitted from the user, the human sensor 21 supplies user position information indicating the position of the user to the projection position specifying unit 156.
- the human sensor 21 may be an infrared active sensor.
- the second imaging unit 22 is a camera for imaging a user's operation that indicates a position where the input image 40 is projected. Specifically, the second imaging unit 22 captures an area including the position of the user detected by the human sensor 21 and supplies captured image data indicating the captured image to the projection position specifying unit 156.
- the “region including the user's position” is a region in a predetermined range centered on the position indicated by the user position information.
- the second imaging unit 22 performs the above imaging at predetermined time intervals after the human sensor 21 detects the position of the user, and supplies each captured image data to the projection position specifying unit 156.
- the projection position specifying unit (projection position determining means) 156 first causes the second imaging unit 22 to perform imaging when the human sensor 21 detects the presence of the user. When the detected user position is outside the imaging range of the second imaging unit 22 at that time, the projection position specifying unit 156 controls the imaging direction of the second imaging unit 22 according to the user position information. Imaging is executed later.
- the projection position specifying unit 156 determines on which position on the projection plane 30 of the projection target object the input image 40 for the user to perform an input operation is projected based on the user's operation indicating the projection position. decide. In other words, the projection position specifying unit 156 determines which position on the projection plane 30 the user has designated as the projection position by analyzing the captured image data acquired by the second imaging unit 22. The projection position specifying unit 156 supplies projection position information indicating the specified position to the projection control unit 152 and the imaging control unit 153.
- the operation for designating the projection position is, for example, an operation of touching the surface of the projection surface 30 with the index finger.
- the projection position specifying unit 156 may specify the position touched by the index finger of the user as the projection position of the input image 40.
- FIG. 7 is a flowchart illustrating an example of the flow of input operation determination processing in the television 110.
- the human sensor 21 detects the presence of a user (YES in S21)
- user position information is transmitted to the projection position specifying unit 156.
- the projection position specifying unit 156 controls the imaging direction of the second imaging unit 22 based on the received user position information, and then causes the second imaging unit 22 to perform imaging (S22).
- the projection position specifying unit 156 analyzes the captured image data acquired by the second imaging unit 22. As a result of analyzing the captured image data, when an operation for designating the projection position of the input image 40 is detected (YES in S23), the projection position specifying unit 156 displays the user for the projection image of the input image 40 included in the captured image. Is specified (S24: projection position determination step). The projection position specifying unit 156 supplies projection position information indicating the specified position to the projection control unit 152 and the imaging control unit 153.
- step S25 to step S29 is the same as that of the first embodiment. That is, the processing from step S25 to step S29 is the same as the processing from step S4 to step S8 shown in FIG.
- the television 110 uses the human sensor 21 and the second imaging unit 22 to specify the position where the input image 40 is projected. Thereby, it is not necessary to provide the vibration sensor 10 on the projection surface. Therefore, the degree of freedom of the position where the input image 40 is projected can be further expanded. For example, it is possible to project the input image 40 using the projection surface 30 as a floor of a living room where an unspecified number of vibrations may occur.
- FIG. 8 is a block diagram illustrating an example of a main configuration of the television 120 according to the present embodiment. As illustrated in FIG. 8, the television 120 according to the present embodiment does not include the second imaging unit 22, unlike the television 110 according to the second embodiment.
- the projection position specifying unit 156 causes the imaging unit 13 to execute imaging for specifying the projection position.
- imaging is performed after controlling the imaging direction of the imaging unit 13 according to the information on the position of the user detected by the human sensor 21. Let it run.
- the captured image data acquired by the imaging unit 13 is supplied to the imaging control unit 153, and the projection position of the input image 40 is specified. Since the subsequent processing is the same as that of the second embodiment, detailed description thereof is omitted.
- FIG. 9 is a block diagram showing an example of a main configuration of the television 130 of the present embodiment.
- the television 130 according to the present embodiment does not include the imaging unit 13 therein, and controls the imaging device 20 provided outside the television 130 to perform imaging of user operations. Therefore, the imaging control unit 153 performs wired communication or wireless communication with the imaging device 20 via a communication unit (not shown).
- the imaging device 20 is a device including a camera for imaging a user's operation. Note that the number of cameras provided in the imaging device 20 is not particularly limited, and a plurality of cameras may be provided.
- the imaging control unit 153 sets the imaging direction of the camera included in the imaging device 20 so that the user's action can be taken with respect to the input image 40 displayed at the position indicated by the projection position information supplied from the projection position specifying unit 151.
- a control signal for control is transmitted to the imaging device 20.
- the imaging control unit 153 transmits an imaging execution signal for causing the imaging device 20 to execute imaging of an area including the input image 40 and receives captured image data indicating the captured image from the imaging device 20.
- the imaging device 20 changes the imaging direction of the camera according to the received control signal. Thereafter, when receiving the imaging execution signal, the imaging device 20 executes imaging and transmits the captured image data to the television 120 (imaging control unit 153).
- the television 120 specifies the projection position of the input image 40 by receiving a signal from the vibration sensor 10 as in the first embodiment, but the present invention is not limited to this.
- the projection position of the input image 40 may be specified using the human sensor and the imaging unit (or the second imaging unit).
- FIG. 10 is a schematic diagram showing the configuration of the television control system 300 according to the present embodiment.
- the imaging device 20 performs wired communication or wireless communication with the television 120 to acquire captured image data. Therefore, the user can freely change the installation location of the imaging device 20.
- the camera is positioned at the position a. Since the blind spot due to the back of the user's hand and arm decreases when the position is provided at the position b, the user's action can be imaged more accurately.
- the imaging device 20 can be freely installed at a position where the blind spot can be reduced, a highly reliable input device can be realized in detecting the user's operation.
- the type of the input image 40 to be projected may be changed in accordance with an instruction operation performed by the user on the projected input image 40.
- the process determining unit 155 displays the input image 40 after the change.
- Information to be specified and a change instruction are supplied to the projection control unit 152.
- the projection control unit 152 reads the input image 40 from the projection image storage unit 141 according to the supplied instruction and information, and causes the image projection unit 12 to project the input image 40.
- each input image 40 has an area imitating a button for changing the input image, and when the area is touched by the user, the projection control unit 152 An image selection image for changing the input image 40 may be projected.
- the user of the television 1/110/120 does not move from the current position, and for example, an input image simulating a remote control and an input image simulating a keyboard according to the intended use of the television 1/110/120. And can be changed.
- FIG. 11 is a block diagram illustrating an example of a main configuration of the input control device 2 according to the present embodiment.
- the input control device 2 is a device that receives a user input for causing a plurality of devices (target devices) such as the television 3, the air conditioner 4, and the lighting device 5 to execute processing.
- target devices such as the television 3, the air conditioner 4, and the lighting device 5 to execute processing.
- the target device is not limited to the above-described device, and any device that can receive a signal from the outside and execute processing can be used.
- any of the plurality of target devices can be selected by a user input operation on the input image 40 and a process in the selected target device is selected. Is possible.
- the input control device 2 includes an input information determination unit 158, a transmission control unit 159, a projection control unit 160, and a transmission unit 23 as a configuration that the above-described televisions 1, 110, and 120 do not have. .
- the projection control unit 160 When the projection position information indicating the projection position of the input image 40 is supplied from the projection position specifying unit 151, the projection control unit 160 first selects the device shown in FIG. 12A from the projection image storage unit 141. The use image 41 is read out, and the image projection unit 12 is caused to project the device selection image 41 at the projection position.
- the projection control unit 160 reads the input image 40 from the projection image storage unit 141 according to the information on the target device supplied from the device selection unit 157, and causes the image projection unit 12 to perform projection. For example, a TV remote control image 42 shown in FIG. 12B is projected.
- the input information determination unit 158 is a block that determines a selected device and a process to be executed by the device in accordance with an input to the input image 40 by the user.
- the input information determination unit 158 includes a process determination unit 155 and a device selection unit 157.
- process determination unit 155 is the same as the process determination unit 155 of each embodiment described above, the description thereof is omitted.
- the device selection unit 157 is a block that determines a selected device in accordance with an input to the input image 40 by the user. Specifically, the device selection unit 157 stores information on the target device associated with the position indicated by the touch position information supplied from the image analysis unit 154 (for example, coordinates on the device selection image 41). 14 is read and determined as a selected target device (referred to as a specific device). In addition, the device selection unit 157 supplies information on the specific device to the projection control unit 160.
- the input information determining unit 158 supplies the transmission control unit 159 with information on the specific device and processing executed by the specific device.
- the transmission control unit 159 is a block that controls the transmission unit 23. Specifically, the transmission control unit 159 transmits a control signal corresponding to the process determined by the process determination unit 155 to the specific device determined by the device selection unit 157 by controlling the transmission unit 23.
- the transmission unit 23 (transmission means) is a communication device that transmits a control signal corresponding to a process to be executed by each target device.
- the transmission of the control signal from the transmission unit 23 to each target device is preferably wireless transmission, but may be wired transmission.
- the projection position of the input image 40 is specified by using the human sensor 21 and the second imaging unit 22 as in the second embodiment.
- the image capturing unit 13 may be used instead of the second image capturing unit 22 as in the third embodiment, and the input image is received by receiving a signal from the vibration sensor 10 as in the first embodiment. Forty projection positions may be specified.
- the user can project the input image 40 that can operate a plurality of devices at a desired position on the projection surface 30, and can operate the plurality of devices using only the input image 40. Therefore, the user does not need to install an input device such as a remote controller for operating each device. As a result, there is no possibility that the user loses the input device.
- FIG. 12 is a schematic diagram illustrating an example of an input image projected by the input control device 2 of the present embodiment.
- FIG. 12A is a schematic diagram illustrating an example of the device selection image 41 described above.
- the device selection image 41 is an image for selecting a target device as an input target.
- an area simulating buttons that can be selected by the television 3, the air conditioner 4, and the illumination device 5 is displayed. Each is drawn.
- region drawn is not limited to the above-mentioned example, It changes according to the kind of said object apparatus.
- an input image for performing input to the selected target device is projected to the specified position instead of the device selection image 41.
- a television remote control image 42 simulating a television remote control shown in FIG. 12B is projected.
- the TV remote control image 42 has a power button for switching between the TV operating state and the standby state, a channel button for switching channels, a volume button for changing the volume, like a normal TV remote control, An area simulating a program guide button for displaying the program guide is drawn.
- the TV remote control image 42 is an example, and in addition to the buttons described above, an area simulating a button of a TV remote control may be drawn.
- the user can display the input image 40 for executing the input to the selected target device only by selecting the target device in the device selection image 41.
- the user views the broadcast displayed on the TV 3 by performing an operation on the TV remote control image 42 so as to operate a normal TV remote control. be able to.
- Control blocks of the television 1 and the input control device 2 may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like, or realized by software using a CPU (Central Processing Unit). May be.
- the television 1 and the input control device 2 include a CPU that executes instructions of a program that is software that realizes each function, and a ROM (Read that records the above program and various data so that the computer (or CPU) can read them. Only Memory) or a storage device (these are referred to as “recording media”), RAM (Random Access Memory) for expanding the program, and the like. And the objective of this invention is achieved when a computer (or CPU) reads the said program from the said recording medium and runs it.
- a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
- the program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program.
- a transmission medium such as a communication network or a broadcast wave
- the present invention can also be realized in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
- the input device (the television 1 and the input control device 2) according to the aspect 1 of the present invention is an input device that accepts an input to a target device from a user, and an input image 40 for the user to perform an input operation is projected.
- Projection position determination means projection position specifying unit 151 that determines which position on the projection plane 30 the object has to project based on the user's operation indicating the position or a physical change that occurs with the operation. 156) and designated position specifying means (image analysis unit 154) for specifying the position designated by the user with respect to the input image projected onto the projection plane.
- An input device control method is an input device control method that accepts an input to a target device from a user, and an input image for a user to perform an input operation is a projection target object.
- a projection position determining step (S3, S24) for determining a position on the projection plane to be projected based on the user's action indicating the position or a physical change caused by the action; and the projection plane
- a designated position identifying step (S7, S28) for identifying a position designated by the user with respect to the input image projected on the screen.
- the position on the projection plane where the input image is projected is determined based on the user's action indicating the position or the physical change caused by the action, The position designated by the user with respect to the projected input image is specified.
- the user can project the input image at a desired position on the projection plane by performing an operation for indicating the position where the input image is projected, and input the target device at the desired position. It can be carried out.
- the projection position determination means may determine the projection position of the input image by analyzing an image obtained by capturing the operation.
- the input device can determine the position at which the input image is projected without arranging the vibration sensor on the projection plane. For this reason, an unspecified number of vibrations are generated. Even when the input image is projected onto the projection target object that cannot use the vibration sensor, the projection position of the input image can be specified.
- the input device is the input apparatus according to aspect 2, in which the image capturing unit (the image capturing unit 13 and the second image capturing unit 22) that captures the operation and the user position detection unit (human impression) that detects the position of the user. Sensor 21), and the imaging unit may operate based on a detection result of the user by the user position detection means.
- the image capturing unit the image capturing unit 13 and the second image capturing unit 22
- the user position detection unit human impression
- the user's operation is imaged by the operation of the imaging unit based on the result of detecting the user's position. Then, by analyzing the captured image, it is determined at which position on the projection plane the input image is projected.
- the input device can more reliably capture the user's action indicating the projection position of the input image.
- the input device according to aspect 4 of the present invention may be operable even when the target device is in a standby state in any of the above aspects 1 to 3.
- the input image can be projected onto the projection plane even when the target device is in the standby state.
- the input device can provide an input image that can be used in the same manner as an input device such as a remote controller.
- the input device (input control device 2) according to Aspect 5 of the present invention is the input device according to any one of Aspects 1 to 4, wherein when there are a plurality of the target devices, the plurality of the input devices are input by the user's input operation on the input image. Any of the target devices can be selected, and a process in the selected target device can be selected, and the target device selected by the input operation is caused to execute the process selected by the input operation.
- a transmission means (transmission unit 23) for transmitting a signal for the purpose may be further provided.
- the signal for performing the process selected by the input operation with respect to the image for an input is transmitted with respect to the target apparatus selected by the input operation with respect to the image for an input among several target apparatuses. be able to.
- the user can cause a plurality of target devices to execute processing by the operation on the projected input image.
- the input device is the input apparatus according to aspect 1, wherein the projection position determination unit analyzes the signals output from the plurality of vibration sensors 10 that detect vibrations generated by the operation.
- the projection position of the input image may be determined, and the plurality of vibration sensors may be disposed on the projection plane.
- the position on the projection plane where the input image is projected is determined by analyzing the signals output from the plurality of vibration sensors arranged on the projection plane.
- the user can display the input image at a desired position with a minimum operation such as hitting the projection surface, and thus a more convenient input device can be provided.
- the input device may be realized by a computer.
- the input device is realized by the computer by causing the computer to operate as each unit included in the input device.
- a control program and a computer-readable recording medium on which the control program is recorded also fall within the scope of the present invention.
- the embodiment of the present invention can also be expressed as follows.
- control system of the present invention is a control system that performs control for operating the apparatus, controls each part of the apparatus, and controls the system control unit that controls the own system, and inputs for operating the apparatus.
- Input video projection means for freely projecting an input video to be performed on a predetermined area, projection of the input video, input video projection location instruction means for instructing a projection location, and the input video
- operating status input means for converting the status operated by the computer into information and inputting the information to the control unit.
- the control system it is possible to freely project an input video for performing input for operating the apparatus onto a predetermined area.
- the status operated by the user can be computerized and input to the control unit for the projected input video. Therefore, the user can project the input video image at a desired position on the projection surface, and can input the target device at the desired position.
- the apparatus of the present invention is an apparatus that is controlled by the control system, and preferably has means for operating the control system even in a standby state.
- the control system can be operated even when the apparatus is in a standby state. Therefore, similarly to an input device such as a remote controller, it is possible to provide an input video that can operate the device from a standby state.
- the control system of the present invention is a control system that performs control for operating one or a plurality of devices, and transmits a control signal to each control unit that controls each unit of each device.
- a control information transmission system control unit for controlling, an input video projection means for freely projecting an input video for performing an operation for operating each device for each device, a projection of the input video, Input image projection location instruction means for instructing the projection location, and operation status input means for converting the information operated by the user into the input image and inputting the information to the control unit.
- the apparatus of the present invention preferably has means for receiving a signal from the control system.
- each device can be controlled by transmitting a control signal to each control unit that controls each unit of each device based on the situation operated by the user. This eliminates the need for the user to install an input device such as a remote control for each target device. Therefore, the control system can prevent a situation in which the target device cannot execute processing due to the loss of the input device of each target device.
- the input image projection unit includes a unit that switches a plurality of input images.
- the user can project an input image of a desired format at a desired position.
- the present invention can be suitably used for an apparatus that accepts and operates an input from a remote location, such as a television, an air conditioner, and a lighting device.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Social Psychology (AREA)
- Chemical & Material Sciences (AREA)
- Neurosurgery (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Analytical Chemistry (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Networks & Wireless Communication (AREA)
- Databases & Information Systems (AREA)
- Geometry (AREA)
- User Interface Of Digital Computer (AREA)
- Transforming Electric Information Into Light Information (AREA)
Abstract
Description
以下、図1~4に基づき、本発明の入力装置の一態様であるテレビ(テレビジョン受像機、表示装置)1について詳細に説明する。なお、本発明の一態様のテレビ1は、インターネット接続が可能なテレビとして説明する。しかしながら、本発明に係るテレビは、インターネット接続が可能なテレビに限定されず、放送波を受信し、映像および音声を出力できるものであればよい。
図2は、本実施形態のテレビ1の制御システム100の構成を示す概略図である。図2に示すように、本発明の一態様のテレビ1は、ユーザAが入力操作を行うための入力用画像40を、投影対象物体が有する投影面30におけるどの位置に投影するかを、ユーザAの投影位置を示す動作に伴って発生する物理的変化(振動)に基づいて決定する。
次に、本発明の一態様のテレビ1の要部構成について詳細に説明する。図1は、本実施形態のテレビ1の要部構成の一例を示すブロック図である。
位置情報受信部11は、外部に設けられた複数の振動センサ10a・10bと有線通信あるいは無線通信が可能であり、振動センサ10a・10bからの信号を受信する通信デバイスである。また、振動センサ10a・10bは、上述のように投影面30に配置され、ユーザが投影面30に対して行った動作に伴う振動を検知し、当該振動を検知した旨を示す検知信号を位置情報受信部11に送信する。位置情報受信部11は、振動センサ10a・10bからの検知信号を受信すると、その検知信号を後述する投影位置特定部151に供給する。
次に、入力制御部15の構成を詳細に説明する。入力制御部15は、図1に示すように、後述するテレビ制御部16から独立している。これにより、従来のリモコンなどの入力装置と同様に、テレビ1が待機状態であるときにも動作可能な入力装置を実現することができる。またユーザは、リモコンなどの入力装置を用いる場合と同様に、ユーザが視聴する位置から待機状態にあるテレビ1を起動すること、また起動しているテレビ1を待機状態にすることができる。
投影位置特定部(投影位置決定手段)151は、ユーザが入力操作を行うための入力用画像40を、投影対象物体が有する投影面30におけるどの位置に投影するかを、投影位置を示すユーザの動作に伴って発生する物理的変化に基づいて決定するブロックである。
投影制御部152は、画像投影部12を制御することにより、投影位置特定部151から供給された投影位置情報が示す位置に入力用画像40を投影する。具体的には、投影制御部152は、投影画像格納部141から入力用画像40を読み出し、画像投影部12に上記投影位置への入力用画像40の投影を実行させる。
撮像制御部153は、投影位置特定部151から供給された投影位置情報が示す位置に投影されている入力用画像40に対するユーザの動作が撮像できるように、撮像部13の撮像方向(および撮像範囲)を制御し、撮像させる。撮像制御部153は、撮像部13によって撮像された、入力用画像40を含む領域を撮像した画像のデータ(撮像画像データ)を画像解析部154に供給する。
画像解析部(指示位置特定手段)154は、投影面30に投影された入力用画像40に対してユーザが指示した位置を特定するブロックである。具体的には、画像解析部154は、撮像制御部153から供給された撮像画像データを解析し、ユーザによって入力用画像40に対する動作(指でタッチする動作など)が行われたかどうかを判定する。
処理決定部155は、ユーザが指示した入力用画像40における位置(タッチ位置)に応じて、テレビ1が実行する処理を決定するブロックである。記憶部14には、投影された入力用画像40におけるタッチ位置と、テレビ制御部16へ送信する制御信号の種類との対応関係を示す対応情報が格納されている。処理決定部155は、この対応情報を参照して、画像解析部154から供給されたタッチ位置情報が示すタッチ位置に対応する制御信号を特定する。処理決定部155は、特定した制御信号を、後述するテレビ制御部16に供給する。
画像投影部12は、投影位置特定部151によって特定された投影位置に、入力用画像40を投影するプロジェクタである。画像投影部12は、投影制御部152の制御下で、上記投影位置に応じてその投影方向を変化させることができる。これにより、画像投影部12は、上記投影位置に入力用画像40を投影することができる。
撮像部13は、ユーザの動作を撮像するためのカメラである。具体的には、撮像部13は、投影された入力用画像40を含む領域を撮像し、撮像した画像データを撮像制御部153に供給する。
記憶部14は、入力制御部15が実行する制御プログラムおよび制御プログラムを実行するときに読み出す各種データ(設定値、テーブルなど)を記憶する記憶領域である。記憶部14としては、従来から公知の種々の記憶手段、例えば、ROM(Read Only Memory)、RAM(Random Access Memory)、フラッシュメモリ、EPROM(Erasable Programmable ROM)、EEPROM(登録商標)(Electrically EPROM)、HDD(Hard Disk Drive)などを用いることができる。また、入力制御部15にて取り扱われている各種データや処理中のデータは、記憶部14のワーキングメモリに一時的に記憶される。
テレビ制御部16は、テレビ1の各種機能を制御する制御デバイスである。テレビ制御部16は、処理決定部155から供給された制御信号が示す処理を実行する。例えば、制御信号がチャンネルの変更を示す情報であった場合、テレビ制御部16は、変更後のチャンネルに対応する放送波を受信し、後述する表示部17に画像を表示させる。また、制御信号がインターネット接続によるコンテンツの取得を示す情報であった場合、テレビ制御部16は、外部のサーバ(不図示)からコンテンツを取得し、表示部17にコンテンツの画像を表示させる。さらに、制御信号が、待機状態のテレビ1の起動、あるいは待機状態への移行を示す情報であった場合、テレビ制御部16は、画像および音声の出力を開始、あるいは停止する。
続いて、本実施形態に係るテレビ1における入力操作決定処理の流れを説明する。図4は、テレビ1における入力処理の流れの一例を示すフローチャートである。
本発明の別の実施形態について、図5~図7に基づいて説明すれば、以下のとおりである。なお、説明の便宜上、前記実施形態にて説明した部材と同じ機能を有する部材については、同じ符号を付記し、その説明を省略する。
図5は、本実施形態のテレビ110の要部構成の一例を示すブロック図である。図5に示すように、本実施形態に係るテレビ110は、実施形態1のテレビ1が備える位置情報受信部11に代えて、人感センサ21および第2撮像部22を含む。また、テレビ110は、投影位置特定部151に代えて、投影位置特定部156を含む。
人感センサ21(ユーザ位置検出手段)は、検出範囲内のユーザの位置を検出するセンサである。この人感センサ21の検出範囲は、投影面30およびその近傍の空間領域に限定されてもよい。この場合、人感センサ21は、投影面30の近傍にユーザが存在するときに、当該ユーザの位置を検出する。
第2撮像部22は、入力用画像40を投影する位置を指示する、ユーザの動作を撮像するためのカメラである。具体的には、第2撮像部22は、人感センサ21が検出したユーザの位置を含む領域を撮像し、撮像した画像を示す撮像画像データを投影位置特定部156に供給する。ここで、「ユーザの位置を含む領域」は、ユーザ位置情報が示す位置を中心とした所定の範囲の領域である。第2撮像部22は、人感センサ21がユーザの位置を検出した後、所定の時間間隔で上記撮像を行い、各撮像画像データを投影位置特定部156に供給する。
投影位置特定部(投影位置決定手段)156は、まず、人感センサ21がユーザの存在を検出したとき、第2撮像部22に撮像を実行させる。なお、検出されたユーザ位置が、その時点の第2撮像部22の撮像範囲外にある場合、投影位置特定部156は、ユーザ位置情報に応じて、第2撮像部22の撮像方向を制御した後に撮像を実行させる。
続いて、本実施形態に係るテレビ110における入力操作決定処理の流れを説明する。図7は、テレビ110における入力操作決定処理の流れの一例を示すフローチャートである。
本発明のさらに別の実施形態について、図8に基づいて説明すれば、以下のとおりである。なお、説明の便宜上、前記実施形態にて説明した部材と同じ機能を有する部材については、同じ符号を付記し、その説明を省略する。
本発明のさらに別の実施形態について、図9および図10に基づいて説明すれば、以下のとおりである。なお、説明の便宜上、前記実施形態にて説明した部材と同じ機能を有する部材については、同じ符号を付記し、その説明を省略する。
図10は本実施形態に係るテレビ制御システム300の構成を示す概略図である。本実施形態に係るテレビ120によれば、ユーザの動作を撮像するためのカメラが内部に存在せず、撮像装置20として、テレビ120と有線通信または無線通信を行い、撮像画像データを取得する。そのため、ユーザは撮像装置20の設置場所を自在に変更することができる。
上述のテレビ1からテレビ120において、投影された入力用画像40に対してユーザが行った指示動作に応じて、投影する入力用画像40の種類を変更してもよい。
本発明のさらに別の実施形態について、図11および図12に基づいて説明すれば、以下のとおりである。なお、説明の便宜上、前記実施形態にて説明した部材と同じ機能を有する部材については、同じ符号を付記し、その説明を省略する。
本実施形態では、本発明の入力装置の一態様である入力制御装置2について説明する。図11は、本実施形態の入力制御装置2の要部構成の一例を示すブロック図である。入力制御装置2は、テレビ3、エアコン4、および照明装置5などの複数の機器(対象機器)に処理を実行させるためのユーザの入力を受け付ける装置である。なお、上記対象機器は上記のものに限定されず、外部からの信号を受信し処理を実行することができる機器であればよい。
入力情報決定部158は、ユーザによる入力用画像40への入力に応じて、選択された機器および当該機器が実行する処理を決定するブロックである。この入力情報決定部158は、処理決定部155および機器選択部157を含む。
送信制御部159は、送信部23を制御するブロックである。具体的には、送信制御部159は、送信部23を制御することにより、機器選択部157が決定した特定機器へ、処理決定部155が決定した処理に対応する制御信号を送信する。
送信部23(送信手段)は、各対象機器が実行すべき処理に対応する制御信号を送信する通信デバイスである。なお、送信部23から各対象機器へ制御信号の送信は、無線による送信が好ましいが、有線による送信であってもよい。
図12は、本実施形態の入力制御装置2が投影する入力用画像の一例を示す模式図である。図12の(a)は、上述した機器選択用画像41の一例を示す模式図である。機器選択用画像41は、入力の対象となる対象機器を選択するための画像であり、図12の(a)では、テレビ3、エアコン4、および照明装置5が選択できるボタンを模した領域がそれぞれ描画されている。なお、描画される領域は上述の例に限定されず、上記対象機器の種類に応じて変更される。
テレビ1および入力制御装置2の制御ブロック(特に投影位置特定部151、投影制御部152、撮像制御部153、画像解析部154、処理決定部155、投影位置特定部156、入力情報決定部158、送信制御部159、および投影制御部160)は、集積回路(ICチップ)等に形成された論理回路(ハードウェア)によって実現してもよいし、CPU(Central Processing Unit)を用いてソフトウェアによって実現してもよい。
本発明の態様1に係る入力装置(テレビ1、入力制御装置2)は、対象機器に対する入力をユーザから受け付ける入力装置であって、ユーザが入力操作を行うための入力用画像40を、投影対象物体が有する投影面30におけるどの位置に投影するかを、当該位置を示す上記ユーザの動作または当該動作に伴って発生する物理的変化に基づいて決定する投影位置決定手段(投影位置特定部151、156)と、上記投影面に投影された入力用画像に対してユーザが指示した位置を特定する指示位置特定手段(画像解析部154)と、を備える。
2 入力制御装置(入力装置)
10 振動センサ
13 撮像部
21 人感センサ(ユーザ位置検出手段)
22 第2撮像部(撮像部)
23 送信部(送信手段)
30 投影面
40 入力用画像
151、156 投影位置特定部(投影位置決定手段)
154 画像解析部(指示位置特定手段)
Claims (5)
- 対象機器に対する入力をユーザから受け付ける入力装置であって、
ユーザが入力操作を行うための入力用画像を、投影対象物体が有する投影面におけるどの位置に投影するかを、当該位置を示す上記ユーザの動作または当該動作に伴って発生する物理的変化に基づいて決定する投影位置決定手段と、
上記投影面に投影された入力用画像に対してユーザが指示した位置を特定する指示位置特定手段と、を備えることを特徴とする入力装置。 - 上記投影位置決定手段は、上記動作を撮像した画像を解析することにより、上記入力用画像の投影位置を決定することを特徴とする請求項1に記載の入力装置。
- 上記動作を撮像する撮像部と、
上記ユーザの位置を検出するユーザ位置検出手段とをさらに備え、
上記撮像部は、上記ユーザ位置検出手段によるユーザの検出結果に基づいて動作することを特徴とする請求項2に記載の入力装置。 - 上記対象機器が待機状態にあるときにも動作可能であることを特徴とする請求項1~3のいずれか1項に記載の入力装置。
- 上記対象機器が複数存在する場合に、上記入力用画像に対するユーザの入力動作により、上記複数の対象機器のいずれかを選択可能であるとともに、選択された対象機器における処理を選択可能であり、
上記入力動作によって選択された対象機器に対して、上記入力動作によって選択された処理を実行させるための信号を送信する送信手段をさらに備えることを特徴とする請求項1~4のいずれか1項に記載の入力装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015507968A JPWO2014155885A1 (ja) | 2013-03-27 | 2013-12-26 | 入力装置 |
CN201380075096.1A CN105122186A (zh) | 2013-03-27 | 2013-12-26 | 输入装置 |
US14/779,033 US20160054860A1 (en) | 2013-03-27 | 2013-12-26 | Input device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-067607 | 2013-03-27 | ||
JP2013067607 | 2013-03-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014155885A1 true WO2014155885A1 (ja) | 2014-10-02 |
Family
ID=51622916
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/084894 WO2014155885A1 (ja) | 2013-03-27 | 2013-12-26 | 入力装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20160054860A1 (ja) |
JP (1) | JPWO2014155885A1 (ja) |
CN (1) | CN105122186A (ja) |
WO (1) | WO2014155885A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2015105044A1 (ja) * | 2014-01-10 | 2017-03-23 | 日本電気株式会社 | インターフェース装置、可搬装置、制御装置、モジュール、制御方法およびコンピュータプログラム |
JPWO2018008218A1 (ja) * | 2016-07-05 | 2019-04-18 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
JP2019087138A (ja) * | 2017-11-09 | 2019-06-06 | 株式会社バンダイナムコエンターテインメント | 表示制御システム及びプログラム |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106484100A (zh) * | 2016-09-12 | 2017-03-08 | 珠海格力电器股份有限公司 | 空调器及其控制方法和装置、空调器的线控器 |
CN110012329B (zh) * | 2019-03-19 | 2021-06-04 | 海信视像科技股份有限公司 | 一种显示设备中触控事件的响应方法及显示设备 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06153017A (ja) * | 1992-11-02 | 1994-05-31 | Sanyo Electric Co Ltd | 機器の遠隔制御装置 |
JP2010067062A (ja) * | 2008-09-11 | 2010-03-25 | Ntt Docomo Inc | 入力システム及び入力方法 |
JP2010134629A (ja) * | 2008-12-03 | 2010-06-17 | Sony Corp | 情報処理装置および情報処理方法 |
JP2012191568A (ja) * | 2011-03-14 | 2012-10-04 | Ricoh Co Ltd | 画像投影装置、機能設定方法、および機能設定プログラム |
WO2012173001A1 (ja) * | 2011-06-13 | 2012-12-20 | シチズンホールディングス株式会社 | 情報入力装置 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050259322A1 (en) * | 2004-05-20 | 2005-11-24 | Boecker James A | Touch-enabled projection screen incorporating vibration sensors |
JP5408175B2 (ja) * | 2011-03-31 | 2014-02-05 | カシオ計算機株式会社 | 投影装置、投影方法及びプログラム |
CN104981757B (zh) * | 2013-02-14 | 2017-08-22 | 苹果公司 | 灵活的房间控制器 |
-
2013
- 2013-12-26 WO PCT/JP2013/084894 patent/WO2014155885A1/ja active Application Filing
- 2013-12-26 CN CN201380075096.1A patent/CN105122186A/zh active Pending
- 2013-12-26 US US14/779,033 patent/US20160054860A1/en not_active Abandoned
- 2013-12-26 JP JP2015507968A patent/JPWO2014155885A1/ja active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06153017A (ja) * | 1992-11-02 | 1994-05-31 | Sanyo Electric Co Ltd | 機器の遠隔制御装置 |
JP2010067062A (ja) * | 2008-09-11 | 2010-03-25 | Ntt Docomo Inc | 入力システム及び入力方法 |
JP2010134629A (ja) * | 2008-12-03 | 2010-06-17 | Sony Corp | 情報処理装置および情報処理方法 |
JP2012191568A (ja) * | 2011-03-14 | 2012-10-04 | Ricoh Co Ltd | 画像投影装置、機能設定方法、および機能設定プログラム |
WO2012173001A1 (ja) * | 2011-06-13 | 2012-12-20 | シチズンホールディングス株式会社 | 情報入力装置 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2015105044A1 (ja) * | 2014-01-10 | 2017-03-23 | 日本電気株式会社 | インターフェース装置、可搬装置、制御装置、モジュール、制御方法およびコンピュータプログラム |
JPWO2018008218A1 (ja) * | 2016-07-05 | 2019-04-18 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
JP2019087138A (ja) * | 2017-11-09 | 2019-06-06 | 株式会社バンダイナムコエンターテインメント | 表示制御システム及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
CN105122186A (zh) | 2015-12-02 |
US20160054860A1 (en) | 2016-02-25 |
JPWO2014155885A1 (ja) | 2017-02-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9357134B2 (en) | Electronic apparatus, image sensing apparatus, control method and storage medium | |
KR101287497B1 (ko) | 홈-네트워크 시스템의 제어 명령 전달 장치 및 그 방법 | |
US20130077831A1 (en) | Motion recognition apparatus, motion recognition method, operation apparatus, electronic apparatus, and program | |
WO2014155885A1 (ja) | 入力装置 | |
TWI491224B (zh) | 電視介面操作控制裝置、遙控裝置及方法 | |
US20120140117A1 (en) | Two-Sided Remote Control | |
JP2010134629A (ja) | 情報処理装置および情報処理方法 | |
CN106406710A (zh) | 一种录制屏幕的方法及移动终端 | |
WO2016191938A1 (zh) | 通过触控板调节移动终端拍照焦距的方法和移动终端 | |
CN108476339B (zh) | 一种遥控方法和终端 | |
CN108260013B (zh) | 一种视频播放控制方法及终端 | |
JP2009223490A (ja) | 仮想スイッチならびにそれを用いた家電制御システムおよび家電制御方法 | |
US9438842B2 (en) | Reproduction control apparatus, reproduction control method, and storage medium | |
JP2010079332A (ja) | 遠隔操作装置及び遠隔操作方法 | |
KR20170107987A (ko) | 정보 처리 장치, 정보 처리 방법, 프로그램 및 시스템 | |
JP6211199B2 (ja) | 表示制御装置、表示制御方法、制御プログラムおよび記録媒体 | |
JP2014130590A (ja) | ディスプレイ装置及びその制御方法 | |
KR20150019766A (ko) | 영상 통화 방법 및 이를 지원하는 전자 장치 | |
TW201426404A (zh) | 電子裝置及手勢控制方法 | |
KR101134245B1 (ko) | 3차원 가상 리모콘을 포함한 전자기기 및 그의 구동 방법 | |
JP2015126457A (ja) | サーバ装置のプログラム、サーバ装置及び遠隔会議方法 | |
KR20220130171A (ko) | 촬영 방법 및 전자 장치 | |
US20140152545A1 (en) | Display device and notification method | |
KR101361691B1 (ko) | 터치 스크린을 갖는 휴대 단말기의 부분 이미지 획득 방법 | |
US20160086608A1 (en) | Electronic device, method and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13880130 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015507968 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14779033 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13880130 Country of ref document: EP Kind code of ref document: A1 |