WO2010090106A1 - 情報処理装置、情報処理方法およびプログラム - Google Patents
情報処理装置、情報処理方法およびプログラム Download PDFInfo
- Publication number
- WO2010090106A1 WO2010090106A1 PCT/JP2010/051020 JP2010051020W WO2010090106A1 WO 2010090106 A1 WO2010090106 A1 WO 2010090106A1 JP 2010051020 W JP2010051020 W JP 2010051020W WO 2010090106 A1 WO2010090106 A1 WO 2010090106A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- output device
- gesture
- information
- information processing
- input
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1643—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/169—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/169—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
- G06F1/1692—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes the I/O peripheral being a secondary touch screen used as control interface, e.g. virtual buttons or sliders
Definitions
- the present invention relates to an information processing apparatus, an information processing method, and a program.
- a touch panel In recent years, portable information processing devices equipped with a touch panel, a touch pad, etc. (hereinafter referred to as a touch panel) have become widespread. Examples of such portable information processing apparatuses include a cellular phone, a PHS (Personal Handy-phone System), a portable video player, a portable music player, a PDA (Personal Digital Assistant), and the like. Furthermore, recently, touch panels have been mounted on television receivers, portable game machines, remote controllers (hereinafter referred to as remote controllers), and the like.
- remote controllers remote controllers
- Patent Document 1 discloses a technique for swapping a display screen of a portable information processing device to a large screen display device by intuitive gesture input to the touch panel of the portable information processing device. Proposed.
- An object of the present invention is to enable large-screen display content to be directly operated by a portable information processing apparatus.
- a display panel on which a predetermined object is displayed and a display screen side of the display device are provided on the display screen by a first operating body. Selection of the object selected by the first operating body based on an input position detection unit that detects an input position, input position information that represents the input position, and display position information that represents the display position of the object
- An object identification unit that identifies an object
- a gesture detection unit that is provided on the back side of the display screen of the display device and detects a gesture when a predetermined gesture is input by the second operation body; and Based on the first gesture information representing the gesture detected by the gesture detection unit, the container corresponding to the selected object is displayed.
- An output device selection unit that selects an output device that is an external device that outputs data, and the selection based on second gesture information that represents a gesture detected after transmitting location information of the content data to the output device.
- An information processing apparatus includes a signal generation unit that generates a control signal for causing the output device to execute a predetermined process.
- the information processing apparatus may further include a location information transfer unit that transfers location information of content data corresponding to the selected object to the output device selected by the output device selection unit.
- the gesture detection unit can detect an input position on the back surface of the display screen by the second operating body, and the information processing apparatus can detect an area on the display screen and The area on the back surface of the display screen is divided into a plurality of divided areas, and the first input position information indicating the input position of the first operating body input from the input position detector, and the gesture detection Based on the second input position information indicating the input position of the second operating body input from the unit, the divided area on the display screen where the first operating body is positioned, and the second operation An input area detecting unit for detecting the divided area on the back surface of the display screen where the body is located; first divided area information representing the divided area where the first operating body is located; and the second operation Body position Based on the second divided area information indicating the divided area to be performed, the divided area where the first operating body is located and the divided area where the second operating body is located have a corresponding positional relationship.
- a determination unit for determining whether or not the location information transfer unit is configured to determine, by the determination unit, the divided region where the first operating body is located and the divided region where the second operating body is located.
- the location information may be transferred only when it is determined that the area has a corresponding positional relationship.
- the information processing apparatus acquires display control information that is information for display control on the output device of content data corresponding to the selected object from the output device, and further, the acquired display control information, A command for adjusting the layout of the content data on the display screen of the output device based on the third gesture information representing the gesture detected by the gesture detection unit after the content data is displayed on the output device.
- a layout adjusting unit that generates a signal may be further provided.
- the output device When the layout adjustment is performed when displaying the one content data, the output device associates the layout information regarding the layout of the one content data after the layout adjustment with the location information of the content data.
- the layout adjusting unit stores the one of the one stored in the output device when the output device displays other content data according to the gesture information input from the gesture detection unit.
- Application condition information indicating an application condition of the layout information associated with the location information of the content data may be transmitted to the output device.
- the information processing device further includes a device registration unit that performs a device registration process for making external devices accessible to each other via a network using a common protocol with the information processing device, and the output device selection unit includes: The output device may be selected from the external devices that have undergone the device registration process.
- an input position detecting step for detecting an input position of the first operating body on a display screen of a display device on which a predetermined object is displayed;
- An object specifying step of specifying a selected object that is the object selected by the first operating body based on input position information representing the input position and display position information representing a display position of the object;
- a gesture detecting step for detecting a predetermined gesture input by the second operating body in a state where the selected object is selected by the first operating body, and a first representing the gesture detected by the gesture detecting step.
- An external device that outputs content data corresponding to the selected object based on gesture information
- an information processing method including a signal generation step of generating a control signal for causing the selected output device to execute a predetermined process based on second gesture information.
- a computer is provided on a display screen side of a display device on which a predetermined object is displayed, and the display screen is displayed by a first operating body.
- the object selected by the first operating body based on the input position detection unit for detecting the input position on the input, the input position information indicating the input position, and the display position information indicating the display position of the object;
- An object identification unit that identifies a selected object;
- a gesture detection unit that is provided on the back side of the display screen of the display device and detects a gesture when a predetermined gesture is input by the second operation body;
- Corresponding to the selected object based on first gesture information representing a gesture detected by the gesture detection unit An output device selection unit that selects an output device that is an external device that outputs content data, and second gesture information that represents a gesture detected after transmitting location information of the content data to the output device,
- a program for functioning as an information processing apparatus comprising: a signal generation unit that generates a control
- the computer program is stored in the storage unit included in the computer, and is read and executed by the CPU included in the computer, thereby causing the computer to function as the information processing apparatus.
- a computer-readable recording medium in which a computer program is recorded can also be provided.
- the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
- the above computer program may be distributed via a network, for example, without using a recording medium.
- the user operation becomes simpler and more intuitive than before, and portable information Direct operation of large screen display content by the processing device is possible.
- 2 is an explanatory diagram illustrating an external configuration (display screen side) and a usage example of the information processing apparatus according to the embodiment. It is explanatory drawing which shows the external appearance structure (back side) and usage example of the information processing apparatus which concern on the embodiment. It is explanatory drawing which shows the outline of the information processing method which concerns on the embodiment. It is a disassembled perspective view which shows the hardware constitutions of the information processing apparatus which concerns on the embodiment. It is a block diagram which shows the hardware constitutions of the information processing apparatus which concerns on the embodiment.
- DESCRIPTION OF SYMBOLS 100 Information processing apparatus 101 Device registration part 103 Input position detection part 105 Display control part 107 Object specification part 109 Gesture detection part 111 Output device selection part 113 Content data management part 115 Signal generation part 117 Layout adjustment part 119 Storage part 135 Location information transfer Unit 136 input region detection unit 137 determination unit 150 touch panel 160 touch pad 170 first operation body 180 second operation body
- FIG. 1 is an explanatory diagram illustrating a configuration of a video display system 1 (see, for example, Patent Document 1) as an example of an information processing system including a portable information processing device in related technology.
- a video display system 1 includes a plastic display device 2, a base device (base station) 3, and a large screen display device 4 as an example of an information processing device in the related art.
- the video signal supplied from the base device 3 can be displayed on the plastic display device 2 and the large screen display device 4.
- the instruction content from the user is assigned in advance to the coordinate change of the indicated position on the touch panel on the LCD as the display element of the plastic display device 2. Then, the plastic display device 2 detects the coordinate change of the indicated position on the display screen, determines the instruction content assigned to the detected coordinate change, and controls the control signal based on the determined instruction content.
- the display content of the plastic display device 2 is greatly increased in response to a change in coordinates of the indicated position on the display screen of the plastic display device 2 (for example, a drag operation from the user's hand toward the hand).
- the instruction content to be swapped is assigned to the display screen of the screen display device 4.
- the display content of the plastic display device 2 can be swapped with the display screen of the large screen display device 4.
- touch panels are provided on both surfaces of the display panel of the portable information processing apparatus.
- one of the touch panels for example, the touch panel on the front surface side of the display panel
- an external device for example, a device that reproduces content corresponding to the swapped object
- FIG. 2 is an explanatory diagram showing the overall configuration of the information processing system according to the present embodiment.
- the information processing system 10 mainly includes a portable information processing apparatus 100, an output device 200, and a WEB server 300.
- the information processing apparatus 100 can communicate with the output device 200, the WEB server 300, and the like via the network 400.
- the type of the network 400 is not particularly limited, and examples thereof include a home network using a protocol such as the Internet and DLNA (Digital Living Network Alliance).
- the information processing apparatus 100 when a user is browsing a WEB page in the WEB server 300 through a browser or the like using the information processing apparatus 100, a specific content is selected and output to the output device 200. can do.
- the information processing apparatus 100 causes the location information of the selected content data acquired via the Internet 400 (for example, the URL of the WEB page where the content data is stored) Etc.) can be transmitted to the output device 200.
- the output device 200 that has acquired the location information of the content data outputs the content data by an application associated with the content data.
- the information processing apparatus 100 when the user operates the information processing apparatus 100, the information processing apparatus 100 generates a control signal for executing various processes in the output device 200, and transmits the generated control signal to the output device 200.
- the output device 200 that has received this control signal executes processing corresponding to the control signal (for example, scrolling / zooming of images, fast-forwarding / rewinding of video and music, volume change, etc.).
- the information processing apparatus 100 is an electronic device that is connected to the network 400 by any means such as FTTH (Fiber To The Home) or WiMAX (Worldwide Interoperability for Microwave Access), and can browse the WEB page via a browser.
- Such an information processing apparatus 100 includes a notebook personal computer (hereinafter referred to as PC), a mobile phone, a PHS (Personal Handy-phone System), a mobile video player, a mobile music player, a PDA (Personal Digital Assistant), a mobile phone. There are game consoles.
- the information processing apparatus 100 may be a remote controller (hereinafter referred to as a remote controller) or the like as long as it has a display device such as an LCD.
- the information processing apparatus 100 has a display device, which will be described in detail later, and a touch panel or a touch pad is mounted on both the front and back surfaces of the display screen of the display device.
- operation body When the user operates the information processing apparatus 100, the user normally moves the operating body while pressing the surface of the touch panel or the touch pad with a finger, a stylus, or the like (hereinafter referred to as “operation body”).
- operation body A predetermined operation (gesture operation) is performed.
- the touch panel reads, as coordinates, a point at which the operating body contacts the touch panel surface.
- the method for reading the contact position between the operating body and the touch panel surface is not particularly limited, and any method such as a so-called electrostatic method, pressurization method, or optical method can be used. Then, the coordinates read by the touch panel are transmitted to the arithmetic processing means, and predetermined processing is executed.
- FIG. 1 shows an example in which only one information processing apparatus 100 is connected to the network 400, the number of information processing apparatuses 100 is not particularly limited.
- the output device 200 is a device that outputs content data corresponding to the object selected by the operating body from the objects displayed on the display screen of the display device of the information processing apparatus 100.
- the output device 200 is not particularly limited as long as it can output content data on the information processing apparatus 100.
- Specific examples of the output device 200 include a television receiver having a large screen display, a stationary audio device, and the like.
- the output device 200 is also connected to the network 400 by any means such as FTTH or WiMAX.
- FIG. 1 shows an example in which a television receiver 210 having a large screen display and a stationary audio device 220 are connected to a network 400 as the output device 200.
- the number of output devices 200 is not particularly limited.
- the output device 200 when the output device 200 is selected as a device that outputs content data (hereinafter, selected content data) corresponding to the object selected by the operating tool, the location information of the content data is processed by the information processing apparatus. Get from 100. Then, the output device 200 acquires the content data from the WEB server 300 based on the location information of the acquired content data (for example, the URL of the content data storage destination), and the predetermined amount based on the control signal from the information processing apparatus 100 Execute the process. As such processing, for example, when the selected content data is a still image such as a photograph, focus, zoom, etc., when the selected content is a moving image or music, playback, pause, fast forward, rewind, There is volume adjustment.
- the selected content data is a still image such as a photograph, focus, zoom, etc.
- the selected content is a moving image or music
- playback, pause, fast forward, rewind There is volume adjustment.
- the WEB server 300 transmits content data location information (for example, a URL of a WEB page where the content data is stored) to the information processing apparatus 100. Further, the WEB server 300 distributes content data corresponding to the location information of the content data in response to a request from the output device 200 that acquired the location information of the content data from the information processing apparatus 100.
- the type of content data distributed by the WEB server 300 is not particularly limited as long as it is data to be displayed on the display unit.
- the WEB server 300 is a server that provides a WEB service that can be executed by a WEB browser or the like. For example, a server that provides a service such as a photo sharing service, a video distribution service, or a music distribution service. To do. The user can view the content distributed from the WEB server 300 on the information processing apparatus 100 or the output device 200.
- the network 400 is a communication network that connects the information processing apparatus 100, the output device 200, and the content distribution server 400 so that they can communicate bidirectionally or in one direction.
- the network 400 is, for example, a public line network such as the Internet, a telephone line network, a satellite communication network, a broadcast communication path, a WAN (Wide Area Network), a LAN (Local Area Network), an IP-VPN (Internet Protocol-Virtual Private). Network), Ethernet (registered trademark), a dedicated network such as a wireless LAN, etc., whether wired or wireless.
- the information processing apparatus 100 and the output device 200 can perform data communication with each other on the home network using a protocol such as DLNA.
- FIG. 3 is an explanatory diagram illustrating an appearance configuration (display screen side) and a usage example of the information processing apparatus 100 according to the present embodiment.
- FIG. 4 is an explanatory diagram illustrating an external configuration (back side) and a usage example of the information processing apparatus 100 according to the present embodiment.
- FIG. 5 is an explanatory diagram showing an outline of the information processing method according to the present embodiment.
- the information processing apparatus 100 is a portable electronic device provided with a touch panel 150 on the front surface (display screen side) and a touch pad 160 on the back surface (back side of the display screen).
- the user operates the information processing apparatus 100 by a gesture operation on the touch panel 150 on the front surface.
- the user views the content on the WEB server 300 via the WEB browser by a tap operation or a drag operation using the first operation body 170 (for example, the user's thumb) on the touch panel 150.
- the user can also select an object corresponding to the content data to be output to the output device 200 from among the objects displayed on the touch panel 150 by a tap operation using the first operation body 170 or the like.
- the user is performing a gesture operation on the touch pad 160 on the back surface, and can select the output device 200, swap to the selected output device 200, directly operate the output device 200, and the like. .
- the user can select a desired output device 200 by a tap operation or a drag operation using the second operation body 180 (for example, the user's index finger) on the touch pad 160.
- the desired output device 200 here is an output device 200 that outputs content data (hereinafter, selected content data) corresponding to an object selected by a gesture input in the first operating body 170.
- the information processing apparatus 100 acquires the location information of the selected content data from the WEB server 300 and transmits the acquired location information to the output device 200.
- the user directly performs an operation on the output device 200 using the information processing apparatus 100 by performing a tap operation or a drag operation on the touch pad 160. It can be carried out.
- direct operation on the output device 200 for example, when the selected content data is a still image such as a photograph, focus, zoom, etc., when the selected content is a moving image or music, playback, temporary, etc. Stop, fast forward, rewind, volume adjustment, etc.
- the user's right thumb is used as an example of the first operating body 170
- the user's right hand's index finger is used as an example of the second operating body 180.
- the finger is not limited to these fingers. That is, as the first operation body 170 and the second operation body 180, a finger that is easy for the user to use (for example, the finger of the left hand) may be used, and a stylus or the like may be used.
- FIG. 5 illustrates an example in which a plurality of thumbnail images indicated by rectangles are displayed as objects on the application screen displayed on the touch panel 150 of the information processing apparatus 100. It is assumed that the object 150a is selected by the first operation body 170 (user's thumb) from the plurality of objects displayed on the touch panel 150, and the object 150a is selected.
- FIG. 5 illustrates a case where a television receiver 210 having a large screen display 211 and an audio device 220 are connected to the network 400 as the output device 200.
- the information processing apparatus 100 can select the television receiver 210 as the output device 200.
- the information processing apparatus 100 can select the audio device 220 as the output device 200.
- the information processing apparatus 100 transmits location information (for example, URL) of content data corresponding to the object 150a to the selected output device 200.
- information such as location information of content data corresponding to the object 150a selected by the first operating body 170 is output by a cooperative operation between the touch panel 150 on the front surface of the information processing apparatus and the touch pad 160 on the back surface. 200.
- the output device 200 that has received the location information of the content data accesses the WEB server 300 based on the location information of the content data, and acquires the content data corresponding to the object 150a from the WEB server 300.
- the output device 200 selected by the information processing apparatus 100 is the television receiver 210
- the content corresponding to the object 150a is displayed on the display 211.
- processing corresponding to the gesture is executed by the television receiver 210.
- the output device 200 selected by the information processing apparatus 100 is the audio device 220
- the music content data acquired from the WEB server 300 is stored in the audio device 220.
- the audio device 220 has a display unit, a player screen corresponding to the acquired music content data may be displayed on the display unit. Thereafter, when a gesture is input to the touch pad 160 by the second operation body 180, processing corresponding to the gesture (music reproduction or the like) is executed by the audio device 220.
- FIG. 6 is an exploded perspective view showing a hardware configuration of the information processing apparatus 100 according to the present embodiment.
- FIG. 7 is a block diagram illustrating a hardware configuration of the information processing apparatus 100 according to the present embodiment.
- the information processing apparatus 100 includes a display device 151 provided on a substrate 191, and an information input device 153 provided on the display screen 151 a side (surface side of the information processing device 100) of the display device 151. And a touch pad 160 provided on the back side of the display screen 151a of the display device 151 (the back side of the information processing device 100).
- a RAM Random Access Memory
- CPU Central Processing Unit
- the display device 151 displays results obtained by various processes performed by the information processing apparatus 100 as text or images.
- the display device 151 constitutes the touch panel 150 together with an information input device 153 described later.
- Specific examples of the display device 151 include devices that can visually notify the user of information, such as an LCD (Liquid Crystal Display), an organic EL (Electroluminescence) display device, and the like.
- the information input device 153 has a panel shape and constitutes the touch panel 150 together with the display device 151.
- the information input device 153 detects the contact position of the first operating body 170 in contact with the surface as the input position by the first operating body 170 on the display screen of the display device 151.
- the information input device 153 outputs input position information indicating the detected input position by the first operating body 170 to the CPU 197 as an information signal.
- a user of the information processing apparatus 100 can input various data or instruct a processing operation to the information processing apparatus 100 by operating the information input apparatus 153.
- the touch pad 160 has a panel-like shape like the information input device 153.
- the touch pad 160 detects the contact position of the second operating body 180 in contact with the surface as an input position by the second operating body 180 on the touch pad 160.
- the touch pad 160 outputs input position information representing the detected input position by the second operating body 180 to the CPU 197 as an information signal.
- the user of the information processing apparatus 100 can operate the touch pad 160 to transmit various data and instruct processing operations to the output device 200.
- the information processing apparatus 100 includes a non-volatile memory 193, a RAM 195, a CPU 197, and a network interface 199 in addition to the touch panel 150 (the display device 151 and the information input device 153) and the touch pad 160 described above. And further.
- the non-volatile memory (storage device) 193 is a data storage device configured as an example of a storage device of the information processing device 100.
- a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, An optical storage device or a magneto-optical storage device is used.
- the nonvolatile memory 193 stores programs executed by the CPU 197 and various data.
- the non-volatile memory 193 displays, for example, information regarding the optimal layout, zoom ratio, and the like when displaying content data acquired from the WEB server 300 on the output device 200 in the URL of the storage destination of the content data. It is stored in association with the domain.
- the RAM 195 temporarily stores programs used in the CPU 197, parameters that change as appropriate during the execution, and the like.
- the CPU (control unit) 197 functions as an arithmetic processing unit and a control unit, and controls all or a part of the operation in the information processing apparatus 100 according to various programs recorded in the nonvolatile memory 193 and the RAM 195. is there.
- the network interface 199 is an interface for transmitting / receiving various data to / from external devices such as the output device 200 and the WEB server 300 via the network 400.
- each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Therefore, the hardware configuration to be used can be changed as appropriate according to the technical level at the time of carrying out the present embodiment.
- FIG. 8 is a block diagram illustrating a functional configuration of the information processing apparatus 100 according to the present embodiment.
- the information processing apparatus 100 includes an input position detection unit 103, a display control unit 105, an object specification unit 107, a gesture detection unit 109, an output device selection unit 111, and a signal generation unit 115.
- the information processing apparatus 100 further includes a device registration unit 101, a content data management unit 113, a layout adjustment unit 117, a storage unit 119, and a communication unit 121.
- the device registration unit 101 registers the information processing apparatus 100 and the output device 200 by a simple registration method such as WPS (Wi-Fi Protected Setup). By performing device registration in this manner, the information processing apparatus 100 and the output device 200 can access each other via a network using a common protocol (for example, DLNA).
- a common protocol for example, DLNA
- the device registration method is not limited to WPS, and any method can be adopted as long as the information processing apparatus 100 and the output device 200 can access each other.
- the device registration unit 101 records registered device information representing information (for example, information such as a device name and an IP address) related to an external device for which device registration has been performed in the storage unit 119.
- the input position detection unit 103 detects an input position on the touch panel 150 by the first operating body 170. Specifically, the input position detection unit 103 reads the position (point) on the surface of the touch panel 150 where the first operating body 170 (for example, the thumb of the user of the information processing apparatus 100) is in contact as coordinates.
- the method by which the input position detection unit 103 detects the contact position by the first operating body 170 is not particularly limited, and any method such as a so-called electrostatic method, pressurization method, or optical method can be used. .
- the input position detection unit 103 detects that pressure is applied to the touch panel 150 and reads the coordinates of the point where the pressure is applied.
- the input position detection unit 103 detects the presence of the first operation body 170 in the space on the touch panel 150 close to the touch panel 150 even if the input operation detector 170 is not directly touched by the first operation body 170, and makes contact You may have the function which can be recognized as a position. That is, the contact position referred to here may include position information for an operation performed by the first operating body 170 so as to draw a sky on the screen of the touch panel 150.
- the input position detection unit 103 transmits information on the detected contact position (more specifically, the coordinates of the contact position) to the display control unit 105 and the object specifying unit 107 as input position information. For example, when the detected contact position is one location, the input position detection unit 103 outputs one coordinate (X1, Y1) as the input position information. When there are two detected contact positions, the input position detection unit 103 outputs a plurality of detected coordinates (X1, Y1) and (X2, Y2).
- the display control unit 105 is a control unit that controls the content displayed on the touch panel 150.
- the display control unit 105 reads out object data such as thumbnail images of arbitrary image data recorded in the storage unit 119 to be described later, and causes the touch panel 150 to display the object data.
- the display control unit 105 designates the display position of the object on the touch panel 150 and displays the object data at the display position. For this reason, the display control unit 105 holds information indicating the display position of the object displayed on the touch panel 150.
- Information indicating the display position of the object is transmitted from the display control unit 105 to the object specifying unit 107.
- the input position information is input from the input position detection unit 103 to the display control unit 105.
- the display control unit 105 acquires an object such as a thumbnail of the moving image content included in the information processing apparatus 100 from the storage unit 119 described later, and displays the object on the display screen.
- the display control unit 105 displays an object corresponding to the content data on the display screen.
- the object specifying unit 107 specifies a selected object that is an object selected by the first operation body 170 based on the input position information and the display position information indicating the display position of the object or the like.
- the input position information is input from the input position detection unit 103 to the object specifying unit 107.
- display position information indicating the display position of an object or the like is also input from the display control unit 105 to the object specifying unit 107. Therefore, the object specifying unit 107 compares the input position information input from the input position detection unit 103 with the display position information input from the display control unit 105. Then, the object specifying unit 107 specifies the object selected by the first operating body 170. By this processing, the object specifying unit 107 transmits information related to the selected content object and the like to the display control unit 105 and the content data management unit 113.
- the gesture detection unit 109 detects a gesture when a predetermined gesture is input to the touch pad 160.
- the specific function of the gesture detection unit 109 is similar to the function of the input position detection unit 103 described above. That is, the gesture detection unit 109 detects an input position on the touch pad 160 by the second operation body 180. Specifically, the gesture detection unit 109 reads, as coordinates, a position (point) at which the second operation body 180 (for example, the index finger of the user of the information processing apparatus 100) contacts on the surface of the touch pad 160.
- the method by which the gesture detection unit 109 detects the contact position by the second operating body 180 is not particularly limited, and any method such as a so-called electrostatic method, pressurization method, or optical method can be used.
- the gesture detection unit 109 detects that pressure is applied to the touch pad 160 and reads the coordinates of the point where the pressure is applied. Further, the gesture detection unit 109 detects that the second operation body 180 exists in the space on the touch pad 160 close to the touch pad 160 even if the gesture detection unit 109 is not directly touched by the second operation body 180. You may have the function which can be recognized as a contact position. That is, the contact position referred to here may include position information regarding an operation performed by the second operating body 180 so as to draw a sky on the screen of the touch pad 160.
- the gesture detection unit 109 outputs, as gesture information, information on the detected contact position (more specifically, coordinates of the contact position) and information on the direction and amount of change of the detected contact position with time.
- the data is transmitted to the selection unit 111, the signal generation unit 115, and the layout adjustment unit 117.
- the input position detection unit 103 outputs one coordinate (X1, Y1) as the input position information.
- the input position detection unit 103 outputs a plurality of detected coordinates (X1, Y1) and (X2, Y2).
- a vector representing a change in a plurality of coordinates (X1, Y1) and (X2, Y2) detected within a predetermined time is output.
- the output device selection unit 111 selects the output device 200 that outputs the content data corresponding to the selected object, based on the first gesture information representing the gesture detected by the gesture detection unit 109. That is, first gesture information representing a gesture detected in a state where a specific object is selected by the first operating body 170 is input from the gesture detection unit 109 to the output device selection unit 111. Further, the output device selection unit 111 acquires information (registered device information) related to registered devices that can access the information processing apparatus 100 from the storage unit 119. Then, the output device selection unit 111 selects one output device 200 (for example, the television receiver 210 and the audio device 220) from among the registered output devices 200 based on the first gesture information. For example, the television receiver 210) is selected.
- the gesture detection unit 109 detects the number of times that the tap operation has been performed on the touch pad 160 within a predetermined time. For example, when gesture information indicating that a single tap operation has been performed on the touch pad 160 is input to the output device selection unit 111, the output device selection unit 111 outputs the television receiver 210. The device 200 is selected. When the gesture information indicating that the tap operation has been performed twice on the touch pad 160 is input to the output device selection unit 111, the output device selection unit 111 selects the audio device 220 as the output device 200. To do.
- the determination criterion for the output device selection unit 111 to select the output device 200 is not limited to the method of selecting the output device 200 based on the number of taps of the touch pad 160.
- the display control unit 105 displays on the display screen the names of external devices that have been registered, and the output device selection unit 111 selects the output device 200 in response to a drag operation on the touch pad 160. May be selected.
- the output device 200 is selected according to the number of fingers of the user who are simultaneously touching the touch pad 160 (the television receiver 210 is selected for one, and the audio device 220 is selected for two). It may be. Note that when the output device 200 is selected based on the number of taps, for example, when the user wants to select the television receiver 210 and accidentally taps twice, the user operability may not be good. . From such a viewpoint of user operability, it is preferable to select the output device 200 by a drag operation, the number of fingers, or the like because a desired output device 200 can be immediately selected again even if an erroneous operation is performed. .
- the output device selection unit 111 transmits information on the name, IP address, and the like of the selected output device 200 to the content data management unit 113 and the signal generation unit 115.
- the content data management unit 113 acquires content data and the like from the WEB server 300, transfers location information of the content data to the output device 200 selected by the output device selection unit 111, and the like.
- FIG. 9 is a block diagram showing a configuration of the content data management unit 113 according to the present embodiment.
- the content data management unit 113 includes a content data acquisition unit 131, a selected content identification unit 132, a location information extraction unit 133, an output device identification unit 134, and a location information transfer unit 135.
- the content data management unit 113 may include an input area detection unit 136 and a determination unit 137 as necessary.
- the content data acquisition unit 131 receives predetermined content data from the WEB server 300 via the communication unit 121, its location information (such as the URL where the content data is stored), and information related to the application associated with the content data. To get.
- the content data acquisition unit 131 may record information such as acquired content data in the storage unit 119.
- the content data acquisition unit 131 may transmit the acquired content data to the display control unit 105 and display the content data on the display screen of the information processing apparatus 100 as text, an image, or the like.
- Information regarding the object selected by the first operation body 170 is input from the object specifying unit 107 to the selected content specifying unit 132.
- the selected content specifying unit 113 specifies content data corresponding to the object from the input information related to the selected object. Then, the selected content specifying unit 132 transmits information regarding the specified content data to the location information extracting unit 133.
- the location information extracting unit 133 extracts the location information of the content data from the information regarding the selected content data input from the selected content specifying unit 132.
- the location information of the selected content data to be extracted may be held by the content data management unit 113 or may be stored in the storage unit 119. Further, the location information extraction unit 133 transmits the location information of the extracted selected content data to the location information transfer unit 135.
- the output device specifying unit 134 has been selected as the output device 200 that outputs content data (hereinafter, selected content data) corresponding to the object selected by the first operating body 170 from the output device selection unit 111. Information about the device is entered. The output device specifying unit 134 specifies the output device 200 selected by the output device 200 based on the input information regarding the output device 200. Further, the output device specifying unit 134 transmits information on the specified output device 200 (name, IP address, etc. of the output device 200) to the location information transfer unit 135.
- content data hereinafter, selected content data
- the location information transfer unit 135 transfers the location information of the selected content data input from the location information extraction unit 133 to the output device 200 specified by the output device specification unit 134 via the communication unit 121.
- the input position information indicating the contact position of the first operating body 170 on the touch panel 150 is input from the input position detection unit 103 to the input area detection unit 136.
- input position information representing the contact position of the second operation body 180 on the touch pad 160 is input from the gesture detection unit 109 to the input region detection unit 136.
- the input area detection unit 136 divides the area on the touch panel 150 into a plurality of divided areas, and the divided area where the first operating body 170 is positioned based on the input position information input from the input position detection unit 103. Is detected.
- the input area detection unit 136 divides the area on the touch pad 160 into a plurality of divided areas, and the divided area where the second operating body 180 is located based on the input position information input from the gesture detection unit 109. Is detected. Furthermore, the contact position detection unit 136 detects first divided area information indicating the detected divided area where the first operating body 170 is located, and second divided area information indicating the divided area where the second operating body 180 is located. Is transmitted to the determination unit 137.
- the determination unit 137 determines the divided region where the first operating body 170 is located and the second operating member 180 as the position. It is determined whether or not there is a corresponding positional relationship with the divided area.
- the “corresponding positional relationship” refers to, for example, a divided area where the first operating body 170 is located on the touch panel 150 and a divided area where the second operating body 180 is located on the touch pad 160 facing each other. It refers to the relationship that is located. In other words, it means a relationship in which the divided area where the first operating body 170 is located and the divided area where the second operating body 180 is located are areas having the same coordinates in the XY coordinate plane. Further, the determination unit 137 transmits the result of the determination to the location information transfer unit 135.
- the location information transfer unit 135 determines whether to transfer the location information of the selected content. That is, the location information transfer unit 135 receives a determination result that the divided area where the first operating body 170 is located and the divided area where the second operating body 180 is located have a corresponding positional relationship. The location information of the selected content is transferred to the output device 200. On the other hand, the location information transfer unit 135 receives a determination result that the divided area where the first operating body 170 is located and the divided area where the second operating body 180 is located do not have a corresponding positional relationship. The location information of the selected content is not transferred to the output device 200. In this case, the operation input to the touch pad 160 by the second operation body 180 is determined to be an erroneous operation.
- the information processing apparatus 100 since the information processing apparatus 100 includes the input region detection unit 136 and the determination unit 137, the location of the selected content is detected when the second operation body 180 accidentally contacts the touch pad 160 due to an erroneous operation by the user. Information can be prevented from being transferred to the output device 200.
- the signal generation unit 115 performs predetermined processing on the selected output device 200 based on the second gesture information representing the gesture detected by the gesture detection unit 109 after transmitting the location information of the content data to the output device 200.
- a control signal for execution is generated. The details of the processing are as follows.
- the signal generation unit 115 receives, from the content data management unit 113, information related to the selected content data that has been transferred to the output device 200, the associated application, and the like.
- information related to the selected output device 200 is input to the signal generation unit 115 from the output device selection unit 111.
- gesture information representing a gesture input to the touch pad 160 is input from the gesture detection unit 109 to the signal generation unit 115.
- the signal generation unit 115 recognizes the content of the gesture corresponding to the gesture information. For example, when the gesture information is information indicating that the contact position of the second operation body 180 that has touched the touch pad 160 within a predetermined time is one place (for example, coordinates (X1, Y1)). It is recognized that the tap operation on the touch pad 160 has been performed once. Further, for example, the gesture information is information indicating that the contact position of the second operating body 180 that has touched the touch pad 160 within a predetermined time has moved from the coordinates (X1, Y1) to the coordinates (X2, Y2). If it is detected, it is recognized that a drag operation has been performed on the touch pad 160.
- the gesture information is information indicating that the contact position of the second operation body 180 that has touched the touch pad 160 within a predetermined time is one place (for example, coordinates (X1, Y1)). It is recognized that the tap operation on the touch pad 160 has been performed once. Further, for example, the gesture information is information indicating that the contact position of the second operating body 180 that
- the signal generation unit 115 generates a control signal for causing the output device 200 to execute the process assigned to the gesture based on the gesture information (the content of the recognized gesture) input from the gesture detection unit 109.
- the content reproduction process is assigned to one tap operation after the location information of the selected content data is transmitted, and the volume adjustment process is assigned to the drag operation.
- the signal generation unit 115 causes the output device 200 to reproduce the selected content data.
- a control signal for executing is generated.
- the signal generation unit 115 generates a control signal for causing the output device 200 to adjust the playback volume of the selected content.
- the content of the process assigned to each gesture may be stored in the storage unit 119 in association with the content of the gesture and the content of the process, for example.
- the layout adjustment unit 117 adjusts the layout for displaying the selected content on the output device 200 based on the gesture detected by the gesture detection unit 109. Specifically, the layout adjustment unit 117 acquires display control information that is information for display control in the output device 200 such as the size and resolution of the display screen of the output device 200 from the output device 200. Then, the layout adjustment unit 117 holds the acquired display control information or records it in the storage unit 119. In addition, the layout adjustment unit 117 generates a command signal such as scrolling or zooming for adjusting the layout on the display screen of the content data displayed on the output device 200 based on the display control information and the gesture information. To the output device 200.
- display control information is information for display control in the output device 200 such as the size and resolution of the display screen of the output device 200 from the output device 200. Then, the layout adjustment unit 117 holds the acquired display control information or records it in the storage unit 119. In addition, the layout adjustment unit 117 generates a command signal such as scrolling or zooming for adjusting the layout on
- WEB content can be laid out freely. Therefore, even if the output device 200 plays the selected content as it is based on the location information of the content such as the transferred URL, the content is not always displayed at the center of the display screen at the optimum zoom rate. I can't. Therefore, the layout adjustment as described above is necessary.
- the selected content data is displayed on the output device 200.
- gesture information representing the content of detecting that a gesture such as dragging, pinching in, or pinching out is input to the touch pad 160 from the gesture detection unit 109 is input to the layout adjustment unit 117.
- the layout adjustment unit 117 performs a scroll command signal on the display screen of the output device 200 based on the dragged distance and direction and the display control information (display screen size, etc.). Is transmitted to the output device 200.
- the layout adjustment unit 117 instructs zoom-out on the display screen of the output device 200 based on the pinched-out distance and display control information (display screen size, etc.). A signal is generated and transmitted to the output device 200.
- the layout adjustment unit 117 may record layout information regarding the preset position, zoom ratio, and the like regarding the display of content data after layout adjustment in the storage unit 119.
- the output device 200 records the layout information after layout adjustment for each WEB site, domain, etc., so that when the output device 200 reproduces the same WEB content later, the optimum content automatically. Playback in layout becomes possible. That is, when layout adjustment is performed, the layout information is output from the output device 200, for example, content location information (for example, the URL of the main page of the WEB site where the content is stored, the domain where the content is stored, etc. ) And recorded. When the output device 200 redisplays the content in the same WEB site or the content having the same domain, the output device 200 automatically reproduces the content with the optimum content layout by using the stored layout information. can do.
- content location information for example, the URL of the main page of the WEB site where the content is stored, the domain where the content is stored, etc.
- the layout adjustment unit 117 selects an application condition for the layout information stored in the output device 200 according to the gesture information input from the gesture detection unit 109. May be.
- the application conditions include, for example, “(1) Do not apply stored layout information”, “(2) Apply to content on the same WEB site”, “(3) Content on the same domain” Apply ".
- different gestures are assigned to the application conditions. For example, when the number of second operating bodies 180 in contact with the touch pad 160 (the number of fingers of the user, etc.) is 1, the application condition of (1), the application condition of (2) in the case of two, In the three cases, as in the application condition (3), a gesture is assigned for each application condition. Then, the layout adjustment unit 117 transmits application condition information indicating application conditions selected according to the gesture information input from the gesture detection unit 109 to the output device 200.
- the layout adjustment unit 117 selects the application condition of the layout information. That is, when the output device 200 receives the location information of the selected content data, a display asking whether to apply the layout information stored in the output device 200 is displayed on the display screen of the output device 200. Can be considered. Then, the user who sees the display inputs a predetermined gesture to the touch pad 160 using the second operation body 180, and the layout adjustment unit 117 selects an application condition of the layout information based on the gesture.
- the storage unit 119 stores object data displayed on the touch panel 150.
- the object data referred to here includes, for example, arbitrary parts constituting a graphical user interface (hereinafter, GUI) such as an icon, a button, and a thumbnail.
- GUI graphical user interface
- the storage unit 119 may store object data of content that can be reproduced by the information processing apparatus 100.
- the storage unit 119 stores attribute information in association with individual object data.
- the attribute information includes, for example, object data or entity data creation date, update date, creator name, updater name, entity data type, entity data size, importance, priority, and the like.
- the storage unit 119 stores entity data corresponding to the object data in association with each other.
- entity data referred to here is data corresponding to a predetermined process executed when an object displayed on the touch panel 150 is operated.
- the object data corresponding to the moving image content is associated with the content data of the moving image content as entity data.
- the storage unit 119 also stores an application for playing back the stored content in association with object data, content data, or attribute information.
- the object data stored in the storage unit 119 is read by the display control unit 105 and displayed on the touch panel 150.
- registered device information related to the registered device that has been registered by the device registration unit 101 is also registered.
- the storage unit 119 may store layout information regarding a preset position, a zoom ratio, and the like when the selected content is displayed on the output device 200.
- the storage unit 119 also stores various parameters, intermediate progress of processing, and various databases that need to be saved when the information processing apparatus 100 performs some processing. It can be stored as appropriate.
- the storage unit 119 includes a device registration unit 101, an input position detection unit 103, a display control unit 105, an object specification unit 107, a gesture detection unit 109, an output device selection unit 111, a content data management unit 113, a signal generation unit 115, a layout.
- the adjustment unit 117 and the like can freely read and write.
- the communication unit 121 is connected to a home network or the like between the Internet 400 and the output device 200, and transmits / receives data to / from external devices (in this embodiment, the output device 200 and the WEB server 300) in the information processing apparatus 100. .
- each component described above may be configured using a general-purpose member or circuit, or may be configured by hardware specialized for the function of each component.
- the CPU or the like may perform all functions of each component. Therefore, the configuration to be used can be changed as appropriate according to the technical level at the time of carrying out the present embodiment.
- FIGS. 10 and 11 are flowcharts showing a flow of processing of the information processing method according to the present embodiment.
- FIG. 12 is an explanatory diagram illustrating a first example to which the information processing method according to the present embodiment is applied.
- FIG. 13 is an explanatory diagram illustrating a second example to which the information processing method according to the present embodiment is applied.
- FIG. 14 is an explanatory diagram illustrating a third example to which the information processing method according to the present embodiment is applied.
- FIG. 15 is an explanatory diagram illustrating a modification of the information processing method according to the present embodiment.
- FIG. 16 is an explanatory diagram illustrating a modification of the information processing method according to the present embodiment.
- the information processing method mainly includes the following steps.
- Input position detection step of detecting the input position of the first operating body 170 on the display screen of the display device 151 (touch panel 150) on which a predetermined object is displayed
- Input position of the first operating body 170 Object specifying step (3) for specifying a selected object that is an object selected by the first operating body 170 based on the input position information indicating the display position information and the display position information indicating the display position of the object.
- Gesture detection step (4) for detecting a predetermined gesture input by the second operation body 180 in a state selected by the operation body 170. First gesture representing the gesture detected by the gesture detection step (3).
- the information processing apparatus 100 and desired external devices are registered by a simple registration method such as WPS (S101).
- WPS simple registration method
- the information processing apparatus 100 and external devices registered as devices can access each other via a network using a common protocol.
- the information processing apparatus 100 accesses the WEB server 300 via the network 400 such as the Internet, acquires arbitrary content data, and displays the WEB content on the touch panel 150. Furthermore, any content desired to be output to an external device registered as a device is tapped by the first operating body 170 (for example, the thumb of the user).
- the content to be output to the external device is a photo content in the photo sharing service on the WEB server 300
- a plurality of objects are displayed on the touch panel 150 of the information processing apparatus 100.
- Image is displayed.
- the first object 150a is tapped by the first operating body 170 (for example, the thumb of the user) (S202).
- the WEB page in the service is displayed on the touch panel 150 of the information processing apparatus 100. Is displayed.
- the first operation body 170 taps the object 150b corresponding to the moving image content (S302).
- the WEB page in the service is displayed on the touch panel 150 of the information processing apparatus 100. Is displayed.
- the object 150b corresponding to the music content is tapped by the first operating body 170 (S402).
- the input position detection unit 103 detects the position (input position) tapped by the first operating body 170 (S103). Further, the object specifying unit 107 specifies the object (selected object) selected by the first operating body 170 based on the input position detected in step S103 and the display position information acquired from the display control unit 105. (S105). For example, the object specifying unit 107 selects the object 150a in the example of FIG. 12, the object 150b in the example of FIG. 13, and the object 150c in the example of FIG. Identify as an object.
- the gesture detection unit 109 detects a tap input on the touch pad 160 while the first operating body 170 continues to tap the selected object, that is, in a state where a predetermined object on the touch panel 150 is selected. It is determined whether or not to be performed (S107). For example, in the example of FIG. 12, the gesture detection unit 109 detects a tap input on the touch pad 160 by the second operation body 180 (for example, the user's index finger) on the touch pad 160 in a state where the object 150a is selected. It is determined whether or not.
- the target (selection) of the selected content is output based on the number of times the touch pad 160 is tapped.
- Output device 200 to be switched.
- the output target is the television receiver 210
- the number of taps is two
- the output target is the audio device 220. It should be noted that at this stage, nothing is displayed on the display screen 211 of the television receiver 210 or the display screen 223 of the audio device 220 in any of the examples of FIGS. 12 to S201 of FIG. 14, S301, S401).
- step S107 if the gesture detection unit 109 does not detect a tap input on the touch pad 160, the information processing apparatus 100 returns to step S103 again and determines the input position by the first operating body 170. Wait for detection.
- the output device selection unit 111 selects the selected content according to, for example, the number of taps of the tap input.
- the output device 200 to be output is selected (S109).
- the content data management unit 113 sends the location information (URL) of the selected content data corresponding to the object tapped on the first operating body 170 to the output device 200 selected in step S109 via the network.
- the network here is a network (for example, home network) based on a common protocol with the external device registered in step S101, and is a network (for example, the Internet) that communicates with the WEB server 300 or the like. Is different.
- the location information transfer unit 135 is used only when the position of the first operation body 170 on the touch panel 150 and the position of the second operation body 180 on the touch pad 160 are in a corresponding positional relationship.
- the URL of the selected content data is transferred to the output device 200. That is, as shown in FIG. 15, the input area detection unit 136 divides the area on the touch panel 150 into a plurality of divided areas (four divided areas 150A, 150B, 150C, and 150D in the example of FIG. 15).
- the input area detection unit 136 also divides the area on the touch pad 160 into a plurality of divided areas (four divided areas 160A, 160B, 160C, and 160D in the example of FIG.
- the divided area 150A on the touch panel 150 and the divided area 160A on the touch pad 160 are in a corresponding positional relationship.
- the “corresponding positional relationship” referred to here is, for example, a divided area where the first operation body 170 is located on the touch panel 150 and a divided area where the second operation body 180 is located on the touch pad 160. It refers to the relationship of being positioned facing each other. In other words, it means a relationship in which the divided area where the first operating body 170 is located and the divided area where the second operating body 180 is located are areas having the same coordinates in the XY coordinate plane.
- the input area detection unit 136 detects the divided area where the first operating body 170 is located based on the input position information input from the input position detection unit 103.
- the input area detection unit 136 detects that the first operating body 170 is located in the divided area 150D.
- the input position detection unit 136 detects the divided region where the second operation body 180 is located based on the input position information input from the gesture detection unit 109.
- the input area detection unit 136 detects that the second operation body 180 is located in the divided area 160D. To do.
- the determination unit 137 corresponds to the divided region where the first operating body 170 is located and the divided region where the second operating body 180 is located based on the detection result input from the input region detecting unit 136. It is determined whether or not the positional relationship is satisfied.
- the determination unit 137 since the first operating body 170 is located in the divided area 150D and the second operating body 180 is located in the divided area 160D, the determination unit 137 includes the first operating body 170 and the second operating body 170D. It is determined that the divided area where the operating body 180 is located has a corresponding positional relationship.
- the location information transfer unit 135 determines whether or not to transfer the location information of the selected content when the determination result is input from the determination unit 137. That is, the location information transfer unit 135 receives a determination result that the divided area where the first operating body 170 is located and the divided area where the second operating body 180 is located have a corresponding positional relationship. The location information of the selected content is transferred to the output device 200. On the other hand, the location information transfer unit 135 receives a determination result that the divided area where the first operating body 170 is located and the divided area where the second operating body 180 is located do not have a corresponding positional relationship. The location information of the selected content is not transferred to the output device 200.
- the operation input to the touch pad 160 by the second operation body 180 is determined to be an erroneous operation.
- the location information transfer unit 135 selects the selected content. Is transferred to the output device 200.
- the information processing apparatus 100 since the information processing apparatus 100 includes the input region detection unit 136 and the determination unit 137, the location of the selected content is detected when the second operation body 180 accidentally contacts the touch pad 160 due to an erroneous operation by the user. Information can be prevented from being transferred to the output device 200.
- the output device 200 that has received the URL of the selected content data accesses the WEB site on the WEB server 300 having the URL based on the received URL, and acquires the selected content data (S113).
- layout information such as a preset position of an object displayed on the display screen and a zoom ratio is stored for each URL domain of content data transferred to itself.
- the output device 200 applies the stored layout information at the stage of starting connection to the WEB server 300 (S115).
- step S115 The reason why the process as in step S115 is performed is as follows.
- the WEB content can be freely laid out. Therefore, even if the output device 200 plays the selected content as it is based on the location information of the content such as the transferred URL, the content is not always displayed at the center of the display screen at the optimum zoom rate. I can't.
- content display layouts are often shared within the same WEB content. Therefore, by storing the layout information when the layout is adjusted on the information processing apparatus 100 side on the output device 200 side, the content can be reproduced with the optimum layout from the next time. This point will be specifically described with reference to the example of FIG.
- FIG. 12 shows a photo sharing service in the WEB server 300 on the touch panel 150 of the information processing apparatus 100 and outputs content data (photo image data) corresponding to an object selected from the displayed objects to the output device 200. This is an example of output.
- the output device 200 is based on the received URL in the processing in steps S113 and S115.
- the linked high-resolution photographic image is displayed (S203).
- the title bar and other related information on the site of the photo sharing service are also displayed on the display screen 211.
- a gesture is input to the touch pad 160 by the second operation body 180, for example, a gesture such as a drag is input to the touch pad 160 (S204), the focus position of the photographic image 211a is determined according to the gesture input. Moves (S205).
- the photographic image 211a is zoomed according to the gesture input (S207).
- the photographic image 211a can be adjusted to the optimum display position and zoom ratio. For example, if the content is in the same photo sharing site, the layout of high resolution photos is common. Therefore, the output device 200 stores the layout information after the layout adjustment, and applies the stored layout information when the output to the output device 200 is performed for another photographic image after that. Thus, an optimal layout can be displayed on the display screen 211.
- a plausible rectangle for example, a rectangle having the same size as the size of the display screen 211
- the selected content for example, a photographic image
- the position and zoom ratio may be automatically adjusted.
- the layout adjustment unit 117 of the information processing apparatus 100 is stored in the output device 200 in accordance with the gesture information input from the gesture detection unit 109 when the output device 200 reproduces the selected content.
- Application conditions for layout information may be selected.
- the application conditions include, for example, “(1) Do not apply stored layout information”, “(2) Apply to content on the same WEB site”, “(3) Content on the same domain” Apply ".
- different gestures are assigned to the application conditions.
- the layout adjustment unit 117 determines that the second operation body 180 in contact with the touch pad 160 is only the operation body 181 (only the user's index finger). Apply the applicable conditions. Further, for example, as shown in FIG.
- the layout adjusting unit 117 includes two operation bodies 181 and 182 that are in contact with the touch pad 160 (an index finger of the user). In the case of the middle finger), the application condition (2) is applied. Further, for example, as illustrated in FIG. 16C, the layout adjusting unit 117 includes three operating bodies 181, an operating body 182, and an operating body 183 that are in contact with the touch pad 160. In the case of the user's index finger, middle finger, and ring finger), the application condition (3) is applied. Then, the layout adjustment unit 117 transmits application condition information indicating application conditions selected according to the gesture information input from the gesture detection unit 109 to the output device 200.
- the layout adjustment unit 117 selects the application condition of the layout information. That is, when the output device 200 receives the location information of the selected content data, a display asking whether to apply the layout information stored in the output device 200 is displayed on the display screen of the output device 200. Can be considered. Then, the user who sees the display inputs a predetermined gesture to the touch pad 160 using the second operation body 180, and the layout adjustment unit 117 selects an application condition of the layout information based on the gesture.
- the output device 200 activates an application (usually a WEB browser or the like) associated with the selected content data acquired in step S113, and reproduces the selected content (S117).
- the type of the associated application is determined based on, for example, the file name of the content data to be reproduced (particularly, an extension such as “wma” or “mpg”).
- the gesture detection unit 109 determines whether or not a gesture input is detected on the touch pad 160 by the second operating body 180 (S119). This determination may be performed, for example, by setting a predetermined time and within this time. For example, this determination may be terminated in response to a gesture input by the second operating body 180. If the result of determination in step S119 is that gesture input by the second operating tool 180 has not been detected, processing proceeds to step S127 described later. On the other hand, if the gesture input by the second operating body 180 is detected as a result of the determination in step S119, the signal generation unit 115 performs control for causing the output device 200 to execute processing corresponding to the input gesture. A signal is generated (S121). Then, the signal generation unit 115 transmits the generated control signal to the output device 200 (S123), and the output device 200 that has received this signal executes processing corresponding to the received control signal (S125).
- the gesture on the touch pad 160 can be performed only by relative finger movement. This is a feasible operation. Specifically, for example, when a drag operation in the vertical and horizontal directions is input to the touch pad 160, the gesture detection unit 109 detects that the drag operation is input. Based on the detection result, the signal generation unit 115 generates a control signal that instructs to scroll on the WEB page displayed on the display screen 211.
- an operation related to the moving image content played on the television receiver 210 is performed on the touch pad 160. It is done by gesture input to. Specifically, for example, when a drag operation in the up / down / left / right direction, a drag operation in the left / right direction, a simple tap operation, or the like is input to the touch pad 160, the gesture detection unit 109 inputs the gesture operation. Detect that. Then, the signal generation unit 115 generates a control signal instructing an operation related to the moving image content displayed on the display screen 211 based on the detection result.
- the operation of the touch pad 160 is mainly an operation that is completed with the target content (content output to the output device 200).
- the content to be played back by the audio device 220 is music content in the music distribution service
- an operation related to the music content played back by the audio device 220 is input to the touch pad 160 as a gesture. Is done.
- music content or moving image content (sound only) is output to the audio device 220
- the music being played on the audio device 220 can be controlled (played back) independently of the operation of the information processing apparatus 100.
- using an audio device as an external speaker of a mobile device by a technology such as Bluetooth (registered trademark) is already widely used.
- Bluetooth registered trademark
- connection to the WEB server 300 and reproduction of acquired content are performed independently on the output device 200 side such as the audio device 200, and the information processing apparatus 100 Only the URL and control signal of the content are transmitted. Therefore, it is possible to reproduce different music between the audio device and the mobile device without providing a separate device or incurring processing costs.
- step S127 it is determined whether or not the layout adjustment has been performed on the content currently being played back by the output device 200 (S127). Exit. On the other hand, if it is determined as a result of the determination in step S127 that the layout adjustment has been performed, the layout information such as the preset position and zoom ratio after the latest layout adjustment is recorded in association with the URL domain and the like (S129). ), The process is terminated. The recorded layout information is applied as necessary when other contents are reproduced on the output device 200.
- the division of labor operation that takes advantage of the characteristics of each device occurs. Specifically, processing that requires complicated operations, such as navigation between web contents, is performed on the information processing apparatus 100 side. On the other hand, viewing of content that is easy to operate but is desired to be viewed on a large screen, such as a WEB video or a news article, is performed on the output device 200 side.
- the information processing apparatus 100 is a mobile device in which the touch panel 150 is mounted on one surface (for example, the front surface) and the touch pad 160 is mounted on the other surface (for example, the back surface).
- the information processing apparatus 100 performs an operation on the screen of the information processing apparatus 100 with the touch panel 150, and performs an operation on the screen of the output device 200 that can be connected to the information processing apparatus 100 via a home network or the like with the touch pad 160. .
- complicated operations such as navigation of web contents are performed by the information processing apparatus 100 that is easy to perform touch operations, and contents are output by the output device 200, and these are seamlessly linked. It becomes possible to make it.
- the following effects (1) to (3) can be mainly obtained.
- Separate devices that specialize in navigation that requires complex operations in WEB content and content reproduction that can be realized with simple operations but requires a large screen, good sound quality, etc. 100 and output device 200) and can be controlled simultaneously.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
101 機器登録部
103 入力位置検出部
105 表示制御部
107 オブジェクト特定部
109 ジェスチャ検知部
111 出力機器選択部
113 コンテンツデータ管理部
115 信号生成部
117 レイアウト調整部
119 記憶部
135 所在情報転送部
136 入力領域検出部
137 判定部
150 タッチパネル
160 タッチパッド
170 第1の操作体
180 第2の操作体
1. 関連技術の例(図1)
2. 関連技術の例における課題の解決手段
3. 第1の実施形態
3-1. 情報処理システムの構成(図2)
3-2. 情報処理装置及び情報処理方法の概略(図3~5)
3-3. 情報処理装置のハードウェア構成(図6,7)
3-4. 情報処理装置の機能構成(図8,9)
3-5. 情報処理方法(図10~16)
4.まとめ
まず、本発明の第1の実施形態に係る情報処理装置の説明をする前に、本発明の関連技術における携帯型の情報処理装置の表示画面を大画面の表示装置にスワップする技術の例について、図1を参照しながら説明する。なお、図1は、関連技術における携帯型の情報処理装置を含む情報処理システムの一例としての映像表示システム1(例えば、特許文献1を参照)の構成を示す説明図である。
しかしながら、このような技術においては、上述したように、ユーザ操作が煩雑となり、また、大画面表示装置4に出力されたコンテンツに対し可塑型表示装置2が直接操作する手段を持たない、といった課題があった。
[3-1.情報処理システムの構成]
初めに、図2を参照しながら、本発明の第1の実施形態に係る情報処理装置を含む情報処理システムの全体構成について説明する。図2は、本実施形態に係る情報処理システムの全体構成を示す説明図である。
続いて、図3~図5を参照しながら、本実施形態に係る情報処理装置100及び情報処理方法の概略について説明する。図3は、本実施形態に係る情報処理装置100の外観構成(表示画面側)及び使用例を示す説明図である。図4は、本実施形態に係る情報処理装置100の外観構成(裏面側)及び使用例を示す説明図である。図5は、本実施形態に係る情報処理方法の概略を示す説明図である。
図3及び図4に示すように、情報処理装置100は、表面(表示画面側)にタッチパネル150、裏面(表示画面の裏側)にタッチパッド160が設けられた携帯型の電子機器である。
続いて、以上のような情報処理装置100を用いた本実施形態の情報処理方法の概略を説明する。情報処理装置100の表示画面(タッチパネル150)には、一般的なWEBブラウザ、動画プレーヤ、音楽プレーヤなど、再生するコンテンツに関連付けられたアプリケーションの画面が表示されている。図5には、情報処理装置100のタッチパネル150に表示されたアプリケーションの画面に、オブジェクトとして、矩形で示された複数のサムネイル画像が表示されている例を示している。このタッチパネル150に表示されている複数のオブジェクトの中から、第1の操作体170(ユーザの親指)によりオブジェクト150aがタップされ、オブジェクト150aが選択されたとする。そして、オブジェクト150aが第1の操作体170によりタップされている状態で、タッチパッド160に対して第2の操作体180(例えば、ユーザの人差し指)によりタップ操作が行われると、例えば、そのタップ回数により出力機器200が選択される。図5では、出力機器200として、大画面ディスプレイ211を有するテレビジョン受像機210と、オーディオ機器220がネットワーク400に接続されている場合を例示してある。この例で、例えば、タッチパッド160のタップ回数が1回である場合には、情報処理装置100は、出力機器200としてテレビジョン受像機210を選択するようにできる。また、例えば、タッチパッド160のタップ回数が2回である場合には、情報処理装置100は、出力機器200としてオーディオ機器220を選択するようにできる。そして、情報処理装置100は、選択された出力機器200に対し、オブジェクト150aに対応するコンテンツデータの所在情報(例えば、URL)を送信する。
続いて、図6及び図7を参照しながら、本実施形態に係る情報処理装置100のハードウェア構成について詳述する。図6は、本実施形態に係る情報処理装置100のハードウェア構成を示す分解斜視図である。図7は、本実施形態に係る情報処理装置100のハードウェア構成を示すブロック図である。
続いて、図8を参照しながら、上述したハードウェア構成により実現される本実施形態に係る情報処理装置100の機能構成について説明する。図8は、本実施形態に係る情報処理装置100の機能構成を示すブロック図である。
次に、図10~図16を参照しながら、上述した構成を有する情報処理装置100を用いた本実施形態に係る情報処理方法の処理の流れについて詳細に説明する。図10及び図11は、本実施形態に係る情報処理方法の処理の流れを示すフローチャートである。図12は、本実施形態に係る情報処理方法を適用した第1の例を示す説明図である。図13は、本実施形態に係る情報処理方法を適用した第2の例を示す説明図である。図14は、本実施形態に係る情報処理方法を適用した第3の例を示す説明図である。図15は、本実施形態に係る情報処理方法の変形例を示す説明図である。図16は、本実施形態に係る情報処理方法の変形例を示す説明図である。
(1) 所定のオブジェクトが表示される表示装置151(タッチパネル150)の表示画面上における第1の操作体170の入力位置を検出する入力位置検出ステップ
(2) 第1の操作体170の入力位置を表す入力位置情報とオブジェクトの表示位置を表す表示位置情報とに基づいて、第1の操作体170により選択されたオブジェクトである選択オブジェクトを特定するオブジェクト特定ステップ
(3) 選択オブジェクトが第1の操作体170により選択されている状態で第2の操作体180により入力された所定のジェスチャを検知するジェスチャ検知ステップ
(4) (3)のジェスチャ検知ステップにより検知されたジェスチャを表す第1のジェスチャ情報に基づいて、選択オブジェクトに対応するコンテンツデータを出力する外部機器である出力機器200を選択する出力機器選択ステップ
(5) 選択オブジェクトに対応するコンテンツデータ(選択コンテンツデータ)の所在情報を出力機器200に送信する所在情報送信ステップ
(6) 選択コンテンツデータの所在情報を出力機器200に送信した後に検知されたジェスチャを表す第2のジェスチャ情報に基づいて、選択された出力機器200に所定の処理を実行させるための制御信号を生成する信号生成ステップ
以上のように、本実施形態に係る情報処理装置100は、一方の面(例えば、表面)にタッチパネル150、他方の面(例えば、裏面)にタッチパッド160を搭載したモバイル機器である。そして、情報処理装置100は、タッチパネル150で情報処理装置100の画面上における操作を行い、タッチパッド160で、情報処理装置100とホームネットワーク等により接続可能な出力機器200の画面上における操作を行う。これにより、1台の装置(情報処理装置100)で、2つの装置(情報処理装置100及び出力機器200)の表示画面上における操作を同時に行うことが可能となる。また、本実施形態によれば、例えば、WEBコンテンツのナビゲーションのような複雑な操作は、タッチ操作がしやすい情報処理装置100で行い、コンテンツの出力は出力機器200で行い、これらをシームレスに連携させることを可能となる。
(1)WEBコンテンツで複雑な操作を必要とするナビゲーションと、簡易な操作で実現できるが大画面や良好な音質等を必要とするコンテンツ再生を、それぞれを得意とする別々の装置(情報処理装置100及び出力機器200)で行い、同時に制御することが可能となる。
(2)本実施形態では、情報処理装置100のタッチパネル150とタッチパッド160との連携したジェスチャ操作によるコンテンツのURL転送、タッチパネル150上における情報処理装置100の操作、タッチパッド160上における出力機器200の操作、とハードウェアレベルで操作が分離している。そのため、ユーザに対して、比較的単純で直感的な操作を提供し、ユーザの利便性を向上させることができる。
(3)ブラウザとWPSなどの一般的な技術や、ドメイン別にスクロール・ズーム位置を保存するといったシンプルで、他方、共通レイアウトをもちいるWEBコンテンツにおいては大変有用な仕組みを用いることで、本実施形態を実現できる。そのため、従来のシステムに新たに独自のシステムをインストールするコストがかからない。
Claims (8)
- 所定のオブジェクトが表示される表示装置と、
前記表示装置の表示画面側に設けられ、第1の操作体による前記表示画面上における入力位置を検出する入力位置検出部と、
前記入力位置を表す入力位置情報と前記オブジェクトの表示位置を表す表示位置情報とに基づいて、前記第1の操作体により選択された前記オブジェクトである選択オブジェクトを特定するオブジェクト特定部と、
前記表示装置の表示画面の裏面側に設けられ、第2の操作体により所定のジェスチャが入力された場合に、当該ジェスチャを検知するジェスチャ検知部と、
前記ジェスチャ検知部により検知されたジェスチャを表す第1のジェスチャ情報に基づいて、前記選択オブジェクトに対応するコンテンツデータを出力する外部機器である出力機器を選択する出力機器選択部と、
前記コンテンツデータの所在情報を前記出力機器に送信した後に検知されたジェスチャを表す第2のジェスチャ情報に基づいて、前記選択された出力機器に所定の処理を実行させるための制御信号を生成する信号生成部と、
を備える、情報処理装置。 - 前記選択オブジェクトに対応するコンテンツデータの所在情報を、前記出力機器選択部により選択された前記出力機器に対して転送する、所在情報転送部をさらに備える、請求項1に記載の情報処理装置。
- 前記ジェスチャ検知部は、前記第2の操作体による前記表示画面の裏面上における入力位置を検出することが可能であり、
前記情報処理装置は、
前記表示画面上の領域及び前記表示画面の裏面上の領域を、それぞれ複数の分割領域に分割するとともに、前記入力位置検出部から入力された前記第1の操作体の入力位置を表す第1入力位置情報と、前記ジェスチャ検知部から入力された前記第2の操作体の入力位置を表す第2入力位置情報とに基づいて、前記第1の操作体が位置する前記表示画面上の前記分割領域と、前記第2の操作体が位置する前記表示画面の裏面上の前記分割領域とを検出する入力領域検出部と、
前記第1の操作体が位置する前記分割領域を表す第1分割領域情報と、前記第2の操作体が位置する前記分割領域を表す第2分割領域情報とに基づいて、前記第1の操作体が位置する前記分割領域と、前記第2の操作体が位置する前記分割領域とが対応する位置関係にあるか否かを判定する判定部と、
をさらに備え、
前記所在情報転送部は、前記判定部により、前記第1の操作体が位置する前記分割領域と前記第2の操作体が位置する前記分割領域とが対応する位置関係にあると判定された場合にのみ、前記所在情報を転送する、請求項2に記載の情報処理装置。 - 前記選択オブジェクトに対応するコンテンツデータの前記出力機器への表示制御のための情報である表示制御情報を前記出力機器から取得し、さらに、取得した当該表示制御情報と、前記コンテンツデータが前記出力機器に表示された後に前記ジェスチャ検知部により検知されたジェスチャを表す第3のジェスチャ情報とに基づいて、前記コンテンツデータの前記出力機器の表示画面におけるレイアウトを調整するための指令信号を生成する、レイアウト調整部をさらに備える、請求項1に記載の情報処理装置。
- 前記出力機器は、一のコンテンツデータを表示する際にレイアウト調整が行われた場合に、当該レイアウト調整後の前記一のコンテンツデータのレイアウトに関するレイアウト情報を、当該コンテンツデータの所在情報と対応付けて記憶しており、
前記レイアウト調整部は、前記ジェスチャ検知部から入力されたジェスチャ情報に応じて、前記出力機器が他のコンテンツデータを表示する際において、前記出力機器に記憶されている前記一のコンテンツデータの所在情報に対応付けられた前記レイアウト情報の適用条件を表す適用条件情報を、前記出力機器に送信する、請求項4に記載の情報処理装置。 - 外部機器に対し、前記情報処理装置と共通のプロトコルによりネットワークを介して互いにアクセス可能な状態とする機器登録処理を行う機器登録部をさらに備え、
前記出力機器選択部は、前記機器登録処理が行われた前記外部機器の中から前記出力機器を選択する、請求項1に記載の情報処理装置。 - 所定のオブジェクトが表示される表示装置の表示画面上における第1の操作体の入力位置を検出する入力位置検出ステップと、
前記入力位置を表す入力位置情報と前記オブジェクトの表示位置を表す表示位置情報とに基づいて、前記第1の操作体により選択された前記オブジェクトである選択オブジェクトを特定するオブジェクト特定ステップと、
前記選択オブジェクトが前記第1の操作体により選択されている状態で第2の操作体により入力された所定のジェスチャを検知するジェスチャ検知ステップと、
前記ジェスチャ検知ステップにより検知されたジェスチャを表す第1のジェスチャ情報に基づいて、前記選択オブジェクトに対応するコンテンツデータを出力する外部機器である出力機器を選択する出力機器選択ステップと、
前記コンテンツデータの所在情報を前記出力機器に送信する所在情報送信ステップと、
前記所在情報を前記出力機器に送信した後に検知されたジェスチャを表す第2のジェスチャ情報に基づいて、前記選択された出力機器に所定の処理を実行させるための制御信号を生成する信号生成ステップと、
を含む、情報処理方法。 - コンピュータを、
所定のオブジェクトが表示される表示装置の表示画面側に設けられ、第1の操作体による前記表示画面上における入力位置を検出する入力位置検出部と、
前記入力位置を表す入力位置情報と前記オブジェクトの表示位置を表す表示位置情報とに基づいて、前記第1の操作体により選択された前記オブジェクトである選択オブジェクトを特定するオブジェクト特定部と、
前記表示装置の表示画面の裏面側に設けられ、第2の操作体により所定のジェスチャが入力された場合に、当該ジェスチャを検知するジェスチャ検知部と、
前記ジェスチャ検知部により検知されたジェスチャを表す第1のジェスチャ情報に基づいて、前記選択オブジェクトに対応するコンテンツデータを出力する外部機器である出力機器を選択する出力機器選択部と、
前記コンテンツデータの所在情報を前記出力機器に送信した後に検知されたジェスチャを表す第2のジェスチャ情報に基づいて、前記選択された出力機器に所定の処理を実行させるための制御信号を生成する信号生成部と、
を備える、情報処理装置として機能させるためのプログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
BRPI1006971A BRPI1006971A2 (pt) | 2009-02-04 | 2010-01-27 | "dispositivo e método de processamento de informação, e, programa." |
CN201080005901.XA CN102301317B (zh) | 2009-02-04 | 2010-01-27 | 信息处理装置和信息处理方法 |
US13/146,888 US20110285658A1 (en) | 2009-02-04 | 2010-01-27 | Information processing device, information processing method, and program |
RU2011131785/08A RU2541125C2 (ru) | 2009-02-04 | 2010-01-27 | Устройство обработки информации, способ обработки информации и программа |
EP10738441.4A EP2395416A4 (en) | 2009-02-04 | 2010-01-27 | COMPUTER DEVICE, METHOD, AND PROGRAM |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009-024237 | 2009-02-04 | ||
JP2009024237A JP5233708B2 (ja) | 2009-02-04 | 2009-02-04 | 情報処理装置、情報処理方法およびプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010090106A1 true WO2010090106A1 (ja) | 2010-08-12 |
Family
ID=42542006
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/051020 WO2010090106A1 (ja) | 2009-02-04 | 2010-01-27 | 情報処理装置、情報処理方法およびプログラム |
Country Status (7)
Country | Link |
---|---|
US (1) | US20110285658A1 (ja) |
EP (1) | EP2395416A4 (ja) |
JP (1) | JP5233708B2 (ja) |
CN (1) | CN102301317B (ja) |
BR (1) | BRPI1006971A2 (ja) |
RU (1) | RU2541125C2 (ja) |
WO (1) | WO2010090106A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014020765A1 (ja) * | 2012-08-03 | 2014-02-06 | Necカシオモバイルコミュニケーションズ株式会社 | タッチパネル装置、処理決定方法、プログラムおよびタッチパネルシステム |
Families Citing this family (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102439575B (zh) * | 2010-03-23 | 2014-10-08 | 松下电器产业株式会社 | 控制用户界面显示的服务器装置、方法、程序及集成电路 |
KR101333879B1 (ko) * | 2010-08-24 | 2013-11-27 | 주식회사 팬택 | 이동 단말기 및 이동 단말기를 이용한 제어 방법 |
JP5593980B2 (ja) | 2010-09-02 | 2014-09-24 | 株式会社ニコン | 電子機器及びデータ送信方法 |
CN101951460A (zh) * | 2010-09-03 | 2011-01-19 | 深圳市同洲电子股份有限公司 | 预览照片的方法、系统、移动终端和机顶盒 |
JP6049990B2 (ja) * | 2010-09-15 | 2016-12-21 | 京セラ株式会社 | 携帯電子機器、画面制御方法および画面制御プログラム |
US20120081615A1 (en) * | 2010-09-30 | 2012-04-05 | Starr Ephraim D | Remote control |
JP5598232B2 (ja) | 2010-10-04 | 2014-10-01 | ソニー株式会社 | 情報処理装置、情報処理システムおよび情報処理方法 |
US8619116B2 (en) * | 2010-10-22 | 2013-12-31 | Litl Llc | Video integration |
US11265510B2 (en) | 2010-10-22 | 2022-03-01 | Litl Llc | Video integration |
US9092135B2 (en) * | 2010-11-01 | 2015-07-28 | Sony Computer Entertainment Inc. | Control of virtual object using device touch interface functionality |
US8875180B2 (en) * | 2010-12-10 | 2014-10-28 | Rogers Communications Inc. | Method and device for controlling a video receiver |
US9430128B2 (en) | 2011-01-06 | 2016-08-30 | Tivo, Inc. | Method and apparatus for controls based on concurrent gestures |
CN103329075B (zh) * | 2011-01-06 | 2017-12-26 | TiVo解决方案有限公司 | 用于基于手势控制的方法和装置 |
US9065876B2 (en) * | 2011-01-21 | 2015-06-23 | Qualcomm Incorporated | User input back channel from a wireless sink device to a wireless source device for multi-touch gesture wireless displays |
JP5709206B2 (ja) * | 2011-02-17 | 2015-04-30 | Necカシオモバイルコミュニケーションズ株式会社 | タッチパネル装置、処理決定方法、プログラムおよびタッチパネルシステム |
JP5816834B2 (ja) | 2011-03-22 | 2015-11-18 | パナソニックIpマネジメント株式会社 | 入力装置、および入力方法 |
JP2012247840A (ja) * | 2011-05-25 | 2012-12-13 | Sony Corp | 近隣人物特定装置、近隣人物特定方法、近隣人物特定プログラム及び近隣人物特定システム |
JP5890126B2 (ja) * | 2011-08-24 | 2016-03-22 | シャープ株式会社 | 携帯型電子機器、携帯型電子機器の制御方法、制御プログラム、およびコンピュータ読み取り可能な記録媒体 |
EP2757450A4 (en) * | 2011-09-15 | 2015-08-26 | Nec Corp | DEVICE AND METHOD FOR PROCESSING MEMORY INFORMATION OF ELECTRONIC SHORT NOTES |
KR101972924B1 (ko) | 2011-11-11 | 2019-08-23 | 삼성전자주식회사 | 휴대용 기기에서 부분 영역의 터치를 이용한 전체 영역 지정을 위한 방법 및 장치 |
JP5870661B2 (ja) | 2011-12-06 | 2016-03-01 | 株式会社リコー | 携帯端末、出力制御システム、出力制御プログラム、出力制御方法 |
CN103186333B (zh) * | 2011-12-28 | 2018-05-22 | 深圳富泰宏精密工业有限公司 | 电子设备解锁系统及方法 |
EP2613227B1 (en) * | 2012-01-06 | 2020-04-29 | Samsung Electronics Co., Ltd | Input apparatus and control method thereof |
EP2808773A4 (en) * | 2012-01-26 | 2015-12-16 | Panasonic Corp | MOBILE TERMINAL, TELEPHONE RECEIVER AND DEVICE CONNECTING METHOD |
US9081491B2 (en) * | 2012-03-30 | 2015-07-14 | Corel Corporation | Controlling and editing media files with touch gestures over a media viewing area using a touch sensitive device |
FR2989483B1 (fr) | 2012-04-11 | 2014-05-09 | Commissariat Energie Atomique | Dispositif d'interface utilisateur a electrodes transparentes |
KR102027357B1 (ko) * | 2012-04-24 | 2019-10-01 | 삼성전자주식회사 | 외부 기기의 스크린 상에 디스플레이된 정보를 탐색하는 터치 스크린을 가지는 휴대용 기기 및 그의 정보 탐색 방법 |
KR101341737B1 (ko) * | 2012-06-21 | 2013-12-16 | 주식회사 팬택 | 후면 터치를 이용한 단말 제어 장치 및 방법 |
CN104335151B (zh) | 2012-06-29 | 2018-10-26 | 日本电气株式会社 | 终端设备、显示控制方法和程序 |
EP2685329B1 (en) * | 2012-07-11 | 2015-09-23 | ABB Research Ltd. | Presenting process data of a process control object on a mobile terminal |
FR2995419B1 (fr) | 2012-09-12 | 2015-12-11 | Commissariat Energie Atomique | Systeme d'interface utilisateur sans contact |
CN102866777A (zh) * | 2012-09-12 | 2013-01-09 | 中兴通讯股份有限公司 | 一种数字媒体内容播放转移的方法及播放设备及系统 |
US9323310B2 (en) * | 2012-09-19 | 2016-04-26 | Sony Corporation | Mobile client device, operation method, and recording medium |
FR2996933B1 (fr) * | 2012-10-15 | 2016-01-01 | Isorg | Appareil portable a ecran d'affichage et dispositif d'interface utilisateur |
JP6140980B2 (ja) * | 2012-11-13 | 2017-06-07 | キヤノン株式会社 | 表示装置、画像表示システム、画像表示方法、及びコンピュータプログラム |
US9513795B2 (en) * | 2012-11-29 | 2016-12-06 | Blackberry Limited | System and method for graphic object management in a large-display area computing device |
CN103036962A (zh) * | 2012-12-06 | 2013-04-10 | 惠州Tcl移动通信有限公司 | 一种文件的共享方法及手持设备 |
AU2014205344A1 (en) * | 2013-01-10 | 2015-07-02 | Fox Sports Productions, LLC. | System, method and interface for viewer interaction relative to a 3D representation of a vehicle |
US20140267049A1 (en) * | 2013-03-15 | 2014-09-18 | Lenitra M. Durham | Layered and split keyboard for full 3d interaction on mobile devices |
JP2014204150A (ja) * | 2013-04-01 | 2014-10-27 | 株式会社東芝 | リモコン |
KR102173727B1 (ko) * | 2014-03-31 | 2020-11-03 | 삼성전자주식회사 | 음향 신호 기반 정보 공유 방법 및 그 장치 |
CN104090706B (zh) * | 2014-07-31 | 2018-06-05 | 北京智谷睿拓技术服务有限公司 | 内容获取方法、内容分享方法、及其装置 |
CN105302445B (zh) * | 2015-11-12 | 2019-07-23 | 小米科技有限责任公司 | 图形用户界面绘制方法及装置 |
CN106168879A (zh) * | 2016-06-30 | 2016-11-30 | 努比亚技术有限公司 | 一种双面屏交互的方法及终端 |
CN106502805B (zh) * | 2016-10-31 | 2020-02-21 | 宇龙计算机通信科技(深圳)有限公司 | 一种终端应用分享方法、系统以及设备终端 |
CN106873870A (zh) * | 2017-01-06 | 2017-06-20 | 珠海格力电器股份有限公司 | 一种终端交互方法及其装置、终端及电子设备 |
KR101971982B1 (ko) * | 2017-04-20 | 2019-04-24 | 주식회사 하이딥 | 터치 감지 및 터치압력 감지가 가능한 장치 및 제어방법 |
DE112018002980T5 (de) | 2017-06-12 | 2020-02-20 | Sony Corporation | Informationsverarbeitungssystem, informationsverarbeitungsverfahren und programm |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000029837A (ja) * | 1998-07-08 | 2000-01-28 | Toshiba Corp | 個人認証方法、ユーザ認証装置、及び記録媒体 |
JP2003309884A (ja) * | 2002-04-18 | 2003-10-31 | Matsushita Electric Ind Co Ltd | リモートコントロール装置および記録媒体 |
JP2003330611A (ja) * | 2002-05-16 | 2003-11-21 | Sony Corp | 入力方法及び入力装置 |
JP2004336597A (ja) | 2003-05-12 | 2004-11-25 | Sony Corp | 操作入力受付装置、操作入力受付方法および遠隔操作システム |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7800592B2 (en) * | 2005-03-04 | 2010-09-21 | Apple Inc. | Hand held electronic device with multiple touch sensing devices |
KR100474724B1 (ko) * | 2001-08-04 | 2005-03-08 | 삼성전자주식회사 | 터치스크린을 가지는 장치 및 그 장치에 외부디스플레이기기를 연결하여 사용하는 방법 |
BRPI0212375B1 (pt) * | 2001-09-07 | 2016-05-24 | Intergraph Hardware Tech Co | método para estabilizar uma imagem |
US8176432B2 (en) * | 2001-11-20 | 2012-05-08 | UEI Electronics Inc. | Hand held remote control device having an improved user interface |
JP3925297B2 (ja) * | 2002-05-13 | 2007-06-06 | ソニー株式会社 | 映像表示システム及び映像表示制御装置 |
US7456823B2 (en) * | 2002-06-14 | 2008-11-25 | Sony Corporation | User interface apparatus and portable information apparatus |
US7218313B2 (en) * | 2003-10-31 | 2007-05-15 | Zeetoo, Inc. | Human interface system |
JP4645179B2 (ja) * | 2004-12-02 | 2011-03-09 | 株式会社デンソー | 車両用ナビゲーション装置 |
JP4715535B2 (ja) * | 2005-05-23 | 2011-07-06 | ソニー株式会社 | コンテンツ表示再生システム、コンテンツ表示再生方法、コンテンツ表示再生プログラムを記録した記録媒体及び操作制御装置 |
WO2007124453A2 (en) * | 2006-04-20 | 2007-11-01 | Exceptional Innovation Llc | Touch screen for convergence and automation system |
US8564544B2 (en) * | 2006-09-06 | 2013-10-22 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
US8122475B2 (en) * | 2007-02-13 | 2012-02-21 | Osann Jr Robert | Remote control for video media servers |
-
2009
- 2009-02-04 JP JP2009024237A patent/JP5233708B2/ja not_active Expired - Fee Related
-
2010
- 2010-01-27 WO PCT/JP2010/051020 patent/WO2010090106A1/ja active Application Filing
- 2010-01-27 RU RU2011131785/08A patent/RU2541125C2/ru not_active IP Right Cessation
- 2010-01-27 BR BRPI1006971A patent/BRPI1006971A2/pt not_active IP Right Cessation
- 2010-01-27 US US13/146,888 patent/US20110285658A1/en not_active Abandoned
- 2010-01-27 CN CN201080005901.XA patent/CN102301317B/zh not_active Expired - Fee Related
- 2010-01-27 EP EP10738441.4A patent/EP2395416A4/en not_active Ceased
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000029837A (ja) * | 1998-07-08 | 2000-01-28 | Toshiba Corp | 個人認証方法、ユーザ認証装置、及び記録媒体 |
JP2003309884A (ja) * | 2002-04-18 | 2003-10-31 | Matsushita Electric Ind Co Ltd | リモートコントロール装置および記録媒体 |
JP2003330611A (ja) * | 2002-05-16 | 2003-11-21 | Sony Corp | 入力方法及び入力装置 |
JP2004336597A (ja) | 2003-05-12 | 2004-11-25 | Sony Corp | 操作入力受付装置、操作入力受付方法および遠隔操作システム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2395416A4 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014020765A1 (ja) * | 2012-08-03 | 2014-02-06 | Necカシオモバイルコミュニケーションズ株式会社 | タッチパネル装置、処理決定方法、プログラムおよびタッチパネルシステム |
US9817567B2 (en) | 2012-08-03 | 2017-11-14 | Nec Corporation | Touch panel device, process determination method, program, and touch panel system |
Also Published As
Publication number | Publication date |
---|---|
BRPI1006971A2 (pt) | 2016-04-12 |
JP5233708B2 (ja) | 2013-07-10 |
US20110285658A1 (en) | 2011-11-24 |
RU2541125C2 (ru) | 2015-02-10 |
CN102301317B (zh) | 2014-11-19 |
CN102301317A (zh) | 2011-12-28 |
EP2395416A1 (en) | 2011-12-14 |
EP2395416A4 (en) | 2015-04-29 |
RU2011131785A (ru) | 2013-02-10 |
JP2010182046A (ja) | 2010-08-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5233708B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
US10175847B2 (en) | Method and system for controlling display device and computer-readable recording medium | |
JP5705131B2 (ja) | 異種のタッチ領域を利用した電子機器の動作制御方法及び装置 | |
CN108052265B (zh) | 用于使用图形对象控制内容的方法和设备 | |
TWI516962B (zh) | 巡覽一瀏覽器中之複數個內容項目的方法、電腦系統及電腦程式產品 | |
JP5363259B2 (ja) | 画像表示装置、画像表示方法およびプログラム | |
KR101276846B1 (ko) | 미디어 데이터의 스트리밍 제어방법 및 제어장치 | |
US10275132B2 (en) | Display apparatus, method of controlling display apparatus, and recordable medium storing program for performing method of controlling display apparatus | |
US20150082241A1 (en) | Method for screen mirroring and source device thereof | |
US20150193036A1 (en) | User terminal apparatus and control method thereof | |
US20100101872A1 (en) | Information processing apparatus, information processing method, and program | |
US20100214249A1 (en) | Information processing apparatus, display control method, and program | |
CA2826933C (en) | Method and apparatus for providing graphic user interface in mobile terminal | |
WO2007102110A2 (en) | Method of transferring data | |
KR102037415B1 (ko) | 디스플레이 디바이스 제어 방법 및 시스템과 기록 매체 | |
KR20140034100A (ko) | 휴대단말과 외부 표시장치 연결 운용 방법 및 이를 지원하는 장치 | |
KR20150144641A (ko) | 사용자 단말 장치 및 그 제어 방법 | |
KR20130119708A (ko) | 외부 기기의 스크린 상에 디스플레이된 정보를 탐색하는 터치 스크린을 가지는 휴대용 기기 및 그의 정보 탐색 방법 | |
KR20140133354A (ko) | 디스플레이 장치 및 이의 ui 제공 방법 | |
KR20130034892A (ko) | 이동 단말기 및 그를 통한 차량 제어방법 | |
JP2017117108A (ja) | 電子機器及びその制御方法 | |
WO2021219002A1 (zh) | 显示设备 | |
KR20130116976A (ko) | 이동 단말기 및 그 제어방법 | |
KR20160020724A (ko) | 디스플레이 장치 및 그 제어 방법 | |
KR102065404B1 (ko) | 이동 단말기 및 그 제어방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080005901.X Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10738441 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010738441 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011131785 Country of ref document: RU Ref document number: 13146888 Country of ref document: US Ref document number: 5780/DELNP/2011 Country of ref document: IN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: PI1006971 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: PI1006971 Country of ref document: BR Kind code of ref document: A2 Effective date: 20110728 |