WO2013031134A1 - Information processing apparatus, information processing method, and program - Google Patents

Information processing apparatus, information processing method, and program Download PDF

Info

Publication number
WO2013031134A1
WO2013031134A1 PCT/JP2012/005233 JP2012005233W WO2013031134A1 WO 2013031134 A1 WO2013031134 A1 WO 2013031134A1 JP 2012005233 W JP2012005233 W JP 2012005233W WO 2013031134 A1 WO2013031134 A1 WO 2013031134A1
Authority
WO
WIPO (PCT)
Prior art keywords
information processing
processing apparatus
displayed
user interface
display
Prior art date
Application number
PCT/JP2012/005233
Other languages
French (fr)
Inventor
Wataru Kawamata
Original Assignee
Sony Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corporation filed Critical Sony Corporation
Priority to CN201280040926.2A priority Critical patent/CN103765362B/en
Priority to US14/237,361 priority patent/US10140002B2/en
Priority to KR1020147004212A priority patent/KR20140068024A/en
Priority to BR112014004048A priority patent/BR112014004048A2/en
Priority to EP20120827276 priority patent/EP2751654A4/en
Priority to RU2014106495/08A priority patent/RU2014106495A/en
Publication of WO2013031134A1 publication Critical patent/WO2013031134A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present disclosure relates to an information processing apparatus, an information processing method, and a program.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2008-70968
  • an information processing apparatus including: a display controller that controls a user interface to display a first object; and a detection unit that detects an input received at the user interface, wherein the display controller performs a predetermined operation corresponding to the displayed first object based on a first input detected at a first position at which the first object is displayed and a second input detected at a second position in which the object is not displayed.
  • an information processing method including: controlling a user interface to display a first object; detecting an input received at the user interface; and performing a predetermined operation corresponding to the displayed first object based on a first input detected at a first position at which the first object is displayed and a second input detected at a second position in which the object is not displayed.
  • a non-transitory computer-readable medium including computer program instructions, which when executed by an information processing apparatus, cause the information processing apparatus to perform a method comprising: controlling a user interface to display a first object; detecting an input received at the user interface; and performing a predetermined operation corresponding to the displayed first object based on a first input detected at a first position at which the first object is displayed and a second input detected at a second position in which the object is not displayed.
  • the operability can be improved.
  • Fig. 1 is a block diagram illustrating the functional configuration of an information processing apparatus according to a first embodiment of the present disclosure.
  • Fig. 2 is a diagram illustrating an object selecting operation recognized in the information processing apparatus according to the first embodiment.
  • Fig. 3 is a diagram illustrating a target object expanding operation in the information processing apparatus according to the first embodiment.
  • Fig. 4 is a diagram illustrating a display position of a target object displayed by the information processing apparatus according to the first embodiment.
  • Fig. 5 is a diagram illustrating movement of an expansion center of the target object displayed by the information processing apparatus according to the first embodiment.
  • FIG 6 is a diagram illustrating a display position of the target object displayed by the information processing apparatus according to the first embodiment, when an important region is included.
  • Fig. 1 is a block diagram illustrating the functional configuration of an information processing apparatus according to a first embodiment of the present disclosure.
  • Fig. 2 is a diagram illustrating an object selecting operation recognized in the information processing apparatus according to the first embodiment.
  • FIG. 7 is a diagram illustrating display of an additional function in the expanding operation in the information processing apparatus according to the first embodiment.
  • Fig. 8 is a diagram illustrating an operation of expanding a selection range of a target object in the information processing apparatus according to the first embodiment.
  • Fig. 9 is a diagram illustrating an expanding operation when a plurality of target objects is selected in the information processing apparatus according to the first embodiment.
  • Fig. 10 is a diagram illustrating an operation of changing an effect of an object in the information processing apparatus according to the first embodiment.
  • Fig. 11 is a diagram illustrating an operation of rotating a target object in the information processing apparatus according to the first embodiment.
  • Fig. 12 is a block diagram illustrating the functional configuration of an information processing apparatus according to a second embodiment of the present disclosure.
  • Fig. 13 is a block diagram illustrating an example of a hardware configuration of the information processing apparatus according to the first and second embodiments of the present disclosure.
  • Fig. 1 is a block diagram illustrating the functional configuration of the information processing apparatus according to the embodiment of the disclosure.
  • An information processing apparatus 100a illustrated in Fig. 1 is an apparatus that is capable of recognizing a plurality of coordinate positions simultaneously input.
  • Examples of the information processing apparatus 100a include a portable telephone, a personal computer (PC), a video processing apparatus, a game console, a home appliance, and a music reproducing apparatus.
  • the information processing apparatus 100a includes a display unit 101, a detecting unit 103, an operation recognizing unit 105, and a display control unit 107.
  • the display unit 101 has a function of providing a display screen to a user.
  • the display unit 101 may be a display device such as a liquid crystal display (LCD) device or an organic light emitting diode (OLED) display device.
  • LCD liquid crystal display
  • OLED organic light emitting diode
  • the detecting unit 103 has a function of detecting the position of an operation body on a display screen.
  • the detecting unit 103 can detect the positions of a plurality of fingers F on the display screen.
  • the function of the detecting unit 103 is realized using, for example, a touch sensor, an imaging apparatus, or other sensors.
  • the operation body include a user's body such as a finger, an arm, or a foot and a stylus pen.
  • the detecting unit 103 is a touch sensor superimposed on the display unit 101 and the operation body is a finger F of a user.
  • the touch sensor used here may be, for example, a contact type touch sensor that detects the position of the finger F touching the display screen.
  • the touch sensor used here may be a non-contact type touch sensor that detects the position of the finger F on the display screen in a non-contact manner.
  • the touch sensor used here may detect a pushing operation of a user on the display screen.
  • the operation recognizing unit 105 has a function of recognizing a process indicated by an operation pattern input by the user.
  • the operation pattern is determined based on the position of the operation body. Specifically, the operation pattern is determined based on various conditions such as a detection timing of the position of the operation body and a change pattern of the position of the operation body.
  • the operation recognizing unit 105 determines the operation pattern to recognize a process associated with the operation pattern in advance.
  • the display control unit 107 has a function of controlling display of a display screen on the display unit 101.
  • the display control unit 107 can generate the display screen based on the process recognized by the operation recognizing unit 105 and display the display screen on the display unit 101.
  • the display control unit 107 can display a display screen on which a target object is expanded, when the operation recognizing unit 105 recognizes that an operation pattern input by the user is an object expanding operation.
  • the display control unit 107 can display an object on the display screen.
  • the display control unit 107 may display a plurality of objects on the display screen at random.
  • the display control unit 107 may display the selected object (hereinafter, referred to as a target object meaning an object to be operated) differently from another object.
  • a target object meaning an object to be operated
  • the constituent elements may be configured using general units or circuits or may be configured by hardware specialized for the functions of the constituent elements. Further, the functions of the constituent elements may be performed by reading a control program, which describes a processing order in which the functions are realized by an arithmetic device such as a central processing unit (CPU), from a storage medium such as a read-only memory (ROM) or a random access memory (RAM), which stores the control program, analyzing the control program, and executing the control program. Accordingly, a configuration to be used may be appropriately modified in accordance with a technical level at which this embodiment is realized.
  • a control program which describes a processing order in which the functions are realized by an arithmetic device such as a central processing unit (CPU)
  • ROM read-only memory
  • RAM random access memory
  • a computer program configured to realize the functions of the information processing apparatus 100a according to the above-described embodiment may be created and mounted on a personal computer or the like. Further, a computer readable recording medium that stores the computer program may be provided. Examples of the recording medium include a magnetic disk, an optical disc, a magneto-optical disc, and a flash memory. Furthermore, the computer program may be delivered via a network or the like without use of a recording medium.
  • FIG. 2 is a diagram illustrating an object selecting operation recognized in the information processing apparatus according to the first embodiment.
  • Fig. 3 is a diagram illustrating a target object expanding operation in the information processing apparatus according to the first embodiment.
  • the display control unit 107 can display a plurality of objects 5 arranged at random on the display screen, as illustrated in (state 1).
  • four objects 5 are illustrated, that is, an object 5A, an object 5B, an object 5C, and an object 5D are illustrated.
  • the user selects a target object to be operated from among the objects 5 with the finger F1 (state 2).
  • the finger F1 is an example of a first operation body.
  • the operation recognizing unit 105 can recognize a first operation of selecting the target object to be operated from among the objects 5.
  • the operation recognizing unit 105 can recognize the first operation of selecting the object 5C located under the finger F1.
  • the display control unit 107 may display the selected object 5C differently from the other objects 5.
  • the display control unit 107 may perform control such that the selected object 5C is displayed at the uppermost portion.
  • the finger F2 is an example of a second operation body.
  • the finger F2 may be another finger of the same hand as that of the finger F1.
  • the finger F2 may be one of the fingers of another hand of the same user.
  • the finger F2 may be a finger of another user.
  • the operation recognizing unit 105 can recognize a second operation on the target object 5C based on a change in relative positions between a position P1 of the finger F1 and a position P2 of the finger F2.
  • the position of the finger F2 in state 3 is assumed to be P2-1.
  • the user moves the finger F2 from the position P2-1 to a position P2-2, as illustrated in state 4.
  • the operation recognizing unit 105 can recognize that the second operation is an operation of changing the size of the selected object 5. More specifically, when the distance between the position P1 and the position P2-2 is broader than the distance between the position P1 and the position P2-2, the operation recognizing unit 105 can recognize that the second operation is an expanding operation.
  • the operation recognizing unit 105 can recognize that the second operation is a reducing operation.
  • the display control unit 107 expands and displays the target object 5C.
  • Patent Document 1 discloses a technique of performing an expanding or reducing process based on a relative distance between two simultaneously detected points (so-called pinch-in and pinch-out).
  • pinch-in and pinch-out a technique of performing an expanding or reducing process based on a relative distance between two simultaneously detected points
  • the information processing apparatus 100a is configured such that a target object is specified with high accuracy, and then the second operation can be performed on the selected object.
  • the configuration of the information processing apparatus 100a is effective when the plurality of objects 5 to be operated is displayed.
  • the display control unit 107 can trace the changed position of the finger F1 to move the target object 5C (third operation).
  • the third operation may be performed before or after the second operation or may be performed simultaneously with the second operation.
  • FIG. 4 is a diagram illustrating the display position of a target object displayed by the information processing apparatus according to this embodiment.
  • Fig. 5 is a diagram illustrating movement of an expansion center of the target object displayed by the information processing apparatus according to this embodiment.
  • FIG 6 is a diagram illustrating a display position of the target object displayed by the information processing apparatus according to this embodiment, when an important region is included.
  • the information processing apparatus 100a selects the object 5 with the finger F1, and then performs the second operation, putting the finger F1 on the selected object 5. Then, when the user performs the second operation, the user may rarely view the object in some situations. Accordingly, the display control unit 107 of the information processing apparatus 100a may move the display position of the target object 5 to a position at which the user can easily view the target object. As illustrated in Fig. 4, the display control unit 107 may move the display position of the selected object 5 within a range in which the position P1 of the finger F1 is not out of the selected object 5.
  • the display control unit 107 can move the display position of the selected object 5 in accordance with a positional relation between the position P1 of the finger F1 and the position P2 of the finger F2.
  • Fig. 3 the case in which the central point of the object 5C is an expansion center has been described.
  • the display control unit 107 may control display such that the object 5C is expanded in the direction of the finger F2, as illustrated in Fig. 5.
  • the expansion center is moved in the opposite direction of the finger F2 within the range in which the position P1 of the finger F1 is not out of the object 5C. Accordingly, the target object 5C is not covered by the finger F1 and the second operation can be performed while ensuring easiness of view.
  • the movement of the display position may be performed simultaneously with, for example, the second operation.
  • the movement of the display position may be performed after the position P2 of the finger F2 is detected.
  • the user may sometimes perform an operation while viewing the display of the object 5. In this case, this configuration is effective.
  • a predetermined region 51 (for example, an important region) is included within the object 5 in some cases.
  • the predetermined region 51 may be a region corresponding to, for example, the face of a subject.
  • the predetermined region 51 may be, for example, a part of an object.
  • the predetermined region may be, for example, a region that is set based on an image analysis result of the object 5 which is an image object.
  • the display control unit 107 can change the display position of the target object in accordance with the predetermined region included within the target object and an operation detection position, which is a position on the display screen determined and roughened in accordance with the position of the first operation body.
  • the display control unit 107 may change the display position of the target object so that the operation detection position is not included within the predetermined region.
  • the display control unit 107 can move the display position of the object 5 so that the predetermined region 51 does not overlap the position P1 of the finger F1, when moving the display position. With such a configuration, it is possible to prevent a situation where the predetermined region 51 is poorly viewed particularly in the object 5.
  • Fig. 7 is a diagram illustrating the display of the additional function in the expanding operation of the information processing apparatus according to this embodiment.
  • the display control unit 107 can display an additional function object 6 for the target object 5 on the display screen in accordance with the size of the selected target object 5.
  • State 6 is a state where the position P1 of the finger F1 and the position P2 of the finger F2 are detected, as in state 3, but the position of the finger F2 is different.
  • the position of the finger F2 may be a portion in which the object 5 is not displayed or may be a position on the object 5.
  • the display control unit 107 displays the additional function object 6A in accordance with the size of the expanded target object 5C.
  • the additional function objects 6 displayed here are objects used to perform an additional function of performing an adding process on the target object.
  • the operation recognizing unit 105 can recognize a reproducing operation of reproducing the target object 5C.
  • the operation recognizing unit 105 can recognize a deleting operation of deleting the target object 5C.
  • the additional function object 6C may show a photographing date of the target object 5C.
  • Fig. 8 is a diagram illustrating an expanding operation of expanding the selection range of a target object in the information processing apparatus according to this embodiment.
  • Fig. 9 is a diagram illustrating an expanding operation when a plurality of target objects is selected in the information processing apparatus according to this embodiment.
  • the display control unit 107 displays seven objects 5, that is, displays objects 5A to 5G.
  • the object 5C is selected with the finger F1.
  • a tapping operation is performed with the finger F2.
  • the selection range of the target object is expanded.
  • the operation recognizing unit 105 can recognize the tapping operation performed with the finger F2 as an operation of expanding the selection range up to the objects 5 (the objects 5A, 5B, and 5D) adjacent to the object 5C selected in state 9.
  • the expanding operation of expanding the selection range is not limited to the expansion to the objects 5 adjacent to the previously selected object 5.
  • the selection range may be expanded up to the plurality of objects 5 displayed within a predetermined distance from the position P1 of the finger F1.
  • the operation recognizing unit 105 can recognize the expanding operation as an expanding operation of expanding the plurality of objects 5 (the objects 5A to 5D) selected as the target objects.
  • the display control unit 107 can expand and display the objects 5A to 5D.
  • the gravity center of the plurality of objects 5 can be set as the expansion center.
  • the expanding operation of expanding the selection range has been performed as the tapping operation with the finger F2, but the example of the present technology is not limited thereto.
  • the expanding operation of expanding the selection range may be performed with a finger F different from the finger F2.
  • the expanding operation of expanding the selection range may be a double-tapping operation.
  • the expanding operation of expanding the selection range may be a tapping operation performed with a plurality of fingers.
  • Fig. 10 is a diagram illustrating an operation of changing an effect of an object in the information processing apparatus according to this embodiment.
  • the operation recognizing unit 105 may recognize an operation of changing an effect of the object 5.
  • the operation recognizing unit 105 may recognize a tapping operation performed with a finger F3 different from the fingers F1 and F2 as an effect changing operation.
  • the display control unit 107 may change an effect of the object 5C, when the tapping operation performed with the finger F3 is recognized.
  • the change in an effect is a change in the frame of the object 5C will be described below.
  • the object 5C is selected as the target object at the position P1 of the finger F1.
  • an object 5C2 whose frame overlaps the frame of an object 5C1 is displayed (state 13).
  • an object 5C3 whose frame, which is different from the frame of the object 5C2 overlaps the object 5C1 is displayed.
  • the change in an effect may be a change in color tone of an image (monochrome, sepia, etc.), a change in a background, a change in the direction of light, a change in an overlap state among the objects 5, a change in contrast, or the like.
  • FIG. 11 is a diagram illustrating an operation of rotating a target object in the information processing apparatus according to this embodiment.
  • the second operation is the operation of changing the size of the target object has hitherto been described.
  • the second operation may be the operation of rotating the target object.
  • the display control unit 107 can rotate the target object 5C selected with the finger F1 at a display angle.
  • the object 5C selected with the finger F1 may be rotated clockwise from the state of an object 5C-1 to the state of an object 5C-2, while rotating the position P2 of the finger F2 from a position P2-1 to a position P2-2 about the position P1 of the finger F1 (state 16).
  • Fig. 12 is a block diagram illustrating the functional configuration of the information processing apparatus according to the second embodiment of the present disclosure.
  • An information processing apparatus 100b includes a detection information acquiring unit 104, an operation recognizing unit 105, and a display control unit 107 as the main units.
  • the information processing apparatus 100b is different from the information processing apparatus 100a in that the information processing apparatus 100b does not include the display unit 101 and the detecting unit 103. Therefore, the information processing apparatus 100b includes the detection information acquiring unit 104 that acquires detection information from an external detecting unit 103. Only differences between the information processing apparatus 100a according to the first embodiment and the information processing apparatus 100b will be described below and the description of the common constituent elements will not be repeated here.
  • the detection information acquiring unit 104 has a function of acquiring detection information generated when the detecting unit 103 detects the position of an operation body.
  • the detection information acquiring unit 104 can input the acquired detection information into the operation recognizing unit 105.
  • Fig. 17 is a block diagram illustrating an example of the hardware configuration of the information processing apparatus according to the first and second embodiments of the present disclosure.
  • the information processing apparatus 100 includes a GPS antenna 821, a GPS processing unit 823, a communication antenna 825, a communication processing unit 827, a geomagnetic sensor 829, an acceleration sensor 831, a gyro sensor 833, an atmospheric pressure sensor 835, an imaging unit 837, a central processing unit (CPU) 839, a read-only memory (ROM) 841, a random access memory (RAM) 843, an operation unit 847, a display unit 849, a decoder 851, a speaker 853, an encoder 855, a microphone 857, and a storage unit 859.
  • CPU central processing unit
  • ROM read-only memory
  • RAM random access memory
  • the GPS antenna 821 is an example of an antenna that receives signals from positioning satellites.
  • the GPS antenna 821 can receive GPS signals from a plurality of GPS satellites and input the received GPS signal into the GPS processing unit 823.
  • the GPS processing unit 823 is an example of a calculating unit that calculates position information based on the signals received from the positioning satellites.
  • the GPS processing unit 823 calculates the current position information based on the plurality of GPS signals input from the GPS antenna 821 and outputs the calculated position information.
  • the GPS processing unit 823 calculates the position of each GPS satellite based on trajectory data of the GPS satellite and calculates the distance between each GPS satellite and the terminal apparatus 100 based on a difference between transmission and reception times of the GPS signal. Then, the current three-dimensional position can be calculated based on the calculated position of each GPS satellite and the distance between each GPS satellite and the terminal apparatus 100.
  • the trajectory data of the GPS satellite used here may be included in, for example, the GPS signal.
  • the trajectory data of the GPS satellite may be acquired from an external server via the communication antenna 825.
  • the communication antenna 825 is an antenna that has a function of receiving a communication signal via, for example, a portable communication network or a wireless local area network (LAN) communication network.
  • the communication antenna 825 can supply the received signal to the communication processing unit 827.
  • the communication processing unit 827 has a function of performing various kinds of signal processing on the signal supplied from the communication antenna 825.
  • the communication processing unit 827 can supply a digital signal generated from the supplied analog signal to the CPU 839.
  • the geomagnetic sensor 829 is a sensor that detects geomagnetism as a voltage value.
  • the geomagnetic sensor 829 may be a triaxial geomagnetic sensor that detects each of the geomagnetisms in the X, Y, and Z axis directions.
  • the geomagnetic sensor 829 can supply the detected geomagnetism data to the CPU 839.
  • the acceleration sensor 831 is a sensor that detects acceleration as a voltage value.
  • the acceleration sensor 831 may be a triaxial acceleration sensor that detects each of the accelerations in the X, Y, and Z axis directions.
  • the acceleration sensor 831 can supply the detected acceleration data to the CPU 839.
  • the gyro sensor 833 may be a kind of a measuring device that detects an angle or an angular velocity of an object.
  • the gyro sensor 833 may be a triaxial gyro sensor that detects a change angle (angular velocity) of a rotation angle around the X, Y, and Z axes as a voltage value.
  • the gyro sensor 833 can supply the detected angular velocity data to the CPU 839.
  • the atmospheric pressure sensor 835 is a sensor that detects a surrounding pressure as a voltage value.
  • the atmospheric pressure sensor 835 can detect a pressure as a predetermined sampling frequency and supply the detected pressure data to the CPU 839.
  • the imaging unit 837 has a function of photographing a still image or a moving image through a lens under the control of the CPU 839.
  • the imaging unit 837 may store the photographed image in the storage unit 859.
  • the CPU 839 functions as an arithmetic device and a control device to control all of the processes in the information processing apparatus 100 in accordance with various kinds of programs.
  • the CPU 839 may be a microprocessor.
  • the CPU 839 can realize various functions in accordance with various kinds of programs.
  • the ROM 841 can store programs, calculation parameters, or the like used by the CPU 839.
  • the RAM 843 can temporarily store programs used in execution of the CPU 839, or parameters or the like appropriately changed in the execution.
  • the operation unit 847 has a function of generating an input signal used for a user to perform a desired operation.
  • the operation unit 847 may include an input unit, such as a touch sensor, a mouse, a keyboard, a button, a microphone, a switch, or a lever, with which that a user inputs information and an input control circuit configured to generate an input signal based on the input of the user and output the input signal to the CPU 839.
  • the display unit 849 is an example of an output device and may be a display device such as a liquid crystal display (LCD) device or an organic light emitting diode (OLED) display device.
  • the display unit 849 can supply information by displaying a screen for a user.
  • the decoder 851 has a function of performing decoding, analog conversion, or the like on input data under the control of the CPU 839.
  • the decoder 851 performs the decoding, the analog conversion, and the like on audio data input via, for example, the communication antenna 825 and the communication processing unit 827 and outputs an audio signal to the speaker 853.
  • the speaker 853 can output audio based on the audio signal supplied from the decoder 851.
  • the encoder 855 has a function of performing digital conversion, encoding, or the like on input data under the control of the CPU 839.
  • the encoder 855 can perform the digital conversion, the encoding, and the like on an audio signal input from the microphone 857 and output the audio data.
  • the microphone 857 can collect audio and output the audio as an audio signal.
  • the storage unit 859 is a data storage device and may include a storage medium, a recording device that records data in a storage medium, a reading device that reads data from a storage medium, and a deleting device that deletes data recorded in a storage medium.
  • a non-volatile memory such as a flash memory, a magnetoresistive random access memory (MRAM), a ferroelectric random access memory (FeRAM), a phase change random access memory (PRAM), or an electronically erasable and programmable read-only memory (EEPROM), or a magnetic recording medium such as a hard disk drive (HDD) may be used as the storage medium.
  • a user first selects a target object with the finger F1, and then performs an operation on a display screen with the finger F2, but the present technology is not limited thereto.
  • a user may touch a blank portion (a portion in which an object is not displayed) on the display screen with the finger F2, and then may select a target object with the finger F1.
  • An information processing apparatus comprising: a display controller that controls a user interface to display a first object; and a detection unit that detects an input received at the user interface, wherein the display controller performs a predetermined operation corresponding to the displayed first object based on a relative relation of a first input detected at a first position at which the first object is displayed and a second input detected at a second position in which the first object is not displayed.
  • the information processing apparatus of (1) further comprising: the user interface that is controlled by the display controller to display an object.
  • the information processing apparatus of (1) to (2) wherein the information processing apparatus is one of a portable telephone, a personal computer, a video processing apparatus, a game console, a home appliance and a music reproducing apparatus.
  • the detection unit is a touch sensor disposed on a surface of the user interface that detects a touch input received at the user interface.
  • the display controller controls the user interface to display a plurality of objects including the first object.
  • the detection unit detects a change in position of the second input and the display controller controls the user interface to change the size of the displayed first object based on the detected change in position.
  • the detection unit detects a change in position of the second input from the second position to a third position that is a greater distance from the first position than the second position, and the display controller controls the user interface to increase the size of the displayed first object based on the detected change in position.
  • the detection unit detects, as the second input, a plurality of individual inputs, and the display controller controls the user interface to display a predetermined effect on the first object based on the plurality of individual inputs.
  • the predetermined operation is an operation of rotating the displayed first object.
  • a non-transitory computer-readable medium including computer program instructions, which when executed by an information processing apparatus, cause the information processing apparatus to perform a method comprising: controlling a user interface to display a first object; detecting an input received at the user interface; and performing a predetermined operation corresponding to the displayed first object based on a first input detected at a first position at which the first object is displayed and a second input detected at a second position in which the object is not displayed.
  • Information processing apparatus 101 Display unit 103 Detecting unit 104 Detection information acquiring unit 105 Operation recognizing unit 107 Display control unit

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An information processing apparatus including: a display controller that controls a user interface to display a first object; and a detection unit that detects an input received at the user interface, wherein the display controller performs a predetermined operation corresponding to the displayed first object based on a first input detected at a first position at which the first object is displayed and a second input detected at a second position in which the object is not displayed.

Description

INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM
The present disclosure relates to an information processing apparatus, an information processing method, and a program.
In recent years, input devices have been diversified, and thus many methods of operating information processing apparatuses by users have been contrived. For example, in pointing devices, the number of coordinate positions simultaneously input has been one in the related art. For example, as disclosed in Patent Document 1, however, multi-pointing devices capable of simultaneously inputting a plurality of coordinate positions have appeared.
Patent Document 1: Japanese Patent Application Laid-Open No. 2008-70968
Summary
Thus, user interfaces improved in operability have been demanded, as input devices have been diversified. Thus, it is desirable to provide a novel and improved information processing apparatus, information processing method, and program capable of improving operability.
According to the present disclosure, there is provided an information processing apparatus including: a display controller that controls a user interface to display a first object; and a detection unit that detects an input received at the user interface, wherein the display controller performs a predetermined operation corresponding to the displayed first object based on a first input detected at a first position at which the first object is displayed and a second input detected at a second position in which the object is not displayed.
According to the present disclosure, there is provided an information processing method including: controlling a user interface to display a first object;
detecting an input received at the user interface; and performing a predetermined operation corresponding to the displayed first object based on a first input detected at a first position at which the first object is displayed and a second input detected at a second position in which the object is not displayed.
According to the present disclosure, there is provided a non-transitory computer-readable medium including computer program instructions, which when executed by an information processing apparatus, cause the information processing apparatus to perform a method comprising: controlling a user interface to display a first object; detecting an input received at the user interface; and performing a predetermined operation corresponding to the displayed first object based on a first input detected at a first position at which the first object is displayed and a second input detected at a second position in which the object is not displayed.
According to the present disclosure described above, the operability can be improved.
Fig. 1 is a block diagram illustrating the functional configuration of an information processing apparatus according to a first embodiment of the present disclosure. Fig. 2 is a diagram illustrating an object selecting operation recognized in the information processing apparatus according to the first embodiment. Fig. 3 is a diagram illustrating a target object expanding operation in the information processing apparatus according to the first embodiment. Fig. 4 is a diagram illustrating a display position of a target object displayed by the information processing apparatus according to the first embodiment. Fig. 5 is a diagram illustrating movement of an expansion center of the target object displayed by the information processing apparatus according to the first embodiment. FIG 6 is a diagram illustrating a display position of the target object displayed by the information processing apparatus according to the first embodiment, when an important region is included. Fig. 7 is a diagram illustrating display of an additional function in the expanding operation in the information processing apparatus according to the first embodiment. Fig. 8 is a diagram illustrating an operation of expanding a selection range of a target object in the information processing apparatus according to the first embodiment. Fig. 9 is a diagram illustrating an expanding operation when a plurality of target objects is selected in the information processing apparatus according to the first embodiment. Fig. 10 is a diagram illustrating an operation of changing an effect of an object in the information processing apparatus according to the first embodiment. Fig. 11 is a diagram illustrating an operation of rotating a target object in the information processing apparatus according to the first embodiment. Fig. 12 is a block diagram illustrating the functional configuration of an information processing apparatus according to a second embodiment of the present disclosure. Fig. 13 is a block diagram illustrating an example of a hardware configuration of the information processing apparatus according to the first and second embodiments of the present disclosure.
Preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. Throughout the specification and the drawings, the same reference numerals are given to constituent elements having substantially the same functional configuration and the description thereof will not be repeated.
In the specification and the drawings, different alphabets are sometimes given to the end of the same reference numeral to distinguish a plurality of constituent elements having substantially the same functional configuration from each other. For example, a plurality of constituent elements having substantially the same functional configuration are distinguished as objects 5A and 5B, as necessary. However, when it is not necessary to distinguish the plurality of constituent elements having substantially the same functional configuration from each other, only the same reference numeral is given to the constituent elements. For example, when it is not necessary to distinguish the objects 5A and 5B from each other, the objects 5A and 5B are simply referred to as the objects 5.
In the specification and the drawings, different reference numerals are sometimes given to the end of the same reference numeral by inserting a hyphen therebetween to distinguish a plurality of constituent elements having substantially the same functional configuration from each other. For example, a plurality of constituent elements having substantially the same functional configuration are distinguished as objects 5C-1 and 5C-2, as necessary. However, when it is not necessary to distinguish the plurality of constituent elements having substantially the same functional configuration from each other, only the same reference numeral is given to the constituent elements. For example, when it is not necessary to distinguish the objects 5C-1 and 5C-2 from each other, the objects 5C-1 and 5C-2 are simply referred to as the objects 5C-2.
The description will be made in the following order.
1. First Embodiment
1-1. Functional Configuration
1-2: Expanding Operation
1-3: Movement of Display Position
1-4: Display of Additional Function
1-5: Operation of Expanding Selection Range
1-6: Operation of Changing Effect
1-7: Rotating Operation
2. Second Embodiment
3. Example of Hardware Configuration
<1. First Embodiment>
(1-1. Functional Configuration)
First, the functional configuration of an information processing apparatus according to an embodiment of the present disclosure will be described with reference to Fig. 1. Fig. 1 is a block diagram illustrating the functional configuration of the information processing apparatus according to the embodiment of the disclosure.
An information processing apparatus 100a illustrated in Fig. 1 is an apparatus that is capable of recognizing a plurality of coordinate positions simultaneously input. Examples of the information processing apparatus 100a include a portable telephone, a personal computer (PC), a video processing apparatus, a game console, a home appliance, and a music reproducing apparatus.
The information processing apparatus 100a includes a display unit 101, a detecting unit 103, an operation recognizing unit 105, and a display control unit 107.
(Display Unit 101)
The display unit 101 has a function of providing a display screen to a user. The display unit 101 may be a display device such as a liquid crystal display (LCD) device or an organic light emitting diode (OLED) display device.
(Detecting Unit 103)
The detecting unit 103 has a function of detecting the position of an operation body on a display screen. The detecting unit 103 can detect the positions of a plurality of fingers F on the display screen. The function of the detecting unit 103 is realized using, for example, a touch sensor, an imaging apparatus, or other sensors. Examples of the operation body include a user's body such as a finger, an arm, or a foot and a stylus pen. In this embodiment, it is assumed that the detecting unit 103 is a touch sensor superimposed on the display unit 101 and the operation body is a finger F of a user. The touch sensor used here may be, for example, a contact type touch sensor that detects the position of the finger F touching the display screen. Alternatively, the touch sensor used here may be a non-contact type touch sensor that detects the position of the finger F on the display screen in a non-contact manner. Alternatively, the touch sensor used here may detect a pushing operation of a user on the display screen.
(Operation Recognizing Unit 105)
The operation recognizing unit 105 has a function of recognizing a process indicated by an operation pattern input by the user. The operation pattern is determined based on the position of the operation body. Specifically, the operation pattern is determined based on various conditions such as a detection timing of the position of the operation body and a change pattern of the position of the operation body. The operation recognizing unit 105 determines the operation pattern to recognize a process associated with the operation pattern in advance.
(Display Control Unit 107)
The display control unit 107 has a function of controlling display of a display screen on the display unit 101. The display control unit 107 can generate the display screen based on the process recognized by the operation recognizing unit 105 and display the display screen on the display unit 101. For example, the display control unit 107 can display a display screen on which a target object is expanded, when the operation recognizing unit 105 recognizes that an operation pattern input by the user is an object expanding operation.
The display control unit 107 can display an object on the display screen. For example, the display control unit 107 may display a plurality of objects on the display screen at random. When one of the objects is selected, the display control unit 107 may display the selected object (hereinafter, referred to as a target object meaning an object to be operated) differently from another object. The details of control performed by the display control unit 107 will be described in accordance with each situation described below.
Examples of the function of the information processing apparatus 100a according to this embodiment have hitherto been described. The constituent elements may be configured using general units or circuits or may be configured by hardware specialized for the functions of the constituent elements. Further, the functions of the constituent elements may be performed by reading a control program, which describes a processing order in which the functions are realized by an arithmetic device such as a central processing unit (CPU), from a storage medium such as a read-only memory (ROM) or a random access memory (RAM), which stores the control program, analyzing the control program, and executing the control program. Accordingly, a configuration to be used may be appropriately modified in accordance with a technical level at which this embodiment is realized.
A computer program configured to realize the functions of the information processing apparatus 100a according to the above-described embodiment may be created and mounted on a personal computer or the like. Further, a computer readable recording medium that stores the computer program may be provided. Examples of the recording medium include a magnetic disk, an optical disc, a magneto-optical disc, and a flash memory. Furthermore, the computer program may be delivered via a network or the like without use of a recording medium.
(1-2. Expanding Operation)
Next, an expanding operation of the information processing apparatus according to this embodiment will be described with reference to Figs. 2 and 3. Fig. 2 is a diagram illustrating an object selecting operation recognized in the information processing apparatus according to the first embodiment. Fig. 3 is a diagram illustrating a target object expanding operation in the information processing apparatus according to the first embodiment.
First, the display control unit 107 can display a plurality of objects 5 arranged at random on the display screen, as illustrated in (state 1). Here, four objects 5 are illustrated, that is, an object 5A, an object 5B, an object 5C, and an object 5D are illustrated. In this state, the user selects a target object to be operated from among the objects 5 with the finger F1 (state 2). Here, the finger F1 is an example of a first operation body. For example, when the position of the finger F1 is detected on one of the objects 5, the operation recognizing unit 105 can recognize a first operation of selecting the target object to be operated from among the objects 5. More specifically, when the position of the finger F1 is detected, the operation recognizing unit 105 can recognize the first operation of selecting the object 5C located under the finger F1. The display control unit 107 may display the selected object 5C differently from the other objects 5. When another object 5 is displayed while being overlapped on the selected object 5C, the display control unit 107 may perform control such that the selected object 5C is displayed at the uppermost portion.
Thus, when the target object 5C is selected, one position on the display screen is pointed with a finger F2 (state 3). The finger F2 is an example of a second operation body. The finger F2 may be another finger of the same hand as that of the finger F1. Alternatively, the finger F2 may be one of the fingers of another hand of the same user. Alternatively, the finger F2 may be a finger of another user. The operation recognizing unit 105 can recognize a second operation on the target object 5C based on a change in relative positions between a position P1 of the finger F1 and a position P2 of the finger F2.
Hereinafter, an expanding operation as an example of the second operation will be described with reference to Fig. 3. The position of the finger F2 in state 3 is assumed to be P2-1. The user moves the finger F2 from the position P2-1 to a position P2-2, as illustrated in state 4. Here, when the distance between the position P1 and the position P2-2 and the distance between the position P1 and the position P2-2 are changed, the operation recognizing unit 105 can recognize that the second operation is an operation of changing the size of the selected object 5. More specifically, when the distance between the position P1 and the position P2-2 is broader than the distance between the position P1 and the position P2-2, the operation recognizing unit 105 can recognize that the second operation is an expanding operation. Conversely, when the distance between the position P1 and the position P2-2 is narrower than the distance between the position P1 and the position P2-2, the operation recognizing unit 105 can recognize that the second operation is a reducing operation. In state 4, when the operation recognizing unit 105 recognizes that the second operation is an expanding operation, the display control unit 107 expands and displays the target object 5C.
As described above, when the user desires to perform an operation (the second operation) on one of a plurality of objects on a display screen provided by the information processing apparatus 100a, the user can selectively perform the second operation on the selected object 5 by performing a selecting operation (the first operation) with the finger F1, and then performing the second operation. Patent Document 1 described above discloses a technique of performing an expanding or reducing process based on a relative distance between two simultaneously detected points (so-called pinch-in and pinch-out). However, when a target object is specified with two fingers, the specifying accuracy of the target object may deteriorate compared to a case where a target object is specified with a single finger. Accordingly, the information processing apparatus 100a is configured such that a target object is specified with high accuracy, and then the second operation can be performed on the selected object. In particular, as illustrated in Fig. 2, the configuration of the information processing apparatus 100a is effective when the plurality of objects 5 to be operated is displayed.
Although not illustrated here, when the target object 5C is selected as a target object with the finger F1 and then the position of the finger F1 is then changed, the display control unit 107 can trace the changed position of the finger F1 to move the target object 5C (third operation). The third operation may be performed before or after the second operation or may be performed simultaneously with the second operation.
(1-3. Movement of Display Position)
Hereinafter, movement of the display position of a target object will be described with reference to Figs. 4 to 6. Fig. 4 is a diagram illustrating the display position of a target object displayed by the information processing apparatus according to this embodiment. Fig. 5 is a diagram illustrating movement of an expansion center of the target object displayed by the information processing apparatus according to this embodiment. FIG 6 is a diagram illustrating a display position of the target object displayed by the information processing apparatus according to this embodiment, when an important region is included.
As described above, the information processing apparatus 100a selects the object 5 with the finger F1, and then performs the second operation, putting the finger F1 on the selected object 5. Then, when the user performs the second operation, the user may rarely view the object in some situations. Accordingly, the display control unit 107 of the information processing apparatus 100a may move the display position of the target object 5 to a position at which the user can easily view the target object. As illustrated in Fig. 4, the display control unit 107 may move the display position of the selected object 5 within a range in which the position P1 of the finger F1 is not out of the selected object 5.
Further, the display control unit 107 can move the display position of the selected object 5 in accordance with a positional relation between the position P1 of the finger F1 and the position P2 of the finger F2. In Fig. 3, the case in which the central point of the object 5C is an expansion center has been described. However, the present technology is not limited thereto. For example, when the second operation is an expanding operation, the display control unit 107 may control display such that the object 5C is expanded in the direction of the finger F2, as illustrated in Fig. 5. At this time, the expansion center is moved in the opposite direction of the finger F2 within the range in which the position P1 of the finger F1 is not out of the object 5C. Accordingly, the target object 5C is not covered by the finger F1 and the second operation can be performed while ensuring easiness of view.
The movement of the display position may be performed simultaneously with, for example, the second operation. Alternatively, the movement of the display position may be performed after the position P2 of the finger F2 is detected. In particular, when the object 5 is an image, the user may sometimes perform an operation while viewing the display of the object 5. In this case, this configuration is effective.
When the object 5 is an image, for example, as illustrated in Fig. 6, a predetermined region 51 (for example, an important region) is included within the object 5 in some cases. The predetermined region 51 may be a region corresponding to, for example, the face of a subject. Alternatively, the predetermined region 51 may be, for example, a part of an object. The predetermined region may be, for example, a region that is set based on an image analysis result of the object 5 which is an image object. The display control unit 107 can change the display position of the target object in accordance with the predetermined region included within the target object and an operation detection position, which is a position on the display screen determined and roughened in accordance with the position of the first operation body. For example, when it is detected that the first operation position is included within the predetermined region of the target object, the display control unit 107 may change the display position of the target object so that the operation detection position is not included within the predetermined region. The display control unit 107 can move the display position of the object 5 so that the predetermined region 51 does not overlap the position P1 of the finger F1, when moving the display position. With such a configuration, it is possible to prevent a situation where the predetermined region 51 is poorly viewed particularly in the object 5.
(1-4. Display of Additional Function)
Next, display of an additional function in the expanding operation of the information processing apparatus according to this embodiment will be described with reference to Fig. 7. Fig. 7 is a diagram illustrating the display of the additional function in the expanding operation of the information processing apparatus according to this embodiment.
When the second operation is a size changing operation of changing the size of the selected object 5, the display control unit 107 can display an additional function object 6 for the target object 5 on the display screen in accordance with the size of the selected target object 5.
For example, as illustrated in Fig. 7, the position P1 of the finger F1 and the position P2 of the finger F2 are detected on the display screen on which the plurality of objects 5 is displayed (state 6). State 6 is a state where the position P1 of the finger F1 and the position P2 of the finger F2 are detected, as in state 3, but the position of the finger F2 is different. As illustrated in this state, the position of the finger F2 may be a portion in which the object 5 is not displayed or may be a position on the object 5. When the user performs the expanding operation, as in state 7, the display control unit 107 displays the additional function object 6A in accordance with the size of the expanded target object 5C.
When the user further performs the expanding operation, as in state 8, yet other additional function objects 6B and 6C are displayed. The additional function objects 6 displayed here are objects used to perform an additional function of performing an adding process on the target object. For example, when the detecting unit 103 detects the position of the finger F on the additional function object 6A, the operation recognizing unit 105 can recognize a reproducing operation of reproducing the target object 5C. Further, when the detecting unit 103 detects the position of the finger F on the additional function object 6B, the operation recognizing unit 105 can recognize a deleting operation of deleting the target object 5C. The additional function object 6C may show a photographing date of the target object 5C.
(1-5. Operation of Expanding Selection Range)
Next, an expanding operation of expanding a selection range of a target object in the information processing apparatus according to this embodiment will be described with reference to Figs. 8 and 9. Fig. 8 is a diagram illustrating an expanding operation of expanding the selection range of a target object in the information processing apparatus according to this embodiment. Fig. 9 is a diagram illustrating an expanding operation when a plurality of target objects is selected in the information processing apparatus according to this embodiment.
In state 9, the display control unit 107 displays seven objects 5, that is, displays objects 5A to 5G. Here, the object 5C is selected with the finger F1. In this state, a tapping operation is performed with the finger F2. Then, as illustrated in state 10, the selection range of the target object is expanded. Here, the operation recognizing unit 105 can recognize the tapping operation performed with the finger F2 as an operation of expanding the selection range up to the objects 5 (the objects 5A, 5B, and 5D) adjacent to the object 5C selected in state 9. The expanding operation of expanding the selection range is not limited to the expansion to the objects 5 adjacent to the previously selected object 5. For example, the selection range may be expanded up to the plurality of objects 5 displayed within a predetermined distance from the position P1 of the finger F1.
When the plurality of objects 5 is selected as the target objects through the expanding operation of expanding the selection range and the user performs an expanding operation by moving the position P2 of the finger F2, as illustrated in Fig. 9, the operation recognizing unit 105 can recognize the expanding operation as an expanding operation of expanding the plurality of objects 5 (the objects 5A to 5D) selected as the target objects. At this time, the display control unit 107 can expand and display the objects 5A to 5D. For example, at this time, the gravity center of the plurality of objects 5 can be set as the expansion center.
The expanding operation of expanding the selection range has been performed as the tapping operation with the finger F2, but the example of the present technology is not limited thereto. For example, the expanding operation of expanding the selection range may be performed with a finger F different from the finger F2. Further, the expanding operation of expanding the selection range may be a double-tapping operation. Furthermore, the expanding operation of expanding the selection range may be a tapping operation performed with a plurality of fingers.
(1-6. Operation of Changing Effect)
Next, an operation of changing an effect of a target object in the information processing apparatus according to this embodiment will be described with reference to Fig. 10. Fig. 10 is a diagram illustrating an operation of changing an effect of an object in the information processing apparatus according to this embodiment.
When the object 5 displayed on the display screen is an image object, the operation recognizing unit 105 may recognize an operation of changing an effect of the object 5. For example, the operation recognizing unit 105 may recognize a tapping operation performed with a finger F3 different from the fingers F1 and F2 as an effect changing operation. For example, the display control unit 107 may change an effect of the object 5C, when the tapping operation performed with the finger F3 is recognized. For example, a case in which the change in an effect is a change in the frame of the object 5C will be described below. In state 12, the object 5C is selected as the target object at the position P1 of the finger F1. Here, when the position P2 of the finger F2 is detected and the user performs a tapping operation with the finger F3, an object 5C2 whose frame overlaps the frame of an object 5C1 is displayed (state 13). Here, when the user performs a tapping operation with the finger F3 again, an object 5C3 whose frame, which is different from the frame of the object 5C2, overlaps the object 5C1 is displayed.
The change of the frame has hitherto been described, but the present technology is not limited thereto. For example, the change in an effect may be a change in color tone of an image (monochrome, sepia, etc.), a change in a background, a change in the direction of light, a change in an overlap state among the objects 5, a change in contrast, or the like.
(1-7. Rotating Operation)
Next, an operation of rotating a target object in the information processing apparatus according to this embodiment will be described with reference to Fig. 11. Fig. 11 is a diagram illustrating an operation of rotating a target object in the information processing apparatus according to this embodiment.
The case in which the second operation is the operation of changing the size of the target object has hitherto been described. However, the second operation may be the operation of rotating the target object. When the operation recognizing unit 105 recognizes a change in the relative positions between the positions P1 and P2 in which the distance between the position P1 of the finger F1 and the position P2 of the finger F2 is nearly constant, the display control unit 107 can rotate the target object 5C selected with the finger F1 at a display angle.
For example, as illustrated in Fig. 11, the object 5C selected with the finger F1 may be rotated clockwise from the state of an object 5C-1 to the state of an object 5C-2, while rotating the position P2 of the finger F2 from a position P2-1 to a position P2-2 about the position P1 of the finger F1 (state 16).
<2. Second Embodiment>
Next, the functional configuration of an information processing apparatus according to a second embodiment of the present disclosure will be described with reference to Fig. 12. Fig. 12 is a block diagram illustrating the functional configuration of the information processing apparatus according to the second embodiment of the present disclosure.
An information processing apparatus 100b according to the second embodiment of the present disclosure includes a detection information acquiring unit 104, an operation recognizing unit 105, and a display control unit 107 as the main units. Compared to the information processing apparatus 100a according to the first embodiment of the present disclosure, the information processing apparatus 100b is different from the information processing apparatus 100a in that the information processing apparatus 100b does not include the display unit 101 and the detecting unit 103. Therefore, the information processing apparatus 100b includes the detection information acquiring unit 104 that acquires detection information from an external detecting unit 103. Only differences between the information processing apparatus 100a according to the first embodiment and the information processing apparatus 100b will be described below and the description of the common constituent elements will not be repeated here.
(Detection Information Acquiring Unit 104)
The detection information acquiring unit 104 has a function of acquiring detection information generated when the detecting unit 103 detects the position of an operation body. The detection information acquiring unit 104 can input the acquired detection information into the operation recognizing unit 105.
<3. Example of Hardware Configuration>
Next, an example of a hardware configuration of the information processing apparatus according to the first and second embodiments of the present disclosure will be described with reference to Fig. 13. Fig. 17 is a block diagram illustrating an example of the hardware configuration of the information processing apparatus according to the first and second embodiments of the present disclosure.
For example, the information processing apparatus 100 includes a GPS antenna 821, a GPS processing unit 823, a communication antenna 825, a communication processing unit 827, a geomagnetic sensor 829, an acceleration sensor 831, a gyro sensor 833, an atmospheric pressure sensor 835, an imaging unit 837, a central processing unit (CPU) 839, a read-only memory (ROM) 841, a random access memory (RAM) 843, an operation unit 847, a display unit 849, a decoder 851, a speaker 853, an encoder 855, a microphone 857, and a storage unit 859.
(GPS Antenna 821)
The GPS antenna 821 is an example of an antenna that receives signals from positioning satellites. The GPS antenna 821 can receive GPS signals from a plurality of GPS satellites and input the received GPS signal into the GPS processing unit 823.
(GPS Processing Unit 823)
The GPS processing unit 823 is an example of a calculating unit that calculates position information based on the signals received from the positioning satellites. The GPS processing unit 823 calculates the current position information based on the plurality of GPS signals input from the GPS antenna 821 and outputs the calculated position information. Specifically, the GPS processing unit 823 calculates the position of each GPS satellite based on trajectory data of the GPS satellite and calculates the distance between each GPS satellite and the terminal apparatus 100 based on a difference between transmission and reception times of the GPS signal. Then, the current three-dimensional position can be calculated based on the calculated position of each GPS satellite and the distance between each GPS satellite and the terminal apparatus 100. Further, the trajectory data of the GPS satellite used here may be included in, for example, the GPS signal. Alternatively, the trajectory data of the GPS satellite may be acquired from an external server via the communication antenna 825.
(Communication Antenna 825)
The communication antenna 825 is an antenna that has a function of receiving a communication signal via, for example, a portable communication network or a wireless local area network (LAN) communication network. The communication antenna 825 can supply the received signal to the communication processing unit 827.
(Communication Processing Unit 827)
The communication processing unit 827 has a function of performing various kinds of signal processing on the signal supplied from the communication antenna 825. The communication processing unit 827 can supply a digital signal generated from the supplied analog signal to the CPU 839.
(Geomagnetic Sensor 829)
The geomagnetic sensor 829 is a sensor that detects geomagnetism as a voltage value. The geomagnetic sensor 829 may be a triaxial geomagnetic sensor that detects each of the geomagnetisms in the X, Y, and Z axis directions. The geomagnetic sensor 829 can supply the detected geomagnetism data to the CPU 839.
(Acceleration Sensor 831)
The acceleration sensor 831 is a sensor that detects acceleration as a voltage value. The acceleration sensor 831 may be a triaxial acceleration sensor that detects each of the accelerations in the X, Y, and Z axis directions. The acceleration sensor 831 can supply the detected acceleration data to the CPU 839.
(Gyro Sensor 833)
The gyro sensor 833 may be a kind of a measuring device that detects an angle or an angular velocity of an object. The gyro sensor 833 may be a triaxial gyro sensor that detects a change angle (angular velocity) of a rotation angle around the X, Y, and Z axes as a voltage value. The gyro sensor 833 can supply the detected angular velocity data to the CPU 839.
(Atmospheric Pressure Sensor 835)
The atmospheric pressure sensor 835 is a sensor that detects a surrounding pressure as a voltage value. The atmospheric pressure sensor 835 can detect a pressure as a predetermined sampling frequency and supply the detected pressure data to the CPU 839.
(Imaging Unit 837)
The imaging unit 837 has a function of photographing a still image or a moving image through a lens under the control of the CPU 839. The imaging unit 837 may store the photographed image in the storage unit 859.
(CPU 839)
The CPU 839 functions as an arithmetic device and a control device to control all of the processes in the information processing apparatus 100 in accordance with various kinds of programs. The CPU 839 may be a microprocessor. The CPU 839 can realize various functions in accordance with various kinds of programs.
(ROM 841 and RAM 843)
The ROM 841 can store programs, calculation parameters, or the like used by the CPU 839. The RAM 843 can temporarily store programs used in execution of the CPU 839, or parameters or the like appropriately changed in the execution.
(Operation Unit 847)
The operation unit 847 has a function of generating an input signal used for a user to perform a desired operation. The operation unit 847 may include an input unit, such as a touch sensor, a mouse, a keyboard, a button, a microphone, a switch, or a lever, with which that a user inputs information and an input control circuit configured to generate an input signal based on the input of the user and output the input signal to the CPU 839.
(Display Unit 849)
The display unit 849 is an example of an output device and may be a display device such as a liquid crystal display (LCD) device or an organic light emitting diode (OLED) display device. The display unit 849 can supply information by displaying a screen for a user.
(Decoder 851 and Speaker 853)
The decoder 851 has a function of performing decoding, analog conversion, or the like on input data under the control of the CPU 839. The decoder 851 performs the decoding, the analog conversion, and the like on audio data input via, for example, the communication antenna 825 and the communication processing unit 827 and outputs an audio signal to the speaker 853. The speaker 853 can output audio based on the audio signal supplied from the decoder 851.
(Encoder 855 and Microphone 857)
The encoder 855 has a function of performing digital conversion, encoding, or the like on input data under the control of the CPU 839. The encoder 855 can perform the digital conversion, the encoding, and the like on an audio signal input from the microphone 857 and output the audio data. The microphone 857 can collect audio and output the audio as an audio signal.
(Storage Unit 859)
The storage unit 859 is a data storage device and may include a storage medium, a recording device that records data in a storage medium, a reading device that reads data from a storage medium, and a deleting device that deletes data recorded in a storage medium. Here, for example, a non-volatile memory such as a flash memory, a magnetoresistive random access memory (MRAM), a ferroelectric random access memory (FeRAM), a phase change random access memory (PRAM), or an electronically erasable and programmable read-only memory (EEPROM), or a magnetic recording medium such as a hard disk drive (HDD) may be used as the storage medium.
The preferred embodiments of the present disclosure have hitherto been described in detail with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited thereto. It is apparent to those skilled in the art of the present disclosure that the present disclosure is modified or amended in various forms within the scope of the technical spirit described in claims, and the modifications and amendments are, of course, construed to pertain to the technical scope of the present disclosure.
For example, in the above-described embodiments, a user first selects a target object with the finger F1, and then performs an operation on a display screen with the finger F2, but the present technology is not limited thereto. For example, a user may touch a blank portion (a portion in which an object is not displayed) on the display screen with the finger F2, and then may select a target object with the finger F1.
The configurations described below also pertain to the technical scope of the present disclosure.
(1) An information processing apparatus comprising: a display controller that controls a user interface to display a first object; and a detection unit that detects an input received at the user interface, wherein the display controller performs a predetermined operation corresponding to the displayed first object based on a relative relation of a first input detected at a first position at which the first object is displayed and a second input detected at a second position in which the first object is not displayed.
(2) The information processing apparatus of (1), further comprising: the user interface that is controlled by the display controller to display an object.
(3) The information processing apparatus of (1) to (2), wherein the information processing apparatus is one of a portable telephone, a personal computer, a video processing apparatus, a game console, a home appliance and a music reproducing apparatus.
(4) The information processing apparatus of (1) to (3), wherein the detection unit is a touch sensor disposed on a surface of the user interface that detects a touch input received at the user interface.
(5) The information processing apparatus of (1) to (4), wherein the display controller controls the user interface to display a plurality of objects including the first object.
(6) The information processing apparatus (5), wherein the display controller controls the user interface to display the first object differently from the other plurality of objects based on the detected first input.
(7) The information processing apparatus of (1) to (6), wherein the predetermined operation is an operation of changing a size of the displayed first object.
(8) The information processing apparatus of (7), wherein the detection unit detects a change in position of the second input and the display controller controls the user interface to change the size of the displayed first object based on the detected change in position.
(9) The information processing apparatus of (7) and (8), wherein the detection unit detects a change in position of the second input from the second position to a third position that is a greater distance from the first position than the second position, and the display controller controls the user interface to increase the size of the displayed first object based on the detected change in position.
(10) The information processing apparatus of (9), wherein the display controller controls the user interface to increase the size of the displayed first object in a direction corresponding to the detected change in position.
(11) The information processing apparatus of (9) and (10), wherein the display controller controls the user interface to display additional functions corresponding to the first object when the size of the displayed first object exceeds a predetermined size.
(12) The information processing apparatus of (7) to (11), wherein the detection unit detects a change in position of the second input from the second position to a third position that is a shorter distance from the first position than the second position, and the display controller controls the user interface to decrease the size of the displayed first object based on the detected change in position.
(13) The information processing apparatus of (12), wherein the display controller controls the user interface to decrease the size of the displayed first objection in a direction corresponding to the detected change in position.
(14) The information processing apparatus of (5) to (13), wherein the detection unit detects a plurality of individual inputs as the second input, and the display controller controls the user interface to display the first object and a sub-set of the plurality of objects differently from the remaining plurality of objects based on the plurality of individual inputs.
(15) The information processing apparatus of (14), wherein the detection unit detects a change in position of the second input from the second position to a third position that is a greater distance from the first position than the second position, and the display controller controls the user interface to increase the size of the displayed first object and the sub-set of the plurality of objects based on the detected change in position.
(16) The information processing apparatus of (1) to (15), wherein the detection unit detects, as the second input, a plurality of individual inputs, and the display controller controls the user interface to display a predetermined effect on the first object based on the plurality of individual inputs.
(17) The information processing apparatus of (1) to (16), wherein the predetermined operation is an operation of rotating the displayed first object.
(18) The information processing apparatus of (17), wherein the detection unit detects a change in position of the second input and the display controller controls the user interface to rotate the displayed first object in a direction corresponding to the detected change in position.
(19) An information processing method performed by an information processing apparatus, the method comprising: controlling a user interface to display a first object; detecting an input received at the user interface; and performing a predetermined operation corresponding to the displayed first object based on a first input detected at a first position at which the first object is displayed and a second input detected at a second position in which the object is not displayed.
(20) A non-transitory computer-readable medium including computer program instructions, which when executed by an information processing apparatus, cause the information processing apparatus to perform a method comprising: controlling a user interface to display a first object; detecting an input received at the user interface; and performing a predetermined operation corresponding to the displayed first object based on a first input detected at a first position at which the first object is displayed and a second input detected at a second position in which the object is not displayed.
100 Information processing apparatus
101 Display unit
103 Detecting unit
104 Detection information acquiring unit
105 Operation recognizing unit
107 Display control unit

Claims (20)

  1. An information processing apparatus comprising:
    a display controller that controls a user interface to display a first object; and
    a detection unit that detects an input received at the user interface, wherein
    the display controller performs a predetermined operation corresponding to the displayed first object based on a relative relation of a first input detected at a first position at which the first object is displayed and a second input detected at a second position in which the first object is not displayed.
  2. The information processing apparatus of claim 1, further comprising:
    the user interface that is controlled by the display controller to display an object.
  3. The information processing apparatus of claim 1, wherein
    the information processing apparatus is one of a portable telephone, a personal computer, a video processing apparatus, a game console, a home appliance and a music reproducing apparatus.
  4. The information processing apparatus of claim 1, wherein
    the detection unit is a touch sensor disposed on a surface of the user interface that detects a touch input received at the user interface.
  5. The information processing apparatus of claim 1, wherein
    the display controller controls the user interface to display a plurality of objects including the first object.
  6. The information processing apparatus of claim 5, wherein
    the display controller controls the user interface to display the first object differently from the other plurality of objects based on the detected first input.
  7. The information processing apparatus of claim 1, wherein
    the predetermined operation is an operation of changing a size of the displayed first object.
  8. The information processing apparatus of claim 7, wherein
    the detection unit detects a change in position of the second input and the display controller controls the user interface to change the size of the displayed first object based on the detected change in position.
  9. The information processing apparatus of claim 7, wherein
    the detection unit detects a change in position of the second input from the second position to a third position that is a greater distance from the first position than the second position, and the display controller controls the user interface to increase the size of the displayed first object based on the detected change in position.
  10. The information processing apparatus of claim 9, wherein
    the display controller controls the user interface to increase the size of the displayed first object in a direction corresponding to the detected change in position.
  11. The information processing apparatus of claim 9, wherein
    the display controller controls the user interface to display additional functions corresponding to the first object when the size of the displayed first object exceeds a predetermined size.
  12. The information processing apparatus of claim 7, wherein
    the detection unit detects a change in position of the second input from the second position to a third position that is a shorter distance from the first position than the second position, and the display controller controls the user interface to decrease the size of the displayed first object based on the detected change in position.
  13. The information processing apparatus of claim 12, wherein
    the display controller controls the user interface to decrease the size of the displayed first objection in a direction corresponding to the detected change in position.
  14. The information processing apparatus of claim 5, wherein
    the detection unit detects a plurality of individual inputs as the second input, and the display controller controls the user interface to display the first object and a sub-set of the plurality of objects differently from the remaining plurality of objects based on the plurality of individual inputs.
  15. The information processing apparatus of claim 14, wherein
    the detection unit detects a change in position of the second input from the second position to a third position that is a greater distance from the first position than the second position, and the display controller controls the user interface to increase the size of the displayed first object and the sub-set of the plurality of objects based on the detected change in position.
  16. The information processing apparatus of claim 1, wherein
    the detection unit detects, as the second input, a plurality of individual inputs, and the display controller controls the user interface to display a predetermined effect on the first object based on the plurality of individual inputs.
  17. The information processing apparatus of claim 1, wherein
    the predetermined operation is an operation of rotating the displayed first object.
  18. The information processing apparatus of claim 17, wherein
    the detection unit detects a change in position of the second input and the display controller controls the user interface to rotate the displayed first object in a direction corresponding to the detected change in position.
  19. An information processing method performed by an information processing apparatus, the method comprising:
    controlling a user interface to display a first object;
    detecting an input received at the user interface; and
    performing a predetermined operation corresponding to the displayed first object based on a first input detected at a first position at which the first object is displayed and a second input detected at a second position in which the object is not displayed.
  20. A non-transitory computer-readable medium including computer program instructions, which when executed by an information processing apparatus, cause the information processing apparatus to perform a method comprising:
    controlling a user interface to display a first object;
    detecting an input received at the user interface; and
    performing a predetermined operation corresponding to the displayed first object based on a first input detected at a first position at which the first object is displayed and a second input detected at a second position in which the object is not displayed.





PCT/JP2012/005233 2011-09-01 2012-08-21 Information processing apparatus, information processing method, and program WO2013031134A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
CN201280040926.2A CN103765362B (en) 2011-09-01 2012-08-21 Information processing equipment, information processing method and program
US14/237,361 US10140002B2 (en) 2011-09-01 2012-08-21 Information processing apparatus, information processing method, and program
KR1020147004212A KR20140068024A (en) 2011-09-01 2012-08-21 Information processing apparatus, information processing method, and program
BR112014004048A BR112014004048A2 (en) 2011-09-01 2012-08-21 apparatus and method of processing information, and non-transient computer readable media
EP20120827276 EP2751654A4 (en) 2011-09-01 2012-08-21 Information processing apparatus, information processing method, and program
RU2014106495/08A RU2014106495A (en) 2011-09-01 2012-08-21 INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD AND PROGRAM

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011191144A JP2013054470A (en) 2011-09-01 2011-09-01 Information processor, information processing method, and program
JP2011-191144 2011-09-01

Publications (1)

Publication Number Publication Date
WO2013031134A1 true WO2013031134A1 (en) 2013-03-07

Family

ID=47755665

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/005233 WO2013031134A1 (en) 2011-09-01 2012-08-21 Information processing apparatus, information processing method, and program

Country Status (9)

Country Link
US (1) US10140002B2 (en)
EP (1) EP2751654A4 (en)
JP (1) JP2013054470A (en)
KR (1) KR20140068024A (en)
CN (1) CN103765362B (en)
BR (1) BR112014004048A2 (en)
RU (1) RU2014106495A (en)
TW (1) TW201319922A (en)
WO (1) WO2013031134A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3256935A4 (en) * 2015-02-13 2018-01-17 Samsung Electronics Co., Ltd. Apparatus and method for multi-touch input

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9792017B1 (en) 2011-07-12 2017-10-17 Domo, Inc. Automatic creation of drill paths
US9202297B1 (en) * 2011-07-12 2015-12-01 Domo, Inc. Dynamic expansion of data visualizations
JP6197559B2 (en) * 2013-10-10 2017-09-20 コニカミノルタ株式会社 Object operation system, object operation control program, and object operation control method
JP6466736B2 (en) * 2015-02-26 2019-02-06 株式会社コーエーテクモゲームス Information processing apparatus, information processing method, and program
JP6477096B2 (en) * 2015-03-20 2019-03-06 ヤマハ株式会社 Input device and sound synthesizer
US9652125B2 (en) 2015-06-18 2017-05-16 Apple Inc. Device, method, and graphical user interface for navigating media content
JP2017027422A (en) * 2015-07-24 2017-02-02 アルパイン株式会社 Display device and display processing method
US11922006B2 (en) 2018-06-03 2024-03-05 Apple Inc. Media control for screensavers on an electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001290585A (en) * 2000-01-31 2001-10-19 Canon Inc Position information processor, position information processing method and program, and operation device and its method and program
JP2007279638A (en) * 2006-04-12 2007-10-25 Xanavi Informatics Corp Navigation device
JP2008209915A (en) * 2008-01-29 2008-09-11 Fujitsu Ten Ltd Display device
JP2009525538A (en) * 2006-01-30 2009-07-09 アップル インコーポレイテッド Gesture using multi-point sensing device
JP2010176330A (en) * 2009-01-28 2010-08-12 Sony Corp Information processing apparatus and display control method

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5481710A (en) * 1992-09-16 1996-01-02 International Business Machines Corporation Method of and system for providing application programs with an undo/redo function
US7138983B2 (en) 2000-01-31 2006-11-21 Canon Kabushiki Kaisha Method and apparatus for detecting and interpreting path of designated position
WO2002101534A1 (en) * 2001-06-12 2002-12-19 Idelix Software Inc. Graphical user interface with zoom for detail-in-context presentations
CA2393887A1 (en) * 2002-07-17 2004-01-17 Idelix Software Inc. Enhancements to user interface for detail-in-context data presentation
JP3874737B2 (en) * 2003-03-31 2007-01-31 株式会社東芝 Display device
US7743348B2 (en) 2004-06-30 2010-06-22 Microsoft Corporation Using physical objects to adjust attributes of an interactive display application
JP2008070968A (en) 2006-09-12 2008-03-27 Funai Electric Co Ltd Display processor
WO2008138046A1 (en) 2007-05-11 2008-11-20 Rpo Pty Limited Double touch inputs
US7949954B1 (en) * 2007-08-17 2011-05-24 Trading Technologies International, Inc. Dynamic functionality based on window characteristics
US8130211B2 (en) * 2007-09-24 2012-03-06 Microsoft Corporation One-touch rotation of virtual objects in virtual workspace
US8504945B2 (en) 2008-02-01 2013-08-06 Gabriel Jakobson Method and system for associating content with map zoom function
US20090207142A1 (en) * 2008-02-20 2009-08-20 Nokia Corporation Apparatus, method, computer program and user interface for enabling user input
JP2009271689A (en) * 2008-05-07 2009-11-19 Seiko Epson Corp Display device and display method for the same
US8400477B1 (en) * 2008-05-20 2013-03-19 Google Inc. Object resizing
JP5161690B2 (en) * 2008-07-31 2013-03-13 キヤノン株式会社 Information processing apparatus and control method thereof
JP5108747B2 (en) * 2008-12-26 2012-12-26 富士フイルム株式会社 Information display apparatus, method and program
JP2011022964A (en) * 2009-07-21 2011-02-03 Panasonic Corp Touch panel and input display system
US8429565B2 (en) * 2009-08-25 2013-04-23 Google Inc. Direct manipulation gestures
JP2011053770A (en) 2009-08-31 2011-03-17 Nifty Corp Information processing apparatus and input processing method
US8766928B2 (en) * 2009-09-25 2014-07-01 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8619100B2 (en) 2009-09-25 2013-12-31 Apple Inc. Device, method, and graphical user interface for touch-based gestural input on an electronic canvas
KR101624993B1 (en) * 2009-10-01 2016-06-07 엘지전자 주식회사 Mobile terminal and control method thereof
US8347238B2 (en) * 2009-12-16 2013-01-01 Apple Inc. Device, method, and graphical user interface for managing user interface content and user interface elements by dynamic snapping of user interface elements to alignment guides
US8209630B2 (en) * 2010-01-26 2012-06-26 Apple Inc. Device, method, and graphical user interface for resizing user interface content
US8612884B2 (en) * 2010-01-26 2013-12-17 Apple Inc. Device, method, and graphical user interface for resizing objects
US8479117B2 (en) * 2010-06-04 2013-07-02 Lenovo (Singapore) Pte. Ltd. Intelligent window sizing for graphical user interfaces
US9207096B2 (en) * 2011-06-09 2015-12-08 Blackberry Limited Map magnifier
US9146660B2 (en) * 2011-08-22 2015-09-29 Trimble Navigation Limited Multi-function affine tool for computer-aided design
US8176435B1 (en) * 2011-09-08 2012-05-08 Google Inc. Pinch to adjust
US20130117711A1 (en) * 2011-11-05 2013-05-09 International Business Machines Corporation Resize handle activation for resizable portions of a user interface

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001290585A (en) * 2000-01-31 2001-10-19 Canon Inc Position information processor, position information processing method and program, and operation device and its method and program
JP2009525538A (en) * 2006-01-30 2009-07-09 アップル インコーポレイテッド Gesture using multi-point sensing device
JP2007279638A (en) * 2006-04-12 2007-10-25 Xanavi Informatics Corp Navigation device
JP2008209915A (en) * 2008-01-29 2008-09-11 Fujitsu Ten Ltd Display device
JP2010176330A (en) * 2009-01-28 2010-08-12 Sony Corp Information processing apparatus and display control method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2751654A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3256935A4 (en) * 2015-02-13 2018-01-17 Samsung Electronics Co., Ltd. Apparatus and method for multi-touch input
US9965173B2 (en) 2015-02-13 2018-05-08 Samsung Electronics Co., Ltd. Apparatus and method for precise multi-touch input

Also Published As

Publication number Publication date
BR112014004048A2 (en) 2017-03-07
US10140002B2 (en) 2018-11-27
US20140189581A1 (en) 2014-07-03
CN103765362A (en) 2014-04-30
TW201319922A (en) 2013-05-16
EP2751654A4 (en) 2015-04-08
EP2751654A1 (en) 2014-07-09
JP2013054470A (en) 2013-03-21
KR20140068024A (en) 2014-06-05
CN103765362B (en) 2018-06-05
RU2014106495A (en) 2015-08-27

Similar Documents

Publication Publication Date Title
US10140002B2 (en) Information processing apparatus, information processing method, and program
US8988342B2 (en) Display apparatus, remote controlling apparatus and control method thereof
JP6421670B2 (en) Display control method, display control program, and information processing apparatus
US9274619B2 (en) Input apparatus, input method, and input program
JP5935267B2 (en) Information processing apparatus, information processing method, and program
WO2015025345A1 (en) Information display device, information display method, and information display program
WO2013011648A1 (en) Information processing apparatus, information processing method, and program
US20150186004A1 (en) Multimode gesture processing
JP2008146243A (en) Information processor, information processing method and program
CN103513895A (en) Remote control apparatus and control method thereof
JP6534011B2 (en) INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD
JP6519075B2 (en) INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD
CN204945943U (en) For providing the remote control equipment of remote control signal for external display device
US10719147B2 (en) Display apparatus and control method thereof
JP4879933B2 (en) Screen display device, screen display method and program
US12032754B2 (en) Information processing apparatus, information processing method, and non-transitory computer readable medium
JP2013025464A (en) Information processor, information processing method and program
JP6514416B2 (en) IMAGE DISPLAY DEVICE, IMAGE DISPLAY METHOD, AND IMAGE DISPLAY PROGRAM
JP2016118947A (en) Spatial handwriting input system using angle-adjustable virtual plane
JP2016110349A (en) Information processing device, control method therefor, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12827276

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 14237361

Country of ref document: US

ENP Entry into the national phase

Ref document number: 20147004212

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2014106495

Country of ref document: RU

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112014004048

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 112014004048

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20140221