WO2014054367A1 - 情報処理装置、情報処理方法及び記録媒体 - Google Patents
情報処理装置、情報処理方法及び記録媒体 Download PDFInfo
- Publication number
- WO2014054367A1 WO2014054367A1 PCT/JP2013/073648 JP2013073648W WO2014054367A1 WO 2014054367 A1 WO2014054367 A1 WO 2014054367A1 JP 2013073648 W JP2013073648 W JP 2013073648W WO 2014054367 A1 WO2014054367 A1 WO 2014054367A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- operation target
- unit
- target image
- display
- input unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1615—Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
- G06F1/1616—Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
- G06F1/162—Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position changing, e.g. reversing, the face orientation of the screen with a two degrees of freedom mechanism, e.g. for folding into tablet PC like position or orienting towards the direction opposite to the user to show to a second user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/169—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
- G06F1/1692—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes the I/O peripheral being a secondary touch screen used as control interface, e.g. virtual buttons or sliders
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
Definitions
- the present invention relates to an information processing apparatus, an information processing method, and a recording medium.
- Information processing devices such as mobile terminals, game machines, and tablet PCs (Personal Computers) are increasingly equipped with a touch panel as an input unit.
- the position where the user touches the touch panel is specified by the touch panel. If the specified position is within an image display area such as an icon or button, the information processing apparatus executes a predetermined program associated with the image. Therefore, the user can perform a desired input only by touching images such as icons and buttons displayed on the screen. That is, an information processing apparatus including a touch panel provides an input by an easy operation to a user.
- Patent Document 1 discloses a touch sensor device that includes a touch panel and can be input by a user touching the screen of the touch panel.
- an operation target image including icons, buttons, a toolbar, and the like and a wallpaper are displayed on the touch panel in an overlapping manner.
- these operation target images By operating (touching, tapping, etc.) these operation target images, instructions for starting and executing the application are input.
- the wallpaper itself may be associated with a program or the like as an operation target image.
- Some programs are executed by this wallpaper operation, and commands, instructions, data, etc. are input.
- the display position of these operation target images is moved, or several operation target images are displayed. An extra operation to hide and do was necessary.
- the present invention has been made in view of the background as described above, and provides an information processing apparatus, an information processing method, and a recording medium that allow easy operation input even in a situation in which operation target images are displayed overlappingly. To do.
- an information processing apparatus provides: A display unit arranged on one side of the housing; A display control unit that controls the display unit to display operation target images arranged in a plurality of layers; A first input unit disposed on the display unit for detecting a touch position on the display unit; A second input unit that is disposed on a facing surface facing one surface of the housing and detects a touch position on the display unit; Among the operation target images displayed at the touch position detected by the first input unit, the operation target image arranged on the uppermost layer is specified, and the touch position detected by the second input unit A specifying unit for specifying an operation target image arranged in the lowest layer among the operation target images displayed in An executing unit that executes an action corresponding to the operation target image specified by the specifying unit; It is characterized by providing.
- An information processing method includes: A display control step for controlling a display unit arranged on one surface of the housing so as to display an operation target image arranged on a plurality of layers; Among the operation target images displayed at the touch position on the display unit detected by the first input unit arranged on the display unit, the operation target image arranged on the uppermost layer is specified, Of the operation target images displayed at the touch position on the display unit detected by the second input unit arranged on the opposite surface facing the one surface of the casing, the operation arranged on the lowest layer A specific step of identifying a target image; An execution step of executing an action corresponding to the identified operation target image; It is provided with.
- a recording medium is Computer A display unit arranged on one side of the housing, A display control unit for controlling the display unit to display operation target images arranged on a plurality of layers; A first input unit that is disposed on the display unit and detects a touch position on the display unit; A second input unit that is disposed on a facing surface facing one surface of the housing and detects a touch position on the display unit; Among the operation target images displayed at the touch position detected by the first input unit, the operation target image arranged on the uppermost layer is specified, and the touch position detected by the second input unit A specifying unit for specifying an operation target image arranged in the lowest layer among the operation target images displayed in An execution unit that executes an action corresponding to the operation target image specified by the specifying unit; It is a computer-readable recording medium which recorded the program made to function as.
- the information processing apparatus is a mobile terminal 10 that includes a first input unit 110 ⁇ / b> A on the front surface of a housing of the information processing apparatus and a second input unit 110 ⁇ / b> B on the back surface.
- the surface on which the display unit 100 of the casing of the mobile terminal 10 is disposed is referred to as a front surface
- the surface facing the front surface is referred to as a back surface.
- the mobile terminal 10 includes a display unit 100, a first input unit (front input unit) 110A, a second input unit (back input unit) 110B, a control unit 120, and a storage unit 130.
- the display unit 100 includes a display screen such as an LCD (Liquid Crystal Display) or an organic EL (Electro Luminescence) display, and is disposed on the front surface of the casing of the mobile terminal 10. Further, the display unit 100 displays images such as wallpaper, windows, icons, buttons, and the like on “layers” to be described later assigned to the respective images according to the control of the control unit 120.
- a display screen such as an LCD (Liquid Crystal Display) or an organic EL (Electro Luminescence) display
- images such as wallpaper, windows, icons, buttons, and the like on “layers” to be described later assigned to the respective images according to the control of the control unit 120.
- the first input unit (front input unit) 110A includes a touch sensor (touch panel) that detects a touched position.
- 110 A of 1st input parts are arrange
- the second input unit (rear input unit) 110B includes a touch sensor that detects a touched position, and is disposed on the rear surface of the casing of the mobile terminal 10 (a surface facing the first input unit 110A). .
- 110A of 1st input parts and the 2nd input part 110B will transmit the signal containing the information of the position which contacted to the control part 120, if a touch (touch), such as a finger and a pen, is detected with a touch sensor.
- the control unit 120 includes a CPU (Central Processing Unit), a ROM (Read Only Memory) storing a basic operation program, a RAM (Random Access Memory) serving as a work area, and the like.
- the control unit 120 controls the operation of each unit according to a program stored in the storage unit 130, and executes various processes including an operation image specifying process described later.
- the storage unit 130 includes a storage device such as a flash memory and a hard disk, and includes an operation image specifying program 131, an image information storage unit 132, and the like.
- the operation image specifying program 131 When the operation image specifying program 131 detects contact (touch) at the first input unit 110A and the second input unit 110B, the operation image specifying program 131 specifies an operation target image including the position (touch position) at which the contact has occurred in the display area. To do.
- the operation image specifying program 131 is a program for determining whether or not to input a predetermined instruction assigned to the operation target image. As to whether or not to input a predetermined instruction, information indicating whether a “layer” (described later) assigned to the operation target image has been contacted with the first input unit 110A or the second input unit 110B It is determined based on the combination.
- the image layer is information (information indicating the depth of the operation target image) indicating each layer constituting the display hierarchy of the operation target image.
- FIG. 4 shows a layer assigned to the wallpaper 200 and the plurality of icons 210 shown in FIG. 3 and a display hierarchy composed of the layers.
- a predetermined instruction input is also assigned to the wallpaper 200.
- the wallpaper 200 and the icon 210 have overlapping display areas, and “layer 1” is assigned to the wallpaper 200 and “layer 2” is assigned to the icon 210.
- a numerical value indicates that a layer with a larger value is an upper layer than a layer with a smaller value. Therefore, the icon 210 to which the layer 2 is assigned is an image displayed on the layer above the wallpaper 200 to which the layer 1 is assigned.
- the display hierarchy of the image can be logically defined.
- the image information storage unit 132 shown in FIG. 2 stores an image information table 1321 shown in FIG.
- the image information table 1321 includes an operation target image ID column, a material ID column, a layer column, a display area (X-axis direction) column, a display area (Y-axis direction) column, and an action ID column.
- the operation target image ID column stores an ID for identifying the operation target image displayed on the screen.
- the material ID column stores an ID for identifying an image that is a material (icon, button, etc.) of the operation target image.
- the layer column stores a number for identifying a layer assigned to the operation target image.
- the display area (X-axis direction) column and the display area (Y-axis direction) column store information indicating an area where the operation target image is displayed on the screen. Specifically, the display area (X-axis direction) column stores coordinates indicating both ends of the display area of the operation target image on the screen in the X-axis direction. The display area (Y-axis direction) column stores coordinates indicating both ends of the display area of the operation target image on the screen in the Y-axis direction.
- the action ID column stores an ID for identifying an input of a predetermined instruction assigned to each operation target image.
- the control unit 120 refers to the image information table 1321 shown in FIG. 5 and, as shown in FIG. 4, displays each operation target image corresponding to the display area and the layer (1 or 2). To generate a display screen and display it on the display unit 100 as shown in FIG. At this time, in the area where the display areas of the operation target images arranged on different layers overlap, such as the wallpaper 200 and the icon 210 shown in FIG. 3, the control unit 120 operates the operation target image arranged on the upper layer. Is displayed in front of the user (when the operation target image is not subjected to transparency processing or the like).
- the user inputs a desired instruction by operating (touching, tapping, etc.) the displayed operation target image.
- a desired instruction by operating (touching, tapping, etc.) the displayed operation target image.
- the operation target images overlap, the uppermost operation target image is operated from the first input unit 110A facing the user, and the lowermost operation target image is arranged on the back surface. It is possible to operate from the second input unit 110B.
- the user can operate the operation target image from the front (front) side and the back (back) side. Therefore, it is possible to operate the operation target images that are overlapped with each other with an easy operation.
- a mechanism for enabling such an operation will be described with reference to FIG.
- the user operates (touches, taps, etc.) the operation target image displayed on the display unit 100 from the front side or the back side. For this reason, the first input unit 110 ⁇ / b> A or the second input unit 110 ⁇ / b> B transmits a touch detection signal including information specifying the transmission source to the control unit 120.
- the control unit 120 starts the operation image specifying process shown in FIG. First, the control unit 120 reads information specifying the transmission source from the received touch detection signal, and determines whether or not the touch detection signal is a signal transmitted from the first input unit 110A (step S100). When it is determined that the signal is transmitted from the first input unit 110A (step S100; Yes), the control unit 120 specifies the coordinates on the display screen of the touched position from the touch detection signal (step S110a). ). The correspondence between the coordinates on the first input unit 110 ⁇ / b> A and the coordinates on the display unit 100 is set when the mobile terminal 10 is manufactured.
- control unit 120 refers to the image information table 1321 illustrated in FIG. 5 and identifies an operation target image that includes the identified coordinates in the display area (step S120a). Then, the control unit 120 obtains the operation target image ID of the operation target image (the operation target image disposed closest to the user) from the identified operation target images. (Step S130a).
- control unit 120 determines whether or not the operation target image ID has been acquired (step S140). If it is determined that the operation target image ID has been acquired (step S140; Yes), an action ID corresponding to the operation target image ID is acquired. (Step S150). Then, the control unit 120 executes an action specified by the acquired action ID, for example, an action such as starting a corresponding program. If it is determined that there is no operation target image including the coordinates of the touched position in the display area (step S140; No), the control unit 120 ends the operation image specifying process.
- step S100 when it is determined in step S100 that the touch detection signal is a signal transmitted from the second input unit 110B (step S100; No), the control unit 120 determines the touched position from the touch detection signal.
- the coordinates on the display screen are specified (step S110b). The correspondence between the coordinates on the second input unit 110B and the coordinates on the display unit 100 is set when the mobile terminal 10 is manufactured.
- the control unit 120 refers to the image information table 1321 and identifies an operation target image that includes the identified coordinates in the display area (step S120b). Then, the control unit 120 obtains the operation target image ID of the operation target image (the operation target image positioned farthest from the user's perspective) arranged in the lowest layer among the specified operation target images. (Step S130b). Next, the control unit 120 determines whether or not the operation target image ID has been acquired (step S140). If it is determined that the operation target image ID has been acquired (step S140; Yes), an action ID corresponding to the operation target image ID is acquired. (Step S150). Then, the control unit 120 executes an action specified by the acquired action ID, for example, an action such as starting a program corresponding to the operated operation target image.
- the operation target image arranged in front is touched from the front side. Can be operated with.
- an operation target image that is arranged on the back side and cannot be touched from the front side can be operated by touching from the back side. As a result, the user can easily perform an operation input.
- a wallpaper 200 and an icon 210 are displayed on the display screen of the display unit 100 as operation target images, and a two-level display hierarchy is configured by the wallpaper 200 and the icon 210.
- the operation of the operation image specifying process will be described more specifically using an example.
- the control unit 120 receives the touch detection signal from the first input unit 110A, determines whether or not the touch detection signal is a signal transmitted from the first input unit 110A (step S100 in FIG. 6), Is determined to be the first input unit 110A (step S100; Yes).
- the control unit 120 specifies the coordinates of the touched position (step S110a). It is assumed that the coordinates specified here are coordinates (150, 150).
- the control unit 120 refers to the image information table 1321 shown in FIG. 5 and specifies an operation target image that includes the coordinates (150, 150) in the display area (step S120a).
- the icon 210 (operation target image ID “10002”) and the wallpaper 200 (operation target image ID “10001”) are specified as the operation target images.
- the control unit 120 displays the icon 210 arranged in the upper layer (arranged in front).
- the associated operation target image ID “10002” is acquired (step S130a).
- control unit 120 determines whether or not the operation target image ID is acquired (step S140), determines that the operation target image ID is acquired (step S140; Yes), and determines the operation target image ID “ The action ID “DD” corresponding to “10002” is acquired (step S150). Then, the control unit 120 performs an action specified by the action ID “DD”, for example, an action such as starting a program corresponding to the operated icon 210.
- the user wants to operate (touch) the wallpaper 200 displayed on the display unit 100.
- the user can display the wallpaper 200 between the icons 210 through the first input unit 110 ⁇ / b> A disposed on the front side of the mobile terminal 10. touch.
- an arbitrary position on the wallpaper 200 is touched from the back side via the second input unit 110B.
- the control unit 120 specifies the coordinates of the touched position (step S110a). It is assumed that the coordinates specified here are coordinates (250, 350).
- the control unit 120 refers to the image information table 1321 shown in FIG. 5 and specifies an operation target image including coordinates (250, 350) in the display area (step S120a). In this example, only the wallpaper 200 (operation target image ID “10001”) is specified as the operation target image. Therefore, the control unit 120 acquires the operation target image ID “10001” associated with the wallpaper 200 (step S130a).
- control unit 120 determines whether or not the operation target image ID is acquired (step S140), determines that the operation target image ID is acquired (step S140; Yes), and determines the operation target image ID “ The action ID “AA” corresponding to “10001” is acquired (step S150). Then, the control unit 120 performs an action specified by the acquired action ID “AA”, for example, an action such as starting a program corresponding to the operated wallpaper 200.
- the control unit 120 specifies the coordinates of the touched position (step S110b), and specifies the operation target image including the specified coordinates in the display area ( Step S120b).
- the control unit 120 specifies the operation target image arranged in the lowest layer, that is, the wallpaper 200 with the layer “1” (operation target image ID “10001”) and associates it with the wallpaper 200.
- the obtained operation target image ID “10001” is acquired (step S130b).
- control unit 120 determines whether or not the operation target image ID is acquired (step S140), determines that the operation target image ID is acquired (step S140; Yes), and determines the operation target image ID “ The action ID “AA” corresponding to “10001” is acquired (step S150). Furthermore, the control unit 120 performs an action specified by the acquired action ID “AA”, for example, an action such as starting a program corresponding to the operated wallpaper 200.
- the user inputs the first input unit 110A. Can be operated by touching from the front side. Further, in the case of an operation target image that is below another operation target image and is difficult to touch from the front side, it can be operated by touching from the back side using the second input unit 110B. Therefore, various inputs can be performed with a more intuitive and easy operation than in the past.
- the mechanism of the mobile terminal 10 according to the present invention has been described using the operation target images arranged in the two display layers as an example.
- the display hierarchy is often three or more. Therefore, in the present embodiment, the mechanism of the mobile terminal 10 according to the present invention will be described by taking, as an example, operation target images arranged in three or more display layers.
- the mobile terminal 10 displays a wallpaper 200, an icon 210, and an application-specific input screen 220 as operation target images on the display screen of the display unit 100.
- Each of these operation target images includes a wallpaper 200 in the back, an icon 210 in the middle, and an application-specific input screen 220 in the foreground.
- the configuration of the mobile terminal 10 according to the present embodiment is the same as the configuration of the mobile terminal 10 according to the first embodiment.
- the control unit 120 assigns the layer 1 to the wallpaper 200, assigns the layer 2 to the icon 210, and displays the layer on the input screen 220 specific to the application, as shown in FIG. 3 are assigned and displayed at coordinates set in advance.
- the user inputs a desired instruction by operating the wallpaper 200, the icon 210, and the application-specific input screen 220, which are displayed operation target images.
- a desired instruction by operating the wallpaper 200, the icon 210, and the application-specific input screen 220, which are displayed operation target images.
- the control unit 120 receives the touch detection signal from the first input unit 110A, determines whether or not the touch detection signal is a signal transmitted from the first input unit 110A (step S100 in FIG. 6), Is determined to be the first input unit 110A (step S100; Yes).
- the control unit 120 specifies the coordinates of the touched position (step S110a), and specifies the operation target image including the specified coordinates in the display area (step S120a).
- the operation target image specified at this time is the wallpaper 200, the icon 210, and the application-specific input screen 220, or the application-specific input screen 220 and the wallpaper 200.
- the control unit 120 has “3” for the application-specific input screen 220 and “1” for the wallpaper 200.
- the operation target image ID associated with the application-specific input screen 220 having a large layer (arranged in front) is acquired (step S130a).
- the control unit 120 determines whether or not the operation target image ID has been acquired (step S140).
- step S140 If it is determined that the operation target image ID has been acquired (step S140; Yes), an action ID corresponding to the operation target image ID is acquired. (Step S150). Then, the control unit 120 performs an action specified by the acquired action ID, for example, an action such as starting a program corresponding to the input screen 220 specific to the operated application.
- the control unit 120 specifies the coordinates of the touched position (step S110b), and specifies the operation target image including the specified coordinates in the display area. (Step S120b).
- the operation target image specified at this time is only the wallpaper 200, or the icon 210 and the wallpaper 200, or the input screen 220, the icon 210, and the wallpaper 200 unique to the application.
- the control unit 120 sets the operation target image arranged in the lowest layer, that is, the wallpaper 200 whose layer is “1”.
- the operation target image ID is acquired (step S130b).
- control unit 120 determines whether or not the operation target image ID has been acquired (step S140). If it is determined that the operation target image ID has been acquired (step S140; Yes), the control unit 120 is associated with the acquired operation target image ID. The action ID is acquired (step S150). Then, the control unit 120 performs an action corresponding to the acquired action ID, for example, an action such as starting a program corresponding to the operated wallpaper 200.
- the user first Can be operated by touching from the front side using the input unit 110A.
- an operation target image that is below another operation target image and is difficult to touch from the front side it can be operated by touching from the back side using the second input unit 110B. Therefore, various inputs can be performed with a more intuitive and easy operation than in the past.
- the method for detecting the contact position on the touch panel is arbitrary, and for example, an electrostatic detection method, a resistance detection method, or the like may be used.
- the action executed when the operation target image is touched or instructed is not limited to the example of starting the program, and is arbitrary.
- buttons, toolbars, and wallpapers are exemplified as the operation target image.
- any image can be used as the operation target image as long as the area can be specified.
- it may be an image or text, and the image may be a still image or a moving image.
- television broadcast videos images
- electronic program guides data broadcast data
- subtitles subtitles
- actions corresponding to the individuality of the operation target images may be assigned.
- a television broadcast video image
- an electronic program guide is displayed in front of the video (layer 2)
- the operation target image and the action may be set so that the page of the electronic program guide is turned as a corresponding action, and the channel is changed when it is detected that the video of the television broadcast is touched.
- the image of the TV broadcast is displayed in the back, the data / subtitles of the data broadcast is displayed in the foreground, and when it is detected that the data / subtitles are touched, the operation target image is enlarged and reduced. Actions may be associated with each other.
- the present invention is not limited to the above embodiment, and various modifications and applications are possible.
- the operation image specifying process is executed.
- the present invention is not limited to this, and an arbitrary action different from the action corresponding to the operation target image may be executed according to the touch method.
- the control unit 120 It is determined whether or not there is an operation target image at the touch position detected by the input unit.
- the operation target image is an electronic program guide
- a recording reservation for a program that is an action different from the action corresponding to the electronic program guide may be made.
- the operation target image is a wallpaper
- an action different from the action corresponding to the wallpaper for example, a wallpaper selection action when a plurality of candidate wallpapers are displayed on the display unit 100 is executed. May be.
- an intuitive operation is realized by operating the operation target images displayed on the display unit 100 from the input units arranged on the front surface and the back surface of the housing.
- the present invention is not limited to this, and if the intuitive operation is not impaired, a display unit is arranged on the rear surface, and the operation target images displayed on the display unit are displayed from the front and rear input units. You may comprise so that it may operate.
- the operation target image can be displayed on the display unit.
- the operation target image disposed on the uppermost layer is displayed in front of the user.
- the operation target image arranged on the lowest layer is displayed in front of the user. In this way, the operation target image may be displayed so as to give the user a feeling that the housing has been transmitted.
- the present invention has been described by taking a mobile terminal as an example.
- the present invention is not limited to this, and can be applied to any information processing apparatus as long as touch sensors are provided on both main surfaces of the housing.
- it may be a PHS (Personal Handy-phone System), a PDA (Personal Digital Assistant or Personal Data Assistant), a smartphone, a tablet PC (Personal Computer), a game device, a portable music player, and the like.
- the present invention has been described by taking the portable terminal 10 constituted by a single rectangular parallelepiped housing as an example.
- the present invention is not limited to this, and the shape of the housing and the number of housings are arbitrary.
- a first housing 140A and a second housing 140B connected by a biaxial hinge 150 are provided, and each housing is arranged in two directions with the two axes of the biaxial hinge 150 as rotational axes.
- the mobile terminal 10 may be rotatable.
- the control unit 120 starts the operation image specifying process shown in FIG. 6 when detecting that either the first input unit 110A or the second input unit 110B is touched, and the above-described embodiment. The same operation may be performed.
- the information processing apparatus displays the display unit 300 arranged on one surface of the housing and the operation target images arranged on a plurality of layers as shown in FIG.
- a display control unit 310 that controls the display unit 300, a touch-type first input unit 320A that is disposed on the display unit 300 and detects a touched position, and a surface that faces the one surface of the housing.
- the touch-type second input unit 320B for detecting the touched position and the operation target image displayed at the touch position detected by the first input unit 320A are arranged on the uppermost layer.
- a specifying unit 330 that specifies an operation target image that is arranged on the lowest layer among the operation target images displayed at the touch position detected by the second input unit 320B.
- Specific part 330 is special
- An execution unit 340 for executing an action corresponding to the operation target image, is intended to include a wide embodiment of an information processing apparatus including a.
- the display unit 300 has the same configuration and function as the display unit 100 described above.
- the display control unit 310 includes functions related to display control of the control unit 120 and the storage unit 130.
- the first input unit 320A and the second input unit 320B have the same configuration and function as the first input unit 110A, the second input unit 110B, and the like.
- the specifying unit 330 has a function of specifying an operation target image operated via any one of the input units when the control unit 120 executes a program stored in the storage unit 130.
- the execution unit 340 has a function of executing an action assigned to the specified operation target image by causing the control unit 120 to execute a program stored in the storage unit 130.
- the program to be executed is a computer-readable recording medium such as a flexible disk, a CD-ROM (Compact Disc Read-Only Memory), a DVD (Digital Versatile Disc), and an MO (Magneto-Optical Disc).
- a system that executes the above-described processing may be configured by storing and distributing the program on a medium and installing the program.
- the program may be stored in a disk device or the like included in a predetermined server device on a communication network such as the Internet, and may be downloaded, for example, superimposed on a carrier wave.
- Appendix 2 A display unit disposed on the opposite surface of the housing;
- the second input unit is disposed on a display unit disposed on the facing surface,
- the display control unit controls the display unit arranged on the one side so that the operation target image arranged on the uppermost layer is displayed in front of the screen when viewed from the user, and is displayed on the lowermost layer.
- Controlling the display unit arranged on the facing surface so as to display the arranged operation target image in front of the screen when viewed from the user;
- An information processing method characterized by comprising:
- a display unit arranged on one side of the housing A display control unit for controlling the display unit to display operation target images arranged on a plurality of layers;
- a first input unit that is disposed on the display unit and detects a touch position on the display unit;
- a second input unit that is disposed on a facing surface facing one surface of the housing and detects a touch position on the display unit;
- Among the operation target images displayed at the touch position detected by the first input unit, the operation target image arranged on the uppermost layer is specified, and the touch position detected by the second input unit
- a specifying unit for specifying an operation target image arranged in the lowest layer among the operation target images displayed in An execution unit that executes an action corresponding to the operation target image specified by the specifying unit;
- a computer-readable recording medium storing a program that functions as a computer.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
しかし、壁紙の上に多数のアイコン等が配置されている場合には、アイコン等に触れずに壁紙に触れることが難しい。このため、壁紙を介して指示を入力することが難しい。
このように、多数のアイコンが表示された状態のまま、壁紙に対応付けられたプログラムを実行するには、例えば、これらの操作対象画像の表示位置を移動させたり、いくつかの操作対象画像を非表示にしたりする余分な操作が必要だった。
筐体の一面に配置された表示部と、
複数画層に配置された操作対象画像を表示するように前記表示部を制御する表示制御部と、
前記表示部上に配置され、該表示部におけるタッチ位置を検出する第1の入力部と、
前記筐体の一面に対向する対向面に配置され、前記表示部におけるタッチ位置を検出する第2の入力部と、
前記第1の入力部が検出したタッチ位置に表示されている操作対象画像のうち、最上層の画層に配置されている操作対象画像を特定し、前記第2の入力部が検出したタッチ位置に表示されている操作対象画像のうち、最下層の画層に配置されている操作対象画像を特定する特定部と、
前記特定部が特定した操作対象画像に対応するアクションを実行する実行部と、
を備えることを特徴とする。
複数画層に配置された操作対象画像を表示するように、筐体の一面に配置された表示部を制御する表示制御ステップと、
前記表示部上に配置された第1の入力部が検出した前記表示部におけるタッチ位置に表示されている操作対象画像のうち、最上層の画層に配置されている操作対象画像を特定し、前記筐体の一面に対向する対向面に配置された第2の入力部が検出した前記表示部におけるタッチ位置に表示されている操作対象画像のうち、最下層の画層に配置されている操作対象画像を特定する特定ステップと、
前記特定した操作対象画像に対応するアクションを実行する実行ステップと、
を備えたことを特徴とする。
コンピュータを、
筐体の一面に配置された表示部、
複数画層に配置された操作対象画像を表示するように前記表示部を制御する表示制御部、
前記表示部上に配置され、該表示部におけるタッチ位置を検出する第1の入力部、
前記筐体の一面に対向する対向面に配置され、前記表示部におけるタッチ位置を検出する第2の入力部、
前記第1の入力部が検出したタッチ位置に表示されている操作対象画像のうち、最上層の画層に配置されている操作対象画像を特定し、前記第2の入力部が検出したタッチ位置に表示されている操作対象画像のうち、最下層の画層に配置されている操作対象画像を特定する特定部、
前記特定部が特定した操作対象画像に対応するアクションを実行する実行部、
として機能させるプログラムを記録したコンピュータ読み取り可能な記録媒体である。
本実施形態に係る情報処理装置は、図1に示すように、情報処理装置の筐体の前面に第1の入力部110A、背面に第2の入力部110Bを備える携帯端末10である。なお、この携帯端末10の筐体の表示部100が配置された面を前面、前面に対向する面を背面と呼ぶ。この携帯端末10は、壁紙とアイコンの表示領域の一部が重なるとき、第1の入力部110Aからの入力によりアイコンに対応付けられたプログラムを実行し、第2の入力部110Bからの入力により壁紙に対応付けられたプログラムを実行する。
以下、図2を参照して、本実施形態に係る携帯端末10の構成を説明する。
携帯端末10は、表示部100、第1の入力部(前面入力部)110A、第2の入力部(背面入力部)110B、制御部120、記憶部130から構成される。
第2の入力部(背面入力部)110Bは、タッチされた位置を検出するタッチセンサを備え、携帯端末10の筐体の背面(第1の入力部110Aに対向する面)に配置されている。
第1の入力部110A、第2の入力部110Bは、タッチセンサで指やペン等の接触(タッチ)を検出すると、接触があった位置の情報を含む信号を制御部120へ送信する。
携帯端末10の電源がONすると、制御部120は、図5に示す画像情報テーブル1321を参照し、図4に示すように、各操作対象画像を対応する表示領域と画層(1又は2)に配置して表示画面を生成し、図3に示すように表示部100に表示する。このとき、制御部120は、図3に示す壁紙200とアイコン210のように、異なる画層に配置した操作対象画像の表示領域が重なる領域においては、上層の画層に配置された操作対象画像をユーザから見て手前に表示する(操作対象画像に透明処理等が施さない場合)。
第1の実施形態では、2階層の表示階層に配置された操作対象画像を例にとり、本発明に係る携帯端末10の仕組みを説明した。しかし、壁紙200、アイコン210の他にもテキストウィンドウ等が重ねて表示される場合、表示階層は3階層以上となることが多い。そこで、本実施形態では、3階層以上の表示階層に配置された操作対象画像を例にとり、本発明に係る携帯端末10の仕組みを説明する。
なお、本実施形態に係る携帯端末10の構成は、第1の実施形態に係る携帯端末10の構成と同じである。
携帯端末10の電源がONされると、制御部120は、図11に示すように、壁紙200に画層1を割り当て、アイコン210に画層2を割り当て、アプリケーション固有の入力画面220に画層3を割り当て、それぞれ、予め設定されている座標に表示する。
上記実施形態では、第1の入力部110A又は第2の入力部110Bがタッチ位置を検出すると、操作画像特定処理を実行する。しかし、本発明はこれに限定されず、タッチ方法に応じて、操作対象画像に対応するアクションとは異なる任意のアクションを実行するように構成してもよい。例えば、制御部120は、第1の入力部110A又は第2の入力部110Bのうち一方の入力部がタッチ位置を検出しているときに、他方の入力部がタッチ位置を検出すると、一方の入力部が検出しているタッチ位置に操作対象画像があるか否かを判別する。操作対象画像が電子番組表であれば、この電子番組表に対応するアクションとは異なるアクションである番組の録画予約をしてもよい。または、操作対象画像が壁紙であれば、この壁紙に対応するアクションとは異なるアクション(例えば、候補となる壁紙が表示部100に複数表示されているような場合における壁紙の選択アクション)を実行してもよい。
上記の実施形態の一部又は全部は、以下の付記のようにも記載されうるが、以下には限られない。
筐体の一面に配置された表示部と、
複数画層に配置された操作対象画像を表示するように前記表示部を制御する表示制御部と、
前記表示部上に配置され、該表示部におけるタッチ位置を検出する第1の入力部と、
前記筐体の一面に対向する対向面に配置され、前記表示部におけるタッチ位置を検出する第2の入力部と、
前記第1の入力部が検出したタッチ位置に表示されている操作対象画像のうち、最上層の画層に配置されている操作対象画像を特定し、前記第2の入力部が検出したタッチ位置に表示されている操作対象画像のうち、最下層の画層に配置されている操作対象画像を特定する特定部と、
前記特定部が特定した操作対象画像に対応するアクションを実行する実行部と、
を備えることを特徴とする情報処理装置。
前記筐体の対向面に配置された表示部をさらに有し、
前記第2の入力部は、前記対向面に配置された表示部上に配置され、
前記表示制御部は、最上層の画層に配置された操作対象画像を、ユーザから見て画面の手前に表示するように前記一面に配置された表示部を制御し、最下層の画層に配置された操作対象画像を、ユーザから見て画面の手前に表示するように前記対向面に配置された表示部を制御する、
ことを特徴とする付記1に記載の情報処理装置。
前記特定部は、前記第1の入力部又は前記第2の入力部のうち一方の入力部がタッチ位置を検出しているときに、他方の入力部がタッチ位置を検出すると、前記一方の入力部が検出しているタッチ位置に操作対象画像があるか否かを判別し、
前記実行部は、前記特定部が操作対象画像があると判別すると、該操作対象画像に対応するアクションとは異なるアクションを実行する、
付記1又は2に記載の情報処理装置。
複数画層に配置された操作対象画像を表示するように、筐体の一面に配置された表示部を制御する表示制御ステップと、
前記表示部上に配置された第1の入力部が検出した前記表示部におけるタッチ位置に表示されている操作対象画像のうち、最上層の画層に配置されている操作対象画像を特定し、前記筐体の一面に対向する対向面に配置された第2の入力部が検出した前記表示部におけるタッチ位置に表示されている操作対象画像のうち、最下層の画層に配置されている操作対象画像を特定する特定ステップと、
前記特定した操作対象画像に対応するアクションを実行する実行ステップと、
を備えたことを特徴とする情報処理方法。
コンピュータを、
筐体の一面に配置された表示部、
複数画層に配置された操作対象画像を表示するように前記表示部を制御する表示制御部、
前記表示部上に配置され、該表示部におけるタッチ位置を検出する第1の入力部、
前記筐体の一面に対向する対向面に配置され、前記表示部におけるタッチ位置を検出する第2の入力部、
前記第1の入力部が検出したタッチ位置に表示されている操作対象画像のうち、最上層の画層に配置されている操作対象画像を特定し、前記第2の入力部が検出したタッチ位置に表示されている操作対象画像のうち、最下層の画層に配置されている操作対象画像を特定する特定部、
前記特定部が特定した操作対象画像に対応するアクションを実行する実行部、
として機能させるプログラムを記録したコンピュータ読み取り可能な記録媒体。
100、300 表示部
110A、320A 第1の入力部
110B、320B 第2の入力部
120 制御部
130 記憶部
131 操作画像特定プログラム
132 画像情報記憶部
1321 画像情報テーブル
140A 第1の筐体
140B 第2の筐体
150 2軸ヒンジ
200 壁紙
210 アイコン
220 アプリケーション固有の入力画面
310 表示制御部
330 特定部
340 実行部
Claims (5)
- 筐体の一面に配置された表示部と、
複数画層に配置された操作対象画像を表示するように前記表示部を制御する表示制御部と、
前記表示部上に配置され、該表示部におけるタッチ位置を検出する第1の入力部と、
前記筐体の一面に対向する対向面に配置され、前記表示部におけるタッチ位置を検出する第2の入力部と、
前記第1の入力部が検出したタッチ位置に表示されている操作対象画像のうち、最上層の画層に配置されている操作対象画像を特定し、前記第2の入力部が検出したタッチ位置に表示されている操作対象画像のうち、最下層の画層に配置されている操作対象画像を特定する特定部と、
前記特定部が特定した操作対象画像に対応するアクションを実行する実行部と、
を備えることを特徴とする情報処理装置。 - 前記筐体の対向面に配置された表示部をさらに有し、
前記第2の入力部は、前記対向面に配置された表示部上に配置され、
前記表示制御部は、最上層の画層に配置された操作対象画像を、ユーザから見て画面の手前に表示するように前記一面に配置された表示部を制御し、最下層の画層に配置された操作対象画像を、ユーザから見て画面の手前に表示するように前記対向面に配置された表示部を制御する、
ことを特徴とする請求項1に記載の情報処理装置。 - 前記特定部は、前記第1の入力部又は前記第2の入力部のうち一方の入力部がタッチ位置を検出しているときに、他方の入力部がタッチ位置を検出すると、前記一方の入力部が検出しているタッチ位置に操作対象画像があるか否かを判別し、
前記実行部は、前記特定部が操作対象画像があると判別すると、該操作対象画像に対応するアクションとは異なるアクションを実行する、
請求項1又は2に記載の情報処理装置。 - 複数画層に配置された操作対象画像を表示するように、筐体の一面に配置された表示部を制御する表示制御ステップと、
前記表示部上に配置された第1の入力部が検出した前記表示部におけるタッチ位置に表示されている操作対象画像のうち、最上層の画層に配置されている操作対象画像を特定し、前記筐体の一面に対向する対向面に配置された第2の入力部が検出した前記表示部におけるタッチ位置に表示されている操作対象画像のうち、最下層の画層に配置されている操作対象画像を特定する特定ステップと、
前記特定した操作対象画像に対応するアクションを実行する実行ステップと、
を備えたことを特徴とする情報処理方法。 - コンピュータを、
筐体の一面に配置された表示部、
複数画層に配置された操作対象画像を表示するように前記表示部を制御する表示制御部、
前記表示部上に配置され、該表示部におけるタッチ位置を検出する第1の入力部、
前記筐体の一面に対向する対向面に配置され、前記表示部におけるタッチ位置を検出する第2の入力部、
前記第1の入力部が検出したタッチ位置に表示されている操作対象画像のうち、最上層の画層に配置されている操作対象画像を特定し、前記第2の入力部が検出したタッチ位置に表示されている操作対象画像のうち、最下層の画層に配置されている操作対象画像を特定する特定部、
前記特定部が特定した操作対象画像に対応するアクションを実行する実行部、
として機能させるプログラムを記録したコンピュータ読み取り可能な記録媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/430,597 US9733667B2 (en) | 2012-10-01 | 2013-09-03 | Information processing device, information processing method and recording medium |
EP13843475.8A EP2905685A4 (en) | 2012-10-01 | 2013-09-03 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD AND RECORDING MEDIUM |
JP2014539644A JP6225911B2 (ja) | 2012-10-01 | 2013-09-03 | 情報処理装置、情報処理方法及びプログラム |
CN201380051320.3A CN104704441B (zh) | 2012-10-01 | 2013-09-03 | 信息处理设备、信息处理方法和记录介质 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012219691 | 2012-10-01 | ||
JP2012-219691 | 2012-10-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014054367A1 true WO2014054367A1 (ja) | 2014-04-10 |
Family
ID=50434699
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/073648 WO2014054367A1 (ja) | 2012-10-01 | 2013-09-03 | 情報処理装置、情報処理方法及び記録媒体 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9733667B2 (ja) |
EP (1) | EP2905685A4 (ja) |
JP (1) | JP6225911B2 (ja) |
CN (1) | CN104704441B (ja) |
WO (1) | WO2014054367A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022179658A (ja) * | 2019-06-12 | 2022-12-02 | 日本電信電話株式会社 | タッチパネル型情報端末装置およびその情報入力処理方法 |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD726219S1 (en) * | 2013-06-09 | 2015-04-07 | Apple Inc. | Display screen or portion thereof with animated graphical user interface |
KR101730256B1 (ko) * | 2015-09-21 | 2017-04-26 | 엔에이치엔엔터테인먼트 주식회사 | 오버레이 제어 방법 및 시스템 |
US10739968B2 (en) * | 2015-11-23 | 2020-08-11 | Samsung Electronics Co., Ltd. | Apparatus and method for rotating 3D objects on a mobile device screen |
KR102468120B1 (ko) * | 2016-01-27 | 2022-11-22 | 삼성전자 주식회사 | 뷰 계층(뷰 레이어)들을 이용하여 입력을 처리하는 방법 및 전자장치 |
CN106547410B (zh) * | 2016-11-24 | 2019-03-12 | 南京仁光电子科技有限公司 | 一种基于同一屏幕的多层操作的系统 |
JP6868427B2 (ja) * | 2017-03-23 | 2021-05-12 | シャープ株式会社 | 入力機能付き表示装置 |
WO2019041117A1 (zh) * | 2017-08-29 | 2019-03-07 | 深圳传音通讯有限公司 | 一种智能终端的图标显示方法及图标显示装置 |
USD937858S1 (en) | 2019-05-31 | 2021-12-07 | Apple Inc. | Display screen or portion thereof with graphical user interface |
USD937295S1 (en) | 2020-02-03 | 2021-11-30 | Apple Inc. | Display screen or portion thereof with graphical user interface |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000293280A (ja) * | 1999-04-07 | 2000-10-20 | Sharp Corp | 情報入力装置 |
JP2004021522A (ja) * | 2002-06-14 | 2004-01-22 | Sony Corp | 情報処理装置および方法、並びにプログラム |
JP2010262626A (ja) | 2009-04-10 | 2010-11-18 | Nec Lcd Technologies Ltd | タッチセンサ装置及びこれを備えた電子機器 |
JP2010277089A (ja) * | 2009-05-28 | 2010-12-09 | Xerox Corp | ディスプレイシステム、多機能装置、及びマシン動作方法 |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100703443B1 (ko) * | 2005-01-26 | 2007-04-03 | 삼성전자주식회사 | 사용자 인터페이스 화면에서 네비게이션 방향을 사용자가정의할 수 있는 장치 및 방법 |
JP2008165451A (ja) * | 2006-12-28 | 2008-07-17 | Sharp Corp | 表示装置一体型入力装置 |
JP2009187290A (ja) * | 2008-02-06 | 2009-08-20 | Yamaha Corp | タッチパネル付制御装置およびプログラム |
KR101544364B1 (ko) * | 2009-01-23 | 2015-08-17 | 삼성전자주식회사 | 듀얼 터치 스크린을 구비한 휴대 단말기 및 그 컨텐츠 제어방법 |
US20100277420A1 (en) * | 2009-04-30 | 2010-11-04 | Motorola, Inc. | Hand Held Electronic Device and Method of Performing a Dual Sided Gesture |
US8497884B2 (en) * | 2009-07-20 | 2013-07-30 | Motorola Mobility Llc | Electronic device and method for manipulating graphic user interface elements |
EP2341419A1 (en) * | 2009-12-31 | 2011-07-06 | Sony Computer Entertainment Europe Limited | Device and method of control |
JP6049990B2 (ja) * | 2010-09-15 | 2016-12-21 | 京セラ株式会社 | 携帯電子機器、画面制御方法および画面制御プログラム |
US9092135B2 (en) * | 2010-11-01 | 2015-07-28 | Sony Computer Entertainment Inc. | Control of virtual object using device touch interface functionality |
JP2012141869A (ja) * | 2011-01-05 | 2012-07-26 | Sony Corp | 情報処理装置、情報処理方法およびコンピュータプログラム |
CN102645954A (zh) * | 2011-02-16 | 2012-08-22 | 索尼爱立信移动通讯有限公司 | 便携式终端 |
JP5709206B2 (ja) * | 2011-02-17 | 2015-04-30 | Necカシオモバイルコミュニケーションズ株式会社 | タッチパネル装置、処理決定方法、プログラムおよびタッチパネルシステム |
JP5784960B2 (ja) | 2011-04-26 | 2015-09-24 | 京セラ株式会社 | 携帯端末、タッチパネル操作プログラムおよびタッチパネル操作方法 |
JP2013117885A (ja) * | 2011-12-02 | 2013-06-13 | Nintendo Co Ltd | 情報処理プログラム、情報処理装置、情報処理システム及び情報処理方法 |
KR102164453B1 (ko) * | 2012-04-07 | 2020-10-13 | 삼성전자주식회사 | 투명 디스플레이를 포함하는 디바이스에서 오브젝트 제어 방법 및 그 디바이스와 기록 매체 |
JP6271858B2 (ja) * | 2012-07-04 | 2018-01-31 | キヤノン株式会社 | 表示装置及びその制御方法 |
-
2013
- 2013-09-03 CN CN201380051320.3A patent/CN104704441B/zh active Active
- 2013-09-03 EP EP13843475.8A patent/EP2905685A4/en not_active Withdrawn
- 2013-09-03 WO PCT/JP2013/073648 patent/WO2014054367A1/ja active Application Filing
- 2013-09-03 US US14/430,597 patent/US9733667B2/en active Active
- 2013-09-03 JP JP2014539644A patent/JP6225911B2/ja active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000293280A (ja) * | 1999-04-07 | 2000-10-20 | Sharp Corp | 情報入力装置 |
JP2004021522A (ja) * | 2002-06-14 | 2004-01-22 | Sony Corp | 情報処理装置および方法、並びにプログラム |
JP2010262626A (ja) | 2009-04-10 | 2010-11-18 | Nec Lcd Technologies Ltd | タッチセンサ装置及びこれを備えた電子機器 |
JP2010277089A (ja) * | 2009-05-28 | 2010-12-09 | Xerox Corp | ディスプレイシステム、多機能装置、及びマシン動作方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2905685A4 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022179658A (ja) * | 2019-06-12 | 2022-12-02 | 日本電信電話株式会社 | タッチパネル型情報端末装置およびその情報入力処理方法 |
JP7472950B2 (ja) | 2019-06-12 | 2024-04-23 | 日本電信電話株式会社 | タッチパネル型情報端末装置およびその情報入力処理方法 |
Also Published As
Publication number | Publication date |
---|---|
EP2905685A4 (en) | 2016-05-11 |
CN104704441B (zh) | 2018-10-26 |
EP2905685A1 (en) | 2015-08-12 |
CN104704441A (zh) | 2015-06-10 |
JPWO2014054367A1 (ja) | 2016-08-25 |
US20150261253A1 (en) | 2015-09-17 |
US9733667B2 (en) | 2017-08-15 |
JP6225911B2 (ja) | 2017-11-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6225911B2 (ja) | 情報処理装置、情報処理方法及びプログラム | |
US11036384B2 (en) | Pressure-sensitive degree control method and system for touchscreen-enabled mobile terminal | |
JP5204286B2 (ja) | 電子機器および入力方法 | |
CN105224166B (zh) | 便携式终端及其显示方法 | |
US8739053B2 (en) | Electronic device capable of transferring object between two display units and controlling method thereof | |
EP2466442B1 (en) | Information processing apparatus and information processing method | |
EP3023865B1 (en) | Portable terminal having display and method for operating same | |
JP5703873B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
US10198163B2 (en) | Electronic device and controlling method and program therefor | |
EP3343341B1 (en) | Touch input method through edge screen, and electronic device | |
US20110285631A1 (en) | Information processing apparatus and method of displaying a virtual keyboard | |
US20110154248A1 (en) | Information processing apparatus and screen selection method | |
KR20170062954A (ko) | 사용자 단말장치 및 디스플레이 방법 | |
US20160349946A1 (en) | User terminal apparatus and control method thereof | |
JP2009110286A (ja) | 情報処理装置、ランチャー起動制御プログラムおよびランチャー起動制御方法 | |
US20120284668A1 (en) | Systems and methods for interface management | |
JP2009003851A (ja) | タッチパネルを備えた情報機器、それに使用されるアイコン選択方法及びプログラム | |
JP2014164718A (ja) | 情報端末 | |
US20130159934A1 (en) | Changing idle screens | |
WO2014003025A1 (ja) | 電子機器 | |
JP5198548B2 (ja) | 電子機器、表示制御方法及びプログラム | |
US20140075391A1 (en) | Display control device, display control system, storing medium, and display method | |
JP2022167480A (ja) | 情報処理装置及び制御方法 | |
JP2013200826A (ja) | 表示システムおよび表示プログラム | |
JP2013101681A (ja) | 電子機器、表示制御方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13843475 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014539644 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013843475 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14430597 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |