WO2013008615A1 - 入力装置、画像表示方法およびプログラム - Google Patents
入力装置、画像表示方法およびプログラム Download PDFInfo
- Publication number
- WO2013008615A1 WO2013008615A1 PCT/JP2012/066228 JP2012066228W WO2013008615A1 WO 2013008615 A1 WO2013008615 A1 WO 2013008615A1 JP 2012066228 W JP2012066228 W JP 2012066228W WO 2013008615 A1 WO2013008615 A1 WO 2013008615A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- plane
- screen
- display
- image
- icon
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present invention relates to an input device that includes a position input unit such as a touch pad that receives designation of a position in a plane and outputs a position signal indicating the designated position, and controls display according to the position signal.
- a position input unit such as a touch pad that receives designation of a position in a plane and outputs a position signal indicating the designated position, and controls display according to the position signal.
- the touch panel is a combination of a display unit such as a liquid crystal display and a position input unit such as a touch pad that outputs a position signal indicating a contact position of an indicator such as a finger within the screen.
- the position input unit when the user touches a part of the screen with the user, the position input unit outputs a position signal indicating the contact position of the finger in the screen. And a control part controls the display of a display part according to the position signal from a position input part.
- Figure 1 shows an example of the screen display.
- a plurality of icons 101 are displayed on the screen 100 of the display unit at regular intervals.
- a specific function such as application activation is assigned to each icon 101.
- the position input unit When the user touches the position of a desired icon 101 on the screen 100 of the display unit, the position input unit outputs a position signal indicating the contact position of the finger in the screen 100, and the control unit displays the position. Based on the signal, the icon 101 corresponding to the contact position of the finger in the screen 100 is specified. Then, the control unit executes the function assigned to the identified icon 101 and controls the display on the screen 100.
- the icon 101 is small with respect to the size of the finger and the interval between the icons 101 is small, for example, as shown in FIG.
- a part may be applied to the adjacent icon 101b, and the control unit may erroneously recognize the adjacent icon 101b as a designated icon. In this case, a function different from the function desired by the user is executed.
- Patent Document 1 discloses a touch panel that can solve the problem of erroneous designation of icons.
- the size of the icon is changed according to the contact area of the finger. Specifically, if the finger contact area is larger than the icon display area, the icon display area is increased. By enlarging the icon, the user can reliably specify a desired icon with a finger.
- the enlarged icon is displayed in multiple pages. At this time, the enlarged icons are rearranged and displayed in a predetermined order.
- the enlarged icons may be rearranged and displayed.
- the display position of the icons differs between before and after enlargement. It is necessary to check one by one. Such confirmation work is a factor that reduces operability.
- the light from the sun or an external light source may interfere with the visibility of the icon.
- the visibility of an icon improves by changing the direction with respect to the light of a portable apparatus, such a change operation is troublesome for a user.
- An object of the present invention is to provide an input device, an image display method, and a program excellent in operability that allow a user to reliably specify an icon.
- the input device of the present invention provides: A display for displaying images on the screen; A position input unit that receives designation of an in-plane position with respect to the screen and outputs a position signal indicating the designated in-plane position; When a plurality of in-plane positions are designated, each in-plane position is specified based on the position signal, and a plurality of images are displayed at each in-plane position, and the interval between the in-plane positions is set. And a control unit for controlling the display size of each image accordingly.
- the image display method of the present invention includes: The position input unit accepts designation of the in-plane position with respect to the screen of the display unit, and outputs a position signal indicating the designated in-plane position, When a plurality of in-plane positions are specified, the control unit specifies each in-plane position based on the position signal, displays a plurality of images at each in-plane position, and It includes controlling the display size of each image according to the position interval.
- the program of the present invention A process of receiving a position signal indicating each of a plurality of in-plane positions designated with respect to the screen from the position input unit; Based on the position signal, the plurality of in-plane positions are specified, a plurality of images are displayed at each in-plane position, and the display size of each image is controlled according to the interval between the in-plane positions. Processing to be executed by a computer.
- FIG. 5 is a block diagram showing a configuration of an input device according to an embodiment of the present invention.
- the input device shown in FIG. 5 includes a screen 11 that is a display unit, a touch pad 12 that is a position input unit, a storage unit 13, and a control unit 10.
- the screen 11 and the touch pad 12 are arranged side by side for convenience, but actually, the screen 11 and the touch pad 12 are arranged to face each other.
- the in-plane position (coordinates) of the touch pad 12 and the in-plane position (coordinates) of the screen 11 correspond one-to-one.
- the screen 11 is a liquid crystal display, for example, and displays a plurality of images (including images such as icons) on the screen.
- the touch pad 12 accepts designation of an in-plane position with respect to the screen and outputs a position signal indicating the designated in-plane position.
- a combination of the screen 11 and the touch pad 12 is a so-called touch panel.
- touch pad 12 there are various types of touch pad 12 such as a resistive film type, an infrared type, and an electromagnetic induction type. In this embodiment, any type can be adopted. However, the touch pad 12 can detect multiple contacts on the screen at the same time, and can output a position signal indicating each contact location, and can perform multi-touch.
- the contact is contact of an indicator such as a finger or a pen.
- a position signal is a signal which shows the coordinate of the contact location on a screen, for example.
- the storage unit 13 is a storage device such as a semiconductor memory, and stores data and programs necessary for operating the input device.
- the program includes a plurality of applications.
- the data includes various data such as icon image data.
- the storage unit 13 also stores an application / icon correspondence table in which applications and icon image data are stored in association with each other.
- the control unit 10 operates according to a program stored in the storage unit 13. For example, the control unit 10 controls display of an image such as an icon on the screen 11 based on a position signal from the touch pad 12 or is designated. The function (application) assigned to the icon is executed.
- the control unit 10 includes a contact position / interval calculation unit 1, a display position / size calculation unit 2, a display control unit 3, and an application execution unit 4.
- the contact position / interval calculation unit 1 specifies an in-plane position (contact position of the indicator) on the touch pad 12 specified by the user with the indicator based on the position signal from the touch pad 12. When a plurality of in-plane positions are designated, the contact position / interval calculating unit 1 specifies each in-plane position on the touch pad 12 based on a position signal from the touch pad 12, and also determines each in-plane position. Calculate the interval.
- the display position / size calculation unit 2 determines a display position on the screen 11 corresponding to each in-plane position on the touch pad 12 specified by the contact position / interval calculation unit 1, and the contact position / interval calculation unit 1 Based on the acquired interval between the in-plane positions, the size of each icon image is calculated so that adjacent icon images do not overlap.
- the width of the icon image displayed in the first contact area is set to be smaller than D / 2.
- the width on the second contact area side from the center of gravity is set to be smaller than D / 2.
- the horizontal and vertical widths of adjacent icon images may be set so as to satisfy the following conditions.
- the two in-plane positions on the touch pad 12 are P1 and P2, respectively.
- the length of the straight line D connecting the center of gravity of the in-plane position P1 and the center of gravity of the in-plane position P2 is L, and the angle between the straight line D and the horizontal straight line is ⁇ .
- the horizontal interval W1 between the center of gravity of the in-plane position P1 and the center of gravity of the in-plane position P2 is given by L ⁇ cos ⁇ .
- the interval W2 in the vertical direction between the center of gravity of the in-plane position P1 and the center of gravity of the in-plane position P2 is given by L ⁇ sin ⁇ .
- the width from the center of gravity of the in-plane position P1 to the in-plane position P2 side is set to be smaller than W1 / 2.
- the width from the center of gravity of the in-plane position P2 to the in-plane position P1 is set to be smaller than W1 / 2.
- the width from the center of gravity of the in-plane position P1 to the in-plane position P2 is set to be smaller than W2 / 2. Further, among the vertical widths of the icon image displayed at the in-plane position P2, the width from the center of gravity of the in-plane position P2 to the in-plane position P1 is set to be smaller than W2 / 2.
- the display control unit 3 displays an image of a predetermined icon at a position determined by the display position / size calculation unit 2 at a position determined by the display position / size calculation unit 2.
- the application execution unit 4 Based on the position signal from the touch pad 12, the application execution unit 4 selects an icon image selected by the user through a predetermined contact operation using an indicator from the icon images displayed on the screen 11. The application corresponding to the identified icon image is acquired from the application / icon correspondence table in the storage unit 13. Then, the application execution unit 4 executes the acquired application.
- Fig. 7 shows the procedure of icon image display processing. Hereinafter, the display process will be described with reference to FIGS. 5 and 7.
- the contact position / interval calculator 1 determines whether or not a plurality of positions are designated based on the position signal from the touch pad 12 (step S10).
- the designation of a plurality of positions is not only a state in which a plurality of fingers are in contact with the screen at the same time, but a state in which a plurality of locations are touched by at least one finger within a predetermined time. means.
- two points may be specified by bringing the index finger and middle finger into contact with the screen at the same time, or two points may be designated by bringing the middle finger into contact with the screen within a predetermined time after making the index finger contact with the screen. May be.
- step S10 If the determination result in step S10 is “No”, this display process ends.
- step S10 If the determination result in step S10 is “Yes”, then the contact position / interval calculator 1 detects each contact position based on the position signal from the touch pad 12 (step S11), The interval between the contact positions is calculated (step S12).
- the display position / size calculation unit 2 determines the display position on the screen 11 corresponding to each contact position detected in step S11, and the icon image based on the contact position interval calculated in step S12. The size is determined (step S13).
- the display control unit 3 determines the image data to be displayed at each display position in accordance with a predetermined order from the icon image data stored in the storage unit 13 (step S14).
- the predetermined order is, for example, a ranking (priority order) in which priority is given to applications having a high execution frequency.
- the display control unit 3 displays an image (icon image) based on the selected image data on the screen 11 based on the display position and size determined in step S13 (step S15).
- Which image is to be displayed at which display position may be determined based on, for example, the positional relationship of contact. Specifically, when a plurality of locations are specified with a plurality of fingers, an image (icon image) is sorted to a display position so that a higher-order image (icon image) is displayed on the left side. I do.
- the contact position / interval calculation unit 1 outputs a position signal from the touch pad 12. Based on this, even if it is detected that the indicator has left the screen, the display control unit 3 maintains the display of the image (icon image) for a predetermined time from the detection timing.
- the user designates a plurality of positions in the screen using an indicator such as a finger, displays an image (icon image) at each designated position, and then moves the indicator away from the screen. Also, the image (icon image) is displayed over a certain period of time. In this display period, the user can confirm and designate an icon image.
- Fig. 8 shows the procedure for starting the icon.
- step S20 When the icon image is displayed in the procedure shown in FIG. 7 (step S20), the application execution unit 4 subsequently determines whether or not the indicator has left the screen based on the position signal from the touch pad 12. Determination is made (step S21).
- step S21 If the determination result in step S21 is “Yes”, then the application execution unit 4 measures the elapsed time from when the indicator has left the screen (step S22), and the indicator is within the predetermined time. It is determined whether or not the screen is touched (step S23).
- step S23 If the determination result in step S23 is “No”, the activation process ends.
- step S23 If the determination result of step S23 is “Yes”, then the application execution unit 4 determines whether or not there is one contact location (step S24).
- step S24 If the determination result in step S24 is “No”, the activation process ends. In this case, since a plurality of contacts are detected, the processing of steps S11 to S15 of the icon image display processing shown in FIG. 7 may be executed. However, the icon image to be displayed is determined to have the next highest priority of the previously displayed image.
- step S24 If the determination result in step S24 is “Yes”, then the application execution unit 4 detects the designated position on the screen 11 based on the position signal from the touch pad 12 (step S25). Then, the application execution unit 4 determines whether or not the detected position matches the display position of the icon image (step S26).
- step S26 If the determination result in step S26 is “No”, the activation process ends.
- step S26 If the determination result in step S26 is “Yes”, the application execution unit 4 acquires the application assigned to the icon image that matches the specified position from the application / icon correspondence table in the storage unit 13, and The acquired application is executed (step S27).
- an icon image is displayed at the designated positions.
- an indicator such as a finger
- the icon images 6 and 7 are displayed at the designated positions, respectively. Therefore, if the display position of the icon image is designated with the middle finger and the index finger so as to avoid the finger holding the casing of the input device, the casing of the input device is held in the designation of the icon image. Fingers do not get in the way.
- the finger contact position directly becomes the icon image display position
- the user can recognize the icon image display position from the finger contact position without looking at the screen. Therefore, since it is not necessary to confirm how the icon images are displayed, the operability is improved.
- the user can reliably recognize the position of the displayed image based on the contact position of the finger even in a state where the external light as shown in FIG. it can.
- the interval between the middle finger and the index finger touching the screen is set to the interval between the icon images 6 and 7, the user may touch the screen with an interval such that the middle finger and the index finger are not erroneously specified. Thereby, it is possible to suppress erroneous specification of the icon image.
- the input device of this embodiment described above is an example of the present invention, and the configuration and operation thereof can be changed as appropriate.
- the control unit 10 displays an icon image according to the adjustment width.
- the display range of 6 and 7 may be changed. Thereby, for example, if the interval between the contact position of the middle finger and the contact position of the index finger is reduced, the display range of the icon images 6 and 7 is reduced, and if the interval between the contact position of the middle finger and the contact position of the index finger is increased. The display range of the icon images 6 and 7 is increased.
- the screen 11 and the touch pad 12 may be provided on the same surface side or different surface sides.
- the screen 11 is provided on a predetermined surface (front surface) of the housing, and the touch pad 12 is provided on the surface opposite to the surface on which the screen 11 is provided (back surface). You may provide so that it may oppose.
- the user designates a plurality of positions on the touch pad 12 on the back side. For example, as shown in FIG. 10, when two positions are specified on the touch pad 12 on the back side, icon images 6 and 7 are displayed at positions corresponding to the specified positions on the screen 11.
- the size of the touch pad 12 may be different from that of the screen 11.
- a coordinate conversion table indicating the correspondence between the in-plane position (coordinates) of the touch pad 12 and the in-plane position (coordinates) of the screen 11 is stored in the storage unit 13.
- the in-plane position (coordinates) of the touch pad 12 is converted into the in-plane position (coordinates) of the screen 11 with reference to this coordinate conversion table.
- the image displayed at the contact position at the predetermined location is controlled to display the image displayed at the contact location at another location.
- the control function may be assigned.
- the contact position / interval calculation unit 1 detects the contact of the indicator to the first to third locations on the screen and also detects the contact order.
- the display position / size calculation unit 2 determines the display position from the first to third locations, and determines the size of the image based on the interval between the locations.
- the display control part 3 displays the image to which the function which controls the display in the display position of another location is assigned to the display position of the location where the contact order is a predetermined number, and the icon is displayed at the display location of the other location. Display an image.
- FIG. 11 After designating two places in the screen 5 of the screen 11 with the index finger and the middle finger, when another place in the screen 5 is designated with the ring finger, An image to which the control function is assigned is displayed. Specifically, in the control unit 10 shown in FIG. 5, the display control unit 3 displays the icon images 6 and 7 at the contact positions of the middle finger and the index finger, and the control image 8 at the contact position of the ring finger. Display. Here, a function for changing the display of the icon images 6 and 7 is assigned to the control image 8.
- the contact position / interval calculator 1 slides the ring finger on the screen 5 based on the position signal from the touch pad 12. And detect slide amount.
- the case where the control image 8 slides upward is defined as a first slide direction
- the case where the control image 8 slides downward is defined as a second slide direction.
- the display position / display size calculation unit 2 determines the display contents of the icon images 6 and 7 according to the detected slide direction and slide amount. Specifically, when the control image 8 slides by a predetermined amount in the first sliding direction, the image data having the second highest priority order after the image data displayed so far is selected as the icon images 6 and 7. . Then, the display control unit 3 displays images based on the selected image data as icon images 6 and 7.
- the display position where the image to which the control function is assigned is displayed is determined based on the contact order, but is not limited thereto.
- the display position where the image to which the control function is assigned is displayed may be determined according to the contact positional relationship. Specifically, when a plurality of locations are designated with a plurality of fingers, the contact position located on the rightmost side on the screen may be set as a display position where an image to which a control function is assigned is displayed.
- an image is displayed at a display position corresponding to the contact position of the indicator, but the present invention is not limited to this.
- An image may be displayed at a position having a relative relationship with the contact position of the indicator.
- control unit 10 may be a computer (CPU: Central Processing unit) that operates according to a program.
- the program can cause a computer to execute processing for controlling the screen 11 and the touch pad 12, and at least icon image display processing and image display processing to which a control function is assigned.
- the program may be provided using an optical disk such as a CD (Compact Disc) or DVD (Digital Versatile Disc), a recording medium such as a USB (Universal Serial Bus) memory, and is provided via a communication network (for example, the Internet). May be.
- the input device of the present invention can be applied not only to a terminal device such as a tablet terminal or a mobile phone terminal, but also to an electronic device (for example, a game machine) that operates independently.
- a terminal device such as a tablet terminal or a mobile phone terminal
- an electronic device for example, a game machine
Abstract
Description
画像を画面上に表示する表示部と、
前記画面に対する面内位置の指定を受け付けてその指定された面内位置を示す位置信号を出力する位置入力部と、
複数の面内位置が指定された場合に、前記位置信号に基づいて、各面内位置を特定し、複数の画像を前記各面内位置にて表示させるとともに、前記各面内位置の間隔に応じて各画像の表示の大きさを制御する制御部と、を有する。
位置入力部が、表示部の画面に対する面内位置の指定を受け付けてその指定された面内位置を示す位置信号を出力し、
制御部が、複数の面内位置が指定された場合に、前記位置信号に基づいて、各面内位置を特定し、複数の画像を前記各面内位置にて表示させるとともに、前記各面内位置の間隔に応じて各画像の表示の大きさを制御することを含む。
画面に対して指定された複数の面内位置それぞれを示す位置信号を位置入力部から受信する処理と、
前記位置信号に基づいて、前記複数の面内位置を特定し、複数の画像を各面内位置にて表示させるとともに、前記各面内位置の間隔に応じて各画像の表示の大きさを制御する処理と、をコンピュータに実行させる。
2 表示位置・サイズ算出部
3 表示制御部
4 アプリケーション実行部
10 制御部
11 スクリーン
12 タッチパッド
13 記憶部
Claims (7)
- 画像を画面上に表示する表示部と、
前記画面に対する面内位置の指定を受け付けてその指定された面内位置を示す位置信号を出力する位置入力部と、
複数の面内位置が指定された場合に、前記位置信号に基づいて、各面内位置を特定し、複数の画像を前記各面内位置にて表示させるとともに、前記各面内位置の間隔に応じて各画像の表示の大きさを制御する制御部と、を有する入力装置。 - 前記制御部は、前記複数の面内位置のうちの特定の面内位置に、他の面内位置に表示される他の画像の表示を制御するための機能が割り当てられた画像を表示し、該画像に対する操作に応じて前記他の画像の表示内容を制御する、請求項1に記載の入力装置。
- 表示の順位付けがなされた複数の画像データが格納された記憶部を、さらに有し、
前記制御部は、前記複数の画像データのうちから前記順位付けに基づいて前記各面内位置にて表示される画像の画像データを決定する、請求項1または2に記載の入力装置。 - 前記位置入力部は、前記画面が形成された面側に設けられている、請求項1から3のいずれか1項に記載の入力装置。
- 前記位置入力部は、前記画面が形成された面とは反対側の面に設けられている、請求項1から3のいずれか1項に記載の入力装置。
- 位置入力部が、表示部の画面に対する面内位置の指定を受け付けてその指定された面内位置を示す位置信号を出力し、
制御部が、複数の面内位置が指定された場合に、前記位置信号に基づいて、各面内位置を特定し、複数の画像を前記各面内位置にて表示させるとともに、前記各面内位置の間隔に応じて各画像の表示の大きさを制御する、画像表示方法。 - 画面に対して指定された複数の面内位置それぞれを示す位置信号を位置入力部から受信する処理と、
前記位置信号に基づいて、前記複数の面内位置を特定し、複数の画像を各面内位置にて表示させるとともに、前記各面内位置の間隔に応じて各画像の表示の大きさを制御する処理と、をコンピュータに実行させるプログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013523877A JP5991320B2 (ja) | 2011-07-14 | 2012-06-26 | 入力装置、画像表示方法およびプログラム |
US14/232,532 US9983700B2 (en) | 2011-07-14 | 2012-06-26 | Input device, image display method, and program for reliable designation of icons |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011-155719 | 2011-07-14 | ||
JP2011155719 | 2011-07-14 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013008615A1 true WO2013008615A1 (ja) | 2013-01-17 |
Family
ID=47505912
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/066228 WO2013008615A1 (ja) | 2011-07-14 | 2012-06-26 | 入力装置、画像表示方法およびプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US9983700B2 (ja) |
JP (1) | JP5991320B2 (ja) |
WO (1) | WO2013008615A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016157347A (ja) * | 2015-02-25 | 2016-09-01 | 京セラ株式会社 | 電子機器、制御方法、及び制御プログラム |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8381135B2 (en) | 2004-07-30 | 2013-02-19 | Apple Inc. | Proximity detector in handheld device |
US20100231506A1 (en) * | 2004-09-07 | 2010-09-16 | Timothy Pryor | Control of appliances, kitchen and home |
KR101789332B1 (ko) * | 2011-06-03 | 2017-10-24 | 삼성전자주식회사 | 휴대단말기에서 홈 스크린을 표시하는 방법 |
US10019151B2 (en) * | 2013-02-08 | 2018-07-10 | Motorola Solutions, Inc. | Method and apparatus for managing user interface elements on a touch-screen device |
CN111027107B (zh) * | 2019-12-10 | 2023-05-23 | 维沃移动通信有限公司 | 一种对象显示控制方法及电子设备 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004013381A (ja) * | 2002-06-05 | 2004-01-15 | Kazuyoshi Kotani | 仮想キー片手入力装置 |
JP2004062867A (ja) * | 2002-06-03 | 2004-02-26 | Fuji Xerox Co Ltd | 機能制御装置およびその方法 |
JP2009301094A (ja) * | 2008-06-10 | 2009-12-24 | Sharp Corp | 入力装置及び入力装置の制御方法 |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3721628B2 (ja) * | 1996-02-16 | 2005-11-30 | 松下電工株式会社 | 設備管理装置 |
JPH1139093A (ja) * | 1997-07-15 | 1999-02-12 | Toshiba Corp | 情報処理装置およびポインティング装置 |
US6211856B1 (en) * | 1998-04-17 | 2001-04-03 | Sung M. Choi | Graphical user interface touch screen with an auto zoom feature |
US6765557B1 (en) * | 2000-04-10 | 2004-07-20 | Interlink Electronics, Inc. | Remote control having touch pad to screen mapping |
US7461356B2 (en) | 2002-06-03 | 2008-12-02 | Fuji Xerox Co., Ltd. | Function control unit and method thereof |
JP2004295716A (ja) * | 2003-03-28 | 2004-10-21 | I'm Co Ltd | 表示機能付電子機器 |
JP2006268313A (ja) * | 2005-03-23 | 2006-10-05 | Fuji Xerox Co Ltd | 表示制御装置およびその表示内容の配置方法 |
JP2007156866A (ja) | 2005-12-06 | 2007-06-21 | Matsushita Electric Ind Co Ltd | データ処理装置 |
JP2007256338A (ja) | 2006-03-20 | 2007-10-04 | Denso Corp | 地図表示装置 |
JP2007328421A (ja) | 2006-06-06 | 2007-12-20 | Canon Inc | タッチパネル、及び該装置制御方法 |
JP4927633B2 (ja) | 2006-09-28 | 2012-05-09 | 京セラ株式会社 | 携帯端末及びその制御方法 |
KR20080073868A (ko) * | 2007-02-07 | 2008-08-12 | 엘지전자 주식회사 | 단말기 및 메뉴표시방법 |
JP4979600B2 (ja) | 2007-09-05 | 2012-07-18 | パナソニック株式会社 | 携帯端末装置、及び表示制御方法 |
JPWO2009031214A1 (ja) | 2007-09-05 | 2010-12-09 | パナソニック株式会社 | 携帯端末装置、及び表示制御方法 |
CN102047204A (zh) * | 2008-06-02 | 2011-05-04 | 夏普株式会社 | 输入装置、输入方法、程序以及记录介质 |
US8704775B2 (en) * | 2008-11-11 | 2014-04-22 | Adobe Systems Incorporated | Biometric adjustments for touchscreens |
JP2010146506A (ja) | 2008-12-22 | 2010-07-01 | Sharp Corp | 入力装置、入力装置の制御方法、入力装置の制御プログラム、コンピュータ読取可能な記録媒体、および情報端末装置 |
US8665227B2 (en) * | 2009-11-19 | 2014-03-04 | Motorola Mobility Llc | Method and apparatus for replicating physical key function with soft keys in an electronic device |
US9122318B2 (en) * | 2010-09-15 | 2015-09-01 | Jeffrey R. Spetalnick | Methods of and systems for reducing keyboard data entry errors |
US20120144337A1 (en) * | 2010-12-01 | 2012-06-07 | Verizon Patent And Licensing Inc. | Adjustable touch screen keyboard |
-
2012
- 2012-06-26 US US14/232,532 patent/US9983700B2/en not_active Expired - Fee Related
- 2012-06-26 JP JP2013523877A patent/JP5991320B2/ja not_active Expired - Fee Related
- 2012-06-26 WO PCT/JP2012/066228 patent/WO2013008615A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004062867A (ja) * | 2002-06-03 | 2004-02-26 | Fuji Xerox Co Ltd | 機能制御装置およびその方法 |
JP2004013381A (ja) * | 2002-06-05 | 2004-01-15 | Kazuyoshi Kotani | 仮想キー片手入力装置 |
JP2009301094A (ja) * | 2008-06-10 | 2009-12-24 | Sharp Corp | 入力装置及び入力装置の制御方法 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016157347A (ja) * | 2015-02-25 | 2016-09-01 | 京セラ株式会社 | 電子機器、制御方法、及び制御プログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2013008615A1 (ja) | 2015-02-23 |
US9983700B2 (en) | 2018-05-29 |
US20140139475A1 (en) | 2014-05-22 |
JP5991320B2 (ja) | 2016-09-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5675622B2 (ja) | 表示装置 | |
KR102255830B1 (ko) | 복수 개의 윈도우를 디스플레이하는 방법 및 장치 | |
US8614682B2 (en) | Touchscreen panel unit, scrolling control method, and recording medium | |
US10198163B2 (en) | Electronic device and controlling method and program therefor | |
JP5991320B2 (ja) | 入力装置、画像表示方法およびプログラム | |
JP2009110286A (ja) | 情報処理装置、ランチャー起動制御プログラムおよびランチャー起動制御方法 | |
JP6225911B2 (ja) | 情報処理装置、情報処理方法及びプログラム | |
JP2015005173A (ja) | タッチ・スクリーンを備える携帯式情報端末および入力方法 | |
JP5429627B2 (ja) | 携帯端末、携帯端末の操作方法、及び携帯端末の操作プログラム | |
US20130093712A1 (en) | Touch sensing method and electronic apparatus using the same | |
JP5848732B2 (ja) | 情報処理装置 | |
JP2015508547A (ja) | タッチ感応式デバイスを使用する方向制御 | |
KR20140122076A (ko) | 휴대형 전자장치의 객체 표시 방법 및 그에 관한 장치 | |
KR20140070745A (ko) | 디스플레이 장치 및 이의 구동 방법 | |
JP2013137739A (ja) | 電子機器、動作制御方法およびプログラム | |
WO2017022031A1 (ja) | 情報端末装置 | |
JP2014197164A (ja) | 表示装置、表示方法、及び表示プログラム | |
JP2014106806A (ja) | 情報処理装置 | |
WO2018082256A1 (zh) | 一种终端及其切换应用的方法 | |
WO2013047023A1 (ja) | 表示装置、表示方法およびプログラム | |
CN111142775A (zh) | 一种手势交互方法和装置 | |
JP2013073365A (ja) | 情報処理装置 | |
JP2015153083A (ja) | 表示制御プログラム、装置、及び方法 | |
JP2015022675A (ja) | 電子機器、インターフェース制御方法、および、プログラム | |
KR20140075391A (ko) | 멀티 터치와 탭핑을 결합하여 사용자 명령을 입력하는 방식의 사용자 인터페이스 방법 및 이를 적용한 전자 기기 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12811353 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013523877 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14232532 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12811353 Country of ref document: EP Kind code of ref document: A1 |