WO2014112029A1 - 情報処理装置、情報処理方法、及び、プログラム - Google Patents
情報処理装置、情報処理方法、及び、プログラム Download PDFInfo
- Publication number
- WO2014112029A1 WO2014112029A1 PCT/JP2013/050508 JP2013050508W WO2014112029A1 WO 2014112029 A1 WO2014112029 A1 WO 2014112029A1 JP 2013050508 W JP2013050508 W JP 2013050508W WO 2014112029 A1 WO2014112029 A1 WO 2014112029A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- mark
- information processing
- processing apparatus
- touch
- pinch
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/045—Zooming at least part of an image, i.e. enlarging it or shrinking it
Definitions
- the present invention relates to an information processing apparatus, an information processing method, and a program.
- Patent Document 1 JP-A-2000-163031
- an electronic book including a display unit capable of displaying a map image, and an instruction and an operation amount for executing at least one operation for enlarging or reducing the map image based on an operation history of a finger touching the display unit
- the map image enlargement instruction and the enlargement amount can be input by moving the two fingers away, and the map image reduction instruction and the reduction amount can be input by moving the two fingers closer.
- the operation method of the display image enlargement / reduction processing in the conventional information processing apparatus 100 ′ as described in Patent Document 1 will be described with reference to FIG.
- the fingers 201 and 202 here, the thumb and the index finger
- the touch panel 118 ′ Let the touch positions of the fingers 201 and 202 be the X point and the Y point, respectively.
- the two fingers 201 and 202 are moved away from each other (pinch out), the distance XY increases, and the enlargement ratio of the display image is continuously increased according to the movement.
- the distance XY decreases, and the reduction rate is continuously reduced according to the movement.
- the enlargement / reduction processing is performed according to the change in the distance between two fingers touching the touch panel, and there is an advantage that it is intuitive, easy to understand and easy to use.
- the conventional information processing apparatus 100 ′ uses two fingers for the operation, as shown in FIG. 6, the information processing apparatus 100 ′ is held with the left hand 210, and the pinch-out / pinch-in operation is performed with the right hand. It is easy to operate when using both hands.
- the information processing apparatus is a display unit that displays an image, an operation input unit that inputs a user operation, a control unit, When the control unit detects that the user's finger is touched on the operation input unit, the control unit displays a mark at the detected position, and in the state where the mark is displayed, When it is detected that the user's finger is touched at a different position, and the touch position changes in a direction away from the position of the mark, the display image on the display unit is enlarged, and the touch position is Control is performed to reduce the display image of the display unit when the mark changes in a direction approaching the mark position.
- FIG. 1 is a block diagram showing an internal configuration example of an information processing apparatus 100 according to an embodiment of the present invention.
- the information processing apparatus 100 includes a base station communication unit 101, a CPU 102, a memory 103, a storage 104, a GPS (Global Positioning System) reception unit 105, a geomagnetic sensor 106, an acceleration sensor 107, a gyro sensor 108, a wireless communication unit 109, a microphone 110,
- the audio processing unit 111, the speaker 112, the operation input unit 113, the display unit 114, the image processing unit 115, the video input unit 116, the input / output I / F 117, and the touch panel 118 are provided and are connected to the bus 150.
- the base station communication unit 101 is a communication interface that performs long-distance wireless communication with a base station (not shown) such as W-CDMA (Wideband Code Division Multiple Access) and GSM (registered trademark) (Global System for mobile communications). is there.
- a base station not shown
- W-CDMA Wideband Code Division Multiple Access
- GSM registered trademark
- the CPU 102 executes each program by executing a program stored in the memory 103 to perform various processes.
- the memory 103 is a flash memory, for example, and stores programs, data, and the like.
- the program stored in the memory 103 can be updated and added as needed by the base station communication unit 101 performing wireless communication with the base station and downloading from an external server (not shown).
- the information processing apparatus 100 includes a storage 104 such as a memory card, and can store data in the storage 104.
- the GPS receiver 105 receives a signal from a GPS satellite in the sky. Thereby, the current position of the information processing apparatus 100 can be detected.
- the geomagnetic sensor 106 is a sensor that detects the direction in which the information processing apparatus 100 is facing.
- the acceleration sensor 107 is a sensor that detects the acceleration of the information processing apparatus 100
- the gyro sensor 108 is a sensor that detects the angular velocity of the information processing apparatus 100.
- the wireless communication unit 109 is a communication interface that performs wireless communication using a wireless LAN such as IEEE802.11a / b / n.
- the microphone 110 is for inputting external sound, and the speaker 112 is for outputting sound to the outside.
- the input / output sound is processed by the sound processing unit 111.
- the touch panel 118 includes an operation input unit 113 and a display unit 114.
- the display unit 114 displays images and images such as an LCD, and has an operation input unit 113 such as a touch pad on the display surface.
- the operation input unit 113 is, for example, a capacitive touch pad, and detects a touch operation (hereinafter referred to as touch) with a finger or a touch pen as an operation input.
- touch a touch operation
- a menu of commands (functions) is displayed on the display unit 114, and the user touches and selects a desired command to detect the touch position and accept the command displayed at the touch position. It is.
- touches and touch operations such as taps, flicks, pinch outs / pinch ins, and the like while images are displayed on the display unit 114.
- the touch or touch operation received by the operation input unit 113 is input to the CPU 102 and processed.
- the video input unit 116 is, for example, a camera.
- the video displayed on the display unit 114 and the video input from the video input unit 116 are processed by the image processing unit 115.
- the input / output I / F 117 is, for example, a USB (Universal Serial Bus) or the like, and is an interface that transmits / receives data to / from an external device (not shown).
- USB Universal Serial Bus
- the mode is a mode for accepting a pinch-out / pinch-in operation with two fingers as in the case of the conventional information processing apparatus 100 ′.
- this operation is performed in advance by a command operation or the like. Switch to the example processing mode.
- the tap refers to an operation of hitting one point on the touch panel, and refers to a case where the touch start position and the touch end position are substantially the same.
- the information processing apparatus 100 is held with one hand 200 (here, the right hand) and tapped with the finger 201 (here, the thumb) of the hand 200.
- the operation input unit 113 detects the coordinates of the point A, stores it in the memory 103, and displays the mark 160 superimposed on the original image at the position of the point A (FIG. 2B).
- the mark 160 is referred to as a pinch mark
- the point A is referred to as a pinch position.
- the finger 201 is touched at a position facing the point A of the portion to be enlarged / reduced.
- the finger holding the information processing apparatus 100 from here is not shown for simplicity.
- the start position of this touch be point B.
- the operation input unit 113 detects the coordinates of the point B and stores them in the memory 103.
- the distance AB between the pinch position (point A) and the touch start position (point B) and the midpoint (point X) between the pinch position (point A) and the touch start position (point B) on the display image are calculated. Keep it.
- the movement of the slide refers to moving the touch position while maintaining the touched state.
- the position where the finger 201 is touching is set as a point C, and the operation input unit 113 continuously detects the coordinates of the point C and stores them in the memory 103. Since FIG. 3A shows a state at the start of touch, the touch start position (point B) and the touch position (point C) are the same position.
- FIG. 3B shows a case where the finger 201 is slid in a direction away from the pinch position (point A). This can be determined by calculating the distance AC between the pinch position (point A) and the touch position (point C) and comparing it with the distance AB. That is, since distance AC> distance AB, it can be determined that the object is sliding away from the pinch position (point A).
- the middle point (point X) on the display image becomes the pinch position (point A) and the touch position (point A).
- the display image is scrolled so as to be approximately at the center of the point C), and image processing is performed so that the center point (point X) is enlarged.
- the distance AC and the enlargement ratio are linked. That is, the enlargement ratio increases as the finger 201 moves away from the pinch position (point A).
- the user can easily set the display image to a desired size by one-handed operation. it can.
- the finger 201 may be released from the touch panel 118.
- the display image is set to the enlargement ratio at that time, as shown in FIG. finish.
- the display image is set to the reduction ratio at that time, as shown in FIG. Ends.
- the pinch mark 160 is erased and the process is terminated.
- the present invention is not limited to this.
- the pinch mark 160 may be displayed for a predetermined time (for example, 5 seconds), and the finger 201 may be touched and slid again to enable a pinch-out / pinch-in operation. .
- the present invention is not limited thereto. is not.
- the state of FIG. 3D is obtained, and when the finger is released here, the state of FIG. It becomes. That is, the display image is a reduced image as compared with the initial image of FIG.
- the finger 201 may be slid in a direction away from the pinch position (point A) without releasing the finger 201.
- the touch position (point C) is always detected, the distance AC is calculated, and compared with the distance AB, the enlargement / reduction is determined, and the enlargement ratio / The reduction ratio is determined, and the image is displayed with the enlargement ratio / reduction ratio centered on the coordinates (point X) on the display image.
- the display image is scrolled so that the coordinates (point X) on the display image are approximately at the center between the pinch position (point A) and the touch position (point C).
- the processing of the present embodiment may function in the case of a display image that can be enlarged / reduced, such as a map or a photograph.
- a display image that can be enlarged / reduced such as a map or a photograph.
- the processing of this embodiment is not performed and normal tap processing (selection, etc.) is performed. To do.
- a mark indicating that enlargement / reduction display is possible may be displayed.
- enlarging / reducing is centered on the pinch position (point A). You may do it.
- the point A and the pinch position (point A) on the display image are matched to perform enlargement / reduction display.
- FIG. 4 is an explanatory diagram of the erasing process of the pinch mark 160. This process is used when the touch panel 118 is tapped by mistake and the pinch mark 160 is displayed. As shown in FIG. 4A, the pinch mark 160 is tapped. In this case, the touch start position (point B) is detected, and if the distance from the pinch position (point A) is within a predetermined value (for example, 1 mm), it is determined that the pinch position (point A) has been tapped. As shown in FIG. 4B, the display image does not change, and the pinch mark 160 is erased. In the above description, the pinch mark 160 is erased when the pinch mark 160 is tapped.
- a predetermined value for example, 1 mm
- a predetermined time for example, 5 seconds
- the pinch mark 160 may be automatically deleted.
- a mechanical operation button such as a push button or a slide button may be provided in the information processing apparatus 100, and the pinch mark 160 may be erased when the operation button is operated.
- FIG. 5 is an explanatory diagram of a process for changing the position of the pinch mark 160.
- the pinch mark 160 is touched.
- the touch start position (point B) is detected, and if the distance from the pinch position (point A) is less than or equal to a predetermined value, it is determined that the pinch position (point A) has been touched. .
- the finger 201 is slid as shown in FIG.
- the touch position (point C) at this time is detected, and the pinch mark 160 is moved to the touch position (point C).
- the position (point C) when the finger 201 is released becomes a new pinch position (point A) as shown in FIG. It is trying to become.
- the pinch mark 160 can be displayed at the tapped position simply by tapping the touch panel 118.
- a pinch out / pinch in operation can be performed in conjunction with the movement of the finger.
- the display image can be easily enlarged / reduced.
- the pinch mark 160 is erased.
- the pinch mark 160 is touched and slid, the position of the pinch mark 160 can be changed. Therefore, the pinch mark 160 can be easily erased and changed with one hand. be able to.
- the information processing apparatus 100 is held with the right hand 200 and the operation is performed with the thumb 201 of the right hand 200.
- the present invention is not limited to this.
- the holding and operation may be performed with the left hand 210, or the operation may be performed with another finger (for example, an index finger).
- a mode in which the operation of pinching out / pinch in with two fingers is usually performed as in a conventional device a tap operation with a short time of touching the touch panel 118 is accepted as a normal tap operation.
- the processing of this embodiment may be performed, and the position where the long press operation is performed may be set as the pinch position. That is, there is no need to switch between the conventional pinch-out / pinch-in operation mode and the pinch-out / pinch-in operation mode of this embodiment, and the processing of this embodiment may be performed when a long press operation is performed.
- the present embodiment is not limited to the long press operation.
- the processing of this embodiment may be performed by a double tap operation in which a tap operation is performed twice within a predetermined time (for example, 1 second). In this case, if the first tap position and the second tap position are separated from each other by a predetermined distance (for example, 1 mm), the pinch position setting may not be accepted.
- the portable information processing apparatus may be used for a tablet, a notebook computer, a desktop personal computer, or the like.
- this invention is not limited to an above-described Example, Various modifications are included.
- the above-described embodiments have been described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the configurations described.
- Each of the above-described configurations, functions, processing units, processing means, and the like may be realized by hardware by designing a part or all of them with, for example, an integrated circuit.
- Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor.
- control lines and information lines indicate what is considered necessary for the explanation, and not all the control lines and information lines on the product are necessarily shown. Actually, it may be considered that almost all the components are connected to each other.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Abstract
Description
本願は上記課題を解決する手段を複数含んでいるが、その一例を挙げるならば、情報処理装置であって、画像を表示する表示部と、ユーザ操作を入力する操作入力部と、制御部と、備え、前記制御部は、前記操作入力部にユーザの指がタッチされたことを検出すると、検出した位置にマークを表示し、前記マークが表示されている状態で、該マークの位置とは異なる位置にてユーザの指がタッチされたことを検出した場合に、該タッチ位置が前記マークの位置より離れる方向に変化した場合は、前記表示部の表示画像を拡大し、該タッチ位置が前記マークの位置に近づく方向に変化した場合は、前記表示部の表示画像を縮小する制御を行うことを特徴とする。
操作入力部113は点Aの座標を検出し、メモリ103に記憶し、点Aの位置に元の画像に重ね合わせてマーク160を表示する(図2(b))。以降の説明では、マーク160をピンチマークといい、点Aをピンチ位置という。
図3(a)に示すように、拡大/縮小したい部分の点Aに対向する位置に指201をタッチする。なお、ここから情報処理装置100を保持している指については、簡略化のため図示しないこととする。このタッチの開始位置を点Bとする。操作入力部113は、点Bの座標を検出し、メモリ103に記憶する。このとき、ピンチ位置(点A)とタッチ開始位置(点B)の距離ABと、表示画像上のピンチ位置(点A)とタッチ開始位置(点B)の中点(点X)を算出しておく。
図4は、ピンチマーク160の消去処理の説明図である。本処理は、誤ってタッチパネル118をタップして、ピンチマーク160を表示させてしまった場合などに用いる。図4(a)に示すように、ピンチマーク160をタップする。この場合、タッチ開始位置(点B)を検出し、ピンチ位置(点A)との距離が所定値(例えば、1mm)以内の場合は、ピンチ位置(点A)をタップしたと判断し、図4(b)に示すように、表示画像は変化することなく、ピンチマーク160は消去するようにしている。なお、上記説明においては、ピンチマーク160をタップした場合に、ピンチマーク160を消去するようにしているが、ピンチマーク160が表示された状態で、所定時間(例えば5秒)内に、何も操作が行われなかった場合は、自動でピンチマーク160を消去してもよい。また、プッシュボタン、スライドボタンなど、機械的な操作ボタン(図示せず)を情報処理装置100に設けて、その操作ボタンを操作した時に、ピンチマーク160を消去するようにしてもよい。
図5は、ピンチマーク160の位置の変更処理の説明図である。図5(a)に示すように、ピンチマーク160をタッチする。上記の消去処理の場合と同様に、タッチ開始位置(点B)を検出し、ピンチ位置(点A)との距離が所定値以下の場合は、ピンチ位置(点A)をタッチしたと判断する。この状態で、図5(b)に示すように指201をスライドさせる。このときのタッチ位置(点C)を検出し、タッチ位置(点C)にピンチマーク160を移動させるようにしている。希望の場所まで指201をスライドさせ、指201をタッチパネル118から離すと、図5(c)に示すように、指201を離したときの位置(点C)が新しいピンチ位置(点A)となるようにしている。
また、上記の各構成、機能、処理部、処理手段等は、それらの一部又は全部を、例えば集積回路で設計する等によりハードウェアで実現してもよい。また、上記の各構成、機能等は、プロセッサがそれぞれの機能を実現するプログラムを解釈し、実行することによりソフトウェアで実現してもよい。各機能を実現するプログラム、テーブル、ファイル等の情報は、フラッシュメモリなどのメモリ、または、メモリーカードなどのストレージに置くことができる。
また、制御線や情報線は説明上必要と考えられるものを示しており、製品上必ずしも全ての制御線や情報線を示しているとは限らない。実際には殆ど全ての構成が相互に接続されていると考えてもよい。
113 操作入力部
114 表示部
118 タッチパネル
160 ピンチマーク
201 指
Claims (10)
- 情報処理装置であって、
画像を表示する表示部と、
ユーザ操作を入力する操作入力部と、
制御部と、備え、
前記制御部は、
前記操作入力部にユーザの指がタッチされたことを検出すると、検出した位置にマークを表示し、
前記マークが表示されている状態で、該マークの位置とは異なる位置にてユーザの指がタッチされたことを検出した場合に、
該タッチ位置が前記マークの位置より離れる方向に変化した場合は、前記表示部の表示画像を拡大し、該タッチ位置が前記マークの位置に近づく方向に変化した場合は、前記表示部の表示画像を縮小する制御を行う
ことを特徴とする情報処理装置。 - 請求項1に記載の情報処理装置であって、
前記制御部は、
前記タッチ位置の変化量に応じて、前記表示部の表示画像の拡大率及び縮小率を変化させる制御を行う
こと特徴とする情報処理装置。 - 請求項1または請求項2に記載の情報処理装置であって、
前記制御部は、
前記マークの位置またはその近傍をタッチされたことを検出すると、前記マークを消去する制御を行うことを特徴とする情報処理装置。 - 請求項1乃至請求項3のいずれかに記載の情報処理装置であって、
前記制御部は、
前記マークの位置またはその近傍をタッチされたことを検出し、さらに該タッチ位置が変化したことを検出すると、前記マークを該タッチ位置の変化に合わせて移動させる制御を行うことを特徴とする情報処理装置。 - 請求項4に記載の情報処理装置であって、
前記制御部は、
前記タッチ位置の変化に応じて、前記表示部の表示画像をスクロールする制御を行うことを特徴とする情報処理装置。 - 請求項1乃至請求項5のいずれかに記載の情報処理装置であって、
前記マークを表示する際のタッチは、タップ操作であることを特徴とする情報処理装置。 - 請求項1乃至請求項5のいずれかに記載の情報処理装置であって、
前記マークを表示する際のタッチは、略同一の位置を所定時間以上タッチする操作であることを特徴とする情報処理装置。 - 請求項1乃至請求項5のいずれかに記載の情報処理装置であって、
前記マークを表示する際のタッチは、略同一の位置を所定時間以内に2回タッチする操作であることを特徴とする情報処理装置。 - 情報処理装置における情報処理方法であって、
前記情報処理装置は、画像を表示する表示部と、ユーザ操作を入力する操作入力部と、を備え、
前記操作入力部にユーザの指がタッチされたことを検出すると、検出した位置にマークを表示するステップと、
前記マークが表示されている状態で、該マークの位置とは異なる位置にてユーザの指がタッチされたことを検出した場合に、
該タッチ位置が前記マークの位置より離れる方向に変化した場合は表示画像を拡大し、該タッチ位置が前記マークの位置に近づく方向に変化した場合は表示画像を縮小するステップと、を有する
ことを特徴とする情報処理方法。 - 情報処理装置に、
該情報処理装置の操作入力部にユーザの指がタッチされたことを検出すると、検出した位置にマークを表示するステップと、
前記マークが表示されている状態で、該マークの位置とは異なる位置にてユーザの指がタッチされたことを検出した場合に、
該タッチ位置が前記マークの位置より離れる方向に変化した場合は表示画像を拡大し、該タッチ位置が前記マークの位置に近づく方向に変化した場合は表示画像を縮小するステップと、
を実行させるためのプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2013/050508 WO2014112029A1 (ja) | 2013-01-15 | 2013-01-15 | 情報処理装置、情報処理方法、及び、プログラム |
US14/651,244 US20150301635A1 (en) | 2013-01-15 | 2013-01-15 | Information processing device, information processing method, and program |
JP2014557205A JPWO2014112029A1 (ja) | 2013-01-15 | 2013-01-15 | 情報処理装置、情報処理方法、及び、プログラム |
CN201380065000.3A CN104838347A (zh) | 2013-01-15 | 2013-01-15 | 信息处理装置、信息处理方法、以及程序 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2013/050508 WO2014112029A1 (ja) | 2013-01-15 | 2013-01-15 | 情報処理装置、情報処理方法、及び、プログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014112029A1 true WO2014112029A1 (ja) | 2014-07-24 |
Family
ID=51209155
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/050508 WO2014112029A1 (ja) | 2013-01-15 | 2013-01-15 | 情報処理装置、情報処理方法、及び、プログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20150301635A1 (ja) |
JP (1) | JPWO2014112029A1 (ja) |
CN (1) | CN104838347A (ja) |
WO (1) | WO2014112029A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014182420A (ja) * | 2013-03-18 | 2014-09-29 | Casio Comput Co Ltd | 情報処理装置及びプログラム |
JP2016024580A (ja) * | 2014-07-18 | 2016-02-08 | 富士通株式会社 | 情報処理装置、入力制御方法、および入力制御プログラム |
JP2016224688A (ja) * | 2015-05-29 | 2016-12-28 | シャープ株式会社 | 情報処理装置、制御方法、制御プログラム、および記録媒体 |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016018062A1 (en) * | 2014-07-31 | 2016-02-04 | Samsung Electronics Co., Ltd. | Method and device for providing content |
JP6978826B2 (ja) * | 2016-01-08 | 2021-12-08 | キヤノン株式会社 | 表示制御装置及びその制御方法、プログラム、並びに記憶媒体 |
JP6786833B2 (ja) | 2016-03-22 | 2020-11-18 | 富士ゼロックス株式会社 | 情報処理装置 |
JP7022899B2 (ja) * | 2016-12-27 | 2022-02-21 | パナソニックIpマネジメント株式会社 | 電子機器、入力制御方法、及びプログラム |
JP6962041B2 (ja) * | 2017-07-13 | 2021-11-05 | コニカミノルタ株式会社 | 画像処理装置、画像表示方法、およびコンピュータプログラム |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010067178A (ja) * | 2008-09-12 | 2010-03-25 | Leading Edge Design:Kk | 複数点入力可能な入力装置及び複数点入力による入力方法 |
JP2011034451A (ja) * | 2009-08-04 | 2011-02-17 | Fujitsu Component Ltd | タッチパネル装置及び方法並びにプログラム及び記録媒体 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4803883B2 (ja) * | 2000-01-31 | 2011-10-26 | キヤノン株式会社 | 位置情報処理装置及びその方法及びそのプログラム。 |
JP5259898B2 (ja) * | 2001-04-13 | 2013-08-07 | 富士通テン株式会社 | 表示装置、及び表示処理方法 |
JP4067374B2 (ja) * | 2002-10-01 | 2008-03-26 | 富士通テン株式会社 | 画像処理装置 |
JP5072194B2 (ja) * | 2004-05-14 | 2012-11-14 | キヤノン株式会社 | 情報処理装置および情報処理方法ならびに記憶媒体、プログラム |
DE202005021427U1 (de) * | 2004-07-30 | 2008-02-14 | Apple Inc., Cupertino | Elektronische Vorrichtung mit berührungsempfindlicher Eingabeeinrichtung |
JP5092255B2 (ja) * | 2006-03-09 | 2012-12-05 | カシオ計算機株式会社 | 表示装置 |
JP2009140368A (ja) * | 2007-12-07 | 2009-06-25 | Sony Corp | 入力装置、表示装置、入力方法、表示方法及びプログラム |
JP2009176114A (ja) * | 2008-01-25 | 2009-08-06 | Mitsubishi Electric Corp | タッチパネル装置及びユーザインタフェース装置 |
JP5185150B2 (ja) * | 2009-02-04 | 2013-04-17 | 富士フイルム株式会社 | 携帯機器および操作制御方法 |
CN102369501A (zh) * | 2009-02-23 | 2012-03-07 | 胜利电子株式会社 | 触摸屏控制方法及触摸屏装置 |
JP5812576B2 (ja) * | 2010-04-16 | 2015-11-17 | ソニー株式会社 | 情報処理装置及びそのプログラム |
JP2012185647A (ja) * | 2011-03-04 | 2012-09-27 | Sony Corp | 表示制御装置、表示制御方法、およびプログラム |
-
2013
- 2013-01-15 WO PCT/JP2013/050508 patent/WO2014112029A1/ja active Application Filing
- 2013-01-15 JP JP2014557205A patent/JPWO2014112029A1/ja active Pending
- 2013-01-15 CN CN201380065000.3A patent/CN104838347A/zh active Pending
- 2013-01-15 US US14/651,244 patent/US20150301635A1/en not_active Abandoned
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010067178A (ja) * | 2008-09-12 | 2010-03-25 | Leading Edge Design:Kk | 複数点入力可能な入力装置及び複数点入力による入力方法 |
JP2011034451A (ja) * | 2009-08-04 | 2011-02-17 | Fujitsu Component Ltd | タッチパネル装置及び方法並びにプログラム及び記録媒体 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014182420A (ja) * | 2013-03-18 | 2014-09-29 | Casio Comput Co Ltd | 情報処理装置及びプログラム |
JP2016024580A (ja) * | 2014-07-18 | 2016-02-08 | 富士通株式会社 | 情報処理装置、入力制御方法、および入力制御プログラム |
JP2016224688A (ja) * | 2015-05-29 | 2016-12-28 | シャープ株式会社 | 情報処理装置、制御方法、制御プログラム、および記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2014112029A1 (ja) | 2017-01-19 |
US20150301635A1 (en) | 2015-10-22 |
CN104838347A (zh) | 2015-08-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014112029A1 (ja) | 情報処理装置、情報処理方法、及び、プログラム | |
KR102097496B1 (ko) | 폴더블 이동 단말기 및 그 제어 방법 | |
US11816330B2 (en) | Display device, display controlling method, and computer program | |
CN108958685B (zh) | 连接移动终端和外部显示器的方法和实现该方法的装置 | |
KR101836381B1 (ko) | 터치스크린 단말기에서 화면 디스플레이 제어 방법 및 장치 | |
US9851898B2 (en) | Method for changing display range and electronic device thereof | |
EP2735960A2 (en) | Electronic device and page navigation method | |
US20150185953A1 (en) | Optimization operation method and apparatus for terminal interface | |
US20150022468A1 (en) | Method for processing input and electronic device thereof | |
US9223406B2 (en) | Screen display control method of electronic device and apparatus therefor | |
WO2017113379A1 (zh) | 一种用户界面的菜单显示方法及手持终端 | |
US9400599B2 (en) | Method for changing object position and electronic device thereof | |
WO2014024363A1 (ja) | 表示制御装置、表示制御方法及びプログラム | |
JP2015005173A (ja) | タッチ・スクリーンを備える携帯式情報端末および入力方法 | |
JP2011186734A (ja) | 表示装置及び画面表示方法 | |
JP6102474B2 (ja) | 表示装置、入力制御方法、及び入力制御プログラム | |
JP2014149819A (ja) | 電子装置のスクロール装置及びその方法 | |
WO2013161170A1 (ja) | 入力装置、入力支援方法及びプログラム | |
KR102095039B1 (ko) | 터치 인터페이스를 제공하는 장치에서 터치 입력을 수신하는 방법 및 장치 | |
WO2017022031A1 (ja) | 情報端末装置 | |
KR20140082434A (ko) | 전자장치에서 화면 표시 방법 및 장치 | |
US20130181919A1 (en) | Electronic device and method for controlling the same | |
KR20110066545A (ko) | 터치스크린을 이용하여 이미지를 표시하기 위한 방법 및 단말 | |
KR102027548B1 (ko) | 전자장치에서 화면표시 제어 방법 및 장치 | |
WO2023210352A1 (ja) | 情報処理装置、情報処理方法、及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13871813 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014557205 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14651244 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13871813 Country of ref document: EP Kind code of ref document: A1 |