US9423950B2 - Display control apparatus and control method - Google Patents

Display control apparatus and control method Download PDF

Info

Publication number
US9423950B2
US9423950B2 US12/849,591 US84959110A US9423950B2 US 9423950 B2 US9423950 B2 US 9423950B2 US 84959110 A US84959110 A US 84959110A US 9423950 B2 US9423950 B2 US 9423950B2
Authority
US
United States
Prior art keywords
touch
touch position
control apparatus
display control
acceleration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US12/849,591
Other languages
English (en)
Other versions
US20110032201A1 (en
Inventor
Yasutaka Naka
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Assigned to CANON KABUSHIKI KAISHA reassignment CANON KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NAKA, YASUTAKA
Publication of US20110032201A1 publication Critical patent/US20110032201A1/en
Application granted granted Critical
Publication of US9423950B2 publication Critical patent/US9423950B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers

Definitions

  • the present invention relates to a display control apparatus that allows a plurality of input operations using a touch panel by a user, its control method, a program, and a computer-readable storage medium.
  • Japanese Patent No. 3,181,181 discusses an operation determination method that allows a plurality of pen input operations without changing an input mode with respect to one application.
  • a hand-written input is one of the characteristic input methods using a touch panel.
  • a control for an apparatus is performed based on such a trace of a touch position
  • a sampling method of the touch position may be important.
  • Japanese Patent Application Laid-Open No. 2000-010721 discusses a technique for holding a balance between system load and the hand-written input operability by changing a sampling cycle to be an optimal value according to a changing speed of a touch position.
  • Japanese Patent Application Laid-Open No. 63-174125 discusses a technique in which a system control device scrolls an image displayed on a display unit by tracking a movement of a finger on a touch panel, gradually decreases the scrolling speed after the finger is detached from the touch panel, and stops the scrolling.
  • a system control device cannot distinguish a plurality of input operations, and thus an operability of the panel could be deteriorated. For example, there is a case that although a flick operation, in which an operator quickly moves his finger while touching, is performed, the system control device incorrectly determines that operation as a tap, in which a user touches on one point and quickly detaches without moving. Accordingly, for correctly distinguishing a plurality of input operations, a high-speed coordinate sampling is necessary.
  • the present invention relates to a display control apparatus and its control method allowing a plurality of input operations using a touch panel by a user, and capable of increasing an application performance, and decreasing power consumption.
  • a display control apparatus includes a display unit, a touch condition detection unit, a touch position acquisition unit, and control unit.
  • the display unit is configured to including a touch panel.
  • the touch condition detection unit is configured to detect whether a condition is in a touch condition to the touch panel.
  • the touch position acquisition unit is configured to acquire a touch position on the touch panel.
  • the control unit is configured to control to acquire a touch position in a first cycle by the touch position acquisition unit until a predetermined time elapses from detection of a touch by the touch condition detection unit, and to acquire a touch position by the touch position acquisition unit in a second cycle which is longer than the first cycle, after the predetermined time elapses.
  • FIG. 1 is a block diagram illustrating an example configuration of a digital camera according to an exemplary embodiment.
  • FIG. 2 illustrates a screen change according to the exemplary embodiment.
  • FIG. 3 is a timing chart of coordinate sampling processing according to the exemplary embodiment.
  • FIG. 4 is a flowchart illustrating coordinate sample processing according to the exemplary embodiment.
  • FIG. 5 is a flowchart illustrating processing corresponding to an input operation to a touch panel according to the exemplary embodiment.
  • FIG. 1 illustrates an example configuration of a digital camera 100 as one example of the display control apparatus to which the present invention is applicable.
  • a system control circuit 1 realizes processing described below by executing a program stored in a non-volatile memory 20 built-in or attachable/detachable to/from the digital camera 100 .
  • the system circuit 1 can execute a program located on a network via a network interface (I/F) 21 , and thus the present invention can be applied to a program on a network.
  • I/F network interface
  • a power source switch 2 instructs ON/OFF of power source of a digital camera 100 .
  • a mode switch 3 changes between a shooting mode for performing image processing of image data to store it in a image recording unit 10 , and a reproduction mode for performing image processing of image data stored in the image recording unit 10 to display on a display unit 6 .
  • a release switch 4 instructs the digital camera 100 to record imaging data in the image recording unit 10 .
  • An imaging unit 5 performs AD conversion of an image signal obtained by forming an image on an image sensor to output.
  • a display unit 6 includes a liquid crystal display (LCD), and displays image data written in a display memory 7 .
  • LCD liquid crystal display
  • An image processing unit 8 performs compression/expansion processing and development processing on the image data captured by the imaging unit 5 and the image data recorded in the image recording unit 10 .
  • An image processing memory 9 is used as a work memory necessary for the image processing unit 8 to perform image processing.
  • the image recording unit 10 records captured image data.
  • a touch panel 11 is arranged superposed on the display unit 6 , and outputs touch input given by a user as an analog signal.
  • the touch panel 11 is a resistance film type touch panel in the present embodiment, but any types can be used from among various types of a touch panels, such as an electrostatic capacity type and optical type other than the resistance film type.
  • a touch condition detection unit 12 detects a condition whether a user is touching the touch panel 11 .
  • the touch condition detection unit 12 outputs “ON” in a condition in which a user touches the touch panel 11 by the user's finger or a pen (hereinafter referred to as “touch”). Further, the touch condition detection unit 12 outputs “OFF” in a condition in which the user is not touching the touch panel 11 with the user's finger or the pen, that is, in a condition in which nothing is touching the touch panel 12 (hereinafter referred to as “untouch”).
  • An AD conversion processing unit 13 converts an analog signal output from the touch panel 11 to a digital signal.
  • a filtering processing unit 14 performs filtering processing, such as median/average processing, on the converted digital signal by the AD conversion processing unit 13 .
  • a correction calculation processing unit 15 converts an output result from the filtering processing unit 14 to coordinates used by processing in the system control circuit 1 , and corrects a deviation amount of panel output due to aging.
  • a touch position acquisition unit for acquiring a touch position on the touch panel 11 is configured with the filtering processing unit 14 and correction calculation processing unit 15 .
  • a data acquisition timing control unit 16 notifies coordinate acquisition timing to the correction calculation processing unit 15 based on a coordinate sampling cycle (a coordinate acquisition cycle), such as t1 to t3 described below.
  • the correction calculation processing unit 15 instructs the filtering processing unit 14 to acquire coordinate data according to a data acquisition timing notified from the data acquisition timing control unit 16 .
  • the filtering processing unit 14 causes the AD conversion processing unit 13 to start an operation, in response to the coordinate acquisition request from the correction calculation processing unit 15 , and to perform necessary number of AD conversions for filtering processing, and then to stop the operation thereof.
  • An operation distinction unit 17 distinguishes kinds of input operations by a user, based on the coordinate data in which correction processing is performed by the correction calculation processing unit 15 .
  • a reproducing control unit 18 performs system control in the reproducing mode according to the output of the operation distinction unit 17 .
  • a shooting control unit 19 performs system control in the shooting mode according to the output of the operation distinction unit 17 .
  • the shooting mode of the digital camera 100 includes a plurality of modes, such as an automatic shooting mode, a manual shooting mode, and a scene shooting mode that is specialized to a specified scene. Further, many functions are provided for correcting shooting according to modes and user setups.
  • a through-display function which displays an image currently being captured by the imaging unit 5 on the display unit 6 on real time.
  • a face detection function which detects a human face from an image currently being captured by the imaging unit 5 .
  • a tracking function for tracking a specified object in an image currently being captured by the imaging unit 5 by acquiring a relationship between images.
  • a shooting setting function which performs shooting setting, such as auto focus (AF), auto exposure (AE), and auto white balance (AWB), according to a specified object in an image currently being captured by the imaging unit 5 .
  • shooting setting such as auto focus (AF), auto exposure (AE), and auto white balance (AWB)
  • FIG. 2A illustrates screen transitions of the display unit 6 that includes the touch panel 11 by a reproduction application of the digital camera 110 .
  • a reproduction application scrolls the image as illustrated in a state 52 according to the input coordinates in sequence (hereinafter referred to as “a drag operation”).
  • a user touches the screen, moves the user's finger over the screen, and then untouches.
  • the coordinates A and B are sampled (coordinate acquisition) in this order, and the reproduction application calculates an operation speed between A and B.
  • the reproduction application scrolls the images from a current image to a next image with slowing down the speed after untouch, and stops the scrolling when the next image is displayed at a predetermined position as illustrated in a state 54 (hereinafter referred to as a flick operation).
  • a tap operation when a user touches the screen and untouches at a position within a predetermined range from the first coordinate position, as illustrated in a state 56 , the reproduction application enlarges a currently reproducing image centering the coordinate position touched, and displays on the screen (hereinafter referred to as “a tap operation”).
  • the reproduction application displays a plurality of images on the screen as illustrated in a state 58 (herein after referred to as a double tap operation).
  • the system control circuit 1 performs coordinate sampling processing in a coordinate sampling cycle t1 (a first cycle), which is short (fast), until a predetermined time (T) elapses from starting of touch. Then, after the predetermined time (T) elapses, the system control circuit 1 performs the sampling processing in a coordinate sampling cycle t2 (a second cycle), which is longer (slower) than t1. Further, even after the predetermined time (T) elapses, when a moving speed of touching finger becomes high, the system control circuit 1 changes the coordinate sampling cycle to be the short coordinate sampling cycle t1 again.
  • the system control circuit 1 changes a coordinate sampling cycle to be high-speed at a time of immediately after starting of touch in which there is a possibility of performing the flick operation, or when the drag operation becomes quick. In a period other than these cases, the system control circuit 1 changes the coordinate sampling cycle to be low-speed. Therefore, the system control circuit 1 can reliably distinguish an operation with a coordinate sampling cycle of a requisite minimum speed.
  • FIG. 3 is a timing chart of coordinate sampling processing.
  • the filtering processing unit 14 acquires touched coordinate data, which is AD converted in the AD conversion processing unit 13 , on the touch panel 11 in a predetermined cycle (filtering processing unit sampling timing).
  • the predetermined cycle, in which the filtering processing unit 14 acquires the coordinate data is sufficiently faster than a coordinate sampling cycle (t1, t2, or t3) in which the correction calculation processing unit 15 acquires one coordinate data.
  • the correction calculation processing unit 15 arranges the acquired 6 coordinate data in order of a size, and performs averaging processing to four data positioned at a center of the arranged coordinate data, so that one coordinate data is acquired.
  • a cycle for acquiring one coordinate data by the correction calculation processing unit 15 is t1 until the predetermined time T elapses from turning ON of the output of the touch condition detection unit 12 , and is t2, which is longer than t1, after the predetermined time (T) elapses (correction calculation processing unit calculation timing).
  • t1 and t2 are referred to as a coordinate sampling cycle. That is, the system control circuit 1 performs high-speed coordinate sampling in the coordinate sampling cycle t1 in a period of the time T after touch starting. After the time T elapses, the system control circuit 1 performs low-speed coordinate sampling in the coordinate sampling cycle t2.
  • FIG. 4 is a flowchart illustrating coordinate sampling processing in the correction calculation processing unit 15 of the system control circuit 1 .
  • step S 100 system control circuit 1 initializes the number of AD converted coordinate data with 0.
  • step S 104 the system control circuit 1 repeats AD conversion processing of the coordinate until the number of data coordinate reaches a predetermined number (for example, 6)(NO in step S 104 ) and, in steps S 101 to S 104 , acquires the predetermined number of the coordinate data via the filtering processing unit 14 .
  • a cycle repeating the AD conversion is a far shorter than the coordinate sampling cycle t1.
  • step S 104 the processing proceeds to step S 105 .
  • step S 105 the system control circuit 1 arranges acquired 6 coordinate data in order of a size, and performs averaging processing of the 4 data positioned at a center of the arranged coordinate data.
  • step S 106 the system control circuit 1 performs, with respect to one pair of coordinate data acquired in step S 105 , coordinate conversion processing, and multiplies the result by a correction coefficient.
  • the system control circuit 1 acquires one coordinate data and completes the coordinate sampling processing.
  • the coordinate sampling processing is described including all of the AD conversion result acquisition and its filtering processing, and correction calculation processing. However, it is not necessary to include all of them.
  • the effect of the present invention can be sufficiently acquired by including only any one of the filtering processing and correction calculation processing.
  • the predetermined number of coordinate data acquired by the filtering processing unit 14 is determined to be “6” for acquiring one coordinate data by the correction calculation processing 15 .
  • the coordinate sampling cycle is the same number “6” in each coordinate sampling cycle t1 to t3.
  • the number can be another fixed number or a variable number according to a coordinate sampling cycle.
  • FIG. 5 is a flowchart illustrating a processing corresponding to an input operation to the touch panel 11 by the system control circuit 1 . Relationships between t1, t2, and t3 in the flowchart are assumed that t1 ⁇ t2, t1 ⁇ t3, and t1 ⁇ T.
  • the system control circuit 1 starts following processing when the system control circuit 1 detects that output of the touch condition detection unit 12 turns ON.
  • step S 1 when the mode switch 3 is set to the shooting mode (REC in step S 1 ), the processing proceeds to step S 3 .
  • step S 3 the system control circuit 1 sets t3 to the coordinate sampling cycle timer.
  • step S 4 the system control circuit 1 performs the coordinate sampling processing (refer to FIG. 4 ).
  • step S 5 the system control circuit 1 stores the acquired coordinates as initial coordinates, and waits for when the output of the touch condition detection unit 12 turns OFF (untouch) (YES in step S 6 ), or when the coordinate sampling cycle timer t3 is timed-out (YES in step S 9 ).
  • step S 7 the system control circuit 1 determines whether the final acquired coordinates are in a button area on a screen (graphical user interface (GUI)).
  • GUI graphical user interface
  • step S 8 the system control circuit 1 performs a processing corresponding to the button (a control processing corresponding to a tap operation). For example, a user untouches in an area of an exposure correction value changing button 63 on a screen 60 in FIG. 2B , the system control circuit 1 shifts the screen 60 to a screen 61 for setting a exposure correction value applying to the shooting image.
  • step S 10 the system control circuit 1 performs the coordinate sampling processing (refer to FIG. 4 ).
  • step S 13 the system control circuit 1 performs a processing corresponding to the touch (control processing corresponding to the drag operation).
  • changing processing of an exposure correction value is described according to screen examples in FIG. 2B .
  • a cursor indicating a correction amount moves according to a touch position.
  • the system control circuit 1 can provide, to a user, an operation means easily changing an exposure correction amount without repeating touch and untouch.
  • the system control circuit 1 may execute a through display processing which displays an image captured by the imaging unit 5 , on the display unit 6 on real time. Further, the system control circuit 1 may execute a face detection processing for detecting a human face from an image captured by the imaging unit 5 .
  • system control circuit 1 may execute a tracking processing for tracking a specified object in an image captured by the imaging unit 5 by acquiring a relative relationships between images. Further, the system control circuit 1 may execute a shooting setting processing, which performs shooting setting focusing on a specified object in an image captured by the imaging unit 5 according to a shooting preparation instruction.
  • the system control circuit 1 can set the shooting setting to be a setting in which the shooting setting can be set in all times even when the shooting preparation instruction is not performed.
  • the system control circuit 1 executes shooting when the shooting is instructed by an operation of the release switch 4 or the touch panel 11 , and records the captured image file in the image recording unit 10 .
  • processing load of the system control circuit 1 is very high, so that if the system control circuit 1 performs coordinate sampling processing at high-speed, there is a possibility that other processing may receive bad effects. Further, as described above, a high-speed coordinate sampling cycle is not also necessary.
  • the system control circuit 1 uses the coordinate sampling cycle t3, which is slower than the coordinate sampling cycle t1 that is used in the reproduction mode. Thus, the processing load is restrained.
  • step S 1 when the mode switch 3 is set to the reproduction mode (PLAY in step S 1 ), the processing proceeds to step S 2 .
  • step S 2 the system control circuit 1 turns OFF the double tap determination flag.
  • steps S 14 and S 15 the system control circuit 1 sets the timer T and the coordinate sampling cycle timer t1.
  • step S 16 the system control circuit 1 performs coordinate sampling processing (refer to FIG. 4 ).
  • step S 17 the system control circuit 1 stores the acquired coordinates as initial coordinates, and waits for when output of the touch condition detection unit 12 becomes OFF (untouch) (YES in step S 18 ) or when the coordinate sampling cycle timer t1 or t2, which is currently set, is timed-out (YES in step S 19 ).
  • step S 27 the system control circuit 1 stops the timer T.
  • step S 28 the system control circuit 1 performs a distance determination for determining whether a distance between the initial coordinates stored in step S 17 and a final coordinates is longer than a predetermined distance.
  • step S 29 the system control circuit 1 acquires the final speed of a coordinate change (a speed of a touch position change before untouch) for determining the speed.
  • step S 30 the system control circuit 1 determines that the operation is a flick operation, enlarges a next image by the image processing unit 8 , generates a display data in the display memory 7 , and performs image advancing processing (display processing corresponding to the flick operation), which displays images with scrolling.
  • This operation corresponds to a change from the state 53 to the state 54 in FIG. 2A .
  • a touching time to the touch panel 11 becomes short, so that the system control circuit 1 cannot acquire sufficient coordinate data for acquiring the final speed if the system control circuit 1 does not perform coordinate sampling in a high-speed cycle.
  • step S 31 the system control circuit 1 determines whether the operation is the tap operation or the double tap operation. More specifically, the system control circuit 1 determines ON/OFF of the double tap determination flag.
  • step S 32 the system control circuit 1 performs multi-image reproduction processing (display processing corresponding to a double tap operation). This operation corresponds to the transition from the state 55 to the state 58 in FIG. 2A .
  • step S 33 the system control circuit 1 set a single tap determination timer.
  • the system control circuit 1 waits for the time when, in step S 34 , the output of the touch condition detection unit 12 turns ON or when, in step S 36 , the single tap is certified.
  • step S 34 When the output of the touch condition detection unit 12 turns ON in step S 34 (YES in step S 34 ), then in step S 35 , the system control circuit 1 sets the double tap determination flag to be ON.
  • step S 37 the system control circuit 1 performs image enlargement processing (display processing corresponding to the tap operation) for enlarging an image centering the acquired coordinates and displaying by the image processing unit 8 .
  • This operation corresponds to the transition from the state 55 to the sate 56 in FIG. 2A .
  • step S 20 the system control circuit 1 determines whether the timer T is timed-out.
  • step S 21 the system control circuit 1 sets the coordinate sampling timer t1 and, then in step S 25 , performs the coordinate sampling processing (refer to FIG. 4 ).
  • step S 26 the system control circuit 1 performs image scroll processing. In this operation, as illustrated from the state 51 to the state 52 in FIG. 2A , the system control circuit 1 scrolls the image according to a movement of coordinates in which a user touches.
  • the system control circuit 1 needs to generate display data in which a display area of the image is shifted a little one by one by the image processing unit 8 , and to display them in sequence, so that the processing load of the system becomes high.
  • the system control circuit 1 cannot smoothly perform display updating.
  • the system control circuit 1 can sufficiently follow to the user's operation by the coordinate sampling of several 10 ms interval.
  • step S 22 the system control circuit 1 calculate an acceleration of coordinate change based on the past sampling coordinates.
  • step S 23 the system control circuit 1 set the timer T for performing high-speed sampling, and the processing proceeds to step S 21 .
  • step S 24 the system control circuit 1 set the coordinate sampling cycle timer t2 and, then in step S 25 , performs the coordinate sampling processing (refer to FIG. 4 ).
  • the system control circuit 1 when a user accelerates an operation, the system control circuit 1 operates by changing to a high-speed coordinate sampling cycle for performing an accurate input decision. By this operation, even when the system control circuit 1 operates in a slow-speed coordinate sampling cycle, the system control circutil can handle an exchange from a slow-speed operation to a high-speed operation by a user.
  • the system control circuit 1 changes a time of a coordinate sampling cycle timer according to an elapsed time from touching and an acceleration while being continuously touched. By this operation, the system control circuit 1 can decrease system load without deteriorating operability.
  • the system control circuit 1 in a mode in which a high-speed input operation is not necessary, the system control circuit 1 always performs coordinate sampling with a slow-speed cycle, and thus it becomes possible to decrease system load.
  • the exposure correction screen in the shooting mode is used as an example.
  • the system control circuit 1 can dynamically and properly use the coordinate sampling cycle changing processing according to an executing state of various provided functions in the shooting mode, and thus further system load can be decreased.
  • the system control circuit 1 can dynamically and properly use the coordinate sampling cycle changing processing according to the kind of input operation that is acceptable at that time, and thusfurther system load can be decreased. For example, in a mode in which a flick operation is not accepted based on a final speed and a double tap operation is not accepted by a reason on specification, when the system control circuit 1 uses a coordinate sampling cycle of slow-speed and constant-speed, the system load can be decreased.
  • one hardware can perform the control performed by the control circuit 1 , or a plurality of hardware can share processing and control an entire apparatus.
  • the present invention is applied to a digital camera as an example. However, the application is not limited thereto.
  • the present invention can be applied to apparatuses using a touch panel, such as a personal computer and a personal digital assistant (PDA).
  • PDA personal digital assistant
  • the present invention is applicable to a display control apparatus using a touch panel, such as a mobile telephone terminal or a mobile type image viewer, a display provided in a printer for selecting and confirming a print image, and a digital photo frame.
  • a display control apparatus using a touch panel such as a mobile telephone terminal or a mobile type image viewer, a display provided in a printer for selecting and confirming a print image, and a digital photo frame.
  • the present invention can be preferably applied to a mobile type display apparatus, such as a digital camera, a mobile telephone terminal, a mobile type image viewer, and a mobile type game machine.
  • the mobile type display control apparatuses since they are driven by a battery, usable power is limited compared with a non-portable type display control apparatus. Further, in the mobile type display control apparatus, most of processors and work memories are cheap and small in size compared with those of the non-portable type display control apparatus. Further, processing ability of them is generally low compared with that of the non-portable type apparatus.
  • the present invention has been described based on the preferable exemplary embodiment.
  • the present invention is not limited to these specific exemplary embodiments, and can include various embodiments within the sprit and the scope of the present invention. Apart of above-described embodiments can be combined.
  • aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiments, and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiments.
  • the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (e.g., computer-readable medium).
  • the system or apparatus, and the recording medium where the program is stored are included as being within the scope of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
US12/849,591 2009-08-07 2010-08-03 Display control apparatus and control method Active 2031-11-06 US9423950B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009-185165 2009-08-07
JP2009185165A JP5340075B2 (ja) 2009-08-07 2009-08-07 表示制御装置、その制御方法及びプログラム

Publications (2)

Publication Number Publication Date
US20110032201A1 US20110032201A1 (en) 2011-02-10
US9423950B2 true US9423950B2 (en) 2016-08-23

Family

ID=43534462

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/849,591 Active 2031-11-06 US9423950B2 (en) 2009-08-07 2010-08-03 Display control apparatus and control method

Country Status (2)

Country Link
US (1) US9423950B2 (ja)
JP (1) JP5340075B2 (ja)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170090745A1 (en) * 2015-09-30 2017-03-30 Brother Kogyo Kabushiki Kaisha Information processing apparatus and storage medium

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101774315B1 (ko) * 2011-03-28 2017-09-04 엘지전자 주식회사 이동 단말기 및 그 제어방법
JP5937808B2 (ja) * 2011-11-16 2016-06-22 ローム株式会社 タッチパネルの制御回路、制御方法およびそれらを用いたタッチパネル入力装置、電子機器
JP5797580B2 (ja) * 2012-02-16 2015-10-21 シャープ株式会社 入力制御装置、電子機器、入力制御方法、プログラムおよび記録媒体
JP5770654B2 (ja) * 2012-02-16 2015-08-26 シャープ株式会社 画面表示装置、その制御方法、プログラム、およびコンピュータ読み取り可能な記録媒体
JP5987380B2 (ja) * 2012-03-16 2016-09-07 カシオ計算機株式会社 撮像装置及びプログラム
JP6080515B2 (ja) * 2012-11-26 2017-02-15 キヤノン株式会社 情報処理装置、表示装置、情報処理装置の制御方法、及びプログラム
JP6216145B2 (ja) 2013-04-22 2017-10-18 シナプティクス・ジャパン合同会社 タッチパネルコントローラ及び半導体デバイス
JP2014238695A (ja) * 2013-06-07 2014-12-18 セイコーエプソン株式会社 電子機器及びタップ操作検出方法
JP2014238696A (ja) * 2013-06-07 2014-12-18 セイコーエプソン株式会社 電子機器及びタップ操作検出方法
JP6264814B2 (ja) * 2013-09-30 2018-01-24 ブラザー工業株式会社 操作支援プログラム、通信端末、および処理装置
US9507407B2 (en) 2014-02-21 2016-11-29 Qualcomm Incorporated Method and apparatus for improving power consumption on a touch device
JP6552156B2 (ja) * 2014-03-07 2019-07-31 コニカミノルタ株式会社 データ処理装置、操作受付方法およびコンテンツ表示プログラム
JP6930364B2 (ja) * 2017-03-31 2021-09-01 株式会社デンソーウェーブ 情報読取装置
US10466887B2 (en) * 2017-05-02 2019-11-05 Facebook, Inc. Feed ad scrolling
JP2021018777A (ja) 2019-07-24 2021-02-15 キヤノン株式会社 電子機器

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63174125A (ja) 1987-01-14 1988-07-18 Fujitsu Ltd フアイル検索装置
JPS63163532U (ja) 1987-04-15 1988-10-25
JP2000010721A (ja) 1998-06-24 2000-01-14 Sharp Corp 座標入力装置
JP2000057094A (ja) 1998-08-10 2000-02-25 Fujitsu Ltd 他端末操作装置
JP3181181B2 (ja) 1994-11-11 2001-07-03 シャープ株式会社 文書情報処理装置
JP2001527678A (ja) 1997-05-22 2001-12-25 エリクソン インコーポレイテッド タッチスクリーン入力の適応的サンプリング
US20050179672A1 (en) * 2004-02-17 2005-08-18 Yen-Chang Chiu Simplified capacitive touchpad and method thereof
US20060284857A1 (en) * 2005-06-16 2006-12-21 Lg Electronics Inc. Power-saving function for touch screen device
US20100085316A1 (en) * 2008-10-07 2010-04-08 Jong Hwan Kim Mobile terminal and display controlling method therein
US20100090976A1 (en) * 2008-10-09 2010-04-15 Shih-Chuan Liao Method for Detecting Multiple Touch Positions on a Touch Panel
US20100269068A1 (en) * 2009-04-17 2010-10-21 Christopher Labrador Changing selection focus on an electronic device
US8147248B2 (en) * 2005-03-21 2012-04-03 Microsoft Corporation Gesture training

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06119090A (ja) * 1992-10-07 1994-04-28 Hitachi Ltd 省電力制御方式
JP3064123B2 (ja) * 1992-11-09 2000-07-12 株式会社日立製作所 情報処理装置および入力制御装置
JPH06230898A (ja) * 1993-02-05 1994-08-19 Matsushita Electric Ind Co Ltd ペン入力装置
JPH10269021A (ja) * 1997-03-25 1998-10-09 Sharp Corp タッチパネル入力装置

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63174125A (ja) 1987-01-14 1988-07-18 Fujitsu Ltd フアイル検索装置
JPS63163532U (ja) 1987-04-15 1988-10-25
JP3181181B2 (ja) 1994-11-11 2001-07-03 シャープ株式会社 文書情報処理装置
JP2001527678A (ja) 1997-05-22 2001-12-25 エリクソン インコーポレイテッド タッチスクリーン入力の適応的サンプリング
JP2000010721A (ja) 1998-06-24 2000-01-14 Sharp Corp 座標入力装置
JP2000057094A (ja) 1998-08-10 2000-02-25 Fujitsu Ltd 他端末操作装置
US20050179672A1 (en) * 2004-02-17 2005-08-18 Yen-Chang Chiu Simplified capacitive touchpad and method thereof
US8147248B2 (en) * 2005-03-21 2012-04-03 Microsoft Corporation Gesture training
US20060284857A1 (en) * 2005-06-16 2006-12-21 Lg Electronics Inc. Power-saving function for touch screen device
US20100085316A1 (en) * 2008-10-07 2010-04-08 Jong Hwan Kim Mobile terminal and display controlling method therein
US20100090976A1 (en) * 2008-10-09 2010-04-15 Shih-Chuan Liao Method for Detecting Multiple Touch Positions on a Touch Panel
US20100269068A1 (en) * 2009-04-17 2010-10-21 Christopher Labrador Changing selection focus on an electronic device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170090745A1 (en) * 2015-09-30 2017-03-30 Brother Kogyo Kabushiki Kaisha Information processing apparatus and storage medium
US10338808B2 (en) * 2015-09-30 2019-07-02 Brother Kogyo Kabushiki Kaisha Information processing apparatus and storage medium

Also Published As

Publication number Publication date
JP2011039709A (ja) 2011-02-24
US20110032201A1 (en) 2011-02-10
JP5340075B2 (ja) 2013-11-13

Similar Documents

Publication Publication Date Title
US9423950B2 (en) Display control apparatus and control method
US10216313B2 (en) Electronic apparatus and control method of the same
US10222903B2 (en) Display control apparatus and control method thereof
US9438789B2 (en) Display control apparatus and display control method
US11039073B2 (en) Electronic apparatus and method for controlling the same
US10630904B2 (en) Electronic device, control method for controlling the same, and storage medium for changing a display position
TW200836096A (en) Mobile equipment with display function
US20170104922A1 (en) Electronic apparatus and control method thereof
JP2013142751A (ja) 表示制御装置、その制御方法及びプログラム
JP5563108B2 (ja) 撮影装置、撮影方法およびプログラム
US10712932B2 (en) Electronic device, method for controlling electronic device, and non-transitory computer readable medium
US10120496B2 (en) Display control apparatus and control method thereof
JP6198459B2 (ja) 表示制御装置、表示制御装置の制御方法、プログラム及び記憶媒体
US10649645B2 (en) Electronic apparatus and method for controlling the same
JP2013017088A (ja) 撮像装置、その制御方法、および制御プログラム、並びに記録媒体
JP6055794B2 (ja) 自分撮り撮影装置、自分撮り撮影方法およびプログラム
US20150100919A1 (en) Display control apparatus and control method of display control apparatus
US9064351B2 (en) Display control apparatus and method for controlling the same
JP6393296B2 (ja) 撮像装置及びその制御方法、撮像制御装置、プログラム、並びに記憶媒体
JP2021029034A (ja) 露出設定装置、その制御方法、プログラム、及び記憶媒体
US10037136B2 (en) Display controller that controls designation of position on a display screen, method of controlling the same, and storage medium
JP2020197976A (ja) 電子機器、電子機器の制御方法、プログラム、記憶媒体
JP5863418B2 (ja) 撮像装置及びその制御方法
US11165986B2 (en) Data transfer apparatus and control method thereof
US20210127054A1 (en) Electronic device and control method thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: CANON KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NAKA, YASUTAKA;REEL/FRAME:025485/0351

Effective date: 20100716

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY