US20120218203A1 - Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus - Google Patents

Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus Download PDF

Info

Publication number
US20120218203A1
US20120218203A1 US13/370,049 US201213370049A US2012218203A1 US 20120218203 A1 US20120218203 A1 US 20120218203A1 US 201213370049 A US201213370049 A US 201213370049A US 2012218203 A1 US2012218203 A1 US 2012218203A1
Authority
US
United States
Prior art keywords
touch
image
page
unit
display screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/370,049
Other languages
English (en)
Inventor
Noriyoshi KANKI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sharp Corp
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2011027237A external-priority patent/JP5537458B2/ja
Priority claimed from JP2011027236A external-priority patent/JP5536690B2/ja
Application filed by Individual filed Critical Individual
Assigned to SHARP KABUSHIKI KAISHA reassignment SHARP KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KANKI, NORIYOSHI
Publication of US20120218203A1 publication Critical patent/US20120218203A1/en
Priority to US14/823,681 priority Critical patent/US10191648B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to an image display apparatus having a display device and an input device integrated together, allowing touch-input. More specifically, the present invention relates to a touch drawing display apparatus that has functions of drawing operation and other operation regarding touch operations, and operation method thereof, as well as to an image display apparatus switching images page by page by touching and sliding images with a plurality of fingers and the like and a controller for the display apparatus.
  • a so-called electronic blackboard having a display device with a large screen allowing drawing of an image or other processes upon detection of touching of the display device by the user has been known.
  • the electronic blackboard is useful when summarizing opinions of participants or finding a preferable solution to a problem, for example, at a meeting.
  • Electronic blackboards having various configurations have come to be practically used, and one configured as a computer system having a combination of a display device with a large display screen and an input device for detecting two-dimensional position coordinates such as a touch-panel has been used.
  • an electronic blackboard apparatus successively reads pieces of information related to position coordinates designated by a pen or a finger and pieces of information related to amount of movement, and displays a track of inputs on the display device based on the read pieces of information. Consequently, the apparatus can realize operations as an electronic blackboard such as handwriting input.
  • Japanese Patent Laying-Open No. 6-175776 discloses a projection type presentation device allowing drawing on a displayed screen image using touch operation. There is a problem that recognition of an original image becomes difficult when writing is repeated on the same screen image.
  • drawing is done while touch operation is continued in an image forming area, and when the touch operation in the image forming area once ends and then a touch operation in the image forming area is detected next, the former image is automatically deleted.
  • a specific function other than drawing is allocated to a touch operation different from a simple touch operation for drawing in an electronic blackboard, while an image is being drawn in a drawing mode, it takes some time until a touch operation is determined to be the one for executing the specific function.
  • drawing is done immediately after the touching, so as not to cause any time lag between the touch operation and drawing. Any time lag is stressful for the user.
  • Such a configuration sometimes leads to an erroneous drawing not intended by the user.
  • an operation of flicking with multi-touch is allocated to an operation of scrolling screen images.
  • This problem is not limited to the electronic blackboard, and it occurs in display devices allowing display of images drawn by touching, such as tablet-type terminals. This problem cannot be solved by the technique disclosed in '776 Reference.
  • an electronic blackboard includes that it is possible to display or write (draw) images separately on a plurality of screen images.
  • Each unit of display of such images is referred to as a “page” as an analogy to a book.
  • Japanese Patent Laying-Open No. 11-102274 proposes a device in which the screen image is switched if the screen image is touched by a plurality of fingers and the like (a so-called multi-touch) and the plurality of touched positions are slid by more than a prescribed value in the same direction.
  • pages may be turned erroneously while normal input is being done.
  • a finger other than the finger used for input happens to touch the screen surface and is detected as the multi-touch slide input and, as a result, a page is turned unintentionally. Therefore, a mechanism that can prevent unintended turning of a page even when such an erroneous input is made has been desired. Further, a mechanism that allows the user to easily understand what manner of input is necessary to turn a page is also necessary.
  • the present invention provides a touch drawing display apparatus, including: a display unit that displays an image; a detecting unit that is arranged on the display unit and detects a touched position; a drawing unit that draws, on an image displayed on the display unit, a line corresponding to a track formed by movement of the detected position; and an erasing unit that erases, when an operation other than drawing is specified by the track, a line drawn from a start point to an end point of the track specifying the operation.
  • the track for specifying the operation other than drawing is a track formed by a multi-touch operation of simultaneously touching a plurality of positions of the detecting unit.
  • the present invention provides a method of operating a touch drawing display apparatus that includes a display unit that displays an image and a detecting unit that is arranged on the display unit and detects a touched position, including the steps of: drawing, on an image displayed on the display unit, a line corresponding to a track formed by movement of the detected position; determining whether or not an operation other than drawing is specified by the track while the drawing step is being executed; and erasing, if an operation other than drawing is specified by the track, a line drawn from a start point to an end point of the track specifying the operation.
  • the erroneous image drawn during the determination of the touch operation other than that for drawing is not retained, and the drawn image as intended by the user can be stored. Therefore, when the object image is displayed again, the erroneous image is not displayed and only the intended image is displayed.
  • the present invention provides an image display apparatus allowing touch-input, including: a touch detecting unit that has a display screen, displays page images page by page on the display screen, and detects positions and the number of touch inputs that designate positions on the display screen; a scroll unit that scrolls, when a plurality of touch inputs are detected by the touch detecting unit and their positions on the display screen move in one same direction, an image displayed on the display screen along with movement of positions of the plurality of touch inputs; and a first page switching unit that detects, after the plurality of touch inputs are detected by the touch detecting unit, decrease in the number of touch inputs from an output of the touch detecting unit and, depending on whether an amount of movement of the touch inputs is not larger than a first threshold value, selectively executes a process of returning the image on the display screen to a state before scrolling and a process of switching the image on the display screen by one page in accordance with the direction of movement of the plurality of touch inputs.
  • the scroll unit Scrolls the screen image in that direction. Therefore, the user can intuitively understand that the screen image can be scrolled by multi-touch sliding operation.
  • the scroll made by that time point is resumed to the original state before scrolling. Therefore, even when the user erroneously touches the display screen surface with a plurality of fingers and slides, the original screen image can be resumed if the user becomes aware and moves his/her fingers away immediately.
  • the screen image is switched by one page in accordance with the direction of sliding.
  • the pages of screen image can be switched by the simple operation of multi-touch sliding.
  • an image display apparatus allowing touch-input that allows the user to intuitively understand the method of page-by-page switching of screen images using multi-touch and that can prevent any trouble of display when an erroneous operation is made on such an occasion can be provided.
  • the present invention provides a controller for a display apparatus, used for the display apparatus that has a display screen, displays an image received from outside on the display screen and detects and outputs, to the outside, positions and the number of touch inputs that designate positions on the display screen.
  • the controller for the display apparatus includes: a scroll unit that generates, when it is detected that a plurality of touch inputs are detected by the display apparatus and their positions on the display screen move in one same direction, an image by scrolling the image displayed on the display screen along with the movement of positions of the plurality of touch positions, and applies the generated image to the display apparatus; and a page switching unit that detects, after the plurality of touch inputs are detected by the display apparatus, decrease in the number of touch inputs from an output of the display apparatus, depending on whether an amount of movement of the touch inputs is not larger than a first threshold value, selectively executes a process of generating page image data for returning the image on the display screen to a state before scrolling and a process of generating page image data for switching the image on the display screen by one page in accordance with the direction of movement of the plurality of touch inputs, and transmits the generated page image data to the display apparatus.
  • the present invention provides an image display apparatus allowing touch input, including: a touch detecting unit that has a display screen, displays page images page by page on the display screen, and detects positions and the number of touch inputs that designate positions on the display screen; a scroll unit that scrolls, when a plurality of touch inputs are detected by the touch detecting unit and their positions on the display screen move in one same direction, an image displayed on the display screen along with movement of positions of the plurality of touch inputs; a page switching unit that switches, in response to an amount of movement of the plurality of touch inputs exceeding a threshold value during scrolling of the image by the scroll unit, the image on the display screen by one page in accordance with the direction of movement of the plurality of touch inputs; and a returning unit that detects, from an output of the touch detecting unit, decrease of the number of plurality of touch inputs before page switching by the page switching unit during scrolling of the image by the scroll unit, and returns scrolling of the image by the scroll unit to a state before
  • the plurality of touched positions are detected by the touch detecting unit, and when these positions move in the same direction, the scroll unit moves the screen image along the direction of movement of the touched positions. Therefore, the user can intuitively understand that page switching is possible by the so-called multi-touch. Further, if the plurality of touched positions move in the same direction and the threshold value is exceeded, the display of screen image is switched in accordance with the direction of movement of touched positions in the course of scrolling. The screen image can be switched by the simple operation of so-called multi-touch. Further, for this purpose, what is necessary is simply to move the touched positions continuously without moving the fingers and the like away from the screen surface. Further, if the number of touch inputs decreases before page switching takes place, the screen image returns to the state before the scrolling and, therefore, the original state can easily be resumed even when an erroneous operation is made.
  • an image display apparatus allowing touch-input that allows the user to easily understand that page-by-page switching of screen images is possible by multi-touch input and allows easy switching of page-by-page switching of screen images can be provided.
  • the present invention provides a controller for a display apparatus, used for the display apparatus that has a display screen, displays an image received from outside on the display screen and detects and outputs, to the outside, positions and the number of touch inputs that designate a position on the display screen.
  • the controller includes: a scroll unit that generates, when it is detected that a plurality of touch inputs are detected by the display apparatus and their positions on the display screen move in one same direction, an image by scrolling the image displayed on the display screen along with the movement of positions of the plurality of touch positions, and applies the generated image to the display apparatus; a page switching unit that switches, in response to an amount of movement of the plurality of touch inputs exceeding a threshold value during scrolling of the image by the scroll unit, the image on the display screen by one page in accordance with the direction of movement of the plurality of touch inputs; and a returning unit that detects, from an output of the display apparatus, decrease of the number of plurality of touch inputs before page switching by the page switching unit during scrolling of the image by the scroll unit, and returns scrolling of the image by the scroll unit to a state before scrolling.
  • the user can intuitively understand that page-by-page switching of screen images can be done by using multi-touch, and any trouble in display can be prevented even when an erroneous operation is made on such an occasion. Further, by a simple operation of multi-touch sliding, page of images can be switched by one page. It is unnecessary to stop the multi-touch sliding operation.
  • FIG. 1 is a block diagram showing a schematic configuration of an electronic blackboard in accordance with an embodiment of the present invention.
  • FIG. 2 shows an example of a method of detecting a touch-input.
  • FIG. 3 shows an example of a displayed screen image.
  • FIG. 4 is a flowchart representing a control structure of a program realizing erasure of an erroneous drawing in the electronic blackboard apparatus in accordance with a first embodiment of the present invention.
  • FIG. 5 shows coordinate data structure stored when a multi-touch operation is carried out in accordance with the first embodiment of the present invention.
  • FIG. 6 shows a scroll operation realized by the multi-touch operation in accordance with the first embodiment of the present invention.
  • FIG. 7 shows a scroll of a screen image in accordance with the first embodiment of the present invention.
  • FIG. 8 is a flowchart representing a control structure of a program realizing page switching by a multi-touch, in the electronic blackboard apparatus in accordance with a second embodiment of the present invention.
  • FIG. 9 shows a multi-touch sliding operation in accordance with the second embodiment of the present invention.
  • FIG. 10 shows an operation of sliding by a distance D 1 with multi-touch and then moving the fingers away, in accordance with the second embodiment of the present invention.
  • FIG. 11 shows an operation of sliding by a distance D 2 longer than D 1 with multi-touch and then moving the fingers away, in accordance with the second embodiment of the present invention.
  • FIG. 12 shows a screen image after a page switch, displayed when an operation of sliding by a distance D 2 and then moving the fingers away is done, in accordance with the second embodiment of the present invention.
  • FIG. 13 shows a screen image displayed when an operation of sliding by a distance D 3 with multi-touch is done, in accordance with the second embodiment of the present invention.
  • FIG. 14 shows a screen image displayed when the sliding operation is continued even after the screen image is switched by one page from the screen image shown in FIG. 13 .
  • touch means that a position is made detectable by an input position detecting device
  • touch may include touching and pressing the detecting device, just touching and not pressing the detecting device, and coming very close to but not touching the detecting device.
  • a contact type as well as a non-contact device may be used as the input position detecting device.
  • touch means coming very close to the detecting device, that is, to a distance that allows detection of the input position.
  • an electronic blackboard apparatus 100 in accordance with the first embodiment of the present invention includes: a central processing unit (hereinafter denoted as CPU) 102 ; a read only memory (hereinafter denoted as ROM) 104 ; a random access memory (hereinafter denoted as RAM) 106 ; a storage unit 108 ; an interface unit (hereinafter denoted as IF unit) 110 ; a touch detecting unit 112 ; a display unit 114 ; a display control unit 116 ; a video memory (hereinafter denoted as VRAM) 118 ; and a bus 120 .
  • CPU 102 is for overall control of electronic blackboard apparatus 100 .
  • ROM 104 is a non-volatile storage storing programs and data necessary for controlling operations of electronic blackboard apparatus 100 .
  • RAM 106 is a volatile storage.
  • Storage unit 108 is a non-volatile storage retaining data even when power conduction is shut off, and implemented, for example, by a hard disk drive, a flash memory or the like. Storage unit 108 may be configured as a detachable unit.
  • CPU 102 reads a program from ROM 104 to RAM 106 through bus 120 and executes the program using a part of RAM 106 as a work area.
  • CPU 102 controls various units and components of electronic blackboard apparatus 100 in accordance with a program or programs stored in ROM 104 .
  • bus 120 To bus 120 , CPU 102 , ROM 104 , RAM 106 , storage unit 108 , IF unit 110 , touch detecting unit 112 , display control unit 116 and VRAM 118 are connected. Data (including control information) is exchanged between each of the units through bus 120 .
  • IF unit 110 is for establishing connection with the outside such as a network, and transmits/receives image data to and from a computer or the like connected to the network. Image data received from the outside through IF unit 110 is recorded in storage unit 108 .
  • Display unit 114 is a display panel (such as a liquid crystal panel) for displaying images.
  • Display control unit 116 is provided with a driving unit for driving display unit 114 .
  • Display control unit 116 reads image data stored in VRAM 118 at prescribed timing, generates signals for displaying as an image on display unit 114 , and outputs the generated signals to display unit 114 .
  • the image data to be displayed is read by CPU 102 from storage unit 108 and transmitted to VRAM 118 .
  • Touch detecting unit 112 is, for example, a touch-panel, detecting a touch operation by a user.
  • touch detecting unit 112 if there is a plurality of positions touched by the user, touch detecting unit 112 successively outputs coordinates of each of the positions.
  • touch detecting unit 112 by monitoring the outputs of touch detecting unit 112 , it is possible to know the number of touched positions and their coordinates successively. Detection of touch operation when a touch-panel is used for touch detecting unit 112 will be described with reference to FIG. 2 .
  • FIG. 2 shows an infrared scanning type touch-panel (touch detecting unit 112 ).
  • the touch-panel has arrays of light emitting diodes (hereinafter denoted as LED arrays) 200 and 202 arranged in a line on adjacent two sides of a rectangular writing surface, respectively, and two arrays of photodiodes (hereinafter referred to as PD arrays) 210 and 212 arranged in a line opposite to LED arrays 200 and 202 , respectively.
  • Infrared rays are emitted from each LED of LED arrays 200 and 202 , and the infrared rays are detected by each PD of opposite PD arrays 210 and 212 .
  • infrared rays output from LEDs of LED arrays 200 and 202 are represented by arrows.
  • the touch-panel includes, for example, a micro computer (a device including a CPU, a memory and an input/output circuit), and controls emission of each LED.
  • Each PD outputs a voltage corresponding to the intensity of received light.
  • the output voltage from the PD is amplified by an amplifier. Since signals are output simultaneously from the plurality of PDs of PD arrays 210 and 212 , the output signals are once saved in a buffer and then output as serial signals in accordance with the order of arrangement of PDs, and transmitted to the micro computer.
  • the order of serial signals output from PD array 210 represents the X coordinate.
  • the order of serial signals output from PD array 212 represents the Y coordinate.
  • the micro computer detects a portion where the signal levels of received two serial signals decreased, and thereby finds the position coordinates of the touched position.
  • the micro computer transmits the determined position coordinates to CPU 102 .
  • the process for detecting the touched position is repeated periodically at prescribed detection interval and, therefore, if one point is kept touched for a time period longer than the detection interval, it follows that the same coordinate data is output repeatedly.
  • the touched position can be detected in the similar manner when the user touches touch detecting unit 112 with his/her finger without using touch pen 220 .
  • a touch panel other than the infrared scanning type panel (such as a capacitive type, surface acoustic wave type or resistive type touch panel) may be used as touch detecting unit 112 .
  • a capacitive touch panel When a capacitive touch panel is used, a position can be detected even when a finger or the like is not actually touching (non-contact), if it is close enough to the sensor.
  • FIG. 3 On display unit 114 , a screen image such as shown in FIG. 3 is displayed.
  • the display screen image of display unit 114 is divided to a drawing area 230 and a function button area 240 .
  • Drawing area 230 is for the user to draw an image by touching operations.
  • XY coordinates of touched position and track of its movement are transmitted from touch detecting unit 112 to CPU 102 as described above.
  • CPU 102 writes a prescribed value in a corresponding memory address on VRAM 118 .
  • pixel values of image data on VRAM 118 may be changed, here, it is assumed that VRAM 118 is provided with an area (hereinafter also referred to as an overlay area) for storing drawing data, separate from the area for storing image data.
  • an overlay area for storing drawing data
  • Display control unit 116 superimposes and displays on display unit 114 the image data with the drawing data (the data in the overlay area). Specifically, on a point where the drawing data exists (the pixel having “1” recorded in the overlay area), the drawing data (preset color) is displayed, and on a point where the drawing data does not exist (the pixel having “0” recorded in the overlay area), the image data is displayed.
  • function buttons 242 each having specific function allocated thereto are displayed.
  • a page operation area 250 is displayed.
  • a NEXT button 252 On this area, a NEXT button 252 , a PREVIOUS button 254 and a page number indication box 256 are displayed.
  • NEXT button 252 feeds the displayed page (image data) to the right and shows the next page.
  • PREVIOUS button 254 feeds the displayed page to the left and shows the previous page.
  • Page number indication box indicates the page number of the currently displayed page, of the plurality of pages as the object of display.
  • the position of page operation area 250 is fixed and it does not move even during scrolling.
  • the data for displaying page operation area 250 on display unit 114 may be stored in an overlay area separate from the overlay area for drawing.
  • Functions allocated to function buttons 242 may include: a function of drawing by a touch operation (allowing selection of types of pens for drawing); a function of opening a file (image data) saved in storage unit 108 ; an erasure function of deleting a drawing in a prescribed area; a function of saving displayed image data in storage unit 108 ; and a function of printing displayed image data.
  • Each function button 242 is displayed as an icon.
  • CPU 102 determines whether or not touch detecting unit 112 is touched. As described above, CPU 102 determines whether or not coordinate data is received from touch detecting unit 112 . Position coordinates (X coordinate, Y coordinate) of the touched point are output in the order of touching, from touch detecting unit 112 . If it is determined that there is a touch, the control proceeds to step 302 . Otherwise, the control proceeds to step 306 .
  • CPU 102 stores coordinate data that have been received for a prescribed time period (coordinate data of touched points) in RAM 106 in a manner that represents the order of reception.
  • CPU 102 determines whether or not the program is to be ended. When an end button as one of the function buttons 242 is pressed, CPU 102 ends the program. Otherwise, the control returns to step 300 , and CPU 102 waits for a touch.
  • CPU 102 determines whether or not the touch detected at step 302 is a multi-touch (whether a plurality of points are touched simultaneously). Specifically, CPU 102 calculates, among the plurality of coordinate data stored at step 302 , the distance between the first received coordinate data and the coordinate data received next. If the distance is a prescribed value or larger, CPU 102 determines that it is a multi-touch, and if the distance is smaller than the prescribed value, determines that it is a single touch.
  • the touch detection interval (period) is short and touching to each point involves slight time difference and, therefore, even in the case of multi-touch, coordinate data are output in order of touching from touch detecting unit 112 . Therefore, by calculating the distance between the coordinate data received first and each of some coordinate data received second and thereafter, it is possible to determine whether the touch was a single touch or a multi-touch.
  • the number of points touched simultaneously in the multi-touch operation is not limited to two and may be three or more. Therefore, distances are calculated for a plurality of continuous points.
  • the distance used as a reference for determining a multi-touch may be set appropriately. What is necessary is that the distance as the reference is larger than the normal distance the user moves the touched point in the detection interval. If it is determined to be a multi-touch, the control proceeds to step 320 . If it is determined not to be a multi-touch, the control proceeds to step 310 .
  • CPU 102 draws an image in accordance with the track of movement of the touched point, on the image displayed on display unit 114 .
  • CPU 102 reads the plurality of coordinate data stored at step 302 from RAM 106 , and writes prescribed data on corresponding points in the overlay area of VRAM 118 and on points on lines connecting these points in order. Consequently, the image data and the drawing data (data in the overlay area) are combined and displayed on display unit 114 , as if drawing is done on the image. Since the process at step 312 is repeated as described later, the overlay area is overwritten with existing drawing data left as it is.
  • CPU 102 temporarily stores the contents of drawing in RAM 106 . What is required is that the drawing can be reproduced and, hence, the method of storage may appropriately be selected.
  • the coordinate data stored in RAM 106 at step 302 with the order maintained may be retained as they are.
  • the data in the overlay area of VRAM 118 may be stored as two-dimensional image data in RAM 106 .
  • CPU 102 determines whether or not the touch is maintained. Specifically, CPU 102 determines whether or not coordinate data are continuously received from touch detecting unit 112 . For instance, if touch pen 220 or the user's finger moves away, reception of coordinate data from touch detecting unit 112 stops. If it is determined that the touch is maintained, control returns to step 302 , and the process following step 302 is repeated. If it is not determined that the touch is maintained, the control returns to step 300 and CPU 102 again waits for a next touch.
  • CPU 102 re-stores the coordinate data that have been stored at step 302 in RAM 106 as coordinate data corresponding to each multi-touch position, that is, for each track, in RAM 106 .
  • the coordinate data of first touched point hereinafter also referred to as “start point”. Therefore, it is possible to determine which of the plurality of coordinate data other than the start point stored in RAM 106 corresponds to the track of which of the plurality of start points, based on the distance from the start point.
  • the coordinate data of the start point and the coordinate data of points on the track of movement of the start point are stored as sequence data, in the order of detection.
  • sequence data 400 shown on the left side of FIG. 5 are stored in RAM 106 .
  • (x 0 , y 0 ) and (x 1 , y 1 ) are assumed to be the coordinates of two points that were touched first.
  • the point (x 0 , y 0 ) was touched little earlier than the point (x 1 , y 1 ).
  • two data sequences having coordinate data 410 of the first start point and coordinate data 420 of the second start point as heads, respectively, such as shown on the right side of FIG. 5 are stored in RAM 106 .
  • Coordinate data 410 of the start point and the following data sequence 412 represent one track.
  • Coordinate data 420 of the start point and the following data sequence 422 represent another track.
  • the manner of storage is similar if the number of multi-touch points is three or more.
  • CPU 102 draws an image on the image displayed on display unit 114 , in accordance with the track of touched points as at step 310 .
  • CPU 102 draws using the coordinate data re-stored in RAM 106 as shown, for example, in FIG. 5 . All tracks of simultaneously touched points may be drawn, or a track or tracks of only one or some of the points (for example, a track of the earliest-touched point) may be drawn.
  • FIG. 6 shows an example in which drawings 500 are made and thereafter, a user 510 flicked to the right while multi-touching with two fingers. By this operation, erroneous drawing 502 is made on the image. In FIG. 6 , only a drawing corresponding to one of the two tracks is displayed. If step 312 has already been executed and drawing with single touch has been made, the contents of drawing by the multi-touch is added to the drawn contents.
  • CPU 102 temporarily stores the contents drawn at step 322 in RAM 106 .
  • the specific method is the same as that of step 312 . If the coordinate data re-stored at step 320 is to be retained as it is, different from step 312 , only the coordinate data representing the track drawn at step 322 may be retained.
  • CPU 102 determines whether the multi-touch is maintained. Specifically, CPU 102 determines whether or not a plurality of coordinate data are continuously received from touch detecting unit 112 and whether these coordinate data correspond to the tracks of a plurality of start points determined at step 320 .
  • CPU 102 determines whether or not a plurality of coordinate data are continuously received from touch detecting unit 112 and whether these coordinate data correspond to the tracks of a plurality of start points determined at step 320 .
  • the user first touched with two fingers and then moved one finger away. Then, the track. that has been made by the moved finger is lost, and the coordinate data received from touch detecting unit 112 come to be only the coordinate data representing one track.
  • the control proceeds to step 328 . If it is not determined that multi-touch is maintained, the control proceeds to step 314 , and CPU 102 determines whether or not single touch is maintained, as described above.
  • CPU 102 determines whether or not the detected multi-touch operation is an operation allocated to scrolling (an operation designating a scroll). Specifically, for each sequence of coordinate data corresponding to the tracks of the plurality of touch points stored at step 320 , CPU 102 determines a vector from the coordinate data of the start point to the coordinate data of the last point, and determines whether the vector is of a prescribed length or longer and whether the vector is in the positive direction along the X axis (whether X component is positive). If the vector is in the positive direction along the X axis, the detected multi-touch operation is determined to be an operation allocated to a scroll to the right.
  • the detected multi-touch operation is determined to be an operation allocated to a scroll to the left. If it is determined to be an operation allocated to a scroll, the control proceeds to step 330 . Otherwise, the control returns to step 320 . The process following step 320 is repeated until the vector reaches the prescribed length.
  • CPU 102 deletes the coordinate data representing the track of touched point stored in RAM 106 at step 324 . Specifically, CPU 102 deletes the drawing data written after the determination of multi-touch (for example, write data “0”), while maintaining the drawing data (for example, drawing data drawn with single touch) written before the determination of multi-touch in the overlay area. Therefore, on the screen image of display unit 114 , the erroneous line drawn by the multi-touch operation is erased.
  • FIG. 7 shows a displayed screen image on display unit 114 during a scroll. An arrow 520 represents a scroll to the right. In FIG. 7 , drawing 500 formed by the user and displayed in FIG. 6 is maintained, while erroneous drawing 502 is erased. Since FIG. 7 shows a state in which scroll to the right has already been done to some extent, part of the drawing on the upper right portion is not shown.
  • CPU 102 executes the right or left scroll, in accordance with the result of determination at step 328 .
  • CPU 102 saves the contents of drawing (with the erroneous drawing erased) that have been temporarily stored in RAM 106 at steps 312 and 324 in storage unit 108 , and the control returns to step 300 . Since the contents drawn in the overlay area are saved in storage unit 108 , the image with drawing 500 intended by the user can again be displayed if a left scroll is designated later.
  • the background may be a uniform color image (for example, white, black or gray), provided that a function of drawing in response to a touch operation on touch detecting unit 112 is realized.
  • the function to be allocated may be any operation other than drawing and it may be an upward/downward scroll, a scroll to a diagonal direction, or a page switch without scroll. Further, the function to be allocated may be a function (operation) other than drawing, allocated to a function button. If an operation allocated to a scroll in a direction other than the left/right direction is to be detected, the scrolling direction may be determined considering not only the X component but also the Y component of the vector.
  • a plurality of users touch electronic blackboard apparatus 100 , each with a touch pen or a finger, and to draw a plurality of images simultaneously.
  • what is necessary is to determine whether the distance of points touched simultaneously is about the distance between one's fingers (for example, at most a few centimeters).
  • the operation to which a function other than drawing is allocated it is not limited to a multi-touch operation, and it may be a single-touch operation provided that the operation can be distinguished from the normally conducted touch operation for drawing.
  • the allocated operation other than drawing may be executed.
  • CPU 102 determines whether or not an operation is a multi-touch operation
  • a micro computer in touch detecting unit 112 may determine whether or not an operation is a multi-touch operation, and may transmit the result to CPU 102 .
  • An electronic blackboard apparatus in accordance with the second embodiment of the present invention has the same configuration as that shown in FIG. 1 representing the electronic blackboard apparatus in accordance with the first embodiment.
  • the method of detecting a touch input and the displayed screen images of the electronic blackboard apparatus in accordance with the second embodiment are also the same as those described with reference to FIGS. 2 and 3 regarding the electronic blackboard apparatus in accordance with the first embodiment. Therefore, in the following, the electronic blackboard apparatus in accordance with the second embodiment will be described as “electronic blackboard apparatus 100 ”, referring to FIGS. 1 to 3 where appropriate.
  • a program realizing the page switch function of electronic blackboard apparatus 100 in accordance with the second embodiment using the hardware shown in FIG. 1 when executed by CPU 102 shown in FIG. 1 has the following control structure.
  • the program includes steps 600 , 602 and 604 .
  • CPU 102 monitors an output of touch detecting unit 112 , and in response to an output detecting a touch by the user from touch detecting unit 112 , the control flow proceeds to the next step.
  • CPU 102 again detects a new position of touching by a finger based on an output from touch detecting unit 112 , and the control proceeds to the next step.
  • CPU 102 determines whether or not the touch by the user is a multi-touch made with two or more fingers.
  • Touch detecting unit 112 has a function of outputting coordinate data whose number corresponds to the number of touching, in response to the touching by the user. Therefore, the determination as described above can be made based on the outputs from touch detecting unit 112 .
  • the program further includes a step 606 .
  • CPU 102 stores, if the determination at step 604 is positive, the touch positions detected at steps 600 and 602 in RAM 106 (see FIG. 1 ). Since a multi-touch is detected, here, coordinate values corresponding in number to the number of detected touches are stored. In the subsequent process, the coordinate data of each finger are detected for every detected touch position, and the series of data are stored as a sequence in RAM 106 .
  • the program further includes steps 608 , 610 and 630 .
  • CPU 102 detects the present positions of fingers from the outputs of touch detecting unit 112 .
  • CPU 102 determines whether any one of the multi-touch detection outputs is lost (whether or not any finger has been moved away from touch detecting unit 112 ). If the determination at step 610 is positive, at step 630 , CPU 102 compares the X coordinate of the last detected finger position with the X coordinate of the first finger position stored at step 606 , and determines whether or not an absolute value of difference between the two coordinates is larger than a threshold value D TH1 .
  • direction of page movement is along the lateral direction of drawing area 230 ( FIG. 3 ) and, therefore, X coordinates of finger positions are compared as described above.
  • a page displayed in drawing area 230 is changed in accordance with the direction of finger movement. Specifically, if the direction of finger movement is to the left in FIG. 3 , the next page of the current page is displayed on drawing area 230 , and if it is to the right, the previous page of the current page is displayed on drawing area 230 . If the determination at step 630 is negative, at step 634 , CPU 102 cancels scrolling of the image displayed in drawing area 230 . In the present embodiment, if fingers are slid to the left/right in the state of multi-touch, the image scrolls to the left/right correspondingly.
  • step 634 the display on drawing area 230 is returned from the scrolled image to the display of original page.
  • CPU 102 determines whether or not finger positions detected at step 608 are all in the left direction when viewed from the finger positions recorded at step 606 . If the determination is positive, at step 614 , CPU 102 calculates the amount of movement (absolute value of difference in X coordinate values), and determines whether or not the value is larger than a threshold value D TH2 .
  • the threshold value D TH2 here is larger than the threshold value D TH1 at step 630 .
  • the amount of movement is an average value of all amounts of movement of multi-touch finger positions.
  • step 616 CPU 102 sets the screen image to be displayed on drawing area 230 to be the image of previous one page. Specifically, if the current image is of the second page, the image of the first page is displayed on drawing area 230 by the process of step 616 . Thereafter, the control flow returns to step 600 .
  • step 618 CPU 102 scrolls the screen image displayed on drawing area 230 to the left by the same length as the amount of movement of finger positions calculated at step 614 . At this time, if there is a previous page, the left end of the page is displayed at the right side of drawing area 230 . Then, the control flow proceeds to step 608 .
  • CPU 102 determines whether or not the finger positions detected at step 608 are in the right direction when viewed from the finger positions recorded at step 606 . If the determination is positive, at step 622 , CPU 102 calculates the amount of movement (absolute value of difference in X coordinate values), and determines whether or not the value is larger than a threshold value D TH3 .
  • the threshold value here is the same as the threshold value D TH2 at step 614 .
  • step 624 CPU 102 sets the screen image to be displayed on drawing area 230 to be the image of next one page. Specifically, if the current image is of the second page, the image of the third page is displayed on drawing area 230 . Thereafter, the control flow returns to step 600 . On the other hand, if the determination at step 622 is negative, CPU 102 scrolls the screen image displayed on drawing area 230 to the right by the same length as the amount of movement of finger positions calculated at step 622 . At this time, if there is a next page, the right end of the page is displayed at the left side of drawing area 230 . Then, the control flow proceeds to step 608 .
  • step 628 CPU 102 executes a predetermined process in accordance with the values of X and Y coordinates of finger positions or history thereof.
  • the process that takes place at step 628 may include a process of updating display, when a specific button is pressed with multi-touch, when a specific plurality of buttons are pressed with multi-touch, when a so-called pinch-out is done with the space between fingers made wider, or a so-called pinch-in is done with the space between fingers made narrower.
  • step 628 such a process is done. Since the contents of processing at step 628 are not related to the present invention, detailed description thereof will not be given here, for simplicity of description.
  • the control flow returns to step 608 .
  • control structure of the routine realizing the process for multi-touch operation is as described above.
  • step 604 If it is determined at step 604 that the detected finger position is one and it is not a multi-touch operation, the process following step 636 is executed.
  • step 636 CPU 102 determines whether or not the finger has moved away from drawing area 230 of touch detecting unit 112 .
  • step 638 based on the finger position at the start of touching detected at step 600 and the finger position immediately before detection of the finger moved away at step 604 , CPU 102 executes a process in accordance with these positions.
  • the process when function button 242 , NEXT button 252 or PREVIOUS button 254 shown in FIG. 3 is pressed corresponds to the process that is executed here. Thereafter, the control flow returns to step 600 , and the output of touch detecting unit 112 is monitored until the next touch is detected.
  • step 636 determines whether or not the detected finger position is in drawing area 230 . If the determination at step 640 is positive, at step 642 , CPU 102 draws an image in accordance with drawing settings at that time on the finger position. Thereafter, the control flow returns to step 602 . If the determination at step 640 is negative, drawing is unnecessary and, therefore, the control flow directly returns to step 602 .
  • Electronic blackboard apparatus 100 operates in the following manner. In the following, the operation of electronic blackboard apparatus 100 related mainly to the page feed will be described. Operations of portions not specifically related to the present invention will not be given here.
  • the third page of a document is displayed on the screen of display unit 114 .
  • a user 680 touches the screen surface of display unit 114 with two fingers, and slides the two fingers to the right as represented by an arrow 682 , with the fingers kept in touch with the surface.
  • the amount of sliding here is a distance D 1 , which is smaller than the threshold value D TH1 .
  • the control flow of the program at this time will be described.
  • the first finger position is detected and, thereafter, the next finger position is detected at step 602 .
  • the operation is multi-touch operation and, therefore, the determination at step 604 is YES.
  • the first touched positions of two fingers are stored in RAM 106 at step 606 .
  • current finger positions are again detected. Assuming that the user continues sliding as shown in FIG. 9 , the determination at step 610 is negative. Therefore, at step 612 , whether or not the two finger positions are moving to the left is determined. Here, the two finger positions are both moving to the right. Therefore, the result of determination is negative. As a result, the control proceeds to step 620 .
  • step 620 whether or not the finger positions are moving to the right is determined. In the state shown in FIG. 9 , the result of determination is positive. Therefore, next, at step 622 , whether or not the amount of movement of the fingers is larger than the threshold value D TH1 is determined. At the beginning of repetition, the amount of movement is smaller than the threshold value D TH1 . Therefore, the result of determination is negative. As a result, the screen image is displayed scrolled to the right by the amount equal to the amount of movement of the fingers (average of the amounts of movement of two finger positions) at step 626 . Thereafter, the control returns to step 608 and the next repetition process starts.
  • the display on display unit 114 is as shown in FIG. 10 .
  • D 1 the distance of finger movement
  • D TH1 the threshold value
  • the display on display unit 114 is as shown in FIG. 10 .
  • the screen image is scrolled to the right by the same distance as D 1 .
  • a right end portion 696 of the image of the next page is displayed.
  • step 632 a process for feeding the screen image by one page to the right is executed. Specifically, as shown by an arrow 720 in FIG. 11 , the image of the next page is scrolled to the right by one page, and the next page is displayed on display unit 114 as shown in FIG. 12 .
  • the screen image before movement is the third page
  • the image after movement is the fourth page.
  • step 600 the control proceeds to step 600 and CPU 102 executes the process of monitoring the output of touch detecting unit 112 until the user touches the screen image next time.
  • the user 680 further continues sliding to the right and the distance D 3 becomes larger than the threshold value D TH2 .
  • the result of determination at step 622 becomes positive, and at step 624 , the screen image is fed to the right by one page. Specifically, as shown in FIG. 14 , the screen image of the fourth page is displayed on display unit 114 . Thereafter, the control returns from step 624 to step 600 , and the next position of touching by the user is detected. If the user 680 continues sliding of his/her fingers to the right as shown by an arrow 760 , this sliding of the user is detected as a new touch at step 600 . Therefore, through the process of steps 600 , 602 and 604 , the operation described above is repeated, using the position touched by the user 680 at the time point of page switching as a head position.
  • step 610 When an upward/downward sliding, a pinch-out or a pinch-in takes place with multi-touch, the control flows through steps 610 , 612 , 620 and 628 of FIG. 8 while the sliding is being done, and the process in accordance with the user operation is executed. If any operation is done with a single-touch, the control proceeds from step 604 to step 636 . Until the finger is moved away, the control proceeds from step 640 to 642 if the operation is in drawing area 230 , and by the process at step 642 , an image is drawn at the finger position. If the operation is out of drawing are 230 , nothing happens in the present embodiment.
  • step 638 based on the first detected finger position and the finger position immediately before the finger is moved away, a process in accordance with these positions is executed. If it is the case that the user touched any of function buttons 242 and moved away the finger, a predetermined process corresponding to the touched function button 242 is executed at step 638 .
  • the operation is as follows. During sliding, the screen image is scrolled in accordance with the amount of sliding. When the fingers are removed after sliding by the distance D 1 and the distance D 1 is equal to or smaller than the threshold value D TH1 , scroll is cancelled and the screen image before sliding is resumed. If the distance D 1 is larger than the threshold value D TH1 , the screen image if moved by one page. In the second embodiment described above, the next page is displayed when slid to the right, and the previous page is displayed when slid to the left.
  • the screen image is scrolled. Therefore, it is possible for the user to intuitively understand that the screen image can be scrolled by a multi-touch operation. If touching fingers are removed in the course of multi-touch sliding and the amount of sliding is small, the scroll is canceled and the original screen image is resumed. Even if multi-touch sliding operation is done erroneously during writing on the screen image, the screen image returns to the normal state immediately when the fingers are moved away. Therefore, even when an erroneous operation is done, the influence on the writing operation is small. On the other hand, if the fingers are moved away after sliding over a certain distance, switching to the next page (page feed) occurs.
  • the page feed can be realized by an intuitive operation of multi-touch sliding operation. If sliding is further continued with the fingers kept touched on the screen, page feed is automatically executed while sliding continues, and the sliding operation can further be continued. Therefore, advantageously, the operation of continuous page feed is made simple.
  • step 610 of FIG. 8 it is determined at step 610 of FIG. 8 that sliding ended if only one of multi-touching fingers is moved away from the input screen image.
  • the present invention is not limited to such an embodiment. It may be determined that sliding ended only when all fingers are moved away. In other words, if it is a multi-touch operation when sliding starts, sliding may be continued thereafter even when the multi-touching is lost, and similar effects as in the embodiment above can be attained.
  • determination as to whether an operation is a multi-touch operation or not is made at step 604 of FIG. 8 , and if it is determined not to be a multi-touch operation, a process such as drawing, different from the page switching or scroll process is executed.
  • the present invention is not limited to such an embodiment.
  • the process of page switching or scroll may be executed only when the operation is a multi-touch with three or more fingers, and, if touched by one or two fingers, a process such as drawing, other than the page switching or scroll may be executed even if it is a multi-touch with two fingers.
  • the process of page switching or scroll may be executed only when the operation is a multi-touch with N fingers, and a process such as drawing, other than the page switching or scroll may be executed even if it is a multi-touch with N ⁇ 1 or smaller number of fingers, with N being an integer larger than 1.
  • use of the average value of amounts of movement of a plurality of fingers in the second embodiment is not limiting.
  • the amount of movement may be calculated based on only the first detected one finger position.
  • threshold values D TH1 and D TH2 mentioned above may be changed in accordance with the speed of movement of finger positions during sliding.
  • the direction of sliding and page turning may be the upward/downward direction (Y axis direction). Sliding may be done in consideration of both X and Y axes directions. In other words, the present invention is applicable to an embodiment allowing sliding in a diagonal direction.
  • the first and second threshold values may be determined separately for the X axis direction and Y axis direction, or determined to be the same regardless of the direction.
  • the length of sliding may not be divided to components in the X axis direction and Y axis direction, and whether a page is to be turned or not may be determined based on the length of the track of sliding (distance between the start point and end point).
  • the present invention is not limited to such an embodiment.
  • a dedicated pen may be used.
  • any method may be used provided that touching at a plurality of positions can be detected.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Position Input By Displaying (AREA)
US13/370,049 2011-02-10 2012-02-09 Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus Abandoned US20120218203A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/823,681 US10191648B2 (en) 2011-02-10 2015-08-11 Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2011-027237(P) 2011-02-10
JP2011027237A JP5537458B2 (ja) 2011-02-10 2011-02-10 タッチ入力可能な画像表示装置、表示装置の制御装置、及びコンピュータプログラム
JP2011-027236(P) 2011-02-10
JP2011027236A JP5536690B2 (ja) 2011-02-10 2011-02-10 タッチ描画表示装置及びその操作方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/823,681 Division US10191648B2 (en) 2011-02-10 2015-08-11 Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus

Publications (1)

Publication Number Publication Date
US20120218203A1 true US20120218203A1 (en) 2012-08-30

Family

ID=46718650

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/370,049 Abandoned US20120218203A1 (en) 2011-02-10 2012-02-09 Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus
US14/823,681 Active 2032-03-26 US10191648B2 (en) 2011-02-10 2015-08-11 Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/823,681 Active 2032-03-26 US10191648B2 (en) 2011-02-10 2015-08-11 Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus

Country Status (2)

Country Link
US (2) US20120218203A1 (zh)
CN (2) CN102681721B (zh)

Cited By (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130127758A1 (en) * 2011-11-23 2013-05-23 Samsung Electronics Co., Ltd. Touch input apparatus and method in user terminal
US20130342486A1 (en) * 2012-06-22 2013-12-26 Smart Technologies Ulc Automatic annotation de-emphasis
US20140059626A1 (en) * 2012-08-17 2014-02-27 Flextronics Ap, Llc Systems and methods for providing video on demand in an intelligent television
US20140062914A1 (en) * 2012-09-03 2014-03-06 Acer Incorporated Electronic apparatus and control method using the same
US20140096071A1 (en) * 2012-10-03 2014-04-03 Konica Minolta, Inc. Display system, display device, and image forming device
WO2014082522A1 (zh) * 2012-11-30 2014-06-05 小米科技有限责任公司 一种选择界面标识的方法、装置及移动终端
US20140160076A1 (en) * 2012-12-10 2014-06-12 Seiko Epson Corporation Display device, and method of controlling display device
US20140181730A1 (en) * 2012-12-21 2014-06-26 Orange Fragmented scrolling of a page
WO2014150725A1 (en) * 2013-03-15 2014-09-25 Qualcomm Incorporated Detection of a gesture performed with at least two control objects
US20140304625A1 (en) * 2013-04-03 2014-10-09 Alibaba Group Holding Limited Page returning
US20150026619A1 (en) * 2013-07-17 2015-01-22 Korea Advanced Institute Of Science And Technology User Interface Method and Apparatus Using Successive Touches
US9076085B2 (en) * 2012-02-15 2015-07-07 Canon Kabushiki Kaisha Image processing apparatus, image processing apparatus control method, and storage medium
US9124739B2 (en) 2013-03-25 2015-09-01 Konica Minolta, Inc. Image forming apparatus, page image displaying device, and display processing method
US20150268827A1 (en) * 2014-03-24 2015-09-24 Hideep Inc. Method for controlling moving direction of display object and a terminal thereof
US20150363026A1 (en) * 2014-06-16 2015-12-17 Touchplus Information Corp. Control device, operation mode altering method thereof, control method thereof and battery power warning method thereof
CN105487687A (zh) * 2015-11-23 2016-04-13 广州视睿电子科技有限公司 一种笔迹显示方法和装置
US20160260410A1 (en) * 2015-03-03 2016-09-08 Seiko Epson Corporation Display apparatus and display control method
US20160313883A1 (en) * 2013-09-09 2016-10-27 Huawei Technologies Co., Ltd. Screen Capture Method, Apparatus, and Terminal Device
US20170046024A1 (en) * 2015-08-10 2017-02-16 Apple Inc. Devices, Methods, and Graphical User Interfaces for Manipulating User Interface Objects with Visual and/or Haptic Feedback
US9602729B2 (en) 2015-06-07 2017-03-21 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9612741B2 (en) 2012-05-09 2017-04-04 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US9619076B2 (en) 2012-05-09 2017-04-11 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9639184B2 (en) 2015-03-19 2017-05-02 Apple Inc. Touch input cursor manipulation
US9645732B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9753639B2 (en) 2012-05-09 2017-09-05 Apple Inc. Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US9778771B2 (en) 2012-12-29 2017-10-03 Apple Inc. Device, method, and graphical user interface for transitioning between touch input to display output relationships
US9785305B2 (en) 2015-03-19 2017-10-10 Apple Inc. Touch input cursor manipulation
US9830048B2 (en) 2015-06-07 2017-11-28 Apple Inc. Devices and methods for processing touch inputs with instructions in a web page
US9886184B2 (en) 2012-05-09 2018-02-06 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US9891811B2 (en) 2015-06-07 2018-02-13 Apple Inc. Devices and methods for navigating between user interfaces
US9959025B2 (en) 2012-12-29 2018-05-01 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US20180136812A1 (en) * 2012-07-16 2018-05-17 Samsung Electronics Co., Ltd. Touch and non-contact gesture based screen switching method and terminal
US9990121B2 (en) 2012-05-09 2018-06-05 Apple Inc. Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US9990107B2 (en) 2015-03-08 2018-06-05 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9996231B2 (en) 2012-05-09 2018-06-12 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US10037138B2 (en) 2012-12-29 2018-07-31 Apple Inc. Device, method, and graphical user interface for switching between user interfaces
US10042542B2 (en) 2012-05-09 2018-08-07 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
US10048757B2 (en) 2015-03-08 2018-08-14 Apple Inc. Devices and methods for controlling media presentation
US10067653B2 (en) 2015-04-01 2018-09-04 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10073615B2 (en) 2012-05-09 2018-09-11 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10078442B2 (en) 2012-12-29 2018-09-18 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10095391B2 (en) 2012-05-09 2018-10-09 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
US10126930B2 (en) 2012-05-09 2018-11-13 Apple Inc. Device, method, and graphical user interface for scrolling nested regions
CN108958594A (zh) * 2018-05-23 2018-12-07 郑州云海信息技术有限公司 一种页面跳转方法及设备
US10162452B2 (en) 2015-08-10 2018-12-25 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10175757B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for touch-based operations performed and reversed in a user interface
US10175864B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity
US10200598B2 (en) 2015-06-07 2019-02-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10235035B2 (en) 2015-08-10 2019-03-19 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US10248308B2 (en) 2015-08-10 2019-04-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures
US10275087B1 (en) 2011-08-05 2019-04-30 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10346030B2 (en) 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
US10416800B2 (en) 2015-08-10 2019-09-17 Apple Inc. Devices, methods, and graphical user interfaces for adjusting user interface objects
US10437333B2 (en) 2012-12-29 2019-10-08 Apple Inc. Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
CN110471640A (zh) * 2018-10-26 2019-11-19 珠海中电数码科技有限公司 一种多屏交互方法及其系统
US10496260B2 (en) 2012-05-09 2019-12-03 Apple Inc. Device, method, and graphical user interface for pressure-based alteration of controls in a user interface
US10620781B2 (en) 2012-12-29 2020-04-14 Apple Inc. Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics
US10891044B1 (en) * 2016-10-25 2021-01-12 Twitter, Inc. Automatic positioning of content items in a scrolling display for optimal viewing of the items
US11119564B2 (en) * 2012-05-23 2021-09-14 Kabushiki Kaisha Square Enix Information processing apparatus, method for information processing, and game apparatus for performing different operations based on a movement of inputs
US20210311622A1 (en) * 2020-04-02 2021-10-07 Beijing Dajia Internet Information Technology Co., Ltd. Method and apparatus for obtaining content
US11169684B2 (en) * 2017-02-15 2021-11-09 Canon Kabushiki Kaisha Display control apparatuses, control methods therefor, and computer readable storage medium
US11209921B2 (en) * 2015-09-30 2021-12-28 Ricoh Company, Ltd. Electronic blackboard, storage medium, and information display method
US11368760B2 (en) 2012-08-17 2022-06-21 Flextronics Ap, Llc Applications generating statistics for user behavior

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103229132B (zh) * 2012-11-23 2016-01-27 华为技术有限公司 实现远程浏览的方法及装置
JP5761216B2 (ja) * 2013-01-22 2015-08-12 カシオ計算機株式会社 情報処理装置、情報処理方法及びプログラム
JP5984722B2 (ja) * 2013-03-22 2016-09-06 シャープ株式会社 情報処理装置
CN104423854A (zh) * 2013-08-23 2015-03-18 鸿合科技有限公司 触摸屏信息处理方法及其装置
CN104571890B (zh) * 2013-10-10 2017-12-19 京微雅格(北京)科技有限公司 一种触控滑动显示系统、电子设备和显示方法
JP5977768B2 (ja) * 2014-01-14 2016-08-24 シャープ株式会社 画像表示装置及びその操作方法
JP6043028B1 (ja) * 2015-01-16 2016-12-14 オリンパス株式会社 超音波観測システム
CN106168864A (zh) * 2015-05-18 2016-11-30 佳能株式会社 显示控制装置以及显示控制方法
JP6919174B2 (ja) * 2016-10-26 2021-08-18 セイコーエプソン株式会社 タッチパネル装置およびタッチパネル制御プログラム
CN108205407B (zh) * 2016-12-20 2021-07-06 夏普株式会社 显示装置、显示方法及存储介质
JP6995605B2 (ja) * 2017-12-18 2022-01-14 キヤノン株式会社 電子機器、電子機器の制御方法、プログラム及び記憶媒体
JP7064173B2 (ja) * 2018-05-11 2022-05-10 富士フイルムビジネスイノベーション株式会社 情報処理装置及びプログラム
JP2020042625A (ja) * 2018-09-12 2020-03-19 株式会社東海理化電機製作所 触覚呈示装置及び触覚呈示方法
CN112346581A (zh) * 2019-08-07 2021-02-09 南京中兴新软件有限责任公司 一种移动轨迹的绘制方法、装置及计算机可读存储介质
WO2021124729A1 (ja) * 2019-12-17 2021-06-24 パナソニックIpマネジメント株式会社 表示制御システム、移動体、表示制御方法、表示装置、表示方法及びプログラム

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5111316A (en) * 1990-08-09 1992-05-05 Western Publishing Company Liquid crystal writing state

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8908612D0 (en) * 1989-04-17 1989-06-01 Quantel Ltd Video graphics system
JPH06175776A (ja) 1992-11-27 1994-06-24 Wacom Co Ltd プレゼンテーション装置
US5761340A (en) 1993-04-28 1998-06-02 Casio Computer Co., Ltd. Data editing method and system for a pen type input device
JP3388451B2 (ja) 1993-05-21 2003-03-24 カシオ計算機株式会社 手書き入力装置
JPH0876926A (ja) 1994-09-02 1996-03-22 Brother Ind Ltd 画像表示装置
JPH09231004A (ja) 1996-02-23 1997-09-05 Yazaki Corp 情報処理装置
JPH10124239A (ja) 1996-10-22 1998-05-15 Sharp Corp タブレット入力装置
JPH11102274A (ja) 1997-09-25 1999-04-13 Nec Corp スクロール装置
JP2000222130A (ja) 1999-02-02 2000-08-11 Toshiba Corp 入力装置および方法および記憶媒体
JP2001117686A (ja) 1999-10-20 2001-04-27 Toshiba Corp ペン入力装置およびペン入力装置のポインティング処理方法
JP4803883B2 (ja) 2000-01-31 2011-10-26 キヤノン株式会社 位置情報処理装置及びその方法及びそのプログラム。
US7138983B2 (en) 2000-01-31 2006-11-21 Canon Kabushiki Kaisha Method and apparatus for detecting and interpreting path of designated position
GB0117543D0 (en) * 2001-07-18 2001-09-12 Hewlett Packard Co Document viewing device
US6690365B2 (en) * 2001-08-29 2004-02-10 Microsoft Corporation Automatic scrolling
KR100486711B1 (ko) 2002-08-12 2005-05-03 삼성전기주식회사 개인용 정보 단말기의 페이지 넘김 장치 및 방법
JP4157337B2 (ja) 2002-08-12 2008-10-01 株式会社リコー タッチパネル付きディスプレイ装置、タッチパネル付きディスプレイ装置の制御方法
JP4215549B2 (ja) * 2003-04-02 2009-01-28 富士通株式会社 タッチパネル・モードとポインティング・デバイス・モードで動作する情報処理装置
TWI248576B (en) * 2004-07-05 2006-02-01 Elan Microelectronics Corp Method for controlling rolling of scroll bar on a touch panel
WO2006066456A1 (fr) * 2004-12-23 2006-06-29 Dong Li Dispositif d'interface utilisateur a commutateurs inductifs et terminal portable associe
JP2007316732A (ja) 2006-05-23 2007-12-06 Sharp Corp 項目選択装置、情報処理装置、及び項目選択のためのコンピュータプログラム
CN101226440A (zh) * 2007-01-17 2008-07-23 汉王科技股份有限公司 触控感应按键手写绘画板及实现方法
US8334847B2 (en) * 2007-10-19 2012-12-18 Qnx Software Systems Limited System having user interface using object selection and gestures
JP5239328B2 (ja) 2007-12-21 2013-07-17 ソニー株式会社 情報処理装置及びタッチ動作認識方法
JP5098961B2 (ja) 2008-11-05 2012-12-12 日本電気株式会社 画像表示装置、方法、及びプログラム
CN101739190B (zh) * 2008-11-10 2012-09-05 汉王科技股份有限公司 带有电容式触控按键的手写显示装置
JP5232034B2 (ja) 2009-02-06 2013-07-10 アルプス電気株式会社 入力処理装置
US8493344B2 (en) * 2009-06-07 2013-07-23 Apple Inc. Devices, methods, and graphical user interfaces for accessibility using a touch-sensitive surface
FR2950169B1 (fr) * 2009-09-11 2012-03-23 Milibris Terminal mobile a ecran tactile
US8749499B2 (en) * 2010-06-08 2014-06-10 Sap Ag Touch screen for bridging multi and/or single touch points to applications
CN104102422B (zh) * 2013-04-03 2018-05-01 阿里巴巴集团控股有限公司 页面返回操作的方法及装置
KR102210045B1 (ko) * 2013-12-12 2021-02-01 삼성전자 주식회사 전자장치의 입력 제어장치 및 방법

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5111316A (en) * 1990-08-09 1992-05-05 Western Publishing Company Liquid crystal writing state

Cited By (192)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10338736B1 (en) 2011-08-05 2019-07-02 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10664097B1 (en) 2011-08-05 2020-05-26 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10656752B1 (en) 2011-08-05 2020-05-19 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US10649571B1 (en) 2011-08-05 2020-05-12 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10540039B1 (en) 2011-08-05 2020-01-21 P4tents1, LLC Devices and methods for navigating between user interface
US10386960B1 (en) 2011-08-05 2019-08-20 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10365758B1 (en) 2011-08-05 2019-07-30 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10345961B1 (en) 2011-08-05 2019-07-09 P4tents1, LLC Devices and methods for navigating between user interfaces
US10275087B1 (en) 2011-08-05 2019-04-30 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9158397B2 (en) * 2011-11-23 2015-10-13 Samsung Electronics Co., Ltd Touch input apparatus and method in user terminal
US20130127758A1 (en) * 2011-11-23 2013-05-23 Samsung Electronics Co., Ltd. Touch input apparatus and method in user terminal
US9076085B2 (en) * 2012-02-15 2015-07-07 Canon Kabushiki Kaisha Image processing apparatus, image processing apparatus control method, and storage medium
US11354033B2 (en) 2012-05-09 2022-06-07 Apple Inc. Device, method, and graphical user interface for managing icons in a user interface region
US10114546B2 (en) 2012-05-09 2018-10-30 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US9996231B2 (en) 2012-05-09 2018-06-12 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US10191627B2 (en) 2012-05-09 2019-01-29 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US11947724B2 (en) 2012-05-09 2024-04-02 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
US9990121B2 (en) 2012-05-09 2018-06-05 Apple Inc. Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US10175864B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity
US10073615B2 (en) 2012-05-09 2018-09-11 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US11068153B2 (en) 2012-05-09 2021-07-20 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US9971499B2 (en) 2012-05-09 2018-05-15 Apple Inc. Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US10481690B2 (en) 2012-05-09 2019-11-19 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for media adjustment operations performed in a user interface
US11023116B2 (en) 2012-05-09 2021-06-01 Apple Inc. Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US10175757B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for touch-based operations performed and reversed in a user interface
US10126930B2 (en) 2012-05-09 2018-11-13 Apple Inc. Device, method, and graphical user interface for scrolling nested regions
US10496260B2 (en) 2012-05-09 2019-12-03 Apple Inc. Device, method, and graphical user interface for pressure-based alteration of controls in a user interface
US10095391B2 (en) 2012-05-09 2018-10-09 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
US10592041B2 (en) 2012-05-09 2020-03-17 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US9886184B2 (en) 2012-05-09 2018-02-06 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US10168826B2 (en) 2012-05-09 2019-01-01 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10042542B2 (en) 2012-05-09 2018-08-07 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
US9612741B2 (en) 2012-05-09 2017-04-04 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US10775994B2 (en) 2012-05-09 2020-09-15 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
US11314407B2 (en) 2012-05-09 2022-04-26 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US10775999B2 (en) 2012-05-09 2020-09-15 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US9823839B2 (en) 2012-05-09 2017-11-21 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US10782871B2 (en) 2012-05-09 2020-09-22 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US10884591B2 (en) 2012-05-09 2021-01-05 Apple Inc. Device, method, and graphical user interface for selecting object within a group of objects
US9753639B2 (en) 2012-05-09 2017-09-05 Apple Inc. Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US10908808B2 (en) 2012-05-09 2021-02-02 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US11221675B2 (en) 2012-05-09 2022-01-11 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
US9619076B2 (en) 2012-05-09 2017-04-11 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10942570B2 (en) 2012-05-09 2021-03-09 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
US10969945B2 (en) 2012-05-09 2021-04-06 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
US10996788B2 (en) 2012-05-09 2021-05-04 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US11010027B2 (en) 2012-05-09 2021-05-18 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US11119564B2 (en) * 2012-05-23 2021-09-14 Kabushiki Kaisha Square Enix Information processing apparatus, method for information processing, and game apparatus for performing different operations based on a movement of inputs
US9323367B2 (en) * 2012-06-22 2016-04-26 Smart Technologies Ulc Automatic annotation de-emphasis
US20130342486A1 (en) * 2012-06-22 2013-12-26 Smart Technologies Ulc Automatic annotation de-emphasis
US20180136812A1 (en) * 2012-07-16 2018-05-17 Samsung Electronics Co., Ltd. Touch and non-contact gesture based screen switching method and terminal
US9167186B2 (en) 2012-08-17 2015-10-20 Flextronics Ap, Llc Systems and methods for managing data in an intelligent television
US9172896B2 (en) 2012-08-17 2015-10-27 Flextronics Ap, Llc Content-sensitive and context-sensitive user interface for an intelligent television
US11119579B2 (en) 2012-08-17 2021-09-14 Flextronics Ap, Llc On screen header bar for providing program information
US9432742B2 (en) 2012-08-17 2016-08-30 Flextronics Ap, Llc Intelligent channel changing
US9426527B2 (en) 2012-08-17 2016-08-23 Flextronics Ap, Llc Systems and methods for providing video on demand in an intelligent television
US9426515B2 (en) 2012-08-17 2016-08-23 Flextronics Ap, Llc Systems and methods for providing social media with an intelligent television
US9414108B2 (en) 2012-08-17 2016-08-09 Flextronics Ap, Llc Electronic program guide and preview window
US9380334B2 (en) 2012-08-17 2016-06-28 Flextronics Ap, Llc Systems and methods for providing user interfaces in an intelligent television
US9374546B2 (en) 2012-08-17 2016-06-21 Flextronics Ap, Llc Location-based context for UI components
US9369654B2 (en) 2012-08-17 2016-06-14 Flextronics Ap, Llc EPG data interface
US9363457B2 (en) 2012-08-17 2016-06-07 Flextronics Ap, Llc Systems and methods for providing social media with an intelligent television
US9686582B2 (en) 2012-08-17 2017-06-20 Flextronics Ap, Llc Systems and methods for managing data in an intelligent television
US9301003B2 (en) 2012-08-17 2016-03-29 Jamdeo Technologies Ltd. Content-sensitive user interface for an intelligent television
US9271039B2 (en) 2012-08-17 2016-02-23 Flextronics Ap, Llc Live television application setup behavior
US9264775B2 (en) 2012-08-17 2016-02-16 Flextronics Ap, Llc Systems and methods for managing data in an intelligent television
US9247174B2 (en) 2012-08-17 2016-01-26 Flextronics Ap, Llc Panel user interface for an intelligent television
US9237291B2 (en) 2012-08-17 2016-01-12 Flextronics Ap, Llc Method and system for locating programming on a television
US9232168B2 (en) 2012-08-17 2016-01-05 Flextronics Ap, Llc Systems and methods for providing user interfaces in an intelligent television
US9215393B2 (en) 2012-08-17 2015-12-15 Flextronics Ap, Llc On-demand creation of reports
US9191708B2 (en) 2012-08-17 2015-11-17 Jamdeo Technologies Ltd. Content-sensitive user interface for an intelligent television
US9191604B2 (en) 2012-08-17 2015-11-17 Flextronics Ap, Llc Systems and methods for providing user interfaces in an intelligent television
US9185324B2 (en) 2012-08-17 2015-11-10 Flextronics Ap, Llc Sourcing EPG data
US20140059626A1 (en) * 2012-08-17 2014-02-27 Flextronics Ap, Llc Systems and methods for providing video on demand in an intelligent television
US9185325B2 (en) 2012-08-17 2015-11-10 Flextronics Ap, Llc Systems and methods for providing video on demand in an intelligent television
US9185323B2 (en) 2012-08-17 2015-11-10 Flextronics Ap, Llc Systems and methods for providing social media with an intelligent television
US11150736B2 (en) 2012-08-17 2021-10-19 Flextronics Ap, Llc Systems and methods for providing user interfaces in an intelligent television
US11977686B2 (en) 2012-08-17 2024-05-07 Multimedia Technologies Pte. Ltd. Systems and methods for providing social media with an intelligent television
US10506294B2 (en) 2012-08-17 2019-12-10 Flextronics Ap, Llc Systems and methods for providing user interfaces in an intelligent television
US9167187B2 (en) 2012-08-17 2015-10-20 Flextronics Ap, Llc Systems and methods for providing video on demand in an intelligent television
US11368760B2 (en) 2012-08-17 2022-06-21 Flextronics Ap, Llc Applications generating statistics for user behavior
US9118864B2 (en) 2012-08-17 2015-08-25 Flextronics Ap, Llc Interactive channel navigation and switching
US9118967B2 (en) 2012-08-17 2015-08-25 Jamdeo Technologies Ltd. Channel changer for intelligent television
US9106866B2 (en) 2012-08-17 2015-08-11 Flextronics Ap, Llc Systems and methods for providing user interfaces in an intelligent television
US9077928B2 (en) 2012-08-17 2015-07-07 Flextronics Ap, Llc Data reporting of usage statistics
US9066040B2 (en) 2012-08-17 2015-06-23 Flextronics Ap, Llc Systems and methods for providing video on demand in an intelligent television
US11782512B2 (en) 2012-08-17 2023-10-10 Multimedia Technologies Pte, Ltd Systems and methods for providing video on demand in an intelligent television
US9055254B2 (en) 2012-08-17 2015-06-09 Flextronics Ap, Llc On screen method and system for changing television channels
US9055255B2 (en) 2012-08-17 2015-06-09 Flextronics Ap, Llc Live television application on top of live feed
US9021517B2 (en) * 2012-08-17 2015-04-28 Flextronics Ap, Llc Systems and methods for providing video on demand in an intelligent television
US11474615B2 (en) 2012-08-17 2022-10-18 Flextronics Ap, Llc Systems and methods for providing user interfaces in an intelligent television
US10051314B2 (en) 2012-08-17 2018-08-14 Jamdeo Technologies Ltd. Method and system for changing programming on a television
US9052773B2 (en) * 2012-09-03 2015-06-09 Acer Incorporated Electronic apparatus and control method using the same
US20140062914A1 (en) * 2012-09-03 2014-03-06 Acer Incorporated Electronic apparatus and control method using the same
US20140096071A1 (en) * 2012-10-03 2014-04-03 Konica Minolta, Inc. Display system, display device, and image forming device
WO2014082522A1 (zh) * 2012-11-30 2014-06-05 小米科技有限责任公司 一种选择界面标识的方法、装置及移动终端
US9904414B2 (en) * 2012-12-10 2018-02-27 Seiko Epson Corporation Display device, and method of controlling display device
US20140160076A1 (en) * 2012-12-10 2014-06-12 Seiko Epson Corporation Display device, and method of controlling display device
US20140181730A1 (en) * 2012-12-21 2014-06-26 Orange Fragmented scrolling of a page
US9880726B2 (en) * 2012-12-21 2018-01-30 Orange Fragmented scrolling of a page
US10078442B2 (en) 2012-12-29 2018-09-18 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold
US9965074B2 (en) 2012-12-29 2018-05-08 Apple Inc. Device, method, and graphical user interface for transitioning between touch input to display output relationships
US10620781B2 (en) 2012-12-29 2020-04-14 Apple Inc. Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics
US9857897B2 (en) 2012-12-29 2018-01-02 Apple Inc. Device and method for assigning respective portions of an aggregate intensity to a plurality of contacts
US10101887B2 (en) 2012-12-29 2018-10-16 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US9778771B2 (en) 2012-12-29 2017-10-03 Apple Inc. Device, method, and graphical user interface for transitioning between touch input to display output relationships
US10915243B2 (en) 2012-12-29 2021-02-09 Apple Inc. Device, method, and graphical user interface for adjusting content selection
US10175879B2 (en) 2012-12-29 2019-01-08 Apple Inc. Device, method, and graphical user interface for zooming a user interface while performing a drag operation
US9959025B2 (en) 2012-12-29 2018-05-01 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US10185491B2 (en) 2012-12-29 2019-01-22 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or enlarge content
US9996233B2 (en) 2012-12-29 2018-06-12 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US10437333B2 (en) 2012-12-29 2019-10-08 Apple Inc. Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
US10037138B2 (en) 2012-12-29 2018-07-31 Apple Inc. Device, method, and graphical user interface for switching between user interfaces
WO2014150725A1 (en) * 2013-03-15 2014-09-25 Qualcomm Incorporated Detection of a gesture performed with at least two control objects
US9124739B2 (en) 2013-03-25 2015-09-01 Konica Minolta, Inc. Image forming apparatus, page image displaying device, and display processing method
WO2014165534A1 (en) * 2013-04-03 2014-10-09 Alibaba Group Holding Limited Page returning
US20140304625A1 (en) * 2013-04-03 2014-10-09 Alibaba Group Holding Limited Page returning
US9612736B2 (en) * 2013-07-17 2017-04-04 Korea Advanced Institute Of Science And Technology User interface method and apparatus using successive touches
US20150026619A1 (en) * 2013-07-17 2015-01-22 Korea Advanced Institute Of Science And Technology User Interface Method and Apparatus Using Successive Touches
US20160313883A1 (en) * 2013-09-09 2016-10-27 Huawei Technologies Co., Ltd. Screen Capture Method, Apparatus, and Terminal Device
US9983770B2 (en) * 2013-09-09 2018-05-29 Huawei Technologies Co., Ltd. Screen capture method, apparatus, and terminal device
US20150268827A1 (en) * 2014-03-24 2015-09-24 Hideep Inc. Method for controlling moving direction of display object and a terminal thereof
US20150363026A1 (en) * 2014-06-16 2015-12-17 Touchplus Information Corp. Control device, operation mode altering method thereof, control method thereof and battery power warning method thereof
US9898996B2 (en) * 2015-03-03 2018-02-20 Seiko Epson Corporation Display apparatus and display control method
US20160260410A1 (en) * 2015-03-03 2016-09-08 Seiko Epson Corporation Display apparatus and display control method
US11112957B2 (en) 2015-03-08 2021-09-07 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US9990107B2 (en) 2015-03-08 2018-06-05 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US10402073B2 (en) 2015-03-08 2019-09-03 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9645732B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US10387029B2 (en) 2015-03-08 2019-08-20 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US10338772B2 (en) 2015-03-08 2019-07-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US11977726B2 (en) 2015-03-08 2024-05-07 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US9645709B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10180772B2 (en) 2015-03-08 2019-01-15 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10048757B2 (en) 2015-03-08 2018-08-14 Apple Inc. Devices and methods for controlling media presentation
US10067645B2 (en) 2015-03-08 2018-09-04 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10860177B2 (en) 2015-03-08 2020-12-08 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10613634B2 (en) 2015-03-08 2020-04-07 Apple Inc. Devices and methods for controlling media presentation
US10268342B2 (en) 2015-03-08 2019-04-23 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10268341B2 (en) 2015-03-08 2019-04-23 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US11550471B2 (en) 2015-03-19 2023-01-10 Apple Inc. Touch input cursor manipulation
US10599331B2 (en) 2015-03-19 2020-03-24 Apple Inc. Touch input cursor manipulation
US11054990B2 (en) 2015-03-19 2021-07-06 Apple Inc. Touch input cursor manipulation
US10222980B2 (en) 2015-03-19 2019-03-05 Apple Inc. Touch input cursor manipulation
US9639184B2 (en) 2015-03-19 2017-05-02 Apple Inc. Touch input cursor manipulation
US9785305B2 (en) 2015-03-19 2017-10-10 Apple Inc. Touch input cursor manipulation
US10152208B2 (en) 2015-04-01 2018-12-11 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10067653B2 (en) 2015-04-01 2018-09-04 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US9916080B2 (en) 2015-06-07 2018-03-13 Apple Inc. Devices and methods for navigating between user interfaces
US10303354B2 (en) 2015-06-07 2019-05-28 Apple Inc. Devices and methods for navigating between user interfaces
US9891811B2 (en) 2015-06-07 2018-02-13 Apple Inc. Devices and methods for navigating between user interfaces
US11835985B2 (en) 2015-06-07 2023-12-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9830048B2 (en) 2015-06-07 2017-11-28 Apple Inc. Devices and methods for processing touch inputs with instructions in a web page
US9706127B2 (en) 2015-06-07 2017-07-11 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10200598B2 (en) 2015-06-07 2019-02-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10841484B2 (en) 2015-06-07 2020-11-17 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10455146B2 (en) 2015-06-07 2019-10-22 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US11240424B2 (en) 2015-06-07 2022-02-01 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10346030B2 (en) 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
US9860451B2 (en) 2015-06-07 2018-01-02 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US11231831B2 (en) 2015-06-07 2022-01-25 Apple Inc. Devices and methods for content preview based on touch input intensity
US10705718B2 (en) 2015-06-07 2020-07-07 Apple Inc. Devices and methods for navigating between user interfaces
US11681429B2 (en) 2015-06-07 2023-06-20 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9602729B2 (en) 2015-06-07 2017-03-21 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10963158B2 (en) 2015-08-10 2021-03-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US20170046024A1 (en) * 2015-08-10 2017-02-16 Apple Inc. Devices, Methods, and Graphical User Interfaces for Manipulating User Interface Objects with Visual and/or Haptic Feedback
US10209884B2 (en) 2015-08-10 2019-02-19 Apple Inc. Devices, Methods, and Graphical User Interfaces for Manipulating User Interface Objects with Visual and/or Haptic Feedback
US10416800B2 (en) 2015-08-10 2019-09-17 Apple Inc. Devices, methods, and graphical user interfaces for adjusting user interface objects
US10203868B2 (en) 2015-08-10 2019-02-12 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10162452B2 (en) 2015-08-10 2018-12-25 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US11182017B2 (en) 2015-08-10 2021-11-23 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10884608B2 (en) 2015-08-10 2021-01-05 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US9880735B2 (en) * 2015-08-10 2018-01-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
CN107533368A (zh) * 2015-08-10 2018-01-02 苹果公司 用于使用视觉和/或触觉反馈操纵用户界面对象的设备、方法和图形用户界面
US11740785B2 (en) 2015-08-10 2023-08-29 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10235035B2 (en) 2015-08-10 2019-03-19 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US11327648B2 (en) 2015-08-10 2022-05-10 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10698598B2 (en) 2015-08-10 2020-06-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10248308B2 (en) 2015-08-10 2019-04-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures
US10754542B2 (en) 2015-08-10 2020-08-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US11209921B2 (en) * 2015-09-30 2021-12-28 Ricoh Company, Ltd. Electronic blackboard, storage medium, and information display method
CN105487687A (zh) * 2015-11-23 2016-04-13 广州视睿电子科技有限公司 一种笔迹显示方法和装置
US11531460B2 (en) * 2016-10-25 2022-12-20 Twitter, Inc. Automatic positioning of content items in a scrolling display for optimal viewing of the items
US10891044B1 (en) * 2016-10-25 2021-01-12 Twitter, Inc. Automatic positioning of content items in a scrolling display for optimal viewing of the items
US20210096714A1 (en) * 2016-10-25 2021-04-01 Twitter, Inc. Automatic positioning of content items in a scrolling display for optimal viewing of the items
US11169684B2 (en) * 2017-02-15 2021-11-09 Canon Kabushiki Kaisha Display control apparatuses, control methods therefor, and computer readable storage medium
CN108958594A (zh) * 2018-05-23 2018-12-07 郑州云海信息技术有限公司 一种页面跳转方法及设备
CN110471640A (zh) * 2018-10-26 2019-11-19 珠海中电数码科技有限公司 一种多屏交互方法及其系统
US11474689B2 (en) * 2020-04-02 2022-10-18 Beijing Dajia Internet Information Technology Co., Ltd. Method and apparatus for obtaining content
US20210311622A1 (en) * 2020-04-02 2021-10-07 Beijing Dajia Internet Information Technology Co., Ltd. Method and apparatus for obtaining content

Also Published As

Publication number Publication date
CN102681721B (zh) 2015-04-01
CN104636049A (zh) 2015-05-20
CN104636049B (zh) 2018-04-27
CN102681721A (zh) 2012-09-19
US20150346945A1 (en) 2015-12-03
US10191648B2 (en) 2019-01-29

Similar Documents

Publication Publication Date Title
US10191648B2 (en) Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus
JP5537458B2 (ja) タッチ入力可能な画像表示装置、表示装置の制御装置、及びコンピュータプログラム
US8633906B2 (en) Operation control apparatus, operation control method, and computer program
JP5536690B2 (ja) タッチ描画表示装置及びその操作方法
CN110647248B (zh) 图像显示装置及其操作方法
US20190018585A1 (en) Touch operation method based on interactive electronic white board and system thereof
US10719228B2 (en) Image processing apparatus, image processing system, and image processing method
US20170336932A1 (en) Image display apparatus allowing operation of image screen and operation method thereof
US11150749B2 (en) Control module for stylus with whiteboard-style erasure
US10747425B2 (en) Touch operation input device, touch operation input method and program
JP2009198734A (ja) マルチディスプレイの制御方法、制御プログラムおよびマルチディスプレイ装置
JP5905783B2 (ja) 画像表示システム
JP2012168621A (ja) タッチ描画表示装置及びその操作方法
JP2009116727A (ja) 画像入力表示装置
JP2013178701A (ja) 複数ウィンドウを用いたタッチ描画表示装置
US20190317617A1 (en) Terminal Device And Recording Medium
JP5782157B2 (ja) タッチ入力可能な画像表示装置、表示装置の制御装置、及びコンピュータプログラム
KR100899035B1 (ko) 복수의 디스플레이 패널들을 사용하는 전자칠판 시스템 및그 운용방법
JP5801920B2 (ja) タッチ描画表示装置及びその操作方法
JP6087602B2 (ja) 電子黒板
JP6584876B2 (ja) 情報処理装置、情報処理プログラムおよび情報処理方法
KR20150114332A (ko) 스마트 보드 및 그 제어 방법
JP6068428B2 (ja) 画像表示システムの制御方法及び制御装置
JP2016186524A (ja) 表示システム、表示装置、情報処理装置及び制御方法
KR20150114329A (ko) 스마트 보드 및 그 제어 방법

Legal Events

Date Code Title Description
AS Assignment

Owner name: SHARP KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KANKI, NORIYOSHI;REEL/FRAME:027687/0985

Effective date: 20120106

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION