JP6221622B2 - Touch panel device and image forming apparatus - Google Patents

Touch panel device and image forming apparatus Download PDF

Info

Publication number
JP6221622B2
JP6221622B2 JP2013219781A JP2013219781A JP6221622B2 JP 6221622 B2 JP6221622 B2 JP 6221622B2 JP 2013219781 A JP2013219781 A JP 2013219781A JP 2013219781 A JP2013219781 A JP 2013219781A JP 6221622 B2 JP6221622 B2 JP 6221622B2
Authority
JP
Japan
Prior art keywords
image
operation
display screen
touch panel
boundary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2013219781A
Other languages
Japanese (ja)
Other versions
JP2015082210A (en
Inventor
村石 理恵
理恵 村石
Original Assignee
富士ゼロックス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士ゼロックス株式会社 filed Critical 富士ゼロックス株式会社
Priority to JP2013219781A priority Critical patent/JP6221622B2/en
Publication of JP2015082210A publication Critical patent/JP2015082210A/en
Application granted granted Critical
Publication of JP6221622B2 publication Critical patent/JP6221622B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a touch panel device and an image forming apparatus.

  A touch panel device is mounted on an image forming apparatus and other various apparatuses.

  Here, Patent Document 1 describes that a touch operation area schema is stored for each application, and the application is operated by calling a schema required for execution control.

  Further, in Patent Document 2, the detection sensitivity is changed between the tap operation and the drag operation based on the transition of the capacitance value before and after the touch operation and the operation direction, and the sensitivity is changed depending on the operation region and the travel mode. It is described.

  Further, Patent Document 3 describes learning and updating the tap region of an object.

JP 2004-355106 A JP 2012-242924 A JP2012-216127A

  An object of the present invention is to provide a touch panel device in which a user's operation intention is correctly reflected and an image forming apparatus including the touch panel device.

Claim 1
Including an image drawing unit that draws a touch panel, and an image on the display screen and a sensor responsible for detecting the contact position on the display screen and the display screen serving for a display image, a contact touch start point to the display screen a boundary setting unit that sets a boundary region that have a spread corresponding to the areas on the image or the image is drawn by viewing before Symbol image drawing unit, a contact start point after detecting contact to the display screen A first operation mode in which separation from the display screen is detected without movement of the contact point exceeding the boundary region set to include, and movement of the contact point exceeding the boundary region by detecting contact to the display screen a second operating mode and draws-operation determination equipment having an operation determination unit for determining that detected the spaced from the display screen after,
An image drawn by the image drawing unit or an area on the image accepts an operation in the second operation mode only in the horizontal direction and an image in which the operation in the second operation mode is not accepted in the vertical direction In the touch panel device, the boundary setting unit sets a boundary area that is longer than a horizontal width when the boundary setting unit is an area on an image .

Claim 2 is an image forming apparatus characterized by comprising 1 SL and mounting of the touch panel device according to claim, and an image forming unit which executes image forming processing according to the operation on the touch panel device.

According to the touch panel device of the first aspect and the image forming apparatus of the second aspect , the user's operation intention is correctly reflected as compared with the case where the boundary region having a constant spread is always set.

Further, according to the touch panel device and an image forming apparatus according to claim 2 of claim 1, operation intended users, vertical, are reflected more correctly the horizontal, respectively.

1 is an external view of a multifunction peripheral as an embodiment of an image forming apparatus of the present invention. FIG. 2 is a block diagram showing an internal configuration of the multi-function peripheral whose appearance is shown in FIG. 1. It is explanatory drawing of the problem when operating a touch panel with a finger | toe. It is the figure which showed the structure of the touchscreen and UI controller. It is the flowchart which showed the process at the time of receiving the pressing event in UI controller. It is the flowchart which showed the boundary area | region setting process. It is the figure which showed an example of the image displayed on the display screen of a touch panel. It is the figure which showed the menu image of the state in which the 2nd page was displayed. FIG. 8 is a diagram showing a copy menu image displayed in place of the menu image when the “copy” push button is tapped in the menu image of FIG. 7. It is the figure which showed the 2nd page of the submenu image of "output format". It is the figure which showed an example of the scan preview image. FIG. 12 is a schematic diagram showing the same scan preview image as in FIG. 11 with the entire area of one image overlaid on the basis of image data read by a scanner. It is the flowchart which showed the boundary area | region setting process replaced with FIG.

  Embodiments of the present invention will be described below.

  FIG. 1 is an external view of a multifunction peripheral as an embodiment of the image forming apparatus of the present invention.

  The multifunction machine 10 includes a scanner 11 and a printer 12. A paper tray 13 for storing paper before printing is provided below the printer 12. The multifunction machine 10 is provided with a user interface (hereinafter abbreviated as “UI”) 14. The UI 14 includes a touch panel 141 having a display screen and an operation panel 142 in which operation buttons are arranged. The touch panel 141 includes not only a display screen but also a touch sensor. When the touch panel 141 is touched with a finger, the contact position of the finger on the touch panel 141 is detected. That is, when an image including an operation button is displayed on the touch panel 141 and a finger is brought into contact with the touch panel 141 so as to overlap the operation button on the image, the operation button is pressed on the multifunction device 10. Corresponding instructions are entered.

  The touch panel 141 employed in the present embodiment includes a pressure sensitive sensor. This pressure-sensitive sensor has two ITO films (transparent conductive film) facing each other, and when the ITO film is pressed with a finger, the two ITO films are brought into contact with each other at the pressed position to conduct electricity. The contact location is detected by utilizing the fact that the resistance of the film is divided at the contact location. If there is a spread in the pressed location, a point at the center of gravity is detected.

  FIG. 2 is a block diagram showing the internal configuration of the multi-function peripheral whose appearance is shown in FIG.

  In addition to the scanner 11, printer 12, and UI 14 described with reference to FIG. 1, the multifunction machine 10 further includes a facsimile transmission / reception unit 15, a network interface unit 16, and a control unit 17.

  The scanner 11 is an apparatus that reads an image on a document and generates image data representing the image on the document. The image data generated by the scanner 11 is transferred to any one of the printer 12, the facsimile transmission / reception unit 15, and the network interface unit 16.

  The printer 12 is a device that prints out an image based on the image data on paper. The image data that is the basis of the image printed on the paper by the printer 12 may be image data generated by the scanner 11, but is not limited thereto. The printer 12 prints out image data received by the facsimile transmission / reception unit 15 or an image based on the image data received by the network interface unit 16.

  The facsimile transmission / reception unit 15 is an element that is connected to a telephone line (not shown) and performs facsimile transmission / reception. The facsimile transmission / reception unit 15 receives image data generated by the scanner 11 or image data received by a network interface unit 16 described below and performs facsimile transmission. The facsimile transmission / reception unit 15 receives image data by facsimile reception and passes the image data to the printer 12 or the network interface unit 16.

  The network interface unit 16 is connected to a personal computer for image editing (hereinafter abbreviated as PC) (not shown) via a network, and transmits the image data generated by the scanner 11 to the PC, or from the PC. The image data is received and transferred to the printer 12 and the facsimile transmission / reception unit 15.

  The user interface (UI) 14 includes a touch panel 141 and an operation panel 142 as described with reference to FIG. The UI 14 is responsible for exchanging information between the user and the multifunction machine 10.

  The control unit 16 is responsible for controlling all elements of the multifunction machine 10. FIG. 2 shows a UI interface 171 that controls the touch panel 141 among the elements constituting the control unit 17. The combination of the touch panel 141 and the UI interface 171 in the present embodiment corresponds to an example of the touch panel device of the present invention.

  As described above, the multifunction machine 10 is a device having a function as a copier, a function as a facsimile machine, and a function as a scanner.

  The present embodiment is characterized by a user operation by touching the touch panel 141 with a finger, and this will be described in detail below.

  FIG. 3 is an explanatory diagram of problems when the touch panel is operated with a finger.

  FIG. 3A is an explanatory diagram of erroneous detection by pressing.

  When the touch panel 141 is pushed in with the finger 2, the center of pressure moves from the fingertip to the belly of the finger. For this reason, the detected coordinates move in time, and the user may misrecognize that the user intends to perform a touch operation simply by pressing the touch panel 141 but has performed a drag operation by moving a finger. Such a pressing method is often seen by users with long fingernails and is likely to occur in an operation of pressing a push button displayed on the touch panel.

  FIGS. 3B and 3C are explanatory diagrams of erroneous detection in a flick operation (operation in which a finger is applied to the touch panel and the finger is moved while rubbing the touch panel while moving the finger).

FIG. 3B shows a strong flick operation, and it is recognized that the detection points have moved to the detection points p 1 , p 2 , and p 3 in terms of time. On the other hand, FIG. 3 (C) shows a light flick operation, and the contact start point p 1 is detected, but the contact point p 2 after the subsequent movement is not detected because the contact pressure is weakened. Although the user intends to perform a flick operation, the user may misrecognize that a tap operation (an operation of pressing and releasing the touch panel) has been performed. This tends to occur in images with many page feeds.

  In this embodiment, the occurrence of these erroneous recognitions is suppressed.

  FIG. 4 is a diagram illustrating the configuration of the touch panel and the UI controller.

  4 shows a touch panel 141 constituting the UI 14 (see FIGS. 1 and 2) and a UI controller 171 constituting the control unit 17 (see FIG. 2).

  The touch panel 141 includes a display screen 141a as hardware and a pressure-sensitive touch sensor 141b.

  The control unit 17 shown in FIG. 2 includes a CPU, a memory, and the like (not shown). When the program is executed by the CPU, functions as a UI controller 171 described below, and other functions of the multifunction machine 10 are described. The function to control each element is realized.

  The UI controller 171 includes a UI control device 171a and a drawing / operation determination device 171b. As described above, the UI control device 171a and the drawing / operation determination device 171b are configured by a combination of hardware such as a CPU and a program executed there.

  The UI control device 171a instructs the drawing / operation determination device 171b to display an image to be displayed on the display screen 171a of the touch panel 141. Also, the UI control device 171a receives information on the current operation position (coordinates pressed by the finger) on the touch panel 141 and the type of operation (tap, drag, etc.) on the touch panel 141 from the drawing / operation determination device 171b. It is recognized which operation button has been pressed as compared with the image currently displayed on the touch panel 141, or in which direction the image is moved or the page is updated. When it is necessary to change the image to be displayed on the display screen 141a of the touch panel 141, the UI control device 171a instructs the drawing / operation determining device 171b to change the image. Alternatively, the UI control device 171a outputs information indicating which push button has been pressed when a push button requiring processing other than image display is pressed. The control unit 17 shown in FIG. 2 receives the information and executes a process corresponding to the pressed button.

  The drawing / operation determination device 171 b includes an image drawing unit 71, an operation determination unit 72, and a boundary setting unit 73.

  The image drawing unit 71 draws an image corresponding to the instruction from the UI control device 171a on the display screen 141a of the touch panel 141.

  The operation determination unit 72 acquires the coordinates of the contact point on the touch panel 141 (display screen 141a) and the time change thereof from the touch sensor 141b of the touch panel 141, and determines which type of operation has been performed. This determination result is notified to the UI control device 171a together with information on the coordinates of the contact point. Upon receiving this notification, the UI control device 171a instructs the change of the image to be displayed on the display screen 141a based on this notification, or executes the processing corresponding to the operated push button. To other elements (not shown).

  In addition, the boundary setting unit 73 sets a boundary region in the operation determination unit 72. This boundary area is an area set so as to include the touch start point of the finger on the touch panel 14, and the operation determination unit 72 uses the set boundary area to move the finger that touches the display screen 141a. Whether a drag operation or a tap operation has been performed is determined based on whether or not the display screen 141a has been touched and moved beyond the boundary region. That is, the operation discriminating unit 72 detects a contact with the display screen 141a and then releases the finger from the display screen 141a without moving the contact point beyond the boundary region set to include the contact start point. When a release operation is detected, it is determined that a tap operation has been performed. In addition, the operation determining unit 72 determines that the drag operation has been performed when the movement of the finger exceeding the set boundary region is detected after detecting contact with the display screen 141a. The operation control unit 72 further determines whether the operation is a drag release operation or a flick operation according to the moving speed of the finger in the release operation after the drag operation is performed. Therefore, here, the tap operation and the drag operation / flick operation correspond to examples of the first operation mode and the second operation mode, respectively, according to the present invention.

  Here, each boundary area is registered in the boundary setting unit 73 in association with each of a plurality of images that may be displayed on the display screen 141a. Among these boundary regions, there are boundary regions having different spreads. The boundary setting unit 73 sets the boundary region corresponding to the image drawn on the display screen 141 a by the image drawing unit 71 when setting the boundary region for the operation determination unit 72. In other words, the set boundary region is a boundary region having a spread corresponding to the image. Therefore, the operation determination unit 72 determines that the tap operation is performed even if the finger moves slightly according to the image displayed on the display screen 141a, or determines that the drag operation is performed only by a slight movement of the finger. become. As shown in a specific example to be described later, the boundary setting unit 73 causes the operation determination unit 72 to display an image drawn on the display screen 141a by the image drawing unit 71 as a main operation such as a push button. In the case of images arranged side by side, a wider boundary region is set as compared with an image in which a drag operation / flick operation is the main operation. Depending on the image, the boundary setting unit 73 sets boundary regions having different vertical and horizontal dimensions in the operation determination unit 72. Details will be described later.

  FIG. 5 is a flowchart showing processing at the time of receiving a press event in the UI controller.

  When a pressing event, that is, information indicating that the touch panel 141 from the touch sensor 141b is pressed with a finger is received, execution of the processing shown in FIG. 5 is started.

  Here, first, the coordinates on the touch panel 141 at the time of pressing and the pressing time are stored (step S01), and then the boundary region setting process is performed (step S02).

  FIG. 6 is a flowchart showing the boundary area setting process in step S02 of FIG.

  Here, it is determined whether or not a boundary region corresponding to the image currently displayed on the touch panel 141 is registered (step S31). When the boundary area corresponding to the image is registered, the registered boundary area is set as the boundary area referred to in step S04 in FIG. 5 (step S32). On the other hand, when no boundary area corresponding to the image is registered, a boundary area as a default of this system is set (step S33).

  Returning to FIG.

  When the boundary region is set in step S02, it is next determined whether or not a release event has been received (step S03), and the pressed coordinates are among the x direction (horizontal direction) and the y direction (vertical direction). The determination of whether or not the boundary area has been exceeded in any direction (step S04) is repeated.

  When a release event is received before crossing the boundary region (step S03), a tap operation is detected and processing associated with the tap operation is executed, or execution of the processing is instructed to the control unit 17 (see FIG. 2). Is done. In other words, here, when the tap operation is performed, the tap of the image displayed on the display screen 141a, for example, the start of the copy operation when the push button for instructing the start of the copy is pressed, etc. Various processes according to the operation are executed.

  On the other hand, if no release event is received (step S03) and there is a coordinate change exceeding the boundary region (step S04), a drag operation is detected (step S06), and the current coordinate and time are stored (step S07). ). Here, in accordance with the detection of the drag operation, processing for moving the image displayed on the display screen 141a by the direction and the movement amount according to the drag operation is performed.

  Next, the determination on whether or not a release event has been received (step S08) and the determination on whether or not coordinate movement has occurred (step S09) are repeated. If it is determined that the coordinates have moved without receiving the release event (S08), the drag detection process (step S10) is performed again, that is, the image moving process on the display screen 141a associated with the drag operation is performed. Further, the current coordinates and time are stored (step S07).

  When the release event is received (step S08), the finger moving speed at the time of release is calculated based on the list of coordinates and time stored so far (step S11). Then, it is determined whether or not the finger moving speed at the time of release is equal to or higher than the threshold speed (step S12). When the moving speed of the finger at the time of release is less than the threshold speed, a drag release that means the end of the drag operation is detected (step S13). Then, processing according to the application is executed, such as processing accompanying the drag release, for example, stopping the movement of the image on the display screen 141a as it is or performing page change processing.

  In step S12, when it is determined that the moving speed of the finger at the time of release is equal to or higher than the threshold speed, a flick action is detected (step S14), and the process associated with the flick action, for example, a page change process, is moved by inertia. Processing according to the application, such as processing to be stopped after that, is executed.

  Depending on the application, the processing that accompanies drag release or the processing that accompanies the flick operation may be different depending on the movement trajectory of the finger, such as whether the finger has moved vertically or horizontally. There is also.

  FIG. 7 is a diagram illustrating an example of an image displayed on the display screen of the touch panel.

  FIG. 7 shows a menu image 31. The menu image 31 includes a first area 31a in which push buttons 311 such as “copy”, “fax”, “scanner”, and “easy copy” are arranged vertically and horizontally; It has a second display area 31b in which push buttons 312 such as “language switch” and “image brightness adjustment” are vertically arranged.

  Here, the “copy”, “fax”, and “scanner” push buttons are buttons for displaying changes to the sub-menu images for copy, fax, and scanner, respectively. Further, the “Easy Copy” push button is a button for instructing a change to a submenu image related to copying with simplified settings. Various other push buttons 311 are arranged in the first area 31a. Description of these push buttons 311 is omitted.

  “Language switching” arranged in the second area 31b is a button that is pressed when the language on the image displayed on the display screen 141a is switched to Japanese, English, or the like. "Image brightness adjustment" is a button that is pressed when adjusting the brightness of the image displayed on the display screen 141a. Even more push buttons 312 are arranged in the second region 31b, but the description thereof is omitted.

  The first display area 31a has a two-page configuration, and the first page is displayed in FIG. When a finger is placed on the first area 31a and a large drag operation is performed in the horizontal direction or a flick operation is performed in the horizontal direction, the display is switched to the second page.

  FIG. 8 is a diagram showing a menu image in a state where the second page is displayed.

  For the first area 31a, the page is updated and the second page is displayed, but the second area 31b is common to both pages. On the second page of the first area 31a of the menu image 31, a push button 311 that cannot be arranged on the first page is arranged. A description of each push button 311 is omitted.

  Returning to FIG. 7, the description will be continued.

  In FIG. 7, the finger 2 pressing the menu image 31 is shown superimposed on the menu image 31. When the display image 141a is pressed with the finger 2, a boundary region D1 including the pressed point is set around the pressed point (see FIG. 5, step S02, and FIG. 6). As shown in FIG. 7, the boundary area D1 corresponding to the menu image 31 is a rectangular area that is vertically long and short horizontally, and is set so that the pressed point is the center of the rectangle. The boundary area D1 is an area that is not displayed on the display screen and is set as data inside. When the pressed point by the finger 2 moves sideways beyond the boundary region D1, a drag operation is detected, and the menu image 31 moves sideways as the finger moves. Thereafter, when the menu image 31 is slowly released, the display returns to the first page as shown in FIG. 7 or is changed to the second page shown in FIG. 8 according to the amount of movement at that time. . Also. When the finger is released while moving quickly across the boundary region D1, the display is changed to the second page display even if the movement amount itself is small.

  Further, when the image is pressed with the finger 2 and the finger is released without moving beyond the boundary area D1 set around the pressing point, a tap operation is detected and associated with the push button displayed at the pressing point. Process is executed.

  After pressing the display screen with the finger 2, to cancel the pressing of the push button that overlaps the pressing point, move the finger 2 in the vertical direction until the pressing point exceeds the boundary region D1, and then release the finger 2. . With respect to the menu image 31, if the drag operation in the vertical direction is detected, the push button is not pressed.

  Here, the reason why the vertically long region is set as the boundary region D1 is that it is not erroneously detected as a drag operation due to the movement of the pressed point by pressing as described with reference to FIG. In the horizontal direction, a drag operation or a flick operation for switching pages may be performed, and in the horizontal direction, a change in the pressing point due to the pressing of a finger is unlikely to occur, so the horizontal width of the boundary region D1 is narrower than the vertical width. Is set.

  Although the description has been given with reference to FIG. 7, in this embodiment, the boundary area having the same shape and the same size is set for the menu image 31 shown in FIG. 8 after switching to the second page. In the present embodiment, the shape and size of the boundary region D1 shown in FIG. 7 are also used as they are for the operation of the push buttons arranged in the second region 31b in the menu image 31 shown in FIGS.

  FIG. 9 is a diagram showing a copy menu image displayed in place of the menu image when the “copy” push button is tapped in the menu image of FIG. 7.

  The copy menu image 32 further includes sub-menu images of “copy”, “image quality adjustment”, “reading method”, “output format”, and “job editing”. When these push buttons are tapped, the respective sub menu images are displayed. FIG. 9 shows a submenu image 321 of “output format” among them. Further, FIG. 9 shows the first page of the “output format” submenu image 321.

  The “output format” sub-menu image 321 shown in FIG. 9 is also composed of two pages, and the page is switched when a large drag operation in the vertical direction or a flick operation beyond the boundary region D2 in the vertical direction is performed. .

  In this “output format” sub-menu image 321, a large number of push buttons such as “double / single side selection” push buttons are arranged vertically and horizontally.

  FIG. 10 is a diagram showing the second page of the “output format” submenu image.

  Here, push buttons that cannot be arranged on the first page shown in FIG. 9 are arranged.

  As shown in FIG. 9, when this submenu image is pressed with the finger 2, a square boundary region D <b> 2 centering on the pressing point of the finger 2 is set. A drag operation / flick operation is also performed on the submenu image 321, but a main operation on the submenu image 321 is a tap operation. For this reason, here, in consideration of the movement of the pressing point due to the pressing of the finger shown in FIG. 3A, a wide boundary region D2 is set in the vertical direction as in the menu image 31 shown in FIG. Further, since the submenu image 321 shown in FIG. 9 does not switch pages in the horizontal direction, the boundary area is expanded in the horizontal direction to further prevent occurrence of erroneous detection. In this submenu image 321, a page is switched when a large drag operation is performed in the vertical direction, or when a flick operation exceeding the boundary region D <b> 2 is performed in the vertical direction. If a drag operation is performed in the horizontal direction beyond the boundary area D2, the push button is not pressed. When the finger 2 is released while staying in the movement inside the boundary region D2, processing corresponding to the push button on which the finger 2 is overlapped is executed.

  Further, for the push buttons such as “copy”, “image quality adjustment”,..., Which are arranged horizontally on the upper part of the submenu image 321, a boundary having the same shape and the same size as the boundary region D2 shown in FIG. The area is applied.

  Although the operation on the first page shown in FIG. 9 has been described here, the same applies to the operation on the second page shown in FIG.

  FIG. 11 is a diagram illustrating an example of a scan preview image.

  The scan preview image 33 is used for displaying and confirming an image based on the image data on the display screen 141a before the image data obtained by the scanner 11 (see FIGS. 1 and 2) is transmitted to the PC. belongs to.

  FIG. 12 is a schematic diagram showing the same scan preview image as in FIG. 11, with the entire area of one image based on the image data read by the scanner 11 superimposed.

  As shown in FIG. 12, the image 34 based on the image data obtained by the scanner 11 is an image having a larger area than the display screen 141a, and only a part of the image 34 is displayed on the display screen 141a.

  Therefore, the user performs confirmation while displaying another region of the image 34 on the display screen 141a by dragging or flicking. On this scanner preview image 33, there are also push buttons such as “stop” and “start transmission”, but the main operation on the scan preview image 33 is a drag operation or a flick operation. Therefore, in this scan preview image 33, a small square boundary region D3 centering on the pressing point is set around the pressing start point by the finger 2. Thereby, the erroneous detection described with reference to FIG. 3C is avoided, and smooth movement of the image is realized.

  In the present embodiment, since the boundary area is set for each image, the same shape as the boundary area D3 shown in FIG. 11 is obtained when the operation button on the scan preview image 33 shown in FIG. A boundary region having the same dimensions is set.

  Thus, in this embodiment, since the expansion of the boundary region is changed according to the image displayed on the display screen 141a, the occurrence of the false detection described with reference to FIG. An operability multifunction machine 10 is realized.

  Above, description of 1st Embodiment is complete | finished and 2nd Embodiment is described next. In the following description of the second embodiment, only differences from the first embodiment described so far will be described.

  FIG. 13 is a flowchart showing a boundary area setting process instead of the boundary area setting process shown in FIG. 6 in the first embodiment.

  The process shown in FIG. 5 in the second embodiment is the same as that in the first embodiment except that the boundary area setting process in step S02 is changed from the process shown in FIG. 6 to the process shown in FIG. .

  In the boundary area setting process shown in FIG. 13, when the image displayed on the display screen 141a is pressed with a finger, it is determined whether or not the coordinates of the pressed point are within the push button area on the image. (Step S41). Here, for example, in the scan preview image 33 shown in FIG. 11, it is determined that the area where a part of the image 34 shown in FIG. 12 is displayed is not within the push button area, and the “stop” or “start transmission” button is pressed. The area where the buttons are arranged is determined to be a push button area.

  In the boundary area setting process shown in FIG. 13, when it is determined that the coordinates at the time of pressing are within the push button area, a relatively wide boundary area (in FIG. 9) for the push button area is formed around the coordinates. The boundary region D2 shown) is set (step S42). On the other hand, if it is determined that the coordinates at the time of pressing are not within the pushbutton area, a relatively narrow boundary area (see the boundary area D3 shown in FIG. 11) outside the pushbutton area is set around the coordinates. Is done.

  In the first embodiment described above, the boundary region is set for each image. In the second embodiment, the boundary region is set for each image region.

  The area referred to here may be an area where the display screen 141a is widely divided as described with reference to FIG. 11, but is not limited thereto. For example, in the menu image 31 shown in FIGS. 7 and 8, the space between the adjacent push buttons 311 is vacant. Therefore, only the inside of the push button 311 that traces the outer shape of the mark may be set as the push button area, and the empty area between the adjacent push buttons may be set outside the push button area. In this case, when a finger is placed so as to overlap the push button 311, a relatively wide boundary region suitable for the tap operation is set, and when the finger is placed in a region away from the push button 311, the drag operation or the flick operation is performed. A suitable relatively narrow boundary region is set.

  The setting of the boundary area for each area of the image shown in FIG. 13 and the setting of the boundary area for each image shown in FIG. 6 may be combined.

  In other words, the image is divided into the push button area and the outside of the push button area, and the shape and dimensions of the boundary area are changed for each image even within the push button area, or the image is outside the push button area. However, the shape and size of the boundary region may be changed for each image.

  For example, in the menu image 31 shown in FIG. 7 and the “output format” submenu image 321 shown in FIG. 9, the shapes and dimensions of the boundary areas D1 and D2 are different even in the push button area.

  Combining the setting of the boundary area for each image area shown in FIG. 13 and the setting of the boundary area for each image shown in FIG. 6, even for the same kind of area (for example, a push button area), The shape and dimensions of the boundary area can be changed.

  As described above, according to the above-described various embodiments, the occurrence of the erroneous detection described with reference to FIG.

  Here, an example in which the present invention is applied to the multifunction machine 10 shown in FIGS. 1 and 2 has been described. However, the image forming apparatus of the present invention does not have to be a multifunction machine, and is a copier having only a copy function. Alternatively, it may be a printer without a scanner function or an apparatus having only a facsimile function. In the image forming apparatus of the present invention, the method for forming an image is not limited to the electrophotographic method, and any method for forming an image, such as an ink jet method or a heat sensitive method, may be used.

  Furthermore, the touch panel device of the present invention is not applied only to an image forming apparatus, but can be widely applied to devices that need to exchange information with a user.

2 fingers 10 MFP 11 scanner 12 printer 13 paper tray 14 user interface (UI)
DESCRIPTION OF SYMBOLS 15 Facsimile transmission / reception part 16 Network interface part 17 Control part 71 Image drawing part 72 Operation determination part 73 Boundary setting part 141 Touch panel 141a Display screen 141b Touch sensor 171 UI controller 171a UI control apparatus 171b Drawing / operation discrimination | determination apparatus

Claims (2)

  1. Including an image drawing unit that draws a touch panel, and an image on the display screen and a sensor responsible for detecting the contact position on the display screen and the display screen serving for a display image, a contact touch start point to the display screen a boundary setting unit that sets a boundary region that have a spread corresponding to the areas on the image or the image is drawn by viewing before Symbol image drawing unit, a contact start point after detecting contact to the display screen A first operation mode in which separation from the display screen is detected without movement of the contact point exceeding the boundary region set to include, and movement of the contact point exceeding the boundary region by detecting contact to the display screen a second operating mode and draws-operation determination equipment having an operation determination unit for determining that detected the spaced from the display screen after,
    An image drawn by the image drawing unit or an area on the image accepts an operation in the second operation mode only in the horizontal direction and an image in which the operation in the second operation mode is not accepted in the vertical direction The touch panel device according to claim 1, wherein, in the case of an area on an image, the boundary setting unit sets a boundary area that is longer than a horizontal width .
  2. An image forming apparatus comprising: the touch panel device according to claim 1; and an image forming unit that executes an image forming process according to an operation on the touch panel device.
JP2013219781A 2013-10-23 2013-10-23 Touch panel device and image forming apparatus Active JP6221622B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2013219781A JP6221622B2 (en) 2013-10-23 2013-10-23 Touch panel device and image forming apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2013219781A JP6221622B2 (en) 2013-10-23 2013-10-23 Touch panel device and image forming apparatus

Publications (2)

Publication Number Publication Date
JP2015082210A JP2015082210A (en) 2015-04-27
JP6221622B2 true JP6221622B2 (en) 2017-11-01

Family

ID=53012779

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2013219781A Active JP6221622B2 (en) 2013-10-23 2013-10-23 Touch panel device and image forming apparatus

Country Status (1)

Country Link
JP (1) JP6221622B2 (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009044770A1 (en) * 2007-10-02 2009-04-09 Access Co., Ltd. Terminal device, link selection method, and display program
JP5418187B2 (en) * 2009-12-02 2014-02-19 ソニー株式会社 Contact operation determination device, contact operation determination method, and program
JP5479414B2 (en) * 2010-11-24 2014-04-23 キヤノン株式会社 Information processing apparatus and control method thereof

Also Published As

Publication number Publication date
JP2015082210A (en) 2015-04-27

Similar Documents

Publication Publication Date Title
US8866761B2 (en) Operation display device and operation display method
US10209862B2 (en) Image processing apparatus for selecting printing operations using a sliding operation for scrolling a list of selection images
CN105208242B (en) Image display control device, image forming apparatus, and image display control method
JP4683126B2 (en) Input device
JP5862888B2 (en) Operation display device and program
JP6524620B2 (en) Information processing system, information processing device, information processing method, and program
US9578193B2 (en) Quick operation user interface for a multifunction printing device
JP5994412B2 (en) Image display apparatus, image control apparatus, image forming apparatus, and program
US8804148B2 (en) Image forming apparatus and non-transitory computer readable medium storing a program for controlling the same
US9076085B2 (en) Image processing apparatus, image processing apparatus control method, and storage medium
US8928692B2 (en) Image processing apparatus, method for displaying pop-up window, and computer-readable storage medium for computer program
US9146628B2 (en) Input apparatus and storage medium storing input control program
JP6382008B2 (en) Image processing apparatus, object display method, and program
US20160328106A1 (en) Thumbnail display apparatus, thumbnail display method, and computer readable medium for switching displayed images
US9172830B2 (en) Image forming apparatus interface where user selections are displayed in a hierarchical manner
US9557904B2 (en) Information processing apparatus, method for controlling display, and storage medium
US9100520B2 (en) Determining whether a scanned page is blank or contains image data
JP5874465B2 (en) Information processing apparatus, image forming apparatus, information processing apparatus control method, image forming apparatus control method, information processing apparatus control program, and image forming apparatus control program
US9325868B2 (en) Image processor displaying plural function keys in scrollable state
JP5599117B2 (en) Operating device and operating method
JP6119633B2 (en) Display control apparatus, image forming apparatus, and program
US10180771B2 (en) User interface provided with display unit for displaying screen
US9965167B2 (en) Display apparatus for displaying images in different mannersand non-transitory storage medium storing instructions executable by the display apparatus
JP2015207246A (en) Touch panel device and image forming apparatus
JP5882779B2 (en) Image processing apparatus, image processing apparatus control method, and program

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20160524

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20170214

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20170215

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20170411

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20170905

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20170918

R150 Certificate of patent or registration of utility model

Ref document number: 6221622

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150