US20150347070A1 - Method of providing screen for manipulating execution of application of image forming apparatus and image forming apparatus using the method - Google Patents
Method of providing screen for manipulating execution of application of image forming apparatus and image forming apparatus using the method Download PDFInfo
- Publication number
- US20150347070A1 US20150347070A1 US14/644,592 US201514644592A US2015347070A1 US 20150347070 A1 US20150347070 A1 US 20150347070A1 US 201514644592 A US201514644592 A US 201514644592A US 2015347070 A1 US2015347070 A1 US 2015347070A1
- Authority
- US
- United States
- Prior art keywords
- user interface
- image forming
- forming apparatus
- user
- application
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00352—Input means
- H04N1/00384—Key input means, e.g. buttons or keypads
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/00411—Display of information to the user, e.g. menus the display also being used for user input, e.g. touch screen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/12—Digital output to print unit, e.g. line printer, chain printer
- G06F3/1201—Dedicated interfaces to print systems
- G06F3/1223—Dedicated interfaces to print systems specifically adapted to use a particular technique
- G06F3/1237—Print job management
- G06F3/1253—Configuration of print job parameters, e.g. using UI at the client
- G06F3/1258—Configuration of print job parameters, e.g. using UI at the client by updating job settings at the printer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/12—Digital output to print unit, e.g. line printer, chain printer
- G06F3/1201—Dedicated interfaces to print systems
- G06F3/1202—Dedicated interfaces to print systems specifically adapted to achieve a particular effect
- G06F3/1203—Improving or facilitating administration, e.g. print management
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/00413—Display of information to the user, e.g. menus using menus, i.e. presenting the user with a plurality of selectable options
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/00472—Display of information to the user, e.g. menus using a pop-up window
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/0077—Types of the still picture apparatus
- H04N2201/0094—Multifunctional device, i.e. a device capable of all of reading, reproducing, copying, facsimile transception, file transception
Abstract
Description
- This application claims the benefit of Korean Patent Application No. 10-2014-0067796, filed on Jun. 3, 2014, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein in its entirety by reference.
- 1. Field
- One or more exemplary embodiments relate to a method of providing a screen for manipulating execution of an application of an image forming apparatus, and the image forming apparatus using the method.
- 2. Description of the Related Art
- Various image forming apparatuses including a printer, a copy machine, a multi-functional device, etc. have a user interface by which a user may control an operation of the image forming apparatus or may input data to the image forming apparatus. A screen that provides the user interface is displayed on a manipulation panel of the image forming apparatus. According to developments in various technologies, hardware and software that are used in the image forming apparatus are also improved, and thus, the user interface of the image forming apparatus is being improved to increase user convenience.
- One or more exemplary embodiments include a method of providing a screen for manipulating execution of an application of an image forming apparatus, and the image forming apparatus using the method, whereby the image forming apparatus may be controlled by substituting a physical button for a virtual button.
- Additional aspects will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the presented exemplary embodiments.
- According to one or more exemplary embodiments, a method of providing a screen for manipulating execution of an application of an image forming apparatus includes operations of generating a first image signal indicating a first user interface for setting options to be applied to the execution of the application, and a second image signal indicating a second user interface including at least one virtual button for controlling an operation of the image forming apparatus; and displaying, based on the first image signal and the second image signal, the first user interface and the second user interface on the screen.
- According to one or more exemplary embodiments, a non-transitory computer-readable recording medium includes a recorded program for executing the method by using a computer.
- According to one or more exemplary embodiments, an image forming apparatus that provides a screen for manipulating execution of an application includes an image processor for generating a first image signal indicating a first user interface for setting options to be applied to the execution of the application, and a second image signal indicating a second user interface including at least one virtual button for controlling an operation of the image forming apparatus; and a display for displaying, based on the first image signal and the second image signal, the first user interface and the second user interface on the screen.
- According to one or more exemplary embodiments, an image forming apparatus that provides a screen for manipulating execution of an application includes an image forming unit, an image processor to generate a first user interface for setting options to be applied to the execution of the application and a second user interface comprising first and second virtual buttons for controlling an operation of the image forming apparatus, and a display to display the first user interface and the second user interface on the screen. At least one of a function and an appearance of the first virtual button is maintained identical regardless of the application whose options are set by the first user interface while at least one of a function and an appearance of the second virtual button is changed according to the application whose options are set by the first user interface.
- These and/or other aspects will become apparent and more readily appreciated from the following description of the exemplary embodiments, taken in conjunction with the accompanying drawings in which:
-
FIG. 1 illustrates an external appearance of an image forming apparatus; -
FIG. 2 illustrates physical button keys included in a manipulation panel of an image forming apparatus according to the related art; -
FIG. 3 is a block diagram illustrating a configuration of the image forming apparatus, according to an exemplary embodiment; -
FIG. 4 illustrates a screen for manipulating execution of an application of the image forming apparatus, according to an exemplary embodiment; -
FIG. 5 illustrates the screen for manipulating execution of an application of the image forming apparatus, according to another exemplary embodiment; -
FIG. 6 illustrates changes in a second user interface which occur when a user's manipulation with respect to a first user interface is input to the screen for manipulating execution of an application of the image forming apparatus, according to an exemplary embodiment; -
FIG. 7 illustrates changes in the second user interface which occur when the user's manipulation with respect to the first user interface is input to the screen for manipulating execution of an application of the image forming apparatus, according to another exemplary embodiment; -
FIG. 8 illustrates that a position of the second user interface is moved on the screen for manipulating execution of an application of the image forming apparatus, according to an exemplary embodiment; -
FIG. 9 illustrates that a position of the second user interface is moved on the screen for manipulating execution of an application of the image forming apparatus, according to another exemplary embodiment; -
FIG. 10 is a block diagram illustrating a configuration of the image forming apparatus, according to another exemplary embodiment; and -
FIG. 11 is a flowchart of a method of providing the screen for manipulating execution of an application of the image forming apparatus, according to an exemplary embodiment. - Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. In this regard, the present exemplary embodiments should be considered in a descriptive sense only and not for purposes of limiting the scope of the inventive concept. All differences that can be easily derived, by one of ordinary skill in the art, from the descriptions and the exemplary embodiments, will be construed as being included in the scope of the inventive concept.
- Throughout the specification, it will be further understood that the terms “configured”, “configuring”, “formed”, and/or “forming” and “comprises”, “comprising,”, “includes” and/or “including”, when used herein, specify the presence of stated components, steps, or operations, but do not preclude the absence of one or more of the components, steps, or operations or the addition of one or more other components, steps, or operations.
- While such terms as “first,” “second,” etc., may be used to describe various components, such components must not be limited to the above terms. The above terms are used only to distinguish one component from another.
- One or more exemplary embodiments are related to a method of providing a screen for manipulating execution of an application of an image forming apparatus, and the image forming apparatus using the method. In the following description, functions or constructions that are well-known to one of ordinary skill in the art will not be described in detail.
- As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
-
FIG. 1 illustrates an external appearance of animage forming apparatus 100. Theimage forming apparatus 100 is an apparatus including a printer, a copy machine, or a multi-functional device capable of forming an image on a transfer medium such as paper. - The
image forming apparatus 100 may have a user interface device arranged externally at a front portion, a side portion, or a rear portion of theimage forming apparatus 100 so as to display information for controlling an operation of theimage forming apparatus 100 and to receive an input of a user's manipulation. The user interface device may indicate both hardware and software that connect theimage forming apparatus 100 with a user, and may be installed in theimage forming apparatus 100. The user interface device may separately include a display for displaying information and a user input unit for receiving an input of the user's manipulation or may be configured such that a touchscreen in which the display and the user input unit are combined. - As illustrated in
FIG. 1 , theimage forming apparatus 100 may provide ascreen 200 at the front portion of the external appearance so as to manipulate an application of theimage forming apparatus 100. Thescreen 200 for manipulating the execution of the application may be displayed on a display of theimage forming apparatus 100. In order to operate a function of theimage forming apparatus 100, a user may execute the application that corresponds to the function in thescreen 200 for manipulating the application. -
FIG. 2 illustrates physical button keys included in amanipulation panel 300 of an image forming apparatus according to the related art. - Referring to
FIG. 2 , themanipulation panel 300 of the image forming apparatus according to the related art has a screen that displays information to a user so as to control the image forming apparatus, and akeypad 310 that is configured of the physical button keys as a user input unit for receiving an input of the user's manipulation. Thekeypad 310 that is configured of the physical button keys includes a 3×4 matrix numbers button part, a function button part, a control button part, a light-emitting diode (LED) for notifying a user about a status of the image forming apparatus, or the like. Here, the term “physical button” may refer to a button that protrudes from the manipulation panel and that has a tangible shape of its own such as a key on a computer keyboard, or a raised bubble key on a membrane-type keypad. - However, recently, a size of a manipulation panel mounted in the
image forming apparatus 100 has increased, and in particular, a ratio has increased by which a display panel for displaying information to a user, a touch panel for receiving information from the user, or a touchscreen that combines the display panel and the touch panel occupies the overall manipulation panel. - When a size of the touch panel that is an integrated module of the display panel and the touch panel is increased, the user may specifically manipulate various functions that are executable in the
image forming apparatus 100, so that user convenience with respect to theimage forming apparatus 100 may be increased. Hereinafter, in one or more exemplary embodiments, physical keys that were provided in the related art as physical buttons of a manipulation panel of theimage forming apparatus 100 are removed and, instead, the physical keys are generated as virtual buttons or icons by using software and are provided with thescreen 200 for manipulating execution of an application. Accordingly, user convenience and usability of theimage forming apparatus 100 may be highly improved. Hereinafter, a method of providing thescreen 200 for manipulating execution of an application of theimage forming apparatus 100, and theimage forming apparatus 100 using the method are described. -
FIG. 3 is a block diagram illustrating a configuration of theimage forming apparatus 100, according to an exemplary embodiment. It will be obvious to one of ordinary skill in the art that theimage forming apparatus 100 may further include general-use elements as well as the elements shown inFIG. 3 . - Referring to
FIG. 3 , theimage forming apparatus 100 may include adisplay 110 and animage processor 130. - The
display 110 may include a display panel (not shown) and a controller (not shown) for controlling the display panel. The display panel may be embodied as various displays including a liquid crystal display (LCD), an organic light-emitting diode (OLED) display, an active-matrix organic light-emitting diode (AM-OLED) display, a plasma display panel (PDP), or the like. Thedisplay 110 may be combined with a touch panel (not shown) and thus may be provided as a touchscreen (not shown). For example, the touchscreen may include an integrated module formed by stacking the display panel and the touch panel. - The
image processor 130 may process an image signal for generating a screen to be displayed on thedisplay 110. When a booting process of theimage forming apparatus 100 is complete, theimage processor 130 may generate a screen by processing an image signal so as to display a screen for controlling an operation of theimage forming apparatus 100. In more detail, theimage processor 130 may generate a screen including various objects such as various applications for executing functions of theimage forming apparatus 100, various user interfaces for receiving an input of user's manipulation, various contents for providing information to a user. Examples of a user's manipulation may include touching a particular icon or virtual button with a finger or stylus. Different touches such as touches varying with time, location, or pressure may yield varying results. Theimage processor 130 may calculate attribute values such as coordinates values, forms, sizes, or colors by which the objects are displayed according to layouts of screens. Then, theimage processor 130 may generate, based on the attribute values, the screens that have various layouts and include the objects. - The screens generated by the
image processor 130 may be provided to thedisplay 110 and may be displayed on an entire area or various predetermined areas making up a portion of thedisplay 110. - Hereinafter, mutual connection between the
display 110 and theimage processor 130, and operations of thedisplay 110 and theimage processor 130 shown inFIG. 3 are described. - The
image processor 130 may generate an image signal that indicates a first user interface for setting options to be applied to execution of an application, and a second user interface including at least one virtual button for controlling an operation of theimage forming apparatus 100 so that the first and second user interfaces are generated. - The
display 110 may display, based on the image signal generated by theimage processor 130, the first user interface and the second user interface on thescreen 200 for manipulating the execution of the application. - When the user's manipulation with respect to the
screen 200 is input, theimage processor 130 may perform image processing that corresponds to the user's manipulation and may control a new screen to be displayed on thedisplay 110. Hereinafter, a configuration and operations of thescreen 200 for manipulating execution of an application are 3described in detail. -
FIG. 4 illustrates thescreen 200 for manipulating execution of an application of theimage forming apparatus 100, according to an exemplary embodiment.FIG. 4 illustrates thescreen 200 for manipulating the execution of the application when a user selects a copy application from among a plurality of applications that are executable by theimage forming apparatus 100. - Referring to
FIG. 4 , thescreen 200 for manipulating the execution of the application may include afirst user interface 210 for setting options to be applied to the execution of the application, and asecond user interface 220 including at least one virtual button (here,virtual buttons 222 and 224) for controlling an operation of theimage forming apparatus 100. - The
first user interface 210 may set the options to be applied to the execution of the application. As illustrated inFIG. 4 , when the copy application is executed, thefirst user interface 210 may set options related to a copy, e.g., the number of copied pages, image attribute adjustment, a paper source, reduction/enlargement, a duplex copy, a color mode, document alignment, etc. - The
second user interface 220 may include thevirtual buttons image forming apparatus 100. Thevirtual button 222 may be one of buttons for starting, stopping, and resetting the operation of theimage forming apparatus 100. Thesecond user interface 220 may also include thevirtual button 224 whose shape and function are changed according to types of the application. For example, referring toFIGS. 4 and 5 , respectively,button 224 is configured as an Interrupt function when the Copy application is being executed or is being displayed infirst user interface 210 whilebutton 224 is configured as an On Hook function when the Start application is being executed or is being displayed infirst user interface 210. When thesecond user interface 220 includes thevirtual buttons second user interface 220 may be separated based on each of thevirtual buttons - The
first user interface 210 and thesecond user interface 220 may occupy separate areas of thescreen 200 without overlapping each other or may be displayed while overlapping each other. In order to utilize well an area of thescreen 200 for manipulating the execution of the application, it is recommended that thefirst user interface 210 and thesecond user interface 220 are displayed while overlapping each other, and thesecond user interface 220 adaptively varies according to user's manipulation. Hereinafter, for convenience of description, it is assumed that thefirst user interface 210 and thesecond user interface 220 overlap each other on thescreen 200 for manipulating the execution of the application. - When the
first user interface 210 and thesecond user interface 220 overlap each other on thescreen 200 for manipulating the execution of the application, thesecond user interface 220 may be displayed on or overlapping thefirst user interface 210. In other words, an area of thefirst user interface 210 that is overlapped by thesecond user interface 220 including thevirtual buttons second user interface 220. In this case, in order to minimize the obstruction due to thesecond user interface 220, a transparency of an entire area or a partial area of thesecond user interface 220 may be adjusted. For example, a degree of the transparency of the entire area or the partial area of thesecond user interface 220 may be greater than a degree of a transparency of thefirst user interface 210. - The
first user interface 210 and thesecond user interface 220 may be displayed differently according to application types. Hereinafter, when a different application is executed, thescreen 200 for manipulating execution of the different application is described. -
FIG. 5 illustrates thescreen 200 for manipulating execution of an application of theimage forming apparatus 100, according to another exemplary embodiment.FIG. 5 illustrates thescreen 200 for manipulating the execution of the application when a user selects a send application from among a plurality of applications that are executable by theimage forming apparatus 100. The embodiment ofFIG. 5 is described with reference to a difference from the embodiment ofFIG. 4 . - Referring to
FIG. 5 , thescreen 200 for manipulating the execution of the application may include afirst user interface 210 for setting options to be applied to the execution of the application, and asecond user interface 220 including thevirtual buttons image forming apparatus 100. - The
first user interface 210 may set the options to be applied to the execution of the application. As illustrated inFIG. 5 , when the send application is executed, thefirst user interface 210 may set options related to scanning and transmitting a document, e.g., the options related to adjusting an attribute of an image included in the document, setting a sender, setting a receiver, a method of transmitting the scanned document, scanning both sides, skipping a blank page, a file name of the scanned document, a file format of the scanned document, resolution of the scanned document, etc. - The
second user interface 220 may includevirtual buttons image forming apparatus 100. As in the embodiment ofFIG. 4 , thevirtual button 222 may be one of buttons for starting, stopping, and resetting the operation of theimage forming apparatus 100. However, unlike the embodiment ofFIG. 4 , when the send application is executed or when attributes corresponding to the send application are displayed on thefirst user interface 210, thesecond user interface 220 may include an On Hook button as thevirtual button 224 whose shape and function are changed according to types of the application. In the embodiment ofFIG. 4 , when the copy application is executed, thevirtual button 224 whose shape and function are changed according to types of the application is an Interrupt button. - Hereinafter, as in the embodiments of
FIGS. 4 and 5 , when thefirst user interface 210 and thesecond user interface 220 overlap each other on thescreen 200 for manipulating execution of an application, changes in thescreen 200 for manipulating the execution of the application is described, wherein the changes occur in response to an input of user's manipulation. -
FIG. 6 illustrates changes in asecond user interface 220 which occur when user's manipulation with respect to afirst user interface 210 is input to thescreen 200 for manipulating execution of an application of theimage forming apparatus 100, according to an exemplary embodiment. Referring toFIG. 6 , three diagrams are shown from the left, according to flow of time, and in particular, an area of thefirst user interface 210 that overlaps thesecond user interface 220 is mainly shown. - Referring to the first diagram, when the
first user interface 210 and thesecond user interface 220 overlap each other on thescreen 200 for manipulating the execution of the application, thesecond user interface 220 may be displayed on thefirst user interface 210. Thus, a user has no difficulty in pressingvirtual buttons virtual buttons first user interface 210 that is obstructed by thesecond user interface 220 is not apparent or clearly visible and thus is unable to receive an input of user's manipulation. Here, as shown in the first diagram, a user may input user's manipulation to a random area of thefirst user interface 210. - Referring to the second diagram, after the user's manipulation with respect to the
first user interface 210 is input, thesecond user interface 220 becomes transparent. - Referring to the third diagram, the
second user interface 220 completely disappears on thefirst user interface 210. Here, the user may previously set and adjust a time in which thesecond user interface 220 becomes transparent and then completely disappears. - As described above, when the user's manipulation with respect to the
first user interface 210 is input, thesecond user interface 220 disappears during a predetermined time from thescreen 200 for manipulating the execution of the application, so that thefirst user interface 210 may be in a usable state for the user. - If the user's manipulation is not input during the predetermined time, the
display 110 of theimage forming apparatus 100 may display again thesecond user interface 220 on thefirst user interface 210. Alternatively, if user's manipulation is input to request thedisplay 110 to display thesecond user interface 220, thedisplay 110 of theimage forming apparatus 100 may display again thesecond user interface 220 on thefirst user interface 210. A procedure in which the disappearedsecond user interface 220 is displayed again on thescreen 200 for manipulating the execution of the application may be in reverse order of the diagrams shown inFIG. 6 . -
FIG. 7 illustrates changes in thesecond user interface 220 which occur when a user's manipulation with respect to thefirst user interface 210 is input to thescreen 200 for manipulating execution of an application of theimage forming apparatus 100, according to another exemplary embodiment. Referring toFIG. 7 , three diagrams are shown from left to right, according to the flow of time, and in particular, an area of thefirst user interface 210 that overlaps thesecond user interface 220 is mainly shown. - Referring to the first diagram, when the
first user interface 210 and thesecond user interface 220 overlap each other on thescreen 200 for manipulating the execution of the application, thesecond user interface 220 may be displayed over thefirst user interface 210. As described above with respect toFIG. 6 , an area of thefirst user interface 210 that is obstructed by thesecond user interface 220 is not apparent or clearly visible and thus is unable to receive an input of user's manipulation. Here, as shown in the first diagram, the user's manipulation may be input to arbitrarily input to any area of thefirst user interface 210. - Referring to the second diagram, after the user's manipulation with respect to the
first user interface 210 is input, thesecond user interface 220 moves in a right direction. In this regard, the user may previously set and adjust a disappearance speed, a disappearance direction, or a disappearance position of thesecond user interface 220. In the second diagram, anarea 230 of thesecond user interface 220 that has a dotted outline extends beyond thescreen 200 for manipulating the execution of the application, and thus is not displayed on thescreen 200 for manipulating the execution of the application. - Referring to the third diagram, the
second user interface 220 may be displayed in the form of asubstitution icon 240 on an area of thescreen 200 for manipulating the execution of the application, so as to notify the user about the existence of thesecond user interface 220. Thesubstitution icon 240 may be generated by deforming an area of thesecond user interface 220 or may be generated with a new form. Alternatively, as illustrated inFIG. 6 , thesecond user interface 220 may completely disappear on thefirst user interface 210. - As described above, when the user's manipulation with respect to the
first user interface 210 is input, thesecond user interface 220 is substituted or replaced with thesubstitution icon 240 having a small size or completely disappears for a predetermined time from thescreen 200 for manipulating the execution of the application, so that thefirst user interface 210 may be in a usable state for the user. In an alternative embodiment, thesecond user interface 220 disappears until the user's manipulation with respect to the first user interface is no longer input. - If the user's manipulation is not input during the predetermined time, the
display 110 of theimage forming apparatus 100 may display again thesecond user interface 220 on thefirst user interface 210. Alternatively, if user's manipulation is input to request thedisplay 110 to display thesecond user interface 220, e.g., if the user clicks thesubstitution icon 240, thedisplay 110 of theimage forming apparatus 100 may display again thesecond user interface 220 on thefirst user interface 210. A procedure in which the disappearedsecond user interface 220 is displayed again on thescreen 200 for manipulating the execution of the application may be in reverse order of the diagrams shown inFIG. 7 . - As described above with reference to
FIGS. 6 and 7 , in a case where thefirst user interface 210 and thesecond user interface 220 are displayed while overlapping each other and thus, it is difficult to use thefirst user interface 210, thesecond user interface 220 may disappear according to the user's manipulation, so that the user may use thefirst user interface 210. -
FIG. 8 illustrates that a position of thesecond user interface 220 is moved on thescreen 200 for manipulating execution of an application of theimage forming apparatus 100, according to an exemplary embodiment. - If the position of the
second user interface 220 displayed on thescreen 200 for manipulating the execution of the application obstructs a portion, required to be viewed, of thefirst user interface 210, it is necessary to change the position of thesecond user interface 220 that overlaps on thefirst user interface 210. - Referring to
FIG. 8 , thesecond user interface 220 that is displayed over thefirst user interface 210 and that obscures a portion of thefirst user interface 210 may be moved to an arbitrary or random position on thefirst user interface 210, according to user's manipulation. - For example, as illustrated in
FIG. 8 , if a user presses a portion of thesecond user interface 220 for few seconds at an original position of thesecond user interface 220, thesecond user interface 220 may be in a movable state for the user. Afterward, the user may move thesecond user interface 220 to a desired position in a drag-and-drop manner, so that the original position of thesecond user interface 220 may be changed. An alternative method of moving the original position of thesecond user interface 220 is described below with reference toFIG. 9 . -
FIG. 9 illustrates that a position of thesecond user interface 220 is moved on thescreen 200 for manipulating execution of an application of theimage forming apparatus 100, according to another exemplary embodiment. - As illustrated in
FIG. 9 , if user's manipulation with respect to thesecond user interface 220 is input, thedisplay 110 of theimage forming apparatus 100 may displaypositions 250 to which thesecond user interface 220 may be moved on thescreen 200 for manipulating the execution of the application. For example, if a user presses a portion of thesecond user interface 220 for few seconds, thedisplay 110 may distinguishably display thepositions 250 from other areas of thefirst user interface 210. - The
display 110 of theimage forming apparatus 100 may move and display thesecond user interface 220 at a user-selected position from among thepositions 250 to which thesecond user interface 220 may be moved. For example, if the user selects one of thepositions 250 to which thesecond user interface 220 may be moved, or drags and drops thesecond user interface 220 to a desired position, thedisplay 110 of theimage forming apparatus 100 may move and display thesecond user interface 220. -
FIG. 10 is a block diagram illustrating a configuration of theimage forming apparatus 100, according to another exemplary embodiment. It will be obvious to one of ordinary skill in the art that theimage forming apparatus 100 may further include general-use elements as well as the elements shown inFIG. 10 . Elements of theimage forming apparatus 100 may be added, omitted, or integrated according to actual specification of theimage forming apparatus 100. For example, if required, at least two elements from among the elements of theimage forming apparatus 100 shown inFIG. 10 may be integrated into one element or one element from among the elements may be subdivided into at least two elements. - Referring to
FIG. 10 , theimage forming apparatus 100 may include, for example, thedisplay 110, auser input unit 120, theimage processor 130, amemory 140, afax 150, ascanner 160, animage forming unit 170, acommunication unit 180, and acontroller 190. The elements may exchange data with each other by using adata bus 195. - The
display 110 may display, to a user, thescreen 200 for manipulating execution of an application. Thescreen 200 for manipulating the execution of the application may include afirst user interface 210 for setting options to be applied to the execution of the application, and asecond user interface 220 includingvirtual buttons image forming apparatus 100. - The
image processor 130 may generate a first image signal and a second image signal that indicate, respectively, thefirst user interface 210 and thesecond user interface 220 that are to be displayed on thescreen 200 for manipulating the execution of the application. The first and second image signals that are generated by theimage processor 130 may be transmitted to thedisplay 110 and thus may be displayed as images including thefirst user interface 210 and thesecond user interface 230 on thescreen 200 for manipulating the execution of the application. Also, theimage processor 130 may process, based on user's manipulation input to theuser input unit 120, image signals so as to newly compose a screen displayed on thedisplay 110. The image processing and the layout of the screen that are performed by theimage processor 130 according to the user's manipulation are described above with reference toFIGS. 4 through 9 . - Regarding the
display 110 and theimage processor 130, descriptions that are the same as the aforementioned contents will be omitted here. - The
user input unit 120 may receive, from a user, an input of user's manipulation with respect to a screen that is displayed on thedisplay 110. Theuser input unit 120 may include at least one selected from a touch panel and a pen recognizing panel. - The touch panel may sense a touch input by a user and may output a value of a touch event that corresponds to a signal generated by the sensed touch input. When the touch panel is combined with a display panel and thus is formed as a touchscreen, the touchscreen may be configured as a capacitive touchscreen or a resistive touchscreen by using various types of touch sensors. The capacitive touchscreen may calculate touch coordinates by sensing a small amount of electricity generated when a body part of the user touches the surface of the capacitive touchscreen, which is coated with a dielectric material. The resistive touchscreen may include two embedded electrode plates and may calculate touch coordinates by sensing a flow of current that occurs when the user touches the resistive touchscreen and thus upper and lower plates of a touched point contact each other. The touch event that occurs on the touchscreen may be mainly generated by a finger of a person but may also be generated by an object formed of a conductive material capable of changing capacitance.
- The pen recognizing panel may sense a proximate input or a touch input of a touch pen (e.g., a stylus pen or a digitizer pen) which occurs by a user, and may output a sensed proximate pen event or a sensed pen touch event. The pen recognizing panel may sense the touch input or the proximate input according to changes in a strength of an electromagnetic field, which occur when the touch pen approaches or touches the pen recognizing panel.
- The
memory 140 may store all pieces of data that are generated according to an operation of theimage forming apparatus 100 and may store all programs that are used when theimage forming apparatus 100 operates. For example, thememory 140 may store data such as data received from an external device, data input via theuser input unit 120, faxed data, scanned data, and copied data that are generated according to the operation of theimage forming apparatus 100, and may store various programs that are used in controlling theimage forming apparatus 100. Also, thememory 140 may temporarily or semi-permanently store a part of content to be displayed on the screen of thedisplay 110. - The
memory 140 may include at least one selected from an internal memory (not shown) and an external memory (not shown). The internal memory may include at least one selected from a volatile memory (e.g., a dynamic random-access memory (DRAM), a static RAM (SRAM), a synchronous dynamic RAM (SDRAM), etc.), a non-volatile memory (e.g., a one time programmable read-only memory (OTPROM), a programmable ROM (PROM), an erasable and programmable ROM (EPROM), an electrically erasable and programmable ROM (EEPROM), a mask ROM, a flash ROM, etc.), a hard disk drive (HDD), and a solid-state drive (SSD). The external memory may include at least one selected from a compact flash (CF) memory, a secure digital (SD) memory, a micro secure digital (Micro-SD) memory, a mini secure digital (mini SD) memory, an extreme digital (xD) memory, and a memory stick. - The
fax 150 transmits or receives fax data by using a modem. Thefax 150 may convert image data, which is recorded to a document, into fax data to be adaptive for transmission using the modem, or may receive fax data from an external device and may deliver the fax data to theimage forming unit 170 so as to control theimage forming unit 170 to output the fax data to a printing medium such as a printing paper. - The
scanner 160 may generate scanned data by scanning image data that is recorded to a document, and may deliver the scanned data to thecommunication unit 180 for an access to a network, to thememory 140 for storage, to thefax 150 for fax transmission, or to theimage forming unit 170 for printing. That is, thescanner 160 may perform functions such as a scan to a server message block (scan to SMB) function, a scan to file transfer protocol (scan to FTP) function, a web distributed authoring and versioning (a SCAN TO WebDAV) function, a scan to e-mail function, a scan to personal computer (PC) function, or a scan to box function. - The
image forming unit 170 forms an image and outputs copied and printed data to a printing medium such as a printing paper. Theimage forming unit 170 may include hardware units and a software module for driving the hardware units that perform charging, exposing, developing, transferring, and fixing operations so as to output the copied and printed data to the printing medium. - The
communication unit 180 may include a network module for an access to a network according to an application and functions of theimage forming apparatus 100, the modem for fax transmission and reception, and a universal serial bus (USB) host module for establishing a data transfer channel with a portable storage medium. Thecommunication unit 180 may communicate with various external devices according to various communication schemes. Thecommunication unit 180 may include at least one selected from a WiFi chip, a Bluetooth chip, a wireless communication chip, and a near field communication (NFC) chip. Thecontroller 190 may control thecommunication unit 180 to communicate with the various external devices. - The WiFi chip and the Bluetooth chip may communicate with another device by using WiFi and Bluetooth, respectively. If the WiFi chip or the Bluetooth chip is used, the WiFi chip or the Bluetooth chip may first transmit and receive various types of connection information including a service set identification (SSID), a session key, or the like, may connection communication by using the connection information, and then may transmit and receive various types of information. The wireless communication chip indicates a chip that communicates with another device according to various communication standards such as the Institute of Electrical and Electronics Engineers (IEEE),
ZigBee 3rd generation (3G), 3rd Generation Partnership Project (3GPP), Long Term Evolution (LTE), or the like. The NFC chip indicates a chip that operates in an NFC way by using a 13.56 MHz band from among various radio frequency-identification (RF-ID) frequency bands such as 135 kHz, 13.56 MHz, 433 MHz, 860-960 MHz, 2.45 GHz, or the like. - The
controller 190 may generally control functions of theimage forming apparatus 100 and may be formed as a micro-processor. Thecontroller 190 may be divided into a plurality of processor modules that are separated according to their functions, and a main processor module that collectively manages the plurality of processor modules. Thecontroller 190 may control thedisplay 110, theuser input unit 120, and theimage processor 130 to display a screen including a user interface so that a user may watch, and to process image signals according to an input of user's manipulation so that a screen corresponding thereto may be displayed. Also, thecontroller 190 may control various programs and data to be stored in thememory 140 or may control various programs and data stored in thememory 140 to be loaded from thememory 140. Thecontroller 190 may control an operation of thefax 150 to transmit or receive a fax or may control an operation of thescanner 160 to scan a document. Thecontroller 190 may control data loaded from thememory 140 to be compared with data processed by thecontroller 190, or may control data stored in thememory 140 to be delivered to theimage forming unit 170. Thecontroller 190 may control thecommunication unit 180 to receive data from an external device or to transmit data to the external device. - Names of the elements of the
image forming apparatus 100 may be changed. Theimage forming apparatus 100 may be embodied with more or less elements than the aforementioned elements and may further include other elements. -
FIG. 11 is a flowchart of a method of providing thescreen 200 for manipulating execution of an application of theimage forming apparatus 100, according to an exemplary embodiment. Although descriptions are omitted here, if the descriptions are described above, the descriptions may also be applied to the flowchart ofFIG. 11 . - In operation S1110, the
image processor 130 of theimage forming apparatus 100 may generate a first image signal that indicates thefirst user interface 210 for setting options to be applied to the execution of the application and may generate a second image signal that indicates thesecond user interface 220 including thevirtual buttons image forming apparatus 100. Thefirst user interface 210 and thesecond user interface 220 may be differently displayed according to application types. - The
virtual button 222 may be one of a button for starting, stopping, and resetting the operation of theimage forming apparatus 100. Thesecond user interface 220 may include thevirtual button 224 whose shape and function are changed according to types of the application. When thesecond user interface 220 includes thevirtual buttons second user interface 220 may be separated based on each of thevirtual buttons - In operation S1120, the
display 110 of theimage forming apparatus 100 may display, based on the first image signal and the second image signal, thefirst user interface 210 and thesecond user interface 220 on thescreen 200 for manipulating the execution of the application. If thefirst user interface 210 and thesecond user interface 220 overlap each other on thescreen 200 for manipulating the execution of the application, thedisplay 110 may display thesecond user interface 220 over thefirst user interface 210. Here, a degree of a transparency of an entire area or a partial area of thesecond user interface 220 may be greater than a degree of a transparency of thefirst user interface 210. By doing so, a user may see information, which is displayed on an area of thefirst user interface 210 that is obstructed by thesecond user interface 220, via thesecond user interface 220 that is translucent. Also, the user may move thesecond user interface 220 to a random position on thefirst user interface 210. - If user's manipulation occurs with respect to the
screen 200 for manipulating the execution of the application that is displayed on thedisplay 110 of theimage forming apparatus 100, the user's manipulation is input by using theuser input unit 120. Based on the input user's manipulation, theimage processor 130 may newly compose thescreen 200 for manipulating the execution of the application that is to be displayed on thedisplay 110. - For one example, in a case where the
second user interface 220 that is placed over thefirst user interface 210 is displayed on thescreen 200 for manipulating the execution of the application, if user's manipulation with respect to thefirst user interface 210 is input, theimage forming apparatus 100 may control thesecond user interface 220 to disappear from thescreen 200 during a predetermined time, and if the user's manipulation is not input during the predetermined time, theimage forming apparatus 100 may control thesecond user interface 220 to be displayed again. For another example, in a case where thesecond user interface 220 that is placed over thefirst user interface 210 is displayed on thescreen 200 for manipulating the execution of the application, if user's manipulation with respect to thefirst user interface 210 is input, theimage forming apparatus 100 may control thesecond user interface 220 to be displayed as a substitution icon on an area of thescreen 200 for manipulating the execution of the application, and if the user's manipulation with respect to the substitution icon is input, theimage forming apparatus 100 may control thesecond user interface 220 to be displayed again. - As described above, according to the one or more of the above exemplary embodiments, the image forming apparatus may be controlled in a manner that physical buttons are displayed as virtual buttons by using software, whereby usability of the image forming apparatus may be improved.
- The method of providing the screen for manipulating the execution of the application can be written as computer programs and can be implemented in general-use digital computers that execute the programs using a non-transitory computer readable recording medium. Examples of the computer readable recording medium include magnetic storage media (e.g., ROM, floppy disks, hard disks, etc.), optical recording media (e.g., CD-ROMs, or DVDs), etc.
- Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter. The described hardware devices may be configured to act as one or more software modules in order to perform the operations of the above-described embodiments, or vice versa. Any one or more of the software modules described herein may be executed by a dedicated hardware-based computer or processor unique to that unit or by a hardware-based computer or processor common to one or more of the modules. The described methods may be executed on a general purpose computer or processor or may be executed on a particular machine such as the image forming apparatus described herein.
- It should be understood that the exemplary embodiments described herein should be considered in a descriptive sense only and not for purposes of limitation. Descriptions of features or aspects within each exemplary embodiment should typically be considered as available for other similar features or aspects in other exemplary embodiments.
- While one or more exemplary embodiments have been described with reference to the figures, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope as defined by the following claims.
Claims (26)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/612,611 US11188280B2 (en) | 2014-06-03 | 2017-06-02 | Method of providing screen for manipulating execution of application of image forming apparatus and image forming apparatus using the method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2014-0067796 | 2014-06-03 | ||
KR1020140067796A KR20150139337A (en) | 2014-06-03 | 2014-06-03 | Method for providing a screen for manipulating application execution of image forming apparatus and image forming apparatus using the same |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/612,611 Continuation US11188280B2 (en) | 2014-06-03 | 2017-06-02 | Method of providing screen for manipulating execution of application of image forming apparatus and image forming apparatus using the method |
Publications (2)
Publication Number | Publication Date |
---|---|
US20150347070A1 true US20150347070A1 (en) | 2015-12-03 |
US9696952B2 US9696952B2 (en) | 2017-07-04 |
Family
ID=53396255
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/644,592 Active US9696952B2 (en) | 2014-06-03 | 2015-03-11 | Method of providing screen for manipulating execution of application of image forming apparatus and image forming apparatus using the method |
US15/612,611 Active 2036-11-14 US11188280B2 (en) | 2014-06-03 | 2017-06-02 | Method of providing screen for manipulating execution of application of image forming apparatus and image forming apparatus using the method |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/612,611 Active 2036-11-14 US11188280B2 (en) | 2014-06-03 | 2017-06-02 | Method of providing screen for manipulating execution of application of image forming apparatus and image forming apparatus using the method |
Country Status (4)
Country | Link |
---|---|
US (2) | US9696952B2 (en) |
EP (1) | EP2953335A1 (en) |
KR (1) | KR20150139337A (en) |
CN (1) | CN105282359B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160094734A1 (en) * | 2014-09-30 | 2016-03-31 | Brother Kogyo Kabushiki Kaisha | Image processing apparatus and method for the same |
US20160094728A1 (en) * | 2014-09-29 | 2016-03-31 | Brother Kogyo Kabushiki Kaisha | Function execution apparatus and screen information server |
US9491328B2 (en) * | 2015-02-28 | 2016-11-08 | Xerox Corporation | System and method for setting output plex format using automatic page detection |
WO2020096576A1 (en) * | 2018-11-06 | 2020-05-14 | Hewlett-Packard Development Company, L.P. | Iconographic control panel of an image forming apparatus |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD821488S1 (en) * | 2017-02-16 | 2018-06-26 | Xerox Corporation | User interface display module for a multifunction printing machine |
USD822100S1 (en) * | 2017-02-16 | 2018-07-03 | Xerox Corporation | Multifunction printing machine |
USD821489S1 (en) * | 2017-02-16 | 2018-06-26 | Xerox Corporation | User interface display module for a multifunction printing machine |
CN107544465A (en) * | 2017-09-13 | 2018-01-05 | 四川谊田集群科技有限公司 | A kind of system and method to control device remote debugging |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050099400A1 (en) * | 2003-11-06 | 2005-05-12 | Samsung Electronics Co., Ltd. | Apparatus and method for providing vitrtual graffiti and recording medium for the same |
US20080176604A1 (en) * | 2007-01-22 | 2008-07-24 | Lg Electronics Inc. | Mobile communication device and method of controlling operation of the mobile communication device |
US20110077083A1 (en) * | 2009-09-29 | 2011-03-31 | Nexon Mobile Corporation | Method for providing user interface for controlling game |
US20120242691A1 (en) * | 2011-03-24 | 2012-09-27 | Konica Minolta Business Technologies, Inc. | Image forming apparatus, display method, and non-transitory computer-readable recording medium encoded with display program |
US20130227483A1 (en) * | 2012-02-24 | 2013-08-29 | Simon Martin THORSANDER | Method and Apparatus for Providing a User Interface on a Device That Indicates Content Operators |
US20140104178A1 (en) * | 2012-10-15 | 2014-04-17 | Samsung Electronics Co., Ltd | Electronic device for performing mode coversion in performing memo function and method thereof |
US20150146226A1 (en) * | 2013-11-27 | 2015-05-28 | Konica Minolta Inc. | Image forming apparatus, display method for an operation screen, and computer program |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5564004A (en) * | 1994-04-13 | 1996-10-08 | International Business Machines Corporation | Method and system for facilitating the selection of icons |
JP2002055750A (en) | 2000-08-10 | 2002-02-20 | Canon Inc | Information processor and function list displaying method and storage medium |
US8631969B2 (en) * | 2004-06-15 | 2014-01-21 | Teknovation, Ltd. | Article storage and retrieval apparatus, and vending machine |
JP2006121644A (en) * | 2004-09-22 | 2006-05-11 | Seiko Epson Corp | Image display device and control method thereof |
JP2008021116A (en) * | 2006-07-12 | 2008-01-31 | Hitachi Ltd | Calculator and method for integrated management and calculation of san/nas |
US20080163119A1 (en) * | 2006-12-28 | 2008-07-03 | Samsung Electronics Co., Ltd. | Method for providing menu and multimedia device using the same |
JP2008304548A (en) | 2007-06-05 | 2008-12-18 | Kyocera Mita Corp | Image forming device |
KR20090043753A (en) * | 2007-10-30 | 2009-05-07 | 엘지전자 주식회사 | Method and apparatus for controlling multitasking of terminal device with touch screen |
US7816641B2 (en) * | 2007-12-28 | 2010-10-19 | Candela Microsystems (S) Pte. Ltd. | Light guide array for an image sensor |
US8763361B2 (en) * | 2008-01-07 | 2014-07-01 | Aerojet-General Corporation | Propulsion system with movable thermal choke |
KR101392964B1 (en) * | 2008-01-11 | 2014-05-09 | 가부시키가이샤 엔티티 도코모 | Mobile communication method and wireless base station |
US8041497B2 (en) * | 2008-07-15 | 2011-10-18 | Ford Global Technologies, Llc | Fuel based engine operation control |
FR2956547B1 (en) * | 2010-02-15 | 2012-09-14 | Sagem Wireless | METHOD AND SYSTEM FOR TRANSFERRING AN IMAGE BETWEEN TWO MOBILE TELEPHONY DEVICES |
JP5081939B2 (en) * | 2010-03-23 | 2012-11-28 | シャープ株式会社 | Operating device, electronic device and image processing apparatus including the operating device, and information display method in the operating device |
KR100989942B1 (en) * | 2010-04-29 | 2010-10-26 | 태산엔지니어링 주식회사 | Eco-friendly aqueous epoxy resin composition and its uses |
US8610745B2 (en) | 2010-05-17 | 2013-12-17 | Sharp Kabushiki Kaisha | Image forming apparatus and display console displaying preview image |
JP5832077B2 (en) * | 2010-09-24 | 2015-12-16 | 任天堂株式会社 | Information processing program, information processing apparatus, information processing system, and information processing method |
KR101861593B1 (en) | 2011-03-15 | 2018-05-28 | 삼성전자주식회사 | Apparatus and method for operating in portable terminal |
CN103459020B (en) * | 2011-04-01 | 2016-08-17 | 陶氏环球技术有限责任公司 | For converting synthesis gas to the catalyst of alcohol |
JP2013134536A (en) * | 2011-12-26 | 2013-07-08 | Brother Ind Ltd | Display unit and display program |
JP5927907B2 (en) * | 2011-12-26 | 2016-06-01 | ブラザー工業株式会社 | Image forming apparatus, image forming apparatus control method, and program |
US8928593B2 (en) | 2012-03-11 | 2015-01-06 | Beijing Hefengxin Keji Co. Ltd. | Selecting and updating location of virtual keyboard in a GUI layout in response to orientation change of a portable device |
KR102029242B1 (en) * | 2013-01-03 | 2019-11-08 | 엘지전자 주식회사 | Method of controling mobile terminal |
JP6024606B2 (en) * | 2013-07-02 | 2016-11-16 | 富士ゼロックス株式会社 | Image forming apparatus, information processing apparatus, program |
JP5884815B2 (en) * | 2013-12-13 | 2016-03-15 | コニカミノルタ株式会社 | Image forming apparatus, operation screen display method, and computer program |
-
2014
- 2014-06-03 KR KR1020140067796A patent/KR20150139337A/en not_active IP Right Cessation
-
2015
- 2015-03-11 US US14/644,592 patent/US9696952B2/en active Active
- 2015-06-02 CN CN201510295565.1A patent/CN105282359B/en active Active
- 2015-06-03 EP EP15170549.8A patent/EP2953335A1/en not_active Ceased
-
2017
- 2017-06-02 US US15/612,611 patent/US11188280B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050099400A1 (en) * | 2003-11-06 | 2005-05-12 | Samsung Electronics Co., Ltd. | Apparatus and method for providing vitrtual graffiti and recording medium for the same |
US20080176604A1 (en) * | 2007-01-22 | 2008-07-24 | Lg Electronics Inc. | Mobile communication device and method of controlling operation of the mobile communication device |
US20110077083A1 (en) * | 2009-09-29 | 2011-03-31 | Nexon Mobile Corporation | Method for providing user interface for controlling game |
US20120242691A1 (en) * | 2011-03-24 | 2012-09-27 | Konica Minolta Business Technologies, Inc. | Image forming apparatus, display method, and non-transitory computer-readable recording medium encoded with display program |
US20130227483A1 (en) * | 2012-02-24 | 2013-08-29 | Simon Martin THORSANDER | Method and Apparatus for Providing a User Interface on a Device That Indicates Content Operators |
US20140104178A1 (en) * | 2012-10-15 | 2014-04-17 | Samsung Electronics Co., Ltd | Electronic device for performing mode coversion in performing memo function and method thereof |
US20150146226A1 (en) * | 2013-11-27 | 2015-05-28 | Konica Minolta Inc. | Image forming apparatus, display method for an operation screen, and computer program |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160094728A1 (en) * | 2014-09-29 | 2016-03-31 | Brother Kogyo Kabushiki Kaisha | Function execution apparatus and screen information server |
US9509860B2 (en) * | 2014-09-29 | 2016-11-29 | Brother Kogyo Kabushiki Kaisha | Function execution apparatus and screen information server |
US20160094734A1 (en) * | 2014-09-30 | 2016-03-31 | Brother Kogyo Kabushiki Kaisha | Image processing apparatus and method for the same |
US9479657B2 (en) * | 2014-09-30 | 2016-10-25 | Brother Kogyo Kabushiki Kaisha | Image processing apparatus and method for the same |
US9491328B2 (en) * | 2015-02-28 | 2016-11-08 | Xerox Corporation | System and method for setting output plex format using automatic page detection |
WO2020096576A1 (en) * | 2018-11-06 | 2020-05-14 | Hewlett-Packard Development Company, L.P. | Iconographic control panel of an image forming apparatus |
CN112543704A (en) * | 2018-11-06 | 2021-03-23 | 惠普发展公司,有限责任合伙企业 | Image log control panel of image forming apparatus |
US11480904B2 (en) | 2018-11-06 | 2022-10-25 | Hewlett-Packard Development Company, L.P. | Iconographic control panel of an image forming apparatus |
Also Published As
Publication number | Publication date |
---|---|
KR20150139337A (en) | 2015-12-11 |
US11188280B2 (en) | 2021-11-30 |
US20170269887A1 (en) | 2017-09-21 |
CN105282359B (en) | 2018-09-25 |
CN105282359A (en) | 2016-01-27 |
EP2953335A1 (en) | 2015-12-09 |
US9696952B2 (en) | 2017-07-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11188280B2 (en) | Method of providing screen for manipulating execution of application of image forming apparatus and image forming apparatus using the method | |
JP7328182B2 (en) | IMAGE PROCESSING DEVICE, CONTROL METHOD AND PROGRAM OF IMAGE PROCESSING DEVICE | |
EP3035184B1 (en) | User interface apparatus, method for controlling a user interface, and computer-readable storage medium for controlling a user interface | |
JP2017058887A (en) | Display input device, image forming apparatus, electronic apparatus, display control method and program | |
US9232090B2 (en) | Electronic apparatus and image forming apparatus with improved displays of different levels of menu items | |
US9223531B2 (en) | Image processing apparatus that generates remote screen display data, portable terminal apparatus that receives remote screen display data, and recording medium storing a program for generating or receiving remote screen display data | |
KR102333135B1 (en) | Method for providing a screen for manipulating application execution of image forming apparatus and image forming apparatus using the same | |
JP2014175918A (en) | Image processing system, control method and control program | |
CN107544707B (en) | Display input device | |
JP2024015405A (en) | Control device and control program | |
US10691293B2 (en) | Display device and computer-readable non-transitory recording medium with display control program stored thereon | |
KR20170063375A (en) | Information processing apparatus, control method of information processing apparatus, and storage medium | |
US11523011B2 (en) | Image forming apparatus and numerical value counting method | |
JP6786199B2 (en) | Print control device, control method of print control device, and printer driver program | |
JP2018165939A (en) | Application program and information processing system | |
JP2018165938A (en) | Application program and information processing system | |
JP2015053623A (en) | Print setting device and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HA, KWANG-SOO;KIM, SE-YOUNG;LIM, SE-RRAH;REEL/FRAME:035178/0402 Effective date: 20150311 |
|
AS | Assignment |
Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE ADDRESS PREVIOUSLY RECORDED AT REEL: 035178 FRAME: 0402. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNORS:HA, KWANG-SOO;KIM, SE-YOUNG;LIM, SE-RRAH;REEL/FRAME:035353/0791 Effective date: 20150311 |
|
AS | Assignment |
Owner name: S-PRINTING SOLUTION CO., LTD., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SAMSUNG ELECTRONICS CO., LTD;REEL/FRAME:041852/0125 Effective date: 20161104 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
CC | Certificate of correction | ||
AS | Assignment |
Owner name: HP PRINTING KOREA CO., LTD., KOREA, REPUBLIC OF Free format text: CHANGE OF NAME;ASSIGNOR:S-PRINTING SOLUTION CO., LTD.;REEL/FRAME:047370/0405 Effective date: 20180316 |
|
AS | Assignment |
Owner name: HP PRINTING KOREA CO., LTD., KOREA, REPUBLIC OF Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE DOCUMENTATION EVIDENCING THE CHANGE OF NAME PREVIOUSLY RECORDED ON REEL 047370 FRAME 0405. ASSIGNOR(S) HEREBY CONFIRMS THE CHANGE OF NAME;ASSIGNOR:S-PRINTING SOLUTION CO., LTD.;REEL/FRAME:047769/0001 Effective date: 20180316 |
|
AS | Assignment |
Owner name: HP PRINTING KOREA CO., LTD., KOREA, REPUBLIC OF Free format text: CHANGE OF LEGAL ENTITY EFFECTIVE AUG. 31, 2018;ASSIGNOR:HP PRINTING KOREA CO., LTD.;REEL/FRAME:050938/0139 Effective date: 20190611 |
|
AS | Assignment |
Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS Free format text: CONFIRMATORY ASSIGNMENT EFFECTIVE APRIL 15, 2019;ASSIGNOR:HP PRINTING KOREA CO., LTD.;REEL/FRAME:050743/0481 Effective date: 20190826 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |