US20190146743A1 - Display apparatus and non-transitory computer readable medium storing program - Google Patents
Display apparatus and non-transitory computer readable medium storing program Download PDFInfo
- Publication number
- US20190146743A1 US20190146743A1 US16/182,622 US201816182622A US2019146743A1 US 20190146743 A1 US20190146743 A1 US 20190146743A1 US 201816182622 A US201816182622 A US 201816182622A US 2019146743 A1 US2019146743 A1 US 2019146743A1
- Authority
- US
- United States
- Prior art keywords
- display
- displays
- stand
- processing apparatus
- image processing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000001514 detection method Methods 0.000 claims description 31
- 238000000034 method Methods 0.000 claims description 18
- 230000033001 locomotion Effects 0.000 claims description 8
- 238000012545 processing Methods 0.000 description 184
- 238000004891 communication Methods 0.000 description 24
- 238000003384 imaging method Methods 0.000 description 19
- 238000003825 pressing Methods 0.000 description 13
- 238000013459 approach Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 238000009434 installation Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 125000000391 vinyl group Chemical group [H]C([*])=C([H])[H] 0.000 description 1
- 229920002554 vinyl polymer Polymers 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/40—Document-oriented image-based pattern recognition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/00411—Display of information to the user, e.g. menus the display also being used for user input, e.g. touch screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H04N5/247—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
Definitions
- the present invention relates to a display apparatus and a non-transitory computer readable medium storing a program.
- a display apparatus including: a first display unit that displays a plurality of display elements on a first display surface, the first display surface being not touch-sensitive; and a second display unit that displays a specific display element on a second display surface, the specific display element being selected from the plurality of display elements displayed on the first display surface, by an operation performed on the second display surface.
- FIG. 1 is a perspective view of an image processing apparatus according to an exemplary embodiment of the invention
- FIG. 2 is a diagram illustrating an example of a hardware configuration of the image processing apparatus according to the exemplary embodiment of the invention
- FIG. 3 is a block diagram illustrating a functional configuration example of a control device in the exemplary embodiment of the invention.
- FIG. 4 is a view illustrating a screen display example during stand-by of the image processing apparatus
- FIG. 5 is a view illustrating a screen display example when login is completed in the image processing apparatus
- FIG. 6 is a view illustrating a screen display example when a print operation is started in the image processing apparatus
- FIG. 7 is a view illustrating a screen display example when file contents are checked in the image processing apparatus
- FIG. 8 is a view illustrating a screen display example when an output format is selected in the image processing apparatus
- FIG. 9 is a view illustrating a screen display example when a print operation is completed in the image processing apparatus.
- FIG. 10 is a view illustrating a screen display example when documents are placed by a user on an operation stand
- FIG. 11 is a view illustrating a screen display example when documents are placed by a user on an operation stand
- FIG. 12 is a view illustrating a screen display example when two-dimensional scan is completed in the image processing apparatus
- FIG. 13 is a view illustrating a screen display example when a storage operation is started in the image processing apparatus
- FIG. 14 is a view illustrating a screen display example when three-dimensional scan is completed in the image processing apparatus
- FIG. 15 is a flowchart illustrating an operation example of the control device in the exemplary embodiment of the invention.
- FIG. 16 is a flowchart illustrating an operation example of the control device when print processing is performed.
- FIG. 17 is a flowchart illustrating an operation example of the control device when two-dimensional scan processing is performed
- FIG. 18 is a view illustrating a screen display example displayed in a first mode of the public print processing
- FIG. 19 is a view illustrating a screen display example displayed in a second mode of the public print processing
- FIG. 20 is a view illustrating a screen display example displayed in a third mode of the public print processing
- FIG. 21 is a view illustrating a screen display example displayed in a fourth mode of the public print processing
- FIG. 22 is a view illustrating a screen display example displayed in a fifth mode of the public print processing
- FIG. 23 is a flowchart illustrating an operation example of the control device when the public print processing is performed in the first to third modes
- FIG. 24 is a flowchart illustrating an operation example of the control device when the public print processing is performed in the fourth mode
- FIG. 25 is a flowchart illustrating an operation example of the control device when the public print processing is performed in the fifth mode.
- FIG. 26 is a flowchart illustrating an operation example of the control device when the public print processing is performed in the sixth and seventh modes.
- FIG. 1 is a perspective view of an image processing apparatus 100 according to an exemplary embodiment of the invention.
- the image processing apparatus 100 includes a guide display 10 , an operation stand 20 , a projector 30 , an operation detector 40 , a printer 50 , and imagers 60 a to 60 d.
- the guide display 10 is a display that displays a message to a user, such as guidance, for an operation of the image processing apparatus 100 . Unlike the later-described operation stand 20 , even when contact is made with the surface of the guide display 10 , contact is not detected.
- a liquid crystal display may be used as the guide display 10 .
- the guide display 10 is provided as an example of a first display surface that does not detect a contact operation.
- the operation stand 20 is a substantially horizontal stand that projects toward a user so that the user can place and operate a mobile information terminal and a document.
- the “substantially horizontal” may refer to a horizontal levelness that does not cause a mobile information terminal or a document placed on the operation stand 20 to slip down.
- the operation stand 20 is designed so that an image is displayed by the function of the later-described projector 30 , and contact with the surface of the operation stand 20 is detected by the function of the later-described operation detector 40 .
- the operation stand 20 itself may be configurated by a display and a projector 30 may not be provided.
- the operation stand 20 is provided as an example of a display surface, a second display surface, and a platen.
- the projector 30 is a projector that projects an image onto the operation stand 20 .
- the projector 30 projects an image onto the operation stand 20 in an oblique direction from above because the projector 30 is provided at a lower portion of the guide display 10 .
- the projector 30 may be provided vertically above the operation stand 20 to project an image onto the operation stand 20 in a direction from immediately above.
- the projector 30 may be provided vertically below the operation stand 20 , or the projector 30 may project an image onto the operation stand 20 in a direction from immediately below using a mirror along with the projector 30 .
- a liquid crystal projector may be used as the projector 30 .
- the operation detector 40 detects an operation by contacting with the surface of the operation stand 20 . Detection of the operation may be made by sensing blocking of infrared rays by a finger of a user, the infrared rays radiating to the surface of the operation stand 20 radially. Specifically, for instance, an infrared LED and an infrared sensor may be used as the operation detector 40 .
- the printer 50 is a printer that prints an image on paper or other media.
- an electrophotographic system that forms an image by transferring toner adhering to a photoconductor onto a recording medium, or an inkjet printer that discharges ink on a recording medium to form an image may be used as the printer 50 .
- the printer 50 may be a printer that creates a printed material by pressing a block, to which ink is applied, against paper or other media.
- the printer 50 is provided as an example of the printer.
- the imagers 60 a to 60 d are cameras that capture an image of a document or a mobile information terminal placed on the operation stand 20 .
- the imagers 60 a, 60 b are provided at an upper portion of the guide display 10 , and thus mainly capture an image of a document or a mobile information terminal placed on the operation stand 20 from above.
- the imagers 60 c, 60 d are provided on the near side of the guide display 10 , and thus mainly capture an image in an oblique direction from below when a three-dimensional object is placed on the operation stand 20 .
- the imagers 60 a to 60 d have different applications according to the positions provided, and hereinafter are referred to as the imager 60 when these imagers are not distinguished from each other.
- the imager 60 is provided as a scanner, thus hereinafter “captures something” may also be expressed as “scans something”.
- the imager 60 is provided as an example of the reading device. Although four imagers 60 are illustrated in the drawings, the number of imagers 60 is not limited to four. For instance, an imager 60 for detecting a line of sight and/or motion of a user may be provided at a position which allows such detection.
- FIG. 2 is a diagram illustrating an example of a hardware configuration of the image processing apparatus 100 according to the exemplary embodiment.
- the image processing apparatus 100 includes a central processing unit (CPU) 1 , a random access memory (RAM) 2 , a read only memory (ROM) 3 , a hard disk drive (HDD) 4 , a communication interface (hereinafter referred to as a “communication I/F”) 5 , a guide display 10 , a projector 30 , an operation detector 40 , a printer 50 , and an imager 60 .
- CPU central processing unit
- RAM random access memory
- ROM read only memory
- HDD hard disk drive
- communication I/F communication interface
- the CPU 1 implements the later-described functions by loading various programs stored in the ROM 3 into the RAM 2 , and executing the programs.
- the RAM 2 is a memory that is used as a memory for work of the CPU 1 .
- the ROM 3 is a memory that stores various programs to be executed by the CPU 1 .
- the HDD 4 is, for instance, a magnetic disk device that stores data scanned by the imager 60 , data used by printing in the printer 50 and other data.
- the communication I/F 5 transmits and receives various information to and from other devices via a communication line.
- FIG. 3 is a block diagram illustrating a functional configuration example of a control device 70 that controls the image processing apparatus 100 .
- the control device 70 is an example of a display device and a an image reading device, and is regarded as a device which is implemented by the CPU 1 (see FIG. 2 ) of the image processing apparatus 100 in such a manner that the CPU 1 reads a program implementing the later-described functional units, for instance, from the ROM 3 (see FIG. 2 ) to the RAM 2 (see FIG. 2 ) and executes the program.
- control device 70 includes a display controller 71 , a projection controller 72 , a detection controller 73 , a print controller 74 , an imaging controller 75 , a communication controller 76 , a payment processor 77 , a document type recognizer 78 , and a scan data processor 79 .
- the display controller 71 displays various types of guidance and various screens on the guide display 10 .
- the display controller 71 is provided as an example of a first display unit that displays information on the first display surface.
- the projection controller 72 displays various screens on the operation stand 20 using the projector 30 .
- the projection controller 72 is provided as an example of a second display unit that displays information on the display surface, the second display surface, and the platen.
- the detection controller 73 determines whether or not the operation detector 40 has detected an operation by contacting with the surface of the operation stand 20 . In addition, the detection controller 73 also determines whether or not a human sensor (not illustrated) has detected approach of a user.
- the print controller 74 controls printing by the printer 50 .
- the imaging controller 75 controls the imager 60 to capture an image of a document or a mobile information terminal placed on the operation stand 20 , and obtains the image captured by the imager 60 .
- the imaging controller 75 controls the imager 60 such that when a predetermined time has elapsed since a document is placed on the operation stand 20 , the imager 60 scans the document.
- the imaging controller 75 is provided as an example of a reading unit that reads an image.
- the imaging controller 75 may obtain a detection result from the imager 60 that detects a line of sight and/or motion of a user.
- the imaging controller 75 is an example of a detection unit that detects motion of a user.
- the communication controller 76 When information recorded on a card is read by a card reader (not illustrated), the communication controller 76 receives the information from the card reader. Also, when information stored in a mobile information terminal is received by a near field communication (NFC) reader (not illustrated), the communication controller 76 receives the information from the NFC reader. In addition, the communication controller 76 receives information stored in a mobile information terminal via Wi-Fi (registered trademark). Instead of Wi-Fi, Bluetooth (registered trademark) may be used. However, a description is given below with Wi-Fi used. In the exemplary embodiment, the communication controller 76 is provided as an example of a reading unit that reads information.
- the communication controller 76 receives a file from an external cloud system or transmits a file to an external cloud system via the communication I/F 5 .
- the communication controller 76 is provided as an example of a receiving unit that receives data from another device, and an example of a transmission unit that transmits data to another device.
- the payment processor 77 performs payment-related processing such as generation of payment information based on the information received by the communication controller 76 from the card reader and the information received by the communication controller 76 from Wi-Fi.
- the document type recognizer 78 recognizes the type of the document.
- the type of the document may be recognized, for instance, by pattern matching with image data pre-stored for each type of document.
- the scan data processor 79 performs various types of processing on scan data obtained by the imaging controller 75 .
- the various types of processing include processing of scan data, and processing to integrate pieces of scan data obtained by multiple scans.
- the scan data processor 79 is provided as an example of an output unit that outputs an image obtained by integrating two images.
- final printing and scanning are performed by the image processing apparatus 100 , but a prior operation for the printing and scanning is performed by a mobile information terminal such as a smartphone.
- an application software for utilizing the image processing apparatus 100 is installed in the mobile information terminal, and a user performs the prior operation using the application. It is to be noted that the application used in the exemplary embodiment is only for utilizing the image processing apparatus 100 , thus any “application” mentioned in the present description indicates the application for utilizing the image processing apparatus 100 .
- the various information (hereinafter referred to as “registration information”) registered in the mobile information terminal includes a payment method, a print setting, and a storage destination.
- the image processing apparatus 100 is designed to be installed and utilized in a public space, and thus a payment method has to be registered.
- the payment method indicates how payment is made for printing and scanning, and includes, for instance, payment by a credit card, and payment by an electronic money IC card.
- the print setting indicates a desired print style when printing is made.
- the print setting also includes a special output style such as stapling, and putting a printed material in an envelope or a vinyl bag.
- the storage destination indicates where scan data obtained by scanning a document is stored.
- the storage destination includes an expense settlement cloud system, a document management cloud system, and a business card management cloud system. These storage destinations may be each registered as the location where scan data of a document is stored according to the type of the document. Registration may be made such that for instance, when the type of a document is receipt, the scan data is stored in the expense settlement cloud system, when the type of a document is A4 paper, the scan data is stored in the document management cloud system, and when the type of a document is business card, the scan data is stored in the business card cloud system.
- a user when printing a file stored in a cloud system, a user starts up the application by the mobile information terminal, obtains a list of files from the cloud system, and the list is displayed on the display of the mobile information terminal.
- a user reserves printing by designating a file which is desired to be printed.
- a file for which printing is reserved is called a “print reservation file”.
- a user registers various information in the print reservation file. For instance, a user sets an output format, and a payment method to the print reservation file. Alternatively, a user may leave the output format and the payment method unset.
- the application of the mobile information terminal also provides relevant information for this case. For instance, when a user designates a print reservation file and presses down a search button of the mobile information terminal, the application displays a map of the surrounding area of the user on the display of the mobile information terminal, and displays the installation location of an image processing apparatus 100 that can print the print reservation file in consideration of an output format set for the designated print reservation file. Thus, it is possible for the user to go to the installation location of the image processing apparatus 100 and to print the print reservation file which is desired to be printed.
- FIG. 4 is a view illustrating a screen display example during stand-by of the image processing apparatus 100 .
- the image processing apparatus 100 displays a stand-by screen 101 on the guide display 10 during stand-by.
- the stand-by screen 101 includes a graphic as an example of a display element that expresses information which is considered to be necessary for a user related to the installation location of the image processing apparatus 100 .
- FIG. 4 illustrates information A to H as an example of such information.
- the stand-by screen 101 has various versions according to the installation location of the image processing apparatus 100 , and the information A to H vary with the version of the stand-by screen 101 .
- the stand-by screen 101 is a station version
- the information A to H is the information on train operation, schedule, and travel.
- the size may be changed according to a priority level of information, for instance, a graphic indicating information that is considered to be highly necessary for users is displayed in large size.
- the image processing apparatus 100 proceeds to one of print processing, two-dimensional scan processing, and three-dimensional scan processing according to the object placed on the operation stand 20 .
- successful authentication based on authentication information transmitted by the application which has been started up in the mobile information terminal 90 causes the image processing apparatus 100 to proceed to the print processing.
- the image processing apparatus 100 proceeds to the two-dimensional scan processing, and when the three-dimensional object 97 is placed on the operation stand 20 by a user, the image processing apparatus 100 proceeds to the three-dimensional scan processing.
- FIG. 5 is a view illustrating a screen display example when login is completed in the image processing apparatus 100 .
- the image processing apparatus 100 starts print processing.
- successful authentication causes login processing to be completed, thus the image processing apparatus 100 first displays a login completion screen 102 on the guide display 10 and the operation stand 20 .
- the login completion screen 102 is a screen in which the mobile information terminal 90 , the operation stand 20 , and the guide display 10 are linked by animation.
- FIG. 6 is a view illustrating a screen display example when a print operation is started in the image processing apparatus 100 .
- the image processing apparatus 100 displays a print instruction screen 103 on the operation stand 20 .
- the print instruction screen 103 includes an image (hereinafter referred to as a “file image”) indicating the print reservation file.
- FIG. 6 illustrates file images 911 to 913 as an example of such a file image.
- a printing fee is calculated according to the attribute of the print reservation file, and the printing fee is also displayed on the print instruction screen 103 .
- the image processing apparatus 100 displays a guide 121 regarding edition and a guide 122 regarding print on the guide display 10 .
- the application displays a print button on the display of the mobile information terminal 90 .
- FIG. 7 is a view illustrating a screen display example when file contents are checked in the image processing apparatus 100 .
- the image processing apparatus 100 displays a file content display screen 104 on the operation stand 20 .
- the file content display screen 104 is a screen that displays a document in actual size and allows editing of the document. For instance, when an expansion gesture is made on the file image 913 , the contents of the print reservation file represented by the file image 913 are displayed.
- the image processing apparatus 100 displays a guide 123 regarding content check and a guide 124 regarding an output format.
- the characters of a confidential document such as an in-house document are first displayed in a blurred manner, and the characters traced by a finger of a user in accordance with the guide 123 may be displayed in a recognizable manner. Alternatively, the characters traced by the palm of a user may be displayed in a more recognizable manner.
- FIG. 8 is a view illustrating a screen display example when an output format is selected in the image processing apparatus 100 .
- a bookbinding button (not illustrated) is pressed down by a user in accordance with the guide 124 of FIG. 7 , the image processing apparatus 100 displays an output format display screen 105 on the operation stand 20 .
- the output format display screen 105 includes various output formats, and a desired output format is selectable from the output formats.
- the image processing apparatus 100 returns the current screen to the original screen.
- the image processing apparatus 100 starts printing. In this process, the image processing apparatus 100 moves the file images 911 to 913 toward the near side to fade out of sight.
- FIG. 9 is a view illustrating a screen display example when a print operation is completed in the image processing apparatus 100 .
- the image processing apparatus 100 displays a logout guide screen 106 on the operation stand 20 .
- the logout guide screen 106 includes a faintly shining area around the mobile information terminal 90 for prompting a user to remove the mobile information terminal 90 .
- the image processing apparatus 100 displays a guide 125 regarding logout and a guide 126 regarding personal belongings on the guide display 10 .
- the application displays a check mark on the display of the mobile information terminal 90 to notify a user of completion of printing.
- the image processing apparatus 100 performs logout processing, and displays a message indicating completion of logout on the guide display 10 and the operation stand 20 . Also, the application displays a message indicating completion of logout on the display of the mobile information terminal 90 .
- FIGS. 10 and 11 are each a view illustrating a screen display example when the document 95 is placed on the operation stand 20 by a user.
- the image processing apparatus 100 displays a guide 171 regarding position adjustment of the document 95 on the guide display 10 .
- a case is considered where a receipt 951 , an A4 paper 952 , and a business card 953 are placed as the document 95 closely to each other on the operation stand 20 by a user in accordance with the guide 171 as illustrated in FIG. 10 .
- the image processing apparatus 100 recognizes the type of the document 95 as a document, and displays a document type recognition result 150 indicating the type on the operation stand 20 .
- the image processing apparatus 100 recognizes the types of the document 95 as a receipt, A4 paper, and a business card, and displays document type recognition results 151 to 153 indicating the types on the operation stand 20 .
- the image processing apparatus 100 scans the document 95 .
- FIG. 12 is a view illustrating a screen display example when the scan is completed in the image processing apparatus 100 .
- the image processing apparatus 100 displays a guide 172 regarding removal of the document 95 and a guide 173 regarding storage destination on the guide display 10 .
- the image processing apparatus 100 displays a scanned image 921 of the receipt, a scanned image 922 of the A4 paper, and a scanned image 923 of the business card on the operation stand 20 . In this process, the image processing apparatus 100 displays the scanned images 921 to 923 in an erect state.
- FIG. 13 is a view illustrating a screen display example when a storage operation is started in the image processing apparatus 100 .
- the image processing apparatus 100 displays a storage instruction screen 154 on the operation stand 20 .
- the storage instruction screen 154 includes storage destination icons 924 to 926 indicating respective storage destinations registered for the types of document in the mobile information terminal 90 .
- the storage destination icon 924 indicates the expense settlement cloud system registered as the storage destination of scan data of receipt.
- the storage destination icon 925 indicates the document management cloud system registered as the storage destination of scan data of A4 paper.
- the storage destination icon 926 indicates the business card management cloud system registered as the storage destination of scan data of business card.
- the application displays a storage button on the display of the mobile information terminal 90 .
- the image processing apparatus 100 stores the scan data of the receipt, A4 paper, and business card in the respective corresponding cloud systems.
- the image processing apparatus 100 displays the guide 125 regarding logout, and the guide 126 regarding personal belongings on the guide display 10 .
- the image processing apparatus 100 performs logout processing, and displays a message indicating completion of logout on the guide display 10 and the operation stand 20 .
- the application displays a message indicating completion of logout on the mobile information terminal 90 .
- the image processing apparatus 100 scans the three-dimensional object 97 .
- the image processing apparatus 100 displays a result of scanning the three-dimensional object 97 on the guide display 10 and the operation stand 20 .
- FIG. 14 is a view illustrating a screen display example when the scan is completed in the image processing apparatus 100 .
- the image processing apparatus 100 displays a planar image 971 of a scan result on the operation stand 20 .
- the image processing apparatus 100 displays a three-dimensional image 972 of a scan result on the guide display 10 .
- the three-dimensional image 972 is displayed while being rotated as indicated by an arrow in FIG. 14 , thereby allowing a user to check the three-dimensional shape of the scan result.
- the image processing apparatus 100 displays a guide 176 regarding confirmation of a storage destination and a guide 177 regarding determination of a payment method on the guide display 10 .
- the image processing apparatus 100 displays a storage instruction screen 155 on the operation stand 20 .
- the storage instruction screen 155 includes a storage destination icon 927 indicating a storage destination registered in the mobile information terminal 90 .
- the storage destination icon 927 indicates a cloud system which is registered as the storage destination of scan data of three-dimensional objects.
- the application displays a storage button on the display of the mobile information terminal 90 .
- the image processing apparatus 100 stores the scan data of the three-dimensional object 97 in a corresponding cloud system.
- the image processing apparatus 100 displays the guide 125 regarding logout, and the guide 126 regarding personal belongings on the guide display 10 .
- the image processing apparatus 100 performs logout processing, and displays a message indicating completion of logout on the guide display 10 and the operation stand 20 .
- the application displays a message indicating completion of logout on the mobile information terminal 90 .
- FIG. 15 is a flowchart illustrating an operation example of the control device 70 that performs such screen display.
- the display controller 71 first displays the stand-by screen 101 on the guide display 10 (step 701 ).
- the detection controller 73 determines whether or not a human sensor has detected approach of a user (step 702 ). When it is determined that the human sensor has not detected approach of a user, the detection controller 73 repeats step 702 , whereas when it is determined that the human sensor has detected approach of a user, the control device 70 performs public print processing to print information necessary for a user in a public space (step 703 ).
- the imaging controller 75 determines whether or not the imager 60 has detected anything placed on the operation stand 20 (step 704 ). When it is determined that the imager 60 has not detected anything placed on the operation stand 20 , the control device 70 continues the public print processing.
- the imaging controller 75 determines whether or not the imager 60 has detected the document 95 placed on the operation stand 20 (step 705 ). As a result, when it is determined that the imager 60 has detected the document 95 placed on the operation stand 20 , the control device 70 performs two-dimensional scan processing (step 706 ).
- the imaging controller 75 determines whether or not the imager 60 has detected the mobile information terminal 90 placed on the operation stand 20 (step 707 ).
- the control device 70 performs print processing (step 708 ).
- the communication controller 76 obtains authentication information registered in the mobile information terminal 90 before the print processing is performed, makes authentication and Wi-Fi connection setting based on the authentication information, and receives registration information from the mobile information terminal 90 via Wi-Fi.
- the control device 70 performs three-dimensional scan processing (step 709 ).
- FIG. 16 is a flowchart illustrating an operation example of the control device 70 when the print processing in step 708 of FIG. 15 is performed.
- control device 70 first displays the login completion screen 102 on the guide display 10 and the operation stand 20 (step 721 ). Specifically, the display controller 71 displays part of the login completion screen 102 on the guide display 10 , and the projection controller 72 displays the remaining part of the login completion screen 102 on the operation stand 20 using the projector 30 .
- the projection controller 72 performs print instruction screen display processing to display a print instruction screen 103 on the operation stand 20 using the projector 30 , the print instruction screen 103 for giving an instruction to print a print reservation file (step 722 ).
- the payment processor 77 performs payment processing by a payment method registered for the print reservation file in the registration information or a payment method selected then (step 723 ).
- the communication controller 76 determines whether or not notification that the print button has been pressed down in the mobile information terminal 90 has been received via Wi-Fi (step 724 ).
- the communication controller 76 repeats step 724 , whereas when it is determined that notification that the print button has been pressed down in the mobile information terminal 90 has been received via Wi-Fi, the print controller 74 performs control so that printing is made by the printer 50 (step 725 ).
- the projection controller 72 displays the logout guide screen 106 on the operation stand 20 using the projector 30 (step 726 ).
- FIG. 17 is a flowchart illustrating an operation example of the control device 70 when the two-dimensional scan processing in step 706 of FIG. 15 is performed.
- the control device 70 first displays a document type recognition result on the operation stand 20 (step 741 ). Specifically, the imaging controller 75 obtains the image of the document 95 captured by the imager 60 , the document type recognizer 78 recognizes the type of the document 95 , for instance, by pattern matching, and the projection controller 72 displays a result of the recognition on the operation stand 20 using the projector 30 .
- the imaging controller 75 determines whether or not the imager 60 has detected change in the position of the document 95 (step 742 ). When it is determined that the imager 60 has detected change in the position of the document 95 , the control device 70 performs step 741 again. When it is determined that the imager 60 has not detected change in the position of the document 95 , the imaging controller 75 determines whether or not a predetermined time has elapsed (step 743 ). When it is determined that a predetermined time has not elapsed, the imaging controller 75 performs step 742 again.
- the imaging controller 75 scans the document 95 placed on the operation stand 20 using the imager 60 (step 744 ).
- the projection controller 72 performs scan image display processing to display the scanned image 92 on the operation stand 20 using the projector 30 (step 745 ).
- the imaging controller 75 determines whether or not the imager 60 has detected the mobile information terminal 90 placed on the operation stand 20 (step 746 ). When it is determined that the imager 60 has not detected the mobile information terminal 90 placed on the operation stand 20 , the imaging controller 75 repeats step 746 , whereas when it is determined that the imager 60 has detected the mobile information terminal 90 placed on the operation stand 20 , the projection controller 72 displays a storage instruction screen on the operation stand 20 using the projector 30 , the storage instruction screen for giving an instruction to store scan data (step 747 ).
- the communication controller 76 obtains authentication information registered in the mobile information terminal 90 , makes authentication and Wi-Fi connection setting based on the authentication information, and receives registration information from the mobile information terminal 90 via Wi-Fi.
- the payment processor 77 performs payment processing by a payment method registered for the type of the document 95 in the registration information or a payment method selected then (step 748 ).
- the communication controller 76 determines whether or not notification that the storage button has been pressed down in the mobile information terminal 90 has been received via Wi-Fi (step 749 ).
- the communication controller 76 repeats step 749 , whereas when it is determined that notification that the storage button has been pressed down in the mobile information terminal 90 has been received via Wi-Fi, the projection controller 72 performs storage instruction screen erasure processing to erase the storage instruction screen 154 (step 750 ).
- the communication controller 76 transmits the scan data of the document 95 to a storage destination registered for the type of the document 95 via the communication I/F 5 , and stores the scan data (step 751 ).
- the screen displayed by the public print processing in step 703 of FIG. 15 is roughly divided into two types: one is for displaying on the operation stand 20 a graphic selected by an operation performed on the surface of the operation stand 20 from the graphics representing the information A to H displayed on the guide display 10 , and the other is for displaying on the operation stand 20 a graphic selected by an operation of a user from the graphics representing the information A to H displayed on the guide display 10 .
- the former includes the type in which the graphics displayed on the guide display 10 is allowed to extend into the operation stand 20 and a graphic is selected by an operation on the operation stand 20 .
- This type includes a mode (hereinafter referred to as a “first mode”) in which of the graphics displayed on the guide display 10 , a graphic displayed on the side near the operation stand 20 is allowed to extend into the operation stand 20 ; and a mode (hereinafter referred to as a “second mode”) in which of the graphics displayed on the guide display 10 , a graphic selected by an operation in an area into which a graphic extends is allowed to extend into the operation stand 20 ; a mode (hereinafter referred to as a “third mode”) in which of the graphics displayed on the guide display 10 , a graphic selected by an operation in an area other than the area into which a graphic extends is allowed to extend into the operation stand 20 ; and a mode (hereinafter referred to as a “fourth mode”) in which of the graphics displayed on the guide display 10 , a graphic
- the former includes a mode (hereinafter referred to as a “fifth mode”) in which one of the graphics displayed on the guide display 10 is selected by an operation on the operation stand 20 with the graphics not extending into the operation stand 20 .
- the latter includes a mode (hereinafter referred to as a “sixth mode”) in which of the graphics representing the information A to H displayed on the guide display 10 , a graphic displayed at a position indicated by an operation of a user is displayed on the operation stand 20 ; and a mode (hereinafter referred to as a “seventh mode”) in which of the graphics representing the information A to H displayed on the guide display 10 , a graphic corresponding to identification information indicated by an operation of a user is displayed on the operation stand 20 .
- a mode hereinafter referred to as a “sixth mode”
- a mode hereinafter referred to as a “seventh mode”
- FIG. 18 is a view illustrating a screen display example displayed in the first mode of the public print processing in step 703 of FIG. 15 .
- the image processing apparatus 100 displays on the guide display 10 a public print screen 201 in which the graphics representing the information A to H are aligned in a vertical row.
- the image processing apparatus 100 displays part of a graphic that represents the lowermost information A in a selection candidate display area 211 which is an example of a first area of the operation stand 20 .
- a swipe operation performed by a user on the graphic representing the desired information in the direction indicated by a solid line arrow 241 causes the image processing apparatus 100 to display the graphic in a print target display area 221 , the swipe operation being an example of a first operation, the print target display area 221 being an example of a second area of the operation stand 20 .
- FIG. 19 is a view illustrating a screen display example displayed in the second mode of the public print processing in step 703 of FIG. 15 .
- the image processing apparatus 100 displays on the guide display 10 a public print screen 202 in which the graphics representing the information A to H are aligned in a circular manner.
- the image processing apparatus 100 displays part of a graphic that represents the lowermost information A in a selection candidate display area 212 which is an example of the first area of the operation stand 20 .
- a swipe operation performed by a user on the graphic representing the desired information in the direction indicated by a solid line arrow 242 causes the image processing apparatus 100 to display the graphic in a print target display area 222 , the swipe operation being an example of the first operation, the print target display area 222 being an example of the second area of the operation stand 20 .
- the solid line arrow 242 indicates a downward direction
- the dashed line arrow 232 indicates a rightward direction in the above, without being limited to this
- the former may be a first direction
- the latter may be a second direction different from the first direction in a more generalized manner.
- FIG. 20 is a view illustrating a screen display example displayed in the third mode of the public print processing in step 703 of FIG. 15 .
- the image processing apparatus 100 displays on the guide display 10 a public print screen 203 in which the graphics representing the information A to H are aligned in a vertical row.
- the image processing apparatus 100 displays part of a graphic that represents the lowermost information A in a selection candidate display area 213 which is an example of the first area of the operation stand 20 .
- a swipe operation performed by a user on the graphic representing the desired information in the direction indicated by a solid line arrow 243 causes the image processing apparatus 100 to display the graphic in the print target display area 223 , the swipe operation being an example of the first operation.
- FIG. 21 is a view illustrating a screen display example displayed in the fourth mode of the public print processing in step 703 of FIG. 15 .
- the image processing apparatus 100 displays on the guide display 10 a public print screen 204 in which the graphics representing the information A to H are aligned in a vertical row.
- the image processing apparatus 100 displays part of a graphic that represents the lowermost information A in a selection candidate display area 214 which is an example of the first area of the operation stand 20 . Subsequently, the image processing apparatus 100 moves the graphics representing the information A to H without an operation of a user.
- the graphics representing the information B to H slide down to respective positions at which the graphics representing the information A to G are used to be, and the graphic representing the information A is moved to the position at which the graphic representing the information H is used to be.
- a swipe operation which is an example of the first operation, performed by a user on the graphic representing the desired information in the direction indicated by a solid line arrow 244 causes the image processing apparatus 100 to display the graphic in a print target display area 224 which is an example of the second area of the operation stand 20 .
- FIG. 22 is a view illustrating a screen display example displayed in the fifth mode of the public print processing in step 703 of FIG. 15 .
- the image processing apparatus 100 displays on the guide display 10 a public print screen 205 in which the graphics representing the information A to H are aligned in a horizontal row.
- a swipe operation which is an example of the first operation, performed by a user on the graphic representing the desired information in the direction indicated by a solid line arrow 245 causes the image processing apparatus 100 to display the graphic on the operation stand 20 .
- graphics are moved by a user performing a swipe operation in the rightward direction on the public print screen 205 in the description above, without being limited to this, for instance, when a predetermined time has elapsed, graphics may be moved without an operation of a user.
- a swipe operation performed by a user on the graphic causes the image processing apparatus 100 to display the graphic on the operation stand 20 .
- a swipe operation performed by a user on the graphic may cause the image processing apparatus 100 to display the graphic on the operation stand 20 in a more generalized manner.
- graphics may be displayed, for instance, in descending order of degree desired by a user in the selection candidate display areas 211 to 214 of the operation stand 20 .
- the degree desired by a user may be calculated based on attributes of a user, such as sex and age, which are obtained from an image of the user captured by the imager 60 , for instance.
- the above-described sixth and seventh modes may also be considered as a method to move a graphic representing desired information to the operation stand 20 from the graphics representing the information A to H displayed on the guide display 10 .
- the sixth mode of the public print processing in step 703 of FIG. 15 will be described.
- the image processing apparatus 100 identifies the position at which a graphic representing desired information is displayed, the graphic being one of the graphics representing the information A to H displayed on the guide display 10 .
- the “operation to indicate a position” includes glancing at the position at which a graphic representing desired information is displayed, and pointing at the position at which a graphic representing desired information is displayed. Accordingly, the image processing apparatus 100 changes the display mode of the graphic displayed at the identified position. For instance, in order to attract attention, the graphic displayed at the identified position is made greater than the other graphics or separated from the other graphics. Subsequently, when a swipe operation is performed by a user on the operation stand 20 , the image processing apparatus 100 moves the graphic displayed at the identified position to the operation stand 20 .
- the seventh mode of the public print processing in step 703 of FIG. 15 will be described.
- the image processing apparatus 100 identifies the identification information associated with the graphic representing desired information, the graphic being one of the graphics representing the information A to H displayed on the guide display 10 .
- identification information is a number
- the “operation to indicate identification information with finger” includes holding up fingers with the number of fingers being the number. Accordingly, the image processing apparatus 100 changes the display mode of the graphic associated with the identified identification information. For instance, the graphic associated with the identified identification information is made greater than the other graphics or separated from the other graphics to attract attention. Subsequently, when a swipe operation is performed by a user on the operation stand 20 , the image processing apparatus 100 moves the graphic associated with the identified identification information to the operation stand 20 .
- Alignment of the graphics representing the information A to H has not been mentioned in the sixth and seventh modes, and this is because one of the graphics on the guide display 10 is directly identified by an operation of a user, and thus the order of the selected candidates of the graphics representing the information A to H does not have to be considered.
- FIG. 23 is a flowchart illustrating an operation example of the control device 70 when the public print processing in step 703 of FIG. 15 is performed in the first to third modes.
- the display controller 71 first aligns and displays graphics representing information on the guide display 10 (step 801 ). Specifically, the graphics representing information are aligned and displayed in a vertical direction as in FIGS. 18 and 20 in the first and the third modes, and the graphics are aligned and displayed in a circular manner as in FIG. 19 in the second mode.
- the detection controller 73 determines whether or not the first swipe operation on the operation stand 20 has been detected by the operation detector 40 (step 802 ).
- the first swipe operation refers to the swipe operation in the direction indicated by the dashed line arrow 231 in FIG. 18 in the first mode, the swipe operation in the direction indicated by the dashed line arrow 232 in FIG. 19 in the second mode, and the swipe operation in the direction indicated by the dashed line arrow 233 in FIG. 20 in the third mode.
- the detection controller 73 ends the processing.
- the display controller 71 moves and displays the graphic in the guide display 10
- the projection controller 72 changes and displays the graphics in the selection candidate display area on the operation stand 20 using the projector 30 (step 803 ). Specifically, the graphics are moved in a vertical direction and displayed as in FIGS. 18 and 20 in the first and the third modes, and the graphics are moved in a circular direction and displayed as in FIG. 19 in the second mode.
- the detection controller 73 determines whether or not the second swipe operation on the operation stand 20 has been detected by the operation detector 40 (step 804 ).
- the second swipe operation refers to the swipe operation in the direction indicated by the solid line arrow 241 of FIG. 18 in the first mode, the swipe operation in the direction indicated by the solid line arrow 242 of FIG. 19 in the second mode, and the swipe operation in the direction indicated by the solid line arrow 243 of FIG. 20 in the third mode.
- the detection controller 73 ends the processing.
- the projection controller 72 moves and displays the graphic from the selection candidate display area to the print target display area on the operation stand 20 using the projector 30 (step 805 ).
- the detection controller 73 determines whether pressing of the print button on the operation stand 20 is detected by the operation detector 40 (step 806 ). When it is determined that pressing of the print button on the operation stand 20 has not been detected by the operation detector 40 , the detection controller 73 ends the processing, whereas when it is determined that pressing of the print button on the operation stand 20 has been detected by the operation detector 40 , the print controller 74 performs control so that printing is made by the printer 50 (step 807 ).
- FIG. 24 is a flowchart illustrating an operation example of the control device 70 when the public print processing in step 703 of FIG. 15 is performed in the fourth mode.
- the display controller 71 first aligns and displays graphics representing information on the guide display 10 (step 821 ). Specifically, the graphics representing information are aligned and displayed in a vertical direction as in FIG. 21 .
- the display controller 71 determines whether or not a predetermined time has elapsed (step 822 ).
- the display controller 71 ends the processing. On the other hand, when it is determined that a predetermined time has not elapsed, the display controller 71 moves and displays the graphic in the guide display 10 , and the projection controller 72 changes and displays the graphics in the selection candidate display area on the operation stand 20 using the projector 30 (step 823 ). Specifically, the graphics are moved in a vertical direction and displayed as in FIG. 21 .
- the detection controller 73 determines whether or not a swipe operation on the operation stand 20 has been detected by the operation detector 40 (step 824 ).
- the swipe operation refers to the swipe operation in the direction indicated by the solid line arrow 244 in FIG. 21 .
- the detection controller 73 ends the processing.
- the projection controller 72 moves and displays the graphic from the selection candidate display area to the print target display area on the operation stand 20 using the projector 30 (step 825 ).
- the detection controller 73 determines whether pressing of the print button on the operation stand 20 is detected by the operation detector 40 (step 826 ). When it is determined that pressing of the print button on the operation stand 20 has not been detected by the operation detector 40 , the detection controller 73 ends the processing, whereas when it is determined that pressing of the print button on the operation stand 20 has been detected by the operation detector 40 , the print controller 74 performs control so that printing is made by the printer 50 (step 827 ).
- FIG. 25 is a flowchart illustrating an operation example of the control device 70 when the public print processing in step 703 of FIG. 15 is performed in the fifth mode.
- the display controller 71 first aligns and displays graphics representing information on the guide display 10 (step 841 ). Specifically, the graphics representing information are aligned and displayed in a horizontal direction as in FIG. 22 .
- the detection controller 73 determines whether or not the first swipe operation on the operation stand 20 has been detected by the operation detector 40 (step 842 ).
- the first swipe operation refers to the swipe operation in the direction indicated by the dashed line arrow 235 in FIG. 22 .
- the detection controller 73 ends the processing.
- the display controller 71 moves and displays the graphic in the guide display 10
- the projection controller 72 changes and displays the graphics in the selection candidate display area on the operation stand 20 using the projector 30 (step 843 ). Specifically, the graphics are moved in a horizontal direction and displayed as in FIG. 22 .
- the detection controller 73 determines whether or not the second swipe operation on the operation stand 20 has been detected by the operation detector 40 (step 844 ).
- the second swipe operation refers to the swipe operation in the direction indicated by the solid line arrow 245 of FIG. 22 .
- the detection controller 73 ends the processing.
- the display controller 71 and the projection controller 72 moves the graphic displayed on the central position of the guide display 10 to the operation stand 20 (step 845 ). Specifically, the display controller 71 deletes the graphic displayed on the central position of the guide display 10 , and the projection controller 72 displays the deleted graphics on the operation stand 20 using the projector 30 .
- the detection controller 73 determines whether pressing of the print button on the operation stand 20 is detected by the operation detector 40 (step 846 ). When it is determined that pressing of the print button on the operation stand 20 has not been detected by the operation detector 40 , the detection controller 73 ends the processing, whereas when it is determined that pressing of the print button on the operation stand 20 has been detected by the operation detector 40 , the print controller 74 performs control so that printing is made by the printer 50 (step 847 ).
- FIG. 26 is a flowchart illustrating an operation example of the control device 70 that performs the public print processing in the sixth and seventh modes.
- the imaging controller 75 first determines whether a predetermined operation of a user has been detected by the imager 60 (step 861 ).
- the predetermined operation refers to designating a graphic on the guide display 10 by glancing or pointing at the graphic in the sixth mode, and designating a graphic on the guide display 10 by indicating the identification information of the graphic with finger in the seventh mode.
- the imaging controller 75 ends the processing.
- the display controller 71 displays the designated graphic on the guide display 10 in a changed display mode (step 862 ). Specifically, the designated graphic is made greater than the other graphics or separated from the other graphics to attract attention.
- the detection controller 73 determines whether or not a swipe operation on the operation stand 20 has been detected by the operation detector 40 (step 863 ).
- the swipe operation refers to the swipe operation in a direction to the near side of the operation stand 20 , for instance.
- the detection controller 73 ends the processing.
- the display controller 71 and the projection controller 72 moves the designated graphic on the guide display 10 to the operation stand 20 (step 864 ). Specifically, the display controller 71 deletes the designated graphic on the guide display 10 , and the projection controller 72 displays the deleted graphics on the operation stand 20 using the projector 30 .
- the detection controller 73 determines whether pressing of the print button on the operation stand 20 is detected by the operation detector 40 (step 865 ). When it is determined that pressing of the print button on the operation stand 20 has not been detected by the operation detector 40 , the detection controller 73 ends the processing, whereas when it is determined that pressing of the print button on the operation stand 20 has been detected by the operation detector 40 , the print controller 74 performs control so that printing is made by the printer 50 (step 866 ).
- control device 70 in the exemplary embodiment is prepared, for instance, as a program such as application software.
- any program that implements the exemplary embodiment is considered to be a program that causes a computer to implement a function of displaying multiple display elements on the first display surface which is not touch-sensitive, and a function of displaying a specific display element selected from the multiple display elements displayed on the first display surface by an operation performed on the second display surface.
- any program that implements the exemplary embodiment is considered to be a program that causes a computer to implement a function of detecting an operation of an operator, a function of displaying multiple display elements on the first display surface which is not touch-sensitive, and a function of displaying a specific display element on the second display surface according to an operation performed on the second display surface, the specific display element being selected from the multiple display elements displayed on the first display surface by the detected operation.
- any program that implements the exemplary embodiment may be provided not only by a communication unit, but also by a recording medium such as a CD-ROM that stores the program.
Abstract
Description
- This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2017-219915 filed on Nov. 15, 2017.
- The present invention relates to a display apparatus and a non-transitory computer readable medium storing a program.
- According to an aspect of the invention, there is provided a display apparatus including: a first display unit that displays a plurality of display elements on a first display surface, the first display surface being not touch-sensitive; and a second display unit that displays a specific display element on a second display surface, the specific display element being selected from the plurality of display elements displayed on the first display surface, by an operation performed on the second display surface.
- Exemplary embodiment of the present invention will be described in detail based on the following figures, wherein:
-
FIG. 1 is a perspective view of an image processing apparatus according to an exemplary embodiment of the invention; -
FIG. 2 is a diagram illustrating an example of a hardware configuration of the image processing apparatus according to the exemplary embodiment of the invention; -
FIG. 3 is a block diagram illustrating a functional configuration example of a control device in the exemplary embodiment of the invention; -
FIG. 4 is a view illustrating a screen display example during stand-by of the image processing apparatus; -
FIG. 5 is a view illustrating a screen display example when login is completed in the image processing apparatus; -
FIG. 6 is a view illustrating a screen display example when a print operation is started in the image processing apparatus; -
FIG. 7 is a view illustrating a screen display example when file contents are checked in the image processing apparatus; -
FIG. 8 is a view illustrating a screen display example when an output format is selected in the image processing apparatus; -
FIG. 9 is a view illustrating a screen display example when a print operation is completed in the image processing apparatus; -
FIG. 10 is a view illustrating a screen display example when documents are placed by a user on an operation stand; -
FIG. 11 is a view illustrating a screen display example when documents are placed by a user on an operation stand; -
FIG. 12 is a view illustrating a screen display example when two-dimensional scan is completed in the image processing apparatus; -
FIG. 13 is a view illustrating a screen display example when a storage operation is started in the image processing apparatus; -
FIG. 14 is a view illustrating a screen display example when three-dimensional scan is completed in the image processing apparatus; -
FIG. 15 is a flowchart illustrating an operation example of the control device in the exemplary embodiment of the invention; -
FIG. 16 is a flowchart illustrating an operation example of the control device when print processing is performed. -
FIG. 17 is a flowchart illustrating an operation example of the control device when two-dimensional scan processing is performed; -
FIG. 18 is a view illustrating a screen display example displayed in a first mode of the public print processing; -
FIG. 19 is a view illustrating a screen display example displayed in a second mode of the public print processing; -
FIG. 20 is a view illustrating a screen display example displayed in a third mode of the public print processing; -
FIG. 21 is a view illustrating a screen display example displayed in a fourth mode of the public print processing; -
FIG. 22 is a view illustrating a screen display example displayed in a fifth mode of the public print processing; -
FIG. 23 is a flowchart illustrating an operation example of the control device when the public print processing is performed in the first to third modes; -
FIG. 24 is a flowchart illustrating an operation example of the control device when the public print processing is performed in the fourth mode; -
FIG. 25 is a flowchart illustrating an operation example of the control device when the public print processing is performed in the fifth mode; and -
FIG. 26 is a flowchart illustrating an operation example of the control device when the public print processing is performed in the sixth and seventh modes. - Hereinafter, an exemplary embodiment of the invention will be described in detail with reference to the accompanying drawings.
-
FIG. 1 is a perspective view of animage processing apparatus 100 according to an exemplary embodiment of the invention. As illustrated, theimage processing apparatus 100 includes aguide display 10, an operation stand 20, aprojector 30, anoperation detector 40, aprinter 50, andimagers 60 a to 60 d. - The
guide display 10 is a display that displays a message to a user, such as guidance, for an operation of theimage processing apparatus 100. Unlike the later-described operation stand 20, even when contact is made with the surface of theguide display 10, contact is not detected. Here, for instance, a liquid crystal display may be used as theguide display 10. In the exemplary embodiment, theguide display 10 is provided as an example of a first display surface that does not detect a contact operation. - The
operation stand 20 is a substantially horizontal stand that projects toward a user so that the user can place and operate a mobile information terminal and a document. In this case, the “substantially horizontal” may refer to a horizontal levelness that does not cause a mobile information terminal or a document placed on the operation stand 20 to slip down. Theoperation stand 20 is designed so that an image is displayed by the function of the later-describedprojector 30, and contact with the surface of theoperation stand 20 is detected by the function of the later-describedoperation detector 40. However, theoperation stand 20 itself may be configurated by a display and aprojector 30 may not be provided. In the exemplary embodiment, theoperation stand 20 is provided as an example of a display surface, a second display surface, and a platen. - The
projector 30 is a projector that projects an image onto the operation stand 20. Theprojector 30 projects an image onto the operation stand 20 in an oblique direction from above because theprojector 30 is provided at a lower portion of theguide display 10. Theprojector 30, however, may be provided vertically above the operation stand 20 to project an image onto the operation stand 20 in a direction from immediately above. Alternatively, theprojector 30 may be provided vertically below the operation stand 20, or theprojector 30 may project an image onto the operation stand 20 in a direction from immediately below using a mirror along with theprojector 30. Here, for instance, a liquid crystal projector may be used as theprojector 30. - The
operation detector 40 detects an operation by contacting with the surface of the operation stand 20. Detection of the operation may be made by sensing blocking of infrared rays by a finger of a user, the infrared rays radiating to the surface of the operation stand 20 radially. Specifically, for instance, an infrared LED and an infrared sensor may be used as theoperation detector 40. - The
printer 50 is a printer that prints an image on paper or other media. Here, for instance, an electrophotographic system that forms an image by transferring toner adhering to a photoconductor onto a recording medium, or an inkjet printer that discharges ink on a recording medium to form an image may be used as theprinter 50. Alternatively, theprinter 50 may be a printer that creates a printed material by pressing a block, to which ink is applied, against paper or other media. In the exemplary embodiment, theprinter 50 is provided as an example of the printer. - The
imagers 60 a to 60 d are cameras that capture an image of a document or a mobile information terminal placed on the operation stand 20. Among these, theimagers guide display 10, and thus mainly capture an image of a document or a mobile information terminal placed on the operation stand 20 from above. Also, theimagers guide display 10, and thus mainly capture an image in an oblique direction from below when a three-dimensional object is placed on the operation stand 20. Like this, theimagers 60 a to 60 d have different applications according to the positions provided, and hereinafter are referred to as theimager 60 when these imagers are not distinguished from each other. In this case, theimager 60 is provided as a scanner, thus hereinafter “captures something” may also be expressed as “scans something”. In the exemplary embodiment, theimager 60 is provided as an example of the reading device. Although fourimagers 60 are illustrated in the drawings, the number ofimagers 60 is not limited to four. For instance, animager 60 for detecting a line of sight and/or motion of a user may be provided at a position which allows such detection. -
FIG. 2 is a diagram illustrating an example of a hardware configuration of theimage processing apparatus 100 according to the exemplary embodiment. As illustrated, theimage processing apparatus 100 includes a central processing unit (CPU) 1, a random access memory (RAM) 2, a read only memory (ROM) 3, a hard disk drive (HDD) 4, a communication interface (hereinafter referred to as a “communication I/F”) 5, aguide display 10, aprojector 30, anoperation detector 40, aprinter 50, and animager 60. - The
CPU 1 implements the later-described functions by loading various programs stored in theROM 3 into theRAM 2, and executing the programs. TheRAM 2 is a memory that is used as a memory for work of theCPU 1. TheROM 3 is a memory that stores various programs to be executed by theCPU 1. TheHDD 4 is, for instance, a magnetic disk device that stores data scanned by theimager 60, data used by printing in theprinter 50 and other data. The communication I/F 5 transmits and receives various information to and from other devices via a communication line. - Since the
guide display 10, theprojector 30, theoperation detector 40, theprinter 50, and theimager 60 have been already described with reference toFIG. 1 , a description thereof is omitted here. -
FIG. 3 is a block diagram illustrating a functional configuration example of acontrol device 70 that controls theimage processing apparatus 100. Here, thecontrol device 70 is an example of a display device and a an image reading device, and is regarded as a device which is implemented by the CPU 1 (seeFIG. 2 ) of theimage processing apparatus 100 in such a manner that theCPU 1 reads a program implementing the later-described functional units, for instance, from the ROM 3 (seeFIG. 2 ) to the RAM 2 (seeFIG. 2 ) and executes the program. As illustrated, thecontrol device 70 includes adisplay controller 71, aprojection controller 72, adetection controller 73, aprint controller 74, animaging controller 75, acommunication controller 76, apayment processor 77, adocument type recognizer 78, and ascan data processor 79. - The
display controller 71 displays various types of guidance and various screens on theguide display 10. In the exemplary embodiment, thedisplay controller 71 is provided as an example of a first display unit that displays information on the first display surface. - The
projection controller 72 displays various screens on the operation stand 20 using theprojector 30. In the exemplary embodiment, theprojection controller 72 is provided as an example of a second display unit that displays information on the display surface, the second display surface, and the platen. - The
detection controller 73 determines whether or not theoperation detector 40 has detected an operation by contacting with the surface of theoperation stand 20. In addition, thedetection controller 73 also determines whether or not a human sensor (not illustrated) has detected approach of a user. - The
print controller 74 controls printing by theprinter 50. - The
imaging controller 75 controls theimager 60 to capture an image of a document or a mobile information terminal placed on theoperation stand 20, and obtains the image captured by theimager 60. In particular, theimaging controller 75 controls theimager 60 such that when a predetermined time has elapsed since a document is placed on theoperation stand 20, theimager 60 scans the document. In the exemplary embodiment, theimaging controller 75 is provided as an example of a reading unit that reads an image. Also, theimaging controller 75 may obtain a detection result from theimager 60 that detects a line of sight and/or motion of a user. In this case, theimaging controller 75 is an example of a detection unit that detects motion of a user. - When information recorded on a card is read by a card reader (not illustrated), the
communication controller 76 receives the information from the card reader. Also, when information stored in a mobile information terminal is received by a near field communication (NFC) reader (not illustrated), thecommunication controller 76 receives the information from the NFC reader. In addition, thecommunication controller 76 receives information stored in a mobile information terminal via Wi-Fi (registered trademark). Instead of Wi-Fi, Bluetooth (registered trademark) may be used. However, a description is given below with Wi-Fi used. In the exemplary embodiment, thecommunication controller 76 is provided as an example of a reading unit that reads information. - In addition, the
communication controller 76 receives a file from an external cloud system or transmits a file to an external cloud system via the communication I/F 5. In the exemplary embodiment, thecommunication controller 76 is provided as an example of a receiving unit that receives data from another device, and an example of a transmission unit that transmits data to another device. - The
payment processor 77 performs payment-related processing such as generation of payment information based on the information received by thecommunication controller 76 from the card reader and the information received by thecommunication controller 76 from Wi-Fi. - When a document is placed on the
operation stand 20, thedocument type recognizer 78 recognizes the type of the document. The type of the document may be recognized, for instance, by pattern matching with image data pre-stored for each type of document. - The
scan data processor 79 performs various types of processing on scan data obtained by theimaging controller 75. Here, the various types of processing include processing of scan data, and processing to integrate pieces of scan data obtained by multiple scans. In the exemplary embodiment, thescan data processor 79 is provided as an example of an output unit that outputs an image obtained by integrating two images. - In the exemplary embodiment, final printing and scanning are performed by the
image processing apparatus 100, but a prior operation for the printing and scanning is performed by a mobile information terminal such as a smartphone. - Thus, before a screen display example of the
image processing apparatus 100 is described, a prior operation performed in the mobile information terminal will be described. An application software (hereinafter referred to as an “application”) for utilizing theimage processing apparatus 100 is installed in the mobile information terminal, and a user performs the prior operation using the application. It is to be noted that the application used in the exemplary embodiment is only for utilizing theimage processing apparatus 100, thus any “application” mentioned in the present description indicates the application for utilizing theimage processing apparatus 100. - First, the operation for the first time in the mobile information terminal will be described. When subscribing a service for utilizing the
image processing apparatus 100, a user starts up the application by the mobile information terminal, and registers authentication information and other various information for performing authentication in the mobile information terminal. - The various information (hereinafter referred to as “registration information”) registered in the mobile information terminal includes a payment method, a print setting, and a storage destination.
- In the exemplary embodiment, the
image processing apparatus 100 is designed to be installed and utilized in a public space, and thus a payment method has to be registered. Specifically, the payment method indicates how payment is made for printing and scanning, and includes, for instance, payment by a credit card, and payment by an electronic money IC card. - Also, the print setting indicates a desired print style when printing is made. In addition to normal print setting such as monochrome printing or color printing, and single-sided printing or double-sided printing, the print setting also includes a special output style such as stapling, and putting a printed material in an envelope or a vinyl bag.
- Also, the storage destination indicates where scan data obtained by scanning a document is stored. The storage destination includes an expense settlement cloud system, a document management cloud system, and a business card management cloud system. These storage destinations may be each registered as the location where scan data of a document is stored according to the type of the document. Registration may be made such that for instance, when the type of a document is receipt, the scan data is stored in the expense settlement cloud system, when the type of a document is A4 paper, the scan data is stored in the document management cloud system, and when the type of a document is business card, the scan data is stored in the business card cloud system.
- Next, the operation for the second time and after in the mobile information terminal will be described. For instance, when printing a file stored in a cloud system, a user starts up the application by the mobile information terminal, obtains a list of files from the cloud system, and the list is displayed on the display of the mobile information terminal. In this state, a user reserves printing by designating a file which is desired to be printed. Hereinafter, a file for which printing is reserved is called a “print reservation file”. Also, a user registers various information in the print reservation file. For instance, a user sets an output format, and a payment method to the print reservation file. Alternatively, a user may leave the output format and the payment method unset.
- Subsequently, for actually printing the file, a user has to go to an installation location of the
image processing apparatus 100 in a public space. The application of the mobile information terminal also provides relevant information for this case. For instance, when a user designates a print reservation file and presses down a search button of the mobile information terminal, the application displays a map of the surrounding area of the user on the display of the mobile information terminal, and displays the installation location of animage processing apparatus 100 that can print the print reservation file in consideration of an output format set for the designated print reservation file. Thus, it is possible for the user to go to the installation location of theimage processing apparatus 100 and to print the print reservation file which is desired to be printed. - Hereinafter, a screen display example in the
image processing apparatus 100 will be described. - (Screen Display Example during Stand-by)
-
FIG. 4 is a view illustrating a screen display example during stand-by of theimage processing apparatus 100. As illustrated, theimage processing apparatus 100 displays a stand-by screen 101 on theguide display 10 during stand-by. The stand-by screen 101 includes a graphic as an example of a display element that expresses information which is considered to be necessary for a user related to the installation location of theimage processing apparatus 100.FIG. 4 illustrates information A to H as an example of such information. The stand-by screen 101 has various versions according to the installation location of theimage processing apparatus 100, and the information A to H vary with the version of the stand-by screen 101. For instance, when theimage processing apparatus 100 is installed in a station, the stand-by screen 101 is a station version, and the information A to H is the information on train operation, schedule, and travel. Although all graphics indicating the information A to H have the same size inFIG. 4 , the size may be changed according to a priority level of information, for instance, a graphic indicating information that is considered to be highly necessary for users is displayed in large size. - In the state where the stand-
by screen 101 ofFIG. 4 is displayed, when one of amobile information terminal 90, adocument 95, and a three-dimensional object 97 is placed on the operation stand 20 by a user, theimage processing apparatus 100 proceeds to one of print processing, two-dimensional scan processing, and three-dimensional scan processing according to the object placed on theoperation stand 20. Specifically, when themobile information terminal 90 is placed on the operation stand 20 by a user, successful authentication based on authentication information transmitted by the application which has been started up in themobile information terminal 90 causes theimage processing apparatus 100 to proceed to the print processing. In contrast, when thedocument 95 is placed on the operation stand 20 by a user, theimage processing apparatus 100 proceeds to the two-dimensional scan processing, and when the three-dimensional object 97 is placed on the operation stand 20 by a user, theimage processing apparatus 100 proceeds to the three-dimensional scan processing. -
FIG. 5 is a view illustrating a screen display example when login is completed in theimage processing apparatus 100. When themobile information terminal 90 is placed on the operation stand 20 by a user and authentication is successful, theimage processing apparatus 100 starts print processing. In this process, successful authentication causes login processing to be completed, thus theimage processing apparatus 100 first displays alogin completion screen 102 on theguide display 10 and theoperation stand 20. As illustrated, thelogin completion screen 102 is a screen in which themobile information terminal 90, theoperation stand 20, and theguide display 10 are linked by animation. -
FIG. 6 is a view illustrating a screen display example when a print operation is started in theimage processing apparatus 100. When a print reservation file is designated by themobile information terminal 90 placed on theoperation stand 20, theimage processing apparatus 100 displays aprint instruction screen 103 on theoperation stand 20. Theprint instruction screen 103 includes an image (hereinafter referred to as a “file image”) indicating the print reservation file.FIG. 6 illustratesfile images 911 to 913 as an example of such a file image. Furthermore, inFIG. 6 , a printing fee is calculated according to the attribute of the print reservation file, and the printing fee is also displayed on theprint instruction screen 103. On the other hand, theimage processing apparatus 100 displays aguide 121 regarding edition and aguide 122 regarding print on theguide display 10. At this point, as illustrated, the application displays a print button on the display of themobile information terminal 90. -
FIG. 7 is a view illustrating a screen display example when file contents are checked in theimage processing apparatus 100. When an expansion gesture is made by a user in accordance with theguide 121 ofFIG. 6 , theimage processing apparatus 100 displays a filecontent display screen 104 on theoperation stand 20. The filecontent display screen 104 is a screen that displays a document in actual size and allows editing of the document. For instance, when an expansion gesture is made on thefile image 913, the contents of the print reservation file represented by thefile image 913 are displayed. On the other hand, theimage processing apparatus 100 displays aguide 123 regarding content check and aguide 124 regarding an output format. The characters of a confidential document such as an in-house document are first displayed in a blurred manner, and the characters traced by a finger of a user in accordance with theguide 123 may be displayed in a recognizable manner. Alternatively, the characters traced by the palm of a user may be displayed in a more recognizable manner. -
FIG. 8 is a view illustrating a screen display example when an output format is selected in theimage processing apparatus 100. When a bookbinding button (not illustrated) is pressed down by a user in accordance with theguide 124 ofFIG. 7 , theimage processing apparatus 100 displays an outputformat display screen 105 on theoperation stand 20. The outputformat display screen 105 includes various output formats, and a desired output format is selectable from the output formats. When one of the output formats is selected by a user, theimage processing apparatus 100 returns the current screen to the original screen. - Subsequently, in the state where the
print instruction screen 103 ofFIG. 6 is displayed, when a printing fee is paid and the print button is pressed down by a user, theimage processing apparatus 100 starts printing. In this process, theimage processing apparatus 100 moves thefile images 911 to 913 toward the near side to fade out of sight. -
FIG. 9 is a view illustrating a screen display example when a print operation is completed in theimage processing apparatus 100. When the print operation is completed, theimage processing apparatus 100 displays alogout guide screen 106 on theoperation stand 20. Thelogout guide screen 106 includes a faintly shining area around themobile information terminal 90 for prompting a user to remove themobile information terminal 90. On the other hand, theimage processing apparatus 100 displays aguide 125 regarding logout and aguide 126 regarding personal belongings on theguide display 10. At this point, as illustrated, the application displays a check mark on the display of themobile information terminal 90 to notify a user of completion of printing. - Thus, when a user removes the
mobile information terminal 90 from theoperation stand 20, theimage processing apparatus 100 performs logout processing, and displays a message indicating completion of logout on theguide display 10 and theoperation stand 20. Also, the application displays a message indicating completion of logout on the display of themobile information terminal 90. -
FIGS. 10 and 11 are each a view illustrating a screen display example when thedocument 95 is placed on the operation stand 20 by a user. When thedocument 95 is placed on the operation stand 20 by a user, theimage processing apparatus 100 displays aguide 171 regarding position adjustment of thedocument 95 on theguide display 10. First, a case is considered where areceipt 951, anA4 paper 952, and abusiness card 953 are placed as thedocument 95 closely to each other on the operation stand 20 by a user in accordance with theguide 171 as illustrated inFIG. 10 . In this case, theimage processing apparatus 100 recognizes the type of thedocument 95 as a document, and displays a documenttype recognition result 150 indicating the type on theoperation stand 20. Next, a case is considered where thereceipt 951, theA4 paper 952, and thebusiness card 953 are placed as thedocument 95 apart from each other on the operation stand 20 by a user in accordance with theguide 171 as illustrated inFIG. 11 . In this case, theimage processing apparatus 100 recognizes the types of thedocument 95 as a receipt, A4 paper, and a business card, and displays document type recognition results 151 to 153 indicating the types on theoperation stand 20. - Subsequently, when a predetermined time elapses with the
document 95 placed as illustrated inFIG. 11 , theimage processing apparatus 100 scans thedocument 95. -
FIG. 12 is a view illustrating a screen display example when the scan is completed in theimage processing apparatus 100. When the scan is completed, theimage processing apparatus 100 displays aguide 172 regarding removal of thedocument 95 and aguide 173 regarding storage destination on theguide display 10. When thereceipt 951, theA4 paper 952, and thebusiness card 953 are removed from the operation stand 20 by a user in accordance with theguide 172, theimage processing apparatus 100 displays a scannedimage 921 of the receipt, a scannedimage 922 of the A4 paper, and a scannedimage 923 of the business card on theoperation stand 20. In this process, theimage processing apparatus 100 displays the scannedimages 921 to 923 in an erect state. -
FIG. 13 is a view illustrating a screen display example when a storage operation is started in theimage processing apparatus 100. When themobile information terminal 90 is placed on the operation stand 20 by a user, theimage processing apparatus 100 displays astorage instruction screen 154 on theoperation stand 20. In addition to the scannedimages 921 to 923, thestorage instruction screen 154 includesstorage destination icons 924 to 926 indicating respective storage destinations registered for the types of document in themobile information terminal 90. Here, thestorage destination icon 924 indicates the expense settlement cloud system registered as the storage destination of scan data of receipt. Also, thestorage destination icon 925 indicates the document management cloud system registered as the storage destination of scan data of A4 paper. In addition, thestorage destination icon 926 indicates the business card management cloud system registered as the storage destination of scan data of business card. In this process, the application displays a storage button on the display of themobile information terminal 90. When a storage fee is paid and the storage button is pressed down on thestorage instruction screen 154 by a user, theimage processing apparatus 100 stores the scan data of the receipt, A4 paper, and business card in the respective corresponding cloud systems. - Subsequently, as illustrated in
FIG. 9 , theimage processing apparatus 100 displays theguide 125 regarding logout, and theguide 126 regarding personal belongings on theguide display 10. Thus, when a user removes themobile information terminal 90 from theoperation stand 20, theimage processing apparatus 100 performs logout processing, and displays a message indicating completion of logout on theguide display 10 and theoperation stand 20. Also, the application displays a message indicating completion of logout on themobile information terminal 90. - When a predetermined time elapses with the three-dimensional object 97 placed on the
operation stand 20, theimage processing apparatus 100 scans the three-dimensional object 97. When the three-dimensional object 97 is removed from the operation stand 20 by a user, theimage processing apparatus 100 displays a result of scanning the three-dimensional object 97 on theguide display 10 and theoperation stand 20. -
FIG. 14 is a view illustrating a screen display example when the scan is completed in theimage processing apparatus 100. When the scan is completed, theimage processing apparatus 100 displays aplanar image 971 of a scan result on theoperation stand 20. On the other hand, theimage processing apparatus 100 displays a three-dimensional image 972 of a scan result on theguide display 10. The three-dimensional image 972 is displayed while being rotated as indicated by an arrow inFIG. 14 , thereby allowing a user to check the three-dimensional shape of the scan result. In addition, theimage processing apparatus 100 displays aguide 176 regarding confirmation of a storage destination and aguide 177 regarding determination of a payment method on theguide display 10. When themobile information terminal 90 is placed on the operation stand 20 by a user in this state, theimage processing apparatus 100 displays astorage instruction screen 155 on theoperation stand 20. In addition to theplanar image 971 of the scan result, thestorage instruction screen 155 includes astorage destination icon 927 indicating a storage destination registered in themobile information terminal 90. Here, thestorage destination icon 927 indicates a cloud system which is registered as the storage destination of scan data of three-dimensional objects. In this process, the application displays a storage button on the display of themobile information terminal 90. When a storage fee is paid and the storage button is pressed down on thestorage instruction screen 155 by a user, theimage processing apparatus 100 stores the scan data of the three-dimensional object 97 in a corresponding cloud system. - Subsequently, as illustrated in
FIG. 9 , theimage processing apparatus 100 displays theguide 125 regarding logout, and theguide 126 regarding personal belongings on theguide display 10. Thus, when themobile information terminal 90 is removed from the operation stand 20 by a user, theimage processing apparatus 100 performs logout processing, and displays a message indicating completion of logout on theguide display 10 and theoperation stand 20. Also, the application displays a message indicating completion of logout on themobile information terminal 90. -
FIG. 15 is a flowchart illustrating an operation example of thecontrol device 70 that performs such screen display. - As illustrated, in the
control device 70, thedisplay controller 71 first displays the stand-by screen 101 on the guide display 10 (step 701). - Next, the
detection controller 73 determines whether or not a human sensor has detected approach of a user (step 702). When it is determined that the human sensor has not detected approach of a user, thedetection controller 73 repeats step 702, whereas when it is determined that the human sensor has detected approach of a user, thecontrol device 70 performs public print processing to print information necessary for a user in a public space (step 703). - Subsequently, the
imaging controller 75 determines whether or not theimager 60 has detected anything placed on the operation stand 20 (step 704). When it is determined that theimager 60 has not detected anything placed on theoperation stand 20, thecontrol device 70 continues the public print processing. - On the other hand, when it is determined that the
imager 60 has detected anything placed on theoperation stand 20, theimaging controller 75 determines whether or not theimager 60 has detected thedocument 95 placed on the operation stand 20 (step 705). As a result, when it is determined that theimager 60 has detected thedocument 95 placed on theoperation stand 20, thecontrol device 70 performs two-dimensional scan processing (step 706). - Also, when it is determined that the
imager 60 has not detected thedocument 95 placed on theoperation stand 20, theimaging controller 75 determines whether or not theimager 60 has detected themobile information terminal 90 placed on the operation stand 20 (step 707). As a result, when it is determined that theimager 60 has detected themobile information terminal 90 placed on theoperation stand 20, thecontrol device 70 performs print processing (step 708). At this point, in thecontrol device 70, it is assumed that thecommunication controller 76 obtains authentication information registered in themobile information terminal 90 before the print processing is performed, makes authentication and Wi-Fi connection setting based on the authentication information, and receives registration information from themobile information terminal 90 via Wi-Fi. On the other hand, when it is determined that theimager 60 has not detected themobile information terminal 90 placed on theoperation stand 20, thecontrol device 70 performs three-dimensional scan processing (step 709). -
FIG. 16 is a flowchart illustrating an operation example of thecontrol device 70 when the print processing instep 708 ofFIG. 15 is performed. - As illustrated, the
control device 70 first displays thelogin completion screen 102 on theguide display 10 and the operation stand 20 (step 721). Specifically, thedisplay controller 71 displays part of thelogin completion screen 102 on theguide display 10, and theprojection controller 72 displays the remaining part of thelogin completion screen 102 on the operation stand 20 using theprojector 30. - Next, the
projection controller 72 performs print instruction screen display processing to display aprint instruction screen 103 on the operation stand 20 using theprojector 30, theprint instruction screen 103 for giving an instruction to print a print reservation file (step 722). - Subsequently, the
payment processor 77 performs payment processing by a payment method registered for the print reservation file in the registration information or a payment method selected then (step 723). Thecommunication controller 76 then determines whether or not notification that the print button has been pressed down in themobile information terminal 90 has been received via Wi-Fi (step 724). When it is determined that notification that the print button has been pressed down in themobile information terminal 90 has not been received via Wi-Fi, thecommunication controller 76 repeats step 724, whereas when it is determined that notification that the print button has been pressed down in themobile information terminal 90 has been received via Wi-Fi, theprint controller 74 performs control so that printing is made by the printer 50 (step 725). - Subsequently, when printing by the
printer 50 is completed, theprojection controller 72 displays thelogout guide screen 106 on the operation stand 20 using the projector 30 (step 726). -
FIG. 17 is a flowchart illustrating an operation example of thecontrol device 70 when the two-dimensional scan processing instep 706 ofFIG. 15 is performed. - As illustrated, the
control device 70 first displays a document type recognition result on the operation stand 20 (step 741). Specifically, theimaging controller 75 obtains the image of thedocument 95 captured by theimager 60, thedocument type recognizer 78 recognizes the type of thedocument 95, for instance, by pattern matching, and theprojection controller 72 displays a result of the recognition on the operation stand 20 using theprojector 30. - Next, the
imaging controller 75 determines whether or not theimager 60 has detected change in the position of the document 95 (step 742). When it is determined that theimager 60 has detected change in the position of thedocument 95, thecontrol device 70 performs step 741 again. When it is determined that theimager 60 has not detected change in the position of thedocument 95, theimaging controller 75 determines whether or not a predetermined time has elapsed (step 743). When it is determined that a predetermined time has not elapsed, theimaging controller 75 performs step 742 again. - On the other hand, when it is determined that a predetermined time has elapsed, the
imaging controller 75 scans thedocument 95 placed on the operation stand 20 using the imager 60 (step 744). Thus, theprojection controller 72 performs scan image display processing to display the scanned image 92 on the operation stand 20 using the projector 30 (step 745). - Next, the
imaging controller 75 determines whether or not theimager 60 has detected themobile information terminal 90 placed on the operation stand 20 (step 746). When it is determined that theimager 60 has not detected themobile information terminal 90 placed on theoperation stand 20, theimaging controller 75 repeats step 746, whereas when it is determined that theimager 60 has detected themobile information terminal 90 placed on theoperation stand 20, theprojection controller 72 displays a storage instruction screen on the operation stand 20 using theprojector 30, the storage instruction screen for giving an instruction to store scan data (step 747). At this point, it is assumed that thecommunication controller 76 obtains authentication information registered in themobile information terminal 90, makes authentication and Wi-Fi connection setting based on the authentication information, and receives registration information from themobile information terminal 90 via Wi-Fi. - Subsequently, the
payment processor 77 performs payment processing by a payment method registered for the type of thedocument 95 in the registration information or a payment method selected then (step 748). Thecommunication controller 76 then determines whether or not notification that the storage button has been pressed down in themobile information terminal 90 has been received via Wi-Fi (step 749). When it is determined that notification that the storage button has been pressed down in themobile information terminal 90 has not been received via Wi-Fi, thecommunication controller 76 repeats step 749, whereas when it is determined that notification that the storage button has been pressed down in themobile information terminal 90 has been received via Wi-Fi, theprojection controller 72 performs storage instruction screen erasure processing to erase the storage instruction screen 154 (step 750). Thecommunication controller 76 then transmits the scan data of thedocument 95 to a storage destination registered for the type of thedocument 95 via the communication I/F 5, and stores the scan data (step 751). - In the exemplary embodiment, the screen displayed by the public print processing in step 703 of
FIG. 15 is roughly divided into two types: one is for displaying on the operation stand 20 a graphic selected by an operation performed on the surface of the operation stand 20 from the graphics representing the information A to H displayed on theguide display 10, and the other is for displaying on the operation stand 20 a graphic selected by an operation of a user from the graphics representing the information A to H displayed on theguide display 10. - Between these, the former includes the type in which the graphics displayed on the
guide display 10 is allowed to extend into the operation stand 20 and a graphic is selected by an operation on theoperation stand 20. This type includes a mode (hereinafter referred to as a “first mode”) in which of the graphics displayed on theguide display 10, a graphic displayed on the side near the operation stand 20 is allowed to extend into theoperation stand 20; and a mode (hereinafter referred to as a “second mode”) in which of the graphics displayed on theguide display 10, a graphic selected by an operation in an area into which a graphic extends is allowed to extend into theoperation stand 20; a mode (hereinafter referred to as a “third mode”) in which of the graphics displayed on theguide display 10, a graphic selected by an operation in an area other than the area into which a graphic extends is allowed to extend into theoperation stand 20; and a mode (hereinafter referred to as a “fourth mode”) in which of the graphics displayed on theguide display 10, a graphic selected without an operation of a user is allowed to extend into theoperation stand 20. It is to be noted that the second and third modes may be regarded as a mode in which of the graphics displayed on theguide display 10, a graphic selected by an operation on the operation stand 20 is allowed to extend into theoperation stand 20. - Also, the former includes a mode (hereinafter referred to as a “fifth mode”) in which one of the graphics displayed on the
guide display 10 is selected by an operation on the operation stand 20 with the graphics not extending into theoperation stand 20. - On the other hand, the latter includes a mode (hereinafter referred to as a “sixth mode”) in which of the graphics representing the information A to H displayed on the
guide display 10, a graphic displayed at a position indicated by an operation of a user is displayed on theoperation stand 20; and a mode (hereinafter referred to as a “seventh mode”) in which of the graphics representing the information A to H displayed on theguide display 10, a graphic corresponding to identification information indicated by an operation of a user is displayed on theoperation stand 20. - Hereinafter, the first to seventh modes of the public print processing will be specifically described.
-
FIG. 18 is a view illustrating a screen display example displayed in the first mode of the public print processing in step 703 ofFIG. 15 . When a user approaches theimage processing apparatus 100 with the stand-by screen 101 displayed, theimage processing apparatus 100 displays on the guide display 10 apublic print screen 201 in which the graphics representing the information A to H are aligned in a vertical row. At this point, theimage processing apparatus 100 displays part of a graphic that represents the lowermost information A in a selectioncandidate display area 211 which is an example of a first area of theoperation stand 20. In this state, when a user touches a graphic representing the information A in the selectioncandidate display area 211 to perform a swipe operation in the direction indicated by a dashedline arrow 231, on thepublic print screen 201, the graphics representing the information B to H slide down to respective positions at which the graphics representing the information A to G are used to be, and the graphic representing the information A is moved to the position at which the graphic representing the information H is used to be. During the operation to move the graphics representing the information A to H in this manner, when a graphic representing desired information appears in the selectioncandidate display area 211, a swipe operation performed by a user on the graphic representing the desired information in the direction indicated by asolid line arrow 241 causes theimage processing apparatus 100 to display the graphic in a printtarget display area 221, the swipe operation being an example of a first operation, the printtarget display area 221 being an example of a second area of theoperation stand 20. -
FIG. 19 is a view illustrating a screen display example displayed in the second mode of the public print processing in step 703 ofFIG. 15 . When a user approaches theimage processing apparatus 100 with the stand-by screen 101 ofFIG. 4 displayed, theimage processing apparatus 100 displays on the guide display 10 apublic print screen 202 in which the graphics representing the information A to H are aligned in a circular manner. At this point, theimage processing apparatus 100 displays part of a graphic that represents the lowermost information A in a selectioncandidate display area 212 which is an example of the first area of theoperation stand 20. In this state, when a user touches the selectioncandidate display area 212 to perform a swipe operation in the direction indicated by a dashedline arrow 232, a circle including the graphics representing the information A to H rotates in the counterclockwise direction on thepublic print screen 202, the swipe operation being an example of a second operation. In other words, the graphic representing the information A is moved to the position at which the graphic representing the information H is used to be, and the graphics representing the information B to H are moved to the respective positions at which the graphics representing the information A to G are used to be. During the operation to move the graphics representing the information A to H in this manner, when a graphic representing desired information appears in the selectioncandidate display area 212, a swipe operation performed by a user on the graphic representing the desired information in the direction indicated by asolid line arrow 242 causes theimage processing apparatus 100 to display the graphic in a printtarget display area 222, the swipe operation being an example of the first operation, the printtarget display area 222 being an example of the second area of theoperation stand 20. Although thesolid line arrow 242 indicates a downward direction, and the dashedline arrow 232 indicates a rightward direction in the above, without being limited to this, the former may be a first direction, and the latter may be a second direction different from the first direction in a more generalized manner. -
FIG. 20 is a view illustrating a screen display example displayed in the third mode of the public print processing in step 703 ofFIG. 15 . When a user approaches theimage processing apparatus 100 with the stand-by screen 101 displayed, theimage processing apparatus 100 displays on the guide display 10 apublic print screen 203 in which the graphics representing the information A to H are aligned in a vertical row. At this point, theimage processing apparatus 100 displays part of a graphic that represents the lowermost information A in a selectioncandidate display area 213 which is an example of the first area of theoperation stand 20. In this state, when a user touches a printtarget display area 223 to perform a swipe operation in the direction indicated by a dashedline arrow 233, on thepublic print screen 203, the graphics representing the information B to H slide down to respective positions at which the graphics representing the information A to G are used to be, and the graphic representing the information A is moved to the position at which the graphic representing the information H is used to be, the printtarget display area 223 being an example of the second area, the swipe operation being an example of the second operation. During the operation to move the graphics representing the information A to H in this manner, when a graphic representing desired information appears in the selectioncandidate display area 213, a swipe operation performed by a user on the graphic representing the desired information in the direction indicated by asolid line arrow 243 causes theimage processing apparatus 100 to display the graphic in the printtarget display area 223, the swipe operation being an example of the first operation. -
FIG. 21 is a view illustrating a screen display example displayed in the fourth mode of the public print processing in step 703 ofFIG. 15 . When a user approaches theimage processing apparatus 100 with the stand-by screen 101 displayed, theimage processing apparatus 100 displays on the guide display 10 apublic print screen 204 in which the graphics representing the information A to H are aligned in a vertical row. At this point, theimage processing apparatus 100 displays part of a graphic that represents the lowermost information A in a selectioncandidate display area 214 which is an example of the first area of theoperation stand 20. Subsequently, theimage processing apparatus 100 moves the graphics representing the information A to H without an operation of a user. In other words, when a predetermined time has elapsed, on thepublic print screen 204, the graphics representing the information B to H slide down to respective positions at which the graphics representing the information A to G are used to be, and the graphic representing the information A is moved to the position at which the graphic representing the information H is used to be. During movement of the graphics representing the information A to H in this manner, when a graphic representing desired information appears in the selectioncandidate display area 214, a swipe operation, which is an example of the first operation, performed by a user on the graphic representing the desired information in the direction indicated by asolid line arrow 244 causes theimage processing apparatus 100 to display the graphic in a printtarget display area 224 which is an example of the second area of theoperation stand 20. -
FIG. 22 is a view illustrating a screen display example displayed in the fifth mode of the public print processing in step 703 ofFIG. 15 . When a user approaches theimage processing apparatus 100 with the stand-by screen 101 displayed, theimage processing apparatus 100 displays on the guide display 10 apublic print screen 205 in which the graphics representing the information A to H are aligned in a horizontal row. In this state, when a user touches the operation stand 20 to perform a swipe operation, which is an example of the second operation, in the direction indicated by a dashedline arrow 235, on thepublic print screen 205, the graphics representing the information B to H are moved to the respective positions at which the graphics representing the information A to G are used to be, and the graphic representing the information A is moved to the position at which the graphic representing the information H is used to be. During the operation to move the graphics representing the information A to H in this manner, when a graphic representing desired information is moved to the central position (indicated by enclosing with a thick line), a swipe operation, which is an example of the first operation, performed by a user on the graphic representing the desired information in the direction indicated by asolid line arrow 245 causes theimage processing apparatus 100 to display the graphic on theoperation stand 20. Although graphics are moved by a user performing a swipe operation in the rightward direction on thepublic print screen 205 in the description above, without being limited to this, for instance, when a predetermined time has elapsed, graphics may be moved without an operation of a user. Also, in the description above, when a graphic representing desired information is moved to the central position, a swipe operation performed by a user on the graphic causes theimage processing apparatus 100 to display the graphic on theoperation stand 20. However, without being limited to this, when a graphic representing desired information is in a predetermined state, a swipe operation performed by a user on the graphic may cause theimage processing apparatus 100 to display the graphic on the operation stand 20 in a more generalized manner. - Although the order in which the graphics representing the information A to H are aligned has not been mentioned in the first to fifth modes, graphics may be displayed, for instance, in descending order of degree desired by a user in the selection
candidate display areas 211 to 214 of theoperation stand 20. Here, the degree desired by a user may be calculated based on attributes of a user, such as sex and age, which are obtained from an image of the user captured by theimager 60, for instance. - Also, in the public print processing in step 703 of
FIG. 15 , the above-described sixth and seventh modes may also be considered as a method to move a graphic representing desired information to the operation stand 20 from the graphics representing the information A to H displayed on theguide display 10. - First, the sixth mode of the public print processing in step 703 of
FIG. 15 will be described. In the sixth mode, when a user performs an operation to indicate a position on theguide display 10, theimage processing apparatus 100 identifies the position at which a graphic representing desired information is displayed, the graphic being one of the graphics representing the information A to H displayed on theguide display 10. Here, the “operation to indicate a position” includes glancing at the position at which a graphic representing desired information is displayed, and pointing at the position at which a graphic representing desired information is displayed. Accordingly, theimage processing apparatus 100 changes the display mode of the graphic displayed at the identified position. For instance, in order to attract attention, the graphic displayed at the identified position is made greater than the other graphics or separated from the other graphics. Subsequently, when a swipe operation is performed by a user on theoperation stand 20, theimage processing apparatus 100 moves the graphic displayed at the identified position to theoperation stand 20. - Next, the seventh mode of the public print processing in step 703 of
FIG. 15 will be described. In the sixth mode, when a user performs an operation to indicate identification information of a graphic on theguide display 10 with finger, theimage processing apparatus 100 identifies the identification information associated with the graphic representing desired information, the graphic being one of the graphics representing the information A to H displayed on theguide display 10. Here, when identification information is a number, the “operation to indicate identification information with finger” includes holding up fingers with the number of fingers being the number. Accordingly, theimage processing apparatus 100 changes the display mode of the graphic associated with the identified identification information. For instance, the graphic associated with the identified identification information is made greater than the other graphics or separated from the other graphics to attract attention. Subsequently, when a swipe operation is performed by a user on theoperation stand 20, theimage processing apparatus 100 moves the graphic associated with the identified identification information to theoperation stand 20. - Alignment of the graphics representing the information A to H has not been mentioned in the sixth and seventh modes, and this is because one of the graphics on the
guide display 10 is directly identified by an operation of a user, and thus the order of the selected candidates of the graphics representing the information A to H does not have to be considered. -
FIG. 23 is a flowchart illustrating an operation example of thecontrol device 70 when the public print processing in step 703 ofFIG. 15 is performed in the first to third modes. - As illustrated, in the
control device 70, thedisplay controller 71 first aligns and displays graphics representing information on the guide display 10 (step 801). Specifically, the graphics representing information are aligned and displayed in a vertical direction as inFIGS. 18 and 20 in the first and the third modes, and the graphics are aligned and displayed in a circular manner as inFIG. 19 in the second mode. - Subsequently, the
detection controller 73 determines whether or not the first swipe operation on the operation stand 20 has been detected by the operation detector 40 (step 802). Here, the first swipe operation refers to the swipe operation in the direction indicated by the dashedline arrow 231 inFIG. 18 in the first mode, the swipe operation in the direction indicated by the dashedline arrow 232 inFIG. 19 in the second mode, and the swipe operation in the direction indicated by the dashedline arrow 233 inFIG. 20 in the third mode. - When it is determined the first swipe operation on the operation stand 20 has not been detected by the
operation detector 40, thedetection controller 73 ends the processing. On the other hand, when it is determined the first swipe operation on the operation stand 20 has been detected by theoperation detector 40, thedisplay controller 71 moves and displays the graphic in theguide display 10, and theprojection controller 72 changes and displays the graphics in the selection candidate display area on the operation stand 20 using the projector 30 (step 803). Specifically, the graphics are moved in a vertical direction and displayed as inFIGS. 18 and 20 in the first and the third modes, and the graphics are moved in a circular direction and displayed as inFIG. 19 in the second mode. - Subsequently, the
detection controller 73 determines whether or not the second swipe operation on the operation stand 20 has been detected by the operation detector 40 (step 804). Here, the second swipe operation refers to the swipe operation in the direction indicated by thesolid line arrow 241 ofFIG. 18 in the first mode, the swipe operation in the direction indicated by thesolid line arrow 242 ofFIG. 19 in the second mode, and the swipe operation in the direction indicated by thesolid line arrow 243 ofFIG. 20 in the third mode. - When it is determined the second swipe operation on the operation stand 20 has not been detected by the
operation detector 40, thedetection controller 73 ends the processing. On the other hand, when it is determined the second swipe operation on the operation stand 20 has been detected by theoperation detector 40, theprojection controller 72 moves and displays the graphic from the selection candidate display area to the print target display area on the operation stand 20 using the projector 30 (step 805). - Subsequently, the
detection controller 73 determines whether pressing of the print button on the operation stand 20 is detected by the operation detector 40 (step 806). When it is determined that pressing of the print button on the operation stand 20 has not been detected by theoperation detector 40, thedetection controller 73 ends the processing, whereas when it is determined that pressing of the print button on the operation stand 20 has been detected by theoperation detector 40, theprint controller 74 performs control so that printing is made by the printer 50 (step 807). -
FIG. 24 is a flowchart illustrating an operation example of thecontrol device 70 when the public print processing in step 703 ofFIG. 15 is performed in the fourth mode. - As illustrated, in the
control device 70, thedisplay controller 71 first aligns and displays graphics representing information on the guide display 10 (step 821). Specifically, the graphics representing information are aligned and displayed in a vertical direction as inFIG. 21 . - Subsequently, the
display controller 71 determines whether or not a predetermined time has elapsed (step 822). - When it is determined that a predetermined time has not elapsed, the
display controller 71 ends the processing. On the other hand, when it is determined that a predetermined time has elapsed, thedisplay controller 71 moves and displays the graphic in theguide display 10, and theprojection controller 72 changes and displays the graphics in the selection candidate display area on the operation stand 20 using the projector 30 (step 823). Specifically, the graphics are moved in a vertical direction and displayed as inFIG. 21 . - Subsequently, the
detection controller 73 determines whether or not a swipe operation on the operation stand 20 has been detected by the operation detector 40 (step 824). Here, the swipe operation refers to the swipe operation in the direction indicated by thesolid line arrow 244 inFIG. 21 . - When it is determined the swipe operation on the operation stand 20 has not been detected by the
operation detector 40, thedetection controller 73 ends the processing. On the other hand, when it is determined the swipe operation on the operation stand 20 has been detected by theoperation detector 40, theprojection controller 72 moves and displays the graphic from the selection candidate display area to the print target display area on the operation stand 20 using the projector 30 (step 825). - Subsequently, the
detection controller 73 determines whether pressing of the print button on the operation stand 20 is detected by the operation detector 40 (step 826). When it is determined that pressing of the print button on the operation stand 20 has not been detected by theoperation detector 40, thedetection controller 73 ends the processing, whereas when it is determined that pressing of the print button on the operation stand 20 has been detected by theoperation detector 40, theprint controller 74 performs control so that printing is made by the printer 50 (step 827). -
FIG. 25 is a flowchart illustrating an operation example of thecontrol device 70 when the public print processing in step 703 ofFIG. 15 is performed in the fifth mode. - As illustrated, in the
control device 70, thedisplay controller 71 first aligns and displays graphics representing information on the guide display 10 (step 841). Specifically, the graphics representing information are aligned and displayed in a horizontal direction as inFIG. 22 . - Subsequently, the
detection controller 73 determines whether or not the first swipe operation on the operation stand 20 has been detected by the operation detector 40 (step 842). Here, the first swipe operation refers to the swipe operation in the direction indicated by the dashedline arrow 235 inFIG. 22 . - When it is determined the first swipe operation on the operation stand 20 has not been detected by the
operation detector 40, thedetection controller 73 ends the processing. On the other hand, when it is determined the first swipe operation on the operation stand 20 has been detected by theoperation detector 40, thedisplay controller 71 moves and displays the graphic in theguide display 10, and theprojection controller 72 changes and displays the graphics in the selection candidate display area on the operation stand 20 using the projector 30 (step 843). Specifically, the graphics are moved in a horizontal direction and displayed as inFIG. 22 . - Subsequently, the
detection controller 73 determines whether or not the second swipe operation on the operation stand 20 has been detected by the operation detector 40 (step 844). Here, the second swipe operation refers to the swipe operation in the direction indicated by thesolid line arrow 245 ofFIG. 22 . - When it is determined the second swipe operation on the operation stand 20 has not been detected by the
operation detector 40, thedetection controller 73 ends the processing. On the other hand, when it is determined the second swipe operation on the operation stand 20 has been detected by theoperation detector 40, thedisplay controller 71 and theprojection controller 72 moves the graphic displayed on the central position of theguide display 10 to the operation stand 20 (step 845). Specifically, thedisplay controller 71 deletes the graphic displayed on the central position of theguide display 10, and theprojection controller 72 displays the deleted graphics on the operation stand 20 using theprojector 30. - Subsequently, the
detection controller 73 determines whether pressing of the print button on the operation stand 20 is detected by the operation detector 40 (step 846). When it is determined that pressing of the print button on the operation stand 20 has not been detected by theoperation detector 40, thedetection controller 73 ends the processing, whereas when it is determined that pressing of the print button on the operation stand 20 has been detected by theoperation detector 40, theprint controller 74 performs control so that printing is made by the printer 50 (step 847). -
FIG. 26 is a flowchart illustrating an operation example of thecontrol device 70 that performs the public print processing in the sixth and seventh modes. - As illustrated, in the
control device 70, theimaging controller 75 first determines whether a predetermined operation of a user has been detected by the imager 60 (step 861). Here, the predetermined operation refers to designating a graphic on theguide display 10 by glancing or pointing at the graphic in the sixth mode, and designating a graphic on theguide display 10 by indicating the identification information of the graphic with finger in the seventh mode. - When it is determined that a predetermined operation of a user has not been detected by the
imager 60, theimaging controller 75 ends the processing. On the other hand, when it is determined that a predetermined operation of a user has been detected by theimager 60, thedisplay controller 71 displays the designated graphic on theguide display 10 in a changed display mode (step 862). Specifically, the designated graphic is made greater than the other graphics or separated from the other graphics to attract attention. - Subsequently, the
detection controller 73 determines whether or not a swipe operation on the operation stand 20 has been detected by the operation detector 40 (step 863). Here, the swipe operation refers to the swipe operation in a direction to the near side of theoperation stand 20, for instance. - When it is determined the swipe operation on the operation stand 20 has not been detected by the
operation detector 40, thedetection controller 73 ends the processing. On the other hand, when it is determined the swipe operation on the operation stand 20 has been detected by theoperation detector 40, thedisplay controller 71 and theprojection controller 72 moves the designated graphic on theguide display 10 to the operation stand 20 (step 864). Specifically, thedisplay controller 71 deletes the designated graphic on theguide display 10, and theprojection controller 72 displays the deleted graphics on the operation stand 20 using theprojector 30. - Subsequently, the
detection controller 73 determines whether pressing of the print button on the operation stand 20 is detected by the operation detector 40 (step 865). When it is determined that pressing of the print button on the operation stand 20 has not been detected by theoperation detector 40, thedetection controller 73 ends the processing, whereas when it is determined that pressing of the print button on the operation stand 20 has been detected by theoperation detector 40, theprint controller 74 performs control so that printing is made by the printer 50 (step 866). - The processing performed by the
control device 70 in the exemplary embodiment is prepared, for instance, as a program such as application software. - Specifically, any program that implements the exemplary embodiment is considered to be a program that causes a computer to implement a function of displaying multiple display elements on the first display surface which is not touch-sensitive, and a function of displaying a specific display element selected from the multiple display elements displayed on the first display surface by an operation performed on the second display surface.
- Also any program that implements the exemplary embodiment is considered to be a program that causes a computer to implement a function of detecting an operation of an operator, a function of displaying multiple display elements on the first display surface which is not touch-sensitive, and a function of displaying a specific display element on the second display surface according to an operation performed on the second display surface, the specific display element being selected from the multiple display elements displayed on the first display surface by the detected operation.
- It is to be noted that any program that implements the exemplary embodiment may be provided not only by a communication unit, but also by a recording medium such as a CD-ROM that stores the program.
- The foregoing description of the exemplary embodiment of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiment was chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to understand the invention for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.
Claims (16)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-219915 | 2017-11-15 | ||
JP2017219915A JP7143580B2 (en) | 2017-11-15 | 2017-11-15 | Display device and program |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190146743A1 true US20190146743A1 (en) | 2019-05-16 |
Family
ID=66432069
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/182,622 Abandoned US20190146743A1 (en) | 2017-11-15 | 2018-11-07 | Display apparatus and non-transitory computer readable medium storing program |
Country Status (2)
Country | Link |
---|---|
US (1) | US20190146743A1 (en) |
JP (1) | JP7143580B2 (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040021681A1 (en) * | 2002-07-30 | 2004-02-05 | Liao Chin-Hua Arthur | Dual-touch-screen mobile computer |
US20060013462A1 (en) * | 2004-07-15 | 2006-01-19 | Navid Sadikali | Image display system and method |
US20060034043A1 (en) * | 2004-08-10 | 2006-02-16 | Katsumi Hisano | Electronic device, control method, and control program |
US20130057487A1 (en) * | 2011-09-01 | 2013-03-07 | Sony Computer Entertainment Inc. | Information processing device adapted to receiving an input for user control using a touch pad and information processing method thereof |
US20140075377A1 (en) * | 2012-09-10 | 2014-03-13 | Samsung Electronics Co. Ltd. | Method for connecting mobile terminal and external display and apparatus implementing the same |
US20150212647A1 (en) * | 2012-10-10 | 2015-07-30 | Samsung Electronics Co., Ltd. | Head mounted display apparatus and method for displaying a content |
US20190109937A1 (en) * | 2011-11-04 | 2019-04-11 | Remote TelePointer, LLC | Method and system for user interface for interactive devices using a mobile device |
US20190115097A1 (en) * | 2011-11-23 | 2019-04-18 | Remedev, Inc. | Remotely-executed medical diagnosis and therapy including emergency automation |
US20190325847A1 (en) * | 2017-01-03 | 2019-10-24 | Samsung Electronics Co., Ltd. | Electronic device and displaying method thereof |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100747421B1 (en) * | 1999-10-20 | 2007-08-09 | 코닌클리케 필립스 일렉트로닉스 엔.브이. | Device and Method of browsing an image collection |
US8982070B2 (en) * | 2009-10-28 | 2015-03-17 | Nec Corporation | Portable information terminal |
JP5636678B2 (en) * | 2010-01-19 | 2014-12-10 | ソニー株式会社 | Display control apparatus, display control method, and display control program |
JP5621407B2 (en) * | 2010-08-20 | 2014-11-12 | 日本電気株式会社 | Operation input device, program and method |
JP5652432B2 (en) * | 2011-06-29 | 2015-01-14 | トヨタ自動車株式会社 | Vehicle control device |
JP5924518B2 (en) * | 2011-10-14 | 2016-05-25 | コニカミノルタ株式会社 | control panel |
JP2016122234A (en) * | 2014-12-24 | 2016-07-07 | 株式会社Nttドコモ | Wearable terminal and display control program |
JP2017058972A (en) * | 2015-09-16 | 2017-03-23 | レノボ・シンガポール・プライベート・リミテッド | Information processor, display method thereof, and program executable by computer |
-
2017
- 2017-11-15 JP JP2017219915A patent/JP7143580B2/en active Active
-
2018
- 2018-11-07 US US16/182,622 patent/US20190146743A1/en not_active Abandoned
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040021681A1 (en) * | 2002-07-30 | 2004-02-05 | Liao Chin-Hua Arthur | Dual-touch-screen mobile computer |
US20060013462A1 (en) * | 2004-07-15 | 2006-01-19 | Navid Sadikali | Image display system and method |
US20060034043A1 (en) * | 2004-08-10 | 2006-02-16 | Katsumi Hisano | Electronic device, control method, and control program |
US20130057487A1 (en) * | 2011-09-01 | 2013-03-07 | Sony Computer Entertainment Inc. | Information processing device adapted to receiving an input for user control using a touch pad and information processing method thereof |
US20190109937A1 (en) * | 2011-11-04 | 2019-04-11 | Remote TelePointer, LLC | Method and system for user interface for interactive devices using a mobile device |
US20190115097A1 (en) * | 2011-11-23 | 2019-04-18 | Remedev, Inc. | Remotely-executed medical diagnosis and therapy including emergency automation |
US20140075377A1 (en) * | 2012-09-10 | 2014-03-13 | Samsung Electronics Co. Ltd. | Method for connecting mobile terminal and external display and apparatus implementing the same |
US20150212647A1 (en) * | 2012-10-10 | 2015-07-30 | Samsung Electronics Co., Ltd. | Head mounted display apparatus and method for displaying a content |
US20190325847A1 (en) * | 2017-01-03 | 2019-10-24 | Samsung Electronics Co., Ltd. | Electronic device and displaying method thereof |
Also Published As
Publication number | Publication date |
---|---|
JP2019091276A (en) | 2019-06-13 |
JP7143580B2 (en) | 2022-09-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2535837B1 (en) | Connection control device establishing connection between portable type mobile terminal and information processing device by wireless communication | |
JP5974976B2 (en) | Information processing apparatus and information processing program | |
US9176683B2 (en) | Image information processing method, image information processing apparatus and computer-readable recording medium storing image information processing program | |
US11323582B2 (en) | Image reading apparatus capable of reading and displaying image of document placed on platen | |
US9420144B2 (en) | Image forming device to provide preview image for editing manuscript image, display apparatus to display and edit the preview image, and methods thereof | |
US9807258B1 (en) | Print data processing method of mobile device and the mobile device | |
US11233909B2 (en) | Display apparatus capable of displaying guidance information and non-transitory computer readable medium storing program | |
US11050892B2 (en) | Display apparatus and non-transitory computer readable medium storing program | |
JP6187063B2 (en) | Information processing apparatus, information processing system, information processing method, and program | |
US20140104639A1 (en) | Information processing apparatus and control method therefor, and print apparatus and control method therefor | |
JP2008009572A (en) | Document processing system, document processing method, and program | |
JP2013182624A (en) | Information processing apparatus and schedule displaying program | |
US9560241B2 (en) | Information processing apparatus, image processing method, and non-transitory computer readable medium | |
US11233911B2 (en) | Image processing apparatus and non-transitory computer readable medium for image processing | |
US20190146743A1 (en) | Display apparatus and non-transitory computer readable medium storing program | |
US10334125B2 (en) | Image forming apparatus with projector to display an image to be printed and related method | |
US20160044197A1 (en) | Method of scanning document and image forming apparatus for performing the same | |
JP6819132B2 (en) | Image processing equipment and image processing program | |
US9900455B2 (en) | Method of scanning document and image forming apparatus for performing the same | |
JP2014030080A (en) | Image processing apparatus and image processing method | |
US20240064249A1 (en) | Information processing apparatus, information processing method, and non-transitory computer readable medium | |
JP2019018528A (en) | Image processing device and image processing program | |
JP6705988B2 (en) | Information processing system, control method thereof, and program | |
JP2017011399A (en) | Image processing apparatus, image forming apparatus, recording medium, and program | |
JP2017011627A (en) | Image processor and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FUJI XEROX CO.,LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:IKEDA, ERIKO;ZHANG, XIAOJING;SEKI, HIROO;REEL/FRAME:047492/0136 Effective date: 20180713 |
|
STCT | Information on status: administrative procedure adjustment |
Free format text: PROSECUTION SUSPENDED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
AS | Assignment |
Owner name: FUJIFILM BUSINESS INNOVATION CORP., JAPAN Free format text: CHANGE OF NAME;ASSIGNOR:FUJI XEROX CO., LTD.;REEL/FRAME:056294/0305 Effective date: 20210401 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |