US20080291283A1 - Image processing apparatus and control method thereof - Google Patents

Image processing apparatus and control method thereof Download PDF

Info

Publication number
US20080291283A1
US20080291283A1 US11/864,385 US86438507A US2008291283A1 US 20080291283 A1 US20080291283 A1 US 20080291283A1 US 86438507 A US86438507 A US 86438507A US 2008291283 A1 US2008291283 A1 US 2008291283A1
Authority
US
United States
Prior art keywords
display
unit
mobile information
information terminal
menu
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/864,385
Other languages
English (en)
Inventor
Ken Achiwa
Hideyasu Kinbara
Shigeki Hasui
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Assigned to CANON KABUSHIKI KAISHA reassignment CANON KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ACHIWA, KEN, HASUI, SHIGEKI, KINBARA, HIDEYASU
Publication of US20080291283A1 publication Critical patent/US20080291283A1/en
Priority to US14/844,933 priority Critical patent/US10318076B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00204Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a digital computer or a digital computer system, e.g. an internet server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/00411Display of information to the user, e.g. menus the display also being used for user input, e.g. touch screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2101/00Still video cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/0008Connection or combination of a still picture apparatus with another apparatus
    • H04N2201/0034Details of the connection, e.g. connector, interface
    • H04N2201/0037Topological details of the connection
    • H04N2201/0041Point to point
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/0008Connection or combination of a still picture apparatus with another apparatus
    • H04N2201/0034Details of the connection, e.g. connector, interface
    • H04N2201/0048Type of connection
    • H04N2201/0055By radio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/0008Connection or combination of a still picture apparatus with another apparatus
    • H04N2201/0034Details of the connection, e.g. connector, interface
    • H04N2201/0048Type of connection
    • H04N2201/0058Docking-station, cradle or the like

Definitions

  • the present invention relates to an image processing apparatus and control method thereof and, more particularly, to an image processing apparatus which has a display input function of generating an input signal upon contacting a display screen, and a “communicate” function with a mobile information terminal, and a control method thereof.
  • connection state between a mobile information terminal and image processing apparatus is clear.
  • wireless communications such connection state between devices often becomes unclear.
  • wireless communications by means of Bluetooth since wireless communications with an effective distance of about 10 m are normally allowed, a plurality of apparatuses having an identical wireless communication function may exist within a communication range in some cases. In such cases, it is difficult to determine with which communication apparatus the mobile information terminal can communicate in practice.
  • a plurality of mobile information terminals having an identical wireless communication may exist within a communication range in some cases. In such cases, it is difficult for the image processing apparatus side to automatically determine with which mobile information terminal the image processing apparatus is to communicate.
  • Japanese Patent Laid-Open No. 2005-25252564 Japanese Patent Laid-Open No. 2005-252564. That is, with this method, when a plurality of mobile information terminals which can communicate with a communication apparatus exist, a list of mobile information terminals which allow wireless communications from the communication apparatus is displayed on a display unit of the communication apparatus, and the user manually selects a destination from that list. According to this method, the communication apparatus stores device information of mobile information terminals with which it established communications previously, and displays information of these terminals on the display unit.
  • each mobile information terminal is specified by a simple number or the like, even when a list of mobile information terminals that allow communications is displayed, it is not easy for the user to determine a desired terminal.
  • the communication apparatus side automatically detects the mobile information terminal placed on the mobile information terminal putting space, and limits a communication partner to that mobile information terminal.
  • the wireless antenna of the communication apparatus is configured to have a small size so as not to communicate with mobile information terminals other than a desired terminal.
  • a method of controlling the transmission power or reception sensitivity upon wireless communication so as not to communicate with mobile information terminals other than a desired terminal is available (Japanese Patent Laid-Open No. 2001-128246). More specifically, a part of the communication apparatus around the wireless antenna is shielded so as to control the communication apparatus to receive a radio wave in only a specific direction.
  • the communication apparatus does not have any function of automatically settling a communication partner, but it has an assistant function of displaying mobile information terminals that can make wireless communications from the communication apparatus on the display unit, and prompting the user to select a desired terminal. That is, since the user himself or herself confirms mobile information terminals displayed on the display unit, and manually selects a desired mobile information terminal, the communication apparatus must comprise a user interface such as a display unit and the like.
  • a dedicated device of the novel function is attached to the apparatus in an initial stage.
  • an expanded device tends to be produced by incorporating the novel function in a device of the existing apparatus.
  • a liquid crystal touch panel prepared by integrating a liquid crystal display (display device) and contact sensor (manipulation device) has prevailed.
  • Such integration may provide the following two merits. First, since a new dedicated device need not be attached to the existing apparatus, a physical installation space of the apparatus can be reduced. Second, since the user need not alternately handle a plurality of dedicated devices attached to a single apparatus, this contributes to a reduction of the operation load on the user and a short operation time.
  • a communication device used as the mobile information terminal putting space on the communication apparatus may be integrated with a display device such as a liquid crystal touch panel or the like. That is, upon implementing a high-speed wireless communication function as an interface with a mobile information terminal, and a large-screen, liquid crystal touch panel display function as a user interface at the same time, the installation space of the apparatus can be reduced, and the user can make more intuitive operations.
  • a mobile information terminal may be placed on function selection menu buttons displayed on the touch panel UI.
  • the communication apparatus side may misinterpret “placement of a mobile information terminal” for “pressing of a function selection menu button”. With this misinterpretation, switching of a menu display is not always done appropriately.
  • a mobile information terminal with a relatively large size such as a mobile phone or PDA which has a large screen and full keyboard, or a notebook type personal computer is placed on the touch panel UI, it physically covers the touch panel UI surface. As a result, the display area narrows down, resulting in poor operability.
  • the present invention enables realization of an image processing apparatus which generates an input signal when a mobile information terminal contacts its display surface, and makes a wireless communication with the mobile information terminal that made the contact, wherein the apparatus checks if an object with which it is in contact with is a mobile information terminal, and makes an appropriate display according to the checking result.
  • One aspect of the present invention provides an image processing apparatus, which comprises a display input unit adapted to generate an input signal when an object contacts a display surface that displays an image, and a wireless communication unit adapted to make a wireless communication with a mobile information terminal which contacts the display surface of the display input unit, and executes processing of image data input from the mobile information terminal, the apparatus comprising: a discrimination unit adapted to discriminate, when the display input unit generates an input signal, whether or not the object which contacts the display surface is the mobile information terminal; and a display control unit adapted to control display contents on the display input unit in accordance with a discrimination result of the discrimination unit.
  • Another aspect of the present invention provides a method of controlling an image processing apparatus, which comprises a display input unit adapted to generate an input signal when an object contacts a display surface that displays an image, and a wireless communication unit adapted to make a wireless communication with a mobile information terminal which contacts the display surface of the display input unit, and executes processing of image data input from the mobile information terminal, the method comprising the steps of: discriminating, when the display input unit generates an input signal, whether or not the object which contacts the display surface is the mobile information terminal; and controlling display contents on the display input unit in accordance with a discrimination result in the discriminating step.
  • Still another aspect of the present invention provides a computer program for making a computer execute a method of controlling an image processing apparatus, which comprises a display input unit adapted to generate an input signal when an object contacts a display surface that displays an image, and a wireless communication unit adapted to make a wireless communication with a mobile information terminal which contacts the display surface of the display input unit, and executes processing of image data input from the mobile information terminal, the program comprising the steps of: discriminating, when the display input unit generates an input signal, whether or not the object which contacts the display surface is the mobile information terminal; and controlling display contents on the display input unit in accordance with a discrimination result in the discriminating step.
  • Yet another aspect of the present invention provides a computer-readable recording medium recording the above computer program.
  • FIG. 1 is a view showing an overview of a conventional image processing apparatus
  • FIG. 2 is a view showing an overview of an image processing apparatus according to one embodiment of the present invention.
  • FIG. 3 is a block diagram showing the hardware arrangement of the image processing apparatus according to the first embodiment
  • FIG. 4 shows a piconet configuration example of Bluetooth according to the first embodiment
  • FIG. 5 is a schematic block diagram showing the arrangement of a wireless communication unit according to the first embodiment
  • FIG. 6 is a partial block diagram showing the arrangement of an RF unit according to the first embodiment
  • FIG. 7 is a flowchart showing the operation of the image processing apparatus according to the first embodiment
  • FIG. 8 is a flowchart showing an operation example in a communication mode according to the first embodiment
  • FIG. 9 is a flowchart showing an operation example in a manipulation mode according to the first embodiment.
  • FIG. 10 is a view showing an overview of operations in respective operation modes according to the first embodiment
  • FIG. 11 is a flowchart showing the operation of an image processing apparatus according to the second embodiment.
  • FIG. 12 is a view showing an overview of operations in respective operation modes according to the second embodiment.
  • FIG. 13 is a flowchart showing the operation of an image processing apparatus according to the third embodiment.
  • FIG. 14 is a view showing an overview of operations in respective operation modes according to the third embodiment.
  • FIG. 15 is a block diagram showing the hardware arrangement of an image processing apparatus according to the fourth embodiment.
  • FIG. 16 is a flowchart showing the display processing in a communication mode according to the fourth embodiment.
  • FIG. 17 shows a default general menu display example according to the fourth embodiment
  • FIG. 18 shows a display transition example upon placing a mobile information terminal according to the fourth embodiment
  • FIG. 19 shows a display transition example upon placing a plurality of mobile information terminals according to the fourth embodiment
  • FIG. 20 shows a manipulation menu display example upon placing a relatively large mobile information terminal in the first to fourth embodiments
  • FIG. 21 is a block diagram showing the hardware arrangement of an image processing apparatus according to the fifth embodiment.
  • FIG. 22 is a flowchart showing the display processing in a communication mode according to the fifth embodiment.
  • FIG. 23 shows a manipulation menu display transition example upon placing a mobile information terminal according to the fifth embodiment.
  • FIG. 24 shows another manipulation menu display transition example upon placing a mobile information terminal according to the fifth embodiment.
  • An image processing apparatus has a function of making wireless communications with a mobile information terminal by, e.g., Bluetooth, and limits a mobile information terminal as a communication partner by assuring a putting space for a mobile information terminal.
  • the image processing apparatus is characterized in that a display as a liquid crystal touch panel and the mobile information terminal putting space, which are separately configured in the conventional image processing apparatus having such function are integrated.
  • FIG. 1 is a view showing an overview of the conventional image processing apparatus.
  • An image processing apparatus 100 shown in FIG. 1 has a display 101 and mobile information terminal putting space 102 as independent components.
  • the display 101 has a touch panel function, and is used to display image print menus, images to be printed, and the like, and to input user's manipulation inputs.
  • the mobile information terminal putting space 102 is a space where a mobile information terminal 200 is placed. When the mobile information terminal 200 is placed on this putting space 102 , it is detected by a sensor (not shown).
  • the display 101 displays processing menus for data to be input from the mobile information terminal 200 . Then, the user selects a desired process from the menus on the display 101 , and presses it with a user's finger 300 , thus determining the process.
  • FIG. 2 shows an overview of an image processing apparatus according to this embodiment.
  • An expanded interface 104 shown in FIG. 2 corresponds to that obtained by integrating the display 101 and mobile information terminal putting space 102 shown in FIG. 1 , and the image processing apparatus 100 has the same arrangement as that shown in FIG. 1 .
  • the expanded interface 104 has a touch panel function, and is used to display image print menus, images to be printed, and the like, and to input user's manipulation inputs.
  • the expanded interface 104 also serves as a mobile information terminal putting space. When the mobile information terminal 200 , the user's finger 300 , or the like is placed on the expanded interface 104 , it is detected by a sensor (not shown).
  • a plurality of modes are available. For example, a method of detecting reflected light by a reflective optical sensor, a method of detecting an interception of light by a transmissive optical sensor, a method of detecting a weight or pressure acting on the putting space using a weight sensor, a method of detecting a reflected wave of a transmitted ultrasonic wave by an ultrasonic sensor, and the like may be used.
  • a method of discriminating a video picture captured by a camera by applying image processing to it by a visible sensor a method of detecting a change in eddy current generated by electromagnetic induction using an induction type proximity sensor, a method of detecting a change in electric capacitance using a capacitance type proximity sensor, and the like can be adopted.
  • the expanded interface 104 displays a processing menu for data to be input from the mobile information terminal 200 .
  • the user selects a desired process from the menu, and presses it with the finger 300 , that process is determined.
  • FIG. 3 is a block diagram showing the hardware arrangement of the image processing apparatus 100 according to this embodiment.
  • a multi-functional peripheral MFP which comprises a copy function, printer function, FAX function, and scanner function is assumed.
  • the image processing apparatus 100 comprises a controller unit 201 which controls the overall apparatus 100 .
  • the image processing apparatus 100 includes a card reader/writer unit 202 , a display unit 203 and manipulation unit 204 , a mobile information terminal detection unit 205 , and a wireless communication unit 206 .
  • the image processing apparatus 100 has a printer unit 207 , scanner unit 208 , image communication unit 209 , image processing unit 210 , and memory unit 211 .
  • the card reader/writer unit 202 allows a communication with a non-contact IC such as RFID or the like.
  • a non-contact ID card that records in advance login information required to log in the image processing apparatus 100 is located close to the card reader/writer unit 202
  • the card reader/writer unit 202 receives the login information from a non-contact IC inside the ID card. Also, the user can manually send information required to log in the apparatus to the non-contact ID card, and can write it in the non-contact ID card.
  • the display unit 203 is a block which displays operation instructions, a print preview of an image to be printed, and the like to the user, and comprises, e.g., a liquid crystal panel or the like as a practical example.
  • the manipulation unit 204 is a block that makes the user select an operation by means of a key operation, and provides a user interface used to manipulate the image processing apparatus 100 .
  • a liquid crystal touch panel is implemented by integrating the display unit 203 and manipulation unit 204 , and the display 101 shown in FIG. 1 is an example obtained by integrating these units.
  • the mobile information terminal detection unit 205 is a block which detects if the mobile information terminal 200 is placed at a predetermined position, i.e., on the expanded interface 104 in the arrangement shown in FIG. 2 , and is implemented as the aforementioned sensor by various methods.
  • the wireless communication unit 206 is a block which makes data communications with a wireless communication device such as the mobile information terminal 200 or the like by a wireless communication system such as Bluetooth, a wireless LAN, or the like, and comprises an antenna unit, RF unit, and baseband unit.
  • the wireless communication unit 206 makes a communication required to detect the mobile information terminal 200 . More specifically, the wireless communication unit 206 acquires information required to identify the mobile information terminal 200 using a wireless communication method to be described later. After detection of the mobile information terminal 200 , the wireless communication unit 206 makes data communications with the mobile information terminal 200 according to a user's operation so as to execute predetermined processing.
  • the wireless communication unit 206 makes communications to receive print data sent from the mobile information terminal 200 , and to send data stored in the memory unit 211 to the mobile information terminal 200 .
  • the kind of processing to be executed is determined depending on the contents which are instructed by the user from a processing menu window displayed on the display unit of the image processing apparatus after the detection or a menu window displayed on the screen of the mobile information terminal.
  • the display unit 203 , manipulation unit 204 , mobile information terminal detection unit 205 , and wireless communication unit 206 are integrated to implement a wireless communication port, which is mounted back to back to the liquid crystal touch panel. That is, the expanded interface 104 shown in FIG. 2 is prepared by integrating the display unit 203 , manipulation unit 204 , mobile information terminal detection unit 205 , and wireless communication unit 206 . Therefore, the expanded interface 104 detects if the mobile information terminal 200 or the user's finger 300 is placed on the liquid crystal touch panel. Especially, upon detection of the mobile information terminal 200 , the expanded interface 104 can make communications with it.
  • the printer unit 207 is a block which prints an electrical image signal on a print sheet as a visible image, and comprises a laser beam printer or ink-jet printer.
  • the scanner unit 208 is a block which optically scans an original image and converts it into an electrical image signal, and comprises a contact image sensor, scan drive unit, scan illumination control unit, and the like.
  • the scan illumination control unit executes illumination control of an LED inside the contact image sensor.
  • a photosensor in the contact image sensor optically scans an original image, and converts it into an electrical image signal.
  • the image communication unit 209 is a block which exchanges data with an external device.
  • the image communication unit 209 connects the Internet or a LAN, makes a FAX communication by connecting a public telephone line, or connects a personal computer (PC) via a USB interface.
  • PC personal computer
  • the image processing unit 210 is a block which executes scan image processing, communication image processing, and print image processing.
  • the scan image processing applies shading correction and the like to image data received from the scanner unit 208 , and executes gamma processing, binarization processing, halftone processing, and color conversion processing such as RGB ⁇ CMYK or the like, thus converting the image data into high-resolution image data.
  • the print image processing applies resolution conversion to image data in correspondence with a print resolution.
  • the print image processing applies various kinds of image processing such as variable scaling, smoothing, density correction, and the like of an image, thus converting image data into high-resolution image data, and outputting the converted data to a laser beam printer or the like.
  • the communication image processing applies resolution conversion, color conversion, and the like according to the communication performance to the scanned image, and applies resolution conversion or the like according to the print performance to an image received via a communication.
  • the memory unit 211 is a memory device such as a DDR-SDRAM, HDD, or the like, and not only temporarily stores image data but also stores control programs, data, and the like used by the controller unit 201 so as to implement the functions of the image processing apparatus.
  • the controller unit 201 controls the overall image processing apparatus 100 , is electrically connected to respective blocks such as the printer unit 207 , scanner unit 208 , and the like, and executes control to implement advanced functions. More specifically, the controller unit 201 serves as discrimination means, display control means, terminal type identification means, and terminal position specification means in addition to control of the aforementioned blocks by controlling the connected blocks. For example, the controller unit 201 discriminates whether or not an object placed on the expanded interface 104 is a mobile information terminal. Also, the controller unit 201 controls the display contents on the expanded interface 104 according to the discrimination result. The controller unit 201 identifies the type of the mobile information terminal placed on the expanded interface 104 , and controls to display a processing menu window according to the identified type.
  • the controller unit 201 specifies the contact position of the mobile information terminal placed on the expanded interface 104 .
  • the controller unit 201 controls the scanner unit 208 to scan original image data so as to implement the scan function, and also controls the printer unit 207 to output image data onto a print sheet so as to implement the copy function.
  • the controller unit 201 provides a scanner function that sends image data scanned by the scanner unit 208 onto a network via the image communication unit 209 .
  • the controller unit 201 provides a printer function that converts code data received from a network or the like via the image communication unit 209 into image data, and outputs the image data to the printer unit 207 .
  • the controller unit 201 automatically logs in the image processing apparatus 100 using user ID information received from the non-contact ID card using the card reader/writer 202 .
  • the controller unit 201 controls the display contents to display, on the display unit 203 , a list of mobile information terminals 200 which allow wireless communications by the wireless communication unit 206 .
  • the controller unit 201 controls the communication performance of the wireless communication unit 206 according to various states.
  • Bluetooth is a short-distance wireless communication scheme aiming at wireless connections of electronic devices, and uses the frequency of the 2.4-GHz ISM frequency band, which is assigned for use in industries, science and technology, and medical services without license. Since Bluetooth is compact and lightweight, priced low, and consumes little power, it is popularly adapted as a standard for mobile information terminals.
  • a basic network arrangement of Bluetooth is a point-to-point connection between one master device and a maximum of seven slave devices, and is called “piconet”. Furthermore, when a given slave serves as a master of another piconet, or a master serves as a slave of still another piconet, a large-scale network called scatternet which interconnects piconets can be configured. Such wireless network can be easily built or canceled automatically/semi-automatically. For this reason, this network can be used for an ad hoc communication.
  • FIG. 4 shows a configuration example of a piconet of Bluetooth.
  • a master 301 is an image processing apparatus which serves as a master of the piconet.
  • Slaves 303 and 304 are mobile information terminals which are not placed on the mobile information terminal putting space 202 , and are located within a communication zone 307 for normal communication quality, within which communications are disabled at times of lower communication quality (to be described later) but are enabled at times of normal communication quality.
  • a slave 302 is a mobile information terminal which is placed on the mobile information terminal putting space 202 , and is located within a communication zone 306 for lower communication quality, within which communications are enabled at the time of not only normal communication quality but also lower communication quality.
  • a slave 305 is an image processing apparatus which is different from the master 301 , and can execute one job as a slave of the master 301 in this piconet by, e.g., collaboration of the two image processing apparatuses.
  • FIG. 5 shows a schematic arrangement of the wireless communication unit 206 .
  • the wireless communication unit 206 is roughly classified into an antenna unit 401 , RF unit 402 , and baseband unit 403 , and respective units will be briefly described below.
  • the baseband unit 403 exchanges data with the controller unit 201 , and establishes a communication link required to exchange data via wireless communications. At the same time, the baseband unit 403 provides, e.g., an operation management function of packet re-transmission, error correction, and frequency hopping executed by the RF unit 402 , and transmits a wireless quality control signal to the RF unit 402 in response to an instruction from the controller unit 201 .
  • the RF unit 402 comprises a transmission processing unit 431 , reception processing unit 421 , frequency synthesizer unit 413 , transmission and reception switching unit 412 , and filter unit 411 .
  • the filter unit 411 filters radio waves within a frequency band used in Bluetooth from those of various frequencies.
  • the transmission and reception switching unit 412 is a switch that switches a transmission radio wave and reception radio wave. A communication between the master and slave in Bluetooth is basically attained by repeating transmission and reception. For this reason, since transmission and reception are not made at the same time, and an antenna can be shared, switching using such a switch is made.
  • the transmission processing unit 431 processes transmission packet data received from the baseband unit 403 , superposes it on a transmission wave generated by the frequency synthesizer unit 413 , and transmits that wave from the antenna unit 401 . Also, the transmission processing unit 431 controls transmission power according to a transmission quality control signal received from the baseband unit 403 .
  • a transmission quality control signal received from the baseband unit 403 .
  • the transmission frequency is specified by the oscillation frequency of the frequency synthesizer unit 413 .
  • the frequency synthesizer unit 413 receives designation of the oscillation frequency from the baseband unit 403 for every frequency hopping, and begins to oscillate at a predetermined channel frequency.
  • the reception processing unit 421 receives a signal of the 2.4-GHz band via the filter unit 411 from radio waves of various frequencies that have reached the antenna unit 401 . Radio waves received via the filter unit 411 are input to the reception processing unit 421 via the transmission and reception switching unit 412 .
  • the 2.4-GHz band includes many radio waves from other Bluetooth devices and those other than Bluetooth, and a radio wave of a desired channel frequency is selected by a superheterodyne scheme that frequency-converts a desired channel frequency into a specific intermediate frequency (IF). In this way, data superposed on the reception wave is extracted.
  • the IF in the reception processing unit 421 is 2 MHz. Note that the frequency synthesizer unit 413 does not simultaneously make transmission and reception, as described above.
  • the reception processing unit 421 extracts the data from the received radio wave, it decodes the data to a digital signal, and passes the digital signal as reception packet data to the baseband unit 403 . Also, the reception processing unit 421 controls the amplitude of the reception signal according to a reception quality control signal received from the baseband unit 403 .
  • FIG. 6 shows the detailed arrangements of the transmission processing unit 431 , reception processing unit 421 , and frequency synthesizer unit 413 , and these arrangements will be described below.
  • the arrangement of the transmission processing unit 431 will be described first.
  • the arrangement of the transmission processing unit 431 roughly includes a waveform shaper 511 and power amplifier 512 .
  • the waveform shaper 511 removes RF components of a digital waveform from a transmission data signal (1 Mbps, digital) received from the baseband unit 403 , thus outputting an analog signal including components of 1 MHz or less.
  • the frequency synthesizer unit 413 includes a phase comparator 522 , loop filter 523 , VCO (Voltage Controlled Oscillator) 524 , and first frequency divider 525 .
  • the phase comparator 522 operates at 1 MHz.
  • the frequency division ratio of the first frequency divider 525 assumes a designated value of the baseband unit 403 for every frequency hopping.
  • the first frequency divider 525 frequency-divides a signal from the VCO 524 to 1 MHz, and feeds it back to the phase comparator 522 , thus obtaining an oscillation output of a required frequency.
  • the transmission data signal which has passed through the waveform shaper 511 of the transmission processing unit 431 is input to a frequency modulation terminal of the VCO 524 to directly change the capacitance of an LC resonance circuit.
  • the transmission signal superposed with data is amplified by the power amplifier 512 .
  • the amplified transmission signal is transmitted from the antenna unit 401 via the transmission and reception switching unit 412 and filter unit 411 .
  • the power amplifier 512 can execute power control upon transmission according to a transmission quality control signal received from the baseband unit 512 . In this manner, the wireless transmission performance can be switched while assuring certain communication quality according to the state of the apparatus.
  • the baseband unit 403 outputs the transmission quality control signal.
  • the controller unit 201 may directly output this signal.
  • the arrangement of the reception processing unit 421 will be described below.
  • the arrangement of the reception processing unit 421 roughly includes a low-noise amplifier 501 , mixer 502 , IF filter 503 , tuner 504 , IF amplifier 505 , demodulator 506 , and data slicer 507 .
  • the low-noise amplifier 501 amplifies the power of the reception wave 25-folds.
  • the signal/noise ratio of a reception system is largely dominated by the noise factor and gain of an initial-stage amplifier, and suppression of the noise factor and optimization of the gain in the 2.4-GHz band are required.
  • the mixer 502 multiplies the amplified reception wave and the signal from the frequency synthesizer unit 413 to convert a carrier frequency from the 2.4-GHz band to the IF of 2.0 MHz.
  • the output from the mixer 502 includes signals of 2+1, 2, . . . MHz and so forth due to the presence of neighboring channels in addition to the target IF frequency (2 MHz).
  • the tuner 504 fixes a central frequency so as not to be influenced by variations in the manufacture of elements which form a filter of the IF filter 503 or a change in external temperature.
  • the IF signal that has passed through the IF filter 503 is amplified by 40 dB by the IF amplifier 505 , and is demodulated by the demodulator 506 .
  • data of 1 Mbps is superposed as an analog waveform on a DC voltage component.
  • the data slicer 507 extracts the DC voltage component of the demodulated waveform, and converts an analog signal part whose voltage value is higher than the DC component to High of CMOS level and a lower signal part to Low, thus converting the reception data into a digital signal.
  • the data slicer 507 must have a function of detecting the DC component of the demodulated signal within a short period of time to cope with Bluetooth frequency hopping, and detects an appropriate DC voltage level from a demodulated signal waveform within several ⁇ sec after the start of transmission every time a partner device starts data transmission.
  • the low-noise amplifier 501 can execute reception sensitivity control upon reception in accordance with a reception quality control signal received from the baseband unit 403 . With this control, the wireless reception performance can be switched while assuring certain communication quality in accordance with the state of the apparatus.
  • the low-noise amplifier 501 executes the reception sensitivity control.
  • the reception sensitivity control may be done by changing an attenuator of the filter unit 411 .
  • the baseband signal 403 outputs the reception quality control signal.
  • the controller unit 201 may directly output this signal.
  • the layer structure of Bluetooth includes an RF layer, baseband layer, link manager layer, L2CAP (Logical Link Control and Adaptation Protocol Specification) layer, and application layer in turn from the lowermost layer.
  • the RF layer makes an actual wireless communication with another Bluetooth device.
  • the baseband layer executes various kinds of processing to attain wireless control, communication processing for each link, and a communication link.
  • the link manager layer attains establishment of a connection, control, and assurance of security upon establishment of an asynchronous (ACL) link using an LMP (Link Manager Protocol).
  • the L2CAP layer performs integration, division, and assembly of data between upper and lower layer data upon establishment of an asynchronous link, so as to achieve an efficient asynchronous packet.
  • the application layer recognizes and confirms effective services of devices upon establishment of a Bluetooth wireless connection.
  • This packet includes an access code field, header field, and payload field.
  • the access code field is configured by 72 bits to define three different types of codes, i.e., a channel access code (CAC), device access code (DAC), and inquiry access code (IAC), which are used for establishment of synchronization, offset correction, and piconet designation.
  • CAC channel access code
  • DAC device access code
  • IAC inquiry access code
  • the CAC is generated based on the Bluetooth address of a master for each piconet, and is used in a normal connection in a piconet.
  • the DAC is generated from information of each Bluetooth terminal, and is used in paging or a response to paging.
  • the IAC is generated commonly to Bluetooth or for each specific group such as a device type or the like, and is used to find an arbitrary Bluetooth device or a specific device type by issuing an inquiry. That is, if a common IAC is used, devices can generate an inquiry even if they do not know each other's Bluetooth addresses.
  • the header field is configured by 54 bits obtained in consideration of an error correction code of a code rate 1 ⁇ 3 from a total of 18 bits; 3 active member address bits, 4 type code bits, 1 flow control bit, 1 ARQN bit, 1 sequence number bit, and 8 check sum bits.
  • the active member address indicates the destination of a packet: 0, a broadcast mode; and 1 to 7, slave numbers in a piconet.
  • the type code indicates a packet type, and is used to classify five types for link control, four types for data transmission in case of a synchronous connection, and seven types for data transmission in case of an asynchronous connection.
  • the flow control bit indicates NG/IL of reverse sync transmission, and is used to prevent packet collision and to avoid an idle packet output in case of an asynchronous connection.
  • the ARQN bit is used for reception confirmation of a transmitted packet when error correction (Automatic Repeat Request) is valid; normal reception is determined if this bit is 1, or abnormal reception if it is 0.
  • the sequence number is used to distinguish if a packet is new or is re-sent.
  • the check sum is used for error detection of a header.
  • the payload field is used to store data, and the number of bits of the payload field changes depending on the packet type.
  • the packet communication scheme of Bluetooth includes an SCO (Synchronous Connection Oriented) and ACL (Asynchronous Connectionless).
  • the SCO is a symmetric communication in which the length of upstream and downstream packets is one slot, and transmission and reception intervals are constant, i.e., synchronous.
  • HV1 can transmit real data for 80 bits as an information size by multiplying the payload length of 240 bits per packet by an error correction code (FEC) of a code rate 1/3.
  • FEC error correction code
  • HV2 can transmit real data for 160 bits by multiplying an FEC of a code rate 2 ⁇ 3
  • HV3 can transmit real data for 240 bits without any FEC
  • DV can transmit 80 bits for audio data+150 bits for data as an information size.
  • HV1 allows two-way communications at a maximum of 64 kbps
  • HV3 allows those at a maximum of 192 kbps, so that these packet types are mainly used in information transmission with high realtimeness such as audio transmission or the like.
  • the ACL is an asymmetric communication in which the packet length can be selected from 1, 3, and 5 slots for each of an upstream and downstream, and can designate the SCO/ACL.
  • DM Data Medium rate
  • error correction code short Humming code
  • DH Data High rate
  • DM1, DM3, and DM5 Since the DM packet is further classified into three classes, i.e., DM1, DM3, and DM5, and the DH packet is further classified into three classes, i.e., DH1, DH3, and DH5, a total of seven packet types are provided. Except for the AUX, a 16-bit cyclic redundancy checking (CRC) code is appended to detect an information error of the payload field, and when completion of reception cannot be confirmed, a packet is re-sent (ARQ: Automatic Repeat Request). Numerals 1, 3, and 5 of respective classes of the DM and DH packets indicate the numbers of time slots occupied by one packet, and a longer payload length is assured with increasing value, since packets can be continuously output.
  • CRC cyclic redundancy checking
  • the DM1 and DH1 can output 240 bits; the DM3 and DH3, about 1500 bits; and the DM5 and DH5, about 2740 bits.
  • the AUX has a payload length of 240 bits, and has neither the CRC nor FEC.
  • two-way communications at a maximum of 433.9 bps (DH5) in case of symmetric types and at a maximum of 723.2 kbps vs. 57.6 kbps (DH5) in case of asymmetric types can be made. Therefore, these packets are used for connections which require a large communication capacity for one way such as browsing of WWW data, file downloading, and the like.
  • the master and slave states upon Bluetooth connection will be described below.
  • Each of the master and slave states is indicated by one of the following seven states; Standby, Inquiry, Page, Active, Hold, Sniff, and Park.
  • the Standby state is a state of standby, and is switched to a reception state at time intervals of only 1.28 sec to attain low power consumption.
  • the Inquiry state is set when the master recognizes a slave. At this time, no address is assigned to the slave, and the master continuously broadcasts an IQ packet to all the slaves. Upon reception of the IQ packet, each slave stands by for a time period for a RAND frame defined by a random number, and waits until an identical IQ packet arrives again. Upon recognition of arrival of the identical IQ packet, the slave returns an FHS packet to the master as a response. In this way, the master recognizes that slave, and acquires information such as a device, address, clock, and the like of the slave.
  • the Page state is a state in which an ID (identification) packet is output to the designated slave.
  • ID identification
  • the slave When the master transmits an ID packet (paging), the slave also returns an ID packet as a response.
  • the slave Upon reception of the FHS packet, the slave returns an ID packet as a reception response.
  • the Active state is a state in which the master assigns a slave number to a slave using the active member address to make an actual communication. As a result, a piconet is formed in this state.
  • the Hold state is one of low-power consumption states, and the master sets a hold period for a slave.
  • the slave does not receive any packet from the master within the hold period, and transits to a standby state after an elapse of the hold period. Note that this slave can communicate with devices other than the master during the hold period.
  • the Sniff state is another low-power consumption state, and the slave side makes reception within only a predetermined period at predetermined time intervals.
  • the master sets a Sniff cycle and standby slot for the slave first.
  • the slave makes reception within only the standby slot period set for each set Sniff cycle. Since the slave ignores all packets during a period other than the receivable period, it executes fewer processes in the standby state than the Hold state, thus suppressing consumption power.
  • This state is a standby state in which the slave joins a piconet but does not have any influence on the traffic.
  • the Park state is still another low-power consumption state.
  • the master sets a Park cycle for a slave, which receives a BC (beacon channel) from the master at this cycle. Since the slave maintains synchronization with the master by means of this BC, and can receive a broadcast packet or Park change/cancel packet from the master, it is easy for the slave to join a piconet again.
  • the Park cycle is set to be longer than the cycles of the above two low-power consumption states, thus suppressing consumption power accordingly.
  • Bluetooth is used as a wireless scheme that various devices and the MFP support.
  • the same power control mechanism and reception sensitivity adjustment mechanism as in this embodiment can be provided for other wireless schemes.
  • WiFi Wireless LAN
  • IEEE802.11n assumes 400 Mbps as a transmission rate.
  • the processing in the image processing apparatus 100 of this embodiment will be described below.
  • the controller unit 201 systematically controls the following processing.
  • FIG. 7 is a flowchart showing the operation of the image processing apparatus 100 of this embodiment.
  • the image processing apparatus 100 is set in the standby state (S 100 ) and then checks using a sensor (not shown) if an input signal is received from the expanded interface 104 as the touch panel (S 101 ).
  • the input signal from the expanded interface 104 is generated when either the mobile information terminal 200 or the user's finger 300 is placed on the expanded interface 104 . If no input signal is detected, i.e., nothing is placed on the interface 104 , the process returns to step S 100 .
  • the controller unit 201 controls the wireless communication unit 206 to detect the presence/absence of a communication response from an object which inputs that input signal (S 102 ).
  • this communication response is generated from only the mobile information terminal 200 placed on the expanded interface 104 .
  • the controller unit 201 determines that the object placed on the expanded interface 104 is the mobile information terminal 200 , and enters a communication mode (S 103 ).
  • the controller unit 201 determines that the object placed on the expanded interface 104 is the user's finger 300 , and enters a manipulation mode (S 104 ).
  • FIG. 8 is a flowchart showing an operation example in the communication mode in step S 103 .
  • the controller unit 201 forms a piconet with the mobile information terminal 200 placed on the expanded interface 104 (S 400 ). As a result, the piconet allows communications between the mobile information terminal 200 and image processing apparatus 100 so as to execute desired image processing.
  • the controller unit 201 detects using a sensor (not shown) if the mobile information terminal 200 is removed from the expanded interface 104 (S 401 ). If the mobile information terminal 200 is not removed, the process returns to step S 401 ; otherwise, the piconet is released (S 402 ).
  • FIG. 9 is a flowchart showing an operation example in the manipulation mode in step S 104 .
  • the controller unit 201 detects using a sensor (not shown) if the user's finger 300 is removed from the expanded interface 104 (S 500 ). If the finger is not removed, the process returns to step S 500 ; otherwise, a next menu window associated with a screen area touched with the finger 300 is displayed on the expanded interface 104 (S 501 ).
  • FIG. 10 shows the concept upon transition to the communication mode (S 103 ) and manipulation mode (S 104 ).
  • one of the mobile information terminal 200 and the user's finger 300 is placed on the expanded interface 104 which displays a default manipulation menu (window) 500 .
  • a processing menu (window) 600 is displayed on the expanded interface 104 to allow the user to select various processing modes (“process 1 ”, “process 2 ”, and “process 3 ” shown in FIG. 10 ) premised on communications between the mobile information terminal 200 and image processing apparatus 100 .
  • step S 102 when the user's finger 300 is placed on the “COPY” display part on the manipulation menu 500 , since no communication response is returned in step S 102 , the controller unit 201 determines that the user inputs a copy processing instruction, and enters the manipulation mode in step S 104 . Then, a copy manipulation menu 700 for various operation settings required for copy processing is displayed on the expanded interface 104 .
  • an appropriate display can be made on the expanded interface 104 .
  • the mobile information terminal 200 is placed on a function selection menu button displayed on the expanded interface 104
  • “placement of the mobile information terminal” and “pressing of the function selection menu button” can be correctly recognized in real time. Then, a menu display can be appropriately switched according to the recognition result.
  • the expanded interface 104 of the image processing apparatus 100 described in the first embodiment comprises a mode key 1201 as a hardware key which serves as mode selection unit and is used to input a switching instruction of the operation mode. That is, the user sets ON/OFF of the communication mode using this mode key 1201 , and if the mode key 1201 is ON, the communication mode is designated; otherwise, it is canceled.
  • FIG. 11 is a flowchart showing the operation of the image processing apparatus 100 of the second embodiment.
  • the image processing apparatus 100 is set in the standby state (S 200 ) and then checks using a sensor (not shown) if an input signal is received from the expanded interface 104 as the touch panel (S 201 ).
  • the input signal from the expanded interface 104 is generated when either the mobile information terminal 200 or the user's finger 300 is placed on the expanded interface 104 . If no input signal is detected, i.e., nothing is placed on the interface 104 , the process returns to step S 200 .
  • the controller unit 201 detects the setting value of the mode key 1201 (S 202 ). If the communication mode setting is ON, the controller unit 201 enters the communication mode (S 203 ); otherwise, it enters the manipulation mode (S 204 ).
  • FIG. 12 shows the concept upon transition to the communication mode (S 203 ) and manipulation mode (S 204 ) in the second embodiment.
  • One of the mobile information terminal 200 and the user's finger 300 is placed on the expanded interface 104 which displays the default manipulation menu 500 .
  • the controller unit 201 determines in step S 202 that the communication mode is ON without confirming a communication response from the mobile information terminal 200 , and enters the communication mode in step S 203 .
  • the processing menu 600 is displayed on the expanded interface 104 to allow the user to select various processing modes (“process 1 ”, “process 2 ”, and “process 3 ” shown in FIG. 12 ) premised on communications between the mobile information terminal 200 and image processing apparatus 100 .
  • step S 202 determines in step S 202 that the communication mode is OFF, and enters the manipulation mode in step S 204 . Then, the copy manipulation menu 700 for various operation settings required for copy processing is displayed on the expanded interface 104 .
  • the control unconditionally enters the communication mode, and displays the processing menu 600 .
  • the mode key is ON
  • the user's finger may often be placed on the expanded interface 104 .
  • the processing menu 600 is displayed under only the condition that the mode key 1201 is ON, an insignificant window is displayed.
  • the process may advance to a communication determination mode in place of unconditionally entering the communication mode. More specifically, if the controller unit 201 determines in step S 202 that the mode key is ON, the process advances to step S 102 in FIG. 7 . Then, the controller unit 201 detects the presence/absence of a communication response from an object which inputs the input signal. Subsequent processes are the same as those described with reference to FIGS. 7 to 9 .
  • an operation mode to be started upon detection of an input from the expanded interface 104 is discriminated based on the setting of the mode key 1201 , thus allowing an appropriate display on the expanded interface 104 .
  • the third embodiment is characterized in that the expanded interface 104 of the image processing apparatus 100 described in the first embodiment comprises a communication area 901 on its surface, as shown in FIG. 14 . More specifically, if an input signal from the expanded interface 104 is generated within the communication area 901 , the controller unit 201 determines that the mobile information terminal 200 is placed at a predetermined position, and enters the communication mode; otherwise, it enters a normal manipulation mode.
  • FIG. 13 is a flowchart showing the operation of the image processing apparatus 100 of the third embodiment.
  • the image processing apparatus 100 is set in the standby state (S 300 ) and then checks using a sensor (not shown) if an input signal is received from the expanded interface 104 as the touch panel (S 301 ).
  • the input signal from the expanded interface 104 is generated when either the mobile information terminal 200 or the user's finger 300 is placed on the expanded interface 104 . If no input signal is detected, i.e., nothing is placed on the interface 104 , the process returns to step S 300 .
  • the controller unit 201 determines if the position of an object which inputs the input signal is located within the predetermined communication area 901 (S 302 ).
  • the controller unit 201 determines that the mobile information terminal 200 is placed on the expanded interface 104 , and enters the communication mode (S 303 ). On the other hand, if the position of the object is not located within the communication area 901 , the controller unit 201 determines that the user's finger 300 is placed on the expanded interface 104 , and enters the manipulation mode (S 304 ).
  • FIG. 14 shows the concept upon transition to the communication mode (S 303 ) and manipulation mode (S 304 ) in the second embodiment.
  • One of the mobile information terminal 200 and the user's finger 300 is placed on the expanded interface 104 which displays the default manipulation menu 500 .
  • the controller unit 201 determines that the mobile information terminal 200 is placed, and enters the communication mode in step S 303 .
  • the processing menu 600 is displayed on the expanded interface 104 to allow the user to select various processing modes (“process 1 ”, “process 2 ”, and “process 3 ” shown in FIG. 14 ) premised on communications between the mobile information terminal 200 and image processing apparatus 100 .
  • the controller unit 201 determines that the user's finger 300 is placed, and enters the manipulation mode in step S 304 . Then, the copy manipulation menu 700 for various operation settings required for copy processing is displayed on the expanded interface 104 .
  • the control when the position of the object which inputs the input signal is located within the communication area 901 , the control unconditionally enters the communication mode irrespective of the type of object, and the processing menu 600 is displayed.
  • the user's finger may often be placed within the communication area 901 .
  • the processing menu 600 is displayed under only the condition that the position of the object (the finger in this case) that inputs the input signal is located within the communication area, an insignificant window is displayed.
  • the control may advance to a communication determination mode, in place of unconditionally entering the communication mode.
  • step S 302 determines in step S 302 that the position of the object which inputs the input signal is located within the predetermined communication area 901 .
  • the process advances to step S 102 in FIG. 7 .
  • the controller unit 201 detects the presence/absence of a communication response from an object which inputs the input signal. Subsequent processes are the same as those described with reference to FIGS. 7 to 9 .
  • the display unit 203 , manipulation unit 204 , and mobile information terminal detection unit 205 are integrated by the expanded interface 104 as a touch panel UI. That is, the putting space of the mobile information terminal 200 is assured on the touch panel UI.
  • the processing menu for data input from the mobile information terminal 200 placed on the touch panel UI must be displayed on the touch panel UI. If an area used to display this processing menu on the touch panel UI is fixed at a predetermined position, the display control of the processing menu becomes easy. However, if the display area of the processing menu is fixed, an area available for the putting space of the mobile information terminal 200 on the touch panel UI is limited, resulting in a low degree of freedom when the user places the mobile information terminal 200 .
  • the processing menu display processing on the expanded interface 104 executed in the communication mode in the first to third embodiments, i.e., when the mobile information terminal 200 is placed on the expanded interface 104 will be described.
  • FIG. 15 is a block diagram showing the hardware arrangement of the image processing apparatus 100 according to the fourth embodiment.
  • FIG. 15 is characterized in that the arrangement shown in FIG. 3 of the first embodiment further comprises a general menu area specification unit 212 .
  • the controller unit 201 according to this embodiment has a function as terminal identification means that identifies the type of mobile information terminal in addition to the functions of the controller unit 201 according to the aforementioned embodiments.
  • the controller unit 201 according to this embodiment has a function as terminal position specification means that specifies the contact position of the mobile information terminal on the expanded interface 104 .
  • the controller unit 201 according to this embodiment has a function as other menu area specification means that specifies an other menu area which can display other menus (general menus) other than the processing menu for the mobile information terminal.
  • the general menu area specification unit 212 specifies an area that can display a predetermined general menu such as a manipulation menu and the like in a remaining area upon displaying a processing menu for the mobile information terminal 200 on the expanded interface 104 . More specifically, the general menu area specification unit 212 specifies the remaining area that can display the general menu on the expanded interface 104 based on the layout position of the mobile information terminal 200 and the distance (predetermined value) between the processing menu displayed according to the terminal type and the general menu. Note that the controller unit 201 may include the function of the general menu area specification unit 212 .
  • FIG. 16 is a flowchart showing the display processing in the communication mode in the image processing apparatus 100 of the fourth embodiment.
  • the mobile information terminal detection unit 205 specifies the type (to be referred to as a terminal type hereinafter) of at least one mobile information terminal 200 placed on the expanded interface 104 (S 601 ). This specification is made based on a communication response from the mobile information terminal 200 .
  • the mobile information terminal detection unit 205 further specifies a position (to be referred to as a layout position hereinafter) where the mobile information terminal 200 is placed on the expanded interface 104 (S 602 ).
  • the controller unit 201 displays a processing menu for the terminal type specified in step S 601 near the layout position specified in step S 602 (S 603 ). Note that the distance from the layout position to the processing menu display position is set in advance. However, the user may arbitrarily set this distance.
  • the general menu area specification unit 212 specifies an area that can display a general menu based on the layout position of the terminal, and the distance between the processing menu displayed according to the terminal type and a general menu display position designated in advance (S 604 ).
  • the controller unit 201 checks if the area (size) of the general menu display area is larger than a threshold which is set in advance by the user (S 605 ). Only when the size is larger than the threshold, the controller unit 201 displays a general menu (S 606 ). That is, if the size of the general menu display area is smaller than the threshold, the controller unit 201 does not display any general menu.
  • the size of the general menu to be displayed may be changed according to the size of the general menu display area.
  • FIG. 17 shows a default display example on the expanded interface 104 before the mobile information terminal 200 is placed, i.e., a default display example of the general menu 500 .
  • the display contents change, as shown in FIG. 18 .
  • the processing menu 600 for the mobile information terminal 200 is displayed near that terminal, and a reduced general menu 510 obtained by reducing the general menu 500 is displayed.
  • processing menus menus 601 and 602 in this case
  • the terminal types are displayed near these terminals, as shown in FIG. 19 .
  • the general menu is not displayed.
  • the general menu is also displayed.
  • processing menus for respective mobile information terminals are displayed near these terminals placed on the expanded interface 104 . Therefore, even when a plurality of mobile information terminals of the same type are placed, respective processing menus can be displayed so that correspondence with the respective terminals can be recognized.
  • the display positions of the processing menu for the mobile information terminal and general menu are controlled based on the terminal type and layout position of the mobile information terminal placed on the expanded interface 104 .
  • Such display position control can also be independently executed. That is, control may be made so that the processing menu according to the terminal type of the mobile information terminal 200 is displayed for each terminal, but the general menu is not displayed.
  • the display position of the general menu may be controlled not to be occluded according to the layout position of the mobile information terminal 200 without displaying any processing menu for the mobile information terminal 200 .
  • the position of the mobile information terminal 200 on the expanded interface 104 is not limited. Even when a plurality of mobile information terminals are placed, processing menus corresponding to these terminals can be clearly displayed.
  • the aforementioned fourth embodiment has exemplified the processing menu display processing on the expanded interface 104 when the mobile information terminal 200 is placed on the expanded interface 104 which is implemented as a touch panel UI.
  • a corresponding processing menu can be displayed near the mobile information terminal 200 placed on the expanded interface 104 without limiting its position.
  • the position of the mobile information terminal to be placed is not limited, if a mobile information terminal with a relatively large size is placed on the expanded interface 104 , it physically covers the touch panel UI surface. As a result, the display area is reduced, resulting in poor operability of the user.
  • a mobile information terminal larger than the touch panel UI display area When a mobile information terminal larger than the touch panel UI display area is placed on the touch panel UI, no manipulation menu is displayed, and the user cannot make any manipulation.
  • a large mobile information terminal means a mobile phone with a large screen and full keyboard, PDA, notebook type personal computer, and the like herein.
  • FIG. 20 shows a processing menu display example when a mobile information terminal 800 which is of notebook personal computer type and has a display unit 801 is placed on the expanded interface 104 . Since a relatively large mobile information terminal 800 is placed on the touch panel UI display area, the manipulation menu 500 is displayed in a reduced scale, thus impairing operability.
  • the processing menu display processing in the communication mode in the first to third embodiments i.e., when the mobile information terminal is placed on the expanded interface 104 and the touch panel UI display area is reduced will be described below.
  • FIG. 21 is a block diagram showing the hardware arrangement of the image processing apparatus 100 according to the fifth embodiment. As shown in FIG. 21 , the arrangement shown in FIG. 3 in the first embodiment further comprises a display unit information acquisition unit 213 , and a manipulation menu area specification unit 214 .
  • the display unit information acquisition unit 213 acquires display unit information of the mobile information terminal 800 . More specifically, the unit 213 acquires the display unit resolution of the display unit 801 with reference to a device driver setting file of the display unit 801 of the mobile information terminal 800 . As another display unit information acquisition method, the unit 203 may acquire the display unit resolution using a display resolution acquisition command provided by an operating system installed in the mobile information terminal 800 . Note that the mobile information terminal detection unit 205 or controller unit 201 may include the function of the display unit information acquisition unit 213 .
  • the manipulation menu area specification unit 214 specifies a display area required to display a manipulation menu on the expanded interface 104 . More specifically, the unit 214 specifies a displayable area of the manipulation menu on the expanded interface 104 based on the layout position of the mobile information terminal 200 , the size of the default manipulation menu, and the size and distance (predetermined value) of a menu displayed according to the terminal type. Note that the controller unit 201 may include the function of the manipulation menu area specification unit 214 .
  • FIG. 22 is a flowchart showing the display processing in the communication mode in the image processing apparatus 100 of the fifth embodiment.
  • the mobile information terminal detection unit 205 specifies the type (to be referred to as a terminal type hereinafter) of the mobile information terminal 800 placed on the expanded interface 104 (S 701 ). This specification is made based on a communication response from the mobile information terminal 800 .
  • the mobile information terminal detection unit 205 further specifies position information (to be referred to as layout position information hereinafter) where the mobile information terminal 800 is placed on the expanded interface 104 (S 702 ).
  • the manipulation menu area specification unit 214 specifies an area of a manipulation menu that can be displayed on the expanded interface 104 based on the layout position information acquired in step S 702 and the display unit area information of the expanded interface 104 , which is pre-stored in the image processing apparatus 100 (S 703 ).
  • the controller unit 201 then checks if the size of the specified manipulation menu displayable area is smaller than a predetermined threshold (S 704 ). If the size is smaller than the threshold, the display unit information acquisition unit 213 acquires the display unit information of the mobile information terminal 800 (S 705 ), and specifies an area (size) of a displayable area of the mobile information terminal 800 (S 706 ). On the other hand, if the controller unit 201 determines that the size is larger than the threshold, it displays a manipulation menu on the expanded interface.
  • the controller unit 201 checks if the size of the displayable area of the mobile information terminal 800 specified in step S 706 is larger than an initial size of the manipulation menu 500 (S 707 ). Only when the size of the displayable area is larger than the initial size, the controller unit 201 transfers the manipulation menu to the mobile information terminal 800 via the wireless communication unit 206 , and displays the manipulation menu on the display unit 801 (S 711 ).
  • the controller unit 201 determines in step S 707 that the size of the displayable area is smaller than the initial size, it further checks if the displayable area is larger than an area which indicates a minimum menu displayable area and is designated in advance (S 708 ). If the controller unit 201 determines that the displayable area is larger than that area, it divides the manipulation menu into a menu part to be displayed on the expanded interface 104 and that to be displayed on the display unit 801 of the mobile information terminal 800 (S 709 ). Furthermore, the controller unit 201 makes a menu display on the menu displayable area on the expanded interface 104 specified in step S 703 (S 710 ), and that on the display unit 801 (S 711 ).
  • the controller unit 201 determines that the displayable area is smaller than that area, it displays, as error processing, an error message indicating that the manipulation menu cannot be displayed, on the display unit 801 or a display unit (not shown) of the image processing apparatus 100 (S 712 ).
  • the menu to be displayed may be divided into a processing menu and general manipulation menu. Furthermore, the size of the manipulation menu may be changed according to the areas of the menu displayable areas of the expanded interface 104 and display unit 801 .
  • processing menu for the mobile information terminal may be displayed on the display unit 801
  • general manipulation menu may be displayed on the expanded interface 104 .
  • FIG. 23 shows the concept of transition of a menu display on the expanded interface 104 and that on the display unit 801 of the mobile information terminal 800 in this embodiment.
  • the display unit 801 has a displayable area larger than a display area of the default manipulation menu 500 .
  • a manipulation menu 850 is displayed on the display unit 801 .
  • no manipulation menu is displayed on the expanded interface 104 .
  • a menu manipulation is attained using a user interface of the mobile information terminal 800 . More specifically, the user selects the manipulation menu 850 by manipulating a pointer cursor 802 displayed on the display unit 801 using the mobile information terminal 800 .
  • the selection result of the manipulation menu is transferred to the controller unit 201 via the wireless communication unit 206 , thus executing processing according to the selection result.
  • FIG. 24 shows another mode of a menu display on the expanded interface 104 and that on the display unit 801 of the mobile information terminal 800 in this embodiment.
  • the displayable area of the display unit 801 is smaller than a display area of the default manipulation menu 500 , but is larger than a minimum menu displayable area which is designated in advance.
  • a processing menu for the mobile information terminal of the default manipulation menu 500 is displayed on the display unit 801 as the manipulation menu 850 .
  • a general manipulation menu of the default manipulation menu 500 is displayed on the displayable area on the expanded interface. Note that the user manipulates the processing menu using the user interface of the mobile information terminal 800 , and the general manipulation menu via a touch panel UI manipulation on the expanded interface.
  • the manipulation menu can be displayed without being occluded by the mobile information terminal. In this manner, the user's operability can be prevented from impairing without displaying any manipulation menu.
  • the present invention can adopt embodiments in the forms of a system, apparatus, method, program, storage medium (recording medium), and the like. More specifically, the present invention can be applied to either a system constituted by a plurality of devices (e.g., a host computer, interface device, image sensing device, Web application, and the like), or an apparatus consisting of a single device.
  • a system constituted by a plurality of devices (e.g., a host computer, interface device, image sensing device, Web application, and the like), or an apparatus consisting of a single device.
  • the wireless communication unit 206 of the image processing apparatus makes both a communication required to detect the mobile information terminal 200 and that required to transmit print data and the like after detection.
  • the image processing apparatus may comprise a plurality of types of wireless communication units, and may selectively use these wireless communication units to attain a communication required to detect the mobile information terminal and that required to transmit print data and the like.
  • the image processing apparatus may use a proximity wireless communication such as Felica or the like, in the communication required to detect the mobile information terminal, and may use a wireless communication such as Bluetooth, wireless LAN, or the like in the communication required to transmit print data and the like later.
  • the present invention can also be achieved by directly or remotely supplying a program of software that implements the functions of the aforementioned embodiments to a system or apparatus, and reading out and executing the supplied program code by a computer of that system or apparatus.
  • the program in this case is that corresponding to each illustrated flowchart in the embodiments.
  • the program code itself installed in a computer to implement the functional processing of the present invention using the computer implements the present invention.
  • the present invention includes the computer program itself for implementing the functional processing of the present invention.
  • the form of program is not particularly limited, and an object code, a program to be executed by an interpreter, script data to be supplied to an OS, and the like may be used as long as they have the functions of the program.
  • a recording medium for supplying the program the following media can be used.
  • a Floppy® disk, hard disk, optical disk, magneto-optical disk, MO, CD-ROM, CD-R, CD-RW, magnetic tape, nonvolatile memory card, ROM, DVD (DVD-ROM, DVD-R), and the like can be used.
  • the following method may be used.
  • the user establishes a connection to a home page on the Internet using a browser on a client computer, and downloads the computer program itself of the present invention (or a compressed file including an automatic installation function) from the home page onto a recording medium such as a hard disk or the like.
  • the program code that forms the program of the present invention may be segmented into a plurality of files, which may be downloaded from different home pages.
  • the present invention includes a WWW server which makes a plurality of users download a program file required to implement the functional processing of the present invention by the computer.
  • a storage medium such as a CD-ROM or the like, which stores the encrypted program of the present invention, may be delivered to the user, and the user who meets a predetermined condition may be allowed to download key information used to decrypt the encrypted program from a home page via the Internet. That is, the user can execute the encrypted program using the downloaded key information to install the program on a computer.
  • the functions of the aforementioned embodiments can be implemented when the computer executes the readout program. Furthermore, the functions of the aforementioned embodiments can be implemented when an OS or the like running on the computer executes some or all of actual processing operations on the basis of an instruction of that program.
  • the functions of the aforementioned embodiments can be implemented when the program read out from the recording medium is written in a memory equipped on a function expansion board or a function expansion unit, which is inserted in or connected to the computer, and is then executed. Therefore, a CPU equipped on the function expansion board or function expansion unit can execute some or all of actual processing operations based on the instruction of the program.
  • an image processing apparatus which generates an input apparatus when a mobile information terminal contacts a display surface and makes a wireless communication with the contacting mobile information terminal, whether or not the contacting object is a mobile information terminal is discriminated, and an appropriate display can be made according to the discrimination result.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Mobile Radio Communication Systems (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)
  • Accessory Devices And Overall Control Thereof (AREA)
US11/864,385 2006-10-16 2007-09-28 Image processing apparatus and control method thereof Abandoned US20080291283A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/844,933 US10318076B2 (en) 2006-10-16 2015-09-03 Image displaying apparatus with changed menu based on detection of mobile information terminal placed thereon

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2006-281738 2006-10-16
JP2006281738 2006-10-16
JP2007-044531 2007-02-23
JP2007044531A JP4933304B2 (ja) 2006-10-16 2007-02-23 画像処理装置、その制御方法及びプログラム

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/844,933 Continuation US10318076B2 (en) 2006-10-16 2015-09-03 Image displaying apparatus with changed menu based on detection of mobile information terminal placed thereon

Publications (1)

Publication Number Publication Date
US20080291283A1 true US20080291283A1 (en) 2008-11-27

Family

ID=39508128

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/864,385 Abandoned US20080291283A1 (en) 2006-10-16 2007-09-28 Image processing apparatus and control method thereof
US14/844,933 Active 2028-10-01 US10318076B2 (en) 2006-10-16 2015-09-03 Image displaying apparatus with changed menu based on detection of mobile information terminal placed thereon

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/844,933 Active 2028-10-01 US10318076B2 (en) 2006-10-16 2015-09-03 Image displaying apparatus with changed menu based on detection of mobile information terminal placed thereon

Country Status (2)

Country Link
US (2) US20080291283A1 (fr)
JP (1) JP4933304B2 (fr)

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070088640A1 (en) * 2002-04-05 2007-04-19 Shogo Hyakutake System, computer program product and method for managing documents
US20100192101A1 (en) * 2009-01-29 2010-07-29 International Business Machines Corporation Displaying radial menus in a graphics container
US20100289951A1 (en) * 2009-05-12 2010-11-18 Ryu Jae-Kyung Synchronization method
US20110148754A1 (en) * 2009-12-22 2011-06-23 Canon Kabushiki Kaisha Projection apparatus, display apparatus, information processing apparatus, projection system and display system
US20120094716A1 (en) * 2010-10-15 2012-04-19 Reeves Paul E Mirrored remote peripheral interface
US20120242589A1 (en) * 2011-03-24 2012-09-27 University Of Lancaster Computer Interface Method
EP2237139A3 (fr) * 2009-04-01 2013-05-15 Samsung Electronics Co., Ltd. Procédé pour la fourniture d'une interface utilisateur graphique (GUI) et dispositif multimédia l'utilisant
US8683496B2 (en) 2010-10-01 2014-03-25 Z124 Cross-environment redirection
US8726294B2 (en) 2010-10-01 2014-05-13 Z124 Cross-environment communication using application space API
US8819705B2 (en) 2010-10-01 2014-08-26 Z124 User interaction support across cross-environment applications
US8842080B2 (en) 2010-10-01 2014-09-23 Z124 User interface with screen spanning icon morphing
US8868135B2 (en) 2011-09-27 2014-10-21 Z124 Orientation arbitration
US8898443B2 (en) 2010-10-01 2014-11-25 Z124 Multi-operating system
US8933949B2 (en) 2010-10-01 2015-01-13 Z124 User interaction across cross-environment applications through an extended graphics context
US8966379B2 (en) 2010-10-01 2015-02-24 Z124 Dynamic cross-environment application configuration/orientation in an active user environment
US9047102B2 (en) 2010-10-01 2015-06-02 Z124 Instant remote rendering
US20150199941A1 (en) * 2014-01-15 2015-07-16 Nokia Corporation 3d touch sensor reader
US9116490B2 (en) 2012-11-09 2015-08-25 Brother Kogyo Kabushiki Kaisha Image forming apparatus having communication board for near field communication
US20160170638A1 (en) * 2014-12-12 2016-06-16 Konica Minolta, Inc. Image processing apparatus, method for controlling the same, and storage medium
CN105829998A (zh) * 2013-12-12 2016-08-03 微软技术许可有限责任公司 将装置绑定到计算设备
EP2472373A4 (fr) * 2009-08-27 2016-08-17 Sony Corp Dispositif de traitement d'informations, procédé de traitement d'informations et programme
CN106063241A (zh) * 2014-03-10 2016-10-26 京瓷办公信息系统株式会社 图像形成装置
US9509873B2 (en) 2012-04-25 2016-11-29 Brother Kogyo Kabushiki Kaisha Image forming apparatus with communication device
EP3119073A1 (fr) * 2015-07-14 2017-01-18 Canon Kabushiki Kaisha Appareil de formation d'image
US20170070641A1 (en) * 2015-09-03 2017-03-09 Konica Minolta, Inc. Document processing device and communication control method therefor
US20170243094A1 (en) * 2016-02-18 2017-08-24 Seiko Epson Corporation Printing apparatus and control method for printing apparatus
US9768490B2 (en) 2014-03-12 2017-09-19 Brother Kogyo Kabushiki Kaisha Image forming apparatus having wireless communication device
JP2018028954A (ja) * 2017-11-28 2018-02-22 シャープ株式会社 画像表示方法
US9924050B2 (en) 2014-09-18 2018-03-20 Konica Minolta, Inc. Operation display apparatus, portable terminal, programs therefor, and operation display system
US10178250B2 (en) 2016-10-21 2019-01-08 Konica Minolta, Inc. Cooperation system, information processing apparatus, cooperation method and non-transitory computer-readable recording medium encoded with cooperation program
US10268935B2 (en) 2016-02-18 2019-04-23 Seiko Epson Corporation Tape printing apparatus and control method for tape printing apparatus
US10291793B2 (en) * 2011-11-22 2019-05-14 Sharp Kabushiki Kaisha Server apparatus providing portable information terminal and image forming apparatus with cloud image processing service
US20200021697A1 (en) * 2018-07-10 2020-01-16 Canon Kabushiki Kaisha Information processing apparatus, control method, and storage medium thereof
US11570502B2 (en) * 2018-05-01 2023-01-31 Telefonaktiebolaget Lm Ericsson (Publ) Providing personalized messages in adaptive streaming
US11811989B2 (en) 2012-04-25 2023-11-07 Brother Kogyo Kabushiki Kaisha Image forming apparatus including antenna in cover

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4724191B2 (ja) 2008-02-12 2011-07-13 シャープ株式会社 原稿読取装置及び画像形成装置
JP5129669B2 (ja) * 2008-06-30 2013-01-30 キヤノン株式会社 画像形成装置及びその制御方法、画像供給装置及びその制御方法
JP2010136102A (ja) * 2008-12-04 2010-06-17 Canon Inc 無線通信装置およびその制御方法
JP5012933B2 (ja) 2010-02-26 2012-08-29 カシオ計算機株式会社 携帯端末及びプログラム
JP5570304B2 (ja) * 2010-06-02 2014-08-13 キヤノン株式会社 表示装置及びシステム
US8681362B2 (en) 2011-04-14 2014-03-25 Kabushiki Kaisha Toshiba Position detecting apparatus, position detecting method, and image forming apparatus
JP2013005120A (ja) * 2011-06-14 2013-01-07 Canon Inc 通信装置、通信装置の制御方法、及びプログラム
JP2013211670A (ja) * 2012-03-30 2013-10-10 Azbil Corp 光電スイッチ
JP5338940B2 (ja) * 2012-04-13 2013-11-13 カシオ計算機株式会社 表示制御装置及びプログラム
US9507979B2 (en) * 2012-04-27 2016-11-29 Cirque Corporation Saving power in a battery powered system having a touch sensor and an RFID tag reader
JP6067322B2 (ja) * 2012-10-24 2017-01-25 株式会社アイ・オー・データ機器 携帯端末、通信システムおよび携帯端末プログラム
JP5930448B2 (ja) * 2013-03-08 2016-06-08 株式会社ソニー・インタラクティブエンタテインメント 近傍無線通信用のrfidタグのリーダおよび近傍無線通信システム
JP5819892B2 (ja) * 2013-08-21 2015-11-24 レノボ・シンガポール・プライベート・リミテッド 電子ペンで入力する座標入力装置を備える電子機器、制御方法およびコンピュータ・プログラム
JP6188497B2 (ja) * 2013-09-03 2017-08-30 キヤノン株式会社 通信装置、通信装置の制御方法、及びコンピュータプログラム
JP6330279B2 (ja) 2013-09-18 2018-05-30 ソニー株式会社 情報処理装置、情報処理システム、情報処理方法、及びプログラム
JP5790821B2 (ja) * 2014-04-07 2015-10-07 カシオ計算機株式会社 情報読取装置及びプログラム
JP6369243B2 (ja) * 2014-09-08 2018-08-08 コニカミノルタ株式会社 操作表示装置、プログラム、操作表示システム
JP6435890B2 (ja) * 2015-01-30 2018-12-12 コニカミノルタ株式会社 操作表示システム、操作表示装置、プログラム
JP2016224628A (ja) * 2015-05-28 2016-12-28 トヨタ自動車株式会社 表示装置
WO2018066160A1 (fr) * 2016-10-05 2018-04-12 シャープ株式会社 Terminal de communication et procédé de commande de terminal de communication
JP6769243B2 (ja) 2016-11-02 2020-10-14 コニカミノルタ株式会社 連携システム、情報処理装置、連携方法および連携プログラム
JP2018114622A (ja) * 2017-01-16 2018-07-26 コニカミノルタ株式会社 情報処理装置、操作位置表示方法および操作位置表示プログラム
JP6377828B2 (ja) * 2017-11-02 2018-08-22 シャープ株式会社 画像表示装置、表示制御プログラムおよび表示制御方法
JP6495519B2 (ja) * 2018-07-25 2019-04-03 シャープ株式会社 画像表示装置、表示制御プログラムおよび表示制御方法
JP6654722B2 (ja) * 2019-03-08 2020-02-26 シャープ株式会社 画像表示装置および画像表示方法
JP7298224B2 (ja) * 2019-03-19 2023-06-27 株式会社リコー 表示装置、及び表示方法
JP6915178B2 (ja) * 2019-09-12 2021-08-04 キヤノン株式会社 プログラム及び通信端末

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6127941A (en) * 1998-02-03 2000-10-03 Sony Corporation Remote control device with a graphical user interface
US20040117389A1 (en) * 2002-09-05 2004-06-17 Takashi Enami Image forming system that can output documents stored in remote apparatus
US20040248617A1 (en) * 2001-08-28 2004-12-09 Haruo Oba Information processing apparatus and method, and recording medium
US20050192048A1 (en) * 2000-10-20 2005-09-01 Raj Bridgelall Dual mode wireless data communications
US20060001645A1 (en) * 2004-06-30 2006-01-05 Microsoft Corporation Using a physical object to control an attribute of an interactive display application
US20060212936A1 (en) * 2005-03-16 2006-09-21 Audrius Berzanskis Method of integrating QKD with IPSec
US20060230192A1 (en) * 2005-03-29 2006-10-12 Travis Parry Display of a user interface
US20070008300A1 (en) * 2005-07-08 2007-01-11 Samsung Electronics Co., Ltd. Method and medium for variably arranging content menu and display device using the same
US7184391B2 (en) * 2001-05-24 2007-02-27 Samsung Electronics Co., Ltd. Optical recording medium on which multi-modulated header signals are recorded, apparatus and method of recording header signals and apparatus and method of reproducing header signals
US20070189737A1 (en) * 2005-10-11 2007-08-16 Apple Computer, Inc. Multimedia control center
US20070300307A1 (en) * 2006-06-23 2007-12-27 Microsoft Corporation Security Using Physical Objects
US7552402B2 (en) * 2006-06-22 2009-06-23 Microsoft Corporation Interface orientation using shadows

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11353432A (ja) * 1998-06-04 1999-12-24 Nippon Electric Ind Co Ltd Icカードリーダライタ
JP2001128246A (ja) 1999-10-29 2001-05-11 Toshiba Corp 通信システム、この通信システムで用いられる通信装置、及び通信方法
JP4332964B2 (ja) * 1999-12-21 2009-09-16 ソニー株式会社 情報入出力システム及び情報入出力方法
JP2003280815A (ja) * 2002-03-26 2003-10-02 Smkr & D Kk アンテナ付タッチパネル
US20040021698A1 (en) * 2002-08-05 2004-02-05 Baldwin Amanda K. Intuitive touchscreen interface for a multifunction device and method therefor
JP4546023B2 (ja) 2002-12-04 2010-09-15 キヤノン株式会社 情報処理装置及びその制御方法、プログラム並びに記憶媒体
JP2005252564A (ja) * 2004-03-03 2005-09-15 Fuji Xerox Co Ltd 画像形成装置、携帯端末載置ユニット、無線通信制御方法、およびプログラム
US7394459B2 (en) * 2004-04-29 2008-07-01 Microsoft Corporation Interaction between objects and a virtual environment display
JP4664665B2 (ja) * 2004-12-22 2011-04-06 オリンパスイメージング株式会社 デジタルプラットフォーム装置
JP2006195925A (ja) * 2005-01-17 2006-07-27 Nippon Signal Co Ltd:The タッチパネル装置
JP4550636B2 (ja) 2005-03-18 2010-09-22 富士通株式会社 電子機器、その登録方法及び登録プログラム

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6127941A (en) * 1998-02-03 2000-10-03 Sony Corporation Remote control device with a graphical user interface
US20050192048A1 (en) * 2000-10-20 2005-09-01 Raj Bridgelall Dual mode wireless data communications
US7184391B2 (en) * 2001-05-24 2007-02-27 Samsung Electronics Co., Ltd. Optical recording medium on which multi-modulated header signals are recorded, apparatus and method of recording header signals and apparatus and method of reproducing header signals
US20040248617A1 (en) * 2001-08-28 2004-12-09 Haruo Oba Information processing apparatus and method, and recording medium
US20040117389A1 (en) * 2002-09-05 2004-06-17 Takashi Enami Image forming system that can output documents stored in remote apparatus
US20060001645A1 (en) * 2004-06-30 2006-01-05 Microsoft Corporation Using a physical object to control an attribute of an interactive display application
US20060212936A1 (en) * 2005-03-16 2006-09-21 Audrius Berzanskis Method of integrating QKD with IPSec
US20060230192A1 (en) * 2005-03-29 2006-10-12 Travis Parry Display of a user interface
US20070008300A1 (en) * 2005-07-08 2007-01-11 Samsung Electronics Co., Ltd. Method and medium for variably arranging content menu and display device using the same
US20070189737A1 (en) * 2005-10-11 2007-08-16 Apple Computer, Inc. Multimedia control center
US7552402B2 (en) * 2006-06-22 2009-06-23 Microsoft Corporation Interface orientation using shadows
US20070300307A1 (en) * 2006-06-23 2007-12-27 Microsoft Corporation Security Using Physical Objects

Cited By (72)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070094103A1 (en) * 2002-04-05 2007-04-26 Shogo Hyakutake System, managing computer program product and method for managing documents
US20070088640A1 (en) * 2002-04-05 2007-04-19 Shogo Hyakutake System, computer program product and method for managing documents
US8229811B2 (en) * 2002-04-05 2012-07-24 Ricoh Company, Ltd. System, computer program product and method for managing documents
US8239297B2 (en) * 2002-04-05 2012-08-07 Ricoh Americas Corporation System, managing computer program product and method for managing documents
US20100192101A1 (en) * 2009-01-29 2010-07-29 International Business Machines Corporation Displaying radial menus in a graphics container
EP2237139A3 (fr) * 2009-04-01 2013-05-15 Samsung Electronics Co., Ltd. Procédé pour la fourniture d'une interface utilisateur graphique (GUI) et dispositif multimédia l'utilisant
US8723970B2 (en) * 2009-05-12 2014-05-13 Samsung Electronics Co., Ltd. Synchronization method
US20100289951A1 (en) * 2009-05-12 2010-11-18 Ryu Jae-Kyung Synchronization method
EP2472373A4 (fr) * 2009-08-27 2016-08-17 Sony Corp Dispositif de traitement d'informations, procédé de traitement d'informations et programme
US20110148754A1 (en) * 2009-12-22 2011-06-23 Canon Kabushiki Kaisha Projection apparatus, display apparatus, information processing apparatus, projection system and display system
US9049213B2 (en) 2010-10-01 2015-06-02 Z124 Cross-environment user interface mirroring using remote rendering
US8966379B2 (en) 2010-10-01 2015-02-24 Z124 Dynamic cross-environment application configuration/orientation in an active user environment
US8683496B2 (en) 2010-10-01 2014-03-25 Z124 Cross-environment redirection
US9152582B2 (en) 2010-10-01 2015-10-06 Z124 Auto-configuration of a docked system in a multi-OS environment
US8819705B2 (en) 2010-10-01 2014-08-26 Z124 User interaction support across cross-environment applications
US8842080B2 (en) 2010-10-01 2014-09-23 Z124 User interface with screen spanning icon morphing
US9727205B2 (en) 2010-10-01 2017-08-08 Z124 User interface with screen spanning icon morphing
US8898443B2 (en) 2010-10-01 2014-11-25 Z124 Multi-operating system
US8933949B2 (en) 2010-10-01 2015-01-13 Z124 User interaction across cross-environment applications through an extended graphics context
US8957905B2 (en) 2010-10-01 2015-02-17 Z124 Cross-environment user interface mirroring
US8963939B2 (en) 2010-10-01 2015-02-24 Z124 Extended graphics context with divided compositing
US8726294B2 (en) 2010-10-01 2014-05-13 Z124 Cross-environment communication using application space API
US9160796B2 (en) 2010-10-01 2015-10-13 Z124 Cross-environment application compatibility for single mobile computing device
US9026709B2 (en) 2010-10-01 2015-05-05 Z124 Auto-waking of a suspended OS in a dockable system
US9098437B2 (en) 2010-10-01 2015-08-04 Z124 Cross-environment communication framework
US9047102B2 (en) 2010-10-01 2015-06-02 Z124 Instant remote rendering
US9060006B2 (en) 2010-10-01 2015-06-16 Z124 Application mirroring using multiple graphics contexts
US9063798B2 (en) 2010-10-01 2015-06-23 Z124 Cross-environment communication using application space API
US9071625B2 (en) 2010-10-01 2015-06-30 Z124 Cross-environment event notification
US9077731B2 (en) 2010-10-01 2015-07-07 Z124 Extended graphics context with common compositing
US9405444B2 (en) 2010-10-01 2016-08-02 Z124 User interface with independent drawer control
US20120094716A1 (en) * 2010-10-15 2012-04-19 Reeves Paul E Mirrored remote peripheral interface
US8761831B2 (en) * 2010-10-15 2014-06-24 Z124 Mirrored remote peripheral interface
US20120242589A1 (en) * 2011-03-24 2012-09-27 University Of Lancaster Computer Interface Method
US9128660B2 (en) 2011-09-27 2015-09-08 Z124 Dual display pinyin touch input
US9152179B2 (en) 2011-09-27 2015-10-06 Z124 Portrait dual display and landscape dual display
US9104366B2 (en) 2011-09-27 2015-08-11 Z124 Separation of screen usage for complex language input
US9128659B2 (en) 2011-09-27 2015-09-08 Z124 Dual display cursive touch input
US8868135B2 (en) 2011-09-27 2014-10-21 Z124 Orientation arbitration
US8996073B2 (en) 2011-09-27 2015-03-31 Z124 Orientation arbitration
US10291793B2 (en) * 2011-11-22 2019-05-14 Sharp Kabushiki Kaisha Server apparatus providing portable information terminal and image forming apparatus with cloud image processing service
US9509873B2 (en) 2012-04-25 2016-11-29 Brother Kogyo Kabushiki Kaisha Image forming apparatus with communication device
US9900453B2 (en) 2012-04-25 2018-02-20 Brother Kogyo Kabushiki Kaisha Image forming apparatus
US11811989B2 (en) 2012-04-25 2023-11-07 Brother Kogyo Kabushiki Kaisha Image forming apparatus including antenna in cover
US10778857B2 (en) 2012-04-25 2020-09-15 Brother Kogyo Kabushiki Kaisha Image forming apparatus
US9360823B2 (en) 2012-11-09 2016-06-07 Brother Kogyo Kabushiki Kaisha Image forming apparatus having communication board for near field communication
US9116490B2 (en) 2012-11-09 2015-08-25 Brother Kogyo Kabushiki Kaisha Image forming apparatus having communication board for near field communication
US9757938B2 (en) 2012-11-09 2017-09-12 Brother Kogyo Kabushiki Kaisha Image forming apparatus
CN105829998A (zh) * 2013-12-12 2016-08-03 微软技术许可有限责任公司 将装置绑定到计算设备
US20150199941A1 (en) * 2014-01-15 2015-07-16 Nokia Corporation 3d touch sensor reader
CN106063241A (zh) * 2014-03-10 2016-10-26 京瓷办公信息系统株式会社 图像形成装置
US20170019549A1 (en) * 2014-03-10 2017-01-19 Kyocera Document Solutions Inc. Image forming device
US9924057B2 (en) * 2014-03-10 2018-03-20 Kyocera Document Solutions Inc. Image forming device that can be operated from a terminal device
US9768490B2 (en) 2014-03-12 2017-09-19 Brother Kogyo Kabushiki Kaisha Image forming apparatus having wireless communication device
US10177436B2 (en) 2014-03-12 2019-01-08 Brother Kogyo Kabushiki Kaisha Image forming apparatus having wireless communication device
US9924050B2 (en) 2014-09-18 2018-03-20 Konica Minolta, Inc. Operation display apparatus, portable terminal, programs therefor, and operation display system
US10628035B2 (en) * 2014-12-12 2020-04-21 Konica Minolta, Inc. Image processing apparatus, method for controlling the same, and storage medium
US20160170638A1 (en) * 2014-12-12 2016-06-16 Konica Minolta, Inc. Image processing apparatus, method for controlling the same, and storage medium
EP3119073A1 (fr) * 2015-07-14 2017-01-18 Canon Kabushiki Kaisha Appareil de formation d'image
US10678174B2 (en) 2015-07-14 2020-06-09 Canon Kabushiki Kaisha Image forming apparatus
US9817351B2 (en) 2015-07-14 2017-11-14 Canon Kabushiki Kaisha Image forming apparatus
US20170070641A1 (en) * 2015-09-03 2017-03-09 Konica Minolta, Inc. Document processing device and communication control method therefor
US10003717B2 (en) * 2015-09-03 2018-06-19 Konica Minolta, Inc. Document processing device and communication control method considering operation information
CN107089063A (zh) * 2016-02-18 2017-08-25 精工爱普生株式会社 打印装置、打印装置的控制方法
US10268935B2 (en) 2016-02-18 2019-04-23 Seiko Epson Corporation Tape printing apparatus and control method for tape printing apparatus
US10185905B2 (en) * 2016-02-18 2019-01-22 Seiko Epson Corporation Printing apparatus and control method for printing apparatus
US20170243094A1 (en) * 2016-02-18 2017-08-24 Seiko Epson Corporation Printing apparatus and control method for printing apparatus
US10178250B2 (en) 2016-10-21 2019-01-08 Konica Minolta, Inc. Cooperation system, information processing apparatus, cooperation method and non-transitory computer-readable recording medium encoded with cooperation program
JP2018028954A (ja) * 2017-11-28 2018-02-22 シャープ株式会社 画像表示方法
US11570502B2 (en) * 2018-05-01 2023-01-31 Telefonaktiebolaget Lm Ericsson (Publ) Providing personalized messages in adaptive streaming
US20200021697A1 (en) * 2018-07-10 2020-01-16 Canon Kabushiki Kaisha Information processing apparatus, control method, and storage medium thereof
US11039023B2 (en) * 2018-07-10 2021-06-15 Canon Kabushiki Kaisha Information processing apparatus, control method, and storage medium thereof

Also Published As

Publication number Publication date
US10318076B2 (en) 2019-06-11
US20150378516A1 (en) 2015-12-31
JP2008123476A (ja) 2008-05-29
JP4933304B2 (ja) 2012-05-16

Similar Documents

Publication Publication Date Title
US10318076B2 (en) Image displaying apparatus with changed menu based on detection of mobile information terminal placed thereon
KR101865048B1 (ko) 통신장치, 제어 방법 및 기억매체
US9344587B2 (en) Image processing system, information processing apparatus, image processing apparatus, control method therefor, and computer program
US10142510B2 (en) Print control apparatus and control method thereof
JP6650004B2 (ja) 通信システム、プログラム、および通信方法
CN106060304B (zh) 图像处理设备和图像处理设备的控制方法
JP5962240B2 (ja) 画像処理装置、画面情報提供方法、プログラム
US8547574B2 (en) Information processing apparatus and method for wireless communication with other information processing apparatuses
KR20190017672A (ko) 통신 장치 및 프린터
EP2637092A2 (fr) Appareil de traitement d'informations, procédé de commande d'appareil de traitement d'informations et programme associé
JP2017108341A (ja) 多機能周辺装置、多機能周辺装置の制御方法、モバイル端末、モバイル端末の制御方法、及びプログラム
JP7192058B2 (ja) 画像処理装置、画像処理装置の制御方法、及びプログラム
KR20170131252A (ko) 통신 장치 및 그 제어 방법
JP2007310865A (ja) 情報処理装置及び情報処理方法及び情報処理方法をコンピュータによって実現させるための制御プログラム及び制御プログラムを記録したコンピュータ読み取り可能な記録媒体
JP2007520942A (ja) 短距離通信用の近接検出
JP2023112032A (ja) 通信装置および制御方法およびプログラム
US20190306335A1 (en) Non-transitory computer-readable recording medium storing computer-readable instructions for communication device and communication device
JP2006173946A (ja) 無線通信システム
JP2006173949A (ja) ドキュメント処理システム
JP2021013169A (ja) 情報処理装置、情報処理装置の制御方法、及びプログラム
KR20080054664A (ko) 화상형성장치, 무선 통신시스템 및 그 수신 메시지 통보방법
JPH08116376A (ja) データ通信装置

Legal Events

Date Code Title Description
AS Assignment

Owner name: CANON KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ACHIWA, KEN;KINBARA, HIDEYASU;HASUI, SHIGEKI;REEL/FRAME:019997/0319

Effective date: 20070927

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION