US9060085B2 - Image forming apparatus, electronic mail delivery server, and information processing apparatus - Google Patents

Image forming apparatus, electronic mail delivery server, and information processing apparatus Download PDF

Info

Publication number
US9060085B2
US9060085B2 US13/470,575 US201213470575A US9060085B2 US 9060085 B2 US9060085 B2 US 9060085B2 US 201213470575 A US201213470575 A US 201213470575A US 9060085 B2 US9060085 B2 US 9060085B2
Authority
US
United States
Prior art keywords
data
data element
information
display unit
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US13/470,575
Other versions
US20120224232A1 (en
Inventor
Fabrice Matulic
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ricoh Co Ltd
Original Assignee
Ricoh Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ricoh Co Ltd filed Critical Ricoh Co Ltd
Priority to US13/470,575 priority Critical patent/US9060085B2/en
Publication of US20120224232A1 publication Critical patent/US20120224232A1/en
Application granted granted Critical
Publication of US9060085B2 publication Critical patent/US9060085B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/0044Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
    • G06K9/00456
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/40Document-oriented image-based pattern recognition
    • G06V30/41Analysis of document content
    • G06V30/413Classification of content, e.g. text, photographs or tables
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/0077Types of the still picture apparatus
    • H04N2201/0094Multifunctional device, i.e. a device capable of all of reading, reproducing, copying, facsimile transception, file transception

Definitions

  • the present invention relates to an image forming apparatus, an electronic mail delivery server, and an information processing apparatus.
  • Japanese Patent Application Laid-Open No. 2005-244411 discloses a technology for automatically transmitting a document, which is read by a scanner, to a destination which is obtained by optical character recognition (OCR) process.
  • OCR optical character recognition
  • text information and image information are distinguished in a read document to acquire a destination and body information from the text information, and the image information is transmitted by electronic mail (hereinafter, “email”) to the destination as an attached file.
  • electronic mail hereinafter, “email”
  • the conventional technology requires a plurality of processes such that a document read by the scanner is subjected to a character recognition process and further to a process for email transmission. This causes an operator's work to be more complicated. In other words, if each document element being an element that forms input document information is extracted from the document information and a newly extracted document element can be always displayed on an information terminal, this becomes convenient for users.
  • an image forming apparatus includes an analysis unit that analyzes data to extract a data element, and, upon identifying attribute of the data element as text, performs character recognition on the data element, a display unit that displays the data element and receives input, and a communication unit that transmits the data element selected by the input to an information processing apparatus through a network.
  • an electronic mail delivery server includes a receiving unit that receives an electronic mail containing image information, an analysis unit that analyzes the electronic mail to extract a data element, and, upon identifying attribute of the data element as text, performs character recognition on the data element, and a transmitting unit that transmits the data element to a destination of the electronic mail through a network.
  • an information processing apparatus includes a communication unit that receives a data element that constitutes data from an image forming apparatus connected through a network, an display unit that displays the data element as a symbol at a predetermined position on a screen, and receives a command to select and place the symbol on a window of an application, and an information processor that enters the data element corresponding to the symbol into the application in response to the command.
  • FIG. 1 is a functional block diagram of an information delivery system according to a first embodiment of the present invention
  • FIG. 2 is a functional block diagram of an analysis unit shown in FIG. 1 ;
  • FIG. 3 is a functional block diagram of an area recognition unit shown in FIG. 2 ;
  • FIG. 4 is a functional block diagram of a character analyzer shown in FIG. 3 ;
  • FIGS. 5 and 6 are examples of a display screen of a first operation display unit shown in FIG. 1 ;
  • FIG. 7 is one example of a display screen of a second operation display unit shown in FIG. 1 ;
  • FIG. 8 is a flowchart of the operation of the information delivery system shown in FIG. 1 ;
  • FIG. 9 is a functional block diagram of an information delivery system according to a second embodiment of the present invention.
  • FIG. 10 is a flowchart of the operation of the information delivery system shown in FIG. 9 ;
  • FIG. 11 is a functional block diagram of an information delivery system according to a third embodiment of the present invention.
  • FIG. 12 is one example of an editing mode screen displayed in an information processing apparatus shown in FIG. 11 ;
  • FIG. 13 is a flowchart of the operation of the information delivery system shown in FIG. 11 ;
  • FIG. 14A is a functional block diagram of an information delivery system according to a fourth embodiment of the present invention.
  • FIG. 14B is one example of a display screen of a second operation display unit shown in FIG. 14A ;
  • FIG. 15 is a flowchart of the operation of the information delivery system shown in FIG. 14A ;
  • FIG. 16 is a flowchart of the operation of an information delivery system according to a modification of the fourth embodiment
  • FIG. 17 is a functional block diagram of an information delivery system according to a fifth embodiment of the present invention.
  • FIG. 18 is a block diagram of a hardware configuration of the image forming apparatus according to the embodiments.
  • FIG. 19 is a block diagram of a hardware configuration of the information processing apparatus according to the embodiments.
  • FIG. 1 is a functional block diagram of an information delivery system according to the first embodiment of the present invention.
  • the information delivery system includes an image forming apparatus 110 and an information processing apparatus 200 connected to each other through a network 3 .
  • the image forming apparatus 110 analyzes input data to extract therefrom a data element, i.e., an element that constitutes the data, and displays preview of the extracted data element.
  • the image forming apparatus 110 receives a command input through the preview, and transmits information on the selected data element to the information processing apparatus 200 through the network 3 .
  • the information processing apparatus 200 receives the data element information from the image forming apparatus 110 through the network 3 , and displays the data element information in a contents bar (widget) arranged in a rectangular shape at a predetermined position on its screen using an icon as a symbol.
  • the information processing apparatus 200 displays the window of an active (open) application. When the displayed icon is selected and a command is input to bring (place) the selected icon on the application window, the information processing apparatus 200 enters the data element information indicated by the icon into the application running thereon according to the command, and displays the data element information.
  • the image forming apparatus 110 extracts each data element from input data and displays extracted data elements.
  • the selected data element is transmitted to the information processing apparatus 200 and displayed on the screen.
  • the data element can be used in another application active on the screen. This enables to obtain an information delivery system with high usability in which the data element extracted by the image forming apparatus 110 from the input data can be displayed in the information processing apparatus 200 and used in another application.
  • the image forming apparatus 110 includes a scanner 1 , an input processor 2 , an information delivery apparatus 100 , an output processor 4 , and an output unit 5 .
  • the information delivery apparatus 100 extracts a data element from an image read by the scanner 1 and transmits the extracted data element to the information processing apparatus 200 .
  • the information delivery apparatus 100 includes a first operation display unit 11 , an analysis unit 12 , a storage unit 13 , a first display controller 14 , and a first communication unit 15 .
  • the scanner 1 reads data.
  • the input processor 2 converts the analog data read by the scanner 1 to electronic digital data, and transmits the digitally converted image data to the information delivery apparatus 100 .
  • the first communication unit 15 can also receive data through the network 3 .
  • the information delivery apparatus 100 can also input thereto data stored in any one of various recording media through an input unit (not shown).
  • the storage unit 13 stores therein input image data and various types of digital data subjected to processing.
  • the output processor 4 subjects the image data for the read image to output processing.
  • the output unit 5 outputs an image of the image data subjected to the output processing.
  • the image output is a function of the image forming apparatus.
  • the image forming apparatus inputs thereto electronic data through the scanner 1 or the network 3 .
  • the analysis unit 12 analyzes an attribute of input data and extracts a data element that constitutes data, from the data.
  • the first display controller 14 displays the extracted data element on the first operation display unit 11 .
  • the first operation display unit 11 receives a selection input and editing inputs, from an operator, performed by touching the data element displayed thereon.
  • the first operation display unit 11 can be formed as a so-called touch panel having a liquid-crystal display function.
  • the first communication unit 15 transmits the received data element to the information processing apparatus 200 through the network 3 .
  • the analysis unit 12 analyzes which of attributes such as text, halftone image, graphic, image, and any other image type corresponds to the input data, and divides the data into areas by the attribute. Each element of the areas divided for each attribute is a data element.
  • FIG. 2 is a functional block diagram of the analysis unit 12 .
  • the analysis unit 12 includes an area recognition unit 121 and an attribute analyzer 122 .
  • the area recognition unit 121 recognizes each area of data. For example, the area recognition unit 121 recognizes a text attribute for each paragraph, and recognizes a photographic image attribute and a graphic image attribute for each photographic image and each graphic image, respectively.
  • the area recognition unit 121 divides the data into areas based on the result of recognition.
  • the area recognition unit 121 also recognizes each area of the data based on a well-known technology such as change in density of data, detection of an edge portion, and detection of a halftone area.
  • the area recognition unit 121 detects that a portion determined as a character continues to thereby recognize these portions as a text area.
  • the area recognition unit 121 detects that a halftone pixel continues, and can thereby recognize these portions as a photographic image area.
  • the area recognition unit 121 detects an edge portion and a sharp contrast between dark and light colors, and can thereby recognize these portions as a graphic image area.
  • the area recognition unit 121 determines any portion other than the portions as being not text, photograph, and graphic images, and subjects these portions to area recognition process.
  • the attribute analyzer 122 analyzes which information of the attributes such as text, photographic image, halftone image, graphic image, and any other image corresponds to the information for an area to be recognized by the area recognition unit 121 . As a result, the area is divided for each attribute. Each element as the area divided for each attribute is generated as a data element.
  • FIG. 3 is a functional block diagram of the area recognition unit 121 .
  • the area recognition unit 121 includes a layout analyzer 1211 and a character analyzer 1212 .
  • the layout analyzer 1211 subjects the data element extracted by the area recognition unit 121 and the attribute analyzer 122 to a process for analyzing a layout structure of the data.
  • the layout analysis performed by the layout analyzer 1211 is to apply detection information such as edge detection, halftone detection, and pixel position detection to a well-known algorithm, to analyze each layout of divided areas. Alternatively, because the data elements are already extracted, the extracted data elements are applied to the known algorithm, and this enables the layout analysis.
  • the character analyzer 1212 subjects the data element, which is analyzed as the text attribute, to character recognition process.
  • FIG. 4 is a functional block diagram of the character analyzer 1212 .
  • the character analyzer 1212 converts a character element to a machine-processable format.
  • the character analyzer 1212 includes an optical character recognition (OCR) unit 12121 , a spline-to-vector converter 12122 , and a character extracting unit 12123 .
  • OCR optical character recognition
  • the character extracting unit 12123 extracts a character by contrast detection.
  • the spline-to-vector converter 12122 subjects the extracted character to spline shaping.
  • the OCR unit 12121 optically reads a character read as an image to perform character recognition. The character analysis is possible by applying a known algorithm thereto.
  • the character analyzer 1212 converts a character element to a machine-processable format, and this enables the editing to be accurate and quick.
  • the OCR unit 12121 extracts a character image as text information, and so, efficient and highly user-friendly display and editing become possible.
  • the character can be more smoothly displayed by the spline-to-vector converter 12122 .
  • the character analyzer 1212 includes the character extracting unit 12123 , which enables to accurately clip a character and perform character analysis.
  • the analysis unit 12 can determine an attribute of a divided area based on an input of the set attribute, by the operator, which is received by the first operation display unit 11 . For example, there is a case where the operator determines, from the displayed screen, that a text image analyzed and displayed by the analysis unit 12 should actually be a photographic image, and where the operator provides input to display the photographic image. In this case, the analysis unit 12 receives the attribute determined as the photographic image. With this feature, the attribute of the data element being the divided area can be changed or determined through the input by the operator.
  • the first operation display unit 11 receives an input as a touch thereon with a finger, a stylus pen, or any other touching tool.
  • the first operation display unit 11 detects a touch at each point on a display screen and accepts the touch as input using a known technology such as a resistive system for detecting a change in resistance by the pressure applied from the finger or the pen, or a capacitive system (analog capacity coupling scheme).
  • a touch type an example of touch input for inputting by touching the first operation display unit 11 is explained below, but the type is not necessarily limited only to touch input. Therefore, various types of input system including a mouse, a keyboard, and a press switch can be used.
  • FIG. 5 is one example of a display screen of the first operation display unit 11 .
  • An operation display screen 500 of the first operation display unit 11 includes an operation unit 510 and a display unit 530 .
  • Numerals, edit items, and Start for image output can be input through the operation unit 510 .
  • the display unit 530 displays thereon a preview image 532 and an element image 534 .
  • the preview image 532 represents input data
  • the element image 534 is an element image selected from data elements extracted by the analysis unit 12 .
  • the first display controller 14 displays the input image data and the data element extracted by the analysis unit 12 on the first operation display unit 11 in the form as shown in FIG. 5 .
  • the data element is selected from the preview image 532 through a copying operation by touch input using the display unit 530 as an input interface, and displayed in element image mode as the element image 534 .
  • the selection is possible by touch input using a drag-and-drop operation.
  • the selection is also possible by touch input using a copy-and-paste operation.
  • FIG. 6 is another example of a display screen of the first operation display unit 11 .
  • a screen 600 is displayed by being divided into a preview screen 620 and an element image display 650 .
  • the preview screen 620 is divided into data elements and displayed.
  • the element image display is also divided based on data elements.
  • Data elements 622 to 625 and 631 to 634 are now selected by the operator, and are set in the element image display 650 provided on the right side in the element image mode using the drag-and-drop operation.
  • the data elements 622 to 625 are selected from a data element 621 and the data elements 622 to 625 which are text elements in the preview screen 620 , and are copied to data elements 642 to 645 to be set in the element image display 650 on the right side.
  • the data elements 631 to 634 which are photographic images are selected by the operator, and are copied to data elements 651 to 654 which are the photographic images to be set in the element image display 650 on the right side using the drag-and-drop operation.
  • the first operation display unit 11 can be configured to receive input for various settings including the edit settings and print settings from two-module display.
  • the first display controller 14 can also subject the data element to editing as appropriate element information. For example, image information can also be reduced according to a communication function.
  • image information can also be reduced according to a communication function.
  • a thumbnail image thereof is created, and the thumbnail image can be used as an icon image for the photographic image. If it is to be actually transmitted, not the thumbnail image but its original element image information is transmitted.
  • the data element extracted as character is to be inserted into the element image mode on the right side by touch input using the drag-and-drop operation, the data element is inserted therein for each element as not character image information but as text information which has already been subjected to character analysis.
  • the first display controller 14 associates display of the data elements divided by the analysis unit 12 in a preview image with display thereof in the element image mode, to be stored in the storage unit 13 . With this association, the editing can be subjected to a corresponding original data element depending on the editing input from the element image mode.
  • the first operation display unit 11 also receives inputs from the operator such as a destination, an attribute of a data element, and deletion or addition of a data element in the element image mode.
  • the operation input through the first operation display unit 11 can be performed based on a system of selecting it through touch input by the operator, from a pull-down menu pulled down on the first operation display unit 11 .
  • the pull-down menu is possible to include, for example, Add, Shift, Delete, Enlarge/Reduce, and Input of date (not shown). Alternatively, these menu items can be made to appear with a right click operation.
  • the first operation display unit 11 receives an input to delete the data element 652 in the element image mode, and the first display controller 14 newly displays a status (not shown) without the data element 652 in the element image display 650 of the first operation display unit 11 .
  • the first operation display unit 11 can also accept a drag operation for the data element to be displayed using a stylus pen or the like, in addition to touch input with a finger, because using the stylus pen allows a finer operation. Further, in addition to the drag operation, by displaying a predetermined menu on the first operation display unit 11 and by selecting an item from the menu by touch input, a data element can be inserted in email module display and displayed.
  • the first communication unit 15 transmits information for the data element displayed in the element image display 650 of the first operation display unit 11 , to the information processing apparatus 200 through the network 3 .
  • the image forming apparatus can select a data element while viewing a preview image on the first operation display unit 11 , to automatically transmit the selected data element to the information processing apparatus.
  • the image forming apparatus can also transmit display information in either the preview image mode or the element image mode or in both modes displayed by the first operation display unit 11 , to the output processor 4 , and can output an image through the output unit 5 .
  • Output of the data edited in the information delivery apparatus can be implemented by some other means according to various functions of the image forming apparatus such as an output using a facsimile transmission function.
  • the information processing apparatus 200 receives the information for the data element extracted and selected in the image forming apparatus through the network, displays the information on the screen, and enters it into another application running on the screen, and this allows the use of the information. In this manner, the data element extracted from the input data by the image forming apparatus is displayed in the information processing apparatus. Therefore, by placing the displayed data element on the application window running on the information processing apparatus, the operator can use the data element in the application.
  • the information processing apparatus 200 includes a second communication unit 61 , an information processor 62 , an editing unit 63 , a second display controller 64 , and a second operation display unit 65 ( FIG. 1 ).
  • the second communication unit 61 receives the data element information transmitted by the information delivery apparatus 100 through the network 3 .
  • the second display controller 64 displays the received data element information in a contents bar using an icon on the second operation display unit 65 .
  • the second operation display unit 65 displays, when an application such as Word (TM) has started, the screen or window of the application and receives an instruction input to select the displayed icon to be placed on the application screen or window.
  • TM Word
  • the information processor 62 enters the data element information indicated by the icon into the application according to the instruction input received through the second operation display unit 65 .
  • the editing unit 63 subjects the data element information selected by the second operation display unit 65 to, if an editing input is received, the received editing.
  • the second display controller 64 displays the data element information subjected to the editing on the second operation display unit 65 .
  • FIG. 7 is one example of a display screen of the second operation display unit 65 .
  • the mechanical configuration of the second operation display unit 65 is basically the same as the first operation display unit 11 .
  • the second operation display unit 65 receives touch input in the same manner as explained for the first operation display unit 11 .
  • Another application screen 710 and a contents bar 750 that displays data elements in the form of being set therein are displayed on a screen 700 of the second operation display unit 65 .
  • Displayed in the contents bar 750 are, for example, a text image 751 and an image 752 , and the operator inserts the image 752 in an editing screen of a word processor displayed on the left side by touch input using the drag-and-drop operation indicated by arrow 760 .
  • An image 712 is a resulting image after the insertion. Not only the image 752 but the text image 751 as text information can be inserted therein.
  • the information processor 62 converts the image 752 selected by the drag-and-drop operation to a data format so that the data can be inserted in the word processor.
  • the converted image is inserted into a location specified by touch input to place the image thereon.
  • the operator selects an editing item on the second operation display unit 65 and inputs the editing item. Even if the editing item appears by a right click, the pull-down menu can also be displayed.
  • the editing unit 63 subjects the data element information to the editing according to the editing item which is selected and input.
  • the second display controller 64 again displays the data element information subjected to the editing in the contents bar 750 .
  • the data element image can be updated with the edited data element information and the updated data element image can be displayed, or the image before the editing can be displayed in such a manner that the images overlap each other. It is also possible to display another contents bar (not shown) in which the data element image after being edited is set.
  • FIG. 8 is a flowchart of the operation of the information delivery system according to the first embodiment.
  • the scanner 1 reads data
  • the input processor 2 converts the read analog data to digital data
  • the information delivery apparatus 100 receives the digital data.
  • the analysis unit 12 analyzes the read image data.
  • the analysis unit 12 analyzes an attribute of data based on recognition of an area performed by the area recognition unit 121 and based on analysis of the attribute by the attribute analyzer 122 , to extract a data element.
  • the layout analyzer 1211 analyzes layout and the character analyzer 1212 analyzes text information.
  • the character analyzer 1212 analyzes a character by using an OCR process and a character extracting process.
  • the analysis unit 12 analyzes the attribute information and extracts a data element (step S 101 ).
  • the first display controller 14 displays the preview screen in a preview image mode (see 620 in FIG. 6 ) in which the data element is input and the element image display (see 650 in FIG. 6 ) in the element image mode in which the selected data element is displayed.
  • the first display controller 14 generates display data to display the display data for each data element analyzed in the preview image mode (step S 102 ).
  • the element image display represents a state of a new creation screen.
  • the first operation display unit 11 displays the preview screen 620 and the element image display 650 in parallel for each data element generated by the first display controller 14 .
  • the first operation display unit 11 determines whether a selection input is received for the data element in the preview screen 620 (step S 103 ). When the operator selects the data element and sends an input to set the data element in the element image display 650 by using the drag-and-drop operation (YES at step S 103 ), the first operation display unit 11 receives the input.
  • the first operation display unit 11 selects a data element in the preview screen 620 for each data element generated by the first display controller 14 to set the selected data element in the element image display 650 by the drag-and-drop operation ( FIG. 6 ).
  • a new screen can also be configured to set all the data elements displayed in the preview screen 620 , in the element image display. In this case, during the selection operation by the operator, any unnecessary portion is deleted as required.
  • the data elements in both modes are associated with each other by the first display controller 14 .
  • the first communication unit 15 transmits information for the selected data element to the information processing apparatus 200 through the network 3 (step S 104 ).
  • the second communication unit 61 detects whether the data element information has been received (step S 105 ).
  • the second display controller 64 displays an icon of the data element being set in the contents bar (widget) 750 ( FIG. 7 ) in the second operation display unit 65 (step S 106 ).
  • the second operation display unit 65 detects whether the icon has been selected through the display screen (step S 107 ). When the icon has been selected (YES at step S 107 ), the second operation display unit 65 further detects whether an instruction input to place the selected icon on the screen or window of an active application has been received (step S 108 ). In other words, it is detected whether an input indicating a drag-and-drop operation, such that the data element displayed in the contents bar 750 is dragged and dropped onto a screen of another application running thereon, has been received.
  • the information processor 62 enters the data element information corresponding to the icon into the application according to the instruction input (step S 109 ).
  • the image forming apparatus extracts each data element from input data and displays extracted data elements.
  • the image forming apparatus transmits the selected data element to the information processing apparatus.
  • the information processing apparatus displays the data element on the display screen to place it on the screen of another application running thereon, thus using the data element. Therefore, it is possible to obtain a highly user-friendly information delivery system in which the data element extracted from the input data by the image forming apparatus is displayed in the information processing apparatus and can be used in the application without being aware of the operation for data conversion.
  • the second operation display unit of the information processing apparatus is desirably configured to display the contents bar in the form of a display bar and to perform an instruction input by at least either one of the drag-and-drop operation and the copy-and-paste operation. This is because data elements are displayed in an easy-to-view form, and thus, the displayed data elements can be used in the application on the screen with a simple operation.
  • the analysis unit 12 generates data element information in a data format according to the attribute of the data element to be analyzed, and that the information processor 62 enters the data element information displayed on the second operation display unit 65 into the application in the data format based on the application and the attribute of the data element information.
  • the data element to be analyzed is text information
  • data element information is generated in the text file.
  • the data element is image information
  • data element information is generated with the image information, and file capacity upon transmission can thereby be suppressed.
  • the text information or the image information is entered into another application through the screen of the information processing apparatus in a form of a text file or an image file, it is also possible to display the information more accurately.
  • the first operation display unit 11 receives an input to determine an attribute of the data element to be displayed and the analysis unit 12 determines the attribute of the data element according to the received input. This is because the attribute can accurately be determined based on the operator's determination.
  • the first operation display unit 11 receives an input to specify an area of data to be displayed and the analysis unit 12 determines the attribute of the area specified by the received input to extract a data element. This is because the data element information can be extracted only from the area determined by the operator as being necessary and this enables efficient extraction of the data element.
  • the first operation display unit 11 receives an input to specify a shape of an area of data to be displayed as a block. This is because by specifying the area with a block, the data element information can be extracted only from the required area with a simple operation, to enable efficient operation.
  • the storage unit 13 stores therein the data element information of which selection input is received from the first operation display unit 11 and the first communication unit 15 reads the data element information from the storage unit 13 to be transmitted. This is because the data element information stored in the storage unit 13 can be collectively transmitted at an appropriate time for transmission to enable enhancement of communication efficiency.
  • the analysis unit 12 analyzes a data element in parallel to the execution. This is because the data element information can be extracted in the background even when the image forming apparatus executes another function, to enable extraction of highly usable data element information.
  • An information delivery system differs from that of the first embodiment in that the information processing apparatus transmits identification information to the image forming apparatus and the image forming apparatus transmits data element information to a destination corresponding to the received identification information. This enables the image forming apparatus to automatically determine a destination and transmit data element information to the destination.
  • FIG. 9 is a functional block diagram of the information delivery system according to the second embodiment.
  • An information processing apparatus 220 receives identification information from the second operation display unit 65 .
  • a first communication unit 25 of an information delivery apparatus 120 provided in an image forming apparatus 123 receives the identification information and transmits data element information to a corresponding destination according to a correspondence table (not shown) between identification information and destination stored in the storage unit 13 .
  • the information processing apparatus 220 further includes an identification unit 71 that identifies received identification information. Therefore, it is desirable that only when the received identification information is legitimate, the identification information is transmitted to the information delivery apparatus 120 . This enables to determine whether an operator who operates the information processing apparatus 220 has an access right and to receive data element information from the information delivery apparatus 120 to be displayed only when the operator has the access right.
  • the information delivery apparatus 120 includes a verification unit 26 that verifies whether received identification information is legitimate. This enables the data element information to be transmitted only to a destination of which verification is successful.
  • the second communication unit 61 of the information processing apparatus 220 desirably transmits an address on network containing its Internet Protocol (IP) address as identification information.
  • IP Internet Protocol
  • FIG. 10 is a flowchart of the operation of the information delivery system according to the second embodiment.
  • the information processing apparatus 220 detects whether the second operation display unit 65 receives an input of identification information (step S 201 ).
  • the identification unit 71 determines whether the received identification information is legitimate (step S 202 ).
  • the second communication unit 61 transmits the identification information to the image forming apparatus (step S 203 ).
  • the second operation display unit 65 displays an error and finishes the operation (step S 204 ).
  • the first communication unit 15 transmits the selected data element to a corresponding destination by referring to the identification information and an association table (step S 205 ).
  • the image forming apparatus receives the identification information from the information processing apparatus and transmits the data element information only to a destination of the corresponding identification information.
  • the image forming apparatus automatically determines the legitimate destination to enable transmission of the data element information thereto.
  • An information delivery system differs from that of the first embodiment in that the second operation display unit displays an editing mode screen on which the data element information is displayed and through which an editing input is received.
  • the editing unit edits the data element information according to the editing input received through the editing mode screen, and the second operation display unit 65 displays again the edited data element information.
  • FIG. 11 is a functional block diagram of the information delivery system according to the third embodiment.
  • a second display controller 84 of an information processing apparatus 230 causes the second operation display unit 65 to display the editing mode screen on which data element information is displayed and through which an editing input is received.
  • An editing unit 83 edits the data element information according to the editing input received through the editing mode screen in the second operation display unit 65 .
  • the second display controller 84 causes the second operation display unit 65 to display the edited data element information.
  • the data element displayed in the information processing apparatus is subjected to editing to enable the result after the editing to be displayed.
  • FIG. 12 is one example of an editing mode screen displayed in an information processing apparatus according to the third embodiment.
  • Data elements 1210 , 1220 , 1230 , and 1235 are displayed on an editing mode screen 1200 .
  • the location of the data element (header image) 1210 represented by an image is modified by an input operation from the operator, and the header after modified (not shown) can be displayed.
  • the operator performs an input operation to order the data elements 1230 and 1235 which are text elements, and the ordering can be indicated by arrows.
  • FIG. 13 is a flowchart of the operation of the information delivery system according to the third embodiment.
  • the second display controller 84 causes the second operation display unit 65 to display a screen in an editing mode on which the data element information is displayed and through which an editing input is received (step S 301 ).
  • the second operation display unit 65 detects whether an editing input has been received from the data element information displayed in the editing mode (step S 302 ).
  • the editing unit 83 subjects the data element information to editing according to the received editing input (step S 303 ).
  • the second operation display unit 65 again displays the edited data element information (step S 304 ).
  • the information processing apparatus 230 subjects the displayed data element to editing, and can display the result after the editing.
  • the second operation display unit 65 updates the data element information with the edited data element information to be displayed, and, in addition to this, can also display an updated element image in parallel with an element image before updated. Alternatively, the display before updated can be lightly shaded. Because the pieces of element information before and after editing can be compared with each other.
  • An information delivery system differs from that of the first embodiment in that the data element displayed in the contents bar is stored as shared data in another information processing apparatus.
  • the information processing apparatus includes a search unit.
  • the second operation display unit receives a search request input to search for a data element to be displayed in a contents bar, and the search unit generates search request information used to search for the data element stored as shared data in another information processing apparatus based on the received search request input, to transmit the search request information through the network.
  • FIG. 14A is a functional block diagram of the information delivery system according to the fourth embodiment.
  • a data element is stored in its storage unit as shared data 241 .
  • a search unit 96 is included in an information processing apparatus 240 of the information delivery system according to the fourth embodiment.
  • the second operation display unit 65 receives a search request input to search for a data element as the shared data in the other information processing apparatus 240 a from a search field 1402 shown in FIG. 14B .
  • the search unit 96 generates search request information used to search for the data element to be displayed in a contents bar of the second operation display unit from the shared data 241 stored in another information processing apparatus through the network, to transmit the generated search request information.
  • FIG. 15 is a flowchart of the operation of the information delivery system according to the fourth embodiment.
  • the second operation display unit 65 of the information processing apparatus 240 detects whether the search request information for the data element has been received (step S 401 ).
  • the search unit 96 generates shared-data-search-request information used to search for the data element stored as the shared data 241 in another information processing apparatus and transmit it (step S 402 ).
  • the second communication unit 61 transmits the shared-data-search-request information to another information processing apparatus (step S 403 ).
  • the data element is searched from the other information processing apparatus 240 a to acquire the data element, the data element is transmitted, and the second communication unit 61 receives the data element.
  • the other information processing apparatus 240 a can be configured to display the data element as the shared data 241 in the contents bar of the other information processing apparatus 240 a.
  • the search unit 96 can also be configured to search for data element information stored as the shared data in the storage unit 13 of the image forming apparatus through the network.
  • FIG. 16 is a flowchart of the operation of the information delivery system according to a modification of the fourth embodiment.
  • the second operation display unit 65 of the information processing apparatus 240 detects whether search request information for the data element stored as the shared data in the storage unit 13 of the image forming apparatus has been received (step S 501 ).
  • the search unit 96 When the search request information has been received (YES at step S 501 ), the search unit 96 generates storage-search-request information used to search for the data element stored as the shared data in the storage unit 13 of the image forming apparatus to be transmitted through the network (step S 502 ).
  • the second communication unit 61 transmits the storage-search-request information to the image forming apparatus (step S 503 ).
  • the data element is searched from the storage unit 13 and acquired, the data element is transmitted and the second communication unit 61 receives the data element.
  • the information processing apparatus 240 acquires the data element information stored as the shared data in the storage unit 13 of the image forming apparatus, and can display the data element in its own contents bar.
  • FIG. 17 is a functional block diagram of an information delivery system according to a fifth embodiment of the present invention.
  • the information delivery system according to the fifth embodiment has a difference from that of the first embodiment in that there is no need to include the scanner 1 , the input processor 2 , the output processor 4 , and the output unit 5 in the image forming apparatuses explained in the first to the fourth embodiments.
  • the information delivery system according to the fifth embodiment includes an email delivery server 150 (data delivery server) instead of the image forming apparatus.
  • a first communication unit 55 of the email delivery server 150 temporarily receives the email, and an analysis unit 52 analyzes the received email containing the image to extract data elements.
  • the analysis of the email and the extraction of the data elements are performed in the same manner as that of the first to the fourth embodiments.
  • the email delivery server 150 extracts the data element from the email to transmit the data element to the information processing apparatus 200 used by the user as the specified destination, and thus, the data element can be displayed in the information processing apparatus 200 . Therefore, the data element can be extracted simply by transmitting the image captured by the mobile phone 1701 or by the digital camera 1702 and the image captured by the PC 1703 by email. Thus, the user does not have to perform a plurality of processes such as the OCR process and the email transmission, which allows improvement of the user's usability.
  • FIG. 18 is a block diagram of a hardware configuration of the image forming apparatus according to the embodiments.
  • the image forming apparatus is configured as, for example, a multifunction product (MFP) that has multifunction such as a facsimile function and a scanning function.
  • the MFP includes a controller 2210 and an engine 2260 which are connected to each other through a Peripheral Component Interconnect (PCI) bus.
  • the controller 2210 controls inputs from a Facsimile Control Unit interface (FCU I/F) 2230 and the first operation display unit 11 , such as control of the MFP, image display control, various types of controls, image processing control, and information delivery control.
  • the engine 2260 is an image processing engine or the like which can be connected to the PCI bus, and an image processing portion such as error diffusion and gamma-conversion performed on acquired image data is included therein.
  • the controller 2210 includes a Central Processing Unit (CPU) 2211 , a northbridge (NB) 2213 , a system memory (MEM-P) 2212 , a southbridge (SB) 2214 , a local memory (MEM-C) 2217 , an Application Specific Integrated Circuit (ASIC) 2216 , and a Hard Disk Drive (HDD) 13 as the storage unit.
  • the NB 2213 and the ASIC 2216 are connected to each other through an Accelerated Graphics Port (AGP) bus 2215 .
  • the MEM-P 2212 includes a Read Only Memory (ROM) 2212 a and a Random Access Memory (RAM) 2212 b.
  • the CPU 2211 controls the MFP, and has a chipset including the NB 2213 , the MEM-P 2212 , and the SB 2214 .
  • the CPU 2211 is connected to other devices through the chipset.
  • the NB 2213 is a bridge that connects the CPU 2211 to the MEM-P 2212 , the SB 2214 , and the AGP 2215 , and includes a memory controller, a PCI master, and an AGP target.
  • the memory controller controls read/write from/to the MEM-P 2212 .
  • the MEM-P 2212 is a system memory that stores therein computer programs and data and used to load computer programs and data.
  • the MEM-P 2212 includes the ROM 2212 a and the RAM 2212 b .
  • the ROM 2212 a stores therein the computer programs and data
  • the RAM 2212 b is a rewritable and readable memory into which the computer programs and data are loaded and is used for image drawing upon image processing.
  • the SB 2214 is a bridge that connects the NB 2213 to a PCI device and peripheral devices.
  • the SB 2214 is connected to the NB 2213 through the PCI bus, which is also connected with the FCU I/F 2230 or the like.
  • the ASIC 2216 is an Integrated Circuit (IC), for multimedia information processing, which includes a hardware element for multimedia information processing.
  • the ASIC 2216 serves as a bridge to connect the AGP 2215 , the PCI bus, the HDD 13 , and the MEM-C 2217 to each other.
  • the ASIC 2216 connects a Universal Serial Bus (USB) 2240 and an IEEE 1394 (the Institute of Electrical and Electronics Engineers 1394) I/F 2250 , through the PCI bus, to a PCI target and an AGP master, an arbiter (ARB) that forms a core of the ASIC 2216 , a memory controller that controls the MEM-C 2217 , a plurality of Direct Memory Access Controllers (DMACs) that rotate image data by a hardware logic, and the engine 2260 .
  • USB Universal Serial Bus
  • IEEE 1394 the Institute of Electrical and Electronics Engineers 1394
  • the MEM-C 2217 is a local memory used as an image buffer for transmission and a code buffer.
  • the HDD 13 stores therein image data, computer programs, font data, and forms.
  • the AGP 2215 is a bus interface for graphics accelerator card that is proposed to increase the speed of graphic processing, and high-throughput direct access to the MEM-P 2212 allows the speed of the graphics accelerator card to be increased.
  • the first operation display unit 11 connected to the ASIC 2216 receives input indicating operation by the operator and transmits received input information to the ASIC 2216 .
  • a computer program (hereinafter, “information delivery program”) executed in the MFP according to the embodiments is provided as previously stored in the ROM and the like.
  • the information delivery program can be provided as recorded on a computer-readable recording medium such as a Compact Disk Read Only Memory (CD-ROM), a Flexible Disk (FD), Compact Disk Recordable (CD-R), and a Digital Versatile Disk (DVD) in a file of an installable format or of an executable format.
  • a computer-readable recording medium such as a Compact Disk Read Only Memory (CD-ROM), a Flexible Disk (FD), Compact Disk Recordable (CD-R), and a Digital Versatile Disk (DVD) in a file of an installable format or of an executable format.
  • CD-ROM Compact Disk Read Only Memory
  • FD Flexible Disk
  • CD-R Compact Disk Recordable
  • DVD Digital Versatile Disk
  • the information delivery program can be provided as stored in a computer connected to a network such as the Internet and downloaded via the network.
  • the information delivery program can also be provided or distributed via a network such as the Internet.
  • the information delivery program includes modules that implement the aforementioned components (analysis unit 12 , storage unit 13 , first display controller 14 , and first communication unit 15 , etc.).
  • the CPU processor
  • the analysis unit 12 , the storage unit 13 , the first display controller 14 , and the first communication unit 15 or the like are implemented on the main storage unit.
  • FIG. 19 is a block diagram of a hardware configuration of the information processing apparatus according to the embodiments.
  • the information processing apparatus includes a control unit such as a CPU 2301 , a storage unit such as ROM 2302 and RAM 2303 , an external storage unit 2304 such as a HDD and a CD drive, a display device 65 a in the second operation display unit 65 , an input device 65 b such as a touch input mechanism, a keyboard, and a mouse, a communication I/F 2307 , and a bus 2308 connecting these components to one another, which realizes the hardware configuration using an ordinary computer.
  • a control unit such as a CPU 2301
  • a storage unit such as ROM 2302 and RAM 2303
  • an external storage unit 2304 such as a HDD and a CD drive
  • a display device 65 a in the second operation display unit 65 such as a touch input mechanism, a keyboard, and a mouse
  • a communication I/F 2307 such as a touch input mechanism, a keyboard, and
  • a computer program (hereinafter, “information processing program”) executed by the information processing apparatus according to the embodiments is provided as recorded on a computer-readable recording medium such as a CD-ROM, a Flexible Disk (FD), CD-R, and a DVD in a file of an installable form or an executable form.
  • a computer-readable recording medium such as a CD-ROM, a Flexible Disk (FD), CD-R, and a DVD in a file of an installable form or an executable form.
  • the information processing program can be provided as stored in a computer connected to a network such as the Internet and downloaded via the network.
  • the information processing program can also be provided or distributed via a network such as the Internet.
  • the information processing program can be provided as previously stored in the ROM and the like.
  • the information processing program includes modules that implement the aforementioned components (second communication unit, information processor, editing unit, and second display controller, etc.).
  • the CPU processor
  • the second communication unit, the information processor, the editing unit, the second display controller, and the like are implemented on the main storage unit.
  • an image forming apparatus analyzes data to extract a data element. Upon identifying attribute of the data element as text, the image forming apparatus performs character recognition on the data element, and displays the data element. When receiving input to select the data element, the image forming apparatus transmits the data element to an information processing apparatus through a network.
  • the data element extracted from the input data can be delivered with less process, which improves convenience for users.
  • an email delivery server receives email containing image information, and analyzes the email to extract a data element. Upon identifying attribute of the data element as text, the email delivery server performs character recognition on the data element, and transmits the data element to a destination of the email through a network.
  • a data element can be extracted without processes such as the OCR process when a user simply transmits email containing an image. This improves convenience for the user.
  • the information processing apparatus displays the data element as a symbol at a predetermined position on its screen. Upon receiving input to select the symbol and place it on the window of an application, the information processing apparatus enters the data element indicated by the symbol into the application.
  • the data element extracted from the input data by the image forming apparatus is displayed and can thereby be used in another application, which also improves convenience for users.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Facsimiles In General (AREA)
  • Facsimile Image Signal Circuits (AREA)
  • Information Transfer Between Computers (AREA)
  • Editing Of Facsimile Originals (AREA)

Abstract

An image forming apparatus includes an information delivery apparatus including an analysis unit and a first display controller. The analysis unit analyzes data to extract data elements. The first display controller displays preview of the data elements on a first display unit. A data element selected from the preview is transmitted to an information processing apparatus via a network. The information processing apparatus displays the data element as a symbol at a predetermined position on a second display unit. The second display unit displays an application window. When the second display unit receives input to select the symbol and place it on the application window, the data element is entered into a corresponding application and the symbol is displayed on the application window.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
The present application is a continuation of U.S. application Ser. No. 11/716,574, filed on Mar. 12, 2007, the disclosure of which is incorporated herein by reference thereto, and incorporates by reference the entire contents of Japanese priority documents 2006-077165, filed in Japan on Mar. 20, 2006, and 2007-054528, filed in Japan on Mar. 5, 2007.
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to an image forming apparatus, an electronic mail delivery server, and an information processing apparatus.
2. Description of the Related Art
In recent years, network communication systems including a network multifunction product capable of communicating data with another terminal unit have been proposed. For example, Japanese Patent Application Laid-Open No. 2005-244411 discloses a technology for automatically transmitting a document, which is read by a scanner, to a destination which is obtained by optical character recognition (OCR) process.
In the conventional technology, text information and image information are distinguished in a read document to acquire a destination and body information from the text information, and the image information is transmitted by electronic mail (hereinafter, “email”) to the destination as an attached file.
The conventional technology, however, requires a plurality of processes such that a document read by the scanner is subjected to a character recognition process and further to a process for email transmission. This causes an operator's work to be more complicated. In other words, if each document element being an element that forms input document information is extracted from the document information and a newly extracted document element can be always displayed on an information terminal, this becomes convenient for users.
SUMMARY OF THE INVENTION
It is an object of the present invention to at least partially solve the problems in the conventional technology.
According to an aspect of the present invention, an image forming apparatus includes an analysis unit that analyzes data to extract a data element, and, upon identifying attribute of the data element as text, performs character recognition on the data element, a display unit that displays the data element and receives input, and a communication unit that transmits the data element selected by the input to an information processing apparatus through a network.
According to another aspect of the present invention, an electronic mail delivery server includes a receiving unit that receives an electronic mail containing image information, an analysis unit that analyzes the electronic mail to extract a data element, and, upon identifying attribute of the data element as text, performs character recognition on the data element, and a transmitting unit that transmits the data element to a destination of the electronic mail through a network.
According to still another aspect of the present invention, an information processing apparatus includes a communication unit that receives a data element that constitutes data from an image forming apparatus connected through a network, an display unit that displays the data element as a symbol at a predetermined position on a screen, and receives a command to select and place the symbol on a window of an application, and an information processor that enters the data element corresponding to the symbol into the application in response to the command.
The above and other objects, features, advantages and technical and industrial significance of this invention will be better understood by reading the following detailed description of presently preferred embodiments of the invention, when considered in connection with the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a functional block diagram of an information delivery system according to a first embodiment of the present invention;
FIG. 2 is a functional block diagram of an analysis unit shown in FIG. 1;
FIG. 3 is a functional block diagram of an area recognition unit shown in FIG. 2;
FIG. 4 is a functional block diagram of a character analyzer shown in FIG. 3;
FIGS. 5 and 6 are examples of a display screen of a first operation display unit shown in FIG. 1;
FIG. 7 is one example of a display screen of a second operation display unit shown in FIG. 1;
FIG. 8 is a flowchart of the operation of the information delivery system shown in FIG. 1;
FIG. 9 is a functional block diagram of an information delivery system according to a second embodiment of the present invention;
FIG. 10 is a flowchart of the operation of the information delivery system shown in FIG. 9;
FIG. 11 is a functional block diagram of an information delivery system according to a third embodiment of the present invention;
FIG. 12 is one example of an editing mode screen displayed in an information processing apparatus shown in FIG. 11;
FIG. 13 is a flowchart of the operation of the information delivery system shown in FIG. 11;
FIG. 14A is a functional block diagram of an information delivery system according to a fourth embodiment of the present invention;
FIG. 14B is one example of a display screen of a second operation display unit shown in FIG. 14A;
FIG. 15 is a flowchart of the operation of the information delivery system shown in FIG. 14A;
FIG. 16 is a flowchart of the operation of an information delivery system according to a modification of the fourth embodiment;
FIG. 17 is a functional block diagram of an information delivery system according to a fifth embodiment of the present invention;
FIG. 18 is a block diagram of a hardware configuration of the image forming apparatus according to the embodiments; and
FIG. 19 is a block diagram of a hardware configuration of the information processing apparatus according to the embodiments.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Exemplary embodiments of the present invention are explained in detail below with reference to the accompanying drawings.
FIG. 1 is a functional block diagram of an information delivery system according to the first embodiment of the present invention. The information delivery system includes an image forming apparatus 110 and an information processing apparatus 200 connected to each other through a network 3.
The image forming apparatus 110 analyzes input data to extract therefrom a data element, i.e., an element that constitutes the data, and displays preview of the extracted data element. The image forming apparatus 110 receives a command input through the preview, and transmits information on the selected data element to the information processing apparatus 200 through the network 3. The information processing apparatus 200 receives the data element information from the image forming apparatus 110 through the network 3, and displays the data element information in a contents bar (widget) arranged in a rectangular shape at a predetermined position on its screen using an icon as a symbol. The information processing apparatus 200 displays the window of an active (open) application. When the displayed icon is selected and a command is input to bring (place) the selected icon on the application window, the information processing apparatus 200 enters the data element information indicated by the icon into the application running thereon according to the command, and displays the data element information.
With this configuration, the image forming apparatus 110 extracts each data element from input data and displays extracted data elements. When any one of the data elements is selected from those displayed, the selected data element is transmitted to the information processing apparatus 200 and displayed on the screen. Thus, the data element can be used in another application active on the screen. This enables to obtain an information delivery system with high usability in which the data element extracted by the image forming apparatus 110 from the input data can be displayed in the information processing apparatus 200 and used in another application.
The image forming apparatus 110 includes a scanner 1, an input processor 2, an information delivery apparatus 100, an output processor 4, and an output unit 5. The information delivery apparatus 100 extracts a data element from an image read by the scanner 1 and transmits the extracted data element to the information processing apparatus 200. The information delivery apparatus 100 includes a first operation display unit 11, an analysis unit 12, a storage unit 13, a first display controller 14, and a first communication unit 15.
The scanner 1 reads data. The input processor 2 converts the analog data read by the scanner 1 to electronic digital data, and transmits the digitally converted image data to the information delivery apparatus 100. Referring to the input of the data, the first communication unit 15 can also receive data through the network 3. The information delivery apparatus 100 can also input thereto data stored in any one of various recording media through an input unit (not shown).
The storage unit 13 stores therein input image data and various types of digital data subjected to processing. The output processor 4 subjects the image data for the read image to output processing. The output unit 5 outputs an image of the image data subjected to the output processing. The image output is a function of the image forming apparatus.
The image forming apparatus inputs thereto electronic data through the scanner 1 or the network 3. In the information delivery apparatus 100, the analysis unit 12 analyzes an attribute of input data and extracts a data element that constitutes data, from the data.
The first display controller 14 displays the extracted data element on the first operation display unit 11. The first operation display unit 11 receives a selection input and editing inputs, from an operator, performed by touching the data element displayed thereon.
The first operation display unit 11 can be formed as a so-called touch panel having a liquid-crystal display function. The first communication unit 15 transmits the received data element to the information processing apparatus 200 through the network 3.
The analysis unit 12 analyzes which of attributes such as text, halftone image, graphic, image, and any other image type corresponds to the input data, and divides the data into areas by the attribute. Each element of the areas divided for each attribute is a data element.
FIG. 2 is a functional block diagram of the analysis unit 12. The analysis unit 12 includes an area recognition unit 121 and an attribute analyzer 122. The area recognition unit 121 recognizes each area of data. For example, the area recognition unit 121 recognizes a text attribute for each paragraph, and recognizes a photographic image attribute and a graphic image attribute for each photographic image and each graphic image, respectively. The area recognition unit 121 divides the data into areas based on the result of recognition. The area recognition unit 121 also recognizes each area of the data based on a well-known technology such as change in density of data, detection of an edge portion, and detection of a halftone area.
The area recognition unit 121 detects that a portion determined as a character continues to thereby recognize these portions as a text area. The area recognition unit 121 detects that a halftone pixel continues, and can thereby recognize these portions as a photographic image area. The area recognition unit 121 detects an edge portion and a sharp contrast between dark and light colors, and can thereby recognize these portions as a graphic image area. The area recognition unit 121 determines any portion other than the portions as being not text, photograph, and graphic images, and subjects these portions to area recognition process. These technologies are well known, and hence, detailed explanation is omitted.
The attribute analyzer 122 analyzes which information of the attributes such as text, photographic image, halftone image, graphic image, and any other image corresponds to the information for an area to be recognized by the area recognition unit 121. As a result, the area is divided for each attribute. Each element as the area divided for each attribute is generated as a data element.
FIG. 3 is a functional block diagram of the area recognition unit 121. The area recognition unit 121 includes a layout analyzer 1211 and a character analyzer 1212. The layout analyzer 1211 subjects the data element extracted by the area recognition unit 121 and the attribute analyzer 122 to a process for analyzing a layout structure of the data. The layout analysis performed by the layout analyzer 1211 is to apply detection information such as edge detection, halftone detection, and pixel position detection to a well-known algorithm, to analyze each layout of divided areas. Alternatively, because the data elements are already extracted, the extracted data elements are applied to the known algorithm, and this enables the layout analysis. The character analyzer 1212 subjects the data element, which is analyzed as the text attribute, to character recognition process.
FIG. 4 is a functional block diagram of the character analyzer 1212. The character analyzer 1212 converts a character element to a machine-processable format. The character analyzer 1212 includes an optical character recognition (OCR) unit 12121, a spline-to-vector converter 12122, and a character extracting unit 12123.
The character extracting unit 12123 extracts a character by contrast detection. The spline-to-vector converter 12122 subjects the extracted character to spline shaping. The OCR unit 12121 optically reads a character read as an image to perform character recognition. The character analysis is possible by applying a known algorithm thereto.
The character analyzer 1212 converts a character element to a machine-processable format, and this enables the editing to be accurate and quick. The OCR unit 12121 extracts a character image as text information, and so, efficient and highly user-friendly display and editing become possible. The character can be more smoothly displayed by the spline-to-vector converter 12122. Furthermore, the character analyzer 1212 includes the character extracting unit 12123, which enables to accurately clip a character and perform character analysis.
The analysis unit 12 can determine an attribute of a divided area based on an input of the set attribute, by the operator, which is received by the first operation display unit 11. For example, there is a case where the operator determines, from the displayed screen, that a text image analyzed and displayed by the analysis unit 12 should actually be a photographic image, and where the operator provides input to display the photographic image. In this case, the analysis unit 12 receives the attribute determined as the photographic image. With this feature, the attribute of the data element being the divided area can be changed or determined through the input by the operator.
The first operation display unit 11 receives an input as a touch thereon with a finger, a stylus pen, or any other touching tool. The first operation display unit 11 detects a touch at each point on a display screen and accepts the touch as input using a known technology such as a resistive system for detecting a change in resistance by the pressure applied from the finger or the pen, or a capacitive system (analog capacity coupling scheme). As a touch type, an example of touch input for inputting by touching the first operation display unit 11 is explained below, but the type is not necessarily limited only to touch input. Therefore, various types of input system including a mouse, a keyboard, and a press switch can be used.
FIG. 5 is one example of a display screen of the first operation display unit 11. An operation display screen 500 of the first operation display unit 11 includes an operation unit 510 and a display unit 530. Numerals, edit items, and Start for image output can be input through the operation unit 510.
The display unit 530 displays thereon a preview image 532 and an element image 534. The preview image 532 represents input data, and the element image 534 is an element image selected from data elements extracted by the analysis unit 12. The first display controller 14 displays the input image data and the data element extracted by the analysis unit 12 on the first operation display unit 11 in the form as shown in FIG. 5.
The data element is selected from the preview image 532 through a copying operation by touch input using the display unit 530 as an input interface, and displayed in element image mode as the element image 534. The selection is possible by touch input using a drag-and-drop operation. Moreover, the selection is also possible by touch input using a copy-and-paste operation.
FIG. 6 is another example of a display screen of the first operation display unit 11. In the first operation display unit 11, a screen 600, according to the other example, is displayed by being divided into a preview screen 620 and an element image display 650. The preview screen 620 is divided into data elements and displayed. The element image display is also divided based on data elements.
Data elements 622 to 625 and 631 to 634 are now selected by the operator, and are set in the element image display 650 provided on the right side in the element image mode using the drag-and-drop operation. The data elements 622 to 625 are selected from a data element 621 and the data elements 622 to 625 which are text elements in the preview screen 620, and are copied to data elements 642 to 645 to be set in the element image display 650 on the right side.
Likewise, the data elements 631 to 634 which are photographic images are selected by the operator, and are copied to data elements 651 to 654 which are the photographic images to be set in the element image display 650 on the right side using the drag-and-drop operation.
Various types of edit settings or parameters can be input through the first operation display unit 11 using this screen as an interface. The first display controller 14 subjects data element information to editing in response to an editing input, and displays again the data elements in the element image mode. For example, an image of the data element 651 can be subjected to filtering to enable the background of a person to be made brighter. The first operation display unit 11 can also be configured to receive input for various settings including the edit settings and print settings from two-module display.
The first display controller 14 can also subject the data element to editing as appropriate element information. For example, image information can also be reduced according to a communication function. To display the data element (photographic image) 634, a thumbnail image thereof is created, and the thumbnail image can be used as an icon image for the photographic image. If it is to be actually transmitted, not the thumbnail image but its original element image information is transmitted.
If the data element extracted as character is to be inserted into the element image mode on the right side by touch input using the drag-and-drop operation, the data element is inserted therein for each element as not character image information but as text information which has already been subjected to character analysis.
The first display controller 14 associates display of the data elements divided by the analysis unit 12 in a preview image with display thereof in the element image mode, to be stored in the storage unit 13. With this association, the editing can be subjected to a corresponding original data element depending on the editing input from the element image mode.
The first operation display unit 11 also receives inputs from the operator such as a destination, an attribute of a data element, and deletion or addition of a data element in the element image mode.
The operation input through the first operation display unit 11 can be performed based on a system of selecting it through touch input by the operator, from a pull-down menu pulled down on the first operation display unit 11. The pull-down menu is possible to include, for example, Add, Shift, Delete, Enlarge/Reduce, and Input of date (not shown). Alternatively, these menu items can be made to appear with a right click operation. These technologies are well known, and hence, detailed explanation thereof is omitted.
For example, in a Delete operation, the first operation display unit 11 receives an input to delete the data element 652 in the element image mode, and the first display controller 14 newly displays a status (not shown) without the data element 652 in the element image display 650 of the first operation display unit 11.
The first operation display unit 11 can also accept a drag operation for the data element to be displayed using a stylus pen or the like, in addition to touch input with a finger, because using the stylus pen allows a finer operation. Further, in addition to the drag operation, by displaying a predetermined menu on the first operation display unit 11 and by selecting an item from the menu by touch input, a data element can be inserted in email module display and displayed.
The first communication unit 15 transmits information for the data element displayed in the element image display 650 of the first operation display unit 11, to the information processing apparatus 200 through the network 3. With this feature, the image forming apparatus can select a data element while viewing a preview image on the first operation display unit 11, to automatically transmit the selected data element to the information processing apparatus.
The image forming apparatus can also transmit display information in either the preview image mode or the element image mode or in both modes displayed by the first operation display unit 11, to the output processor 4, and can output an image through the output unit 5. Output of the data edited in the information delivery apparatus can be implemented by some other means according to various functions of the image forming apparatus such as an output using a facsimile transmission function.
The information processing apparatus 200 receives the information for the data element extracted and selected in the image forming apparatus through the network, displays the information on the screen, and enters it into another application running on the screen, and this allows the use of the information. In this manner, the data element extracted from the input data by the image forming apparatus is displayed in the information processing apparatus. Therefore, by placing the displayed data element on the application window running on the information processing apparatus, the operator can use the data element in the application.
The information processing apparatus 200 includes a second communication unit 61, an information processor 62, an editing unit 63, a second display controller 64, and a second operation display unit 65 (FIG. 1).
The second communication unit 61 receives the data element information transmitted by the information delivery apparatus 100 through the network 3.
The second display controller 64 displays the received data element information in a contents bar using an icon on the second operation display unit 65.
The second operation display unit 65 displays, when an application such as Word (TM) has started, the screen or window of the application and receives an instruction input to select the displayed icon to be placed on the application screen or window.
The information processor 62 enters the data element information indicated by the icon into the application according to the instruction input received through the second operation display unit 65.
The editing unit 63 subjects the data element information selected by the second operation display unit 65 to, if an editing input is received, the received editing. The second display controller 64 displays the data element information subjected to the editing on the second operation display unit 65.
FIG. 7 is one example of a display screen of the second operation display unit 65. The mechanical configuration of the second operation display unit 65 is basically the same as the first operation display unit 11. For example, the second operation display unit 65 receives touch input in the same manner as explained for the first operation display unit 11.
Another application screen 710 and a contents bar 750 that displays data elements in the form of being set therein are displayed on a screen 700 of the second operation display unit 65. Displayed in the contents bar 750 are, for example, a text image 751 and an image 752, and the operator inserts the image 752 in an editing screen of a word processor displayed on the left side by touch input using the drag-and-drop operation indicated by arrow 760. An image 712 is a resulting image after the insertion. Not only the image 752 but the text image 751 as text information can be inserted therein.
The information processor 62 converts the image 752 selected by the drag-and-drop operation to a data format so that the data can be inserted in the word processor. In the active word processor, the converted image is inserted into a location specified by touch input to place the image thereon.
If the data element set in the contents bar 750 is subjected to the editing before its insertion, the operator selects an editing item on the second operation display unit 65 and inputs the editing item. Even if the editing item appears by a right click, the pull-down menu can also be displayed. The editing unit 63 subjects the data element information to the editing according to the editing item which is selected and input.
The second display controller 64 again displays the data element information subjected to the editing in the contents bar 750. In this case, the data element image can be updated with the edited data element information and the updated data element image can be displayed, or the image before the editing can be displayed in such a manner that the images overlap each other. It is also possible to display another contents bar (not shown) in which the data element image after being edited is set.
FIG. 8 is a flowchart of the operation of the information delivery system according to the first embodiment. The scanner 1 reads data, the input processor 2 converts the read analog data to digital data, and the information delivery apparatus 100 receives the digital data. The analysis unit 12 analyzes the read image data. The analysis unit 12 analyzes an attribute of data based on recognition of an area performed by the area recognition unit 121 and based on analysis of the attribute by the attribute analyzer 122, to extract a data element. The layout analyzer 1211 analyzes layout and the character analyzer 1212 analyzes text information. The character analyzer 1212 analyzes a character by using an OCR process and a character extracting process. The analysis unit 12 analyzes the attribute information and extracts a data element (step S101).
The first display controller 14 displays the preview screen in a preview image mode (see 620 in FIG. 6) in which the data element is input and the element image display (see 650 in FIG. 6) in the element image mode in which the selected data element is displayed. The first display controller 14 generates display data to display the display data for each data element analyzed in the preview image mode (step S102). At this step, because a selection input is not yet received from the operator, the element image display represents a state of a new creation screen. The first operation display unit 11 displays the preview screen 620 and the element image display 650 in parallel for each data element generated by the first display controller 14.
The first operation display unit 11 determines whether a selection input is received for the data element in the preview screen 620 (step S103). When the operator selects the data element and sends an input to set the data element in the element image display 650 by using the drag-and-drop operation (YES at step S103), the first operation display unit 11 receives the input.
The first operation display unit 11 selects a data element in the preview screen 620 for each data element generated by the first display controller 14 to set the selected data element in the element image display 650 by the drag-and-drop operation (FIG. 6). A new screen can also be configured to set all the data elements displayed in the preview screen 620, in the element image display. In this case, during the selection operation by the operator, any unnecessary portion is deleted as required. The data elements in both modes are associated with each other by the first display controller 14.
The first communication unit 15 transmits information for the selected data element to the information processing apparatus 200 through the network 3 (step S104). The second communication unit 61 detects whether the data element information has been received (step S105). When the data element information has been received, the second display controller 64 displays an icon of the data element being set in the contents bar (widget) 750 (FIG. 7) in the second operation display unit 65 (step S106).
The second operation display unit 65 detects whether the icon has been selected through the display screen (step S107). When the icon has been selected (YES at step S107), the second operation display unit 65 further detects whether an instruction input to place the selected icon on the screen or window of an active application has been received (step S108). In other words, it is detected whether an input indicating a drag-and-drop operation, such that the data element displayed in the contents bar 750 is dragged and dropped onto a screen of another application running thereon, has been received.
When the input indicating the drag-and-drop operation is detected (YES at step S108), the information processor 62 enters the data element information corresponding to the icon into the application according to the instruction input (step S109).
With this configuration, the image forming apparatus extracts each data element from input data and displays extracted data elements. When receiving input to select any one of the displayed data elements from the operator, the image forming apparatus transmits the selected data element to the information processing apparatus. The information processing apparatus displays the data element on the display screen to place it on the screen of another application running thereon, thus using the data element. Therefore, it is possible to obtain a highly user-friendly information delivery system in which the data element extracted from the input data by the image forming apparatus is displayed in the information processing apparatus and can be used in the application without being aware of the operation for data conversion.
The second operation display unit of the information processing apparatus is desirably configured to display the contents bar in the form of a display bar and to perform an instruction input by at least either one of the drag-and-drop operation and the copy-and-paste operation. This is because data elements are displayed in an easy-to-view form, and thus, the displayed data elements can be used in the application on the screen with a simple operation.
It is desirable that the analysis unit 12 generates data element information in a data format according to the attribute of the data element to be analyzed, and that the information processor 62 enters the data element information displayed on the second operation display unit 65 into the application in the data format based on the application and the attribute of the data element information. For example, if the data element to be analyzed is text information, data element information is generated in the text file. If the data element is image information, then data element information is generated with the image information, and file capacity upon transmission can thereby be suppressed. When the text information or the image information is entered into another application through the screen of the information processing apparatus in a form of a text file or an image file, it is also possible to display the information more accurately.
It is desirable that the first operation display unit 11 receives an input to determine an attribute of the data element to be displayed and the analysis unit 12 determines the attribute of the data element according to the received input. This is because the attribute can accurately be determined based on the operator's determination.
It is further desirable that the first operation display unit 11 receives an input to specify an area of data to be displayed and the analysis unit 12 determines the attribute of the area specified by the received input to extract a data element. This is because the data element information can be extracted only from the area determined by the operator as being necessary and this enables efficient extraction of the data element.
It is further desirable that the first operation display unit 11 receives an input to specify a shape of an area of data to be displayed as a block. This is because by specifying the area with a block, the data element information can be extracted only from the required area with a simple operation, to enable efficient operation.
It is also desirable that the storage unit 13 stores therein the data element information of which selection input is received from the first operation display unit 11 and the first communication unit 15 reads the data element information from the storage unit 13 to be transmitted. This is because the data element information stored in the storage unit 13 can be collectively transmitted at an appropriate time for transmission to enable enhancement of communication efficiency.
It is also desirable that even when the data is input and the image forming apparatus executes another function, the analysis unit 12 analyzes a data element in parallel to the execution. This is because the data element information can be extracted in the background even when the image forming apparatus executes another function, to enable extraction of highly usable data element information.
It is also possible to automatically deliver all the data elements extracted by the analysis unit 12. This enables input data to be automatically analyzed and all the data element information included in the data to be automatically delivered.
An information delivery system according to a second embodiment differs from that of the first embodiment in that the information processing apparatus transmits identification information to the image forming apparatus and the image forming apparatus transmits data element information to a destination corresponding to the received identification information. This enables the image forming apparatus to automatically determine a destination and transmit data element information to the destination.
FIG. 9 is a functional block diagram of the information delivery system according to the second embodiment. An information processing apparatus 220 receives identification information from the second operation display unit 65. A first communication unit 25 of an information delivery apparatus 120 provided in an image forming apparatus 123 receives the identification information and transmits data element information to a corresponding destination according to a correspondence table (not shown) between identification information and destination stored in the storage unit 13.
The information processing apparatus 220 further includes an identification unit 71 that identifies received identification information. Therefore, it is desirable that only when the received identification information is legitimate, the identification information is transmitted to the information delivery apparatus 120. This enables to determine whether an operator who operates the information processing apparatus 220 has an access right and to receive data element information from the information delivery apparatus 120 to be displayed only when the operator has the access right.
It is further desirable that the information delivery apparatus 120 includes a verification unit 26 that verifies whether received identification information is legitimate. This enables the data element information to be transmitted only to a destination of which verification is successful.
The second communication unit 61 of the information processing apparatus 220 desirably transmits an address on network containing its Internet Protocol (IP) address as identification information. With this configuration, the address of the apparatus is automatically transmitted or is transmitted manually by the operator to enable data element information to be transmitted from the image forming apparatus.
FIG. 10 is a flowchart of the operation of the information delivery system according to the second embodiment. The information processing apparatus 220 detects whether the second operation display unit 65 receives an input of identification information (step S201). When the second operation display unit 65 receives the identification information (YES at step S201), the identification unit 71 determines whether the received identification information is legitimate (step S202). When the identification information is legitimate (YES at step S202), the second communication unit 61 transmits the identification information to the image forming apparatus (step S203).
On the other hand, when the identification information is not legitimate (NO at step S202), the second operation display unit 65 displays an error and finishes the operation (step S204).
In the image forming apparatus that receives the identification information, when the first operation display unit 11 receives a selection input for the data element (YES at step S103), the first communication unit 15 transmits the selected data element to a corresponding destination by referring to the identification information and an association table (step S205).
As explained above, in the information delivery system according to the second embodiment, the image forming apparatus receives the identification information from the information processing apparatus and transmits the data element information only to a destination of the corresponding identification information. Thus, the image forming apparatus automatically determines the legitimate destination to enable transmission of the data element information thereto.
An information delivery system according to a third embodiment differs from that of the first embodiment in that the second operation display unit displays an editing mode screen on which the data element information is displayed and through which an editing input is received. The editing unit edits the data element information according to the editing input received through the editing mode screen, and the second operation display unit 65 displays again the edited data element information.
FIG. 11 is a functional block diagram of the information delivery system according to the third embodiment. A second display controller 84 of an information processing apparatus 230 causes the second operation display unit 65 to display the editing mode screen on which data element information is displayed and through which an editing input is received. An editing unit 83 edits the data element information according to the editing input received through the editing mode screen in the second operation display unit 65. The second display controller 84 causes the second operation display unit 65 to display the edited data element information.
With this configuration, the data element displayed in the information processing apparatus is subjected to editing to enable the result after the editing to be displayed.
FIG. 12 is one example of an editing mode screen displayed in an information processing apparatus according to the third embodiment. Data elements 1210, 1220, 1230, and 1235 are displayed on an editing mode screen 1200. For example, the location of the data element (header image) 1210 represented by an image is modified by an input operation from the operator, and the header after modified (not shown) can be displayed. Moreover, the operator performs an input operation to order the data elements 1230 and 1235 which are text elements, and the ordering can be indicated by arrows.
FIG. 13 is a flowchart of the operation of the information delivery system according to the third embodiment. The second display controller 84 causes the second operation display unit 65 to display a screen in an editing mode on which the data element information is displayed and through which an editing input is received (step S301). The second operation display unit 65 detects whether an editing input has been received from the data element information displayed in the editing mode (step S302). When it is detected that the editing input has been received (YES at step S302), the editing unit 83 subjects the data element information to editing according to the received editing input (step S303). The second operation display unit 65 again displays the edited data element information (step S304).
As explained above, the information processing apparatus 230 subjects the displayed data element to editing, and can display the result after the editing.
The second operation display unit 65 updates the data element information with the edited data element information to be displayed, and, in addition to this, can also display an updated element image in parallel with an element image before updated. Alternatively, the display before updated can be lightly shaded. Because the pieces of element information before and after editing can be compared with each other.
An information delivery system according to a fourth embodiment differs from that of the first embodiment in that the data element displayed in the contents bar is stored as shared data in another information processing apparatus. In the fourth embodiment, the information processing apparatus includes a search unit. The second operation display unit receives a search request input to search for a data element to be displayed in a contents bar, and the search unit generates search request information used to search for the data element stored as shared data in another information processing apparatus based on the received search request input, to transmit the search request information through the network.
FIG. 14A is a functional block diagram of the information delivery system according to the fourth embodiment. In another information processing apparatus 240 a, a data element is stored in its storage unit as shared data 241. A search unit 96 is included in an information processing apparatus 240 of the information delivery system according to the fourth embodiment. The second operation display unit 65 receives a search request input to search for a data element as the shared data in the other information processing apparatus 240 a from a search field 1402 shown in FIG. 14B. The search unit 96 generates search request information used to search for the data element to be displayed in a contents bar of the second operation display unit from the shared data 241 stored in another information processing apparatus through the network, to transmit the generated search request information.
FIG. 15 is a flowchart of the operation of the information delivery system according to the fourth embodiment. The second operation display unit 65 of the information processing apparatus 240 detects whether the search request information for the data element has been received (step S401). When the search request information has been received (YES at step S401), the search unit 96 generates shared-data-search-request information used to search for the data element stored as the shared data 241 in another information processing apparatus and transmit it (step S402). The second communication unit 61 transmits the shared-data-search-request information to another information processing apparatus (step S403). When the data element is searched from the other information processing apparatus 240 a to acquire the data element, the data element is transmitted, and the second communication unit 61 receives the data element.
With this feature, it is also possible to display even a data element, which the information processing apparatus 240 does not retain, in the contents bar of the information processing apparatus 240, by searching for the data element retained as the shared data in the other information processing apparatus 240 a and transmitting it to be acquired.
It is noted that the other information processing apparatus 240 a can be configured to display the data element as the shared data 241 in the contents bar of the other information processing apparatus 240 a.
The search unit 96 can also be configured to search for data element information stored as the shared data in the storage unit 13 of the image forming apparatus through the network.
FIG. 16 is a flowchart of the operation of the information delivery system according to a modification of the fourth embodiment. The second operation display unit 65 of the information processing apparatus 240 detects whether search request information for the data element stored as the shared data in the storage unit 13 of the image forming apparatus has been received (step S501). When the search request information has been received (YES at step S501), the search unit 96 generates storage-search-request information used to search for the data element stored as the shared data in the storage unit 13 of the image forming apparatus to be transmitted through the network (step S502). The second communication unit 61 transmits the storage-search-request information to the image forming apparatus (step S503). When the data element is searched from the storage unit 13 and acquired, the data element is transmitted and the second communication unit 61 receives the data element.
With this feature, the information processing apparatus 240 acquires the data element information stored as the shared data in the storage unit 13 of the image forming apparatus, and can display the data element in its own contents bar.
FIG. 17 is a functional block diagram of an information delivery system according to a fifth embodiment of the present invention. The information delivery system according to the fifth embodiment has a difference from that of the first embodiment in that there is no need to include the scanner 1, the input processor 2, the output processor 4, and the output unit 5 in the image forming apparatuses explained in the first to the fourth embodiments. In other words, the information delivery system according to the fifth embodiment includes an email delivery server 150 (data delivery server) instead of the image forming apparatus. For example, when an email containing image information is transmitted to a destination as the user of the information processing apparatus 200 from a mobile phone 1701, a digital camera 1702, or a personal computer (PC) 1703, to deliver the email to the specified destination, a first communication unit 55 of the email delivery server 150 temporarily receives the email, and an analysis unit 52 analyzes the received email containing the image to extract data elements. The analysis of the email and the extraction of the data elements are performed in the same manner as that of the first to the fourth embodiments.
With this feature, the email delivery server 150 extracts the data element from the email to transmit the data element to the information processing apparatus 200 used by the user as the specified destination, and thus, the data element can be displayed in the information processing apparatus 200. Therefore, the data element can be extracted simply by transmitting the image captured by the mobile phone 1701 or by the digital camera 1702 and the image captured by the PC 1703 by email. Thus, the user does not have to perform a plurality of processes such as the OCR process and the email transmission, which allows improvement of the user's usability.
FIG. 18 is a block diagram of a hardware configuration of the image forming apparatus according to the embodiments. The image forming apparatus is configured as, for example, a multifunction product (MFP) that has multifunction such as a facsimile function and a scanning function. As shown in FIG. 18, the MFP includes a controller 2210 and an engine 2260 which are connected to each other through a Peripheral Component Interconnect (PCI) bus. The controller 2210 controls inputs from a Facsimile Control Unit interface (FCU I/F) 2230 and the first operation display unit 11, such as control of the MFP, image display control, various types of controls, image processing control, and information delivery control. The engine 2260 is an image processing engine or the like which can be connected to the PCI bus, and an image processing portion such as error diffusion and gamma-conversion performed on acquired image data is included therein.
The controller 2210 includes a Central Processing Unit (CPU) 2211, a northbridge (NB) 2213, a system memory (MEM-P) 2212, a southbridge (SB) 2214, a local memory (MEM-C) 2217, an Application Specific Integrated Circuit (ASIC) 2216, and a Hard Disk Drive (HDD) 13 as the storage unit. The NB 2213 and the ASIC 2216 are connected to each other through an Accelerated Graphics Port (AGP) bus 2215. The MEM-P 2212 includes a Read Only Memory (ROM) 2212 a and a Random Access Memory (RAM) 2212 b.
The CPU 2211 controls the MFP, and has a chipset including the NB 2213, the MEM-P 2212, and the SB 2214. The CPU 2211 is connected to other devices through the chipset.
The NB 2213 is a bridge that connects the CPU 2211 to the MEM-P 2212, the SB 2214, and the AGP 2215, and includes a memory controller, a PCI master, and an AGP target. The memory controller controls read/write from/to the MEM-P 2212.
The MEM-P 2212 is a system memory that stores therein computer programs and data and used to load computer programs and data. The MEM-P 2212 includes the ROM 2212 a and the RAM 2212 b. The ROM 2212 a stores therein the computer programs and data, and the RAM 2212 b is a rewritable and readable memory into which the computer programs and data are loaded and is used for image drawing upon image processing.
The SB 2214 is a bridge that connects the NB 2213 to a PCI device and peripheral devices. The SB 2214 is connected to the NB 2213 through the PCI bus, which is also connected with the FCU I/F 2230 or the like.
The ASIC 2216 is an Integrated Circuit (IC), for multimedia information processing, which includes a hardware element for multimedia information processing. The ASIC 2216 serves as a bridge to connect the AGP 2215, the PCI bus, the HDD 13, and the MEM-C 2217 to each other.
The ASIC 2216 connects a Universal Serial Bus (USB) 2240 and an IEEE 1394 (the Institute of Electrical and Electronics Engineers 1394) I/F 2250, through the PCI bus, to a PCI target and an AGP master, an arbiter (ARB) that forms a core of the ASIC 2216, a memory controller that controls the MEM-C 2217, a plurality of Direct Memory Access Controllers (DMACs) that rotate image data by a hardware logic, and the engine 2260.
The MEM-C 2217 is a local memory used as an image buffer for transmission and a code buffer. The HDD 13 stores therein image data, computer programs, font data, and forms.
The AGP 2215 is a bus interface for graphics accelerator card that is proposed to increase the speed of graphic processing, and high-throughput direct access to the MEM-P 2212 allows the speed of the graphics accelerator card to be increased.
The first operation display unit 11 connected to the ASIC 2216 receives input indicating operation by the operator and transmits received input information to the ASIC 2216.
A computer program (hereinafter, “information delivery program”) executed in the MFP according to the embodiments is provided as previously stored in the ROM and the like.
The information delivery program can be provided as recorded on a computer-readable recording medium such as a Compact Disk Read Only Memory (CD-ROM), a Flexible Disk (FD), Compact Disk Recordable (CD-R), and a Digital Versatile Disk (DVD) in a file of an installable format or of an executable format.
The information delivery program can be provided as stored in a computer connected to a network such as the Internet and downloaded via the network. The information delivery program can also be provided or distributed via a network such as the Internet.
The information delivery program includes modules that implement the aforementioned components (analysis unit 12, storage unit 13, first display controller 14, and first communication unit 15, etc.). As actual hardware, the CPU (processor) loads the information delivery program from the ROM into a main storage unit to execute it. Thus, the analysis unit 12, the storage unit 13, the first display controller 14, and the first communication unit 15 or the like are implemented on the main storage unit.
FIG. 19 is a block diagram of a hardware configuration of the information processing apparatus according to the embodiments. The information processing apparatus includes a control unit such as a CPU 2301, a storage unit such as ROM 2302 and RAM 2303, an external storage unit 2304 such as a HDD and a CD drive, a display device 65 a in the second operation display unit 65, an input device 65 b such as a touch input mechanism, a keyboard, and a mouse, a communication I/F 2307, and a bus 2308 connecting these components to one another, which realizes the hardware configuration using an ordinary computer.
A computer program (hereinafter, “information processing program”) executed by the information processing apparatus according to the embodiments is provided as recorded on a computer-readable recording medium such as a CD-ROM, a Flexible Disk (FD), CD-R, and a DVD in a file of an installable form or an executable form.
The information processing program can be provided as stored in a computer connected to a network such as the Internet and downloaded via the network. The information processing program can also be provided or distributed via a network such as the Internet. The information processing program can be provided as previously stored in the ROM and the like.
The information processing program includes modules that implement the aforementioned components (second communication unit, information processor, editing unit, and second display controller, etc.). As actual hardware, the CPU (processor) loads the information processing program from the storage medium into a main storage unit to execute it. Thus, the second communication unit, the information processor, the editing unit, the second display controller, and the like are implemented on the main storage unit.
As set forth hereinabove, according to an embodiment of the present invention, an image forming apparatus analyzes data to extract a data element. Upon identifying attribute of the data element as text, the image forming apparatus performs character recognition on the data element, and displays the data element. When receiving input to select the data element, the image forming apparatus transmits the data element to an information processing apparatus through a network. Thus, the data element extracted from the input data can be delivered with less process, which improves convenience for users.
Moreover, an email delivery server receives email containing image information, and analyzes the email to extract a data element. Upon identifying attribute of the data element as text, the email delivery server performs character recognition on the data element, and transmits the data element to a destination of the email through a network. Thus, a data element can be extracted without processes such as the OCR process when a user simply transmits email containing an image. This improves convenience for the user.
Furthermore, having received the data element from the image forming apparatus, the information processing apparatus displays the data element as a symbol at a predetermined position on its screen. Upon receiving input to select the symbol and place it on the window of an application, the information processing apparatus enters the data element indicated by the symbol into the application. Thus, the data element extracted from the input data by the image forming apparatus is displayed and can thereby be used in another application, which also improves convenience for users.
Although the invention has been described with respect to a specific embodiment for a complete and clear disclosure, the appended claims are not to be thus limited but are to be construed as embodying all modifications and alternative constructions that may occur to one skilled in the art that fairly fall within the basic teaching herein set forth.

Claims (4)

What is claimed is:
1. An image forming apparatus comprising:
an input unit configured to receive data of a document image read from a document;
an analysis unit configured to analyze attributes of data elements constituting the data and extract, from the data, the data elements divided for each attribute;
a first operation display unit configured to display the extracted data elements and receive a selection of at least one data element from the extracted data elements; and
a first communication unit configured to transmit the at least one data element for display on a second operation display unit, the second operation display unit configured to display the at least one data element and receive a selection, from the at least one data element, of at least one data element to be imported into an application running on an information processing apparatus; and
wherein the first operation display unit is further configured to receive area-specifying input to specify an area corresponding to data elements of the data to be displayed, and wherein the data elements divided for each attribute are divided based on the area-specifying input.
2. An image forming apparatus comprising:
an input unit configured to receive data of a document image read from a document;
an analysis unit configured to analyze attributes of data elements constituting the data and extract, from the data, the data elements divided for each attribute;
a first operation display unit configured to display the extracted data elements and receive a selection of at least one data element from the extracted data elements; and
a first communication unit configured to transmit the at least one data element for display on a second operation display unit, the second operation display unit configured to display the at least one data element and receive a selection, from the at least one data element, of at least one data element to be imported into an application running on an information processing apparatus; and
wherein the first operation display unit is further configured to receive attribute-specifying input to specify the attribute of the data element to be displayed, and wherein the data elements divided for each attribute are divided based on the attribute-specifying input.
3. An image forming apparatus comprising:
an input unit configured to receive data of a document image read from a document;
an analysis unit configured to analyze attributes of data elements constituting the data and extract, from the data, the data elements divided for each attribute;
a first operation display unit configured to display the extracted data elements and receive a selection of at least one data element from the extracted data elements; and
a first communication unit configured to transmit the at least one data element for display on a second operation display unit, the second operation display unit configured to display the at least one data element and receive a selection, from the at least one data element, of at least one data element to be imported into an application running on an information processing apparatus; and
wherein the information processing apparatus further comprises a search unit configured to search for at least one data element stored as shared data over a network in response to a request received through the second operation display unit.
4. The image forming apparatus of claim 3, wherein the search unit searches for the at least one data element stored as shared data in the image forming apparatus through the network.
US13/470,575 2006-03-20 2012-05-14 Image forming apparatus, electronic mail delivery server, and information processing apparatus Expired - Fee Related US9060085B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/470,575 US9060085B2 (en) 2006-03-20 2012-05-14 Image forming apparatus, electronic mail delivery server, and information processing apparatus

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2006077165 2006-03-20
JP2006-077165 2006-03-20
JP2007054528A JP4897520B2 (en) 2006-03-20 2007-03-05 Information distribution system
JP2007-054528 2007-03-05
US11/716,574 US8201072B2 (en) 2006-03-20 2007-03-12 Image forming apparatus, electronic mail delivery server, and information processing apparatus
US13/470,575 US9060085B2 (en) 2006-03-20 2012-05-14 Image forming apparatus, electronic mail delivery server, and information processing apparatus

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/716,574 Continuation US8201072B2 (en) 2006-03-20 2007-03-12 Image forming apparatus, electronic mail delivery server, and information processing apparatus

Publications (2)

Publication Number Publication Date
US20120224232A1 US20120224232A1 (en) 2012-09-06
US9060085B2 true US9060085B2 (en) 2015-06-16

Family

ID=38558981

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/716,574 Expired - Fee Related US8201072B2 (en) 2006-03-20 2007-03-12 Image forming apparatus, electronic mail delivery server, and information processing apparatus
US13/470,575 Expired - Fee Related US9060085B2 (en) 2006-03-20 2012-05-14 Image forming apparatus, electronic mail delivery server, and information processing apparatus

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/716,574 Expired - Fee Related US8201072B2 (en) 2006-03-20 2007-03-12 Image forming apparatus, electronic mail delivery server, and information processing apparatus

Country Status (2)

Country Link
US (2) US8201072B2 (en)
JP (1) JP4897520B2 (en)

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8566946B1 (en) 2006-04-20 2013-10-22 Fireeye, Inc. Malware containment on connection
US8726178B2 (en) * 2006-11-10 2014-05-13 Ricoh Company, Ltd. Device, method, and computer program product for information retrieval
JP5003469B2 (en) * 2007-12-27 2012-08-15 日本電気株式会社 Information processing device
JP5262121B2 (en) * 2008-01-11 2013-08-14 株式会社リコー Information processing apparatus, symbol display method, and symbol display program
JP2009169536A (en) * 2008-01-11 2009-07-30 Ricoh Co Ltd Information processor, image forming apparatus, document creating method, and document creating program
JP4581008B2 (en) * 2008-07-11 2010-11-17 シャープ株式会社 Image transmission device
JP5323421B2 (en) * 2008-08-21 2013-10-23 京セラドキュメントソリューションズ株式会社 Image forming apparatus, image forming system, and image creating method
JP5073734B2 (en) * 2009-12-08 2012-11-14 シャープ株式会社 Network system, air cleaner for image forming apparatus, and control method of air cleaner in network system
JP5495802B2 (en) * 2010-01-06 2014-05-21 キヤノン株式会社 Image processing apparatus, image processing apparatus control method, and program
JP5879731B2 (en) * 2011-04-14 2016-03-08 株式会社リコー Document management program, document management method, and document management apparatus
JP6008564B2 (en) * 2011-06-20 2016-10-19 三菱電機株式会社 Air conditioner
KR101872255B1 (en) * 2011-11-09 2018-08-02 삼성전자 주식회사 Providing Method For Touch Functions and System, Touch Pen supporting the same
KR20140055133A (en) * 2012-10-30 2014-05-09 삼성전자주식회사 User terminal apparatus and control method thereof
US8577719B2 (en) * 2012-01-13 2013-11-05 Darlene Danece Bainbridge Strategic quality support system
USD711895S1 (en) * 2012-06-06 2014-08-26 Apple Inc. Display screen or portion thereof with graphical user interface
JP5700014B2 (en) * 2012-09-26 2015-04-15 コニカミノルタ株式会社 Printing system, printing control program, and recording medium
US9330223B2 (en) 2012-09-28 2016-05-03 International Business Machines Corporation Optical rule checking for detecting at risk structures for overlay issues
KR20150043109A (en) * 2013-10-14 2015-04-22 삼성전자주식회사 Electronic device and method for controlling object display
US10067921B2 (en) * 2014-03-20 2018-09-04 Nec Corporation Information processing apparatus, information processing method, and information processing program
US9591015B1 (en) 2014-03-28 2017-03-07 Fireeye, Inc. System and method for offloading packet processing and static analysis operations
US9690933B1 (en) 2014-12-22 2017-06-27 Fireeye, Inc. Framework for classifying an object as malicious with machine learning for deploying updated predictive models
US10075455B2 (en) 2014-12-26 2018-09-11 Fireeye, Inc. Zero-day rotating guest image profile
US9838417B1 (en) 2014-12-30 2017-12-05 Fireeye, Inc. Intelligent context aware user interaction for malware detection
US10148693B2 (en) 2015-03-25 2018-12-04 Fireeye, Inc. Exploit detection system
US9438613B1 (en) 2015-03-30 2016-09-06 Fireeye, Inc. Dynamic content activation for automated analysis of embedded objects
US9483644B1 (en) 2015-03-31 2016-11-01 Fireeye, Inc. Methods for detecting file altering malware in VM based analysis
US10642941B2 (en) * 2015-04-09 2020-05-05 International Business Machines Corporation System and method for pipeline management of artifacts
US9594904B1 (en) 2015-04-23 2017-03-14 Fireeye, Inc. Detecting malware based on reflection
USD769315S1 (en) * 2015-07-09 2016-10-18 Monthly Gift Inc. Display screen or portion thereof with graphical user interface
USD768709S1 (en) * 2015-07-31 2016-10-11 Gen-Probe Incorporated Display screen or portion thereof with animated graphical user interface
JP6657771B2 (en) * 2015-10-23 2020-03-04 富士通株式会社 Choice information presentation system, method, and program
JP2019009710A (en) * 2017-06-28 2019-01-17 京セラドキュメントソリューションズ株式会社 Image reading device and image forming apparatus
USD879832S1 (en) * 2017-09-12 2020-03-31 Compagnie Generale Des Etablissments Michelin Display screen or portion thereof with icon
JP7379876B2 (en) * 2019-06-17 2023-11-15 株式会社リコー Character recognition device, document file generation method, document file generation program

Citations (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH096901A (en) 1995-06-22 1997-01-10 Oki Electric Ind Co Ltd Document reader
US5812862A (en) 1993-05-10 1998-09-22 Apple Computer, Inc. Computer-human interface system for compound documents
JPH10285325A (en) 1997-04-08 1998-10-23 Oki Electric Ind Co Ltd Facsimile reception transfer system
US5892843A (en) 1997-01-21 1999-04-06 Matsushita Electric Industrial Co., Ltd. Title, caption and photo extraction from scanned document images
CN1245935A (en) 1999-06-21 2000-03-01 李虹 Full-automatic recognizing, logging-in and retrieval system of calling cards
US6088709A (en) 1994-09-20 2000-07-11 Fuji Xerox Co., Ltd. Document processing unit for enabling collective seeing of comments added separately by different persons to document copies containing the same text part
US6157738A (en) 1996-06-17 2000-12-05 Canon Kabushiki Kaisha System for extracting attached text
US6173287B1 (en) 1998-03-11 2001-01-09 Digital Equipment Corporation Technique for ranking multimedia annotations of interest
US6335742B1 (en) 1997-07-24 2002-01-01 Ricoh Company, Ltd. Apparatus for file management and manipulation using graphical displays and textual descriptions
US20020029259A1 (en) 2000-07-26 2002-03-07 Nec Corporation Remote operation system and remote operation method thereof
US20020069254A1 (en) 2000-12-01 2002-06-06 Matsushita Graphic Communication Systems, Inc. Server apparatus and method for electronic mail transmission control
JP2002318535A (en) 2001-04-23 2002-10-31 Canon Inc Information processor, information processing method, information processing system and information processing program
US20030018727A1 (en) 2001-06-15 2003-01-23 The International Business Machines Corporation System and method for effective mail transmission
US20030033221A1 (en) 2000-12-28 2003-02-13 Tetsuji Fuwa Electronic shop management system
JP2003111050A (en) 2001-09-27 2003-04-11 Olympus Optical Co Ltd Video distribution server and video reception client system
US20030160813A1 (en) 2002-02-25 2003-08-28 Raju Narayan D. Method and apparatus for a dynamically-controlled remote presentation system
US6613100B2 (en) 1997-11-26 2003-09-02 Intel Corporation Method and apparatus for displaying miniaturized graphical representations of documents for alternative viewing selection
US20030184598A1 (en) 1997-12-22 2003-10-02 Ricoh Company, Ltd. Television-based visualization and navigation interface
US20040017375A1 (en) 2002-07-29 2004-01-29 Microsoft Corporation In-situ digital inking for applications
JP2004096198A (en) 2002-08-29 2004-03-25 Murata Mach Ltd Communication terminal device and facsimile machine
US6732102B1 (en) 1999-11-18 2004-05-04 Instaknow.Com Inc. Automated data extraction and reformatting
US20040243575A1 (en) 2002-01-16 2004-12-02 Fujitsu Limited Information searching system
US20050135677A1 (en) 2003-12-22 2005-06-23 Seevers Timothy S. Method and apparatus for acquiring image data from a scanned document
JP2005182735A (en) 2003-05-14 2005-07-07 Sharp Corp Document data display device, cellular phone device, document data display method, document data display program, computer-readable recording medium with it recorded, document data printer, document data output device, document data output method, document data output program, and computer-readable recording medium with it recorded
US6931600B1 (en) 1999-05-07 2005-08-16 Autodesk, Inc. Integrating into an application objects that are provided over a network
JP2005244411A (en) 2004-02-25 2005-09-08 Ricoh Co Ltd Network combined machine
US20050259297A1 (en) 2004-04-28 2005-11-24 Oki Data Corporation Image forming apparatus and verifier
US20060072162A1 (en) 2004-08-31 2006-04-06 Tetsuya Nakamura Mail data processing method, mail server, program for mail server, terminal device and program for terminal device
US7027809B1 (en) 1996-03-14 2006-04-11 Bellsouth Intellectual Property Corporation Systems and methods for providing visibility to SS7 integrated on-line networks
US7047238B2 (en) 2002-02-21 2006-05-16 Hitachi, Ltd. Document retrieval method and document retrieval system
US20060117108A1 (en) 2004-12-01 2006-06-01 Richard Salisbury Touch screen control
US20060174020A1 (en) 2005-02-01 2006-08-03 Walls Jeffrey J Systems and methods for providing reliable multicast messaging in a multi-node graphics system
US20060224940A1 (en) 2005-04-04 2006-10-05 Sam Lee Icon bar display for video editing system
US7171456B2 (en) 2000-05-31 2007-01-30 Fuji Xerox Co., Ltd. Document data transmission device, document data transmission reception system, and document data transmission method
US20070055931A1 (en) 2003-05-14 2007-03-08 Hiroaki Zaima Document data output device capable of appropriately outputting document data containing a text and layout information
US20070156925A1 (en) 2002-06-21 2007-07-05 Daisuke Shigeta Information Processing Apparatus, Information Equipment, Information Processing System, Automatic Backup Method, Data Transmission and Reception Method, Automatic Backup Program, Data Transmission and Reception Program, and Record Medium having the Programs Recorded thereon
US20070168426A1 (en) 1993-10-01 2007-07-19 Collaboration Properties, Inc. Storing and Accessing Media Files
US20070220425A1 (en) 2006-03-14 2007-09-20 Fabrice Matulic Electronic mail editing device, image forming apparatus, and electronic mail editing method
US7280475B2 (en) 2001-05-28 2007-10-09 Matsushita Electric Industrial Co., Ltd. Data transfer device
US20080043942A1 (en) 2006-08-03 2008-02-21 Omar Cardona E-Mail Delivery Options Usability Tool
US7392484B1 (en) 2000-10-13 2008-06-24 Hewlett-Packard Development Company, L.P. Method and system for capturing, storing, sharing, and managing notes taken during a computer based meeting
US20080216111A1 (en) 1994-05-20 2008-09-04 United Video Properties, Inc. Electronic television program guide schedule system and method
US20090167783A1 (en) 2007-12-28 2009-07-02 Canon Kabushiki Kaisha Image display method, image display apparatus, image recording apparatus, and image pickup apparatus
US20090183140A9 (en) 1998-02-17 2009-07-16 Peck Joseph E Automatic generation of application domain specific graphical programs
US7596750B2 (en) 2002-02-13 2009-09-29 Canon Kabushiki Kaisha Data processing method, program, and information processor
US7689933B1 (en) 2005-11-14 2010-03-30 Adobe Systems Inc. Methods and apparatus to preview content
US8046673B2 (en) 2005-11-07 2011-10-25 Business Objects Software Ltd. Apparatus and method for facilitating trusted business intelligence through data context

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09297750A (en) * 1996-03-08 1997-11-18 Nikon Corp Source file editing device
US6636259B1 (en) * 2000-07-26 2003-10-21 Ipac Acquisition Subsidiary I, Llc Automatically configuring a web-enabled digital camera to access the internet
JP3620716B2 (en) * 2000-07-26 2005-02-16 日本電気株式会社 Remote operation system, remote operation method thereof, and recording medium recording remote operation program

Patent Citations (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5812862A (en) 1993-05-10 1998-09-22 Apple Computer, Inc. Computer-human interface system for compound documents
US20070168426A1 (en) 1993-10-01 2007-07-19 Collaboration Properties, Inc. Storing and Accessing Media Files
US20080216111A1 (en) 1994-05-20 2008-09-04 United Video Properties, Inc. Electronic television program guide schedule system and method
US6088709A (en) 1994-09-20 2000-07-11 Fuji Xerox Co., Ltd. Document processing unit for enabling collective seeing of comments added separately by different persons to document copies containing the same text part
JPH096901A (en) 1995-06-22 1997-01-10 Oki Electric Ind Co Ltd Document reader
US7027809B1 (en) 1996-03-14 2006-04-11 Bellsouth Intellectual Property Corporation Systems and methods for providing visibility to SS7 integrated on-line networks
US6157738A (en) 1996-06-17 2000-12-05 Canon Kabushiki Kaisha System for extracting attached text
US5892843A (en) 1997-01-21 1999-04-06 Matsushita Electric Industrial Co., Ltd. Title, caption and photo extraction from scanned document images
JPH10285325A (en) 1997-04-08 1998-10-23 Oki Electric Ind Co Ltd Facsimile reception transfer system
US6335742B1 (en) 1997-07-24 2002-01-01 Ricoh Company, Ltd. Apparatus for file management and manipulation using graphical displays and textual descriptions
US6613100B2 (en) 1997-11-26 2003-09-02 Intel Corporation Method and apparatus for displaying miniaturized graphical representations of documents for alternative viewing selection
US20030184598A1 (en) 1997-12-22 2003-10-02 Ricoh Company, Ltd. Television-based visualization and navigation interface
US20090183140A9 (en) 1998-02-17 2009-07-16 Peck Joseph E Automatic generation of application domain specific graphical programs
US6173287B1 (en) 1998-03-11 2001-01-09 Digital Equipment Corporation Technique for ranking multimedia annotations of interest
US6931600B1 (en) 1999-05-07 2005-08-16 Autodesk, Inc. Integrating into an application objects that are provided over a network
CN1245935A (en) 1999-06-21 2000-03-01 李虹 Full-automatic recognizing, logging-in and retrieval system of calling cards
US6732102B1 (en) 1999-11-18 2004-05-04 Instaknow.Com Inc. Automated data extraction and reformatting
US7171456B2 (en) 2000-05-31 2007-01-30 Fuji Xerox Co., Ltd. Document data transmission device, document data transmission reception system, and document data transmission method
US20020029259A1 (en) 2000-07-26 2002-03-07 Nec Corporation Remote operation system and remote operation method thereof
US7392484B1 (en) 2000-10-13 2008-06-24 Hewlett-Packard Development Company, L.P. Method and system for capturing, storing, sharing, and managing notes taken during a computer based meeting
US20020069254A1 (en) 2000-12-01 2002-06-06 Matsushita Graphic Communication Systems, Inc. Server apparatus and method for electronic mail transmission control
US20030033221A1 (en) 2000-12-28 2003-02-13 Tetsuji Fuwa Electronic shop management system
JP2002318535A (en) 2001-04-23 2002-10-31 Canon Inc Information processor, information processing method, information processing system and information processing program
US7280475B2 (en) 2001-05-28 2007-10-09 Matsushita Electric Industrial Co., Ltd. Data transfer device
US20030018727A1 (en) 2001-06-15 2003-01-23 The International Business Machines Corporation System and method for effective mail transmission
JP2003111050A (en) 2001-09-27 2003-04-11 Olympus Optical Co Ltd Video distribution server and video reception client system
US20040243575A1 (en) 2002-01-16 2004-12-02 Fujitsu Limited Information searching system
US7596750B2 (en) 2002-02-13 2009-09-29 Canon Kabushiki Kaisha Data processing method, program, and information processor
US7047238B2 (en) 2002-02-21 2006-05-16 Hitachi, Ltd. Document retrieval method and document retrieval system
US20030160813A1 (en) 2002-02-25 2003-08-28 Raju Narayan D. Method and apparatus for a dynamically-controlled remote presentation system
US20070156925A1 (en) 2002-06-21 2007-07-05 Daisuke Shigeta Information Processing Apparatus, Information Equipment, Information Processing System, Automatic Backup Method, Data Transmission and Reception Method, Automatic Backup Program, Data Transmission and Reception Program, and Record Medium having the Programs Recorded thereon
US20040017375A1 (en) 2002-07-29 2004-01-29 Microsoft Corporation In-situ digital inking for applications
JP2004096198A (en) 2002-08-29 2004-03-25 Murata Mach Ltd Communication terminal device and facsimile machine
JP2005182735A (en) 2003-05-14 2005-07-07 Sharp Corp Document data display device, cellular phone device, document data display method, document data display program, computer-readable recording medium with it recorded, document data printer, document data output device, document data output method, document data output program, and computer-readable recording medium with it recorded
US20070055931A1 (en) 2003-05-14 2007-03-08 Hiroaki Zaima Document data output device capable of appropriately outputting document data containing a text and layout information
US20050135677A1 (en) 2003-12-22 2005-06-23 Seevers Timothy S. Method and apparatus for acquiring image data from a scanned document
JP2005244411A (en) 2004-02-25 2005-09-08 Ricoh Co Ltd Network combined machine
US20050259297A1 (en) 2004-04-28 2005-11-24 Oki Data Corporation Image forming apparatus and verifier
US20060072162A1 (en) 2004-08-31 2006-04-06 Tetsuya Nakamura Mail data processing method, mail server, program for mail server, terminal device and program for terminal device
US20060117108A1 (en) 2004-12-01 2006-06-01 Richard Salisbury Touch screen control
US20060174020A1 (en) 2005-02-01 2006-08-03 Walls Jeffrey J Systems and methods for providing reliable multicast messaging in a multi-node graphics system
US20060224940A1 (en) 2005-04-04 2006-10-05 Sam Lee Icon bar display for video editing system
US8046673B2 (en) 2005-11-07 2011-10-25 Business Objects Software Ltd. Apparatus and method for facilitating trusted business intelligence through data context
US7689933B1 (en) 2005-11-14 2010-03-30 Adobe Systems Inc. Methods and apparatus to preview content
US20070220425A1 (en) 2006-03-14 2007-09-20 Fabrice Matulic Electronic mail editing device, image forming apparatus, and electronic mail editing method
US20080043942A1 (en) 2006-08-03 2008-02-21 Omar Cardona E-Mail Delivery Options Usability Tool
US20090167783A1 (en) 2007-12-28 2009-07-02 Canon Kabushiki Kaisha Image display method, image display apparatus, image recording apparatus, and image pickup apparatus

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Gaber et al., Cost-Efficient Mining Techniques for Data Streams, ACM 2004, pp. 109-114.
Kurki et al., Agents in Delivering Personalized Content Based on Semantic Metadata, Google 1999, pp. 84-93.
Podlazov, Nonblockability of Switches with the Cayley Graph Structure for Serial Transfer of Data Blocks. Generalized Hypercubes and Multidimensional Grids, Google 2003, pp. 153-166.
Shneiderman et al., Direct Annotation: A Drag-and-Drop Strategy for Labeling Photos, IEEE 2000, pp. 88-95.

Also Published As

Publication number Publication date
US20120224232A1 (en) 2012-09-06
US8201072B2 (en) 2012-06-12
JP4897520B2 (en) 2012-03-14
US20070230778A1 (en) 2007-10-04
JP2007288769A (en) 2007-11-01

Similar Documents

Publication Publication Date Title
US9060085B2 (en) Image forming apparatus, electronic mail delivery server, and information processing apparatus
JP5223284B2 (en) Information retrieval apparatus, method and program
JP4909576B2 (en) Document editing apparatus, image forming apparatus, and program
US10108584B2 (en) Host apparatus and screen capture control method thereof
US8726178B2 (en) Device, method, and computer program product for information retrieval
US7610274B2 (en) Method, apparatus, and program for retrieving data
US20070220425A1 (en) Electronic mail editing device, image forming apparatus, and electronic mail editing method
US7899246B2 (en) Image display device, image display method, and computer product
JP2007286864A (en) Image processor, image processing method, program, and recording medium
JP2007034847A (en) Retrieval apparatus and retrieval method
JP2007150858A5 (en)
US11303769B2 (en) Image processing system that computerizes documents with notification of labeled items, control method thereof, and storage medium
JP2010072842A (en) Image processing apparatus and image processing method
EP3716598B1 (en) Image processing apparatus, method for controlling the same, and program
CN108881665B (en) Information processing apparatus, information processing method, and computer program
JPH11316821A (en) Method and device for information processing
JP2008052496A (en) Image display device, method, program and recording medium
US9531901B2 (en) Image processing apparatus, image processing method, and non-transitory computer readable medium
US8463804B2 (en) File processing apparatus operating a file based on previous execution history of the file
CN100588218C (en) Image forming apparatus and electronic mail delivery server,
JP7404943B2 (en) Information processing device and information processing program
JP2011095889A (en) Image reader
JP2011086996A (en) Image processing multifunction machine
US11245802B2 (en) Information processing apparatus and non-transitory computer readable medium
US10445375B2 (en) Retrieval information generation device, image processing device, and non-transitory computer readable medium

Legal Events

Date Code Title Description
STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20190616