US20030179301A1 - Tagging for transferring image data to destination - Google Patents
Tagging for transferring image data to destination Download PDFInfo
- Publication number
- US20030179301A1 US20030179301A1 US10/414,191 US41419103A US2003179301A1 US 20030179301 A1 US20030179301 A1 US 20030179301A1 US 41419103 A US41419103 A US 41419103A US 2003179301 A1 US2003179301 A1 US 2003179301A1
- Authority
- US
- United States
- Prior art keywords
- data
- tag
- host
- image
- capturing device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N1/32101—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N1/32128—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title attached to the image data, e.g. file header, transmitted message header, information on the same page or in the same computer file as the image
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/102—Programmed access in sequence to addressed parts of tracks of operating record carriers
- G11B27/105—Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
- H04N5/772—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/0077—Types of the still picture apparatus
- H04N2201/0084—Digital still camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3212—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
- H04N2201/3222—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of processing required or performed, e.g. forwarding, urgent or confidential handling
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3225—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
- H04N2201/3242—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document of processing required or performed, e.g. for reproduction or before recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3274—Storage or retrieval of prestored additional information
- H04N2201/3277—The additional information being stored in the same storage device as the image data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/907—Television signal recording using static stores, e.g. storage tubes or semiconductor memories
Definitions
- the present invention relates to digital camera technology. More specifically, the present invention relates to a method and apparatus for tagging images and videos to facilitate transferring them to a specified destination for easy playback, etc.
- DPOF Version 1.0 One recently adopted digital camera standard, DPOF Version 1.0, available on “http:www.panasonic.co.jp/avc/video/dpof/dpof — 110/white_e.htm,” discloses some functions that may be performed in certain digital cameras.
- DPOF Version 1.0 allows the following functions to be specified on the camera: (1) multi-image print, (2) specifying the size of printed images, (3)auto-transfer via Internet and fax, and (4) auto play for slide show.
- the multi-image-print feature enables one to specify the number of images to be printed on one sheet.
- the specifying-the-size-of-printed-images feature enables one to specify the size of the printed images, so that one could use the prints for a variety of applications, such as displays and certificate materials.
- the auto-transfer-via-Internet-and-fax feature enables one to attach a message to image data and send the resulting data via email.
- the auto-play-for-slide-show feature enables one to specify the images to be played back on liquid crystal displays of digital cameras, video projectors, or PCs for slide show.
- the present invention provides a method, and corresponding apparatus, for attaching a tag to data generated by an image capturing device for post processing in a remote location.
- the data generated by an image capturing device can include, amongst others, still image data, video data, and/or audio data.
- a tag is affixed to the data (which can be still image data, video data, or audio data) within the image capturing device.
- the tag can be attached within the header of the data file.
- the tag could be in the data portion of a still image file, or within a stream of data in a video file.
- the tag is an alias for predetermined instructions according to which the image data is to be processed.
- the tag can be a resolution tag, a cropping tag, a red-eye removal tag, and a quick-send tag.
- the tag comprises an alias for a destination address to which the image data is to be sent.
- the tag can be “Mom” while the destination address to which this alias corresponds can be mom's email address.
- the image capturing device itself may contain only a listing of aliases, one or more of which can be selected by the user. In a remote location, these aliases can then be matched up with the actual destination addresses. This remote location can be a personal computer, a cell phone, a PDA (Personal Digital Assistant), a remote server, etc.
- the tag includes identifying information about the content of the image (e.g., names of the subjects, location, event, etc.). Further, in one embodiment, the tag contains data indicating the status of the tag information within the system, and the status of actions taken within the system to process the tagged image/video data. It will be understood by one skilled in the art that this concept of tags being matched up with other information at a remote location need not be limited to tags comprising an alias for a destination address, but rather can be applied to various kinds of tags as well.
- FIG. 1A is a block diagram of a digital camera according to one embodiment of the invention.
- FIG. 1B depicts a block diagram of an image data file according to one embodiment of the invention.
- FIG. 2 depicts a block diagram of a computer system according to one embodiment of the invention.
- FIG. 3 depicts a simplified flow chart of a method of image tagging for post processing according to one embodiment of the invention.
- FIG. 4 depicts a simplified flow chart of attaching an image tag according to the method of FIG. 3.
- FIG. 5 depicts a simplified flow chart of processing image data according to the method of FIG. 3.
- FIG. 6 depicts a block diagram of a computer connected to a communication network according to one embodiment of the invention.
- FIG. 7A is a block diagram of a system in accordance with one embodiment of the present invention.
- FIG. 7B is a flowchart illustrating the working of a system in accordance with an embodiment of the present invention.
- a digital camera 50 includes an imaging device 100 and a processing system 150 .
- the imaging device includes a lens 102 having an iris, a filter 104 , an image sensor 106 , a timing generator 108 , an analog signal processor (ASP) 110 , an analog-to-digital (A/D) converter 112 , a digital signal processor (DSP) 114 , and one or more motors 116 .
- ASP analog signal processor
- A/D analog-to-digital
- DSP digital signal processor
- imaging device 100 captures an image of object 101 via reflected light impacting image sensor 106 along an optical path 118 .
- Image sensor 106 generates a set of raw image data representing the captured image.
- the raw image data is then routed through ASP 110 , A/D converter 112 and DSP 114 .
- DSP 114 has outputs coupled to timing generator 108 , ASP 110 , and motors 116 to control these components.
- DSP 114 also has its output coupled to processing system 150 via a bus 151 .
- the raw image data are transmitted to system 150 and processed therein.
- processing system 150 includes a bus interface 152 , a processor 154 , a read-only memory (ROM) 156 , an input device 158 , a random access memory (RAM) 160 , an I/O interface 162 , a flash memory 164 , a non-volatile memory 166 , and an internal bus 168 .
- ROM read-only memory
- RAM random access memory
- Bus interface 152 is a bi-directional first-in, first-out interface for receiving the raw image data and control signals passed between system 150 and DSP 114 .
- Processor 154 executes programming instructions stored in ROM 156 and RAM 160 to perform various operations.
- ROM 156 generally stores a set of computer readable program instructions which control how processor 154 accesses, transforms and outputs the image data.
- ROM 156 also stores a start-up program or file that enable a user to access the images stored in the flash memory using any computer whether it has a companion driver software installed or not.
- Input device 158 generally includes one or more control buttons (not shown) which are used to input operating signals that are translated by processor 154 into an image capture request, an operating mode selection request, and various control signals for imaging device 100 .
- I/O Interface 162 is coupled to internal bus 168 and has an external port connector (not shown) that can be used to couple digital camera 50 to a computer 200 for viewing and editing the image data stored in flash memory 164 .
- the camera and computer may be coupled to each other via a communication link 163 .
- I/O interface 62 is a universal serial bus (USB) port.
- Flash memory 164 stores the image data processed by the processor as image data files (see FIG. 1B).
- flash memory 164 is a removable flash card or disk, (e.g., SmartMediaTM, CompactFlashTM, SecureDigital (SD) card, etc.) so that a user may replace a full flash card with a new flash card to store additional image data.
- other types of non-volatile memory other than flash cards may be used.
- FIG. 1B illustrates a schematic block diagram of an image data file 180 including a header 182 , a compressed image data 184 , and a tag field 186 .
- Header 182 includes information identifying corresponding image data file 180 .
- Image data 184 represents an image captured with camera 50 .
- the image data is generally in a compressed form, e.g., in JPEG format, to conserve memory space of flash card 164 .
- Tag field 186 includes tags, e.g., a resolution tag 188 , a cropping tag 190 , a red-eye removal tag 192 , and a quick-send tag 194 , that provides instructions to computer 200 for post processing, as well as other types of tags.
- non-volatile memory 166 stores an image counter whose current value becomes an identifier for each new set of image data captured by camera 50 .
- the counter is preferably incremented each time a new image is captured.
- computer 200 includes an I/O interface 202 which can be used to couple computer 200 to camera 50 .
- the computer also includes a bus 204 for communicating data, a central process unit (CPU) 206 coupled to bus 204 to process data, a memory 206 coupled to bus 204 to store data and instructions to be executed by CPU 206 , and a communication interface 208 coupled to a network via a communication link 209 .
- the communication interface may be an integrated services digital network (ISDN) card, modem, local area network (LAN) card, or the like.
- ISDN integrated services digital network
- LAN local area network
- Computer 200 is coupled to a display device 210 , e.g., a monitor, via bus 204 to display information and an input device 212 , e.g., a keyboard, to input data to the computer.
- computer 200 serves as a host device for viewing, editing, and otherwise processing image data files received from camera 50 via I/O interface 202 , as explained in more detail later in connection with FIG. 5.
- another electronic device e.g., a cellular phone or portable digital assistant, may be used as the host device in place of the computer.
- the system may consist of an interface (possibly wireless) in the camera itself communicating with a router through which the camera can send data directly to a server etc.
- the image data files can be transmitted to the host device via an intermediary electronic device, such as a flash card reader (not shown).
- a process 300 depicts a method of image tagging for post processing according to one embodiment of the present invention.
- a user takes a picture using camera 50 , from which raw image data is generated by image sensor 106 .
- Processing unit 154 processes the raw image data, where the processing includes compressing the data to a more manageable size (step 304 ).
- the image data is compressed into a Joint Photographic Expert Group (JPEG) format.
- JPEG Joint Photographic Expert Group
- the user views the image corresponding to the saved image data and selects one or more tags to be attached to the image data (step 306 ) via the user interface of the input device 158 .
- JPEG Joint Photographic Expert Group
- computer 200 can process the image data files automatically, without specific user initiatives, upon receiving them from camera 50 according to the instructions specified in the tag.
- camera 50 does not require a powerful processor since heavy data processing functions may be allocated to be performed in computer 200 .
- the inconvenience to the user of editing or otherwise processing the image data on computer 200 is reduced.
- tags are stored in tag field 186 of the image data file.
- the tags are stored in the tag filed in the stream of a video file.
- the tags are interleaved or encoded into the still image or video data itself.
- the image data file is transmitted to computer 200 either by linking camera 50 to the computer, or by removing the flash card and inserting it into a flash card reader that is coupled to the computer (step 308 ).
- Computer 200 processes the image data file according to the tags in the tag field (step 310 ).
- the tags are extracted on the host PC, and the tag is then looked-up in the database on the PC.
- each tag in the database has one or more destination addresses associate with it.
- the PC sends the image data, along with these associated destination addresses, to a server.
- the image data may be automatically modified on the PC for optimized delivery through the server to a recipient, based up on various factors (e.g., file type, connection, internet congestion, recipient's platform and conduit, etc.).
- the server then delivers the image data to each of the specified destination addresses. Another example is as follows. If the image data has a tag instructing the computer to increase the pixel resolution of the image data from one megapixels to three megapixels, the computer performs an appropriate algorithm to increase the resolution size of the image.
- a method 400 depicts a method of attaching tags to the image data according to one implementation of the present invention.
- the user views the image data stored in RAM 160 or flash card 164 .
- digital cameras such as camera 50
- a tag with appropriate instructions is attached to the image data (step 406 ). It should be noted that a tag may simply be an integer which is interpreted on the host to indicate an action, or set of data, or both.
- the user is prompted if he or she is finished with the tagging (step 408 ). If so, method 320 ends and process 400 continues onto step 308 . If not, steps 404 to 408 are repeated.
- camera 50 enables the user to attach one or more of the following tags to the image data: (1) resolution tag 188 , (2) cropping tag 190 , (3) red-eye removal tag 192 , (4) quick-send tag 194 (see, FIG. 1B) and various other types of tags.
- the resolution tag instructs a host device, e.g., computer 200 , to automatically convert the image data from one resolution size to another resolution size.
- camera 50 is configured to save images in resolution size of one mega-pixel.
- the user may view the captured image, and if he or she likes the picture and wishes to enlarge it, the user may select to have the image data converted to a greater resolution, e.g., three megapixels.
- a method of converting an image from one resolution size to another resolution size is described in U.S. Pat. No. 6,058,248, which is incorporated by reference herein for all purposes.
- the user may select from a plurality of resolution sizes or manually input the desired resolution size.
- the user may elect to have the image converted to a lower resolution size, particularly when the user wishes to email the image data to another person, to minimize use of the communication bandwidth.
- the camera may be programmed to attach automatically attach a resolution tag without specific user initiative. For example, the user may set the default resolution size as two megapixels and require the camera to automatically attach a resolution tag to image data generated, where the resolution tag instructs a host device to convert the image data from two megapixels to one megapixel.
- the cropping tag instructs computer 200 to automatically remove undesired portions of a picture.
- the user may view the captured image and decides which portion of the image to retain and which to delete.
- a method of cropping an image data is described in U.S. Pat. No. 5,978,519, which is incorporated by reference herein for all purposes.
- the red-eye removal tag instructs computer 200 to automatically edit the image to remove the red-eye effects on pictures taken in poorly lighted environments.
- Pictures taken in poorly lighted environments may cause the pupils of people or animals to take on red tint.
- the user may review the picture taken and, if necessary, attach a tag instructing the computer to automatically remove the red-eye effects on the picture.
- the camera may be provided with a light sensor (not shown) and programmed to attach a red-eye removal tag automatically whenever a picture is taken in a poorly lighted environment.
- the red-eye removal tags may be automatically attached to the images captured whenever a flash light (not shown) of the camera goes off.
- a method of removing the red-eye effects is described in U.S. Pat. No. 6,134,339, which is incorporated by reference herein for all purposes.
- the quick-send tag instructs computer 200 to automatically send the image data to another person or entity via a communication network.
- Camera 50 may include a plurality of communication addresses, e.g., email addresses, in ROM 156 .
- the quick-send tag may comprise of an alias, rather than the actual address of the recipient. The use of such aliases is discussed in greater detail below with reference to FIGS. 7A and 7B.
- tags can be of various other types.
- the tag includes identifying information about the content of the image (e.g., names of the subjects, location, event, etc.).
- the tag contains data indicating the status of the tag information within the system, and the status of actions taken within the system to process the tagged image/video data.
- the user may attach other types of tags other than those listed above, e.g., a stitching tag that instructs computer 200 to stitch two or more pictures together.
- a method 500 depicts a method of processing image data file 330 in computer 200 .
- computer 200 receives the image data file via I/O interface 202 .
- I/O interface 202 of computer 200 is coupled to I/O interface 162 of camera 50 to enable the computer to receive the image data file.
- flash card 164 is removed from camera 50 and inserted into a flash card reader, which is coupled to I/O interface 202 of the computer, to transmit the image data file to computer 200 .
- the camera and flash card reader may be coupled to the computer via a physical connection or wireless connection.
- computer 200 checks tag field 186 of the received image data file to determine whether corresponding image data 184 needs to be processed in a particular manner according to tags in tag field 186 (step 504 ).
- the received image data file is first stored in memory 208 before the tag field is checked by CPU 206 . If CPU 206 determines that the tag field does not contain any tag, then image data 184 is processed according to a default setting, i.e., the image data is decompressed and displayed on display device 210 (step 510 ). Thereafter, the user may edit, print, send, or perform other functions on the image data 184 using input device 212 .
- step 506 if there are one or more tags (e.g., resolution tag 188 and quick-send tag 194 ) in the image tag field, CPU 206 retrieves one of the tags to be processed (step 508 ).
- the tags may be retrieved in any order or in the order it was attached in method 400 .
- the resolution tag is first retrieved, where the resolution tag instructs the computer to convert the image data from the resolution size of one mega-pixel to the resolution size of three megapixels.
- the computer processes the image data by performing an appropriate algorithm to increase the resolution size (step 510 ).
- a resulting image data file 180 ′ with new image data 184 ′ of three megapixels is saved in memory 208 of the computer. Thereafter, the image with the increased resolution size is displayed on display device 210 to the user for viewing, editing, or performing other functions on the image data.
- the computer checks the tag field to determine if there are any other tags in the tag field. Since another tag exists in tag field 186 in this exemplary implementation, steps 508 to 510 are repeated. The remaining tag, quick-send tag 194 , is retrieved (step 508 ). In one implementation, these subsequent steps may be performed prior to displaying the new image data 184 ′ on the display device.
- the tag instructs the computer to transmit the image data file to one or more recipients, e.g., Jay Feldis and Bryed Billerbeck. In one embodiment, the tag may include the email addresses of these recipients. In another embodiment, the tag is simply an alias or reference (e.g., an integer) to an entry in a database on the host.
- the entry in the database matches up each tag with one or more destination addresses, as explained below in more detail with reference to FIGS. 7A&B.
- the computer connects to a communication network via link 209 .
- computer 200 is coupled to a plurality of remote computers 230 to 234 via a communication network 240 , e.g., the Internet.
- the computer initiates an Internet connection, and once connected to the Internet, the image data file is sent to the email addresses of Jay Feldis and Bryed Billerbeck.
- the data is sent via a local email client.
- the data is sent through a server, which then sends the data to the recipients (e.g., via SMTP mail software component on the server).
- Jay and Bryed having access to remote computers 230 and 232 , respectively, may retrieve the image data file transmitted by computer 200 .
- the transmitted image data file is the original image data file 180 transmitted by camera 50 , where the image data is formatted to have a resolution size of one mega-pixel.
- the remote computers may automatically increase the resolution size of image data 184 to three megapixels according to the instructions provided in resolution tag 188 before displaying the image on their respective display devices.
- the transmitted image data file may be the new image data file 180 ′ with the new image data 184 ′ of three megapixels, thereby eliminating the need to process the resolution tag before displaying the image.
- One advantage of transmitting the original image data file 184 is bandwidth conservation during transmission of the data file.
- FIG. 7A is a block diagram of a system in accordance with one embodiment of the present invention.
- FIG. 7A comprises an image capturing device 710 , a host 720 , a network 730 , and a destination 740 .
- Data from the image capturing device 710 can be downloaded to the host 720 , and then be transferred to the destination 740 via the network 730 .
- the image capturing device 710 comprises a data storage module 712 , an alias table 714 , and a display module 716 .
- the image capturing device 710 is a digital camera.
- the data storage module 712 comprises only internal memory, such as NAND Flash, etc.
- the data storage module 712 comprises only external (or removable) memory, such as Compact Flash, Smart Media Card, SD, memory sticks, etc.
- the data storage module 712 comprises both internal and external memory.
- the alias table 714 is a listing of various aliases set-up by the user.
- the user sets up the aliases on the image capturing device 710 itself.
- the user sets up the aliases on a host 720 , and these can then be downloaded to the image capturing device 710 , either directly (e.g., if the host 720 is a personal computer), or via the network 730 .
- the aliases can be stored as a table, a list, or in any other format. Table 1 below provides an example of an alias table. TABLE 1 Alias Mom Family Jay Bryed Friends Work
- the display module 716 can display several things, including but not limited to displaying a preview of data about to be captured by the user, displaying previously captured data for review by the user, and displaying a choice of aliases from which the user can select one or more aliases.
- the display module 716 comprises a Liquid Crystal Display (LCD) or a Liquid Crystal Monitor (LCM). Further, the display module 716 can also be used to receive user selections/instructions, such as which images are to be sent to what destinations, etc. In one embodiment, this can be done by displaying a user interface on the display module 716 .
- the host 720 is a remote server.
- the image capturing device 710 communicates with the remote server via the network 730 .
- a digital camera may not need to download pictures to a local personal computer, but instead, may directly communicate with a remote server over the network.
- the host 720 is a personal computer, a cell-phone, a networked storage device (a media library), a Personal Digital Assistant (PDA), etc.
- the image capturing device 710 communicates directly with the host 720 , without having to go through the network 730 . Such an embodiment is explained in further detail with reference to FIG. 7B below.
- the host 720 includes a receiving module 722 and a look-up table 724 .
- the receiving module 722 receives the image data from the image capturing device 710 .
- the look-up table 724 can be created, in one embodiment, on the host 720 .
- the tags, as well as the instructions associated with each tag are entered on the host 720 .
- the tags are then downloaded on to the image capturing device 710 .
- the look-up table 724 is harvested from other applications on the host 710 , or from elsewhere on the network 730 .
- the look-up table 724 comprises aliases mapped to destination addresses. Thus, if some data is associated with a specific alias, the look-up table 724 serves to translate the alias to a destination address. It is to be noted that a single alias could refer to a single destination address, or to multiple destination addresses (i.e., to a group of destination addresses).
- the destination addresses can be any type of address, such as email addresses, Instant Messenger (IM) addresses, cell phone addresses, etc.
- IM Instant Messenger
- table is simply illustrative. The information can be stored as a table, a list, or in any other format.
- Table 2 provides an example of a look-up table 724 . It is to be noted that in other embodiments, table 724 also includes other information such as IM buddy names, the address of a storage location such as a data storage location, media library, or website, etc.
- table 724 also includes other information such as IM buddy names, the address of a storage location such as a data storage location, media library, or website, etc.
- the network 730 is any type of network such as a wide area network (WAN) or a local area network (LAN).
- the wide area network may include the Internet, the Internet 2 , and the like.
- the local area network may include an Intranet, which may be a network based on, for example, TCP/IP, belonging to an organization accessible only by the organization's members, employees, or others with authorization.
- the local area network may also be a network such as, for example, NetwareTM from Novell Corporation (Provo, Utah) or Windows NT from Microsoft Corporation (Redmond, Wash.).
- the destination 740 may be a specified by an email address, an instant messenger address, an e-frame address, a cell phone number, and so on. It will be obvious to a person of ordinary skill in the art that the destination 740 is any destination where the data can be sent. For instance, it is possible for the destination to be the postal address of a person, where a DVD, video cassette, and/or hard copies of photos could be delivered (via a store etc. where the DVDs etc. are prepared).
- FIG. 7B is a block diagram of another embodiment in accordance with the present invention.
- the components of the system include an image capturing device 710 , a host 720 , networks 730 a & b , a server 735 , and a destination 740 .
- the image capturing device 710 is as described above with reference to FIG. 7A. However, unlike in FIG. 7A, the image capturing device 710 does not connect directly to the network 730 . Instead, the image capturing device connects to host 720 , via which it connects to the network 730 a.
- the host 720 is a personal computer (PC). In other embodiments, the host 720 is a cell-phone, a networked storage device (a media library), a Personal Digital Assistant (PDA), etc.
- the host 720 includes a receiving module 722 and a look-up table 724 , which have been described with reference to FIG. 7A.
- the host 720 also has an image modification module 725 .
- the image modification module 725 modifies (e.g., compresses) images/videos to satisfy quick download by recipients, and to keep costs down. In one embodiment, such compression and image modification is configurable, and is specified by the server 735 , and happens dynamically when the host 720 connects to the server 735 .
- the host 720 then connects to a server 735 through a network 730 a.
- the server 735 connects to the destination 740 through a network 730 b.
- networks 730 a and 730 b could be the same network, or could be distinct networks.
- the server 735 serves various functions. For instance, in one embodiment, the server 735 could create thumbnails of the images/videos to send to the destination 740 . In one embodiment, the server 735 creates Hyper Text Markup Language (HTML) emails to send to the destination 740 . In one embodiment, the server 735 creates email text in the appropriate language.
- HTML Hyper Text Markup Language
- the server 735 reformats data to optimize delivery to the recipient based on the data type, connection, network traffic, etc. In still another embodiment, the server 735 selects appropriate communication channels (e.g., email, instant messenger, cell phone, etc.) based on factors such as address type, file size, etc.
- appropriate communication channels e.g., email, instant messenger, cell phone, etc.
- FIG. 7C is a flowchart illustrating how the systems in FIGS. 7A & 7B would work in accordance with one embodiment of the present invention.
- the data such as still image data, video data, or audio data
- the image capturing device 710 thus receives selection of the data 750 .
- the data from the data storage 712 is displayed on the display module 716 .
- This data may be displayed as individual data items (e.g., individual still images) or as a collection of these (e.g., thumbnails of several still images), etc.
- the user can then select desired data via the display module 716 , and the image capturing device 710 receives selection of the data 750 .
- the user also selects one or more tags to be associated with the data.
- the image capturing device 710 thus receives the selection of a tag(s) to be associated with the selected image data 752 .
- the tag can refer to destination(s) to which the selected data is to be sent.
- the tags can refer to the subject matter of the picture being taken. (E.g., the picture being taken is of “mom” or of “Christmas” etc.).
- the tag when the data being selected is video data, the tag can be a video editing tag.
- Such a video editing tag may include instructions to trim one or both ends of the video file, mark one or more significant frames in the video file for special processing (e.g., display on host), indicate which frame to represent as a thumbnail, etc.
- the selection of the tags is performed by the user by interacting with the display module 716 .
- the display module 716 displays a list of possible tags. An example of such a list comprising of aliases is provided in Table 1 above. The user can thus select one of the tags displayed. For instance, the user may choose to send a particular video clip to “Mom.”
- the user inputs (e.g., using a stylus, or by using a “keyboard” that appears on the display module 716 ) the tag using the display module 716 , rather than selecting it from a list.
- the user manipulates certain input devices (e.g., buttons on the image capturing device 710 ) to make the selections (e.g., of the data and the tags).
- the image capturing device 710 receives 752 the selection of one or more tags to be associated with the selected data.
- the selected tag(s) is then attached 754 to the selected data.
- the tag is inserted into the header of the data.
- the tag is encrypted an embedded into the data itself. This can be done by interleaving the tag with pixels of image data. In one embodiment, this interleaving is done in a way that is not visible to the user when the data is displayed. In another embodiment, this interleaved tag is extracted before the data is displayed to the user.
- existing technologies e.g., the JPEG-EXIF metadata standard
- JPEG-EXIF metadata standard can be used to insert tags still images and/or videos.
- the modified data (i.e., the data including the destination alias) is downloaded 756 to a host 720 .
- the table look-up is then performed 758 to identify the instructions corresponding to the tags. For example, a look-up may identify which destination or destinations are specified by each alias tag.
- An example of a look-up table is provided in Table 2 above. It can be seen from Table 2 that, for example, if “Mom” were the alias tag inserted 754 into the data, then the data would be emailed to jane@yahoo.com.
- the present invention is not limited to sending data to email addresses alone. Rather, the present invention can be used to send data to instant messenger addresses, web-sites specified by URLs, and any other such destinations which can be specified using a destination address. Further, it should be noted that, as mentioned above, the tags could identify things other than destination addresses, such as the subject matter of the photograph, etc.
- the data is intelligently formatted 759 .
- intelligent formatting 759 includes sending each recipient a single message (e.g., email), even if, for example, multiple still images are to be sent to him.
- FIG. 7D illustrates some details.
- each piece of data (e.g., a still image, a video file, etc.) is uploaded to the host 720 (and/or the server 735 ) only once.
- the tags associated with each piece of data are then looked-up in a look-up table as described above.
- the instructions associated with each tag e.g., specific destination address(es) corresponding to each tag, to which the still image and/or video data is to be sent.
- the data for transfer is then formatted intelligently such that each destination address is followed by a list of still images and/or video data to be sent to that address. In this way, each recipient only receives a single message, regardless of how many pieces of data are received by him.
- intelligent formatting 759 includes optimizing data transfer. Such optimization may include generation of thumbnails of still image/video data, generating any text accompanying the data, and so on.
- data is optimized based on the data type, recipient's network connection, etc. In one embodiment, such information is obtained dynamically from the network. In another embodiment, such information is included in the look-up table.
- the data is processed 760 in accordance with the instructions associated with the tag.
- the tag is a destination tag
- the data is sent to the destination specified by the destination address(es) corresponding to the alias tag(s).
- the data can be sent to various through communication channel (e.g., email, instant messaging, cell phone, etc.).
- the recipient receives the message containing still image/video data on only one of these addresses. In another embodiment, the recipient receives the message on all of these addresses.
- the tag is a subject matter tag
- the data is sorted by it (e.g., stored in a folder named by the subject matter corresponding to the subject mater tag).
- the table look-up step 758 , the intelligent formatting of the data step 759 , and the processing of data step 760 may be performed on the same host, or on different hosts.
- the table look-up is performed on the host 720
- the intelligent formatting of the data 759 and the processing of data 760 is performed on the server 735 .
- each of these three steps is performed on the host 720 .
- each of these three steps is performed on the server 735 . It will be obvious to one of ordinary skill in the art that any combination of these steps being performed on the host 720 and the server 735 is possible. It is also possible to have several hosts and/or servers for performing these various steps.
- the present invention may be embodied in other specific forms without departing from the essential characteristics thereof.
- the captured data could be audio data associated with an image, or separate from an image.
- the tag could be imbedded in a field of the audio data, in the data itself or a header. Accordingly, the foregoing description is intended to be illustrative, but not limiting, of the scope of the invention which is set forth in the following claims.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Studio Devices (AREA)
- Television Signal Processing For Recording (AREA)
- Editing Of Facsimile Originals (AREA)
Abstract
A method and system for sending image data on an image capturing apparatus, to specified destination(s). The image capturing apparatus includes a list of aliases for the various destinations which can be specified. The data to be sent, and the alias specifying the destination to which it is to be sent, is selected. The alias is then inserted into the data, and the modified data is downloaded to a host. The host contains a look-up table specifying the destination address(es) corresponding to the alias. The specified data is sent to this destination address.
Description
- This application is a continuation-in-part of U.S. application Ser. No. 09/898,476, filed Jul. 3, 2001, entitled “Image Tagging for Post-Processing” invented by John Feldis, which is incorporated herein by reference in its entirety.
- The present invention relates to digital camera technology. More specifically, the present invention relates to a method and apparatus for tagging images and videos to facilitate transferring them to a specified destination for easy playback, etc.
- Digital cameras have been gaining wide acceptance among consumers recently. At the same time, wide usage of email and the Internet has led to increased electronic communication between consumers. A natural result of these two factors is increased desire among users to share still image and video files over the Internet. Still images and videos can currently be sent over the Internet by downloading the image data to a host, and then associating certain image data with a destination address on the host.
- One type of digital camera is disclosed in U.S. Pat. No. 6,167,469, assigned to Agilent Technologies, Inc. The digital camera enables digital images to be transmitted to a selected destination. The application discloses that the address of the destination is input into the camera. A voice message may be attached to the digital images and sent to the selected destination.
- One recently adopted digital camera standard, DPOF Version 1.0, available on “http:www.panasonic.co.jp/avc/video/dpof/dpof—110/white_e.htm,” discloses some functions that may be performed in certain digital cameras. DPOF Version 1.0 allows the following functions to be specified on the camera: (1) multi-image print, (2) specifying the size of printed images, (3)auto-transfer via Internet and fax, and (4) auto play for slide show. The multi-image-print feature enables one to specify the number of images to be printed on one sheet. The specifying-the-size-of-printed-images feature enables one to specify the size of the printed images, so that one could use the prints for a variety of applications, such as displays and certificate materials. The auto-transfer-via-Internet-and-fax feature enables one to attach a message to image data and send the resulting data via email. The auto-play-for-slide-show feature enables one to specify the images to be played back on liquid crystal displays of digital cameras, video projectors, or PCs for slide show.
- Another digital camera standard, Exif Reader, available on “http://www.takenet.or.jp/˜ryuui/minisoft/exifread/english/,” provides a numerous TIFF/EP tags that may be attached to the image data generated by digital cameras.
- None of the above prior art, however, appears to address the need to provide users with a digital camera with data that is easily transferable to a destination, without inputting destination addresses into the camera itself. In addition, none of the above prior art appears to address the need to include image tags in both still images as well as in video/audio data.
- Thus there exists a need for a digital camera where still image data as well as video/audio data can be easily transferred to a destination without inputting destination addresses into the camera.
- The present invention provides a method, and corresponding apparatus, for attaching a tag to data generated by an image capturing device for post processing in a remote location. It is to be noted that the data generated by an image capturing device can include, amongst others, still image data, video data, and/or audio data.
- In one embodiment, a tag is affixed to the data (which can be still image data, video data, or audio data) within the image capturing device. In one embodiment of the present invention, the tag can be attached within the header of the data file. In another embodiment, the tag could be in the data portion of a still image file, or within a stream of data in a video file.
- In one embodiment, the tag is an alias for predetermined instructions according to which the image data is to be processed. Amongst other things, the tag can be a resolution tag, a cropping tag, a red-eye removal tag, and a quick-send tag. In one embodiment, the tag comprises an alias for a destination address to which the image data is to be sent. For instance, the tag can be “Mom” while the destination address to which this alias corresponds can be mom's email address. The image capturing device itself may contain only a listing of aliases, one or more of which can be selected by the user. In a remote location, these aliases can then be matched up with the actual destination addresses. This remote location can be a personal computer, a cell phone, a PDA (Personal Digital Assistant), a remote server, etc.
- In one embodiment, the tag includes identifying information about the content of the image (e.g., names of the subjects, location, event, etc.). Further, in one embodiment, the tag contains data indicating the status of the tag information within the system, and the status of actions taken within the system to process the tagged image/video data. It will be understood by one skilled in the art that this concept of tags being matched up with other information at a remote location need not be limited to tags comprising an alias for a destination address, but rather can be applied to various kinds of tags as well.
- For a further understanding of the nature and advantages of the invention, reference should be made to the following description taken in conjunction with the accompanying drawings.
- FIG. 1A is a block diagram of a digital camera according to one embodiment of the invention.
- FIG. 1B depicts a block diagram of an image data file according to one embodiment of the invention.
- FIG. 2 depicts a block diagram of a computer system according to one embodiment of the invention.
- FIG. 3 depicts a simplified flow chart of a method of image tagging for post processing according to one embodiment of the invention.
- FIG. 4 depicts a simplified flow chart of attaching an image tag according to the method of FIG. 3.
- FIG. 5 depicts a simplified flow chart of processing image data according to the method of FIG. 3.
- FIG. 6 depicts a block diagram of a computer connected to a communication network according to one embodiment of the invention.
- FIG. 7A is a block diagram of a system in accordance with one embodiment of the present invention.
- FIG. 7B is a flowchart illustrating the working of a system in accordance with an embodiment of the present invention.
- The figures (or drawings) depict a preferred embodiment of the present invention for purposes of illustration only. It is noted that similar or like reference numbers in the figures may indicate similar or like functionality. One of skill in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods disclosed herein may be employed without departing from the principles of the invention(s) herein. It is to be noted that the present invention relates to any type of data that can be captured by a digital camera, such as, but not limited to, still image, video, or audio data. For convenience, in some places “image” or other similar terms may be used in this application. Where applicable, these are to be construed as including any such data capturable by a digital camera.
- Referring to FIG. 1A, a
digital camera 50 includes animaging device 100 and aprocessing system 150. The imaging device includes alens 102 having an iris, afilter 104, animage sensor 106, atiming generator 108, an analog signal processor (ASP) 110, an analog-to-digital (A/D)converter 112, a digital signal processor (DSP) 114, and one ormore motors 116. - In operation,
imaging device 100 captures an image ofobject 101 via reflected light impactingimage sensor 106 along anoptical path 118.Image sensor 106 generates a set of raw image data representing the captured image. The raw image data is then routed throughASP 110, A/D converter 112 andDSP 114.DSP 114 has outputs coupled totiming generator 108,ASP 110, andmotors 116 to control these components.DSP 114 also has its output coupled toprocessing system 150 via abus 151. The raw image data are transmitted tosystem 150 and processed therein. - In one embodiment,
processing system 150 includes abus interface 152, aprocessor 154, a read-only memory (ROM) 156, aninput device 158, a random access memory (RAM) 160, an I/O interface 162, aflash memory 164, anon-volatile memory 166, and aninternal bus 168. -
Bus interface 152 is a bi-directional first-in, first-out interface for receiving the raw image data and control signals passed betweensystem 150 andDSP 114.Processor 154 executes programming instructions stored inROM 156 andRAM 160 to perform various operations.ROM 156 generally stores a set of computer readable program instructions which control howprocessor 154 accesses, transforms and outputs the image data. In one implementation,ROM 156 also stores a start-up program or file that enable a user to access the images stored in the flash memory using any computer whether it has a companion driver software installed or not. -
Input device 158 generally includes one or more control buttons (not shown) which are used to input operating signals that are translated byprocessor 154 into an image capture request, an operating mode selection request, and various control signals forimaging device 100. I/O Interface 162 is coupled tointernal bus 168 and has an external port connector (not shown) that can be used to coupledigital camera 50 to acomputer 200 for viewing and editing the image data stored inflash memory 164. The camera and computer may be coupled to each other via acommunication link 163. In one implementation, I/O interface 62 is a universal serial bus (USB) port. -
Flash memory 164 stores the image data processed by the processor as image data files (see FIG. 1B). In one implementation,flash memory 164 is a removable flash card or disk, (e.g., SmartMedia™, CompactFlash™, SecureDigital (SD) card, etc.) so that a user may replace a full flash card with a new flash card to store additional image data. In other implementations, other types of non-volatile memory other than flash cards may be used. - FIG. 1B illustrates a schematic block diagram of an image data file180 including a
header 182, a compressed image data 184, and atag field 186.Header 182 includes information identifying corresponding image data file 180. Image data 184 represents an image captured withcamera 50. The image data is generally in a compressed form, e.g., in JPEG format, to conserve memory space offlash card 164.Tag field 186 includes tags, e.g., a resolution tag 188, a cropping tag 190, a red-eye removal tag 192, and a quick-send tag 194, that provides instructions tocomputer 200 for post processing, as well as other types of tags. - Referring back to FIG. 1A,
non-volatile memory 166 stores an image counter whose current value becomes an identifier for each new set of image data captured bycamera 50. The counter is preferably incremented each time a new image is captured. - Referring to FIG. 2,
computer 200 includes an I/O interface 202 which can be used to couplecomputer 200 tocamera 50. The computer also includes abus 204 for communicating data, a central process unit (CPU) 206 coupled tobus 204 to process data, amemory 206 coupled tobus 204 to store data and instructions to be executed byCPU 206, and acommunication interface 208 coupled to a network via acommunication link 209. The communication interface may be an integrated services digital network (ISDN) card, modem, local area network (LAN) card, or the like.Computer 200 is coupled to adisplay device 210, e.g., a monitor, viabus 204 to display information and aninput device 212, e.g., a keyboard, to input data to the computer. In operation,computer 200 serves as a host device for viewing, editing, and otherwise processing image data files received fromcamera 50 via I/O interface 202, as explained in more detail later in connection with FIG. 5. Alternatively, another electronic device, e.g., a cellular phone or portable digital assistant, may be used as the host device in place of the computer. In another embodiment, the system may consist of an interface (possibly wireless) in the camera itself communicating with a router through which the camera can send data directly to a server etc. Yet in other implementations, the image data files can be transmitted to the host device via an intermediary electronic device, such as a flash card reader (not shown). - Referring to FIG. 3, a
process 300 depicts a method of image tagging for post processing according to one embodiment of the present invention. Atstep 302, a user takes apicture using camera 50, from which raw image data is generated byimage sensor 106.Processing unit 154 processes the raw image data, where the processing includes compressing the data to a more manageable size (step 304). In one implementation, the image data is compressed into a Joint Photographic Expert Group (JPEG) format. The user views the image corresponding to the saved image data and selects one or more tags to be attached to the image data (step 306) via the user interface of theinput device 158. Thereafter, using the tags,computer 200 can process the image data files automatically, without specific user initiatives, upon receiving them fromcamera 50 according to the instructions specified in the tag. As a result,camera 50 does not require a powerful processor since heavy data processing functions may be allocated to be performed incomputer 200. At the same time, the inconvenience to the user of editing or otherwise processing the image data oncomputer 200 is reduced. - Once the user decides on the tags to be attached, they are attached to the image data and stored in flash memory or
flash card 164. That is, one or more tags are stored intag field 186 of the image data file. In other embodiments, the tags are stored in the tag filed in the stream of a video file. In yet other embodiments, the tags are interleaved or encoded into the still image or video data itself. The image data file is transmitted tocomputer 200 either by linkingcamera 50 to the computer, or by removing the flash card and inserting it into a flash card reader that is coupled to the computer (step 308).Computer 200 processes the image data file according to the tags in the tag field (step 310). In one embodiment, the tags are extracted on the host PC, and the tag is then looked-up in the database on the PC. As explained in further detail below, in one embodiment, each tag in the database has one or more destination addresses associate with it. In one embodiment, the PC sends the image data, along with these associated destination addresses, to a server. In one embodiment, the image data may be automatically modified on the PC for optimized delivery through the server to a recipient, based up on various factors (e.g., file type, connection, internet congestion, recipient's platform and conduit, etc.). The server then delivers the image data to each of the specified destination addresses. Another example is as follows. If the image data has a tag instructing the computer to increase the pixel resolution of the image data from one megapixels to three megapixels, the computer performs an appropriate algorithm to increase the resolution size of the image. - Referring back to step306, a method 400 (FIG. 4) depicts a method of attaching tags to the image data according to one implementation of the present invention. At
step 402, the user views the image data stored inRAM 160 orflash card 164. Generally, digital cameras, such ascamera 50, include a liquid crystal display (not shown) for viewing images. While viewing images on the liquid crystal display, the user may select an action to be performed subsequently by a host device (step 404). A tag with appropriate instructions is attached to the image data (step 406). It should be noted that a tag may simply be an integer which is interpreted on the host to indicate an action, or set of data, or both. The user is prompted if he or she is finished with the tagging (step 408). If so, method 320 ends andprocess 400 continues ontostep 308. If not, steps 404 to 408 are repeated. - In one implementation,
camera 50 enables the user to attach one or more of the following tags to the image data: (1) resolution tag 188, (2) cropping tag 190, (3) red-eye removal tag 192, (4) quick-send tag 194 (see, FIG. 1B) and various other types of tags. The resolution tag instructs a host device, e.g.,computer 200, to automatically convert the image data from one resolution size to another resolution size. For example,camera 50 is configured to save images in resolution size of one mega-pixel. The user may view the captured image, and if he or she likes the picture and wishes to enlarge it, the user may select to have the image data converted to a greater resolution, e.g., three megapixels. A method of converting an image from one resolution size to another resolution size is described in U.S. Pat. No. 6,058,248, which is incorporated by reference herein for all purposes. - In one implementation, the user may select from a plurality of resolution sizes or manually input the desired resolution size. In another implementation, the user may elect to have the image converted to a lower resolution size, particularly when the user wishes to email the image data to another person, to minimize use of the communication bandwidth. Yet in another implementation, the camera may be programmed to attach automatically attach a resolution tag without specific user initiative. For example, the user may set the default resolution size as two megapixels and require the camera to automatically attach a resolution tag to image data generated, where the resolution tag instructs a host device to convert the image data from two megapixels to one megapixel.
- The cropping tag instructs
computer 200 to automatically remove undesired portions of a picture. The user may view the captured image and decides which portion of the image to retain and which to delete. A method of cropping an image data is described in U.S. Pat. No. 5,978,519, which is incorporated by reference herein for all purposes. - The red-eye removal tag instructs
computer 200 to automatically edit the image to remove the red-eye effects on pictures taken in poorly lighted environments. Pictures taken in poorly lighted environments may cause the pupils of people or animals to take on red tint. The user may review the picture taken and, if necessary, attach a tag instructing the computer to automatically remove the red-eye effects on the picture. In one implementation, the camera may be provided with a light sensor (not shown) and programmed to attach a red-eye removal tag automatically whenever a picture is taken in a poorly lighted environment. For example, the red-eye removal tags may be automatically attached to the images captured whenever a flash light (not shown) of the camera goes off. A method of removing the red-eye effects is described in U.S. Pat. No. 6,134,339, which is incorporated by reference herein for all purposes. - The quick-send tag instructs
computer 200 to automatically send the image data to another person or entity via a communication network.Camera 50 may include a plurality of communication addresses, e.g., email addresses, inROM 156. For each picture taken, the user may select one or more recipients to whom the picture should be sent. In one embodiment of the present invention, the quick-send tag may comprise of an alias, rather than the actual address of the recipient. The use of such aliases is discussed in greater detail below with reference to FIGS. 7A and 7B. When the image data files corresponding to the pictures are received bycomputer 200, they are automatically sent to the selected addresses, as explained in more detail below. - As mentioned above, tags can be of various other types. For example, in one embodiment, the tag includes identifying information about the content of the image (e.g., names of the subjects, location, event, etc.). Further, in one embodiment, the tag contains data indicating the status of the tag information within the system, and the status of actions taken within the system to process the tagged image/video data. For example, a status tag could include information such as status=delivery attempted <date>; result=failed; retry pending. In other implementations, the user may attach other types of tags other than those listed above, e.g., a stitching tag that instructs
computer 200 to stitch two or more pictures together. - Referring back to step310, a method 500 (FIG. 5) depicts a method of processing image data file 330 in
computer 200. Atstep 502,computer 200 receives the image data file via I/O interface 202. In one implementation, I/O interface 202 ofcomputer 200 is coupled to I/O interface 162 ofcamera 50 to enable the computer to receive the image data file. In another implementation,flash card 164 is removed fromcamera 50 and inserted into a flash card reader, which is coupled to I/O interface 202 of the computer, to transmit the image data file tocomputer 200. The camera and flash card reader may be coupled to the computer via a physical connection or wireless connection. - Using
CPU 206,computer 200 checks tagfield 186 of the received image data file to determine whether corresponding image data 184 needs to be processed in a particular manner according to tags in tag field 186 (step 504). In one implementation, the received image data file is first stored inmemory 208 before the tag field is checked byCPU 206. IfCPU 206 determines that the tag field does not contain any tag, then image data 184 is processed according to a default setting, i.e., the image data is decompressed and displayed on display device 210 (step 510). Thereafter, the user may edit, print, send, or perform other functions on the image data 184 usinginput device 212. - On the other hand, at
step 506, if there are one or more tags (e.g., resolution tag 188 and quick-send tag 194) in the image tag field,CPU 206 retrieves one of the tags to be processed (step 508). The tags may be retrieved in any order or in the order it was attached inmethod 400. In this exemplary implementation, the resolution tag is first retrieved, where the resolution tag instructs the computer to convert the image data from the resolution size of one mega-pixel to the resolution size of three megapixels. The computer processes the image data by performing an appropriate algorithm to increase the resolution size (step 510). In one implementation, a resulting image data file 180′ with new image data 184′ of three megapixels is saved inmemory 208 of the computer. Thereafter, the image with the increased resolution size is displayed ondisplay device 210 to the user for viewing, editing, or performing other functions on the image data. - At
step 512, the computer checks the tag field to determine if there are any other tags in the tag field. Since another tag exists intag field 186 in this exemplary implementation, steps 508 to 510 are repeated. The remaining tag, quick-send tag 194, is retrieved (step 508). In one implementation, these subsequent steps may be performed prior to displaying the new image data 184′ on the display device. The tag instructs the computer to transmit the image data file to one or more recipients, e.g., Jay Feldis and Bryed Billerbeck. In one embodiment, the tag may include the email addresses of these recipients. In another embodiment, the tag is simply an alias or reference (e.g., an integer) to an entry in a database on the host. The entry in the database matches up each tag with one or more destination addresses, as explained below in more detail with reference to FIGS. 7A&B. The computer connects to a communication network vialink 209. As shown in FIG. 6,computer 200 is coupled to a plurality ofremote computers 230 to 234 via acommunication network 240, e.g., the Internet. - The computer initiates an Internet connection, and once connected to the Internet, the image data file is sent to the email addresses of Jay Feldis and Bryed Billerbeck. In one embodiment, the data is sent via a local email client. In one embodiment, the data is sent through a server, which then sends the data to the recipients (e.g., via SMTP mail software component on the server). Jay and Bryed having access to
remote computers computer 200. In one implementation, the transmitted image data file is the original image data file 180 transmitted bycamera 50, where the image data is formatted to have a resolution size of one mega-pixel. Therefore, upon receipt of image data file 180, the remote computers may automatically increase the resolution size of image data 184 to three megapixels according to the instructions provided in resolution tag 188 before displaying the image on their respective display devices. Alternatively, the transmitted image data file may be the new image data file 180′ with the new image data 184′ of three megapixels, thereby eliminating the need to process the resolution tag before displaying the image. One advantage of transmitting the original image data file 184 is bandwidth conservation during transmission of the data file. - FIG. 7A is a block diagram of a system in accordance with one embodiment of the present invention. FIG. 7A comprises an
image capturing device 710, ahost 720, anetwork 730, and adestination 740. Data from theimage capturing device 710 can be downloaded to thehost 720, and then be transferred to thedestination 740 via thenetwork 730. - The
image capturing device 710 comprises adata storage module 712, an alias table 714, and adisplay module 716. In one embodiment, theimage capturing device 710 is a digital camera. In one embodiment, thedata storage module 712 comprises only internal memory, such as NAND Flash, etc. In another embodiment, thedata storage module 712 comprises only external (or removable) memory, such as Compact Flash, Smart Media Card, SD, memory sticks, etc. In yet another embodiment, thedata storage module 712 comprises both internal and external memory. - In one embodiment, the alias table714 is a listing of various aliases set-up by the user. In one embodiment of the present invention, the user sets up the aliases on the
image capturing device 710 itself. In another embodiment of the present invention, the user sets up the aliases on ahost 720, and these can then be downloaded to theimage capturing device 710, either directly (e.g., if thehost 720 is a personal computer), or via thenetwork 730. It will be obvious to one of skill in the art that the use of the word “table” is simply illustrative. The aliases can be stored as a table, a list, or in any other format. Table 1 below provides an example of an alias table.TABLE 1 Alias Mom Family Jay Bryed Friends Work - The
display module 716 can display several things, including but not limited to displaying a preview of data about to be captured by the user, displaying previously captured data for review by the user, and displaying a choice of aliases from which the user can select one or more aliases. In one embodiment, thedisplay module 716 comprises a Liquid Crystal Display (LCD) or a Liquid Crystal Monitor (LCM). Further, thedisplay module 716 can also be used to receive user selections/instructions, such as which images are to be sent to what destinations, etc. In one embodiment, this can be done by displaying a user interface on thedisplay module 716. - In one embodiment of the present invention, the
host 720 is a remote server. In one embodiment, theimage capturing device 710 communicates with the remote server via thenetwork 730. As an example, a digital camera may not need to download pictures to a local personal computer, but instead, may directly communicate with a remote server over the network. In other embodiments, thehost 720 is a personal computer, a cell-phone, a networked storage device (a media library), a Personal Digital Assistant (PDA), etc. In such embodiments, theimage capturing device 710 communicates directly with thehost 720, without having to go through thenetwork 730. Such an embodiment is explained in further detail with reference to FIG. 7B below. - The
host 720 includes a receivingmodule 722 and a look-up table 724. The receivingmodule 722 receives the image data from theimage capturing device 710. The look-up table 724 can be created, in one embodiment, on thehost 720. The tags, as well as the instructions associated with each tag are entered on thehost 720. The tags are then downloaded on to theimage capturing device 710. In another embodiment, the look-up table 724 is harvested from other applications on thehost 710, or from elsewhere on thenetwork 730. - In one embodiment, the look-up table724 comprises aliases mapped to destination addresses. Thus, if some data is associated with a specific alias, the look-up table 724 serves to translate the alias to a destination address. It is to be noted that a single alias could refer to a single destination address, or to multiple destination addresses (i.e., to a group of destination addresses). The destination addresses can be any type of address, such as email addresses, Instant Messenger (IM) addresses, cell phone addresses, etc. Moreover, it will be obvious to one of skill in the art that the use of the word “table” is simply illustrative. The information can be stored as a table, a list, or in any other format. Table 2 provides an example of a look-up table 724. It is to be noted that in other embodiments, table 724 also includes other information such as IM buddy names, the address of a storage location such as a data storage location, media library, or website, etc.
TABLE 2 Alias Address(es) Mom jane@yahoo.com Family john@hotmail.com, jane@yahoo.com, mary@abc123.com Jay jay@feldis.com Bryed bryed@yahoo.com Friends susan@hotmail.com, tim@yahoo.com, joanne@xyz.com, peter@rst.com - The
network 730 is any type of network such as a wide area network (WAN) or a local area network (LAN). The wide area network may include the Internet, theInternet 2, and the like. The local area network may include an Intranet, which may be a network based on, for example, TCP/IP, belonging to an organization accessible only by the organization's members, employees, or others with authorization. The local area network may also be a network such as, for example, Netware™ from Novell Corporation (Provo, Utah) or Windows NT from Microsoft Corporation (Redmond, Wash.). - The
destination 740 may be a specified by an email address, an instant messenger address, an e-frame address, a cell phone number, and so on. It will be obvious to a person of ordinary skill in the art that thedestination 740 is any destination where the data can be sent. For instance, it is possible for the destination to be the postal address of a person, where a DVD, video cassette, and/or hard copies of photos could be delivered (via a store etc. where the DVDs etc. are prepared). - FIG. 7B is a block diagram of another embodiment in accordance with the present invention. The components of the system include an
image capturing device 710, ahost 720,networks 730 a & b, aserver 735, and adestination 740. - The
image capturing device 710 is as described above with reference to FIG. 7A. However, unlike in FIG. 7A, theimage capturing device 710 does not connect directly to thenetwork 730. Instead, the image capturing device connects to host 720, via which it connects to thenetwork 730 a. - In one embodiment of the present invention, the
host 720 is a personal computer (PC). In other embodiments, thehost 720 is a cell-phone, a networked storage device (a media library), a Personal Digital Assistant (PDA), etc. Thehost 720 includes a receivingmodule 722 and a look-up table 724, which have been described with reference to FIG. 7A. In addition, in one embodiment, thehost 720 also has animage modification module 725. Amongst other things, theimage modification module 725 modifies (e.g., compresses) images/videos to satisfy quick download by recipients, and to keep costs down. In one embodiment, such compression and image modification is configurable, and is specified by theserver 735, and happens dynamically when thehost 720 connects to theserver 735. - The
host 720 then connects to aserver 735 through anetwork 730 a. Theserver 735 connects to thedestination 740 through anetwork 730 b. It is to be noted thatnetworks server 735 serves various functions. For instance, in one embodiment, theserver 735 could create thumbnails of the images/videos to send to thedestination 740. In one embodiment, theserver 735 creates Hyper Text Markup Language (HTML) emails to send to thedestination 740. In one embodiment, theserver 735 creates email text in the appropriate language. In yet another embodiment, theserver 735 reformats data to optimize delivery to the recipient based on the data type, connection, network traffic, etc. In still another embodiment, theserver 735 selects appropriate communication channels (e.g., email, instant messenger, cell phone, etc.) based on factors such as address type, file size, etc. - FIG. 7C is a flowchart illustrating how the systems in FIGS. 7A & 7B would work in accordance with one embodiment of the present invention. Initially, the data (such as still image data, video data, or audio data) to be processed (e.g., to be sent to a destination) is selected by a user on the
image capturing device 710. Theimage capturing device 710 thus receives selection of the data 750. In one embodiment, the data from thedata storage 712 is displayed on thedisplay module 716. This data may be displayed as individual data items (e.g., individual still images) or as a collection of these (e.g., thumbnails of several still images), etc. The user can then select desired data via thedisplay module 716, and theimage capturing device 710 receives selection of the data 750. - The user also selects one or more tags to be associated with the data. The
image capturing device 710 thus receives the selection of a tag(s) to be associated with the selected image data 752. In one embodiment, the tag can refer to destination(s) to which the selected data is to be sent. In another embodiment, the tags can refer to the subject matter of the picture being taken. (E.g., the picture being taken is of “mom” or of “Christmas” etc.). In yet another embodiment, when the data being selected is video data, the tag can be a video editing tag. Such a video editing tag may include instructions to trim one or both ends of the video file, mark one or more significant frames in the video file for special processing (e.g., display on host), indicate which frame to represent as a thumbnail, etc. - In one embodiment, the selection of the tags is performed by the user by interacting with the
display module 716. It will be obvious to one skilled in the art that the user may select the tags in one of several ways. For instance, in one embodiment, thedisplay module 716 displays a list of possible tags. An example of such a list comprising of aliases is provided in Table 1 above. The user can thus select one of the tags displayed. For instance, the user may choose to send a particular video clip to “Mom.” In another embodiment, the user inputs (e.g., using a stylus, or by using a “keyboard” that appears on the display module 716) the tag using thedisplay module 716, rather than selecting it from a list. In another embodiment, the user manipulates certain input devices (e.g., buttons on the image capturing device 710) to make the selections (e.g., of the data and the tags). In any case, theimage capturing device 710 receives 752 the selection of one or more tags to be associated with the selected data. - The selected tag(s) is then attached754 to the selected data. In one embodiment, the tag is inserted into the header of the data. In another embodiment, the tag is encrypted an embedded into the data itself. This can be done by interleaving the tag with pixels of image data. In one embodiment, this interleaving is done in a way that is not visible to the user when the data is displayed. In another embodiment, this interleaved tag is extracted before the data is displayed to the user. In one embodiment, existing technologies (e.g., the JPEG-EXIF metadata standard) can be used to insert tags still images and/or videos.
- The modified data (i.e., the data including the destination alias) is downloaded756 to a
host 720. The table look-up is then performed 758 to identify the instructions corresponding to the tags. For example, a look-up may identify which destination or destinations are specified by each alias tag. An example of a look-up table is provided in Table 2 above. It can be seen from Table 2 that, for example, if “Mom” were the alias tag inserted 754 into the data, then the data would be emailed to jane@yahoo.com. If “Family” were the alias tag inserted 754 into the data, then the data would be emailed to john@hotmail.com, jane@yahoo.con, and mary@abc123.com. It will be obvious to one skilled in the art, that the present invention is not limited to sending data to email addresses alone. Rather, the present invention can be used to send data to instant messenger addresses, web-sites specified by URLs, and any other such destinations which can be specified using a destination address. Further, it should be noted that, as mentioned above, the tags could identify things other than destination addresses, such as the subject matter of the photograph, etc. - In one embodiment, the data is intelligently formatted759. In one embodiment, when the tag specifies a destination to which the data is to be sent,
intelligent formatting 759 includes sending each recipient a single message (e.g., email), even if, for example, multiple still images are to be sent to him. FIG. 7D illustrates some details. - Referring to FIG. 7D, it can be seen that the tags are associated with the data on the
image capturing device 710. In one embodiment, each piece of data (e.g., a still image, a video file, etc.) is uploaded to the host 720 (and/or the server 735) only once. The tags associated with each piece of data are then looked-up in a look-up table as described above. Thus the instructions associated with each tag (e.g., specific destination address(es) corresponding to each tag, to which the still image and/or video data is to be sent). The data for transfer is then formatted intelligently such that each destination address is followed by a list of still images and/or video data to be sent to that address. In this way, each recipient only receives a single message, regardless of how many pieces of data are received by him. - In one embodiment, when the tag specifies a destination to which the data is to be sent,
intelligent formatting 759 includes optimizing data transfer. Such optimization may include generation of thumbnails of still image/video data, generating any text accompanying the data, and so on. In one embodiment, data is optimized based on the data type, recipient's network connection, etc. In one embodiment, such information is obtained dynamically from the network. In another embodiment, such information is included in the look-up table. - Referring again to FIG. 7C, it can be seen that the data is processed760 in accordance with the instructions associated with the tag. For example, if the tag is a destination tag, the data is sent to the destination specified by the destination address(es) corresponding to the alias tag(s). In one embodiment, the data can be sent to various through communication channel (e.g., email, instant messaging, cell phone, etc.). In one embodiment, if a single recipient is identified by multiple different addresses (from same or different communication channels), the recipient receives the message containing still image/video data on only one of these addresses. In another embodiment, the recipient receives the message on all of these addresses.
- As another example, if the tag is a subject matter tag, the data is sorted by it (e.g., stored in a folder named by the subject matter corresponding to the subject mater tag).
- It is to be noted that the table look-up
step 758, the intelligent formatting of thedata step 759, and the processing of data step 760 may be performed on the same host, or on different hosts. For example, in one embodiment, the table look-up is performed on thehost 720, while the intelligent formatting of thedata 759 and the processing ofdata 760 is performed on theserver 735. In another embodiment, each of these three steps is performed on thehost 720. In yet another embodiment, each of these three steps is performed on theserver 735. It will be obvious to one of ordinary skill in the art that any combination of these steps being performed on thehost 720 and theserver 735 is possible. It is also possible to have several hosts and/or servers for performing these various steps. - As will be understood by those skilled in the art, the present invention may be embodied in other specific forms without departing from the essential characteristics thereof. For example, the captured data could be audio data associated with an image, or separate from an image. The tag could be imbedded in a field of the audio data, in the data itself or a header. Accordingly, the foregoing description is intended to be illustrative, but not limiting, of the scope of the invention which is set forth in the following claims.
Claims (29)
1. An image capturing device for capturing data, the device comprising:
a recording module for converting sensed information into recorded data;
a processor coupled to the recording module to process the recorded data;
a memory coupled to the processor for storing the recorded data; a user interface to allow a user to select recorded data stored in memory and to select a tag to associate with the selected recorded data; and
a computer readable media storing a program configured to cause the processor to attach the selected tag to the selected recorded data by modifying the recorded data.
2. The image capturing device of claim 1 , wherein the recorded data captured is one of still image data and video data.
3. The image capturing device of claim 1 , wherein the recorded data captured is audio data.
4. The image capturing device of claim 1 , wherein the user interface includes an electronic display.
5. The image capturing device of claim 1 , wherein the user interface comprises buttons.
6. The image capturing device of claim 1 , wherein the tag comprises an alias for a destination to which the selected image data is to be sent.
7. The image capturing device of claim 1 , wherein the tag comprises a resolution tag.
8. The image capturing device of claim 1 , wherein the tag comprises a red-eye removal tag.
9. The image capturing device of claim 1 , wherein the tag comprises a cropping tag.
10. The image capturing device of claim 1 , wherein the tag comprises a subject matter tag.
11. A method for processing media data, comprising;
receiving selection of media data;
receiving selection of an alias tag specifying a destination to which the selected media data is to be sent; and
modifying the selected media data by inserting the selected alias tag into the selected media data.
12. The method of claim 11 , wherein the modification of the selected image data is performed by inserting the selected tag alias into a header portion of the selected media data.
13. The method of claim 11 wherein the media data is image data, wherein the modification of the selected image data is performed by interleaving the selected tag alias into pixels of the image data itself.
14. The method of claim 11 , further comprising:
downloading the modified media data onto a host; and
looking-up a destination address specified by the selected alias.
15. The method of claim 14 wherein the looking-up is performed on the host.
16. The method of claim 14 wherein the looking-up is performed on a second host.
17. The method of claim 14 , further comprising:
sending the selected media data to the destination address.
18. The method of claim 17 , further comprising:
intelligently formatting the data to be sent.
19. The method of claim 17 , wherein the sending is performed on the host.
20. The method of claim 17 , wherein the sending is performed on a second host.
21. The method of claim 17 wherein the destination address is an email address.
22. The method of claim 17 wherein the destination address is an instant messenger address.
23. The method of claim 17 wherein the destination address is a cell-phone number.
24. The method of claim 14 wherein the host is a remote server.
25. The method of claim 14 wherein the host is a personal computer.
26. The method of claim 14 wherein the host is a cell-phone.
27. A host for receiving media data from an image capturing device, the host comprising:
a receiving module to receive from the image capturing device, media data including a tag; and
an association module communicatively coupled to the receiving module, which associates the tag with predesignated instructions stored on the host.
28. The host of claim 27 , wherein the association module comprises a look-up table, listing a plurality of tags, and instructions associated with each of the plurality of tags.
29. An image capturing device, comprising:
a data storage module for storing a plurality of captured data;
a display module communicatively coupled to the data storage module for selecting one of the plurality of captured data; and
an alias storage module for storing a plurality of aliases which can be attached to the selected one of the plurality of captured data, wherein the plurality of aliases refer to a plurality of destination addresses to which the plurality of captured data can be sent.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/414,191 US20030179301A1 (en) | 2001-07-03 | 2003-04-14 | Tagging for transferring image data to destination |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/898,476 US7110026B2 (en) | 2001-07-03 | 2001-07-03 | Image tagging for post processing |
US10/414,191 US20030179301A1 (en) | 2001-07-03 | 2003-04-14 | Tagging for transferring image data to destination |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/898,476 Continuation-In-Part US7110026B2 (en) | 2001-07-03 | 2001-07-03 | Image tagging for post processing |
Publications (1)
Publication Number | Publication Date |
---|---|
US20030179301A1 true US20030179301A1 (en) | 2003-09-25 |
Family
ID=25409512
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/898,476 Expired - Lifetime US7110026B2 (en) | 2001-07-03 | 2001-07-03 | Image tagging for post processing |
US10/414,191 Abandoned US20030179301A1 (en) | 2001-07-03 | 2003-04-14 | Tagging for transferring image data to destination |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/898,476 Expired - Lifetime US7110026B2 (en) | 2001-07-03 | 2001-07-03 | Image tagging for post processing |
Country Status (3)
Country | Link |
---|---|
US (2) | US7110026B2 (en) |
CN (1) | CN1229963C (en) |
DE (1) | DE10229093A1 (en) |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040064455A1 (en) * | 2002-09-26 | 2004-04-01 | Eastman Kodak Company | Software-floating palette for annotation of images that are viewable in a variety of organizational structures |
US20050220349A1 (en) * | 2003-07-11 | 2005-10-06 | Shinji Furuya | Image display apparatus and short film generation apparatus |
US20050278643A1 (en) * | 2004-06-15 | 2005-12-15 | Hitachi, Ltd. | Display control apparatus, information display apparatus, display system, display control program and display control method |
US20050278379A1 (en) * | 2004-06-10 | 2005-12-15 | Canon Kabushiki Kaisha | Image retrieval device and image retrieval method |
US20070098284A1 (en) * | 2004-04-09 | 2007-05-03 | Hiroshi Sasaki | Method for preparing compressed image data file, image data compression device, and photographic device |
US20070244892A1 (en) * | 2006-04-17 | 2007-10-18 | Narancic Perry J | Organizational data analysis and management |
WO2008087627A2 (en) * | 2007-01-16 | 2008-07-24 | D-Blur Technologies | Passing embedded data through a digital image processor |
US20090019176A1 (en) * | 2007-07-13 | 2009-01-15 | Jeff Debrosse | Live Video Collection And Distribution System and Method |
US20090109324A1 (en) * | 2007-10-26 | 2009-04-30 | Jonathan Kaplan | Charging and use scheme for a hand-held electronics device |
US20100171833A1 (en) * | 2007-02-07 | 2010-07-08 | Hamish Chalmers | Video archival system |
US20100254368A1 (en) * | 2008-10-14 | 2010-10-07 | Sony Corporation | Information receiving apparatus and information transmitting apparatus |
US20100310245A1 (en) * | 2003-12-09 | 2010-12-09 | Panasonic Corporation | Lens driving apparatus, imaging apparatus, and lens barrel and camera main body used for this |
US20110001890A1 (en) * | 2009-06-05 | 2011-01-06 | T-Data Systejms (S) Pte. Ltd. | Portable image projector and projection method |
US20120036132A1 (en) * | 2010-08-08 | 2012-02-09 | Doyle Thomas F | Apparatus and methods for managing content |
US20120069047A1 (en) * | 2010-09-17 | 2012-03-22 | Panasonic Corporation | Image display apparatus, image editing apparatus, image display program, and image editing program |
US20120242845A1 (en) * | 2009-12-01 | 2012-09-27 | T-Data Systems (S) Pte Ltd | Memory card and method for storage and wireless transceiving of data |
US8341219B1 (en) * | 2006-03-07 | 2012-12-25 | Adobe Systems Incorporated | Sharing data based on tagging |
US8346016B1 (en) * | 2006-05-19 | 2013-01-01 | Google Inc. | Large-scale image processing using mass parallelization techniques |
US20140043495A1 (en) * | 2012-08-10 | 2014-02-13 | Logitech Europe S.A. | Wireless video camera and connection methods including multiple video or audio streams |
US8762493B1 (en) | 2006-06-22 | 2014-06-24 | Google Inc. | Hierarchical spatial data structure and 3D index data versioning for generating packet data |
US9224145B1 (en) | 2006-08-30 | 2015-12-29 | Qurio Holdings, Inc. | Venue based digital rights using capture device with digital watermarking capability |
CN106791957A (en) * | 2016-12-07 | 2017-05-31 | 北京华夏电通科技有限公司 | Net cast processing method and processing device |
US20190020813A1 (en) * | 2017-07-14 | 2019-01-17 | Casio Computer Co., Ltd. | Image Recording Apparatus, Image Recording Method, and Computer-Readable Storage Medium |
Families Citing this family (96)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002036682A (en) * | 2000-07-24 | 2002-02-06 | Canon Inc | Image recorder, image input unit, data processor, image recording method, and recording medium |
DE60144300D1 (en) * | 2000-10-20 | 2011-05-05 | Fujifilm Corp | An image processing device that associates information with an identified subject of the image |
US7133070B2 (en) * | 2001-09-20 | 2006-11-07 | Eastman Kodak Company | System and method for deciding when to correct image-specific defects based on camera, scene, display and demographic data |
JP4343468B2 (en) * | 2001-11-29 | 2009-10-14 | 富士フイルム株式会社 | Image processing system, imaging apparatus, image processing apparatus, image processing method, and program |
US7151920B2 (en) * | 2001-12-18 | 2006-12-19 | Nokia Corporation | System for protecting pictures used in wireless communication messages |
JP3790965B2 (en) * | 2002-03-20 | 2006-06-28 | 富士写真フイルム株式会社 | Digital camera and image processing apparatus |
US7170557B2 (en) * | 2002-03-26 | 2007-01-30 | Eastman Kodak Company | Modular digital imaging system |
JP4357151B2 (en) * | 2002-03-28 | 2009-11-04 | 富士フイルム株式会社 | Digital camera and image data processing system |
TWI228370B (en) * | 2002-05-29 | 2005-02-21 | Abocom Sys Inc | Means for displaying having pluralities of memory card slots |
US6888569B2 (en) * | 2002-10-02 | 2005-05-03 | C3 Development, Llc | Method and apparatus for transmitting a digital picture with textual material |
JP3972871B2 (en) * | 2002-10-17 | 2007-09-05 | 村田機械株式会社 | Color image communication apparatus and color image communication method |
JP4151387B2 (en) * | 2002-11-15 | 2008-09-17 | セイコーエプソン株式会社 | Automatic image quality adjustment according to subject brightness |
US7612803B2 (en) * | 2003-06-10 | 2009-11-03 | Zoran Corporation | Digital camera with reduced image buffer memory and minimal processing for recycling through a service center |
US7221811B2 (en) * | 2003-10-01 | 2007-05-22 | Hewlett-Packard Development Company, L.P. | Method and apparatus for conveying image attributes |
TW200514423A (en) * | 2003-10-06 | 2005-04-16 | Sunplus Technology Co Ltd | Image-capturing equipment serving mobile storage device |
US20060031090A1 (en) * | 2004-03-31 | 2006-02-09 | Tarr Christopher A | System and method for providing custom stock images |
US8657606B2 (en) * | 2004-07-02 | 2014-02-25 | Paul Fisher | Asynchronous art jurying system |
US7502063B2 (en) * | 2004-08-09 | 2009-03-10 | Aptina Imaging Corporation | Camera with scalable resolution |
KR100647953B1 (en) * | 2004-11-04 | 2006-11-23 | 엘지전자 주식회사 | Mobile phone offering image meta information |
US20060173803A1 (en) * | 2005-01-28 | 2006-08-03 | Morris Robert P | Method and system for associating specific files with different applications |
US9124729B2 (en) * | 2005-01-31 | 2015-09-01 | The Invention Science Fund I, Llc | Shared image device synchronization or designation |
US7920169B2 (en) * | 2005-01-31 | 2011-04-05 | Invention Science Fund I, Llc | Proximity of shared image devices |
US20070236505A1 (en) * | 2005-01-31 | 2007-10-11 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Resampling of transformed shared image techniques |
US20060187228A1 (en) * | 2005-01-31 | 2006-08-24 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Sharing including peripheral shared image device |
US20060285150A1 (en) * | 2005-01-31 | 2006-12-21 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Regional proximity for shared image device(s) |
US20060174203A1 (en) * | 2005-01-31 | 2006-08-03 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Viewfinder for shared image device |
US20060221197A1 (en) * | 2005-03-30 | 2006-10-05 | Jung Edward K | Image transformation estimator of an imaging device |
US8902320B2 (en) * | 2005-01-31 | 2014-12-02 | The Invention Science Fund I, Llc | Shared image device synchronization or designation |
US20060187230A1 (en) * | 2005-01-31 | 2006-08-24 | Searete Llc | Peripheral shared image device sharing |
US9910341B2 (en) * | 2005-01-31 | 2018-03-06 | The Invention Science Fund I, Llc | Shared image device designation |
US8606383B2 (en) | 2005-01-31 | 2013-12-10 | The Invention Science Fund I, Llc | Audio sharing |
US20060190968A1 (en) * | 2005-01-31 | 2006-08-24 | Searete Llc, A Limited Corporation Of The State Of The State Of Delaware | Sharing between shared audio devices |
US7876357B2 (en) * | 2005-01-31 | 2011-01-25 | The Invention Science Fund I, Llc | Estimating shared image device operational capabilities or resources |
US9082456B2 (en) * | 2005-01-31 | 2015-07-14 | The Invention Science Fund I Llc | Shared image device designation |
US20060170956A1 (en) * | 2005-01-31 | 2006-08-03 | Jung Edward K | Shared image devices |
US20060187227A1 (en) * | 2005-01-31 | 2006-08-24 | Jung Edward K | Storage aspects for imaging device |
US9325781B2 (en) | 2005-01-31 | 2016-04-26 | Invention Science Fund I, Llc | Audio sharing |
US20060173972A1 (en) * | 2005-01-31 | 2006-08-03 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Audio sharing |
US20060171603A1 (en) * | 2005-01-31 | 2006-08-03 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Resampling of transformed shared image techniques |
US9489717B2 (en) * | 2005-01-31 | 2016-11-08 | Invention Science Fund I, Llc | Shared image device |
US20070139529A1 (en) * | 2005-06-02 | 2007-06-21 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Dual mode image capture technique |
US7872675B2 (en) * | 2005-06-02 | 2011-01-18 | The Invention Science Fund I, Llc | Saved-image management |
US8233042B2 (en) * | 2005-10-31 | 2012-07-31 | The Invention Science Fund I, Llc | Preservation and/or degradation of a video/audio data stream |
US9967424B2 (en) * | 2005-06-02 | 2018-05-08 | Invention Science Fund I, Llc | Data storage usage protocol |
US10003762B2 (en) | 2005-04-26 | 2018-06-19 | Invention Science Fund I, Llc | Shared image devices |
US20070098348A1 (en) * | 2005-10-31 | 2007-05-03 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Degradation/preservation management of captured data |
US9451200B2 (en) * | 2005-06-02 | 2016-09-20 | Invention Science Fund I, Llc | Storage access technique for captured data |
US20070008326A1 (en) * | 2005-06-02 | 2007-01-11 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Dual mode image capture technique |
US20070109411A1 (en) * | 2005-06-02 | 2007-05-17 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Composite image selectivity |
US20070222865A1 (en) * | 2006-03-15 | 2007-09-27 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Enhanced video/still image correlation |
US9093121B2 (en) | 2006-02-28 | 2015-07-28 | The Invention Science Fund I, Llc | Data management of an audio data stream |
US8253821B2 (en) * | 2005-10-31 | 2012-08-28 | The Invention Science Fund I, Llc | Degradation/preservation management of captured data |
US8681225B2 (en) * | 2005-06-02 | 2014-03-25 | Royce A. Levien | Storage access technique for captured data |
US9076208B2 (en) * | 2006-02-28 | 2015-07-07 | The Invention Science Fund I, Llc | Imagery processing |
US7782365B2 (en) | 2005-06-02 | 2010-08-24 | Searete Llc | Enhanced video/still image correlation |
US9191611B2 (en) * | 2005-06-02 | 2015-11-17 | Invention Science Fund I, Llc | Conditional alteration of a saved image |
US8072501B2 (en) * | 2005-10-31 | 2011-12-06 | The Invention Science Fund I, Llc | Preservation and/or degradation of a video/audio data stream |
US8964054B2 (en) * | 2006-08-18 | 2015-02-24 | The Invention Science Fund I, Llc | Capturing selected image objects |
US9819490B2 (en) * | 2005-05-04 | 2017-11-14 | Invention Science Fund I, Llc | Regional proximity for shared image device(s) |
US9001215B2 (en) * | 2005-06-02 | 2015-04-07 | The Invention Science Fund I, Llc | Estimating shared image device operational capabilities or resources |
US20090144391A1 (en) * | 2007-11-30 | 2009-06-04 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Audio sharing |
US9942511B2 (en) * | 2005-10-31 | 2018-04-10 | Invention Science Fund I, Llc | Preservation/degradation of video/audio aspects of a data stream |
US9167195B2 (en) * | 2005-10-31 | 2015-10-20 | Invention Science Fund I, Llc | Preservation/degradation of video/audio aspects of a data stream |
US9621749B2 (en) * | 2005-06-02 | 2017-04-11 | Invention Science Fund I, Llc | Capturing selected image objects |
JP2006311154A (en) * | 2005-04-27 | 2006-11-09 | Sony Corp | Imaging apparatus, processing method therefor, and program for executing the method by computer |
US20060274153A1 (en) * | 2005-06-02 | 2006-12-07 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Third party storage of captured data |
KR100763180B1 (en) * | 2005-06-09 | 2007-10-04 | 삼성전자주식회사 | Browsing method using meta-data and apparatus using the same |
US20070203595A1 (en) * | 2006-02-28 | 2007-08-30 | Searete Llc, A Limited Liability Corporation | Data management of an audio data stream |
US20070120980A1 (en) | 2005-10-31 | 2007-05-31 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Preservation/degradation of video/audio aspects of a data stream |
KR101314827B1 (en) * | 2006-12-05 | 2013-10-04 | 삼성전자주식회사 | Method for controlling digital photographing apparatus, and digital photographing apparatus adopting the method |
JP4981438B2 (en) * | 2006-12-26 | 2012-07-18 | キヤノン株式会社 | Image processing apparatus, control method thereof, and program |
JP4849020B2 (en) * | 2007-06-28 | 2011-12-28 | ソニー株式会社 | Imaging apparatus, imaging method, and program |
US20090109491A1 (en) * | 2007-10-30 | 2009-04-30 | Microsoft Corporation | Raw-quality processing of non-raw images |
US8867779B2 (en) * | 2008-08-28 | 2014-10-21 | Microsoft Corporation | Image tagging user interface |
US8396246B2 (en) * | 2008-08-28 | 2013-03-12 | Microsoft Corporation | Tagging images with labels |
US20110169987A1 (en) * | 2008-09-15 | 2011-07-14 | Gann Robert G | Minimally Processing Displayed Images Captured From A Burst |
US8179452B2 (en) * | 2008-12-31 | 2012-05-15 | Lg Electronics Inc. | Method and apparatus for generating compressed file, and terminal comprising the apparatus |
US8794506B2 (en) | 2009-02-23 | 2014-08-05 | Digitaqq | System for automatic image association |
US8587703B2 (en) * | 2009-12-01 | 2013-11-19 | Aptina Imaging Corporation | Systems and methods for image restoration |
US9013602B2 (en) * | 2011-01-03 | 2015-04-21 | Intellectual Ventures Fund 83 Llc | Digital camera system having a retail mode |
CN103873835A (en) * | 2012-12-17 | 2014-06-18 | 联想(北京)有限公司 | Image collecting method, image processing method and electronic device |
US9575995B2 (en) | 2013-05-01 | 2017-02-21 | Cloudsight, Inc. | Image processing methods |
US10223454B2 (en) | 2013-05-01 | 2019-03-05 | Cloudsight, Inc. | Image directed search |
US9665595B2 (en) | 2013-05-01 | 2017-05-30 | Cloudsight, Inc. | Image processing client |
US10140631B2 (en) | 2013-05-01 | 2018-11-27 | Cloudsignt, Inc. | Image processing server |
US9569465B2 (en) | 2013-05-01 | 2017-02-14 | Cloudsight, Inc. | Image processing |
US9639867B2 (en) | 2013-05-01 | 2017-05-02 | Cloudsight, Inc. | Image processing system including image priority |
US9830522B2 (en) | 2013-05-01 | 2017-11-28 | Cloudsight, Inc. | Image processing including object selection |
CA3014245A1 (en) * | 2014-04-04 | 2015-10-04 | Cloudsight, Inc. | Image processing methods |
NZ731529A (en) * | 2014-10-24 | 2022-12-23 | Beezbutt Pty Ltd | Camera application |
WO2016095361A1 (en) | 2014-12-14 | 2016-06-23 | SZ DJI Technology Co., Ltd. | Methods and systems of video processing |
US9965635B2 (en) * | 2015-04-24 | 2018-05-08 | Panasonic Intellectual Property Corporation Of America | Image tagging device |
US20170016623A1 (en) * | 2015-07-14 | 2017-01-19 | Tovala | Automated cooking device and method |
JP2017059888A (en) * | 2015-09-14 | 2017-03-23 | オリンパス株式会社 | Information recording apparatus, information recording method, and information recording program |
US20190293249A1 (en) * | 2018-03-21 | 2019-09-26 | Allister B. Hamilton | Method and apparatus for dynamically projecting and displaying customized decorative images on a building or home |
CN113794834B (en) * | 2021-08-25 | 2023-08-08 | 维沃移动通信(杭州)有限公司 | Image processing method and device and electronic equipment |
Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5477264A (en) * | 1994-03-29 | 1995-12-19 | Eastman Kodak Company | Electronic imaging system using a removable software-enhanced storage device |
US5696850A (en) * | 1995-12-21 | 1997-12-09 | Eastman Kodak Company | Automatic image sharpening in an electronic imaging system |
US5978519A (en) * | 1996-08-06 | 1999-11-02 | Xerox Corporation | Automatic image cropping |
US6014143A (en) * | 1997-05-30 | 2000-01-11 | Hewlett-Packard Company | Ray transform method for a fast perspective view volume rendering |
US6016354A (en) * | 1997-10-23 | 2000-01-18 | Hewlett-Packard Company | Apparatus and a method for reducing red-eye in a digital image |
US6020920A (en) * | 1997-06-10 | 2000-02-01 | Flashpoint Technology, Inc. | Method and system for speculative decompression of compressed image data in an image capture unit |
US6058248A (en) * | 1997-04-21 | 2000-05-02 | Hewlett-Packard Company | Computerized method for improving data resolution |
US6075542A (en) * | 1996-07-29 | 2000-06-13 | Eastman Kodak Company | Method of combining two digital images |
US6128038A (en) * | 1997-08-01 | 2000-10-03 | Fuji Photo Film Co., Ltd. | Image information recording medium and image processing system generating the recording medium |
US6134339A (en) * | 1998-09-17 | 2000-10-17 | Eastman Kodak Company | Method and apparatus for determining the position of eyes and for correcting eye-defects in a captured frame |
US6167469A (en) * | 1998-05-18 | 2000-12-26 | Agilent Technologies, Inc. | Digital camera having display device for displaying graphical representation of user input and method for transporting the selected digital images thereof |
US6177956B1 (en) * | 1996-10-23 | 2001-01-23 | Flashpoint Technology, Inc. | System and method for correlating processing data and image data within a digital camera device |
US6185000B1 (en) * | 1996-12-24 | 2001-02-06 | Fuji Photo Film Co., Ltd. | Method and apparatus for instructing printing of recorded information on picture image prints, and program recorded medium used therefor |
US6198526B1 (en) * | 1997-09-11 | 2001-03-06 | Fuji Photo Film Co., Ltd. | Method and apparatus for recording order information |
US6573927B2 (en) * | 1997-02-20 | 2003-06-03 | Eastman Kodak Company | Electronic still camera for capturing digital image and creating a print order |
US20030103144A1 (en) * | 2001-12-04 | 2003-06-05 | Robert Sesek | Digital camera having image transfer method and system |
US20030189643A1 (en) * | 2002-04-04 | 2003-10-09 | Angelica Quintana | Digital camera capable of sending files via online messenger |
US6956832B1 (en) * | 1998-06-15 | 2005-10-18 | Nokia Networks Oy | Method for delivering messages in a wireless communications system using the same protocol for all types of messages |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5943093A (en) * | 1996-09-26 | 1999-08-24 | Flashpoint Technology, Inc. | Software driver digital camera system with image storage tags |
JP3791635B2 (en) * | 1996-10-22 | 2006-06-28 | 富士写真フイルム株式会社 | Image reproduction method, image reproduction apparatus, image processing method, and image processing apparatus |
US6784924B2 (en) * | 1997-02-20 | 2004-08-31 | Eastman Kodak Company | Network configuration file for automatically transmitting images from an electronic still camera |
JPH10334212A (en) * | 1997-04-01 | 1998-12-18 | Fuji Photo Film Co Ltd | System for printing image from image file with additional information |
JP3786242B2 (en) * | 1997-07-14 | 2006-06-14 | 富士写真フイルム株式会社 | Image processing method and apparatus, image reproduction method and apparatus, and image confirmation apparatus used in the method |
US6535243B1 (en) * | 1998-01-06 | 2003-03-18 | Hewlett- Packard Company | Wireless hand-held digital camera |
US6041143A (en) * | 1998-04-14 | 2000-03-21 | Teralogic Incorporated | Multiresolution compressed image management system and method |
US6762791B1 (en) * | 1999-02-16 | 2004-07-13 | Robert W. Schuetzle | Method for processing digital images |
US6564282B1 (en) * | 1999-05-05 | 2003-05-13 | Flashpoint Technology, Inc. | Method and system for increasing storage capacity in a digital image capture device |
EP1181809B1 (en) | 1999-06-02 | 2004-03-24 | Eastman Kodak Company | Customizing digital image transfer |
US6812962B1 (en) * | 2000-05-11 | 2004-11-02 | Eastman Kodak Company | System and apparatus for automatically forwarding digital images to a service provider |
US6636259B1 (en) * | 2000-07-26 | 2003-10-21 | Ipac Acquisition Subsidiary I, Llc | Automatically configuring a web-enabled digital camera to access the internet |
TW564372B (en) * | 2000-09-22 | 2003-12-01 | Seiko Epson Corp | Image processing method |
US6895112B2 (en) * | 2001-02-13 | 2005-05-17 | Microsoft Corporation | Red-eye detection based on red region detection with eye confirmation |
-
2001
- 2001-07-03 US US09/898,476 patent/US7110026B2/en not_active Expired - Lifetime
-
2002
- 2002-06-28 DE DE10229093A patent/DE10229093A1/en not_active Ceased
- 2002-07-03 CN CN02140187.XA patent/CN1229963C/en not_active Expired - Fee Related
-
2003
- 2003-04-14 US US10/414,191 patent/US20030179301A1/en not_active Abandoned
Patent Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5477264A (en) * | 1994-03-29 | 1995-12-19 | Eastman Kodak Company | Electronic imaging system using a removable software-enhanced storage device |
US5696850A (en) * | 1995-12-21 | 1997-12-09 | Eastman Kodak Company | Automatic image sharpening in an electronic imaging system |
US6075542A (en) * | 1996-07-29 | 2000-06-13 | Eastman Kodak Company | Method of combining two digital images |
US5978519A (en) * | 1996-08-06 | 1999-11-02 | Xerox Corporation | Automatic image cropping |
US6177956B1 (en) * | 1996-10-23 | 2001-01-23 | Flashpoint Technology, Inc. | System and method for correlating processing data and image data within a digital camera device |
US6185000B1 (en) * | 1996-12-24 | 2001-02-06 | Fuji Photo Film Co., Ltd. | Method and apparatus for instructing printing of recorded information on picture image prints, and program recorded medium used therefor |
US6573927B2 (en) * | 1997-02-20 | 2003-06-03 | Eastman Kodak Company | Electronic still camera for capturing digital image and creating a print order |
US6058248A (en) * | 1997-04-21 | 2000-05-02 | Hewlett-Packard Company | Computerized method for improving data resolution |
US6014143A (en) * | 1997-05-30 | 2000-01-11 | Hewlett-Packard Company | Ray transform method for a fast perspective view volume rendering |
US6020920A (en) * | 1997-06-10 | 2000-02-01 | Flashpoint Technology, Inc. | Method and system for speculative decompression of compressed image data in an image capture unit |
US6128038A (en) * | 1997-08-01 | 2000-10-03 | Fuji Photo Film Co., Ltd. | Image information recording medium and image processing system generating the recording medium |
US6198526B1 (en) * | 1997-09-11 | 2001-03-06 | Fuji Photo Film Co., Ltd. | Method and apparatus for recording order information |
US6016354A (en) * | 1997-10-23 | 2000-01-18 | Hewlett-Packard Company | Apparatus and a method for reducing red-eye in a digital image |
US6167469A (en) * | 1998-05-18 | 2000-12-26 | Agilent Technologies, Inc. | Digital camera having display device for displaying graphical representation of user input and method for transporting the selected digital images thereof |
US6956832B1 (en) * | 1998-06-15 | 2005-10-18 | Nokia Networks Oy | Method for delivering messages in a wireless communications system using the same protocol for all types of messages |
US6134339A (en) * | 1998-09-17 | 2000-10-17 | Eastman Kodak Company | Method and apparatus for determining the position of eyes and for correcting eye-defects in a captured frame |
US20030103144A1 (en) * | 2001-12-04 | 2003-06-05 | Robert Sesek | Digital camera having image transfer method and system |
US20030189643A1 (en) * | 2002-04-04 | 2003-10-09 | Angelica Quintana | Digital camera capable of sending files via online messenger |
Cited By (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040064455A1 (en) * | 2002-09-26 | 2004-04-01 | Eastman Kodak Company | Software-floating palette for annotation of images that are viewable in a variety of organizational structures |
US20050220349A1 (en) * | 2003-07-11 | 2005-10-06 | Shinji Furuya | Image display apparatus and short film generation apparatus |
US7469064B2 (en) * | 2003-07-11 | 2008-12-23 | Panasonic Corporation | Image display apparatus |
US7885529B2 (en) | 2003-12-09 | 2011-02-08 | Panasonic Corporation | Lens driving apparatus, imaging apparatus, and lens barrel and camera main body used for this |
US20100310245A1 (en) * | 2003-12-09 | 2010-12-09 | Panasonic Corporation | Lens driving apparatus, imaging apparatus, and lens barrel and camera main body used for this |
US20070098284A1 (en) * | 2004-04-09 | 2007-05-03 | Hiroshi Sasaki | Method for preparing compressed image data file, image data compression device, and photographic device |
US7477796B2 (en) * | 2004-04-09 | 2009-01-13 | Nokia Corporation | Method for preparing compressed image data file, image data compression device, and photographic device |
US20050278379A1 (en) * | 2004-06-10 | 2005-12-15 | Canon Kabushiki Kaisha | Image retrieval device and image retrieval method |
US20050278643A1 (en) * | 2004-06-15 | 2005-12-15 | Hitachi, Ltd. | Display control apparatus, information display apparatus, display system, display control program and display control method |
US8341219B1 (en) * | 2006-03-07 | 2012-12-25 | Adobe Systems Incorporated | Sharing data based on tagging |
US7747605B2 (en) * | 2006-04-17 | 2010-06-29 | Perry J. Narancic | Organizational data analysis and management |
US20070244892A1 (en) * | 2006-04-17 | 2007-10-18 | Narancic Perry J | Organizational data analysis and management |
US8660386B1 (en) | 2006-05-19 | 2014-02-25 | Google Inc. | Large-scale image processing using mass parallelization techniques |
US8346016B1 (en) * | 2006-05-19 | 2013-01-01 | Google Inc. | Large-scale image processing using mass parallelization techniques |
US8762493B1 (en) | 2006-06-22 | 2014-06-24 | Google Inc. | Hierarchical spatial data structure and 3D index data versioning for generating packet data |
US9224145B1 (en) | 2006-08-30 | 2015-12-29 | Qurio Holdings, Inc. | Venue based digital rights using capture device with digital watermarking capability |
WO2008087627A3 (en) * | 2007-01-16 | 2010-01-07 | D-Blur Technologies | Passing embedded data through a digital image processor |
WO2008087627A2 (en) * | 2007-01-16 | 2008-07-24 | D-Blur Technologies | Passing embedded data through a digital image processor |
US20100039524A1 (en) * | 2007-01-16 | 2010-02-18 | Uri Kinrot | Passing Embedded Data Through A Digital Image Processor |
US20100171833A1 (en) * | 2007-02-07 | 2010-07-08 | Hamish Chalmers | Video archival system |
US9030563B2 (en) * | 2007-02-07 | 2015-05-12 | Hamish Chalmers | Video archival system |
US20090019176A1 (en) * | 2007-07-13 | 2009-01-15 | Jeff Debrosse | Live Video Collection And Distribution System and Method |
US20090109324A1 (en) * | 2007-10-26 | 2009-04-30 | Jonathan Kaplan | Charging and use scheme for a hand-held electronics device |
US8223262B2 (en) * | 2007-10-26 | 2012-07-17 | Cisco Technology, Inc. | Charging and use scheme for a hand-held electronics device |
US8467332B2 (en) * | 2008-10-14 | 2013-06-18 | Sony Corporation | Information receiving apparatus and information transmitting apparatus |
US20100254368A1 (en) * | 2008-10-14 | 2010-10-07 | Sony Corporation | Information receiving apparatus and information transmitting apparatus |
US20110001890A1 (en) * | 2009-06-05 | 2011-01-06 | T-Data Systejms (S) Pte. Ltd. | Portable image projector and projection method |
US20120242845A1 (en) * | 2009-12-01 | 2012-09-27 | T-Data Systems (S) Pte Ltd | Memory card and method for storage and wireless transceiving of data |
US9247083B2 (en) * | 2009-12-01 | 2016-01-26 | T-Data Systems (S) Pte Ltd | Memory card and method for storage and wireless transceiving of data |
US20120036132A1 (en) * | 2010-08-08 | 2012-02-09 | Doyle Thomas F | Apparatus and methods for managing content |
US9223783B2 (en) * | 2010-08-08 | 2015-12-29 | Qualcomm Incorporated | Apparatus and methods for managing content |
US20120069047A1 (en) * | 2010-09-17 | 2012-03-22 | Panasonic Corporation | Image display apparatus, image editing apparatus, image display program, and image editing program |
US20140043495A1 (en) * | 2012-08-10 | 2014-02-13 | Logitech Europe S.A. | Wireless video camera and connection methods including multiple video or audio streams |
US9838651B2 (en) * | 2012-08-10 | 2017-12-05 | Logitech Europe S.A. | Wireless video camera and connection methods including multiple video or audio streams |
US10110855B2 (en) | 2012-08-10 | 2018-10-23 | Logitech Europe S.A. | Wireless video camera and connection methods including a USB emulation |
US10205914B2 (en) | 2012-08-10 | 2019-02-12 | Logitech Europe S.A. | Wireless video camera and connection methods including multiple video or audio streams |
CN106791957A (en) * | 2016-12-07 | 2017-05-31 | 北京华夏电通科技有限公司 | Net cast processing method and processing device |
US20190020813A1 (en) * | 2017-07-14 | 2019-01-17 | Casio Computer Co., Ltd. | Image Recording Apparatus, Image Recording Method, and Computer-Readable Storage Medium |
US10616479B2 (en) * | 2017-07-14 | 2020-04-07 | Casio Computer Co., Ltd. | Image recording apparatus, image recording method, and computer-readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
US7110026B2 (en) | 2006-09-19 |
US20030007078A1 (en) | 2003-01-09 |
CN1229963C (en) | 2005-11-30 |
DE10229093A1 (en) | 2003-02-20 |
CN1396758A (en) | 2003-02-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20030179301A1 (en) | Tagging for transferring image data to destination | |
US7414746B2 (en) | Image data communication method | |
US8868778B2 (en) | Transmission bandwidth and memory requirements reduction in a portable image capture device | |
US7027171B1 (en) | Digital camera and document photographing and transmitting method of the same | |
JP4140048B2 (en) | Image management apparatus, image management program, and image management method | |
TW522721B (en) | Image information obtaining method, image information transmitting apparatus and image information transmitting system | |
US7107607B2 (en) | Image data communication system and method thereof, and image pickup apparatus and image data processing method | |
US10650090B2 (en) | Content management apparatus, web server, network system, content management method, content information management method, and program | |
US20020021359A1 (en) | Image data transmitting device and method | |
US20040201735A1 (en) | Image storage queue | |
US7324139B2 (en) | Digital camera, a method of shooting and transferring text | |
US20060050321A1 (en) | Record/replay apparatus and method | |
JP2004201325A (en) | System and method for sharing images | |
US20040075746A1 (en) | Portable terminal, printing apparatus, image-printing system and thumbnail-creation apparatus | |
JP2007201578A (en) | Image recording system | |
KR20000054449A (en) | The method for online image processing and the system thereof | |
JP2002232761A (en) | Picture recording method, picture transmission method and picture recording apparatus | |
JP4330327B2 (en) | Digital camera | |
JP2002051241A (en) | Electronic camera | |
JP2007201579A (en) | Electronic camera | |
JP4344956B2 (en) | Image recording method and image recording apparatus | |
JPH09200668A (en) | Image pickup device | |
JP2001333366A (en) | Electronic camera | |
JP4334163B2 (en) | Image printing system, image display system, image communication system, and imaging device, printing device, display device, and communication device used therefor | |
JP2006165794A (en) | Information communication device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: LOGITECH EUROPE S.A., SWITZERLAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FELDIS, JOHN J.;BATEMAN, JOHN;REEL/FRAME:013977/0543 Effective date: 20030410 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |