US20140368717A1 - Photographing apparatus capable of communication with external apparatus and method of controlling the same - Google Patents
Photographing apparatus capable of communication with external apparatus and method of controlling the same Download PDFInfo
- Publication number
- US20140368717A1 US20140368717A1 US14/471,227 US201414471227A US2014368717A1 US 20140368717 A1 US20140368717 A1 US 20140368717A1 US 201414471227 A US201414471227 A US 201414471227A US 2014368717 A1 US2014368717 A1 US 2014368717A1
- Authority
- US
- United States
- Prior art keywords
- data
- image
- mobile phone
- metadata
- photographing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/26—Devices for calling a subscriber
- H04M1/27—Devices whereby a plurality of signals may be stored simultaneously
- H04M1/274—Devices whereby a plurality of signals may be stored simultaneously with provision for storing more than one subscriber number at a time, e.g. using toothed disc
- H04M1/2745—Devices whereby a plurality of signals may be stored simultaneously with provision for storing more than one subscriber number at a time, e.g. using toothed disc using static electronic memories, e.g. chips
- H04M1/27467—Methods of retrieving data
- H04M1/2747—Scrolling on a display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/26—Devices for calling a subscriber
- H04M1/27—Devices whereby a plurality of signals may be stored simultaneously
- H04M1/274—Devices whereby a plurality of signals may be stored simultaneously with provision for storing more than one subscriber number at a time, e.g. using toothed disc
- H04M1/2745—Devices whereby a plurality of signals may be stored simultaneously with provision for storing more than one subscriber number at a time, e.g. using toothed disc using static electronic memories, e.g. chips
- H04M1/27467—Methods of retrieving data
- H04M1/27475—Methods of retrieving data using interactive graphical means or pictorial representations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
- H04M1/72412—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/7243—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
- H04M1/72439—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00281—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a telecommunication apparatus, e.g. a switched network of teleprinters for the distribution of text-based information, a selective call terminal
- H04N1/00307—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a telecommunication apparatus, e.g. a switched network of teleprinters for the distribution of text-based information, a selective call terminal with a mobile telephone apparatus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/0044—Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
- H04N5/772—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/12—Messaging; Mailboxes; Announcements
- H04W4/14—Short messaging services, e.g. short message services [SMS] or unstructured supplementary service data [USSD]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/02—Details of telephonic subscriber devices including a Bluetooth interface
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2101/00—Still video cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/0008—Connection or combination of a still picture apparatus with another apparatus
- H04N2201/0034—Details of the connection, e.g. connector, interface
- H04N2201/0048—Type of connection
- H04N2201/0055—By radio
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/0077—Types of the still picture apparatus
- H04N2201/0084—Digital still camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3204—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3204—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium
- H04N2201/3205—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium of identification information, e.g. name or ID code
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3204—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium
- H04N2201/3209—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium of a telephone number
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3261—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of multimedia information, e.g. a sound signal
- H04N2201/3266—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of multimedia information, e.g. a sound signal of text or character information, e.g. text accompanying an image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3273—Display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3274—Storage or retrieval of prestored additional information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3278—Transmission
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/804—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
Definitions
- the present general inventive concept relates to a photographing apparatus and a method of controlling the same. More particularly, the present general inventive concept relates to a photographing apparatus capable of communication with an external apparatus and a method of controlling the same.
- Photographing apparatuses photograph subjects to generate images that may then be stored.
- Representative examples of photographing apparatuses include digital cameras and digital camcorders.
- Photographing apparatuses may also display the images that have been photographed using photographing apparatuses and stored.
- the displayed images may, for example, be displayed on televisions (TVs) connected to the photographing apparatuses.
- TVs televisions
- photographing apparatuses may be used with separate communication devices.
- photographing apparatuses may have both photographing functions and communication functions, users may temporarily stop photographing and communicate with external apparatuses, thus increasing user inconvenience.
- a user may collect comments from other users while sharing the photographed image.
- the photographed images are stored in a file on a personal computer (PC)
- other users' comments may be recorded using text.
- text does not have the ability to represent the feeling of movement at the time photographing operation is performed.
- the user may need to perform several operations other than photographing images, which may burden the user.
- the present general inventive concept provides a photographing apparatus and a method of controlling the same, which enables the photographing apparatus to communicate with an external apparatus while photographing an image.
- the present general inventive concept provides a photographing apparatus and a method of controlling the same, in which data received from an external apparatus is stored together with a photographed image to enhance user convenience.
- a photographing apparatus including an image photographing unit to photograph an image, a storage unit to store the photographed image, a display unit to display the photographed image, a communication interface to communicate with an external apparatus and a controller to control such that a notification message to indicate that data has been received is generated and displayed on the display unit together with the photographed image, if the communication interface receives data from the external apparatus.
- the external apparatus may include a mobile phone, and the communication interface may perform short-range communication with the mobile phone.
- the mobile phone may perform long-range communication with another mobile phone and may transmit an image received from the communication interface to the other mobile phone or transmit data received from the other mobile phone to the communication interface.
- the received data may include at least one of audio data and text data.
- the controller may store the received data in the storage unit in which the photographed image is stored, if a storage command to store the received data is input.
- the photographing apparatus may further include a compressor to compress audio data. If the received data is in a form of audio data, the compressed audio data output from the compressor may be stored in the storage unit.
- the controller may generate metadata including information regarding the received data and the image and may store the generated metadata in the storage unit.
- the foregoing and/or other aspects and utilities of the present general inventive concept may also be achieved by providing a method of controlling a photographing apparatus, including photographing an image, displaying the photographed image and performing communication with an external apparatus. If the data is received from the external apparatus, the displaying of the photographed image includes displaying a notification message to indicate that data has been received together with the photographed image.
- the external apparatus may include a mobile phone, and the performing may include performing short-range communication with the mobile phone.
- the mobile phone may perform long-range communication with another mobile phone and may transmit an image received from the photographing apparatus to the other mobile phone or transmit data received from the other mobile phone to the photographing apparatus.
- the received data may include at least one of audio data and text data.
- the method may further include storing the received data in a storage unit in which the photographed image is stored, if a storage command to store the received data is input.
- the method may further include compressing audio data if the received data is in a form of audio data.
- the method may further include generating metadata including information regarding the received data and the image.
- the storing may include storing the metadata and the image.
- a photographing apparatus usable with an external device including an image photographing unit to photograph an image, a communication interface to communicate with the external apparatus and a controller to determine whether data has been received from the external unit through the communication interface and, if so, to display together with the photographed image of the photographing unit, a notification message that the data has been received.
- a photographing apparatus usable with an external device including a photographic mode in which an image is photographed and stored without communication with the external device, a communication mode in which communication with the external device is enabled while the image is being photographed and the image is stored with the data communicated by the external device, and a selection button for selecting one of the photographic mode and the communication mode.
- a photographing system including a photographing apparatus to photograph an image, the photographing apparatus having a photographic mode and a communication mode, a first mobile phone to communicate with the photographing apparatus and a second mobile phone to communicate with the first mobile phone, wherein the photographing apparatus receives data from the second mobile phone through the first mobile phone while photographing the image, when in the communication mode.
- the foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing a method of operating a photographing apparatus, the method including photographing an image, communicating with an external apparatus, and determining whether data has been received from the external unit and, if so, displaying a notification message that the data has been received together with the photographed image of the photographing unit.
- a method of operating a photographing apparatus including determining whether a photographic mode or a communication mode has been selected, if a photographic mode has been selected, preventing communication with an external device during the photographing of an image, and if a communication mode has been selected, allowing communication with the external device during the photographing of the image.
- a computer-readable recording medium having embodied thereon a computer program to execute a method, wherein the method includes determining whether a photographic mode or a communication mode has been selected, if a photographic mode has been selected, preventing communication with an external device during the photographing of an image, and if a communication mode has been selected, allowing communication with the external device during the photographing of the image.
- FIG. 1 is a view illustrating photographing apparatus and a plurality of mobile phones which are connected via a network, according to an exemplary embodiment of the present general inventive concept;
- FIG. 2 is a block diagram illustrating a digital camcorder according to an exemplary embodiment of the present general inventive concept
- FIG. 3 is a flowchart illustrating a method of converting a specific image of a photographed moving image into a still image and transmitting the converted image, or of processing data received from an external apparatus, according to an exemplary embodiment of the present general inventive concept;
- FIGS. 4A to 4E are views illustrating in detail a method of selecting a specific image from a photographed moving image and transmitting the specific image to an external apparatus;
- FIGS. 5A and 5B are views illustrating a data structure of an image to be transmitted, according to an exemplary embodiment of the present general inventive concept
- FIGS. 6A to 6C are views illustrating a video output unit when data is received from a second mobile phone
- FIGS. 7A to 7D are views illustrating a data structure of metadata, according to an exemplary embodiment of the present general inventive concept
- FIGS. 8A to 8D are views illustrating data which is received from a third mobile phone
- FIG. 9 is a view illustrating a data structure of third metadata which is generated using data received from a fourth mobile phone.
- FIG. 10 is a view illustrating a data structure of image management data.
- FIG. 1 is a view illustrating photographing apparatus and a plurality of mobile phones that are connected via a network, according to an exemplary embodiment of the present general inventive concept.
- a digital video camcorder (DVC) 110 can be connected via the network.
- DVC digital video camcorder
- the DVC 110 can photograph an image and transmit the photographed image to an external apparatus, that is, for example, the first mobile phone 120 , second mobile phone 140 and third mobile phone 150 .
- the image photographed by the DVC 110 can be transmitted to the first mobile phone 120 using short-range wireless communication, such as Bluetooth.
- the first mobile phone 120 can transmit the image to the second mobile phone 140 and third mobile phone 150 using long-range wireless communication, in particular a base station 130 .
- the second to fourth mobile phones 140 to 160 can transmit data to the DVC 110 through the first mobile phone 120 .
- the DVC 110 can perform data communication with the second to fourth mobile phones 140 to 160 through the first mobile phone 120 .
- a user of the DVC 110 desires to share a specific image with a third party (for example, a user of the second mobile phone 140 or a user of the third mobile phone 150 ) which is spatially separated from the user of the DVC 110 while photographing an image
- the specific image can be transmitted to the first mobile phone 120 using short-range wireless communication, and the first mobile phone 120 can transmit the received image to the second mobile phone 140 and third mobile phone 150 using long-range wireless communication.
- a short message service (SMS) can be used when transmitting the image from the first mobile phone 120 to the second mobile phone 140 and third mobile phone 150 . Accordingly, the user of the DVC 110 , user of the second mobile phone 140 and user of the third mobile phone 150 can share the same image.
- SMS short message service
- the DVC 110 and mobile phones 140 and 150 can perform data communication, and thus the user of the second mobile phone 140 and user of the third mobile phone 150 can leave comments regarding the image being photographed in real-time and the comments can be stored together with the image.
- FIG. 2 is a block diagram illustrating the DVC 110 according to an exemplary embodiment of the present general inventive concept.
- the DVC 110 includes an image photographing unit 210 , a coder/decoder (CODEC) 220 , a temporary storage unit 230 , a communication interface 240 , a storage unit 250 , a display information combiner 260 , an image output unit 270 , a manipulator 280 , and a controller 290 .
- CDEC coder/decoder
- the image photographing unit 210 can include a lens 212 , a charge-coupled device (CCD) 214 , a microphone 216 , and a signal processor 218 .
- CCD charge-coupled device
- the CCD 214 can capture an optical image of a subject incident through the lens 212 and generate a video signal corresponding to the photographed optical image.
- the photographing apparatus according to the exemplary embodiment of the present general inventive concept can be implemented by a complementary metal oxide semiconductor (CMOS) type instead of the CCD 114 .
- CMOS complementary metal oxide semiconductor
- the signal processor 218 can remove noise, adjust a level of the video signal, convert an analog signal into a digital signal, and perform digital signal processing (DSP) for the video signal output from the CCD 214 .
- the signal processor 218 can also amplify an audio signal output from the microphone 216 and convert the audio signal into a digital signal.
- the CODEC 220 can compress the video and audio signals (hereinafter, referred to as “images”) processed by the signal processor 218 in a predetermined format and transfer the compressed images to the storage unit 250 .
- the CODEC 220 can decode an image output from the storage unit 250 or the temporary storage unit 230 into a re-playable original signal and transfer the re-playable original signal to the image output unit 270 .
- the temporary storage unit 230 can temporarily store the specific image compressed by the CODEC 220 before transmitting the image to the communication interface 240 , and can also temporarily store data received from the communication interface 240 .
- the communication interface 240 can modulate the image output from the temporary storage unit 230 to wirelessly transmit the image, and transfer the data received from the external apparatus to the temporary storage unit 230 .
- the communication interface 240 may be a Bluetooth module or a wireless local area network (LAN) module to perform short-range wireless communication.
- the storage unit 250 can store the image compressed by the CODEC 220 . Additionally, the storage unit 250 can store metadata output from the controller 290 , together with the image.
- the metadata includes the data received from the external apparatus and various information required to connect the data received from the external apparatus to the image.
- the display information combiner 260 can combine the video signal output from the signal processor 218 with display information such as characters, symbols, diagrams, graphics or other information.
- the display information combiner 260 can combine the display information using an on-screen display (OSD) method under the control of the controller 290 .
- OSD on-screen display
- the image output unit 270 can include a video output unit 272 to display the video signal, and an audio output unit 274 to output the audio signal.
- the video output unit 272 can be implemented as a liquid crystal display (LCD) or the like, and the audio output unit 274 can be implemented as a speaker or the like.
- the video output unit 272 can also display functions of the manipulator 280 , or can be used as a part of the manipulator 280 .
- the manipulator 280 includes a selection button through which a mode of the DVC 110 can be selected.
- a user may execute a photographing command or transmit the photographed image to the external apparatus and may also receive text data or audio data from the external apparatus, using the manipulator 280 including the selection button.
- Modes of the DVC 110 include a “photographing mode” and a “communication mode”.
- the “photographing mode”, as a general function of the DVC 110 refers to a mode in which an image is photographed and the photographed image is stored.
- the “communication mode” refers to a mode in which a photographed image is stored and communication with an external apparatus is performed while photographing an image.
- the controller 290 can control the entire operation of the DVC 110 . For example, if the user inputs a command to change the mode of the DVC 110 to the “communication mode” through the manipulator 280 , the controller 290 can store the image compressed by the CODEC 220 in the storage unit 250 , and also transmit the compressed image to the external apparatus using the communication interface 240 . Additionally, if the compressed image is transmitted, the controller 290 can generate information regarding the transmitted image in the form of a data structure and store the information in the temporary storage unit 230 .
- the controller 290 can generate a notification message to indicate that data has been received and can transfer the generated notification message to the display information combiner 270 .
- the controller 290 can also transfer the received data to the image output unit 270 or store the data in the storage unit 250 .
- the controller 290 can generate metadata to connect the data to the image and can store the metadata together with the image in the storage unit 250 .
- controller 290 can control the image output unit 270 to output the image processed by the signal processor 218 , and the video output unit 272 to display information regarding a manipulation command input by the user using the manipulator 280 .
- FIG. 3 is a flowchart illustrating a method of converting a specific image of a photographed moving image into a still image and transmitting the converted image, or of processing data received from an external apparatus, according to an exemplary embodiment of the present general inventive concept.
- the image photographing unit 210 can photograph an image in operation S 310 .
- the CCD 214 can capture an optical image of a subject incident through the lens 212 and transfer the image to the signal processor 218 .
- the microphone 216 can transfer an audio signal to the signal processor 218 .
- the signal processor 218 can remove noise, adjust a level of the video signal, convert an analog signal into a digital signal, and perform digital signal processing (DSP) for the video signal output from the CCD 214 .
- DSP digital signal processing
- the signal processor 218 can also amplify the audio signal output from the microphone 216 and convert the audio signal into a digital signal.
- the CODEC 220 can compress the photographed image in a predetermined format in operation S 315 .
- the controller 290 can determine whether the mode of the DVC 110 is the communication mode in operation S 320 . Accordingly, the controller 290 can determine whether a user inputs a command to change the mode of the DVC 110 to the communication mode through the manipulator 280 . If the mode of the DVC 110 is set to be in the communication mode, the DVC 110 can receive data from and transmit data to the external apparatus while photographing an image.
- the controller 290 can determine whether the communication interface 240 receives data in operation S 330 . Specifically, the controller 290 can determine whether the first mobile phone 120 receives data, such as text data or audio data, from the second to fourth mobile phones 140 to 160 and transmits the data to the DVC 110 .
- the controller 290 can determine whether a specific image selection command is input in operation S 335 . Specifically, if a specific image which a user desires to share with a third party is photographed using the DVC 110 while photographing an image, a command to select the photographed image can be input. Accordingly, the specific photographed image can be compressed in a still image format by the CODEC 220 and stored in the temporary storage unit 230 .
- the controller 290 can also determine whether a contact address for the external apparatus is input in operation S 340 .
- the contact address for the external apparatus refers to numbers for the second mobile phone 140 and third mobile phone 150 .
- the user can input the contact address for the external apparatus using a user interface provided by the DVC 110 or using a contact address for an external apparatus previously stored in the DVC 110 . Additionally, since the DVC 110 can communicate with the first mobile phone 120 , contact addresses stored in the first mobile phone 120 can be used.
- the controller 290 can generate a data structure including information regarding the selected specific image and store the generated data structure in the temporary storage unit 230 in operation S 345 .
- a method of generating a data structure will be described later.
- the communication interface 240 can transmit the data structure of the specific image and the specific image to the external apparatus in operation S 350 . Specifically, if the communication interface 240 of the DVC 110 transmits the data structure of the specific image and the specific image to the first mobile phone 120 using short-range wireless communication, the first mobile phone 120 can transmit the data structure of the specific image and the specific image to the external apparatus using long-range wireless communication.
- the controller 290 can control the display information combiner 260 and the video output unit 272 so that a notification message to indicate data reception can be generated and the generated notification message can be displayed on the video output unit 272 in operation S 355 .
- the data received by the communication interface 240 can be stored in the temporary storage unit 230 , and the controller can read the data being stored in the temporary storage unit 230 and generate a notification message.
- the controller 290 can generate various notification messages according to the types of data. Accordingly, if the received data is text data or audio data, different types of notification messages can be generated to be transferred to the display information combiner 260 , and the display information combiner 260 can combine the notification message with a region of the photographed image and the combined image can be transferred to the video output unit 272 . Accordingly, various notification messages can be generated according to the types of data, and thus enable the user to easily identify the type of data received.
- the controller 290 can determine whether an output command to output the received data is input in operation S 360 .
- the user can check that the notification message is displayed on the video output unit 272 while photographing a moving image, and can input the output command.
- the controller 290 can output the received data being stored in the temporary storage unit 230 in operation S 365 .
- the controller 290 can read the text data being stored in the temporary storage unit 230 and convert the data into original text data, followed by transferring the original data to the display information combiner 260 .
- the display information combiner 260 can combine the text data with the image output from the signal processor 218 and transfer the combined image to the video output unit 272 .
- the video output unit 272 can output the image combined with the text data.
- the controller 290 can read the audio data being stored in the temporary storage unit 230 and transfer the data to the CODEC 220 .
- the CODEC 220 can then decode the audio data into a re-playable original signal and transfer the re-playable original signal to the audio output unit 274 , and the audio output unit 274 can output the audio data.
- the controller 290 can determine whether a storage command to store the data output from the image output unit 270 is input in operation S 370 . Specifically, if the user desires to store the output data together with the photographed image, the user can input the storage command, and the controller 290 can determine that the storage command is input.
- the controller 290 can generate metadata to connect the output data to the image in operation S 375 .
- the metadata can include information regarding the output data along with information regarding the image. If the storage command is input, the controller 290 can generate metadata based on whether the data is associated with the data structure transmitted at operation S 350 . A method of generating metadata will be described later.
- the controller 290 can store the generated metadata in the storage unit 250 in which the image is currently stored in operation S 380 .
- the compressed image can be stored in the storage unit 250 in operation S 385 , because it is determined that the DVC 110 is in the photographing mode.
- the controller 290 can determine whether data is received at operation S 330 and can then determine whether the specific image selection command is input at operation S 335 , but the present general inventive concept is not limited thereto. Accordingly, if the DVC 110 is in the communication mode, the controller 290 can determine whether data is received or whether the image is transmitted.
- FIGS. 4A to 4E are views illustrating in detail a method of selecting a specific image from a photographed moving image to be transmitted to an external apparatus. As illustrated in FIG. 4A , if a user desires to share the specific image with a third party while photographing the moving image, the user can select an arrow, that is, a user interface, displayed on a region of the video output unit 272 .
- a window to input the contact address for the external apparatus can be displayed on a region of the video output unit 272 . Since the contact address for the external apparatus can generally include numbers, a number input window is illustrated in FIG. 4B . For convenience of description, if the user of the DVC 110 inputs the contact address for the second mobile phone 140 using the number input window, the numbers illustrated in FIG. 4B may be the contact address for the second mobile phone 140 .
- the present general inventive concept is not limited to this embodiment, so if the user selects a down arrow displayed on the number input window, a pre-registered contact address window for external apparatuses can be displayed, as illustrated in FIG. 4C . Accordingly, the user can input the contact address for the external apparatus using the pre-registered contact address window for external apparatuses.
- “Han-soo CHO” selected by the user may be the contact address for the third mobile phone 150 , and accordingly the user can select the contact address for the external apparatus using the number input window of FIG. 4B and the pre-registered contact address window for external apparatuses of FIG. 4C , and can input a transmission command. If icons corresponding to the contact addresses for the external apparatuses are previously registered, the user can select the icons instead of the contact address for the external apparatus, as illustrated in FIG. 4D .
- the specific image can be transmitted to the external apparatus while displaying a complete notification message at the bottom of the video output unit 272 .
- the method of transmitting the specific image to the external apparatus was described above.
- FIGS. 5A and 5B are views regarding a data structure of an image to be transmitted, according to an exemplary embodiment of the present general inventive concept.
- the data structure can include a moving image ID 501 indicating information regarding a moving image containing an image to be transmitted, the size 502 of an image to be transmitted, the type 503 of an image to be transmitted, a title 504 of an image to be transmitted, a profile type 505 for data transmission between the DVC 110 and first mobile phone 120 , the image sending time 506 in the DVC 110 (specifically, the communication interface 240 ), a receiver list 507 , and information 508 regarding whether the receiver is registered.
- FIG. 5B illustrates a data structure of a specific image prepared based on the data structure of FIG. 5A . Accordingly, as illustrated in the data structure of FIG. 5B , it is determined that the user of the DVC 110 transmits the same image to the second mobile phone 140 and third mobile phone 150 through the first mobile phone 120 .
- FIGS. 6A to 6C are views regarding the video output unit 272 ( FIG. 2 ) when data is received from the second mobile phone 140 ( FIG. 1 ). As illustrated in FIG. 6A , if the data is received from the second mobile phone 140 , an image being photographed can be displayed together with a notification message on the video output unit 272 of the DVC 110 .
- the video output unit 272 of the DVC 110 can display text data “What a beautiful spot!! Where is it?”, which is received from the second mobile phone 140 , together with an icon corresponding to the second mobile phone 140 , as illustrated in FIG. 6B .
- the words indicating whether to store the data can be displayed, as illustrated in FIG. 6C . Accordingly, the user can input the storage command through the user interface.
- the controller 290 can generate metadata to connect the data to the image.
- a data structure of metadata according to the exemplary embodiment of the present general inventive concept is now described with reference to FIGS. 7A to 7D .
- FIGS. 7A to 7D are views regarding a data structure of metadata, according to an exemplary embodiment of the present general inventive concept.
- the data structure of the metadata can include the type 701 of received data, the size 702 of the received data, the playback time 703 of the received data, the reception time 704 of the received data, the data transmitter 705 , the data content 706 , the moving image ID 707 associated with the received data, information 708 regarding whether metadata is generated, a data transmitter icon 709 , the next metadata address 710 , the previous metadata address 711 , information 712 regarding whether an icon is registered, an icon display position 713 , and an icon display method 714 .
- FIG. 7B illustrates a data structure of data received from the second mobile phone 140 ( FIG. 1 ) prior to input of the output command at operation S 365 ( FIG. 3 ).
- the data structure of the data received from the second mobile phone 140 includes only the type 701 of the received data, the size 702 of the received data, the reception time 704 of the received data, the data transmitter 705 and the data content 706 .
- the controller 290 FIG. 2
- the controller 290 can generate a data structure illustrated in FIG. 7C . Accordingly, if the output command is input, the text data can be transferred to the video output unit 272 and the icon can also be transferred to the video output unit 272 according to the situation. Accordingly, information regarding the data transmitter icon 709 , the information 712 regarding whether an icon is registered, the icon display position 713 , and the icon display method 714 can be added to the data structure. Additionally, the data and icons can be output to the image output unit 270 according to the generated data structure.
- the controller 290 can generate metadata to connect the data to be stored to the image, and in particular, can generate metadata based on whether the data to be stored includes a comment regarding the image. It is assumed that the data received from the second mobile phone 140 is the data associated with the image being photographed, so the data structure illustrated in FIG. 5B is currently stored in the temporary storage unit 230 . Accordingly, the controller 290 can generate metadata using the data structures illustrated in FIGS. 5B and 7C .
- FIG. 7D illustrates a data structure of metadata (hereinafter, referred to as “first metadata”) of the data received from the second mobile phone 140 .
- first metadata a data structure of metadata (hereinafter, referred to as “first metadata”) of the data received from the second mobile phone 140 .
- the controller 290 can check whether the data transmitter 705 exists in the receiver list 507 .
- the image sending time 506 recorded in the data structure of FIG. 5B can be copied into the playback time 703 of the first metadata, and the moving image ID 501 recorded in the data structure of FIG. 5B can be copied into the moving image ID 707 , and accordingly, the first metadata can be generated.
- the first metadata of the data received from the second mobile phone 140 is illustrated in FIG. 7D , and can be stored in the storage unit 250 in which the image associated with the first metadata is being stored.
- the user of the second mobile phone 140 transmits the comment regarding the specific image transmitted by the user of the DVC 110 in the form of text data, so if the user desires to play back the image again after storing the image and metadata, the comment can be played back together with the image which is most closely associated with the data, to thus implement the feeling of movement.
- a method of outputting data received from the third mobile phone 150 and a method of generating metadata of the data received from the third mobile phone 150 can be performed similarly to the method of outputting data received from the second mobile phone 140 and the method of generating metadata of the data received from the second mobile phone 140 .
- FIGS. 8A to 8D are views illustrating data that is received from the third mobile phone 150 .
- the DVC 110 can receive the audio data from the third mobile phone 150 , and thus a notification message different from the notification message of the text data can be displayed on the video output unit 272 , as illustrated in FIG. 8A . Additionally, since the audio data can be output from the audio output unit 274 , only an icon corresponding to an audio data transmitter can be displayed on the video output unit 272 while outputting the audio data, as illustrated in FIG. 8B .
- FIG. 8D illustrates a data structure of generated metadata (hereinafter, referred to as “second metadata”) when a storage command to store the audio data received from the third mobile phone 150 is input.
- the data structure of the second metadata is the same as that of the first metadata, but information contained in the data structures are different from each other.
- the playback time 706 - 2 of the second metadata is equal to the playback time 706 - 1 of the first metadata, audio data and text data can have the same playback time when playing back the image.
- the data can be output using different methods without causing any problems.
- the data may be displayed on the video output unit 272 in order of the reception time 704 at a predetermined time interval.
- the metadata may be stored according to the order of the reception time 704 .
- the second metadata address may be recorded in an address 710 - 1 of metadata next to the first metadata, and the first metadata address may be stored in an address 710 - 2 of metadata preceding the second metadata.
- FIG. 9 is a view illustrating a data structure of third metadata that is generated using data received from the fourth mobile phone 160 . Since the data received from the fourth mobile phone 160 is not associated with the image being photographed, the data described in FIGS. 5A and 5B is not stored in the temporary storage unit 230 . Accordingly, if the storage command is input, the reception time can be recorded in a data playback time 703 - 3 because there is no image sending time, and remaining information can be recorded in the same manner as the method by which the controller 290 generates the first metadata.
- FIG. 10 is a view regarding a data structure of image management data.
- the storage unit 250 may store the image management data, in addition to the image and metadata. If an image file is completely stored in the storage unit 250 , the controller 290 can generate image management data using the image and metadata.
- the data structure of the image management data can include a moving image ID 1001 , an address 1002 of metadata which is initially connected to the moving image, an address 1003 of metadata which is finally connected to the moving image, a total number of metadata 1004 , a total number of metadata 1005 containing text data, and a total number of metadata 1006 containing audio data.
- the controller 290 can control an entire operation of the DVC 110 so that the image stored in the storage unit 250 can be played back, and also determine whether metadata exists using the image management data.
- the image management data contains the address of the first metadata that is an address of metadata that is initially connected to the image, and accordingly, the controller 290 can determine that metadata exists. Additionally, when the moving image is played back while measuring a period of time, if the measured period of time corresponds to the playback time 703 - 1 of the first metadata, the controller 290 can read the first metadata being stored in the storage unit 250 .
- the controller 290 can check the type of data, and if the data is in the form of text data, the controller 290 can restore the data into re-playable data and transfer the re-playable data to the display information combiner 260 .
- the display information combiner 260 can combine the text data contained in the first metadata with the currently played back image and transfer the combined image to the video output unit 272 .
- the playback time 703 - 2 of the second metadata is equal to the playback time 703 - 1 of the first metadata, and thus the audio data of the second metadata can also be played back when the text data of the first metadata is played back. Accordingly, the audio data contained in the second metadata can be transferred to the CODEC 220 . Additionally, the CODEC 220 can decode the compressed audio data into re-playable original audio data and transfer the re-playable original data to the audio output unit 274 . When the text data and audio data are played back, the icons corresponding to the text data and audio data can also be displayed on the video output unit 272 .
- An address of the third metadata which is the address 710 - 2 of next metadata, can be recorded in the data structure of the second metadata, and accordingly, the controller 290 can determine the address of the third metadata.
- the controller 290 can measure the period of time again. Additionally, the measured period of time corresponds to the playback time 703 - 3 of the third metadata, content 706 - 3 of the third metadata can be read out. Therefore, using the above-described method, the image can be played back together with the data received from the external apparatus.
- the text data and audio data can be stored in the storage unit in which the image is stored, the image can also be played back in other playback apparatuses, in addition to the digital camcorder according to the exemplary embodiment of the present general inventive concept.
- the notification message can be generated and displayed on the video output unit 272 , but the present general inventive concept is not limited thereto.
- the controller 290 can generate an alarm sound instead of the notification message and transfer the alarm sound to the audio output unit 274 , so the audio output unit 274 can output a notification message alarm.
- the icon corresponding to the data can be displayed, but the present general inventive concept is not limited thereto. All icons associated with data being stored in the file when playing back images for each file unit can be displayed regardless of the playback time of the data. Additionally, if only the photographed image is played back and a user inputs a separate manipulation command, for example, if the user selects a specific icon through the manipulator 280 , data corresponding to the selected icon can be played back.
- the present general inventive concept can also be embodied as computer-readable codes on a computer-readable medium.
- the computer-readable medium can include a computer-readable recording medium and a computer-readable transmission medium.
- the computer-readable recording medium is any data storage device that can store data that can be thereafter read by a computer system. Examples of the computer-readable recording medium include read-only memory (ROM), random-access memory (RAM), CD-ROMs, magnetic tapes, floppy disks, and optical data storage devices.
- the computer-readable recording medium can also be distributed over network coupled computer systems so that the computer-readable code is stored and executed in a distributed fashion.
- the computer-readable transmission medium can transmit carrier waves or signals (e.g., wired or wireless data transmission through the Internet). Also, functional programs, codes, and code segments to accomplish the present general inventive concept can be easily construed by programmers skilled in the art to which the present general inventive concept pertains.
- the exemplary embodiment of the present general inventive concept provides a DVC 110 capable of performing data communication with a mobile phone that is spatially separated from the DVC 110 , through a mobile phone, but is not limited thereto.
- the DVC 110 can directly perform short-range communication with another DVC capable of performing short-range communication.
- the exemplary embodiment of the present general inventive concept can be transferred to a photographing apparatus to photograph an image. Accordingly, a digital camera, a camera for a mobile phone or other devices can be used in the exemplary embodiment of the present general inventive concept, in addition to the digital camcorder of the exemplary embodiment of the present general inventive concept.
- the photographing apparatus can communicate with the external apparatus while photographing images. Additionally, the photographing apparatus can generate the data received from the external apparatus in the form of metadata and store the generated metadata together with the image, thus enhancing user convenience.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Studio Devices (AREA)
- Telephonic Communication Services (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
A photographing apparatus capable of performing communication with an external apparatus and a method of controlling the photographing apparatus. The photographing apparatus can communicate with an external apparatus and store data received from the external apparatus together with a photographed image, even while photographing the image. Additionally, the data received from the external apparatus can be stored in a form of metadata to connect data to the image, and thus it is possible to play back the data received from the external apparatus in a more lifelike manner.
Description
- This application is a Continuation Application of prior application Ser. No. 11/781,996, filed on Jul. 24, 2007 in the United States Patent and Trademark Office, which claims priority under 35 U.S.C. §119 (a) from Korean Patent Application No. 10-2007-0012228, filed on Feb. 6, 2007, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein by reference in its entirety.
- 1. Field of the Invention
- The present general inventive concept relates to a photographing apparatus and a method of controlling the same. More particularly, the present general inventive concept relates to a photographing apparatus capable of communication with an external apparatus and a method of controlling the same.
- 2. Description of the Related Art
- Photographing apparatuses photograph subjects to generate images that may then be stored. Representative examples of photographing apparatuses include digital cameras and digital camcorders. Photographing apparatuses may also display the images that have been photographed using photographing apparatuses and stored. The displayed images may, for example, be displayed on televisions (TVs) connected to the photographing apparatuses.
- If users desire to communicate with external apparatuses while photographing images, photographing apparatuses may be used with separate communication devices. Alternatively, if photographing apparatuses have both photographing functions and communication functions, users may temporarily stop photographing and communicate with external apparatuses, thus increasing user inconvenience.
- Additionally, a user may collect comments from other users while sharing the photographed image. When the photographed images are stored in a file on a personal computer (PC), other users' comments may be recorded using text. However, text does not have the ability to represent the feeling of movement at the time photographing operation is performed.
- In addition, in order to enable other users may leave comments, the user may need to perform several operations other than photographing images, which may burden the user.
- The present general inventive concept provides a photographing apparatus and a method of controlling the same, which enables the photographing apparatus to communicate with an external apparatus while photographing an image.
- The present general inventive concept provides a photographing apparatus and a method of controlling the same, in which data received from an external apparatus is stored together with a photographed image to enhance user convenience.
- Additional aspects and utilities of the present general inventive concept will be set forth in unit in the description which follows and, in unit, will be obvious from the description, or may be learned by practice of the general inventive concept.
- The foregoing and/or other aspects and utilities of the present general inventive concept may be achieved by providing a photographing apparatus including an image photographing unit to photograph an image, a storage unit to store the photographed image, a display unit to display the photographed image, a communication interface to communicate with an external apparatus and a controller to control such that a notification message to indicate that data has been received is generated and displayed on the display unit together with the photographed image, if the communication interface receives data from the external apparatus.
- The external apparatus may include a mobile phone, and the communication interface may perform short-range communication with the mobile phone.
- The mobile phone may perform long-range communication with another mobile phone and may transmit an image received from the communication interface to the other mobile phone or transmit data received from the other mobile phone to the communication interface.
- The received data may include at least one of audio data and text data.
- The controller may store the received data in the storage unit in which the photographed image is stored, if a storage command to store the received data is input.
- The photographing apparatus may further include a compressor to compress audio data. If the received data is in a form of audio data, the compressed audio data output from the compressor may be stored in the storage unit.
- When the received data is stored in the storage unit, the controller may generate metadata including information regarding the received data and the image and may store the generated metadata in the storage unit.
- The foregoing and/or other aspects and utilities of the present general inventive concept may also be achieved by providing a method of controlling a photographing apparatus, including photographing an image, displaying the photographed image and performing communication with an external apparatus. If the data is received from the external apparatus, the displaying of the photographed image includes displaying a notification message to indicate that data has been received together with the photographed image.
- The external apparatus may include a mobile phone, and the performing may include performing short-range communication with the mobile phone.
- The mobile phone may perform long-range communication with another mobile phone and may transmit an image received from the photographing apparatus to the other mobile phone or transmit data received from the other mobile phone to the photographing apparatus.
- The received data may include at least one of audio data and text data.
- The method may further include storing the received data in a storage unit in which the photographed image is stored, if a storage command to store the received data is input.
- The method may further include compressing audio data if the received data is in a form of audio data.
- The method may further include generating metadata including information regarding the received data and the image. The storing may include storing the metadata and the image.
- The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing a photographing apparatus usable with an external device, the apparatus including an image photographing unit to photograph an image, a communication interface to communicate with the external apparatus and a controller to determine whether data has been received from the external unit through the communication interface and, if so, to display together with the photographed image of the photographing unit, a notification message that the data has been received.
- The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing a photographing apparatus usable with an external device, the apparatus including a photographic mode in which an image is photographed and stored without communication with the external device, a communication mode in which communication with the external device is enabled while the image is being photographed and the image is stored with the data communicated by the external device, and a selection button for selecting one of the photographic mode and the communication mode.
- The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing a photographing system, including a photographing apparatus to photograph an image, the photographing apparatus having a photographic mode and a communication mode, a first mobile phone to communicate with the photographing apparatus and a second mobile phone to communicate with the first mobile phone, wherein the photographing apparatus receives data from the second mobile phone through the first mobile phone while photographing the image, when in the communication mode.
- The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing a method of operating a photographing apparatus, the method including photographing an image, communicating with an external apparatus, and determining whether data has been received from the external unit and, if so, displaying a notification message that the data has been received together with the photographed image of the photographing unit.
- The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing a method of operating a photographing apparatus, the method including determining whether a photographic mode or a communication mode has been selected, if a photographic mode has been selected, preventing communication with an external device during the photographing of an image, and if a communication mode has been selected, allowing communication with the external device during the photographing of the image.
- The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing a computer-readable recording medium having embodied thereon a computer program to execute a method, wherein the method includes determining whether a photographic mode or a communication mode has been selected, if a photographic mode has been selected, preventing communication with an external device during the photographing of an image, and if a communication mode has been selected, allowing communication with the external device during the photographing of the image.
- These and/or other aspects and utilities of the present general inventive concept will become apparent and more readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
-
FIG. 1 is a view illustrating photographing apparatus and a plurality of mobile phones which are connected via a network, according to an exemplary embodiment of the present general inventive concept; -
FIG. 2 is a block diagram illustrating a digital camcorder according to an exemplary embodiment of the present general inventive concept; -
FIG. 3 is a flowchart illustrating a method of converting a specific image of a photographed moving image into a still image and transmitting the converted image, or of processing data received from an external apparatus, according to an exemplary embodiment of the present general inventive concept; -
FIGS. 4A to 4E are views illustrating in detail a method of selecting a specific image from a photographed moving image and transmitting the specific image to an external apparatus; -
FIGS. 5A and 5B are views illustrating a data structure of an image to be transmitted, according to an exemplary embodiment of the present general inventive concept; -
FIGS. 6A to 6C are views illustrating a video output unit when data is received from a second mobile phone; -
FIGS. 7A to 7D are views illustrating a data structure of metadata, according to an exemplary embodiment of the present general inventive concept; -
FIGS. 8A to 8D are views illustrating data which is received from a third mobile phone; -
FIG. 9 is a view illustrating a data structure of third metadata which is generated using data received from a fourth mobile phone; and -
FIG. 10 is a view illustrating a data structure of image management data. - Reference will now be made in detail to the embodiments of the present general inventive concept, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. The embodiments are described below in order to explain the present general inventive concept by referring to the figures.
-
FIG. 1 is a view illustrating photographing apparatus and a plurality of mobile phones that are connected via a network, according to an exemplary embodiment of the present general inventive concept. InFIG. 1 , a digital video camcorder (DVC) 110, a firstmobile phone 120, a secondmobile phone 140, a thirdmobile phone 150 and a fourthmobile phone 160 can be connected via the network. - The
DVC 110 can photograph an image and transmit the photographed image to an external apparatus, that is, for example, the firstmobile phone 120, secondmobile phone 140 and thirdmobile phone 150. Specifically, the image photographed by theDVC 110 can be transmitted to the firstmobile phone 120 using short-range wireless communication, such as Bluetooth. Additionally, the firstmobile phone 120 can transmit the image to the secondmobile phone 140 and thirdmobile phone 150 using long-range wireless communication, in particular abase station 130. Furthermore, the second to fourthmobile phones 140 to 160 can transmit data to theDVC 110 through the firstmobile phone 120. Accordingly, theDVC 110 can perform data communication with the second to fourthmobile phones 140 to 160 through the firstmobile phone 120. - If a user of the
DVC 110 desires to share a specific image with a third party (for example, a user of the secondmobile phone 140 or a user of the third mobile phone 150) which is spatially separated from the user of theDVC 110 while photographing an image, the specific image can be transmitted to the firstmobile phone 120 using short-range wireless communication, and the firstmobile phone 120 can transmit the received image to the secondmobile phone 140 and thirdmobile phone 150 using long-range wireless communication. A short message service (SMS) can be used when transmitting the image from the firstmobile phone 120 to the secondmobile phone 140 and thirdmobile phone 150. Accordingly, the user of theDVC 110, user of the secondmobile phone 140 and user of the thirdmobile phone 150 can share the same image. - Additionally, the
DVC 110 andmobile phones mobile phone 140 and user of the thirdmobile phone 150 can leave comments regarding the image being photographed in real-time and the comments can be stored together with the image. - Hereinafter, description will be given in detail of a method in which several users can share a single image and data received from an external apparatus can be stored together with the image.
- The
DVC 110 to store the data received from the external apparatuses together with the image will be now described with reference toFIG. 2 .FIG. 2 is a block diagram illustrating theDVC 110 according to an exemplary embodiment of the present general inventive concept. InFIG. 2 , theDVC 110 includes animage photographing unit 210, a coder/decoder (CODEC) 220, atemporary storage unit 230, acommunication interface 240, astorage unit 250, adisplay information combiner 260, animage output unit 270, amanipulator 280, and acontroller 290. - The
image photographing unit 210 can include alens 212, a charge-coupled device (CCD) 214, amicrophone 216, and asignal processor 218. - The
CCD 214 can capture an optical image of a subject incident through thelens 212 and generate a video signal corresponding to the photographed optical image. The photographing apparatus according to the exemplary embodiment of the present general inventive concept can be implemented by a complementary metal oxide semiconductor (CMOS) type instead of the CCD 114. - The
signal processor 218 can remove noise, adjust a level of the video signal, convert an analog signal into a digital signal, and perform digital signal processing (DSP) for the video signal output from theCCD 214. Thesignal processor 218 can also amplify an audio signal output from themicrophone 216 and convert the audio signal into a digital signal. - The
CODEC 220 can compress the video and audio signals (hereinafter, referred to as “images”) processed by thesignal processor 218 in a predetermined format and transfer the compressed images to thestorage unit 250. Alternatively, theCODEC 220 can decode an image output from thestorage unit 250 or thetemporary storage unit 230 into a re-playable original signal and transfer the re-playable original signal to theimage output unit 270. - The
temporary storage unit 230 can temporarily store the specific image compressed by theCODEC 220 before transmitting the image to thecommunication interface 240, and can also temporarily store data received from thecommunication interface 240. - The
communication interface 240 can modulate the image output from thetemporary storage unit 230 to wirelessly transmit the image, and transfer the data received from the external apparatus to thetemporary storage unit 230. Thecommunication interface 240 may be a Bluetooth module or a wireless local area network (LAN) module to perform short-range wireless communication. - The
storage unit 250 can store the image compressed by theCODEC 220. Additionally, thestorage unit 250 can store metadata output from thecontroller 290, together with the image. The metadata includes the data received from the external apparatus and various information required to connect the data received from the external apparatus to the image. - The
display information combiner 260 can combine the video signal output from thesignal processor 218 with display information such as characters, symbols, diagrams, graphics or other information. Thedisplay information combiner 260 can combine the display information using an on-screen display (OSD) method under the control of thecontroller 290. - The
image output unit 270 can include avideo output unit 272 to display the video signal, and anaudio output unit 274 to output the audio signal. Thevideo output unit 272 can be implemented as a liquid crystal display (LCD) or the like, and theaudio output unit 274 can be implemented as a speaker or the like. Thevideo output unit 272 can also display functions of themanipulator 280, or can be used as a part of themanipulator 280. - The
manipulator 280 includes a selection button through which a mode of theDVC 110 can be selected. A user may execute a photographing command or transmit the photographed image to the external apparatus and may also receive text data or audio data from the external apparatus, using themanipulator 280 including the selection button. Modes of theDVC 110 according to an exemplary embodiment of the present general inventive concept include a “photographing mode” and a “communication mode”. The “photographing mode”, as a general function of theDVC 110, refers to a mode in which an image is photographed and the photographed image is stored. The “communication mode” refers to a mode in which a photographed image is stored and communication with an external apparatus is performed while photographing an image. - The
controller 290 can control the entire operation of theDVC 110. For example, if the user inputs a command to change the mode of theDVC 110 to the “communication mode” through themanipulator 280, thecontroller 290 can store the image compressed by theCODEC 220 in thestorage unit 250, and also transmit the compressed image to the external apparatus using thecommunication interface 240. Additionally, if the compressed image is transmitted, thecontroller 290 can generate information regarding the transmitted image in the form of a data structure and store the information in thetemporary storage unit 230. - If the
communication interface 240 receives data from the external apparatus and the received data is stored in thetemporary storage unit 230, thecontroller 290 can generate a notification message to indicate that data has been received and can transfer the generated notification message to thedisplay information combiner 270. Thecontroller 290 can also transfer the received data to theimage output unit 270 or store the data in thestorage unit 250. When storing the received data in thestorage unit 250, thecontroller 290 can generate metadata to connect the data to the image and can store the metadata together with the image in thestorage unit 250. - Additionally, the
controller 290 can control theimage output unit 270 to output the image processed by thesignal processor 218, and thevideo output unit 272 to display information regarding a manipulation command input by the user using themanipulator 280. - Hereinafter, a method by which the
DVC 110 ofFIG. 2 transmits an image to an external apparatus and a method by which theDVC 110 processes the data received from the external apparatus will be described in detail with reference toFIG. 3 .FIG. 3 is a flowchart illustrating a method of converting a specific image of a photographed moving image into a still image and transmitting the converted image, or of processing data received from an external apparatus, according to an exemplary embodiment of the present general inventive concept. - In
FIG. 3 , theimage photographing unit 210 can photograph an image in operation S310. Specifically, theCCD 214 can capture an optical image of a subject incident through thelens 212 and transfer the image to thesignal processor 218. Themicrophone 216 can transfer an audio signal to thesignal processor 218. Thesignal processor 218 can remove noise, adjust a level of the video signal, convert an analog signal into a digital signal, and perform digital signal processing (DSP) for the video signal output from theCCD 214. Thesignal processor 218 can also amplify the audio signal output from themicrophone 216 and convert the audio signal into a digital signal. TheCODEC 220 can compress the photographed image in a predetermined format in operation S315. - The
controller 290 can determine whether the mode of theDVC 110 is the communication mode in operation S320. Accordingly, thecontroller 290 can determine whether a user inputs a command to change the mode of theDVC 110 to the communication mode through themanipulator 280. If the mode of theDVC 110 is set to be in the communication mode, theDVC 110 can receive data from and transmit data to the external apparatus while photographing an image. - If it is determined that the mode of the
DVC 110 is in the communication mode in operation S320-Y, thecontroller 290 can determine whether thecommunication interface 240 receives data in operation S330. Specifically, thecontroller 290 can determine whether the firstmobile phone 120 receives data, such as text data or audio data, from the second to fourthmobile phones 140 to 160 and transmits the data to theDVC 110. - If it is determined that the data is not received in operation S330-N, the
controller 290 can determine whether a specific image selection command is input in operation S335. Specifically, if a specific image which a user desires to share with a third party is photographed using theDVC 110 while photographing an image, a command to select the photographed image can be input. Accordingly, the specific photographed image can be compressed in a still image format by theCODEC 220 and stored in thetemporary storage unit 230. - The
controller 290 can also determine whether a contact address for the external apparatus is input in operation S340. The contact address for the external apparatus refers to numbers for the secondmobile phone 140 and thirdmobile phone 150. The user can input the contact address for the external apparatus using a user interface provided by theDVC 110 or using a contact address for an external apparatus previously stored in theDVC 110. Additionally, since theDVC 110 can communicate with the firstmobile phone 120, contact addresses stored in the firstmobile phone 120 can be used. - If the contact address for the external apparatus is input in operation 5340-Y, the
controller 290 can generate a data structure including information regarding the selected specific image and store the generated data structure in thetemporary storage unit 230 in operation S345. A method of generating a data structure will be described later. - The
communication interface 240 can transmit the data structure of the specific image and the specific image to the external apparatus in operation S350. Specifically, if thecommunication interface 240 of theDVC 110 transmits the data structure of the specific image and the specific image to the firstmobile phone 120 using short-range wireless communication, the firstmobile phone 120 can transmit the data structure of the specific image and the specific image to the external apparatus using long-range wireless communication. - If it is determined that the data is received in operation S330-Y, the
controller 290 can control thedisplay information combiner 260 and thevideo output unit 272 so that a notification message to indicate data reception can be generated and the generated notification message can be displayed on thevideo output unit 272 in operation S355. More particularly, the data received by thecommunication interface 240 can be stored in thetemporary storage unit 230, and the controller can read the data being stored in thetemporary storage unit 230 and generate a notification message. - The
controller 290 can generate various notification messages according to the types of data. Accordingly, if the received data is text data or audio data, different types of notification messages can be generated to be transferred to thedisplay information combiner 260, and thedisplay information combiner 260 can combine the notification message with a region of the photographed image and the combined image can be transferred to thevideo output unit 272. Accordingly, various notification messages can be generated according to the types of data, and thus enable the user to easily identify the type of data received. - The
controller 290 can determine whether an output command to output the received data is input in operation S360. The user can check that the notification message is displayed on thevideo output unit 272 while photographing a moving image, and can input the output command. - If the output command is input in operation S360-Y, the
controller 290 can output the received data being stored in thetemporary storage unit 230 in operation S365. Specifically, if the received data is text data, thecontroller 290 can read the text data being stored in thetemporary storage unit 230 and convert the data into original text data, followed by transferring the original data to thedisplay information combiner 260. Thedisplay information combiner 260 can combine the text data with the image output from thesignal processor 218 and transfer the combined image to thevideo output unit 272. Thevideo output unit 272 can output the image combined with the text data. Alternately, if the received data is audio data, thecontroller 290 can read the audio data being stored in thetemporary storage unit 230 and transfer the data to theCODEC 220. TheCODEC 220 can then decode the audio data into a re-playable original signal and transfer the re-playable original signal to theaudio output unit 274, and theaudio output unit 274 can output the audio data. - The
controller 290 can determine whether a storage command to store the data output from theimage output unit 270 is input in operation S370. Specifically, if the user desires to store the output data together with the photographed image, the user can input the storage command, and thecontroller 290 can determine that the storage command is input. - If it is determined that the storage command is input in operation S370-Y, the
controller 290 can generate metadata to connect the output data to the image in operation S375. The metadata can include information regarding the output data along with information regarding the image. If the storage command is input, thecontroller 290 can generate metadata based on whether the data is associated with the data structure transmitted at operation S350. A method of generating metadata will be described later. - Subsequently, the
controller 290 can store the generated metadata in thestorage unit 250 in which the image is currently stored in operation S380. - If it is determined that the mode of the
DVC 110 is not the communication mode in operation S320-N, the compressed image can be stored in thestorage unit 250 in operation S385, because it is determined that theDVC 110 is in the photographing mode. - For convenience of description, it has been described that if the
DVC 110 is in the communication mode, thecontroller 290 can determine whether data is received at operation S330 and can then determine whether the specific image selection command is input at operation S335, but the present general inventive concept is not limited thereto. Accordingly, if theDVC 110 is in the communication mode, thecontroller 290 can determine whether data is received or whether the image is transmitted. -
FIGS. 4A to 4E are views illustrating in detail a method of selecting a specific image from a photographed moving image to be transmitted to an external apparatus. As illustrated inFIG. 4A , if a user desires to share the specific image with a third party while photographing the moving image, the user can select an arrow, that is, a user interface, displayed on a region of thevideo output unit 272. - As illustrated in
FIG. 4B , a window to input the contact address for the external apparatus can be displayed on a region of thevideo output unit 272. Since the contact address for the external apparatus can generally include numbers, a number input window is illustrated inFIG. 4B . For convenience of description, if the user of theDVC 110 inputs the contact address for the secondmobile phone 140 using the number input window, the numbers illustrated inFIG. 4B may be the contact address for the secondmobile phone 140. - However, the present general inventive concept is not limited to this embodiment, so if the user selects a down arrow displayed on the number input window, a pre-registered contact address window for external apparatuses can be displayed, as illustrated in
FIG. 4C . Accordingly, the user can input the contact address for the external apparatus using the pre-registered contact address window for external apparatuses. As illustrated inFIG. 4C , “Han-soo CHO” selected by the user may be the contact address for the thirdmobile phone 150, and accordingly the user can select the contact address for the external apparatus using the number input window ofFIG. 4B and the pre-registered contact address window for external apparatuses ofFIG. 4C , and can input a transmission command. If icons corresponding to the contact addresses for the external apparatuses are previously registered, the user can select the icons instead of the contact address for the external apparatus, as illustrated inFIG. 4D . - As illustrated in
FIG. 4E , the specific image can be transmitted to the external apparatus while displaying a complete notification message at the bottom of thevideo output unit 272. The method of transmitting the specific image to the external apparatus was described above. - The method by which the
controller 290 generates a data structure of an image to be transmitted at operation S345 will be described in detail with reference toFIGS. 5A and 5B .FIGS. 5A and 5B are views regarding a data structure of an image to be transmitted, according to an exemplary embodiment of the present general inventive concept. The data structure can include a movingimage ID 501 indicating information regarding a moving image containing an image to be transmitted, thesize 502 of an image to be transmitted, thetype 503 of an image to be transmitted, atitle 504 of an image to be transmitted, aprofile type 505 for data transmission between theDVC 110 and firstmobile phone 120, theimage sending time 506 in the DVC 110 (specifically, the communication interface 240), areceiver list 507, andinformation 508 regarding whether the receiver is registered. -
FIG. 5B illustrates a data structure of a specific image prepared based on the data structure ofFIG. 5A . Accordingly, as illustrated in the data structure ofFIG. 5B , it is determined that the user of theDVC 110 transmits the same image to the secondmobile phone 140 and thirdmobile phone 150 through the firstmobile phone 120. - A method of processing the data received from the second to fourth
mobile phones 140 to 160 will be now described in detail using the drawings. For convenience of description, it is assumed that the user of the secondmobile phone 140 transmits a comment regarding a specific image in the form of text data, the user of the thirdmobile phone 150 transmits a comment regarding a specific image in the form of audio data, and the user of the fourthmobile phone 160 transmits text data which is unrelated to the specific image. -
FIGS. 6A to 6C are views regarding the video output unit 272 (FIG. 2 ) when data is received from the second mobile phone 140 (FIG. 1 ). As illustrated inFIG. 6A , if the data is received from the secondmobile phone 140, an image being photographed can be displayed together with a notification message on thevideo output unit 272 of theDVC 110. - If the output command is input at operation S360 (
FIG. 3 ), thevideo output unit 272 of theDVC 110 can display text data “What a beautiful spot!! Where is it?”, which is received from the secondmobile phone 140, together with an icon corresponding to the secondmobile phone 140, as illustrated inFIG. 6B . - If a predetermined period of time has elapsed after the text data is displayed, the words indicating whether to store the data can be displayed, as illustrated in
FIG. 6C . Accordingly, the user can input the storage command through the user interface. - If the storage command is input, the
controller 290 can generate metadata to connect the data to the image. A data structure of metadata according to the exemplary embodiment of the present general inventive concept is now described with reference toFIGS. 7A to 7D . -
FIGS. 7A to 7D are views regarding a data structure of metadata, according to an exemplary embodiment of the present general inventive concept. As illustrated inFIG. 7A , the data structure of the metadata can include thetype 701 of received data, thesize 702 of the received data, theplayback time 703 of the received data, thereception time 704 of the received data, thedata transmitter 705, thedata content 706, the movingimage ID 707 associated with the received data,information 708 regarding whether metadata is generated, adata transmitter icon 709, thenext metadata address 710, theprevious metadata address 711,information 712 regarding whether an icon is registered, anicon display position 713, and an icon display method 714. -
FIG. 7B illustrates a data structure of data received from the second mobile phone 140 (FIG. 1 ) prior to input of the output command at operation S365 (FIG. 3 ). As illustrated inFIG. 7B , the data structure of the data received from the secondmobile phone 140 includes only thetype 701 of the received data, thesize 702 of the received data, thereception time 704 of the received data, thedata transmitter 705 and thedata content 706. The controller 290 (FIG. 2 ) can generate a data structure ofFIG. 7B using only information regarding the data being stored in the temporary storage unit 230 (FIG. 2 ). - If the output command at operation S365 is input, other information regarding the data can be added to the data structure of
FIG. 7B , and thus thecontroller 290 can generate a data structure illustrated inFIG. 7C . Accordingly, if the output command is input, the text data can be transferred to thevideo output unit 272 and the icon can also be transferred to thevideo output unit 272 according to the situation. Accordingly, information regarding thedata transmitter icon 709, theinformation 712 regarding whether an icon is registered, theicon display position 713, and the icon display method 714 can be added to the data structure. Additionally, the data and icons can be output to theimage output unit 270 according to the generated data structure. - If the storage command is input, the
controller 290 can generate metadata to connect the data to be stored to the image, and in particular, can generate metadata based on whether the data to be stored includes a comment regarding the image. It is assumed that the data received from the secondmobile phone 140 is the data associated with the image being photographed, so the data structure illustrated inFIG. 5B is currently stored in thetemporary storage unit 230. Accordingly, thecontroller 290 can generate metadata using the data structures illustrated inFIGS. 5B and 7C . -
FIG. 7D illustrates a data structure of metadata (hereinafter, referred to as “first metadata”) of the data received from the secondmobile phone 140. When generating the first metadata, thecontroller 290 can check whether thedata transmitter 705 exists in thereceiver list 507. - If the
data transmitter 705 exists in thereceiver list 507, theimage sending time 506 recorded in the data structure ofFIG. 5B can be copied into theplayback time 703 of the first metadata, and the movingimage ID 501 recorded in the data structure ofFIG. 5B can be copied into the movingimage ID 707, and accordingly, the first metadata can be generated. The first metadata of the data received from the secondmobile phone 140 is illustrated inFIG. 7D , and can be stored in thestorage unit 250 in which the image associated with the first metadata is being stored. - This is because the user of the second
mobile phone 140 transmits the comment regarding the specific image transmitted by the user of theDVC 110 in the form of text data, so if the user desires to play back the image again after storing the image and metadata, the comment can be played back together with the image which is most closely associated with the data, to thus implement the feeling of movement. - A method of outputting data received from the third
mobile phone 150 and a method of generating metadata of the data received from the thirdmobile phone 150 can be performed similarly to the method of outputting data received from the secondmobile phone 140 and the method of generating metadata of the data received from the secondmobile phone 140. -
FIGS. 8A to 8D are views illustrating data that is received from the thirdmobile phone 150. TheDVC 110 can receive the audio data from the thirdmobile phone 150, and thus a notification message different from the notification message of the text data can be displayed on thevideo output unit 272, as illustrated inFIG. 8A . Additionally, since the audio data can be output from theaudio output unit 274, only an icon corresponding to an audio data transmitter can be displayed on thevideo output unit 272 while outputting the audio data, as illustrated inFIG. 8B . - Moreover, if the user of the
DVC 110 enters into conversation with the user of the thirdmobile phone 150 using the audio data, an icon corresponding to the user of theDVC 110 together with an icon corresponding to the user of the thirdmobile phone 150 can be displayed on thevideo output unit 272, as illustrated inFIG. 8C . In addition,FIG. 8D illustrates a data structure of generated metadata (hereinafter, referred to as “second metadata”) when a storage command to store the audio data received from the thirdmobile phone 150 is input. The data structure of the second metadata is the same as that of the first metadata, but information contained in the data structures are different from each other. - If the playback time 706-2 of the second metadata is equal to the playback time 706-1 of the first metadata, audio data and text data can have the same playback time when playing back the image. As described above, since the first metadata contains the text data and the second metadata contains the audio data, the data can be output using different methods without causing any problems. However, for example, if all metadata contain text data, that is, if data is output using the same method, the data may be displayed on the
video output unit 272 in order of thereception time 704 at a predetermined time interval. Additionally, the metadata may be stored according to the order of thereception time 704. Accordingly, the second metadata address may be recorded in an address 710-1 of metadata next to the first metadata, and the first metadata address may be stored in an address 710-2 of metadata preceding the second metadata. -
FIG. 9 is a view illustrating a data structure of third metadata that is generated using data received from the fourthmobile phone 160. Since the data received from the fourthmobile phone 160 is not associated with the image being photographed, the data described inFIGS. 5A and 5B is not stored in thetemporary storage unit 230. Accordingly, if the storage command is input, the reception time can be recorded in a data playback time 703-3 because there is no image sending time, and remaining information can be recorded in the same manner as the method by which thecontroller 290 generates the first metadata. - Hereinafter, a method of playing back an image together with the data received from the external apparatus will be described.
FIG. 10 is a view regarding a data structure of image management data. Thestorage unit 250 may store the image management data, in addition to the image and metadata. If an image file is completely stored in thestorage unit 250, thecontroller 290 can generate image management data using the image and metadata. The data structure of the image management data can include a movingimage ID 1001, an address 1002 of metadata which is initially connected to the moving image, anaddress 1003 of metadata which is finally connected to the moving image, a total number ofmetadata 1004, a total number ofmetadata 1005 containing text data, and a total number ofmetadata 1006 containing audio data. - If a playback command is input, the
controller 290 can control an entire operation of theDVC 110 so that the image stored in thestorage unit 250 can be played back, and also determine whether metadata exists using the image management data. The image management data contains the address of the first metadata that is an address of metadata that is initially connected to the image, and accordingly, thecontroller 290 can determine that metadata exists. Additionally, when the moving image is played back while measuring a period of time, if the measured period of time corresponds to the playback time 703-1 of the first metadata, thecontroller 290 can read the first metadata being stored in thestorage unit 250. Next, thecontroller 290 can check the type of data, and if the data is in the form of text data, thecontroller 290 can restore the data into re-playable data and transfer the re-playable data to thedisplay information combiner 260. Thedisplay information combiner 260 can combine the text data contained in the first metadata with the currently played back image and transfer the combined image to thevideo output unit 272. - The playback time 703-2 of the second metadata is equal to the playback time 703-1 of the first metadata, and thus the audio data of the second metadata can also be played back when the text data of the first metadata is played back. Accordingly, the audio data contained in the second metadata can be transferred to the
CODEC 220. Additionally, theCODEC 220 can decode the compressed audio data into re-playable original audio data and transfer the re-playable original data to theaudio output unit 274. When the text data and audio data are played back, the icons corresponding to the text data and audio data can also be displayed on thevideo output unit 272. - An address of the third metadata, which is the address 710-2 of next metadata, can be recorded in the data structure of the second metadata, and accordingly, the
controller 290 can determine the address of the third metadata. Thecontroller 290 can measure the period of time again. Additionally, the measured period of time corresponds to the playback time 703-3 of the third metadata, content 706-3 of the third metadata can be read out. Therefore, using the above-described method, the image can be played back together with the data received from the external apparatus. - As described above, since the text data and audio data can be stored in the storage unit in which the image is stored, the image can also be played back in other playback apparatuses, in addition to the digital camcorder according to the exemplary embodiment of the present general inventive concept.
- According to the exemplary embodiment of the present general inventive concept, if data is received from the external apparatus, the notification message can be generated and displayed on the
video output unit 272, but the present general inventive concept is not limited thereto. Thecontroller 290 can generate an alarm sound instead of the notification message and transfer the alarm sound to theaudio output unit 274, so theaudio output unit 274 can output a notification message alarm. - In the exemplary embodiment of the present general inventive concept, if data received using wireless communication is output, the icon corresponding to the data can be displayed, but the present general inventive concept is not limited thereto. All icons associated with data being stored in the file when playing back images for each file unit can be displayed regardless of the playback time of the data. Additionally, if only the photographed image is played back and a user inputs a separate manipulation command, for example, if the user selects a specific icon through the
manipulator 280, data corresponding to the selected icon can be played back. - The present general inventive concept can also be embodied as computer-readable codes on a computer-readable medium. The computer-readable medium can include a computer-readable recording medium and a computer-readable transmission medium. The computer-readable recording medium is any data storage device that can store data that can be thereafter read by a computer system. Examples of the computer-readable recording medium include read-only memory (ROM), random-access memory (RAM), CD-ROMs, magnetic tapes, floppy disks, and optical data storage devices. The computer-readable recording medium can also be distributed over network coupled computer systems so that the computer-readable code is stored and executed in a distributed fashion. The computer-readable transmission medium can transmit carrier waves or signals (e.g., wired or wireless data transmission through the Internet). Also, functional programs, codes, and code segments to accomplish the present general inventive concept can be easily construed by programmers skilled in the art to which the present general inventive concept pertains.
- The exemplary embodiment of the present general inventive concept provides a
DVC 110 capable of performing data communication with a mobile phone that is spatially separated from theDVC 110, through a mobile phone, but is not limited thereto. TheDVC 110 can directly perform short-range communication with another DVC capable of performing short-range communication. - The exemplary embodiment of the present general inventive concept can be transferred to a photographing apparatus to photograph an image. Accordingly, a digital camera, a camera for a mobile phone or other devices can be used in the exemplary embodiment of the present general inventive concept, in addition to the digital camcorder of the exemplary embodiment of the present general inventive concept.
- As described above, the photographing apparatus according to the exemplary embodiment of the present general inventive concept can communicate with the external apparatus while photographing images. Additionally, the photographing apparatus can generate the data received from the external apparatus in the form of metadata and store the generated metadata together with the image, thus enhancing user convenience.
- Although various embodiments of the present general inventive concept have been illustrated and described, it will be appreciated by those skilled in the art that changes may be made in these embodiments without departing from the principles and spirit of the general inventive concept, the scope of which is defined in the appended claims and their equivalents.
Claims (6)
1. An electronic device comprising:
an image pickup unit to acquire a moving image of a subject;
an image processing unit to process signals output from the image pickup unit and to generate video image data;
a display unit to display the video image data;
a memory unit to store the video image data; and
a controller to control the display unit to display the video image data, and to control the display unit to display a notification message in response to receiving a SMS text message sent by an external device while photographing the moving image of the subject,
wherein the notification message indicates that the SMS text message has been received, wherein the notification message including information on contents of the SMS text message, wherein the controller controls the display unit to display the contents of the SMS text message in response to receiving a user input, and wherein the SMS text message has been sent through a base station by the external device.
2. The device as claimed in claim 1 , wherein the notification message comprises a phone number of a sender.
3. The device as claimed in claim 1 , wherein the notification message comprises a name of a sender.
4. The device as claimed in claim 3 , wherein a pre-stored contact address window is displayed in response to receiving a user input.
5. The device as claimed in claim 1 , wherein a contact address of the external device is input and stored, and wherein the notification message is displayed with the contact address.
6. The device as claimed in claim 1 , wherein the controller generates an alarm sound in response to receiving the message.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/471,227 US20140368717A1 (en) | 2007-02-06 | 2014-08-28 | Photographing apparatus capable of communication with external apparatus and method of controlling the same |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2007-0012228 | 2007-02-06 | ||
KR1020070012228A KR101371413B1 (en) | 2007-02-06 | 2007-02-06 | Photographing apparatus of performing communication and method thereof |
US11/781,996 US8848085B2 (en) | 2007-02-06 | 2007-07-24 | Photographing apparatus capable of communication with external apparatus and method of controlling the same |
US14/471,227 US20140368717A1 (en) | 2007-02-06 | 2014-08-28 | Photographing apparatus capable of communication with external apparatus and method of controlling the same |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/781,996 Continuation US8848085B2 (en) | 2007-02-06 | 2007-07-24 | Photographing apparatus capable of communication with external apparatus and method of controlling the same |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140368717A1 true US20140368717A1 (en) | 2014-12-18 |
Family
ID=39675804
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/781,996 Active 2027-10-24 US8848085B2 (en) | 2007-02-06 | 2007-07-24 | Photographing apparatus capable of communication with external apparatus and method of controlling the same |
US14/471,227 Abandoned US20140368717A1 (en) | 2007-02-06 | 2014-08-28 | Photographing apparatus capable of communication with external apparatus and method of controlling the same |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/781,996 Active 2027-10-24 US8848085B2 (en) | 2007-02-06 | 2007-07-24 | Photographing apparatus capable of communication with external apparatus and method of controlling the same |
Country Status (3)
Country | Link |
---|---|
US (2) | US8848085B2 (en) |
KR (1) | KR101371413B1 (en) |
CN (1) | CN101242509B (en) |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8208947B2 (en) | 2007-08-31 | 2012-06-26 | At&T Intellectual Property I, Lp | Apparatus and method for multimedia communication |
US8787993B2 (en) * | 2008-12-11 | 2014-07-22 | Blackberry Limited | System and method for saving data in response to open and close events in a mobile communication device |
JP5388611B2 (en) | 2009-02-03 | 2014-01-15 | キヤノン株式会社 | Imaging apparatus, control method therefor, and program |
KR101722616B1 (en) * | 2009-12-24 | 2017-04-19 | 삼성전자주식회사 | Method and apparatus for operating application of a touch device having touch-based input interface |
US8896633B2 (en) * | 2010-08-17 | 2014-11-25 | Apple Inc. | Adjusting a display size of text |
KR101719653B1 (en) * | 2010-09-16 | 2017-03-24 | 엘지전자 주식회사 | Mobile Terminal, Electronic System And Method Of Transferring And receiving Data Using The Same |
US20130083997A1 (en) * | 2011-10-04 | 2013-04-04 | Alcatel-Lucent Usa Inc. | Temporally structured light |
KR101891101B1 (en) * | 2011-11-04 | 2018-08-24 | 삼성전자주식회사 | Electronic apparatus and method for controlling playback speed of an animation message |
US20140270128A1 (en) * | 2011-11-08 | 2014-09-18 | Nec Corporation | Content display terminal selection system |
US20130332854A1 (en) * | 2012-06-10 | 2013-12-12 | Apple Inc. | Creating image streams and sharing the image streams across different devices |
EP2986007A4 (en) | 2013-04-11 | 2017-03-08 | Nec Corporation | Information processing apparatus, data processing method thereof, and program |
US9804760B2 (en) | 2013-08-22 | 2017-10-31 | Apple Inc. | Scrollable in-line camera for capturing and sharing content |
KR102256642B1 (en) * | 2014-12-04 | 2021-05-27 | 삼성전자주식회사 | Apparatus for transmiting and receiving message and method for transmiting and receiving message |
US9615023B2 (en) * | 2015-03-13 | 2017-04-04 | Center For Integrated Smart Sensors Foundation | Front-end event detector and low-power camera system using thereof |
WO2017119747A1 (en) * | 2016-01-05 | 2017-07-13 | Samsung Electronics Co., Ltd. | Electronic device and method for controlling the electronic device |
CN105847683A (en) * | 2016-03-31 | 2016-08-10 | 成都西可科技有限公司 | Motion camera one-key sharing system and method |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040218738A1 (en) * | 1998-05-01 | 2004-11-04 | Hideyuki Arai | Recording/playback apparatus with telephone and its control method, video camera with telephone and its control method, image communication apparatus, and storage medium |
JP2005191899A (en) * | 2003-12-25 | 2005-07-14 | Mitsubishi Electric Corp | Mobile communication device with camera |
US20070010292A1 (en) * | 2005-07-11 | 2007-01-11 | Nokia Corporation | Alternative notifier for multimedia use |
US20070237491A1 (en) * | 2006-03-29 | 2007-10-11 | Clifford Kraft | Portable personal entertainment video viewing system |
US20070298839A1 (en) * | 2001-07-31 | 2007-12-27 | Matsushita Electric Industrial Co., Ltd. | Camera-equipped cellular telephone |
US20080227490A1 (en) * | 1998-11-19 | 2008-09-18 | Nikon Corporation | Camera capable of communicating with other communication device |
US20080295017A1 (en) * | 2006-09-05 | 2008-11-27 | Tseng Tina L | User interface for a wireless device |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4431216B2 (en) * | 1999-07-09 | 2010-03-10 | 富士フイルム株式会社 | Data communication system |
JP2001045351A (en) * | 1999-08-02 | 2001-02-16 | Minolta Co Ltd | Camera system using portable terminal |
JP2001111977A (en) | 1999-10-05 | 2001-04-20 | Canon Inc | Image pickup device, terminal, method and system for mobile communication and storage medium |
JP3907384B2 (en) * | 2000-05-18 | 2007-04-18 | キヤノン株式会社 | Imaging system, imaging device, communication device, control method thereof, and storage medium |
US6982747B2 (en) * | 2000-05-18 | 2006-01-03 | Canon Kabushiki Kaisha | Notification of operating status in image sensing system |
KR20030028326A (en) * | 2001-09-29 | 2003-04-08 | 삼성전자주식회사 | Method for processing incoming call in telephone with e-mail function |
JP2003153058A (en) * | 2001-11-13 | 2003-05-23 | Sony Corp | Imaging apparatus |
US7545415B2 (en) | 2001-11-27 | 2009-06-09 | Panasonic Corporation | Information-added image pickup method, image pickup apparatus and information delivery apparatus used for the method, and information-added image pickup system |
US7382405B2 (en) * | 2001-12-03 | 2008-06-03 | Nikon Corporation | Electronic apparatus having a user identification function and user identification method |
US7392039B2 (en) * | 2002-03-13 | 2008-06-24 | Novatel Wireless, Inc. | Complete message delivery to multi-mode communication device |
US7843495B2 (en) * | 2002-07-10 | 2010-11-30 | Hewlett-Packard Development Company, L.P. | Face recognition in a digital imaging system accessing a database of people |
KR100511227B1 (en) * | 2003-06-27 | 2005-08-31 | 박상래 | Portable surveillance camera and personal surveillance system using the same |
KR100606766B1 (en) | 2003-12-05 | 2006-07-31 | 엘지전자 주식회사 | Method for Managing a short message |
GB2412804A (en) * | 2004-03-30 | 2005-10-05 | Nokia Corp | Recording images with associated context information |
JP4189349B2 (en) * | 2004-04-27 | 2008-12-03 | Hoya株式会社 | Camera phone and incoming call notification method |
JP2005348028A (en) * | 2004-06-02 | 2005-12-15 | Fuji Photo Film Co Ltd | Imaging apparatus |
US7403225B2 (en) * | 2004-07-12 | 2008-07-22 | Scenera Technologies, Llc | System and method for automatically annotating images in an image-capture device |
KR100828357B1 (en) * | 2005-05-16 | 2008-05-08 | 삼성전자주식회사 | Method and apparatus for storing data obtained by image filming apparatus, and navigation apparatus using location information included in image data |
-
2007
- 2007-02-06 KR KR1020070012228A patent/KR101371413B1/en active IP Right Grant
- 2007-07-24 US US11/781,996 patent/US8848085B2/en active Active
- 2007-12-05 CN CN2007101971262A patent/CN101242509B/en not_active Expired - Fee Related
-
2014
- 2014-08-28 US US14/471,227 patent/US20140368717A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040218738A1 (en) * | 1998-05-01 | 2004-11-04 | Hideyuki Arai | Recording/playback apparatus with telephone and its control method, video camera with telephone and its control method, image communication apparatus, and storage medium |
US20080227490A1 (en) * | 1998-11-19 | 2008-09-18 | Nikon Corporation | Camera capable of communicating with other communication device |
US20070298839A1 (en) * | 2001-07-31 | 2007-12-27 | Matsushita Electric Industrial Co., Ltd. | Camera-equipped cellular telephone |
JP2005191899A (en) * | 2003-12-25 | 2005-07-14 | Mitsubishi Electric Corp | Mobile communication device with camera |
US20070010292A1 (en) * | 2005-07-11 | 2007-01-11 | Nokia Corporation | Alternative notifier for multimedia use |
US20070237491A1 (en) * | 2006-03-29 | 2007-10-11 | Clifford Kraft | Portable personal entertainment video viewing system |
US20080295017A1 (en) * | 2006-09-05 | 2008-11-27 | Tseng Tina L | User interface for a wireless device |
Also Published As
Publication number | Publication date |
---|---|
CN101242509B (en) | 2013-10-09 |
KR101371413B1 (en) | 2014-03-10 |
US20080186385A1 (en) | 2008-08-07 |
US8848085B2 (en) | 2014-09-30 |
KR20080073520A (en) | 2008-08-11 |
CN101242509A (en) | 2008-08-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8848085B2 (en) | Photographing apparatus capable of communication with external apparatus and method of controlling the same | |
US10313730B2 (en) | Device and method for outputting data of a wireless terminal to an external device | |
JP4405512B2 (en) | Remote control device with wireless telephone interface | |
US20070291107A1 (en) | Apparatus and method for sending/receiving text message during video call in mobile terminal | |
JP6385429B2 (en) | Method and apparatus for reproducing stream media data | |
US20060229015A1 (en) | Device and method for displaying files during bluetooth communication in a wireless terminal | |
JP2017501598A5 (en) | ||
US20050140802A1 (en) | Method for synthesizing photographed image with background scene in appratus having a camera | |
US7804516B2 (en) | Network capturing apparatus, displaying method, computer-readable recording medium, and network system | |
KR101314565B1 (en) | Photographing apparatus of providing location related information and method thereof | |
US20080012951A1 (en) | Photographing apparatus wirelessly transmitting image and method of controlling the same | |
JP2009141388A (en) | Information processor, and information processing method | |
US20070207779A1 (en) | Method for confirming message in mobile terminal | |
KR100682725B1 (en) | Method for processing video signal of electronic apparatus and electronic apparatus thereof | |
US20050113152A1 (en) | Method and apparatus for controlling power-off operation of mobile terminal | |
KR100614746B1 (en) | Method for storing multiple motion pictures at the same time | |
JP2005295114A (en) | Photographing apparatus | |
JP2011188209A (en) | Portable apparatus and photographing device | |
JP2004208125A (en) | Communication terminal device, communication system, and communication program | |
KR20070095487A (en) | Method for providing of quick view service in portable communication terminal | |
JP2002344649A (en) | Method, system and program for photographing, storing and restoring moving picture utilizing personal digital assistant | |
JP2005252843A (en) | Mobile terminal device, program recording system, and program recording method | |
JP2003333198A (en) | Intercom system | |
JP2005151256A (en) | Communication terminal device, its control method, its control program, and recording medium with the program recorded | |
JP2016051917A (en) | Voice processing device, imaging device, and voice processing device control method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:O, SEUNG-HUN;REEL/FRAME:033628/0815 Effective date: 20140828 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |