JP2007072210A - Imaging apparatus, its control method, and program - Google Patents

Imaging apparatus, its control method, and program Download PDF

Info

Publication number
JP2007072210A
JP2007072210A JP2005259776A JP2005259776A JP2007072210A JP 2007072210 A JP2007072210 A JP 2007072210A JP 2005259776 A JP2005259776 A JP 2005259776A JP 2005259776 A JP2005259776 A JP 2005259776A JP 2007072210 A JP2007072210 A JP 2007072210A
Authority
JP
Japan
Prior art keywords
shooting
imaging
means
step
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2005259776A
Other languages
Japanese (ja)
Other versions
JP2007072210A5 (en
Inventor
Koichi Nakagawa
浩一 中川
Original Assignee
Canon Inc
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc, キヤノン株式会社 filed Critical Canon Inc
Priority to JP2005259776A priority Critical patent/JP2007072210A/en
Publication of JP2007072210A5 publication Critical patent/JP2007072210A5/ja
Publication of JP2007072210A publication Critical patent/JP2007072210A/en
Application status is Pending legal-status Critical

Links

Images

Abstract

PROBLEM TO BE SOLVED: To provide an imaging device capable of performing effective stereo shooting even for a dynamic subject, reducing the work burden on a photographer before and after shooting, and performing highly convenient stereo shooting, and a control method therefor And provide programs.
When a half-press operation of the release button of the own camera is detected, a mode is set in which the own camera is set as a leading operation camera and a connection destination camera is set as a subordinate operation camera, and is set in the own camera. Is notified to the connected camera ((a)). Next, when the full-press operation of the release button of the own camera is detected, the fact that the full-press operation of the release button has been detected is notified to the connected camera via the IEEE1394 interface as a shooting command (( b)). Thereby, it is possible to automatically perform the camera shooting operation of the connection destination almost simultaneously with the shooting operation of the own camera.
[Selection] Figure 3

Description

  The present invention relates to an image pickup apparatus that photoelectrically converts an optical image of a subject captured from a photographing lens by an image pickup element and generates digital image data, and a control method and program thereof. The present invention relates to an imaging apparatus capable of capturing a stereo image by connecting to the imaging apparatus, a control method thereof, and a program.

  Conventionally, as a stereo image capturing method, there is a method of obtaining a plurality of images with parallax by using a single image capturing apparatus, moving the position of the image capturing apparatus for each capturing, and performing capturing a plurality of times. (For example, refer to Patent Document 1).

There is also a method of obtaining a plurality of images with parallax by photographing from different positions using a plurality of imaging devices.
JP 2002-341474 A

  However, among the above conventional stereo image shooting methods, the former shooting method, that is, the method of shooting a plurality of times using a single imaging device, causes a time gap for each shooting, It was difficult to obtain an effective stereo image for a dynamic subject.

  Of the above conventional stereo image capturing methods, the latter capturing method, that is, a method using a plurality of image capturing devices, makes it difficult to perform the capturing operation at the same timing for each image capturing device. As in the case of using, it was difficult to obtain an effective stereo image for a dynamic subject. Furthermore, when the imaging device automatically determines shooting parameters such as exposure, there is a possibility that the parameter values differ among a plurality of imaging devices. As a result, an image with a different impression may be taken for each imaging device. In order to prevent this, it is necessary to unify such shooting parameters among a plurality of imaging devices and set them manually, and time and a work burden on the photographer are required for preparations for shooting. In addition, since the captured images are dispersed in the respective imaging devices after shooting, in order to use these captured images as stereo images, it is necessary to capture these captured images from the respective imaging devices. there were. Furthermore, at the time of shooting, each imaging device must be equipped with a recording medium that can record the shot image, and it is also necessary to confirm in advance that a recordable recording medium is definitely installed. It was.

  The present invention has been made in view of the above problems, and can perform effective stereo shooting even for a dynamic subject, reduce the work burden on the photographer before and after shooting, and provide highly convenient stereo shooting. It is an object of the present invention to provide an imaging apparatus capable of performing the above, a control method thereof, and a program.

  In order to achieve the above object, an imaging apparatus according to a first aspect of the present invention includes an imaging unit that captures an image of a subject and generates captured image data, a first operation switch that instructs a user to prepare for imaging, and a user that captures an image. A second operation switch for instructing, a communication means for transmitting / receiving information to / from another imaging device, and when a user operates the first operation switch, a set shooting parameter value is transmitted via the communication means. When the user operates the second operation switch while transmitting to the other imaging device, the imaging command is transmitted to the other imaging device via the communication unit, and the subject is captured by the imaging unit. And a control means for controlling to do so.

  In order to achieve the above object, an image pickup apparatus according to claim 4 shoots a subject, generates shooting image data, communication means for transmitting / receiving information to / from another image pickup apparatus, and the other image pickup apparatus. Receiving means for receiving the imaging parameter value, the imaging command and the captured image data set in the other imaging device transmitted from the communication means, and when the imaging command is received by the receiving means And control means for controlling the imaging means so as to take an image of the subject based on the imaging parameter value of the other imaging device already received by the receiving means.

  In order to achieve the above object, a method for controlling an image pickup apparatus according to claim 10 is characterized in that a shooting step for shooting a subject and generating shot image data via a shooting means, and another imaging via a communication means. A communication step for transmitting / receiving information to / from the apparatus, and when the user operates a first operation switch for instructing to prepare for shooting, a set shooting parameter value is transmitted to the other imaging apparatus by the communication step. When the user operates the second operation switch for instructing photographing, a control step is performed so that a photographing command is transmitted to the other imaging device by the communication step and the subject is photographed by the photographing step. It is characterized by having.

  In order to achieve the above object, a method for controlling an imaging apparatus according to claim 11 is characterized in that a photographing step of photographing a subject via a photographing means and generating photographed image data, and another imaging via a communication means. A communication step for transmitting / receiving information to / from the apparatus, and a shooting parameter value, a shooting command, and shot image data set in the other imaging apparatus transmitted from the other imaging apparatus are received via the communication unit. Receiving step, and when the imaging command is received in the receiving step, the imaging step so that the subject is imaged based on the imaging parameter value of the other imaging device already received by the receiving step. And a control step for controlling.

  According to the present invention, when a user operates the first operation switch by connecting to another imaging device via a communication unit, the parameter value set in the own device is transferred to the other imaging device. Thus, since a common shooting parameter value is automatically set to other imaging devices, a stereo image having a sense of unity between the respective imaging devices can be obtained.

  In addition, when the user operates the second operation switch, a shooting command is transmitted to the other imaging devices, and shooting is performed at the same timing between the respective imaging devices. Even an effective stereo image can be obtained.

  Furthermore, since the captured images are transferred between the imaging devices and recorded as a set of stereo images in each imaging device, the burden on the user when the images are taken into a PC (personal computer) or the like can be reduced. .

  Furthermore, even when the storage means is in a non-memory state, the photographed image is transferred to another imaging device and recorded, so it is not necessary to confirm in advance whether the storage means is in a non-memory state. Thus, the burden on the user before shooting can be reduced.

  Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.

(First embodiment)
FIG. 1 is a block diagram showing a schematic configuration of an imaging apparatus (hereinafter referred to as “camera”) according to a first embodiment of the present invention.

  As shown in the figure, the camera of the present embodiment includes a memory card MC1 capable of recording image data, an IEEE 1394 interface IF1 for connecting to an external device (for example, another imaging device) via IEEE 1394, An imaging unit C1 that converts a subject into a video signal, a system controller B1 that controls each unit, a camera video signal processing circuit B2 that processes a video signal from the imaging unit C1, and the camera video signal processing circuit B2 After the compressed video signal is recorded on the memory card MC1, the recording / reproduction data processing circuit B3 for reading out and reproducing the image data recorded on the memory card MC1 and SD (Secure Digital) for controlling the memory card MC1 / MMC (Multimedia Card) card controller B4 and IEEE1394 IEEE 1394 controller B5 for controlling the interface IF1, a character signal generation circuit (CG) B6 for generating a character signal for screen display, and an image data storage RAM (for temporarily storing compressed image data) Random Access Memory) M1, a liquid crystal panel V1 for displaying a captured image and a reproduced image, a half-press detection switch SW1 for detecting a half-press of a release button for instructing a shooting preparation operation, and a release for instructing a shooting operation. It is mainly composed of a full press detection switch SW2 for detecting full press of the button.

  In the present embodiment, the most basic operation when two cameras having similar functions are connected by the IEEE 1394 interface IF1 and a stereo image is taken will be described. It is assumed that the physical and electrical connections between the two cameras have been made in advance.

  Hereinafter, control processing executed by the camera configured as described above will be described.

  FIG. 2 is a flowchart showing a procedure of control processing executed by the camera of the present embodiment, particularly the system controller B1.

  In the figure, in step S1, it is detected whether or not the release button of the own camera has been pressed halfway. When a half-press operation is detected, the camera is set to the mode in which the camera is set as the main camera and the camera connected to the camera (hereinafter referred to as the “destination camera”) is set as the subordinate camera. Proceed to step S2. On the other hand, if a half-press operation has not been detected, the process proceeds to step S12.

  In step S2, the camera of the connection destination is notified via the IEEE 1394 interface IF1 that the release button has been half-pressed, and the process proceeds to step S3.

  In step S3, the video signal from the imaging unit C1 is processed by the camera video processing circuit B2, and optimal exposure, white balance, etc. are determined from the result, and focus adjustment is performed by the autofocus function. Those results are set as imaging parameters, and the process proceeds to step S4.

  In step S4, the set shooting parameters are notified to the connection destination camera (see FIG. 3A). The shooting parameters notified at this time include not only the shooting parameters automatically adjusted by the camera in step S3 but also shooting parameters set in advance by the photographer, such as the image size, the image compression rate, and the angle of view. It is. As a result, it is possible to shoot based on the same shooting parameters with both the own camera and the connection destination camera. After the notification, the process proceeds to step S5.

  In step S5, it is detected whether or not the release button of the own camera has been fully pressed. If a full press operation is detected, the process proceeds to step S6. On the other hand, if the full press operation is not detected, the process stays at step S5 and waits for the release button full press operation of the own camera.

  In step S6, the camera of the connection destination is notified through the IEEE 1394 interface IF1 as a shooting command that the full pressing operation of the release button is detected (see FIG. 3B). Thereby, it is possible to automatically perform the camera shooting operation of the connection destination almost simultaneously with the shooting operation of the own camera. After the notification, the process proceeds to step S7.

  In step S7, shooting processing is performed based on the set shooting parameters. Specifically, the video signal input from the imaging unit C1 is processed to the optimum image quality by the camera video processing circuit B2, and then transmitted to the recording / reproduction data processing circuit B3. The recording / playback data processing circuit B3 compresses the input video signal into a JPEG (Joint Photographic Experts Group) format, stores it in the image data storage RAM M1, and transmits it to the SD / MMC card controller B4 as a JPEG file format. .

  In step S8, the SD / MMC card controller B4 writes the input file into the memory card MC1. At this time, the SD / MMC card controller B4 adds the character string “STA” to the head of the file name of the file to be written. Here, “ST” indicates that the image is obtained by stereo shooting, and “A” indicates that the image is captured by a camera that has taken the leading action during shooting. The reason why such a character string is added to the file name is to make it related to the image file shot by the connection destination camera.

  In subsequent step S9, the image file taken by the connection destination camera is received via the IEEE 1394 interface IF1.

  In step S10, the received image file is transmitted to the SD / MMC card controller B4. The SD / MMC card controller B4 writes this image file into the memory card MC1. At this time, the SD / MMC card controller B4 adds the character string “STB” to the head of the file name of the image file to be written. Here, “B” indicates that the image file is captured by a camera that has taken a subordinate operation at the time of capturing. Thereby, it is indicated that the image file recorded in step S8 and the image file recorded in step S10 are a pair of data obtained by stereo shooting.

  In step S11, the image file recorded in step S8 is transmitted to the connection destination camera via the IEEE 1394 interface IF1 (see FIG. 3C). Here, the mode in which the own camera operates and ends is completed.

  In step S12, it is detected whether or not the release destination camera has been notified of a half-press operation of the release button. When the notification is received, the camera is set to a mode in which shooting is performed with the connection destination camera as the main operation camera and the own camera as the subordinate operation camera, and the process proceeds to step S13. On the other hand, if there is no notification, the process returns to step S1.

  In step S13, it is detected whether or not the imaging parameter is notified from the connection destination camera via the IEEE1394 interface IF1. If there is a notification, the process proceeds to step S14. On the other hand, when there is no notification, it stays at step S14 and waits for a notification from the camera of the connection destination.

  In step S14, the shooting parameters of the own camera are set in accordance with the shooting parameters notified in step S13.

  In step S15, the set parameters are displayed on the liquid crystal panel V1 (see FIG. 4). This display may be a display using a character string or a display using graphic information such as an icon. By this display, the user who is looking at the liquid crystal panel V1 of the subordinate operation can also know under what conditions shooting will be performed.

  In a succeeding step S16, it is detected whether or not a release button full press operation is notified from the connected camera. If a notification is received, the process proceeds to step S17. On the other hand, when there is no notification, it stays at step S16 and waits for a notification from a connection destination camera.

  In step S17, a shooting process is performed based on the set shooting parameters as in step S7. In step S18, the image file converted into the JPEG file format is stored in the memory card MC1 in the same manner as in step S8. Record. However, when recording an image file, unlike step S8, the character string “STB” is added to the head of the file name.

  In the subsequent step S19, the image file recorded in step S18 is transmitted to the connection destination camera via the IEEE 1394 interface IF1. After the transmission, the process proceeds to step S20.

  In step S20, the image file taken by the connection destination camera is received via the IEEE 1394 interface IF1.

  In the subsequent step S21, the received image file is transmitted to the SD / MMC card controller B4. The SD / MMC card controller B4 records this image file on the memory card MC1. At this time, the SD / MMC card controller B4 adds the character string “STA” to the head of the file name of the file to be recorded.

  In step S22, the imaging parameter is returned to the value before the subordinate operation. Thereby, it is possible to prevent shooting with unintended shooting parameters when normal shooting is performed after the end of stereo shooting. Here, the mode in which the own camera is subordinated is completed.

  In this way, a stereo image can be easily taken by two cameras by operating only one camera. In addition, since the captured image is automatically written in the recording media of both cameras, it is possible to save the trouble of exchanging images even when the owners of the cameras are different. Furthermore, even when the owner is the same, when images are captured into a PC (personal computer) or the like, it is possible to capture images captured by both cameras from one camera.

  In the present embodiment, the IEEE 1394 interface has been described as an example of the connection interface between the two cameras, but another wired interface may be used, or a wireless interface such as Bluetooth may be used. Good.

(Second Embodiment)
The camera according to the present embodiment is different from the camera according to the first embodiment only in control processing. Therefore, the hardware is the hardware of the camera according to the first embodiment, that is, the above-described diagram. 1 hardware is used as it is.

  In this embodiment, when two cameras having similar functions are connected by an IEEE 1394 interface and a stereo image is taken, a change in recording method depending on the state of each recording medium will be described.

  5 and 6 are flowcharts showing the procedure of the control process executed by the camera of the present embodiment, particularly the system controller B1.

  In FIG. 5, in step S31, it is detected whether or not the half-press operation of the release button of the own camera has been performed. If a half-press operation is detected, the camera enters a mode in which shooting is performed with the own camera as the main operation camera and the connection destination camera as the subordinate operation camera, and the process proceeds to step S32. On the other hand, if a half-press operation has not been detected, the process proceeds to step S44 in FIG.

  In step S32, it is confirmed whether or not the recording medium (memory card MC1) of the own camera is in a recordable state. If no recording medium is attached or if the recording capacity of the recording medium is insufficient, it is determined that recording on the recording medium is impossible, and the process proceeds to step S39. On the other hand, if it is determined that recording on the recording medium is possible, the process proceeds to step S33.

  In step S33, it is detected whether or not the release button of the own camera has been fully pressed. If a full press operation is detected, the process proceeds to step S34. On the other hand, if the full press operation is not detected, the process stays at step S33 and waits for the full press operation of the release button of the own camera.

  In step S34, a photographing process is performed, and in step S35, a recording process to a recording medium is performed. Specifically, the video signal input from the imaging unit C1 is processed to the optimum image quality by the camera video processing circuit B2, and then transmitted to the recording / reproduction data processing circuit B3. The recording / playback data processing circuit B3 compresses the input video signal into a JPEG format or the like, stores it in the image data storage RAM M1, and transmits it to the SD / MMC card controller B4 as a JPEG file format. The SD / MMC card controller B4 writes the input file to the memory card MC1.

  In the subsequent step S36, the image file taken by the connection destination camera is received via the IEEE 1394 interface IF1.

  In step S37, the received image file is transmitted to the SD / MMC card controller B4. The SD / MMC card controller B4 writes this image file into the memory card MC1.

  In subsequent step S38, the image file recorded in step S37 is transmitted to the connection destination camera via the IEEE 1394 interface IF1. Here, the process in the case where the own camera takes the lead and the recording medium of the own camera is recordable is terminated.

  In step S39, it is detected whether or not there is a notification from the connection destination camera indicating that the recording medium attached to the connection destination camera cannot be recorded. If there is such notification, the process proceeds to step S40. On the other hand, if there is no notification, the process proceeds to step S41.

  In step S40, since the recording media of both cameras are in a recording-impossible state, it is determined that photographing is impossible, and a warning display indicating that recording is impossible is performed on the liquid crystal panel V1. Here, the process when the recording media of both cameras are not recordable ends.

  In step S41, it is detected whether or not the release button of the own camera has been fully pressed. If a full press operation is detected, the process proceeds to step S42. On the other hand, if the full press operation is not detected, the process stays at step S41 and waits for the full press operation of the release button of the own camera.

  In step S42, a photographing process is performed. However, since the recording medium is in a non-recordable state, the image file is not written to the recording medium. Therefore, only the processing in which the video signal is compressed into the JPEG format and stored in the image data storage RAM M1 is performed.

  In subsequent step S43, the image data in the JPEG format stored in the image data storage RAM M1 in step S42 is transmitted to the connection destination camera via the IEEE 1394 interface IF1. As a result, even if the recording medium of the camera itself is in a recording-disabled state, if the recording medium of the connected camera is in a recordable state, stereo shooting is performed, and the images of both cameras are recorded on the connected camera. Can be recorded. Here, the process in the case where the own camera takes the lead and the recording medium of the own camera is not recordable ends.

  In step S44 of FIG. 6, it is detected whether or not a notification of a half-press operation of the release button has been received from the connected camera. If there is a notification, the camera is set to a mode in which shooting is performed with the connected camera as the main operation camera and the own camera as the subordinate operation camera, and the process proceeds to step S45. On the other hand, if there is no notification, the process returns to step S31 in FIG.

  In step S45, it is confirmed whether or not the recording medium of the own camera is in a recordable state. If the recording medium is not loaded or the recording capacity of the recording medium is insufficient, it is determined that recording on the recording medium is impossible, and the process proceeds to step S52. On the other hand, if it is determined that recording on the recording medium is possible, the process proceeds to step S46.

  In step S46, it is detected whether a release button full-press operation notification is received from the connected camera. If there is a notification, the process proceeds to step S47. On the other hand, if there is no notification, the process stays at step S46 and waits for a notification from the connection destination camera.

  In step S47, photographing processing is performed, and in step S48, recording processing on a recording medium is performed. Since the processes in steps S47 and S48 are the same as the processes in steps S34 and S35, the description thereof is omitted.

  In the subsequent step S49, the image file recorded in step S48 is transmitted to the connection destination camera via the IEEE 1394 interface IF1.

  In step S50, the image file captured by the connection destination camera is received via the IEEE 1394 interface IF1.

  In step S51, the received image file is transmitted to the SD / MMC card controller B4. The SD / MMC card controller B4 writes this image file into the memory card MC1. Here, the process in the case where the own camera is subordinate and the medium of the own camera can be recorded ends.

  In step S52, the camera of the connection destination is notified that the recording medium of the own camera is in a state where recording is impossible.

  In a succeeding step S53, it is detected whether or not a release button full press operation is notified from the connected camera. If there is a notification, the process proceeds to step S54. On the other hand, when there is no notification, it stays at step S53 and waits for a notification from a connection destination camera.

  In step S54, a photographing process is performed. However, since the recording medium is in a non-recordable state, the image file is not written to the recording medium. Therefore, only the processing in which the video signal is compressed into the JPEG format and stored in the image data storage RAM M1 is performed.

  In the subsequent step S55, the JPEG format image data stored in the image data storage RAM M1 in step S54 is transmitted to the connection destination camera via the IEEE 1394 interface IF1. As a result, even if the recording medium of the camera itself is in a recording-disabled state, if the recording medium of the connected camera is in a recordable state, stereo shooting is performed, and the images of both cameras are recorded on the connected camera. Can be recorded. Here, the process in the case where the own camera is subordinate and the medium of the own camera is not recordable ends.

  In this way, a stereo image can be easily taken by two cameras by operating only one camera. Even if the recording media of either camera is not recordable, the captured image is sent from the RAM of the camera that cannot record to the recordable camera. The recorded image can be left on the recording medium.

  In the present embodiment, the description of transmission of shooting parameters by half-pressing the release button and transmission of shooting commands by fully pressing the release button is omitted, but in practice, the same operation as in the first embodiment is performed. Is assumed to have been performed.

  A storage medium storing software program codes for realizing the functions of the above-described embodiments is supplied to the system or apparatus, and the computer (or CPU or MPU) of the system or apparatus is stored in the storage medium. It goes without saying that the object of the present invention can also be achieved by reading and executing the program code.

  In this case, the program code itself read from the storage medium realizes the novel function of the present invention, and the program code and the storage medium storing the program code constitute the present invention.

  As a storage medium for supplying the program code, for example, a flexible disk, hard disk, magneto-optical disk, CD-ROM, CD-R, CD-RW, DVD-ROM, DVD-RAM, DVD-RW, DVD + RW, magnetic A tape, a non-volatile memory card, a ROM, or the like can be used. Further, the program code may be supplied from a server computer via a communication network.

  Further, by executing the program code read out by the computer, not only the functions of the above-described embodiments are realized, but the OS running on the computer based on the instruction of the program code is actually used. Needless to say, the present invention includes a case where part or all of the processing is performed and the functions of the above-described embodiments are realized by the processing.

  Further, after the program code read from the storage medium is written into a memory provided in a function expansion board inserted into the computer or a function expansion unit connected to the computer, the function expansion is performed based on the instruction of the program code. It goes without saying that the CPU or the like provided in the board or the function expansion unit performs part or all of the actual processing, and the functions of the above-described embodiments are realized by the processing.

1 is a block diagram illustrating a schematic configuration of an imaging apparatus according to a first embodiment of the present invention. 2 is a flowchart illustrating a procedure of control processing executed by the imaging apparatus of FIG. 1, particularly a system controller. It is a figure which matches the operation state of the imaging device of FIG. 1 and the flow of data to another imaging device. It is a figure which shows an example of the imaging | photography parameter displayed on the liquid crystal panel contained in the imaging device of FIG. It is a flowchart which shows the procedure of the control processing which the imaging device which concerns on the 2nd Embodiment of this invention performs. It is a flowchart which shows the procedure of the control processing which the imaging device which concerns on the 2nd Embodiment of this invention performs.

Explanation of symbols

C1 Imaging unit B1 System controller B2 Camera video signal processing circuit B3 Recording / reproduction data processing circuit B4 SD / MMC card controller B5 IEEE 1394 controller B6 Character signal generation circuit MC1 Memory card M1 Image data storage RAM
V1 LCD panel IF1 IEEE1394 interface SW1 Release button half-press detection switch SW2 Release button full-press detection switch

Claims (13)

  1. Photographing means for photographing a subject and generating photographed image data;
    A first operation switch for instructing the user to prepare for shooting;
    A second operation switch for the user to instruct photographing;
    Communication means for transmitting and receiving information to and from other imaging devices;
    When the user operates the first operation switch, the set shooting parameter value is transmitted to the other imaging device via the communication means, while when the user operates the second operation switch. An imaging apparatus comprising: a control unit configured to transmit an imaging command to the other imaging apparatus via the communication unit and to control the imaging unit to image a subject.
  2. Storage means for storing photographed image data generated by the photographing means;
    The imaging apparatus according to claim 1, further comprising a transmission unit that transmits the captured image data stored in the storage unit to the other imaging apparatus via the communication unit.
  3.   When the storage means is in a non-storable state, the photographed image data obtained by the photographing means is not stored in the storage means but transmitted to the other imaging device by the transmitting means. The imaging device according to claim 2.
  4. Photographing means for photographing a subject and generating photographed image data;
    Communication means for transmitting and receiving information to and from other imaging devices;
    Receiving means for receiving, via the communication means, shooting parameter values, shooting commands and shot image data set in the other imaging device, transmitted from the other imaging device;
    Control means for controlling the imaging means so as to take an image of the subject based on the imaging parameter value of the other imaging device already received by the receiving means when the imaging means is received by the receiving means. An imaging apparatus comprising:
  5. Storage means for storing captured image data received by the receiving means and captured image data generated by the capturing means;
    5. The transmission apparatus according to claim 4, further comprising a transmission unit configured to transmit the captured image data generated by the imaging unit and stored in the storage unit to the other imaging apparatus via the communication unit. Imaging device.
  6.   6. The imaging apparatus according to claim 5, wherein the storage unit stores the captured image data received by the receiving unit and the captured image data generated by the imaging unit in association with each other.
  7.   When the storage means is in a non-recordable state, the photographed image data received by the receiving means is not stored in the storage means, and the photographed image data generated by the photographing means is stored in the storage means. 6. The imaging apparatus according to claim 5, wherein the imaging apparatus transmits the image to the other imaging apparatus without being transmitted.
  8.   The imaging apparatus according to claim 4, further comprising display means for displaying imaging parameter values received by the receiving means.
  9.   The first and second operation switches are configured by the same switch, and shooting preparation and shooting instructions are switched according to an operation mode of the switch by a user. Imaging device.
  10. A shooting step of shooting a subject via a shooting means and generating shot image data;
    A communication step for transmitting and receiving information to and from other imaging devices via the communication means;
    When the user operates the first operation switch for instructing shooting preparation, the set shooting parameter value is transmitted to the other imaging device by the communication step, while the user instructs the second shooting to be performed. When the operation switch is operated, a shooting command is transmitted to the other imaging device by the communication step, and a control step for controlling the subject to be shot by the shooting step is provided. Control method.
  11. A shooting step of shooting a subject via a shooting means and generating shot image data;
    A communication step for transmitting and receiving information to and from other imaging devices via the communication means;
    A reception step of receiving, via the communication means, a shooting parameter value, a shooting command and a shot image data set in the other imaging device, transmitted from the other imaging device;
    A control step for controlling the imaging step so that the subject is imaged based on the imaging parameter value of the other imaging device that has already been received by the receiving step when the imaging command is received by the receiving step. And a control method for the imaging apparatus.
  12.   The program for making a computer perform the control method of the imaging device of Claim 10 or 11.
  13.   A computer-readable storage medium storing the program according to claim 12.
JP2005259776A 2005-09-07 2005-09-07 Imaging apparatus, its control method, and program Pending JP2007072210A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2005259776A JP2007072210A (en) 2005-09-07 2005-09-07 Imaging apparatus, its control method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2005259776A JP2007072210A (en) 2005-09-07 2005-09-07 Imaging apparatus, its control method, and program

Publications (2)

Publication Number Publication Date
JP2007072210A5 JP2007072210A5 (en) 2007-03-22
JP2007072210A true JP2007072210A (en) 2007-03-22

Family

ID=37933698

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2005259776A Pending JP2007072210A (en) 2005-09-07 2005-09-07 Imaging apparatus, its control method, and program

Country Status (1)

Country Link
JP (1) JP2007072210A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011239379A (en) * 2010-04-30 2011-11-24 Sony Europe Ltd Image capturing system, image capturing apparatus, and image capturing method
US9967548B2 (en) 2011-06-03 2018-05-08 Sony Corporation Stereoscopic video imaging system and synchronous control method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03179438A (en) * 1989-12-08 1991-08-05 Minolta Camera Co Ltd Camera capable of stereophotography
JPH10341400A (en) * 1997-06-06 1998-12-22 Nikon Corp Electronic camera having communication function
JP2001230955A (en) * 2000-02-15 2001-08-24 Casio Comput Co Ltd Device and system for picking up image
JP2002171434A (en) * 2000-12-04 2002-06-14 Nikon Corp Electronic camera
JP2003110997A (en) * 2001-07-24 2003-04-11 Olympus Optical Co Ltd Camera and camera system
JP2003228125A (en) * 2002-02-04 2003-08-15 Fuji Photo Film Co Ltd Camera
JP2004297414A (en) * 2003-03-26 2004-10-21 Kyocera Corp Camera having synchronous photographing function and image recorder
JP2005148090A (en) * 2003-11-11 2005-06-09 Canon Inc Imaging device, imaging system, its control method and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03179438A (en) * 1989-12-08 1991-08-05 Minolta Camera Co Ltd Camera capable of stereophotography
JPH10341400A (en) * 1997-06-06 1998-12-22 Nikon Corp Electronic camera having communication function
JP2001230955A (en) * 2000-02-15 2001-08-24 Casio Comput Co Ltd Device and system for picking up image
JP2002171434A (en) * 2000-12-04 2002-06-14 Nikon Corp Electronic camera
JP2003110997A (en) * 2001-07-24 2003-04-11 Olympus Optical Co Ltd Camera and camera system
JP2003228125A (en) * 2002-02-04 2003-08-15 Fuji Photo Film Co Ltd Camera
JP2004297414A (en) * 2003-03-26 2004-10-21 Kyocera Corp Camera having synchronous photographing function and image recorder
JP2005148090A (en) * 2003-11-11 2005-06-09 Canon Inc Imaging device, imaging system, its control method and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011239379A (en) * 2010-04-30 2011-11-24 Sony Europe Ltd Image capturing system, image capturing apparatus, and image capturing method
US9967548B2 (en) 2011-06-03 2018-05-08 Sony Corporation Stereoscopic video imaging system and synchronous control method

Similar Documents

Publication Publication Date Title
JP4990321B2 (en) Imaging device
US8284270B2 (en) Imaging device, edition device, image processing method, and program
JP4720785B2 (en) Imaging apparatus, image reproducing apparatus, imaging method, and program
EP1575277A1 (en) Camcorder
US6542192B2 (en) Image display method and digital still camera providing rapid image display by displaying low resolution image followed by high resolution image
JP4126640B2 (en) Electronic camera
KR101797041B1 (en) Digital imaging processing apparatus and controlling method thereof
US7417667B2 (en) Imaging device with function to image still picture during moving picture imaging
JP5235798B2 (en) Imaging apparatus and control method thereof
US7417668B2 (en) Digital camera
JP4649980B2 (en) Image editing apparatus, image editing method, and program
KR101355014B1 (en) Information processing apparatus and method, and computer readable recording medium
JP4507392B2 (en) Electronic camera
JP4277837B2 (en) Imaging device
EP1770988B1 (en) Image display apparatus and method
JPH09322106A (en) Image pickup recorder
CN1607824A (en) Image processing system, image processing method, electronic camera, and image processing apparatus
CN1281049C (en) Apparatus and method for generating easy-to edit-and-restor image data
JPH11191873A (en) Electronic camera
JP2006246173A (en) Image processing method and image processor
JP2002101329A (en) Digital camera, image reproduction device and method
JP3768135B2 (en) Digital camera
JP4441882B2 (en) Imaging device, display control method, program
JP4831017B2 (en) Image processing apparatus, developing apparatus, image processing method, developing method, image processing program, and developing program
US7535495B2 (en) Digital camera, control method thereof and portable terminal

Legal Events

Date Code Title Description
RD05 Notification of revocation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7425

Effective date: 20070626

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20080908

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20080908

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20110420

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110426

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110627

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20111004

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20120214