WO2006101112A1 - 撮影装置、情報処理装置、情報処理方法、プログラム、およびプログラム記録媒体 - Google Patents
撮影装置、情報処理装置、情報処理方法、プログラム、およびプログラム記録媒体 Download PDFInfo
- Publication number
- WO2006101112A1 WO2006101112A1 PCT/JP2006/305675 JP2006305675W WO2006101112A1 WO 2006101112 A1 WO2006101112 A1 WO 2006101112A1 JP 2006305675 W JP2006305675 W JP 2006305675W WO 2006101112 A1 WO2006101112 A1 WO 2006101112A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- parameter
- reproduction parameter
- playback
- reproduction
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 36
- 230000010365 information processing Effects 0.000 title claims abstract description 21
- 238000003672 processing method Methods 0.000 title claims abstract description 11
- 238000012545 processing Methods 0.000 claims abstract description 194
- 238000000605 extraction Methods 0.000 claims abstract description 95
- 239000000284 extract Substances 0.000 claims abstract description 11
- 238000000034 method Methods 0.000 claims description 69
- 230000008569 process Effects 0.000 claims description 60
- 230000004044 response Effects 0.000 claims description 8
- 238000006243 chemical reaction Methods 0.000 description 13
- 238000010586 diagram Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 7
- 230000009467 reduction Effects 0.000 description 7
- 230000006978 adaptation Effects 0.000 description 6
- 230000003044 adaptive effect Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 238000004091 panning Methods 0.000 description 4
- 238000012937 correction Methods 0.000 description 3
- 206010020675 Hypermetropia Diseases 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
Definitions
- Imaging apparatus information processing apparatus, information processing method, program, and program recording medium
- the present invention relates to a photographing device, an information processing device, an information processing method, a program, and a program recording medium, and in particular, a photographing device, an information processing device, and an information processing device capable of providing an optimal image for a user.
- the present invention relates to an information processing method, a program, and a program recording medium.
- an image (image data) obtained by imaging is encoded by, for example, a DV (Digital Video) system or an MPEG (Moving Picture Experts Group) system, It is recorded on a recording medium such as a disc.
- DV Digital Video
- MPEG Motion Picture Experts Group
- a video camera is generally provided with a zoom function.
- the zoom function it is possible to change the angle of view of the image obtained by imaging (the angle of view of the range shown in the image) (magnification). That is, for example, if the user zooms in and captures an image, a close-up (enlarged) image of a narrow angle of view can be obtained, and if the user performs a zoom-out operation and captures a wide range of images. A wide-angle image can be obtained.
- Patent Document 1 JP 2004-282296 A
- the user confirms the image displayed on the finder or the like of the video camera, Shoot the desired subject. Therefore, other scenes that occur while shooting a certain subject, that is, scenes in a range other than the range shown in the image displayed on the finder etc. All incidents that happened in the range) have been missed.
- the present invention has been made in view of such a situation, and is intended to provide an optimal image for a user.
- the imaging device of the present invention includes a first imaging unit that outputs a first image, and an entire range that is a higher-angle and higher-resolution image than the first image and is reflected in the first image.
- the first information processing method of the present invention is a second image that is a higher-angle and higher-resolution image than the first image, and an image that includes at least the entire range shown in the first image.
- an output control step for outputting is a second image that is a higher-angle and higher-resolution image than the first image, and an image that includes at least the entire range shown in the first image.
- the first program of the present invention is a second image that is an image having a wider angle and higher resolution than the first image, and an image in which at least the entire range shown in the first image is shown.
- the program recorded on the first program recording medium of the present invention is an image having a wider angle and higher resolution than the first image, and at least the entire range shown in the first image is reflected.
- the information processing apparatus of the present invention is a second image that is an image having a wider angle and higher resolution than the first image, and an image in which at least the entire range shown in the first image is shown.
- the second image and the reproduction parameter are recorded from the data recording medium on which the reproduction parameter including at least information indicating the area in which the range shown in the first image is reflected is recorded.
- the extraction means for extracting at least a part of the image from the second image and outputting it as an extracted image, and a new one in accordance with the user's operation.
- a reproduction parameter generation means for generating a reproduction parameter and a recording control means for recording a new reproduction parameter are provided, and the extraction means is based on the new reproduction parameter when a new reproduction parameter is generated. It is characterized by extracting the extracted image.
- the second information processing method of the present invention is a second image which is a higher-angle and higher-resolution image than the first image, and an image in which at least the entire range shown in the first image is shown. And the second image and the reproduction parameter from the data recording medium recorded with the reproduction parameter including at least information indicating the area in which the range shown in the first image is reflected in the second image.
- the extraction step of extracting at least a part of the image from the second image based on the playback parameters, and outputting the extracted image as an extracted image. It includes a playback parameter generation step for generating new playback parameters and a recording control step for recording new playback parameters. In the extraction step, when a new playback parameter is generated, The extracted image is extracted based on the new reproduction parameter.
- the second program of the present invention is a second image that is an image having a wider angle and higher resolution than the first image, and at least the entire range shown in the first image.
- the second image is reproduced from the data recording medium in which the second image is recorded with a reproduction parameter including at least information indicating the region reflected in the first image and reflected in the first image.
- the playback parameter generation step for generating a new playback parameter and the recording control step for recording the new playback parameter are performed. Based on this, the extracted image is extracted.
- the program recorded on the second program recording medium of the present invention is an image having a wider angle and higher resolution than the first image, and at least the entire range shown in the first image is reflected.
- a playback parameter generation step for generating new playback parameters and a recording control step for recording new playback parameters in response to the operation of When a raw parameter is generated, an extracted image is extracted based on a new reproduction parameter.
- the photographing apparatus the first information processing method, the first program, and the program recorded on the first program recording medium of the present invention, a wider angle and higher resolution than the first image.
- An area in which the range shown in the first image is detected is detected from the second image, which is an image of the first image and at least the entire range shown in the first image is shown.
- a reproduction parameter including at least information representing the image is generated, and the second image and the reproduction parameter are output.
- the second information processing method, the second program, and the program recorded on the second program recording medium of the present invention a wider angle and higher resolution than the first image.
- the second image which is an image and an image in which at least the entire range shown in the first image is shown, and the range shown in the first image in the second image are shown.
- the second image and the reproduction parameter are reproduced from the data recording medium on which the reproduction parameter including at least information indicating the connected area is recorded, and at least a part of the second image is reproduced based on the reproduction parameter.
- An image of the area is extracted and output as an extracted image.
- a new playback parameter is generated and a new playback parameter is recorded in response to a user operation.
- an extracted image is extracted based on the new reproduction parameter.
- FIG. 1 is a block diagram showing a configuration example of an embodiment of an image processing system to which the present invention is applied.
- FIG. 2 is a block diagram illustrating a configuration example of the photographing apparatus 1;
- FIG. 3 is a flowchart for explaining the operation of the photographing apparatus 1.
- FIG. 4 is a block diagram illustrating a configuration example of the playback device 2.
- FIG. 5 is a diagram for explaining processing of the image extraction unit 22.
- FIG. 6 is a flowchart for explaining the operation of the playback apparatus 2.
- FIG. 7 is a diagram for explaining an editing operation.
- FIG. 8 is a block diagram showing a configuration example of an embodiment of a computer to which the present invention is applied.
- Fig. 1 is an image processing system to which the present invention is applied (a system is a logical collection of a plurality of devices, whether or not each configuration device is in the same casing) Show the example of the configuration of the embodiment! / Speak.
- the imaging device 1 is, for example, a so-called video camera, and is an image (image data) obtained as a result of imaging according to a user's operation, and parameters used when reproducing the image. Record the playback parameters on recording medium 4.
- the reproduction device 2 reproduces the image and the reproduction parameter recorded on the recording medium 4, and extracts a predetermined area (image) from the image reproduced from the recording medium 4 based on the reproduction parameter.
- the display device 3 is configured by, for example, a CRT (Cathode Ray Tube) or an LCD (Liquid Crystal Display), and displays an image from the playback device 2.
- CTR Cathode Ray Tube
- LCD Liquid Crystal Display
- the recording medium 4 is, for example, a magnetic tape, a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like, and is detachable from the photographing apparatus 1 and the reproducing apparatus 2.
- the recording medium 4 can be built in the photographing apparatus 1, and in this case, the image recorded on the recording medium 4 can be reproduced and transmitted to the reproducing apparatus 2 in the photographing apparatus 1.
- the recording medium 4 can be built in the playback device 2, and in this case, an image or the like obtained by the shooting device 1 is transmitted from the shooting device 1 to the playback device 2, and is stored in the recording medium 4 by the playback device 2. Can be recorded.
- FIG. 2 shows a configuration example of the photographing apparatus 1 in FIG.
- the normal camera unit 11 incorporates an optical system included in a general video camera, a photoelectric conversion element (eg, C, iXChaged Coupled Device; CMOS (and omplementary Metal Oxide bemicond uctor) imager). Under the control of the controller 17, the light incident thereon is photoelectrically converted and an image (data) is output.
- a photoelectric conversion element eg, C, iXChaged Coupled Device; CMOS (and omplementary Metal Oxide bemicond uctor) imager.
- the normal camera unit 11 has, for example, an optical zoom mechanism, and the controller 17 performs the zoom-in operation or zoom-out when the operation unit 15 performs a zoom-in operation or a zoom-out operation.
- the zoom mechanism of the normal camera unit 11 is controlled according to the operation.
- the camera unit 11 drives a built-in optical system according to the control from the controller 17, thereby imaging a narrow or wide range of view angles and outputting an image of the view angles.
- a normal image the image that the normal camera unit 11 captures and outputs.
- the normal image output from the normal camera unit 11 is supplied to the normal signal processing unit 12 and the controller 17.
- the normal signal processing unit 12 performs signal processing according to the control from the controller 17 on the normal image from the normal camera unit 11 and supplies the processed image to the display unit 13.
- signal processing performed by the signal processing unit 12 includes, for example, noise reduction and camera shake correction.
- the signal processing performed in the signal processing unit 12 can be set, for example, by operating the operation unit 15, and the controller 17 controls the normal signal processing unit 12 according to the setting.
- the display unit 13 is a monitor (or finder) for a user to check an image (normal image) captured by the normal camera unit 11, and a normal image (from the normal signal processing unit 12) ( The normal image signal-processed by the normal signal processing unit 12 is displayed.
- the wide-angle camera unit 14 photoelectrically converts the force controlled by the controller 17 and the light incident thereon and outputs an image (data) to the controller 17 and the recording unit 18.
- the wide-angle camera unit 14 is an image having a wider angle and higher resolution than the normal image captured by the normal camera unit 11, and an image in which at least the entire range shown in the normal image is reflected (hereinafter referred to as the normal image). (Hereinafter, referred to as a large-capacity image), and an optical system and a photoelectric conversion element that can capture such a large-capacity image are incorporated.
- a method for obtaining an image having a wider angle than a normal image for example, a method using a wide-angle lens such as a fisheye lens as an optical system of the wide-angle camera unit 14, or an omnidirectional camera
- a method of adopting a camera it is also normal to prepare multiple camera units similar to the normal camera unit 11, arrange each camera unit so that the imaging direction is slightly different, and connect the images obtained by each camera unit. A wide-angle image can be obtained.
- a wide-angle camera unit 1 As the photoelectric conversion element 4, there is a method in which a photoelectric conversion element having a smaller number of pixels and a larger number of pixels than the photoelectric conversion element of the camera unit 11 is usually employed.
- the normal camera unit 11 and the wide-angle camera unit 14 may be disposed in the photographing apparatus 1 so that the wide-angle camera unit 14 can at least capture the entire range.
- the wide-angle large-capacity image is an annular image (round Therefore, the wide-angle camera unit 14 converts the annular image into a far-sighted image (panoramic image) and outputs it.
- the wide-angle camera unit 14 obtains a wide-angle large-capacity image using a fisheye lens.
- the large-capacity image is a wide-angle and high-resolution image as compared with a normal image.
- the operation unit 15 for example, starts or stops recording an image on the recording medium 4, and moreover, the normal operation is performed by the user when commanding zoom-in or zoom-out in the camera unit 11, for example.
- the controller 17 is operated and an operation signal corresponding to the operation is supplied to the controller 17.
- the operation force of the operation unit 15 that commands zoom-in or zoom-out is a zoom-in operation or a zoom-out operation, respectively.
- the zoom-in operation and the zoom-out operation are collectively referred to as a zoom operation as appropriate.
- the sensor 16 is an acceleration sensor, for example, which senses movement information of the photographing apparatus 1 and supplies the sensing information to the controller 17.
- the controller 17 includes a CPU (Central Processing Unit) 17A, a RAM (Random Access Unit) 17B, an EEPROM (Electrically Erasable Programmable Read Only Memory) 17C, and the like, and executes a program stored in the CPU 17A power EEPROM 17C. Thus, each part of the photographing apparatus 1 is controlled.
- a CPU Central Processing Unit
- RAM Random Access Unit
- EEPROM Electrically Erasable Programmable Read Only Memory
- the CPU 17A loads the program stored in the EEPROM 17C into the RAM 17B and executes it, thereby performing various processes.
- the RAM 17B temporarily stores programs executed by the CPU 17A and data necessary for the operation of the CPU 17A.
- EEPROM17C is CPU1 Stores programs to be executed by 7A and data that must be retained even after the imaging device 1 is turned off.
- the program to be executed by the CPU 17A is not only installed in the EEPROM 17C in advance, but also temporarily (permanently) stored (recorded) in the removable recording medium 111, which will be described later, and installed in the EEPROM 17C from the removable recording medium 111. be able to.
- the program can be transferred from the download site wirelessly to the photographing device 1 via an artificial satellite for digital satellite broadcasting, or to the photographing device 1 via a network such as a LAN (Local Area Network) or the Internet.
- the imaging apparatus 1 can transfer the program by wire and receive the program transferred in such a manner through a communication I / F (Interface) (not shown) and install it in the EEPROM 17C.
- the controller 17 controls the normal camera unit 11, the wide-angle camera unit 14, the recording unit 18, and the like.
- the controller 17 corresponds to an area in which a range shown in the normal image from the normal camera unit 11 is reflected from the large-capacity image from the wide-angle camera unit 14 (hereinafter appropriately corresponding to the normal image). Region) and generate playback parameters that include at least information representing that region.
- the controller 17 performs matching between the normal image and the large-capacity image while changing the size of the normal image and changing the position on the large-capacity image that matches the normal image. Then, the controller 17 matches the size of the normal image and the position on the large-capacity image where the normal image and the large-capacity image most closely match, specifically, for example, each pixel of the normal image and the same position as the pixel.
- the size of the normal image and the position on the large-capacity image that minimize the sum of the absolute value differences of the pixel values from the pixels of the large-capacity image are obtained, and the area of the large-capacity image specified by the size and position is It is detected as a region corresponding to a normal image.
- the controller 17 obtains the approximate position of the region corresponding to the normal image in the large-capacity image from the positional relationship between the normal camera unit 11 and the wide-angle camera unit 14 and the zoom of the normal camera unit 11. Based on the state of the mechanism, obtain the approximate size of the area corresponding to the normal image in the large-capacity image, and based on the approximate position and size, Match normal image and large-capacity image while changing the position and size of normal image
- the controller 17 recognizes that the photographing apparatus 1 has been panned or tilted based on the information from the sensor 16, and the reproduction parameter generated by the controller 17 is large.
- the information that the photographing apparatus 1 is panned or tilted, and the information indicating the state of the zoom mechanism of the normal camera unit 11 are also included. Can be included.
- the recording unit 18 outputs the force controlled by the controller 17, a large-capacity image of 14 wide-angle camera units, and the reproduction parameter from the controller 17 to the recording medium 4 for recording.
- the normal camera unit 11 and the wide-angle camera unit 14 start imaging, whereby the normal camera unit 11 performs imaging.
- the wide-angle camera unit 14 starts outputting a large-capacity image obtained by imaging.
- the normal image output from the normal camera unit 11 is supplied to the normal signal processing unit 12 and the controller 17, and the large-capacity image output from the wide-angle camera unit 14 is supplied to the controller 17 and the recording unit 18. .
- the normal signal processing unit 12 performs signal processing on the normal image from the normal camera unit 11 and supplies it to the display unit 13.
- the display unit 13 displays the normal image as a so-called through image. For example, the user specifies a target to be photographed by looking at a normal image displayed on the display unit 13.
- step S1 the controller 17 controls the recording unit 18 to start recording (output) the large-capacity image output from the wide-angle camera unit 14 to the recording medium 4, and step S 2 Proceed to
- step S2 the controller 17 determines whether or not the operation unit 15 is operated by the user so as to start recording (recording) (recording operation). If it is determined in step S2 that the recording operation is not performed, the process proceeds to step S3, and the controller 17 is operated by the user so that the operation unit 15 is turned off (power-off operation). Determine whether or not.
- step S3 If it is determined in step S3 that the power has been turned off, the process returns to step S2, and the same processing is repeated thereafter.
- step S2 If it is determined in step S2 that the recording operation has been performed, that is, the normal image displayed on the user force display unit 13 is viewed, the scene shown in the normal image is recorded. If a recording operation is performed to start shooting (recording), the process proceeds to step S4, where the controller 17 is a normal image output by the normal camera unit 11 and a large-capacity image output by the wide-angle camera unit 14. The generation of playback parameters using the above is started.
- the controller 17 detects a region corresponding to the normal image from the normal camera unit 11 from the large-capacity image having as many as 14 wide-angle camera units, and sends information indicating the region and the normal signal processing unit 12 to the region. Then, playback parameters including information indicating the contents of signal processing applied to normal images are generated.
- step S5 the controller 17 starts recording the reproduction parameter started to be generated in step S4 onto the recording medium 4. That is, the controller 17 controls the recording unit 18 to start recording (output) the reproduction parameter generated by the controller 17 on the recording medium 4.
- the normal image output from the normal camera unit 11 and the large-capacity image output from the wide-angle camera unit 14 have the same frame rate and the same frame rate. If the normal image and large-capacity image frames (image data) are output at the same timing (phase), playback parameters are generated for each frame, and the playback parameters for each frame are stored in the recording medium 4. To be recorded.
- step S5 When recording of playback parameters is started in step S5, the process proceeds to step S6, and the controller 17 is operated (stop operation) by the user so that the operation unit 15 stops recording (recording). Judge whether force. If it is determined in step S6 that the stop operation has not been performed, the process proceeds to step S7, and the controller 17 determines whether the operation unit 15 has been turned off by the user.
- step S7 If it is determined in step S7 that the power is turned off, the process returns to step S6, and the same processing is repeated thereafter. [0067] If it is determined in step S6 that the stop operation has been performed, the process proceeds to step S8, and the controller 17 generates the reproduction parameter started in step S4 and the reproduction parameter started in step S5. Stop recording with. Then, the process returns from step S8 to step S2, and the same processing is repeated thereafter.
- step S9 the controller 17 controls the recording unit 18 to start the operation started in step S1.
- the recording of the capacity image is stopped, and further, the imaging by the normal camera unit 11 and the wide-angle camera unit 14 is stopped, and the processing is ended.
- a large-capacity image continues to be recorded on the recording medium 4 while the power is turned on.
- the reproduction parameter is further recorded until the stop operation is performed.
- the large-capacity image is a wide-angle image in which the entire range shown in the normal image is shown at least, and therefore, a range outside the range shown in the normal image is also shown. Furthermore, the large-capacity image is recorded even before the recording operation is performed, and is recorded even after the stop operation is performed.
- the photographing apparatus not only the scene recognized by the user (recorded) (the scene that is normally shown in the image) but also a large amount of other scenes are shown.
- An image is recorded on the recording medium 4 together with the reproduction parameters.
- the normal image is subjected to signal processing such as camera shake correction and noise reduction in the normal signal processing unit 12, but such signal processing is performed on a large-capacity image. Not given. Therefore, a large-capacity image not subjected to such signal processing is recorded on the recording medium 4.
- a large-capacity image and a playback parameter are recorded on the recording medium 4 until a recording operation and a power stop operation are performed.
- a recording operation and a power stop operation are performed.
- it may be multiplexed and recorded as one file, or may be recorded as separate files.
- FIG. 4 shows a configuration example of the playback device 2 of FIG.
- the recording / playback unit 21 plays back a large-capacity image and playback parameters from the recording medium 4 and supplies the large-capacity image to the image extraction unit 22, for example, in response to an operation of the operation unit 26 by the user.
- the playback parameters are supplied to the playback parameter processing unit 24. Further, the recording / reproducing unit 21 records the reproduction parameter supplied from the reproduction parameter processing unit 24 on the recording medium 4.
- the image extraction unit 22 Based on the playback parameter supplied from the playback parameter processing unit 24, the image extraction unit 22 extracts an image of at least a part of the area from the large-capacity image supplied from the recording / playback unit 21.
- the extracted image (hereinafter referred to as “extracted image” where appropriate) is output to the signal processing unit 23.
- the playback parameter is information representing a rectangular area on the large-capacity image, for example, the horizontal and vertical lengths of the rectangular area and the coordinates of the upper left point on the large-capacity image.
- the image extraction unit 22 identifies a rectangular area (hereinafter, referred to as an extraction window as appropriate) represented by the information included therein based on the reproduction parameter. Then, the image extraction unit 22 extracts an image in the extraction window from the large-capacity image as an extracted image, and supplies the extracted image to the signal processing unit 23.
- the reproduction parameter generated by the controller 17 and recorded on the recording medium 4 includes information indicating an area corresponding to a normal image in a large-capacity image. Included force For this playback parameter, the region represented by the information, that is, the region corresponding to the normal image in the large-capacity image is specified as the extraction window.
- the signal processing unit 23 Based on the reproduction parameter supplied from the reproduction parameter processing unit 24, the signal processing unit 23 performs signal processing on the extracted image supplied from the image extraction unit 22 and supplies it to the display device 3. As a result, the display device 3 displays the extracted image that has been subjected to the signal processing by the signal processing unit 23.
- the playback parameter processing unit 24 supplies the playback parameters supplied from the recording / playback unit 21 to the image extraction unit 22 and the signal processing unit 23. Also, the playback parameter processing unit 24 generates playback parameters according to the operation of the operation unit 26 by the user, and supplies the playback parameters to the image extraction unit 22 and the signal processing unit 23. Further, the playback parameter processing unit 24 supplies the playback parameter supplied with the recording / playback unit 21 or the playback parameter generated according to the operation of the operation unit 26 to the playback parameter storage unit 25 for storage. Also, the playback parameter processing unit 24 reads the playback parameters stored in the playback parameter storage unit 25, supplies the playback parameters to the recording / playback unit 21, and records them on the recording medium 4.
- the reproduction parameter storage unit 25 temporarily stores the reproduction parameters supplied from the reproduction parameter processing unit 24.
- the operation unit 26 is operated by the user when instructing playback or editing of an image (large-capacity image) recorded on the recording medium 4, and the operation signal corresponding to the operation is transmitted to the recording / playback unit 21 or the playback unit 26. This is supplied to the parameter processing unit 24.
- the photographing apparatus 1 recognizes that the user has photographed (recorded), and includes a range of scenes (scenes shown in a normal image) as well as other ranges of scenes. A large-capacity image is recorded on the recording medium 4 together with the reproduction parameters.
- the playback parameter recorded on the recording medium 4 in the imaging device 1 is referred to as a playback parameter during shooting
- information indicating a region corresponding to a normal image in a large-capacity image is included in the playback parameter during shooting. Therefore, the region represented by the information is used as an extraction window, and the image in the extraction window is extracted and displayed as an extracted image from a large-capacity image as shown in FIG. An image showing the same scene as can be displayed.
- the angle of view of the normal image obtained by the shooting is widened. Therefore, the area corresponding to the normal image in the large-capacity image is As shown by R in Fig. 5, the area becomes large and the extraction window
- the dough is also large in size.
- the size is also small.
- the number of pixels of the extracted image extracted from the large-capacity image using such an extraction window is not necessarily constant.
- the image extraction unit 22 displays an image with a predetermined number of pixels that can be displayed on the display device 3 using the extracted image extracted from the large-capacity image. It needs to be converted into an image.
- each pixel of the second image is the target pixel.
- the pixel of interest (pixel value thereof) is obtained.
- a prediction tap to be used a pixel value (predicted value) of the target pixel is obtained by a predetermined prediction calculation using the prediction tap and a tap coefficient that is a coefficient obtained for each predetermined class by learning.
- x represents the pixel value of the pixel of the nth first image that constitutes the prediction tap for the pixel (pixel value) y of the second image
- w represents the nth tap coefficient multiplied by the pixel value of the pixel of the nth prediction tap. Equation (1) is assumed to be composed of N pixels (pixel values) X, X,..., X of the first image of the prediction tapka.
- the pixel (its pixel value) y of the second image can be obtained by a higher-order expression of the second order or higher than the linear first-order expression shown in Expression (1). is there.
- the prediction error e is expressed as k k k
- Equation (3) X is a prediction tap n, k for the kth pixel of the second image.
- the tap coefficient w where the prediction error e in equation (3) (or equation (2)) is 0 is the pixel of the second image k n
- the power that is optimal for prediction It is generally difficult to find such a tap coefficient w for every pixel of the second image. [0103] Therefore, if the least square method is adopted as a standard representing that the tap coefficient w n is optimum, for example, the optimum tap coefficient w is the sum of square errors expressed by the following equation. This can be obtained by setting E to / J.
- Equation (4) ! /, And K are the prediction y for the pixel y of the second image and the pixel y.
- Eq. (7) is the normal equation shown in Eq. (8).
- a pixel y of the second image and a pixel X, X ⁇ of the first image that constitutes a prediction tap for the pixel y are classified.
- a large number of training samples are prepared as learning samples, and the normal equations in Eq. (8) are constructed and solved for each class using the learning samples.
- the learning process to find the tap coefficient that minimizes the sum E) for each class is performed in advance, and the first image is obtained by performing the prediction calculation of equation (1) using the tap coefficient. Converted to a second image.
- each pixel y of the second image is set as the target pixel, and the target pixel y is
- the classification is performed based on, for example, the distribution of pixel values of several pixels of the first image that are spatially or temporally adjacent to the target pixel y.
- the target pixel (pixel value) y belonging to the class Furthermore, in the learning process, for each class, the target pixel (pixel value) y belonging to the class and
- the normal equation of equation (8) is established, and all of the learning samples are used.
- the tap coefficient w for each class can be obtained by solving the normal equation for each class.
- the class classification adaptive process ! find a pixel (pixel value) of the second image.
- the pixel to which the pixel of interest belongs is obtained by classifying the pixel of interest as the pixel of interest and classifying the pixel of interest in the same manner as in the learning process. Furthermore, using the tap coefficient w of the class to which the pixel of interest belongs and the pixels X 1, X 2,.
- the pixel value (predicted value) is obtained.
- the image is converted into an image having a larger number of pixels than that image in the classification adaptation process.
- the tap coefficient w to be obtained can be obtained.
- the second image used in the learning process is an image obtained by thinning out the pixels of the first image, so that the tap for converting the image into an image having fewer pixels than that image in the class classification adaptive process.
- the coefficient w can be obtained.
- the noise is removed from the image in the classification adaptation process.
- the tap coefficient w to be converted into the left image can be obtained.
- the class classification adaptive processing it is possible to perform signal processing such as noise removal (noise reduction) in addition to conversion of the number of pixels of the image by the tap coefficient w obtained by the learning processing.
- class classification adaptation process can also be used in the photographing apparatus 1 to convert the above-described annular image into a hyperopically projected image.
- the image extraction unit 22 performs image conversion processing for converting an extracted image extracted from a large-capacity image into an extracted image having a predetermined number of pixels that can be displayed on the display device 3, for example, as described above for class classification adaptation.
- the extracted image obtained as a result of the processing is supplied to the signal processing unit 23.
- the playback parameter processing unit 24 determines whether or not the operation mode of the playback device 2 is the playback mode in step S21. judge.
- the operation modes of the playback device 2 include, for example, a playback mode for playing back images recorded on the recording medium 4 and an editing mode for editing images.
- the operation mode of the playback device 2 can be selected (set) by the user operating the operation unit 26, for example.
- step S21 If it is determined in step S21 that the operation mode of the playback device 2 is the playback mode, the process proceeds to step S22, and the playback mode processing is performed thereafter.
- step S22 the playback parameter processing unit 24 selects a parameter set used to play back an image recorded on the recording medium 4.
- the parameter set is a set of a series of playback parameters.
- a large-capacity image captured by the photographing apparatus 1 is recorded on the recording medium 4, the large-capacity image is captured.
- At least a parameter set which is a set of a series of playback parameters (playback parameters at the time of shooting) generated by the shooting device 1 (of the controller 17 (FIG. 2)), is recorded on at least the recording medium 4.
- One or more parameter sets can be recorded on the recording medium 4.
- a parameter set used for image reproduction is selected from the one or more parameter sets.
- the reproduction parameter processing unit 24 is recorded in the recording medium 4 by controlling the recording / reproducing unit 21. Recognizes the file name of the parameter set. Further, the playback parameter processing unit 24 controls the signal processing unit 23 to display a list of parameter set file names recorded on the recording medium 4 together with a message prompting the user to select one of them. Display on display device 3.
- the reproduction parameter processing unit 24 selects a parameter set specified by the file name as a parameter set used for image reproduction (hereinafter referred to as a reproduction set as appropriate).
- step S22 the playback parameter processing unit 24
- playback of the large-capacity image recorded on the recording medium 4 and the playback set is started.
- a large-capacity image and a reproduction set are reproduced (read) from the recording medium 4, and the large-capacity image is supplied to the image extraction unit 22, and the reproduction parameter force S, It is supplied to the reproduction parameter processing unit 24.
- the playback parameter may exist for each frame. Unless otherwise specified, playback of large-capacity images starts from the earliest frame in time where playback parameters exist.
- the playback parameter processing unit 24 starts playback of the playback parameters of the playback set in step S23, and when the playback parameter supply from the recording / playback unit 21 is started thereby, the playback parameter image Supply to the extraction unit 22 and the signal processing unit 23 is started, and the process proceeds to step S24.
- step S 24 the image extraction unit 22 starts extracting an extracted image based on the reproduction parameter supplied from the reproduction parameter processing unit 24 from the large-capacity image supplied from the recording / reproduction unit 21.
- a large-capacity image is supplied from the recording / playback unit 21 to the image extraction unit 22, for example, in units of frames, and the image extraction unit 22 receives the large-capacity image supplied from the recording / playback unit 21.
- the extracted frames are extracted from the frames of interest as the frames of interest, and the frames of interest are extracted and supplied to the signal processing unit 23.
- the image extraction unit 22 An image in the extraction window specified by the information included in the corresponding parameter is extracted as an extracted image from the frame of interest, and further, if necessary, image conversion processing is performed to convert the number of pixels as described above. Supply to processing unit 23.
- the image extraction unit 22 selects a frame having a corresponding parameter before the target frame.
- the extracted image is extracted from the frame of interest based on the corresponding parameter of the frame closest to the frame of interest.
- step S24 the process proceeds to step S25, in which the signal processing unit 23 performs image extraction unit 22 processing. Signal processing based on the reproduction parameter (corresponding parameter) supplied from the reproduction parameter processing unit 24 is started for the extracted image (frame) to which force is also supplied.
- the signal processing unit 23 performs signal processing on the extracted image supplied from the image extraction unit 22 with the content represented by the information included in the reproduction parameter supplied from the reproduction parameter processing unit 24. Supply to display device 3.
- the extracted image is extracted from the large-capacity image based on the playback parameters of the playback set !, and further, based on the playback parameters of the playback set. Since the extracted image is subjected to signal processing, it is displayed on the image content (image stream) force display device 3 corresponding to the playback parameters of the playback set.
- step S22 for example, when a parameter set of playback parameters at the time of shooting is selected as a playback set, an image similar to a normal image, that is, a user takes a picture (recording) with the photographing device 1.
- the same scene image as the normal image (the same image that was displayed on the display unit 13 of the shooting device 1 during shooting) appears on the display device 3. Is displayed.
- step S25 After the processing of step S25, the process proceeds to step S26, and the playback parameter processing unit 24 determines whether or not the operation unit 26 has been operated (stop operation) by the user to stop playback. If it is determined in step S26 that the stop operation has not been performed, the process returns to step S26, and the same processing is repeated thereafter.
- step S25 If it is determined in step S25 that the stop operation has been performed, the process proceeds to step S27, where the playback parameter processing unit 24 includes the recording / playback unit 21, the image extraction unit 22, and the signal processing unit. 23, all the large-capacity images and playback parameters started in step S23, the extracted images started in step S24, and the signal processing for the extracted images started in step S25 are all performed. Stop. Then, the process returns to step S21, and the same processing is repeated thereafter.
- step S21 when it is determined in step S21 that the operation mode of the playback device 2 is not the playback mode, the process proceeds to step S28, and the playback parameter processing unit 24 determines that the operation mode of the playback device 2 is the edit mode. Determine whether or not.
- step S28 it is determined that the operation mode of the playback device 2 is not the edit mode. In this case, the process returns to step S21, and the same processing is repeated thereafter.
- step S28 If it is determined in step S28 that the operation mode of the playback device 2 is the edit mode, the process proceeds to step S29, and the edit mode processing is performed thereafter.
- step S29 the playback parameter processing unit 24
- a parameter set (reproduction set) used for reproducing the image content as a reference for editing is selected in the same manner as in step S22, and the process proceeds to step S30.
- step S30 the playback parameter processing unit 24 controls the recording / playback unit 21 to start playback of the large-capacity image recorded on the recording medium 4 and the playback set as in step S23. Let As a result, a large-capacity image and a playback set are played back (read out) from the recording medium 4, and the large-capacity image is supplied to the image extraction unit 22 and playback parameter power of the playback set is played back. It is supplied to the parameter processing unit 24.
- the reproduction parameter processing unit 24 starts reproduction of the reproduction parameter of the reproduction set in step S30, and when supply of the reproduction parameter from the recording / reproduction unit 21 is thereby started, an image of the reproduction parameter is displayed. Supply to the extraction unit 22, the signal processing unit 23, and the reproduction parameter storage unit 25 is started, and the process proceeds to step S31.
- step S31 the playback parameter storage unit 25 starts storing the playback parameters supplied from the playback parameter processing unit 24, and proceeds to step S32.
- step S32 the image extraction unit 22 performs extraction based on the reproduction parameter supplied from the reproduction parameter processing unit 24 and the large-capacity image power supplied from the recording reproduction unit 21, as in step S24. Image extraction starts, and the process proceeds to step S33.
- the image extraction unit 22 uses a large-capacity image frame supplied from the recording / playback unit 21 in an extraction window specified by information included in the playback parameter supplied from the playback parameter processing unit 24.
- the image is extracted as an extracted image, and further, if necessary, the image conversion processing for converting the number of pixels is performed and supplied to the signal processing unit 23.
- step S33 the signal processing unit 23 is based on the reproduction parameter supplied from the reproduction parameter processing unit 24 for the extracted image supplied from the image extraction unit 22 as in step S25. Start signal processing. That is, the signal processing unit 23 performs signal processing of the content represented by the information included in the reproduction parameter supplied from the reproduction parameter processing unit 24 on the extracted image supplied from the image extraction unit 22, Supply to display device 3.
- the playback parameter is stored in the playback parameter storage unit 25 (step S31), except that the playback parameter is stored in the playback mode. Similar processing is performed, and thereby the image content corresponding to the playback set is displayed on the display device 3.
- step S34 the playback parameter processing unit 24 determines whether or not the user has operated the operation unit 26 so as to command editing.
- step S34 If it is determined in step S34 that no editing operation has been performed, that is, if it is determined that the operation unit 26 has not been operated so as to command user force editing, step S35 is performed. Skip to step S36.
- step S34 If it is determined in step S34 that an editing operation has been performed, that is, if it is determined that the operation unit 26 has been operated so as to command user force editing, the process proceeds to step S35. Then, the playback parameter processing unit 24 sets a playback parameter (new playback parameter) (hereinafter referred to as the corresponding parameter) of the target frame of the large-capacity image that was the target frame when the editing operation was performed. (Also referred to as an editing parameter as appropriate) is generated and supplied to the image extraction unit 22, the signal processing unit 23, and the reproduction parameter storage unit 25 in place of the reproduction parameter supplied from the recording / reproducing unit 21.
- a playback parameter new playback parameter
- the image extraction unit 22 extracts an image in the extraction window specified by the information included in the editing parameter from the reproduction parameter processing unit 24 as an extraction image from the large-capacity image
- the signal processing unit 23 the extracted image supplied from the image extraction unit 22 is subjected to signal processing of the contents represented by the information included in the editing parameters supplied from the reproduction parameter processing unit 24, and supplied to the display device 3.
- the display device 3 displays an image edited according to the editing operation.
- the playback parameter storage unit 25 stores the editing parameters supplied from the playback parameter processing unit 24.
- the editing operation will be described with reference to FIG.
- step S29 in Fig. 6 for example, when a playback parameter set for shooting is selected as the playback set, an extracted image is extracted from the large-capacity image based on the shooting playback parameter. Further, since the extracted image is subjected to signal processing based on the shooting reproduction parameter, the normal image (similar image) is displayed on the display device 3 as described above.
- a large-capacity image is a wider-angle image than a normal image including a range of scenes reflected in the normal image, and therefore, a range other than the range reflected in the normal image.
- the scene is also reflected.
- the large-capacity image can be seen by actually viewing the scene reflected in the normal image as well as the scene reflected in the normal image. It will be the scenery, and it is usually reflected in the image, and the scenery is also reflected.
- an editing operation for panning or tilting the photographing device 1 in a pseudo manner is a pseudo panning operation or a pseudo tilting operation, respectively
- a pseudo panning operation is performed as an editing operation.
- the playback parameter processing unit 24 moves the extraction window specified by the corresponding parameter to the extraction window moved in the horizontal direction in accordance with the pseudo pan operation, as shown in the top of FIG.
- An editing parameter is generated as a playback parameter including information indicating.
- the playback parameter processing unit 24 simulates the extraction window specified by the corresponding parameter as shown first from the top in FIG.
- An edit parameter is generated as a playback parameter including information indicating the extraction window moved in the vertical direction according to the tilt operation.
- the image extraction unit 22 extracts an image in the extraction window from the large-capacity image as an extracted image based on such editing parameters, and therefore the extracted image is a normal image. If the camera 1 is panned or tilted when it is in shadow, the scenery that would normally appear in the image is reflected.
- an operation such as a zoom-out operation or a zoom-in operation of the photographing apparatus 1 can be performed in a pseudo manner.
- an editing operation such as V zooming out or zooming in on the photographing apparatus 1
- pseudo zooming out operation or pseudo zooming in operation respectively
- the playback parameter processing unit 24 sets the size of the extraction window specified by the corresponding parameter according to the pseudo zoom-out operation, as shown in the second from the top in FIG.
- An editing parameter is generated as a playback parameter including information indicating the expanded extraction window.
- the playback parameter processing unit 24 extracts the extraction window specified by the corresponding parameter as shown in the third from the top (first from the bottom) in FIG.
- An edit parameter is generated as a playback parameter including information representing an extraction window that is reduced in size according to a pseudo zoom-in operation.
- the image extraction unit 22 extracts an image in the extraction window from the large-capacity image as an extraction image based on such editing parameters, the extracted image is obtained by capturing a normal image. If the camera 1 is zoomed out or zoomed in at the time of shooting, the scene that would normally appear in the image is reflected.
- the image extraction unit 22 uses the extraction window to extract an image having a large capacity and a wide range as an extraction image. Then, a wide-angle image (extracted image) such as a photograph taken by zooming out the photographing device 1 is displayed on the display device 3. In addition, since the extraction window is reduced by the pseudo zoom-in operation, the image extraction unit 22 extracts an image in a narrow range from a large-capacity image as an extraction image by the extraction window, and as a result, the image is captured. An image with a narrow angle of view (extracted image), that is, an image of a close-up subject is displayed on the display device 3 as if the device 1 was zoomed in.
- the extraction window can be When the dough is reduced, the resolution of the extracted image displayed on the display device 3 deteriorates.
- the large-capacity image has a higher resolution than the normal image, and even if the extraction window is reduced to some extent, the resolution of the extracted image extracted by the extraction window by the extraction window is reduced to some extent (for example, it is an image having a resolution that can be maintained at a level equal to or higher than that of a normal image.
- the display device 3 displays an image showing a scene in a range not shown in the normal image. Is done. Therefore, even if there is a scene that was missed during the shooting, the user who has shot with the shooting device 1 can see it if it appears in the large-capacity image.
- the large-capacity image is recorded on the recording medium 4 without being subjected to signal processing that the normal signal processing unit 12 applies to the normal image.
- the playback device 2 when the parameter set for shooting playback parameters is selected as a playback set, an extracted image is extracted from a large-capacity image based on the shooting playback parameters. Further, the extracted image is subjected to signal processing based on the reproduction parameter at the time of shooting. Accordingly, the extracted image is subjected to the same signal processing as the signal processing performed by the normal signal processing unit 12 on the normal image. As a result, an image similar to the normal image is displayed on the display device 3.
- the playback parameter processing unit 24 reproduces information including information representing the content of the signal processing corresponding to the editing operation. Edit parameters as parameters are generated and supplied to the signal processing unit 23.
- the signal processing unit 23 performs signal processing of the content represented by the information included in the editing parameter.
- the signal processing performed on the normal image by the normal signal processing unit 12 for example, various types of signal processing (including encoding processing such as MPEG) such as noise reduction and camera shake correction are included.
- various methods (algorithms) for signal processing for noise reduction for example, there are various methods (algorithms) for signal processing for noise reduction. Therefore, the type of signal processing performed by the normal signal processing unit 12 It is not necessarily appropriate (optimum) for the user, and even if the type of signal processing is appropriate, the method is not always appropriate.
- the playback device 2 can perform signal processing by effectively using all the signal components of the large-capacity image captured by the wide-angle camera unit 1 of the imaging device 1.
- step S36 the reproduction parameter processing unit 24 Determines whether the operation unit 26 has been operated (stop operation) by the user to stop the reproduction. If it is determined in step S36 that the stop operation has not been performed, the process returns to step S34, and thereafter the same processing is repeated.
- step S36 If it is determined in step S36 that the stop operation has been performed, the process proceeds to step S37, where the playback parameter processing unit 24 includes the recording / playback unit 21, the image extraction unit 22, and the signal processing unit.
- the playback parameter processing unit 24 includes the recording / playback unit 21, the image extraction unit 22, and the signal processing unit.
- step S30 By controlling 23, playback of the large-capacity image and playback parameters started in step S30, extraction of the extracted image started in step S32, and signal processing for the extracted image started in step S33 are all stopped.
- the playback parameter storage in the playback parameter storage unit 25 started in step S31 is also stopped.
- the process proceeds from step S37 to S38, where the playback parameter processing unit 24 newly records (saves) a parameter set, which is a set of playback parameters stored in the playback parameter storage unit 25, in the recording medium 4. Determine if.
- the playback parameter processing unit 24 controls the signal processing unit 23 to display an inquiry message asking whether to save a new parameter set on the display device 3, and in response to the inquiry message. Then, after waiting for the user to operate the operation unit 26, it is determined whether or not to record the new parameter set stored in the reproduction parameter storage unit 25 on the recording medium 4.
- step S38 when it is determined that the new parameter set stored in the reproduction parameter storage unit 25 is to be recorded in the recording medium 4, that is, in response to the inquiry message, the user sets a new parameter set.
- the operation unit 26 is operated so as to record, the process proceeds to step S39, where the playback parameter processing unit 24 reads out a new parameter set stored in the playback parameter storage unit 25, and further controls the recording / playback unit 21. By controlling, the new parameter set is recorded in the recording medium 4, and the process proceeds to step S40.
- step S38 If it is determined in step S38 that the new parameter set stored in the playback parameter storage unit 25 is not to be recorded in the recording medium 4, that is, the user makes a new response to the inquiry message. If the operation unit 26 is operated so as not to record a new parameter set, step S39 is skipped and the process proceeds to step S40.
- the playback parameter processing unit 24 stores the new parameter set recorded in the playback parameter storage unit 25. After deleting (erasing), the process returns to step S21, and the same processing is repeated thereafter.
- the new parameter set recorded on the recording medium 4 in the edit mode processing can be selected as a playback set, and accordingly, the image content corresponding to the playback parameter of the new parameter set is selected. It is possible to display on the display device 3.
- an editing parameter (new playback parameter) is generated according to the editing operation, and the extracted image and signal processing are performed based on the editing parameter. Done. Therefore, it is possible to provide an optimal image for the user. it can.
- the recording medium 4 records a large-capacity image that has not been subjected to signal processing that is applied to a normal image, the user performs an editing operation on the playback device 2. It is possible to obtain optimal image content for the user himself / herself by trial and error.
- a series of processing performed by the playback device 2 can be performed by dedicated hardware or can be performed by software.
- a series of processing is performed by software, it is installed in a general-purpose computer or the like that has the program power that constitutes the software.
- FIG. 8 shows a configuration example of an embodiment of a computer in which a program for executing the above-described series of processing is installed.
- the program can be recorded in advance in a hard disk 105 or ROM 103 as a recording medium built in the computer.
- the program is stored in a removable recording medium 111 such as a flexible disk, CD-ROM (Compact Disc Read Only Memory), MO (Magneto Optical) ice disk, DVD (Digital Versatile Disc), magnetic disk, or semiconductor memory. Can be stored (recorded) temporarily or permanently.
- a removable recording medium 111 can be provided as so-called knocking software.
- the program is installed in the computer from the removable recording medium 111 as described above, and is also transferred from the download site to the computer via a digital satellite broadcasting artificial satellite or via a LAN (Local (Area Network), transferred to the computer via a network such as the Internet, and the computer can receive the program transferred in this way by the communication unit 108 and install it in the built-in hard disk 105. it can.
- a digital satellite broadcasting artificial satellite or via a LAN (Local (Area Network) transferred to the computer via a network such as the Internet, and the computer can receive the program transferred in this way by the communication unit 108 and install it in the built-in hard disk 105. it can.
- the computer includes a CPU (Central Processing Unit) 102.
- CPU102 Central Processing Unit
- the input / output interface 110 is connected via the bus 101, and the CPU 102 operates the input unit 107 including a keyboard, a mouse, a microphone, and the like by the user via the input / output interface 110.
- the program stored in the ROM (Read Only Memory) 103 is executed accordingly.
- the CPU 102 is a program stored in the hard disk 105, a program transferred from a satellite or a network, received by the communication unit 108, installed in the hard disk 105, or a removable recording medium installed in the drive 109.
- a program read from 111 and installed in the hard disk 105 is loaded into a RAM (Random Access Memory) 104 and executed.
- RAM Random Access Memory
- the CPU 102 performs processing according to the flowchart of FIG. 6 or processing performed by the configuration of the block diagram of FIG. Then, the CPU 102 outputs the processing result from the output unit 106 configured with an LCD (Liquid Crystal Display), a speaker, or the like, for example, via the input / output interface 110 or the communication unit as necessary. Sent from 108, recorded on hard disk 105, etc.
- LCD Liquid Crystal Display
- processing steps for describing a program for causing a computer to perform various processes do not necessarily have to be performed in time series in the order described in the flowchart. It includes processes that are executed individually (for example, parallel processing or object processing).
- the program may be processed by one computer or may be distributedly processed by a plurality of computers. Furthermore, the program may be transferred to a remote computer and executed.
- the force that is used to convert the wide-angle image captured by the wide-angle camera unit 14 into an image that is far-sighted projection This can be done by the playback device 2 in the photographing device 1.
- the photographing apparatus 1 can be configured integrally with the reproduction apparatus 2.
- the recording of a large-capacity image on the recording medium 4 is performed both before the recording operation and after the stop operation for the recording operation.
- the image includes the scene at the time when the user recognizes that it was recorded and is obscene. Editing operations that capture such scenes are also possible.
- the frame rate of the normal image and the large-capacity image are the same, but the frame rate of the large-capacity image can be higher than the frame rate of the normal image. is there.
- the normal camera unit 11 and the wide-angle camera unit 14 are provided to capture both the normal image and the large-capacity image.
- the camera unit 11 is not provided, and the wide-angle camera unit 14 can capture only a large-capacity image.
- the image capture device 1 (controller 17) sets an extraction window according to the pan operation, tilt operation, and zoom operation of the image capture device 1 by the user.
- the extracted image is subjected to signal processing by the normal signal processing unit 12 and supplied to the display unit 13, whereby the display unit 13 displays an image (extracted image) similar to the normal image. can do.
- the imaging apparatus 1 has a force for recording a large-capacity image and a reproduction parameter on the recording medium 4.
- the large-capacity image and the reproduction parameter are: It is also possible to broadcast from a broadcasting station as a television broadcast program, receive it by the playback device 2, and record it on the recording medium 4.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
- Television Signal Processing For Recording (AREA)
- Editing Of Facsimile Originals (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2006800090425A CN101147391B (zh) | 2005-03-22 | 2006-03-22 | 成像设备、信息处理设备、信息处理方法 |
EP06729642A EP1865712A4 (en) | 2005-03-22 | 2006-03-22 | IMAGING DEVICE, INFORMATION PROCESSING DEVICE AND METHOD, PROGRAM, AND PROGRAM RECORDING MEDIUM |
US11/909,587 US8159572B2 (en) | 2005-03-22 | 2006-03-22 | Image pickup apparatus, apparatus and method for processing information, program, and program recording medium |
KR1020077020886A KR101249322B1 (ko) | 2005-03-22 | 2006-03-22 | 촬영 장치, 정보 처리 장치, 정보 처리 방법 및 프로그램 기록 매체 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005-081648 | 2005-03-22 | ||
JP2005081648A JP3968665B2 (ja) | 2005-03-22 | 2005-03-22 | 撮影装置、情報処理装置、情報処理方法、プログラム、およびプログラム記録媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006101112A1 true WO2006101112A1 (ja) | 2006-09-28 |
Family
ID=37023772
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/305675 WO2006101112A1 (ja) | 2005-03-22 | 2006-03-22 | 撮影装置、情報処理装置、情報処理方法、プログラム、およびプログラム記録媒体 |
Country Status (6)
Country | Link |
---|---|
US (1) | US8159572B2 (ja) |
EP (1) | EP1865712A4 (ja) |
JP (1) | JP3968665B2 (ja) |
KR (1) | KR101249322B1 (ja) |
CN (1) | CN101147391B (ja) |
WO (1) | WO2006101112A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008021660A1 (en) * | 2006-08-15 | 2008-02-21 | 3M Innovative Properties Company | Display simulator |
CN100440930C (zh) * | 2006-12-30 | 2008-12-03 | 北京中星微电子有限公司 | 一种摄像头系统和在摄像头视频流中获取静态图像的方法 |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4957960B2 (ja) | 2007-01-29 | 2012-06-20 | ソニー株式会社 | 画像処理装置、画像処理方法、及びプログラム |
JP5217250B2 (ja) | 2007-05-28 | 2013-06-19 | ソニー株式会社 | 学習装置および学習方法、情報加工装置および情報加工方法、並びにプログラム |
JP4973935B2 (ja) | 2007-06-13 | 2012-07-11 | ソニー株式会社 | 情報処理装置、情報処理方法、プログラム、および記録媒体 |
JP4930297B2 (ja) * | 2007-09-10 | 2012-05-16 | 株式会社ニコン | 撮像装置 |
JPWO2011039947A1 (ja) * | 2009-10-01 | 2013-02-21 | 日本電気株式会社 | 撮像装置、その制御方法、撮像システム、及びプログラム |
JP2014007653A (ja) * | 2012-06-26 | 2014-01-16 | Jvc Kenwood Corp | 撮像装置、撮像方法、撮像システム及びプログラム |
CN107851425B (zh) * | 2015-08-05 | 2022-01-18 | 索尼公司 | 信息处理设备、信息处理方法和程序 |
US10194078B2 (en) * | 2017-06-09 | 2019-01-29 | Immersion Corporation | Haptic enabled device with multi-image capturing abilities |
US11653047B2 (en) * | 2021-07-29 | 2023-05-16 | International Business Machines Corporation | Context based adaptive resolution modulation countering network latency fluctuation |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002369066A (ja) * | 2001-06-11 | 2002-12-20 | Ricoh Co Ltd | 映像記録表示装置及び映像記録表示方法 |
JP2004007284A (ja) * | 2002-05-31 | 2004-01-08 | Ricoh Co Ltd | 映像記録システム、プログラム及び記録媒体 |
JP2005012423A (ja) * | 2003-06-18 | 2005-01-13 | Fuji Photo Film Co Ltd | 撮像装置及び信号処理装置 |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH066669A (ja) | 1992-06-22 | 1994-01-14 | Hitachi Ltd | カメラ一体型記録再生装置 |
JPH06189176A (ja) | 1992-12-22 | 1994-07-08 | Fuji Photo Film Co Ltd | 手ぶれ補正システムおよび方式 |
JP3054002B2 (ja) * | 1993-09-01 | 2000-06-19 | キヤノン株式会社 | 複眼撮像装置 |
US6639626B1 (en) * | 1998-06-18 | 2003-10-28 | Minolta Co., Ltd. | Photographing apparatus with two image sensors of different size |
JP2000270297A (ja) | 1999-03-12 | 2000-09-29 | Toshiba Video Products Japan Kk | デジタル映像記録再生機能を有する監視カメラシステム |
ES2286875T3 (es) * | 1999-04-22 | 2007-12-01 | Leo Vision | Procedimiento y dispositivo de tratamiento y de restitucion de imagenes, con nuevo muestreo. |
JP3698397B2 (ja) | 1999-06-07 | 2005-09-21 | 株式会社日立国際電気 | テレビジョンカメラ |
JP4501239B2 (ja) * | 2000-07-13 | 2010-07-14 | ソニー株式会社 | カメラ・キャリブレーション装置及び方法、並びに、記憶媒体 |
JP2002094862A (ja) | 2000-09-12 | 2002-03-29 | Chinon Ind Inc | 撮像装置 |
US7940299B2 (en) * | 2001-08-09 | 2011-05-10 | Technest Holdings, Inc. | Method and apparatus for an omni-directional video surveillance system |
JP2003179785A (ja) | 2001-12-11 | 2003-06-27 | Pentax Corp | 画像撮影装置 |
JP4068869B2 (ja) * | 2002-03-29 | 2008-03-26 | 富士フイルム株式会社 | デジタルカメラ |
JP3870124B2 (ja) * | 2002-06-14 | 2007-01-17 | キヤノン株式会社 | 画像処理装置及びその方法、並びにコンピュータプログラム及びコンピュータ可読記憶媒体 |
JP2004282296A (ja) | 2003-03-14 | 2004-10-07 | Sony Corp | 撮像装置 |
JP3950085B2 (ja) | 2003-06-10 | 2007-07-25 | 株式会社つくばマルチメディア | 地図誘導全方位映像システム |
JP2005039777A (ja) | 2003-06-30 | 2005-02-10 | Casio Comput Co Ltd | 画像撮影装置及び画像撮影システム |
US20060187322A1 (en) * | 2005-02-18 | 2006-08-24 | Janson Wilbert F Jr | Digital camera using multiple fixed focal length lenses and multiple image sensors to provide an extended zoom range |
JP2007180730A (ja) * | 2005-12-27 | 2007-07-12 | Eastman Kodak Co | デジタルカメラおよびデータ管理方法 |
-
2005
- 2005-03-22 JP JP2005081648A patent/JP3968665B2/ja not_active Expired - Fee Related
-
2006
- 2006-03-22 US US11/909,587 patent/US8159572B2/en not_active Expired - Fee Related
- 2006-03-22 KR KR1020077020886A patent/KR101249322B1/ko not_active IP Right Cessation
- 2006-03-22 WO PCT/JP2006/305675 patent/WO2006101112A1/ja active Application Filing
- 2006-03-22 EP EP06729642A patent/EP1865712A4/en not_active Withdrawn
- 2006-03-22 CN CN2006800090425A patent/CN101147391B/zh not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002369066A (ja) * | 2001-06-11 | 2002-12-20 | Ricoh Co Ltd | 映像記録表示装置及び映像記録表示方法 |
JP2004007284A (ja) * | 2002-05-31 | 2004-01-08 | Ricoh Co Ltd | 映像記録システム、プログラム及び記録媒体 |
JP2005012423A (ja) * | 2003-06-18 | 2005-01-13 | Fuji Photo Film Co Ltd | 撮像装置及び信号処理装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1865712A4 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008021660A1 (en) * | 2006-08-15 | 2008-02-21 | 3M Innovative Properties Company | Display simulator |
US7593017B2 (en) | 2006-08-15 | 2009-09-22 | 3M Innovative Properties Company | Display simulator |
CN100440930C (zh) * | 2006-12-30 | 2008-12-03 | 北京中星微电子有限公司 | 一种摄像头系统和在摄像头视频流中获取静态图像的方法 |
Also Published As
Publication number | Publication date |
---|---|
EP1865712A1 (en) | 2007-12-12 |
JP2006270187A (ja) | 2006-10-05 |
KR101249322B1 (ko) | 2013-04-01 |
US20090201378A1 (en) | 2009-08-13 |
CN101147391B (zh) | 2011-06-01 |
KR20070120951A (ko) | 2007-12-26 |
CN101147391A (zh) | 2008-03-19 |
EP1865712A4 (en) | 2009-12-23 |
US8159572B2 (en) | 2012-04-17 |
JP3968665B2 (ja) | 2007-08-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP3968665B2 (ja) | 撮影装置、情報処理装置、情報処理方法、プログラム、およびプログラム記録媒体 | |
JP4525561B2 (ja) | 撮像装置、画像処理方法、並びにプログラム | |
US8264573B2 (en) | Imaging apparatus and reproducing apparatus which changes frame rate based on zoom operation | |
US8000558B2 (en) | Thumbnail generating apparatus and image shooting apparatus | |
US20070110321A1 (en) | Image processing apparatus, image processing method, program for image processing method, and recording medium which records program for image processing method | |
JP2002101329A (ja) | デジタルカメラ、画像再生装置及び方法 | |
JP2009033450A (ja) | 撮影装置、被写体追尾ズーミング方法及び被写体追尾ズーミングプログラム | |
US20080317291A1 (en) | Image processing apparatus, image processing method and program | |
JP4556195B2 (ja) | 撮像装置、動画再生装置及びそのプログラム | |
JP2007266659A (ja) | 撮像再生装置 | |
JP2006203334A (ja) | 画像記録装置及びその制御方法、並びにプログラム | |
JP2007281966A (ja) | 再生装置、再生画像の選択方法、プログラム、記録媒体 | |
JP2008005427A (ja) | 撮像装置および撮像方法、並びにプログラム | |
JP4893364B2 (ja) | 情報処理装置、情報処理方法、プログラム、およびプログラム記録媒体 | |
JP4172352B2 (ja) | 撮像装置及び方法、撮像システム、プログラム | |
JP4758240B2 (ja) | カメラ | |
JP4547877B2 (ja) | 撮像装置 | |
JP2007228119A (ja) | 撮像装置、画像処理方法、およびプログラム | |
KR101480407B1 (ko) | 디지털 영상 처리 장치, 이의 제어 방법 및 상기 제어방법을 기록한 기록 매체 | |
JP3543978B2 (ja) | デジタル画像信号の伝送装置、受信装置およびデジタル画像信号の記録装置、再生装置 | |
JP2004241834A (ja) | 動画像生成装置及び方法、動画像送信システム、プログラム並びに記録媒体 | |
JP2006238041A (ja) | ビデオカメラ | |
JP2010193106A (ja) | 撮像装置 | |
JPH10276359A (ja) | 追尾装置および追尾方法 | |
JPH0746533A (ja) | デジタル画像信号の記録装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200680009042.5 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2006729642 Country of ref document: EP Ref document number: 1020077020886 Country of ref document: KR |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
NENP | Non-entry into the national phase |
Ref country code: RU |
|
WWP | Wipo information: published in national office |
Ref document number: 2006729642 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11909587 Country of ref document: US |