WO2022264691A1 - 撮像方法及び撮像装置 - Google Patents
撮像方法及び撮像装置 Download PDFInfo
- Publication number
- WO2022264691A1 WO2022264691A1 PCT/JP2022/018701 JP2022018701W WO2022264691A1 WO 2022264691 A1 WO2022264691 A1 WO 2022264691A1 JP 2022018701 W JP2022018701 W JP 2022018701W WO 2022264691 A1 WO2022264691 A1 WO 2022264691A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- imaging
- image
- display
- subject image
- determination
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 67
- 230000002123 temporal effect Effects 0.000 claims abstract description 39
- 230000007246 mechanism Effects 0.000 claims abstract description 26
- 230000002194 synthesizing effect Effects 0.000 claims abstract description 22
- 230000008859 change Effects 0.000 claims abstract description 21
- 238000003384 imaging method Methods 0.000 claims description 215
- 230000008569 process Effects 0.000 claims description 61
- 230000015572 biosynthetic process Effects 0.000 claims description 46
- 238000003786 synthesis reaction Methods 0.000 claims description 46
- 238000005516 engineering process Methods 0.000 description 26
- 238000001514 detection method Methods 0.000 description 14
- 230000003287 optical effect Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000002955 isolation Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000004549 pulsed laser deposition Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 230000006641 stabilisation Effects 0.000 description 2
- 238000011105 stabilization Methods 0.000 description 2
- 240000008005 Crotalaria incana Species 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000002250 progressing effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/951—Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
- H04N23/634—Warning indications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6812—Motion detection based on additional sensors, e.g. acceleration sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/685—Vibration or motion blur correction performed by mechanical compensation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
Definitions
- the technology of the present disclosure relates to an imaging method and an imaging device.
- Patent Document 1 discloses a digital camera that generates high-quality images through super-resolution processing.
- the digital camera described in Patent Document 1 acquires four captured images while moving the relative position of the image sensor with respect to the subject image, and performs super-resolution processing.
- To detect whether or not there is a change in a subject image between a photographed image and photographed images obtained by photographing the second and subsequent frames each time photographing is performed.
- super-resolution photography is restarted from the beginning.
- Patent Document 2 discloses an image processing device capable of obtaining high-quality composite image data.
- the obtaining unit obtains a plurality of image data having RGB color elements.
- the division unit divides each image data into a plurality of areas.
- the calculation unit calculates the amount of deviation for each region of each image data.
- the color interpolation unit performs color interpolation for at least one of the RGB color components based on the shift amount for each region of each image data.
- Patent document 3 describes pixel shifting means for moving the relative position between a light beam incident on an imaging device and the imaging device, and movement by a series of operations by the pixel shifting means when the relative position is at a specific position and from the specific position.
- a photographing means for photographing when in at least one position where the photographing means is located, an image generating means for generating a new high-resolution image from a plurality of image data photographed by the photographing means in a series of operations, and the new image is generated and a display control means for displaying an image taken at an early stage among a plurality of images taken before.
- JP 2016-171511 A JP 2019-161564 A Japanese Patent Application Laid-Open No. 2003-283887
- An embodiment according to the technology of the present disclosure provides an imaging method and an imaging device that enable the user to recognize the processing status.
- an imaging method of the present disclosure is an imaging method used in an imaging apparatus that includes an imaging device that captures a subject image and a moving mechanism that can change the relative position of the subject image and the imaging device. a changing step of changing the relative position a plurality of times; an imaging step of capturing a subject image using an imaging element at the plurality of relative positions to obtain a plurality of first images; and synthesizing the plurality of first images. a synthesizing step of generating a second image by doing so; and a display step of temporally displaying the imaging step or the synthesizing step.
- the temporal display be a display that allows recognition of the execution time of the imaging process or the synthesizing process.
- the changing step it is preferable to change the relative position by moving the imaging device to a predetermined position using a moving mechanism.
- An imaging device includes an imaging device that captures a subject image, a moving mechanism that can change the relative position of the subject image and the imaging device, and a processor.
- the processor performs change processing that changes the relative position multiple times.
- the processor can selectively execute a first mode and a second mode. in which the relative position is changed by moving the imaging device to a predetermined position using the moving mechanism, and the composition processing is performed based on whether at least one of the plurality of first images satisfies the first condition. Performing determination processing for determining whether or not execution is possible, and when the determination is negative in the determination processing, changing temporal display in the display processing to contents different from each other in the first mode and the second mode. is preferred.
- FIG. 7 is a diagram illustrating an example of imaging processing and synthesis processing in a multi-shot synthesis mode
- FIG. 10 is a diagram showing an example of temporal display performed in the multi-shot synthesis mode
- FIG. 10 is a diagram showing an example of notification performed in multi-shot synthesis mode
- 4 is a flow chart showing an example of a series of operations in a multi-shot synthesis mode
- FIG. 7 is a diagram illustrating an example of imaging processing and synthesis processing in a multi-shot synthesis mode
- FIG. 10 is a diagram showing an example of temporal display performed in the multi-shot synthesis mode
- FIG. 10 is a diagram showing an example of notification performed in multi-shot synthesis mode
- 4 is a flow chart showing an example of a series of operations in a multi-shot synthesis mode
- FIG. 7 is a diagram illustrating an example of imaging processing and synthesis processing in a multi-shot synthesis mode
- FIG. 10 is a diagram showing an example of temporal display performed in the multi-shot
- FIG. 10 is a diagram showing an example of imaging processing in a pixel-shifted multi-shot synthesis mode
- 7 is a flow chart showing an example of a series of operations in a pixel-shifted multi-shot synthesis mode
- FIG. 10 is a diagram showing an example of updating the temporal display when the number of times the subject image is captured is increased; It is a figure which shows the example which connected the imaging device with the external device.
- IC is an abbreviation for “Integrated Circuit”.
- CPU is an abbreviation for "Central Processing Unit”.
- ROM is an abbreviation for “Read Only Memory”.
- RAM is an abbreviation for “Random Access Memory”.
- CMOS is an abbreviation for "Complementary Metal Oxide Semiconductor.”
- FPGA is an abbreviation for "Field Programmable Gate Array”.
- PLD is an abbreviation for "Programmable Logic Device”.
- ASIC is an abbreviation for "Application Specific Integrated Circuit”.
- OPF is an abbreviation for "Optical View Finder”.
- EMF is an abbreviation for "Electronic View Finder”.
- JPEG is an abbreviation for "Joint Photographic Experts Group”.
- DSP is an abbreviation for "Digital Signal Processor”.
- the technology of the present disclosure will be described by taking a lens-interchangeable digital camera as an example.
- the technique of the present disclosure is not limited to interchangeable-lens type digital cameras, and can be applied to lens-integrated digital cameras.
- FIG. 1 shows an example of the front side of the imaging device 10.
- the imaging device 10 is a lens-interchangeable digital camera.
- the imaging device 10 is composed of a body 11 and an imaging lens 12 replaceably attached to the body 11 .
- the imaging lens 12 is attached to the front side of the main body 11 via a camera side mount 11A and a lens side mount 12A (see FIG. 2).
- the imaging lens 12 is an example of a lens according to the technology of the present disclosure.
- a dial 13 and a release button 14 are provided on the upper surface of the main body 11.
- the dial 13 is operated when setting an operation mode or the like.
- the operation modes of the imaging device 10 include, for example, a still image imaging mode, a moving image imaging mode, and an image display mode.
- the release button 14 is operated by the user when starting the execution of still image capturing or moving image capturing.
- the still image capturing mode includes a "multi-shot synthesis mode" for obtaining super-resolution images.
- a multi-shot synthesis mode of the present embodiment a plurality of images are synthesized by changing the relative positions of the subject image and the imaging sensor 20 (see FIG. 2) by using the shaking applied to the imaging device 10 due to the user's hand shake or the like. It is a mode that acquires and synthesizes. By synthesizing a plurality of images in which the relative positions of the subject image and the image sensor 20 are different, a super-resolution image exceeding the resolution of a single image can be obtained.
- the main body 11 is provided with a finder 17 .
- the finder 17 is a hybrid finder (registered trademark).
- a hybrid viewfinder is, for example, a viewfinder that selectively uses an optical viewfinder (hereinafter referred to as "OVF”) and an electronic viewfinder (hereinafter referred to as "EVF").
- a finder eyepiece 18 is provided on the back side of the main body 11 .
- the viewfinder eyepiece 18 selectively displays an optical image that can be viewed with the OVF and a live view image that is an electronic image that can be viewed with the EVF.
- a user can observe an optical image or a live view image of a subject through the viewfinder eyepiece 18 .
- a display 15 (see FIG. 2) is provided on the back side of the main body 11.
- the display 15 displays an image based on an image signal obtained by imaging, various menu screens, and the like.
- the Z -axis AZ shown in FIG. 1 corresponds to the optical axis of the imaging lens 12 .
- the X -axis AX and the Y-axis AY are orthogonal to each other and to the Z -axis AZ.
- the X -axis AX and Y -axis AY correspond to the pitch axis and yaw axis according to the technology of the present disclosure.
- the direction of rotation about the Z -axis AZ is called the roll direction.
- the direction of rotation about the X -axis AX is called the pitch direction.
- the direction of rotation about the Y -axis AY is called the yaw direction.
- the X-axis A X direction is called the X direction
- the Y-axis A Y direction is called the Y direction.
- FIG. 2 shows an example of the internal configuration of the imaging device 10. As shown in FIG. The main body 11 and the imaging lens 12 are electrically connected by contact between an electrical contact 11B provided on the camera-side mount 11A and an electrical contact 12B provided on the lens-side mount 12A.
- the imaging lens 12 includes an objective lens 30 , a focus lens 31 , a rear end lens 32 and an aperture 33 .
- Each member is arranged along the optical axis of the imaging lens 12 (that is, the Z-axis A Z ) in the order of the objective lens 30, the diaphragm 33, the focus lens 31, and the rear end lens 32 from the objective side.
- the objective lens 30, focus lens 31, and rear end lens 32 constitute an imaging optical system.
- the type, number, and order of arrangement of the lenses that make up the imaging optical system are not limited to the example shown in FIG.
- the imaging lens 12 has a lens driving control section 34 and a memory.
- the lens drive control unit 34 is composed of, for example, a CPU, a RAM, a ROM, and the like.
- the lens drive control section 34 is electrically connected to the processor 40 in the main body 11 via the electrical contacts 12B and 11B.
- the lens drive control unit 34 drives the focus lens 31 and the diaphragm 33 based on control signals sent from the processor 40 .
- the lens drive control unit 34 performs drive control of the focus lens 31 based on a control signal for focus control transmitted from the processor 40 in order to adjust the focus position of the imaging lens 12 .
- the diaphragm 33 has an aperture whose aperture diameter is variable around the optical axis.
- the lens drive control unit 34 performs drive control of the diaphragm 33 based on the control signal for diaphragm adjustment transmitted from the processor 40.
- the main body 11 is provided with an imaging sensor 20 , a processor 40 , an image processing section 41 , an operation section 42 , a mechanical anti-vibration mechanism 43 , a shake detection sensor 44 , a memory 45 and a display 15 .
- the processor 40 controls the operations of the imaging sensor 20 , the image processing unit 41 , the operation unit 42 , the mechanical vibration isolation mechanism 43 , the shake detection sensor 44 , the memory 45 , and the display 15 .
- the processor 40 is composed of, for example, a CPU, RAM, ROM, and the like. In this case, the processor 40 executes various processes based on the operating program 45A stored in the memory 45.
- FIG. Note that the processor 40 may be configured by an assembly of a plurality of IC chips.
- the imaging sensor 20 is, for example, a CMOS image sensor.
- the imaging sensor 20 is arranged such that the Z-axis AZ as an optical axis is orthogonal to the light-receiving surface 20A and the Z-axis AZ is positioned at the center of the light-receiving surface 20A.
- Light (subject image) that has passed through the imaging lens 12 is incident on the light receiving surface 20A.
- a plurality of pixels that generate image signals by performing photoelectric conversion are formed on the light receiving surface 20A.
- the imaging sensor 20 photoelectrically converts light incident on each pixel to generate and output an image signal.
- the imaging sensor 20 is an example of an “imaging element” according to the technology of the present disclosure.
- a color filter array of Bayer arrangement is arranged on the light receiving surface of the imaging sensor 20, and one of R (red), G (green), and B (blue) color filters is arranged opposite to each pixel. It is Therefore, each pixel of one image before color interpolation processing includes R, G, or B color information. Note that the arrangement of the color filter array is not limited to the Bayer arrangement, and can be changed as appropriate.
- the imaging sensor 20 is held by a mechanical anti-vibration mechanism 43 .
- the mechanical vibration isolation mechanism 43 holds the imaging sensor 20 so as to be able to translate in the X -axis AX and Y-axis AY directions and to be rotatable in the roll direction.
- the configuration of the mechanical vibration isolation mechanism 43 is known, for example, from Japanese Patent Application Laid-Open No. 2016-171511.
- the mechanical anti-vibration mechanism 43 is an example of a “moving mechanism capable of changing the relative position between the subject image and the imaging device” according to the technology of the present disclosure.
- the mechanical image stabilization mechanism 43 may be a mechanism that changes the relative position between the subject image and the imaging sensor 20 by driving a part of the lenses that constitute the imaging optical system of the imaging lens 12 .
- the shake detection sensor 44 detects shake applied to the main body 11 housing the imaging sensor 20 .
- the shake detection sensor 44 is, for example, a five-axis shake detection sensor that detects shake in the roll direction, yaw direction, pitch direction, X direction, and Y direction.
- roll-direction blur is referred to as rotational blur.
- Shaking in the yaw and pitch directions is called angular shake.
- a blur in the X and Y directions is called a translational blur.
- the shake detection sensor 44 is composed of, for example, a gyro sensor and an acceleration sensor.
- a gyro sensor detects rotational shake and angular shake.
- the acceleration sensor detects translational shake.
- the processor 40 drives and controls the mechanical anti-vibration mechanism 43 based on the shake of the imaging device 10 (shaking applied to the imaging device 10 ) detected by the shake detection sensor 44 . Specifically, the processor 40 changes the relative positions of the subject image and the imaging sensor 20 so as to offset the displacement of the subject image due to the user's hand shake.
- a position detection sensor for detecting the position of the imaging sensor 20 may be provided in the mechanical vibration isolation mechanism 43 .
- This position detection sensor is, for example, a Hall sensor.
- the processor 40 drives and controls the mechanical anti-vibration mechanism 43 based on the shake information detected by the shake detection sensor 44 and the position information of the imaging sensor 20 detected by the position detection sensor. Note that the processor 40 may stop driving control of the mechanical anti-vibration mechanism 43 in the multi-shot synthesis mode described above.
- the processor 40 causes the imaging sensor 20 to perform a predetermined plurality of imaging operations (for example, four times) in response to the operation of the release button 14 by the user. That is, in the multi-shot synthesis mode, the imaging sensor 20 performs imaging processing to obtain a plurality of images with different relative positions between the subject image and the imaging sensor 20 .
- the image acquired by the imaging sensor 20 is an example of the "first image" according to the technology of the present disclosure.
- the image processing unit 41 is configured by a DSP, for example.
- the image processing unit 41 generates image data in a predetermined file format (for example, JPEG format) by performing various image processing such as color interpolation processing on the image signal.
- a predetermined file format for example, JPEG format
- the image processing unit 41 performs synthesis processing for generating a super-resolution image by synthesizing a plurality of images acquired by imaging processing.
- the super-resolution image is an example of the "second image" according to the technology of the present disclosure.
- the image processing unit 41 derives the shift amount for each region of each image based on the blur information detected by the blur detection sensor 44 and the position information of the imaging sensor 20 detected by the position detection sensor, A plurality of images are synthesized based on the derived displacement amount. It should be noted that the image processing unit 41 can calculate the shift amount for each area of each image by using a block matching technique instead of the blurring information and the position information. Synthesis processing used in the multi-shot synthesis mode is known, for example, from Japanese Patent Application Laid-Open No. 2019-161564.
- the display 15 displays images based on the image data generated by the image processing unit 41 .
- Images include still images, moving images, and live view images.
- a live view image is an image that is displayed in real time on the display 15 by sequentially outputting image data generated by the image processing unit 41 to the display 15 .
- the image data generated by the image processing unit 41 can be stored in an internal memory (not shown) built into the main body 11 or a storage medium (for example, a memory card) removable from the main body 11.
- a storage medium for example, a memory card
- the operation unit 42 includes the aforementioned dial 13, release button 14, and instruction keys (not shown).
- the instruction keys are provided on the back side of the main body 11, for example.
- the processor 40 controls each part in the main body 11 and the lens driving control part 34 in the imaging lens 12 according to the operation of the operation part 42 .
- the processor 40 determines whether the compositing process by the image processing unit 41 can be executed based on whether at least one of the multiple images obtained by the imaging process satisfies a predetermined quality condition. Determination processing is performed to determine whether or not.
- a quality condition is a condition determined based on the brightness, blur, or degree of blurring of an image.
- the quality condition may be a condition based on the brightness, blur, or degree of blur in one image, or a condition based on differences in brightness, blur, or degree of blur between a plurality of images, or differences in subject position.
- the quality condition may be a comprehensive condition using a plurality of indices such as brightness, blur, degree of blurring, and the like.
- the quality conditions may be different conditions for each of the multiple images acquired by the imaging process. Note that the quality condition is an example of the "first condition" according to the technology of the present disclosure.
- the processor 40 changes the processing content in the imaging process when the determination is negative in the determination process.
- first stop processing is performed to stop capturing of the subject image by the imaging sensor 20 .
- the changed processing contents of the imaging apparatus include, for example, a mode of stopping the imaging process, a mode of re-executing a part of the imaging process, and imaging condition settings (shutter speed, aperture amount of the imaging lens, sensitivity of the imaging device, etc.). , and the sensitivity of the moving mechanism, etc.).
- the processor 40 performs display processing for temporal display regarding imaging processing or synthesis processing.
- the processor 40 controls the display 15 to provide temporal displays regarding imaging processing or compositing processing on the display 15 .
- the temporal display is a user-recognizable display of the execution time of the imaging process or the synthesizing process.
- the temporal display is, for example, a remaining time display that indicates the time until the end of the imaging process and the compositing process.
- the temporal display may display the remaining time of the imaging process and the remaining time of the synthesis process separately, or display the remaining time until the super-resolution processing combining the imaging process and the synthesis process is completed.
- the temporal display is not limited to a display mode in which the remaining time is directly displayed, and may be a display mode in which the number of remaining images or the like is displayed.
- the processor 40 performs a first suspension process of suspending imaging of the subject image by the imaging sensor 20 when the determination is negative in the determination process, and also performs temporal A second cancellation process for canceling the display is performed.
- the processor 40 performs notification processing for notifying the user of the reason for stopping the imaging of the subject image or the reason for stopping the temporal display when the determination is negative in the determination processing.
- the processor 40 notifies the user of the cancellation reason by controlling the display 15 and displaying the cancellation reason on the display 15 .
- FIG. 3 illustrates an example of imaging processing and synthesis processing in the multi-shot synthesis mode.
- N images P are acquired by the imaging sensor 20 in the imaging process.
- the imaging apparatus 10 shakes due to hand shake of the user or the like, and thus the positions of the subject images SI appearing in the respective images P are shifted.
- each area forming the subject image SI includes a plurality of pieces of color information.
- the area PA in the subject image SI contains all the R, G, and B color information.
- each area forming the subject image SI contains a plurality of pieces of color information, by synthesizing the N images P, a high-quality super-resolution image PS can be obtained.
- the super-resolution image PS is displayed on the display 15 under the control of the processor 40 .
- FIG. 4 shows an example of temporal display performed in the multi-shot synthesis mode.
- the display 15 displays (that is, displays as a percentage) the ratio of the processing time to the total processing time for each of the imaging process and the compositing process. In addition to the percentage display, the remaining time of each process may be displayed.
- FIG. 5 shows an example of notification performed in the multi-shot synthesis mode.
- the processor 40 displays on the display 15 the fact that the imaging process has been stopped and the reason for stopping the imaging when the imaging is stopped during the imaging process.
- the reason for cancellation is, for example, a change in brightness during imaging processing. For example, if the brightness of the subject image changes due to changes in lighting caused by flickering of fluorescent lights, etc., there is a possibility that the compositing process cannot be performed due to differences in brightness between the multiple images to be composited.
- reasons for canceling the imaging process or the compositing process include the case where an unacceptable large shake is applied to the imaging device 10 and the case where the subject has moved significantly.
- FIG. 6 is a flow chart showing an example of a series of operations in the multi-shot synthesis mode.
- the processor 40 determines whether or not the user has issued an imaging instruction by operating the release button 14 (step S10).
- the processor 40 determines that an imaging instruction has been given (step S10: YES)
- the processor 40 starts the above-described temporal display (step S11), and causes the imaging sensor 20 to pick up the subject image (step S12).
- the processor 40 determines whether the image acquired by the imaging sensor 20 satisfies the above-described quality conditions (step S13). If the quality condition is satisfied (step S13: YES), the processor 40 determines whether or not the imaging sensor 20 has finished imaging a predetermined number of times (here, N times) (step S14). . If the N times of imaging have not been completed (step S14: NO), the processor 40 returns the process to step S12 and causes the imaging sensor 20 to perform imaging of the subject image again.
- step S14 When the N times of imaging are completed (step S14: YES), the processor 40 causes the image processing unit 41 to perform the above-described synthesis processing (step S15), and when the synthesis processing by the image processing unit 41 is completed , terminate the temporal display (step S16).
- the processor 40 stores the super-resolution image generated by the image processing unit 41 in the memory 45 and displays it on the display 15 (step S17).
- step S13 When the processor 40 determines in step S13 that the quality condition is not satisfied (step S13: NO), it causes the imaging sensor 20 to stop capturing the subject image (step S18). In addition, the processor 40 stops the imaging of the subject image and the temporal display (step S19). Then, the processor 40 notifies the user of the reason for cancellation by displaying the reason for cancellation on the display 15 (step S20).
- step S12 is an example of the "imaging process” according to the technology of the present disclosure.
- Step S15 is an example of the "synthesis step” according to the technology of the present disclosure.
- Step S11 is an example of a “display step” according to the technology of the present disclosure.
- Step S14 is an example of a “change step” according to the technology of the present disclosure.
- Step S13 is an example of a "determining step” according to the technology of the present disclosure.
- Step S18 is an example of the "first stop step” according to the technology of the present disclosure.
- Step S19 is an example of the “second stop step” according to the technology of the present disclosure.
- Step S20 is an example of a "notification step” according to the technology of the present disclosure.
- the imaging device 10 of the present embodiment performs temporal display during execution of super-resolution processing in the multi-shot synthesis mode. Since super-resolution processing includes imaging processing and synthesis processing, the processing takes a long time, so the user may have doubts about whether the processing is progressing normally.
- the image capturing apparatus 10 of the present embodiment can make the user recognize the processing status by performing the temporal display, thereby alleviating the user's doubts.
- the multi-shot synthesis mode is a mode in which change processing is performed to change the relative position between the subject image and the imaging sensor 20 by using the shaking applied to the imaging device 10 due to the user's camera shake or the like (hereinafter referred to as "shake shake”). (referred to as “use multi-shot synthesis mode”).
- the multi-shot synthesis mode is a mode in which the mechanical image stabilization mechanism 43 actively moves the image sensor 20 to change the relative position of the subject image and the image sensor 20 multiple times (hereinafter referred to as "change processing”). , “pixel shift multi-shot synthesis mode”).
- the processor 40 causes the image sensor 20 to perform image capturing multiple times while finely moving the image sensor 20 in a direction orthogonal to the optical axis.
- the image processing unit 41 generates a super-resolution image by synthesizing a plurality of images acquired by the imaging sensor 20 .
- the pixel-shifted multi-shot synthesis mode is known from Japanese Patent Application Laid-Open No. 2016-171511, Japanese Patent Application Laid-Open No. 2019-161564, and the like.
- the imaging device 10 is shaken due to user's hand shake, etc., so it is preferable that the imaging device 10 is used while being fixed on a tripod or the like.
- FIG. 7 shows an example of imaging processing in the pixel-shifted multi-shot synthesis mode.
- the example shown in FIG. 7 is an example in which a total of four images P are acquired while shifting the imaging sensor 20 one pixel at a time in the X direction or the Y direction.
- each area constituting the subject image SI includes all of the R, G, and B color information.
- a super-resolution image PS is obtained. Note that the number of images to be acquired is not limited to four, and can be changed as appropriate, such as nine.
- FIG. 8 is a flowchart showing an example of a series of operations in the pixel-shifted multi-shot synthesis mode. Steps S30 to S37 shown in FIG. 8 are the same as steps S10 to S17 shown in FIG. However, in this modified example, the processor 40 moves the image sensor 20 by one pixel as shown in FIG. Also, in step S34, the processor 40 determines whether or not imaging has been completed for the four positions shown in FIG.
- step S33 when the processor 40 determines in step S33 that the quality condition is not satisfied (step S33: NO), it causes the imaging sensor 20 to perform imaging again at the same position (step S38). As a result of this re-imaging, the number of times of imaging becomes larger than the originally planned number of times of imaging (4 times), and the time required for the imaging process becomes longer. Therefore, the temporal display regarding the imaging process is updated (step S38). Then, the processor 40 notifies the user of the update reason by causing the display 15 to display the reason for updating the temporal display (step S40). After that, the processor 40 returns the process to step S33, and determines whether or not the image acquired by the re-imaging satisfies the quality condition.
- step S38 is an example of the "update step" according to the technology of the present disclosure.
- the relative position between the subject image and the image sensor 20 is determined by moving the image sensor 20 to a predetermined position using the mechanical vibration isolation mechanism 43 .
- the processor 40 increases the number of times the subject image is captured in the imaging process compared to when the determination is affirmed.
- the processor 40 updates the temporal display in the display process when the number of times the subject image is captured is increased. Then, notification processing is performed to notify the user of the reason for increasing the number of times of imaging or the reason for updating the temporal display in the imaging processing.
- FIG. 9 shows an example of updating the temporal display when the number of times the subject image is captured is increased.
- the processor 40 increases the remaining time of the imaging process. Therefore, as shown in FIG. Change the percentage display. Also, in the example shown in FIG. 9, the processor 40 displays on the display 15 that the brightness has changed as the reason for updating the temporal display.
- the processor 40 causes the imaging sensor 20 to perform re-imaging when it is determined in step S33 that the quality condition is not satisfied, but re-imaging may not be performed.
- the processor 40 may cause the image processing section 41 to perform synthesizing processing using only images that satisfy the quality condition.
- the user may be able to select the camera-shake multi-shot synthesis mode (hereinafter referred to as the first mode) and the pixel-shifted multi-shot synthesis mode (hereinafter referred to as the second mode) using the operation unit 42 . That is, the processor 40 may selectively execute the first mode and the second mode. In this case, the processor 40 changes the temporal display in the display process to different contents between the first mode and the second mode when the determination is negative in the determination process. For example, processor 40 suspends the temporal representation in the first mode and updates the temporal representation in the second mode.
- each time the imaging sensor 20 captures a subject image it is determined whether or not the captured image satisfies the quality condition.
- the imaging sensor 20 may perform multiple imaging operations, acquire a predetermined number of images, and then determine whether or not each image satisfies the quality condition.
- the processor 40 performs temporal display and reason display on the display 15. An indication and an indication of the reason may be provided.
- the imaging device 10 may be capable of tethered imaging in which imaging is performed while connected to the personal computer 50 by wire or wirelessly. In this case, the combining process described above may be performed within the personal computer 50 . Also, the display 52 of the personal computer 50 may display the above-described time and reason. Note that the imaging device 10 may be connectable to an external device other than the personal computer 50 .
- the following various processors can be used as the hardware structure of the control unit, with the processor 40 being an example.
- the above-mentioned various processors include CPUs, which are general-purpose processors that function by executing software (programs), as well as processors such as FPGAs whose circuit configuration can be changed after manufacture.
- FPGAs include dedicated electric circuits, which are processors with circuitry specifically designed to perform specific processing, such as PLDs or ASICs.
- the control unit may be configured with one of these various processors, or a combination of two or more processors of the same type or different types (for example, a combination of multiple FPGAs or a combination of a CPU and an FPGA). may consist of Also, the plurality of control units may be configured by one processor.
- control unit there are multiple possible examples of configuring multiple control units with a single processor.
- first example as typified by computers such as clients and servers, there is a mode in which one or more CPUs and software are combined to form one processor, and this processor functions as a plurality of control units.
- second example is the use of a processor that implements the functions of the entire system including multiple control units with a single IC chip, as typified by System On Chip (SOC).
- SOC System On Chip
- an electric circuit combining circuit elements such as semiconductor elements can be used.
- Imaging device 11 Body 11A Camera side mount 11B Electrical contact 12 Imaging lens 12A Lens side mount 12B Electrical contact 13 Dial 14 Release button 15 Display 17 Viewfinder 18 Viewfinder eyepiece 20 Image sensor 20A Light receiving surface 30 Objective lens 31 Focus lens 32 Rear End lens 34 Lens drive control unit 40 Processor 41 Image processing unit 42 Operation unit 43 Mechanical anti-vibration mechanism 44 Blur detection sensor 45A Operation program 50 Personal computer 52 Display A X X-axis A Y Y-axis A Z Z-axis P Image PA Region PS Super-resolution image SI Subject image
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
Description
以下に、上記実施形態の各種変形例について説明する。
11 本体
11A カメラ側マウント
11B 電気接点
12 撮像レンズ
12A レンズ側マウント
12B 電気接点
13 ダイヤル
14 レリーズボタン
15 ディスプレイ
17 ファインダ
18 ファインダ接眼部
20 撮像センサ
20A 受光面
30 対物レンズ
31 フォーカスレンズ
32 後端レンズ
34 レンズ駆動制御部
40 プロセッサ
41 画像処理部
42 操作部
43 機械防振機構
44 ブレ検出センサ
45 メモリ
45A 作動プログラム
50 パーソナルコンピュータ
52 ディスプレイ
AX X軸
AY Y軸
AZ Z軸
P 画像
PA 領域
PS 超解像画像
SI 被写体像
Claims (13)
- 被写体像を撮像する撮像素子と、前記被写体像と前記撮像素子の相対位置を変更可能とする移動機構とを備える撮像装置に用いられる撮像方法であって、
前記相対位置を複数回変更する変更工程と、
複数の前記相対位置で前記撮像素子を用いて前記被写体像を撮像することにより複数の第1画像を取得する撮像工程と、
前記複数の第1画像を合成することにより第2画像を生成する合成工程と、
前記撮像工程又は前記合成工程に関する時間的表示を行う表示工程と、
を含む撮像方法。 - 前記時間的表示は、前記撮像工程又は前記合成工程の実行時間を認識可能な表示である、
請求項1に記載の撮像方法。 - 前記複数の第1画像のうち少なくとも1つが第1条件を満たしているかに基づいて、前記合成工程が実行可能であるか否かを判定する判定工程を含み、
前記判定工程により前記判定が肯定された場合に前記合成工程を実行し、前記判定が否定された場合に前記撮像工程の処理内容を変更する、
請求項1又は請求項2に記載の撮像方法。 - 前記判定工程により前記判定が否定された場合に、前記撮像工程における前記被写体像の撮像を中止する第1中止工程を含む、
請求項3に記載の撮像方法。 - 前記被写体像の撮像が中止された場合に、前記表示工程における前記時間的表示を中止する第2中止工程を含む、
請求項4に記載の撮像方法。 - 前記被写体像の撮像を中止した理由又は前記時間的表示を中止した理由をユーザに通知する通知工程を含む、
請求項5に記載の撮像方法。 - 前記変更工程では、前記撮像装置に加わる揺れを利用して前記相対位置を変更する、
請求項1から請求項6のうちいずれか1項に記載の撮像方法。 - 前記判定工程により前記判定が否定された場合に、前記判定が肯定された場合よりも、前記撮像工程における前記被写体像の撮像回数を多くする、
請求項3に記載の撮像方法。 - 前記撮像工程における前記被写体像の撮像回数を多くした場合に、前記表示工程における前記時間的表示を更新する更新工程を含む、
請求項8に記載の撮像方法。 - 前記撮像工程において前記撮像回数を多くした理由又は前記時間的表示を更新した理由をユーザに通知する通知工程を含む、
請求項9に記載の撮像方法。 - 前記変更工程では、前記移動機構を用いて予め定められた位置に前記撮像素子を移動させることにより前記相対位置を変更する、
請求項8から請求項10のうちいずれか1項に記載の撮像方法。 - 被写体像を撮像する撮像素子と、前記被写体像と前記撮像素子の相対位置を変更可能とする移動機構と、プロセッサとを備え、
前記プロセッサは、
前記相対位置を複数回変更する変更処理と、
複数の前記相対位置で前記撮像素子を用いて前記被写体像を撮像することにより複数の第1画像を取得する撮像処理と、
前記複数の第1画像を合成することにより第2画像を生成する合成処理と、
前記撮像処理又は前記合成処理に関する時間的表示を行う表示処理と、
を実行する撮像装置。 - 前記プロセッサは、
第1モード及び第2モードを選択的に実行可能であり、
前記第1モードでは、前記変更処理において、前記撮像装置に加わる揺れを利用して前記相対位置を変更し、
前記第2モードでは、前記変更処理において、前記移動機構を用いて予め定められた位置に前記撮像素子を移動させることにより前記相対位置を変更し、
前記複数の第1画像のうち少なくとも1つが第1条件を満たしているかに基づいて、前記合成処理が実行可能であるか否かを判定する判定処理を実行し、
前記判定処理により前記判定が否定された場合に、前記表示処理における前記時間的表示を、前記第1モードと前記第2モードで互いに異なる内容に変更する、
請求項12に記載の撮像装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023529655A JPWO2022264691A1 (ja) | 2021-06-17 | 2022-04-25 | |
CN202280041593.9A CN117501707A (zh) | 2021-06-17 | 2022-04-25 | 摄像方法及摄像装置 |
US18/516,911 US20240089588A1 (en) | 2021-06-17 | 2023-11-21 | Imaging method and imaging apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-100823 | 2021-06-17 | ||
JP2021100823 | 2021-06-17 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/516,911 Continuation US20240089588A1 (en) | 2021-06-17 | 2023-11-21 | Imaging method and imaging apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022264691A1 true WO2022264691A1 (ja) | 2022-12-22 |
Family
ID=84527330
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/018701 WO2022264691A1 (ja) | 2021-06-17 | 2022-04-25 | 撮像方法及び撮像装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240089588A1 (ja) |
JP (1) | JPWO2022264691A1 (ja) |
CN (1) | CN117501707A (ja) |
WO (1) | WO2022264691A1 (ja) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007135133A (ja) * | 2005-11-14 | 2007-05-31 | Nikon Corp | 撮像装置 |
WO2010032649A1 (ja) * | 2008-09-16 | 2010-03-25 | 三洋電機株式会社 | 画像表示装置及び撮像装置 |
JP2014123881A (ja) * | 2012-12-21 | 2014-07-03 | Canon Inc | 情報処理装置、情報処理方法、コンピュータプログラム |
WO2016147957A1 (ja) * | 2015-03-13 | 2016-09-22 | リコーイメージング株式会社 | 撮像装置および撮像方法 |
JP2019191258A (ja) * | 2018-04-19 | 2019-10-31 | オリンパス株式会社 | 撮像装置、撮像プログラム、撮像方法 |
-
2022
- 2022-04-25 CN CN202280041593.9A patent/CN117501707A/zh active Pending
- 2022-04-25 WO PCT/JP2022/018701 patent/WO2022264691A1/ja active Application Filing
- 2022-04-25 JP JP2023529655A patent/JPWO2022264691A1/ja active Pending
-
2023
- 2023-11-21 US US18/516,911 patent/US20240089588A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007135133A (ja) * | 2005-11-14 | 2007-05-31 | Nikon Corp | 撮像装置 |
WO2010032649A1 (ja) * | 2008-09-16 | 2010-03-25 | 三洋電機株式会社 | 画像表示装置及び撮像装置 |
JP2014123881A (ja) * | 2012-12-21 | 2014-07-03 | Canon Inc | 情報処理装置、情報処理方法、コンピュータプログラム |
WO2016147957A1 (ja) * | 2015-03-13 | 2016-09-22 | リコーイメージング株式会社 | 撮像装置および撮像方法 |
JP2019191258A (ja) * | 2018-04-19 | 2019-10-31 | オリンパス株式会社 | 撮像装置、撮像プログラム、撮像方法 |
Also Published As
Publication number | Publication date |
---|---|
CN117501707A (zh) | 2024-02-02 |
JPWO2022264691A1 (ja) | 2022-12-22 |
US20240089588A1 (en) | 2024-03-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP2008160175A (ja) | デジタルカメラ | |
US8081223B2 (en) | Imaging apparatus | |
KR101599872B1 (ko) | 디지털 영상 처리 장치, 그 제어방법 및 이를 실행시키기 위한 프로그램을 저장한 기록매체 | |
JP5052389B2 (ja) | 撮像装置 | |
US9871964B2 (en) | Photographing apparatus, photographing controller, photographing control method, and photographing control program | |
JP2009251491A (ja) | 撮像装置および撮像装置の制御方法 | |
CN101373254B (zh) | 摄影装置及摄影装置的控制方法 | |
JP3826878B2 (ja) | 撮像装置 | |
JP2002290828A (ja) | カメラボディ、デジタルカメラおよび露出制御方法 | |
JP5241347B2 (ja) | 撮像装置 | |
WO2022264691A1 (ja) | 撮像方法及び撮像装置 | |
JP2006343645A (ja) | 像ぶれ補正装置、光学機器、カメラおよび撮影レンズ | |
JPWO2020044881A1 (ja) | 撮像装置、撮像方法、及びプログラム | |
JP5478677B2 (ja) | 撮像装置および撮像装置の制御方法 | |
JP2018067883A (ja) | 撮影装置 | |
JP2010171769A (ja) | 電子カメラ | |
JPH05145810A (ja) | 重畳表示可能なカメラ | |
JP2012239078A (ja) | 撮影装置 | |
JP2003029137A (ja) | 自動焦点検出装置 | |
US12022193B2 (en) | Imaging apparatus and operation method thereof | |
WO2022070688A1 (ja) | 撮像装置、撮像装置の駆動方法、及びプログラム | |
JP7210256B2 (ja) | 撮像装置及び表示制御方法 | |
JP6656589B2 (ja) | 撮像装置及び撮像制御プログラム | |
JP2010124177A (ja) | 撮像装置および撮像装置の制御方法 | |
JP2009229983A (ja) | 拡大表示機能付きカメラおよびカメラの制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22824686 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023529655 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280041593.9 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22824686 Country of ref document: EP Kind code of ref document: A1 |