WO2020202684A1 - Dispositif et procédé de traitement d'image, programme et dispositif d'imagerie - Google Patents

Dispositif et procédé de traitement d'image, programme et dispositif d'imagerie Download PDF

Info

Publication number
WO2020202684A1
WO2020202684A1 PCT/JP2020/000146 JP2020000146W WO2020202684A1 WO 2020202684 A1 WO2020202684 A1 WO 2020202684A1 JP 2020000146 W JP2020000146 W JP 2020000146W WO 2020202684 A1 WO2020202684 A1 WO 2020202684A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
sub
main
imaging
Prior art date
Application number
PCT/JP2020/000146
Other languages
English (en)
Japanese (ja)
Inventor
仁史 八木澤
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Priority to US17/441,133 priority Critical patent/US11743576B2/en
Priority to CN202080023014.9A priority patent/CN113632449A/zh
Publication of WO2020202684A1 publication Critical patent/WO2020202684A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • G03B17/04Bodies collapsible, foldable or extensible, e.g. book type
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B19/00Cameras
    • G03B19/02Still-picture cameras
    • G03B19/04Roll-film cameras
    • G03B19/06Roll-film cameras adapted to be loaded with more than one film, e.g. with exposure of one or the other at will
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/53Constructional details of electronic viewfinders, e.g. rotatable or detachable
    • H04N23/531Constructional details of electronic viewfinders, e.g. rotatable or detachable being rotatable or detachable
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • This technology makes it easy to set the composition and check the imaging status of the image processing device, image processing method, program, and imaging device.
  • Patent Document 1 a first image generated by the camera body using the main body lens and an attachment attached to the camera body using an attachment lens having an angle of view different from that of the main body lens are used. Using the generated second image, the shooting range frame of the image on the narrow shooting range is superimposed on the image on the wide shooting range.
  • Patent Document 1 Although the imaging range of the camera body is clarified, the image generated by the camera body cannot be confirmed.
  • an object of this technique to provide an image processing device, an image processing method, a program, and an imaging device that can not only easily determine the composition at the time of shooting but also easily confirm the imaging state.
  • the first aspect of this technology is According to the angle-of-view comparison result between the main image captured by the main imaging unit and the sub-image generated by the sub-imaging unit, an image composition process is performed using the main image and the sub-image to display an image. And the image compositing part that generates
  • the image processing apparatus includes a control unit that causes the image composition unit to perform image composition processing in response to detecting an image viewing operation of the display image generated by the image composition unit.
  • the angle of view comparison between the main captured image and the sub-captured image is performed by the control unit.
  • the image compositing unit performs image compositing processing using the main captured image and the sub-captured image according to the angle of view comparison result, and generates a display image.
  • a display image is generated by superimposing the main captured image on the sub-captured image or superimposing the reduced sub-captured image on the main captured image.
  • the main captured image is used as the display image.
  • the control unit controls the image composition unit to perform the image composition process in response to detecting the image viewing operation of the display image generated by the image composition unit.
  • the angle of view of the main image pickup unit that generates the main image capture image or the sub image pickup unit that generates the sub image capture image can be changed.
  • the angle of view comparison result with the sub-image is output to the image composition unit.
  • the superimposed size of the main captured image is generated according to the user operation received by the user interface unit. Or the superimposition position is changed.
  • the second aspect of this technology is According to the angle-of-view comparison result between the main image captured by the main imaging unit and the sub-image generated by the sub-imaging unit, an image composition process is performed using the main image and the sub-image to display an image. Is generated by the image synthesizer,
  • the image processing method includes having the control unit perform the image composition process in response to the detection of the image viewing operation of the display image generated by the image composition unit.
  • the third aspect of this technology is A program that causes a computer to generate display images. According to the angle-of-view comparison result between the main image captured by the main imaging unit and the sub-image generated by the sub-imaging unit, an image composition process is performed using the main image and the sub-image to display an image. And the procedure to generate There is a program for causing the computer to execute a procedure for performing the image compositing process in response to detecting an image visual recognition operation of the displayed image.
  • the program of the present technology provides, for example, a storage medium, a communication medium, for example, a storage medium such as an optical disk, a magnetic disk, or a semiconductor memory, which is provided in a computer-readable format to a general-purpose computer capable of executing various program codes. It is a program that can be provided by a medium or a communication medium such as a network. By providing such a program in a computer-readable format, processing according to the program can be realized on the computer.
  • the fourth aspect of this technology is The main imaging unit that generates the main captured image and A sub-imaging unit that generates a sub-imaging image in the imaging direction corresponding to the imaging direction of the main imaging unit, and A control unit that compares the angles of view of the main captured image and the sub-captured image, An image compositing unit that generates a display image by performing an image compositing process using the main captured image and the sub-captured image according to the comparison result of the angle of view comparison performed by the control unit.
  • a display unit that displays the display image generated by the image composition unit, and Equipped with a detector that detects image viewing motion
  • the control unit is an imaging device that causes the image compositing unit to perform an image compositing process in response to the detection unit detecting an image viewing operation of a display image generated by the image compositing unit.
  • the main imaging unit generates the main image
  • the sub-imaging unit generates the sub-image in the imaging direction corresponding to the imaging direction of the main imaging unit.
  • the control unit compares the angles of view of the main captured image and the sub-captured image
  • the image synthesis unit compares the main captured image and the sub-captured image according to the comparison result of the angle of view comparison performed by the control unit.
  • the image compositing process used for example, an image compositing process of superimposing the other on one of the main captured image and the sub captured image is performed to generate a display image. It is displayed on the display.
  • control unit causes the image composition unit to perform image composition processing in response to the detection unit detecting the image viewing operation of the display image generated by the image composition unit.
  • a plurality of display units are provided, and the display unit for displaying the displayed image is switched according to the image confirmation operation of the user.
  • one display unit is a viewfinder, and when the user visually recognizes the viewfinder image, the displayed image is displayed on the viewfinder.
  • a touch panel provided on the screen of the display unit is used as a user interface unit that accepts user operations, and the control unit controls the image composition unit according to the user operation received by the user interface unit to synthesize an image. Change the superimposition size or superimposition position in the process.
  • the sub-imaging unit may be fixedly provided to the main body portion provided with the main imaging unit, or may be provided detachably from the main body portion provided with the main imaging unit. Further, the sub-imaging unit is electrically connected to the main body by a connection terminal, a connection cable, or wireless communication. Further, a plurality of sub-imaging units may be provided, and the imaging direction of the plurality of sub-imaging units may be outward with respect to the imaging direction of the main imaging unit.
  • An image pickup device using the image processing device of the present technology has a main image pickup unit, a sub image pickup unit, an image composition unit, a display unit, and a control unit.
  • the main imaging unit generates the main captured image
  • the sub-imaging unit generates the sub-image.
  • the image compositing unit performs image compositing processing of the main captured image and the sub-captured image according to the angle-of-view comparison result of the main captured image and the sub-captured image in the control unit to generate a display image.
  • the display unit displays the display image generated by the image composition unit.
  • FIG. 1 shows a first structural example of the imaging device.
  • FIG. 1A shows a front view
  • FIG. 1B shows a top view.
  • the imaging device 10-1 is provided with a main imaging unit 21 in the main body 11, and a sub-imaging unit 22 for setting a composition or the like is provided in the center of the upper part of the main body 11, for example.
  • the optical axis direction (imaging direction) of the main imaging unit 21 and the sub imaging unit 22 is set to be parallel.
  • the angle of view ⁇ of the sub-imaging unit 22 is wider than the angle of view ⁇ of the main imaging unit 21.
  • the angle of view ⁇ of the sub-imaging unit 22 is set so that the angle of view adjusting range includes a range having a narrower angle of view than the sub-imaging unit 22.
  • a shutter button 271 is provided on the upper surface of the main body 11.
  • FIG. 2 shows a second structural example of the imaging device.
  • FIG. 2A shows a front view
  • FIG. 2B shows a top view.
  • the image pickup apparatus 10-2 is provided with a main imaging unit 21 in the main body 11, and an attachment 12 is provided in the center of the upper surface of the main body 11, for example.
  • a sub-imaging unit 22 can be attached to the attachment 12, and when the sub-imaging unit 22 is attached to the attachment 12, the sub-imaging unit 22 is electrically connected to the main body 11 and is subordinated to the main body 11. Power is supplied to the imaging unit 22, and an image signal is supplied from the sub-imaging unit 22 to the main body 11.
  • the optical axis direction (imaging direction) of the main imaging unit 21 and the sub-imaging unit 22 mounted on the attachment 12 is set to be parallel.
  • the angle of view ⁇ of the sub-imaging unit 22 is wider than the angle of view ⁇ of the main imaging unit 21.
  • the angle of view ⁇ of the sub-imaging unit 22 is set so that the angle of view adjusting range includes a range having a narrower angle of view than the sub-imaging unit 22.
  • a shutter button 271 is provided on the upper surface of the main body 11.
  • FIG. 3 shows a third structural example of the imaging device.
  • FIG. 3A shows a front view
  • FIG. 3B shows a top view.
  • the imaging device 10-3 is provided with a main imaging unit 21 in the main body 11, and a mounting space SP for the sub-imaging unit 22 is provided in, for example, the upper part of the front surface of the main body 11.
  • the sub-imaging unit 22 is detachable from the mounting space SP, and the sub-imaging unit 22 may be connected to the main body 11 using a connection cable 14 or may be connected to the main body 11 via a wireless transmission line. You may.
  • the optical axis direction (imaging direction) of the sub-imaging unit 22 provided in the main imaging unit 21 and the mounting space SP is set to be parallel.
  • the angle of view ⁇ of the sub-imaging unit 22 is wider than the angle of view ⁇ of the main imaging unit 21.
  • the angle of view ⁇ of the sub-imaging unit 22 is set so that the angle of view adjusting range includes a range having a narrower angle of view than the sub-imaging unit 22.
  • a shutter button 271 is provided on the upper surface of the main body 11.
  • FIG. 4 shows a fourth structural example of the imaging device.
  • FIG. 4A shows a front view
  • FIG. 4B shows a top view.
  • the imaging device 10-4 is provided with a main imaging unit 21 in the main body 11, and a sub-imaging unit 22 and a sub-imaging unit 23 are provided, for example, on the front surface of the main body 11.
  • the optical axis directions (imaging directions) of the sub-imaging units 22 and 23 may be set to be parallel to the main imaging unit 21, and as shown in FIG. 4B, the optical axis of the main imaging unit 21 may be set.
  • the optical axis directions of the sub-imaging units 22 and 23 may be provided so as to face outward with respect to the direction.
  • the sub-imaging units 22 and 23 By providing the sub-imaging units 22 and 23 facing outward in this way, it is possible to prevent the image captured by the sub-imaging units 22 and 23 from being eclipsed by the lens or the like of the main imaging unit 21. Further, if the sub-images acquired by the sub-image unit 22 and the sub-image 23 are combined to generate the sub-image Ps, the sub-image Ps will be the image of the angle of view ⁇ 1 of the sub-image 22 and the image of the sub-image 23.
  • the angle of view ⁇ is wider than the angle ⁇ 2.
  • the angle of view ⁇ is set to be wider than the angle of view ⁇ of the main imaging unit 21.
  • the angles of view ⁇ 1 and ⁇ 2 and the imaging direction of the sub-imaging units 22 and 23 are set so that the angle of view adjustment range includes a range of an angle of view narrower than the angle of view ⁇ . It is set.
  • a shutter button 271 is provided on the upper surface of the main body 11.
  • FIG. 5 illustrates the functional configuration of an image pickup apparatus using the image processing apparatus of the present technology.
  • FIG. 5 illustrates the functional configuration of an image pickup apparatus using the image processing apparatus of the present technology.
  • the image pickup device 10 includes a main image pickup unit 21, a sub image pickup unit 22, an image composition unit 24, a display unit 25, a viewfinder (VF) 26, a user interface (user I / F) unit 27, and a control unit 35. There is. Further, the imaging device 10 may have a recording unit 28 and an output unit 29.
  • the main imaging unit 21 includes, for example, an imaging optical system block, an image sensor, a camera signal processing unit, and the like.
  • the imaging optical system block is configured by using a focus lens, a zoom lens, an iris mechanism, or the like, and forms an optical image of a subject on an imaging surface of an image sensor in a desired size.
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge Coupled Device
  • the camera signal processing unit performs noise removal processing, gain adjustment processing, analog / digital conversion processing, defect pixel correction, development processing, etc. on the pixel signal generated by the image sensor to generate the main captured image Pm and synthesize the image.
  • Output to unit 24, recording unit 28, and output unit 29 are examples of the image sensor, and the like.
  • the sub-imaging unit 22 has an imaging optical system block, an image sensor, a camera signal processing unit, and the like, similarly to the main imaging unit 21. Further, the imaging optical system block of the sub-imaging unit 22 has a wider angle of view than, for example, the imaging optical system block of the main imaging unit 21.
  • the sub-imaging unit 22 generates sub-image images Ps and outputs them to the image synthesis unit 24.
  • the image pickup optical system block of the sub image pickup unit 22 includes a range of a narrower angle of view than the sub image pickup unit 22 in the angle of view adjustment range. The angle of view is set.
  • the image composition unit 24 Based on the control signal from the control unit 35, the image composition unit 24 performs image composition processing of the main image Pm generated by the main image pickup unit 21 and the sub-image Ps generated by the sub-image unit 22 and displays them. Generate image Pd. Further, the image synthesizing unit 24 outputs the image signal of the generated display image Pd to the display unit 25 or the viewfinder 26, or the display unit 25 and the viewfinder 26. Further, the image synthesizing unit 24 may output the image signal of the display image Pd to either the display unit 25 or the viewfinder 26 based on the control signal from the control unit 35.
  • the display unit 25 is configured by using a liquid crystal display element, an organic EL display element, or the like.
  • the display unit 25 displays the display image Pd supplied from the image composition unit 24. Further, the display unit 25 displays the menu of the image pickup device 10 and displays the GUI related to the operation of the user (photographer or the like) based on the control signal from the control unit 35.
  • the viewfinder 26 is configured by using a liquid crystal display element, an organic EL display element, or the like.
  • the viewfinder 26 displays the captured image based on the display image signal supplied from the image synthesizing unit 24.
  • the user interface unit 27 is composed of a shutter button 271, an operation switch, an operation button, and the like.
  • the user interface unit 27 generates an operation signal according to the user operation and outputs it to the control unit 35.
  • the user interface unit 27 has a detection unit that detects the visibility of the display of the viewfinder 26, for example, an eyepiece detection unit that detects whether the user is looking into the viewfinder 26, and operates a signal indicating the detection result. It may be output to the control unit 35 as a signal.
  • the recording unit 28 is configured by using a recording medium fixed to the image pickup apparatus 10 or a detachable recording medium.
  • the recording unit 28 records the captured image signal generated by the main imaging unit 21 on the recording medium based on the control signal from the control unit 35.
  • the control unit 35 has a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and the like.
  • the ROM (Read Only Memory) stores various programs executed by the CPU (Central Processing Unit).
  • RAM (Random Access Memory) stores information such as various parameters.
  • the CPU executes various programs stored in the ROM, and controls each unit based on the operation signal from the user interface unit 27 so that the image pickup apparatus 10 performs an operation according to the user operation. Further, the control unit 35 compares the angle of view of the main imaging unit 21 (main captured image Pm) with the angle of view of the sub-imaging unit 22 (sub-image Ps), and synthesizes a control signal indicating the comparison result.
  • the control unit 35 outputs a control signal indicating a new angle-of-view comparison result to the image synthesis unit. Output to 24. Further, the control unit 35 may control the image composition unit 24 to perform the image composition processing when the visual recognition of the display of the viewfinder 26 is detected based on the operation signal from the user interface unit 27.
  • FIG. 6 is a flowchart illustrating the captured image display operation of the imaging device.
  • the control unit acquires the angle of view of the imaging unit.
  • the control unit 35 acquires the angle of view ⁇ of the main imaging unit 21 (main captured image Pm) and the angle of view ⁇ of the sub-imaging unit 22 (sub-image Ps).
  • the angle of view information indicating the angle of view ⁇ of the main imaging unit 21 and the angle of view ⁇ of the sub-imaging unit 22 Is stored in advance in the control unit 35 in the manufacturing process of the imaging device or the like.
  • the angle of view ⁇ may be acquired from the image pickup lens, or the angle of view ⁇ corresponding to the zoom operation may be calculated by the control unit 35. Good.
  • the control unit 35 acquires the angles of view ⁇ and ⁇ and proceeds to step ST2.
  • step ST2 the control unit determines whether the angle of view ⁇ is narrower than the angle of view ⁇ .
  • the control unit 35 proceeds to step ST3 when the angle of view ⁇ is narrower than the angle of view ⁇ , and proceeds to step ST4 when the angle of view ⁇ is equal to or greater than the angle of view ⁇ .
  • step ST3 the control unit sets the image after the superimposition processing as the display image.
  • the control unit 35 controls the image synthesizing unit 24 to superimpose the main image Pm generated by the main imaging unit 21 on the sub-image Ps generated by the sub-imaging unit 22, or the main imaging unit 21.
  • the sub-image Ps generated by the sub-imaging unit 22 is reduced and superposed on the generated main image Pm, the image after the superimposition processing is set as the display image Pd, and the process proceeds to step ST5.
  • step ST4 the control unit sets the main captured image Pm as the display image.
  • the control unit 35 controls the image synthesizing unit 24, sets the main image Pm generated by the main image pickup unit 21 as the display image Pd, and proceeds to step ST5.
  • step ST5 the control unit performs display processing.
  • the control unit 35 displays the display image Pd set in step ST3 or step ST4 with, for example, the viewfinder 26, and returns to step ST1.
  • a display image is generated using the main imaging image Pm and the sub-imaging image Ps, so that the user can perform the main imaging.
  • the sub-images Ps By using the sub-images Ps while checking the image Pm, it becomes possible to easily take a picture with an optimum composition.
  • FIG. 7 illustrates other functional configurations of the imaging apparatus using the present technology.
  • the image pickup device 10 includes a main image pickup unit 21, a sub image pickup unit 22, an image composition unit 24, a display unit 25, a viewfinder 26, a user interface unit 27, an eyepiece detection unit 30, and a control unit 35. Further, the imaging device 10 may have a recording unit 28, an output unit 29, and the like.
  • the main image pickup unit 21 is configured by using an image pickup lens, an image sensor, a camera signal processing unit, and the like, and generates a main image pickup image Pm and outputs it to the image synthesis unit 24 and the recording unit 28. Further, the main imaging unit 21 may output the generated main captured image Pm to the outside.
  • the sub-imaging unit 22 is configured by using an image pickup lens, an image sensor, a camera signal processing unit, and the like having a wider angle of view than the main imaging unit 21, and generates sub-image images Ps and outputs them to the image synthesis unit 24. ..
  • the angle of view of the image pickup lens of the sub image pickup unit 22 is set so that the angle of view adjustment range includes a range of a narrower angle of view than the sub image pickup unit 22. Has been done.
  • the image composition unit 24 Based on the control signal from the control unit 35, the image composition unit 24 performs image composition processing of the main image Pm generated by the main image pickup unit 21 and the sub-image Ps generated by the sub-image unit 22 and displays them. Generate image Pd. Further, the image synthesizing unit 24 outputs the image signal of the generated display image Pd to the display unit 25 or the viewfinder 26, or the display unit 25 and the viewfinder 26. Further, the image synthesizing unit 24 may output the image signal of the display image Pd to either the display unit 25 or the viewfinder 26 based on the control signal from the control unit 35.
  • the display unit 25 is configured by using a liquid crystal display element, an organic EL display element, or the like.
  • the display unit 25 displays the display image Pd supplied from the image composition unit 24.
  • the display unit 25 displays the menu of the image pickup device 10 and displays the GUI related to the user's operation based on the control signal from the control unit 35.
  • the viewfinder 26 is configured by using a liquid crystal display element, an organic EL display element, or the like.
  • the viewfinder 26 displays the captured image based on the display image signal supplied from the image synthesizing unit 24.
  • a shutter button 271 In the user interface unit 27, a shutter button 271, an operation switch, an operation button, a GUI configured by providing a touch panel on the screen of the display unit 25, and the like are used.
  • the user interface unit 27 generates an operation signal according to the user operation and outputs it to the control unit 35.
  • the eyepiece detection unit 30 detects the user's image viewing operation, detects whether the user is looking into the viewfinder 26, for example, and outputs the eyepiece detection result to the control unit 35.
  • the recording unit 28 is configured by using a recording medium fixed to the image pickup apparatus 10 or a detachable recording medium.
  • the recording unit 28 records the captured image signal generated by the main imaging unit 21 on the recording medium based on the control signal from the control unit 35.
  • the control unit 35 has a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and the like.
  • the ROM (Read Only Memory) stores various programs executed by the CPU (Central Processing Unit).
  • RAM (Random Access Memory) stores information such as various parameters.
  • the CPU executes various programs stored in the ROM, and controls each unit based on the operation signal from the user interface unit 27 so that the image pickup apparatus 10 performs an operation according to the user operation. Further, the control unit 35 compares the angle of view of the main imaging unit 21 (main captured image Pm) with the angle of view of the sub-imaging unit 22 (sub-image Ps), and synthesizes a control signal indicating the comparison result.
  • the control unit 35 outputs a control signal indicating a new angle-of-view comparison result to the image synthesis unit. Output to 24. Further, the control unit 35 causes the image composition unit to perform the image composition process in response to the detection of the image viewing operation of the display image generated by the image composition unit 24 based on the detection result of the eyepiece detection unit 30. Further, the control unit 35 controls the display operation of the display image Pd based on the detection result of the eyepiece detection unit 30.
  • FIG. 8 is a flowchart illustrating other captured image display operations of the imaging device.
  • the control unit 35 acquires the eyepiece detection result.
  • the control unit 35 acquires the eyepiece detection result from the eyepiece detection unit 30, and the process proceeds to step ST12.
  • step ST12 the control unit 35 selects the display medium.
  • the control unit 35 determines that the user is looking into the viewfinder 26 based on the detection result acquired in step ST11, the control unit 35 selects the viewfinder 26 as the display medium and determines that the user is not looking into the viewfinder 26. In this case, the display unit 25 is selected as the display medium and the process proceeds to step ST13.
  • the control unit acquires the angle of view.
  • the control unit 35 acquires the angle of view ⁇ of the main imaging unit 21 (main captured image Pm) and the angle of view ⁇ of the sub-imaging unit 22 (sub-image Ps).
  • the angle of view information indicating the angle of view ⁇ of the main imaging unit 21 and the angle of view ⁇ of the sub-imaging unit 22 Is stored in advance in the control unit 35 in the manufacturing process of the imaging device or the like.
  • the angle of view ⁇ may be acquired from the image pickup lens, or the angle of view ⁇ corresponding to the zoom operation may be calculated by the control unit 35. Good.
  • the control unit 35 acquires the angles of view ⁇ and ⁇ and proceeds to step ST14.
  • step ST14 the control unit determines whether the angle of view ⁇ is narrower than the angle of view ⁇ .
  • the control unit 35 proceeds to step ST15 when it is determined that the angle of view ⁇ is narrower than the angle of view ⁇ , and proceeds to step ST18 when the angle of view ⁇ is equal to or greater than the angle of view ⁇ .
  • step ST15 the control unit sets the image after the superimposition processing as the display image.
  • the control unit 35 controls the image synthesizing unit 24 to superimpose the main image Pm generated by the main imaging unit 21 on the sub-image Ps generated by the sub-imaging unit 22, or the main imaging unit 21.
  • the sub-image Ps generated by the sub-imaging unit 22 is reduced and superposed on the generated main image Pm, the image after the superimposition processing is set as the display image Pd, and the process proceeds to step ST16.
  • step ST16 the control unit determines whether the update operation has been performed.
  • the control unit 35 proceeds to step ST17 when the user interface unit 27 determines that the update operation of the image composition process has been performed, and proceeds to step ST19 when it is not determined that the update operation has been performed.
  • step ST17 the control unit performs image composition update processing. Based on the operation signal from the user interface unit 27, the control unit 35 changes the superimposition size and superimposition position of the main image Pm superimposed on the sub-image Ps or the sub-image Ps superimposed on the main image Pm. The display image Pd is generated and the process proceeds to step ST19.
  • FIG. 9 illustrates a case where the superimposition position of the sub-image Ps is changed according to the swipe operation.
  • FIG. 9A shows a display image displayed by the viewfinder 26.
  • FIG. 9B shows a swipe operation performed using the touch panel of the user interface unit 27 provided on the screen of the display unit 25.
  • the superimposed position of the main captured image Pm moves upward and becomes the position Qv indicated by the broken line.
  • FIG. 10 illustrates a case where the main captured image Pm is enlarged according to the pinch-out operation.
  • FIG. 10A shows a display image Pd displayed by the viewfinder 26.
  • FIG. 10B shows a pinch-out operation performed using the touch panel of the user interface unit 27 provided on the screen of the display unit 25.
  • the main captured image Pm is enlarged in the vertical direction and the horizontal direction.
  • step ST18 the control unit sets the main captured image Pm as the display image.
  • the control unit 35 controls the image synthesizing unit 24, sets the main image Pm generated by the main image pickup unit 21 as the display image Pd, and proceeds to step ST19.
  • step ST19 the control unit performs display processing.
  • the control unit 35 causes the display image Pd to be displayed using the display medium selected in step ST12, and returns to step ST11.
  • the main imaging image Pm generated by the main imaging unit is superimposed on the sub-images Ps generated by the sub-imaging unit. While checking the image Pm, it becomes possible to take a picture with an optimum composition by using the sub-image Ps.
  • the display image can be displayed on the display medium used by the user. Further, when the main captured image Pm is superimposed on the sub-image Ps, the display size and the superimposed position of the main captured image Pm can be changed, so that the operation using the displayed image becomes easy.
  • the image synthesizing unit 24 synthesizes the captured images even generated by the plurality of sub-imaging units to synthesize a wide-angle sub-image. Generate Ps.
  • the angle of view can be widened as compared with the case where one sub-imaging unit generates sub-images Ps.
  • the optical axes of the plurality of sub-imaging units are tilted outward, the angle of view of the sub-images Ps generated by synthesizing the captured images can be further widened.
  • the subject area in which the eclipse occurs can be imaged by the other sub-imaging unit. It is possible to generate sub-images Ps that have not occurred.
  • the series of processes described in the specification can be executed by hardware, software, or a composite configuration of both.
  • the program that records the processing sequence is installed in the memory in the computer embedded in the dedicated hardware and executed.
  • the program can be pre-recorded on a hard disk as a recording medium, an SSD (Solid State Drive), or a ROM (Read Only Memory).
  • the program is a flexible disk, CD-ROM (Compact Disc Read Only Memory), MO (Magneto optical) disk, DVD (Digital Versatile Disc), BD (Blu-Ray Disc (registered trademark)), magnetic disk, semiconductor memory card. It can be temporarily or permanently stored (recorded) in a removable recording medium such as.
  • a removable recording medium can be provided as so-called package software.
  • the program may be transferred from the download site to the computer wirelessly or by wire via a network such as LAN (Local Area Network) or the Internet.
  • the computer can receive the program transferred in this way and install it on a recording medium such as a built-in hard disk.
  • the image processing apparatus of the present technology can have the following configurations.
  • At least one of the main image and the sub image is used according to the angle of view comparison result between the main image generated by the main image unit and the sub image generated by the sub image.
  • An image compositing unit that performs image compositing processing and generates a display image
  • An image processing device including a control unit that causes the image compositing unit to perform image compositing processing in response to detecting an image visual recognition operation of a display image generated by the image compositing unit.
  • the control unit When the zoom operation is performed by the main imaging unit or the sub-imaging unit, the control unit outputs the angle of view comparison result between the main image and the sub-image to the image compositing unit (4).
  • the image compositing unit performs an image compositing process using the main captured image and the sub-captured image to display the displayed image.
  • the image processing apparatus according to any one of (1) to (4) to be generated.
  • the image processing apparatus according to (5), wherein the image synthesizing unit superimposes the reduced sub-captured image on the main captured image to generate the display image.
  • the image processing apparatus changes the superimposition size or superimposition position according to the user operation received by the user interface unit.
  • the image synthesizing unit sets the main captured image as the display image according to any one of (1) to (8).
  • the image processing apparatus described. (10) The image processing apparatus according to any one of (1) to (9), wherein the sub-image is an image generated by the sub-imaging unit, which is an imaging direction corresponding to the imaging direction of the main imaging unit. ..
  • Imaging device 11 ... Main body 12 ... Attachment 21 ... Main imaging unit 22, 23 ... Sub imaging unit 24 ⁇ ⁇ ⁇ Image composition unit 25 ⁇ ⁇ ⁇ Display unit 26 ⁇ ⁇ ⁇ Viewfinder 27 ⁇ ⁇ ⁇ User interface unit 28 ⁇ ⁇ ⁇ Recording unit 29 ⁇ ⁇ ⁇ Output unit 30 ⁇ ⁇ ⁇ Eyepiece detection unit 35 ⁇ ⁇ ⁇ Control unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Studio Devices (AREA)
  • Cameras In General (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Structure And Mechanism Of Cameras (AREA)
  • Indication In Cameras, And Counting Of Exposures (AREA)

Abstract

Une unité de commande (35) compare l'angle de vue entre une image principale formée, générée par une unité d'imagerie principale (21) et une image de sous-imagerie générée par une unité de sous-imagerie (22). Lorsqu'il est indiqué, selon le résultat de comparaison de l'angle de vue de l'unité de commande (5), que l'angle de vue de l'image formée principale est plus étroit que l'angle de vue de l'image de sous-imagerie, une unité de synthèse d'image (24) superpose l'image formée principale sur l'image de sous-imagerie ou superpose une version rétractée de l'image de sous-imagerie sur l'image formée principale et génère une image d'affichage. Lorsque l'angle de vue de l'image formée principale est plus étroit que l'angle de vue de l'image de sous-imagerie, l'image formée principale est conçue pour être l'image d'affichage. Une composition peut être facilement déterminée à l'aide de l'image de sous-imagerie au moment de la photographie. En outre, l'état d'imagerie peut être confirmé par l'image formée principale.
PCT/JP2020/000146 2019-03-29 2020-01-07 Dispositif et procédé de traitement d'image, programme et dispositif d'imagerie WO2020202684A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/441,133 US11743576B2 (en) 2019-03-29 2020-01-07 Image processing apparatus, image processing method, program, and imaging apparatus
CN202080023014.9A CN113632449A (zh) 2019-03-29 2020-01-07 图像处理装置、图像处理方法、程序和成像装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-066235 2019-03-29
JP2019066235A JP2020167518A (ja) 2019-03-29 2019-03-29 画像処理装置と画像処理方法およびプログラムと撮像装置

Publications (1)

Publication Number Publication Date
WO2020202684A1 true WO2020202684A1 (fr) 2020-10-08

Family

ID=72668011

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/000146 WO2020202684A1 (fr) 2019-03-29 2020-01-07 Dispositif et procédé de traitement d'image, programme et dispositif d'imagerie

Country Status (4)

Country Link
US (1) US11743576B2 (fr)
JP (1) JP2020167518A (fr)
CN (1) CN113632449A (fr)
WO (1) WO2020202684A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012029245A (ja) * 2010-07-27 2012-02-09 Sanyo Electric Co Ltd 撮像装置
JP2012042805A (ja) * 2010-08-20 2012-03-01 Olympus Imaging Corp 撮像装置
WO2012120952A1 (fr) * 2011-03-04 2012-09-13 富士フイルム株式会社 Dispositif de photographie et procédé de commande d'affichage
JP2015136096A (ja) * 2013-12-20 2015-07-27 パナソニックIpマネジメント株式会社 撮像装置
WO2017200049A1 (fr) * 2016-05-20 2017-11-23 日立マクセル株式会社 Appareil de capture d'image et fenêtre de réglage associée

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6930718B2 (en) * 2001-07-17 2005-08-16 Eastman Kodak Company Revised recapture camera and method
JP4517664B2 (ja) * 2004-02-12 2010-08-04 ソニー株式会社 画像処理装置および方法、プログラム、並びに記録媒体
JP4986189B2 (ja) * 2010-03-31 2012-07-25 カシオ計算機株式会社 撮像装置、及びプログラム
JP2013235195A (ja) 2012-05-10 2013-11-21 Casio Comput Co Ltd 撮像装置、及び撮影範囲表示方法、プログラム
JP6103526B2 (ja) * 2013-03-15 2017-03-29 オリンパス株式会社 撮影機器,画像表示機器,及び画像表示機器の表示制御方法
JP6573211B2 (ja) * 2015-03-04 2019-09-11 カシオ計算機株式会社 表示装置、画像表示方法及びプログラム
WO2016181804A1 (fr) * 2015-05-08 2016-11-17 ソニー株式会社 Dispositif et procédé de traitement d'image
JP2019054461A (ja) * 2017-09-15 2019-04-04 オリンパス株式会社 撮像装置および撮像方法
CN107948519B (zh) * 2017-11-30 2020-03-27 Oppo广东移动通信有限公司 图像处理方法、装置及设备
CN108154514B (zh) * 2017-12-06 2021-08-13 Oppo广东移动通信有限公司 图像处理方法、装置及设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012029245A (ja) * 2010-07-27 2012-02-09 Sanyo Electric Co Ltd 撮像装置
JP2012042805A (ja) * 2010-08-20 2012-03-01 Olympus Imaging Corp 撮像装置
WO2012120952A1 (fr) * 2011-03-04 2012-09-13 富士フイルム株式会社 Dispositif de photographie et procédé de commande d'affichage
JP2015136096A (ja) * 2013-12-20 2015-07-27 パナソニックIpマネジメント株式会社 撮像装置
WO2017200049A1 (fr) * 2016-05-20 2017-11-23 日立マクセル株式会社 Appareil de capture d'image et fenêtre de réglage associée

Also Published As

Publication number Publication date
JP2020167518A (ja) 2020-10-08
US20220159192A1 (en) 2022-05-19
CN113632449A (zh) 2021-11-09
US11743576B2 (en) 2023-08-29

Similar Documents

Publication Publication Date Title
CN107026973B (zh) 图像处理装置、图像处理方法与摄影辅助器材
US8836754B2 (en) Image photographing device and control method thereof
US8780200B2 (en) Imaging apparatus and image capturing method which combine a first image with a second image having a wider view
KR101339193B1 (ko) 카메라 플랫폼 시스템
WO2016002228A1 (fr) Dispositif de capture d'image
KR102280000B1 (ko) 표시 제어 장치, 표시 제어 방법 및 저장 매체
GB2572718A (en) Display control apparatus and control method of the same
JP2007028536A (ja) デジタルカメラ
US20160134805A1 (en) Imaging apparatus, imaging method thereof, and computer readable recording medium
JP5861395B2 (ja) 携帯機器
JP2006245793A (ja) 撮像システム
JP3788714B2 (ja) 撮像装置及びデジタルカメラ
JP6598028B2 (ja) 撮像装置
JP2011142419A5 (fr)
JP6721084B2 (ja) ズーム制御装置、ズーム制御方法およびプログラム
US20180278856A1 (en) Imaging apparatus, control method thereof and program
JP5492651B2 (ja) 撮影装置、パノラマ撮影方法
JP2011035752A (ja) 撮像装置
JP2006174128A (ja) 撮像装置および撮像システム
WO2020202684A1 (fr) Dispositif et procédé de traitement d'image, programme et dispositif d'imagerie
JP2018006995A (ja) 撮像装置、表示装置、画像処理プログラム
JP4172352B2 (ja) 撮像装置及び方法、撮像システム、プログラム
JP2009044403A (ja) 撮像装置
JP4923674B2 (ja) デジタルカメラ及びフォーカス位置特定方法、プログラム
WO2010131724A1 (fr) Appareil photo numérique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20783708

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20783708

Country of ref document: EP

Kind code of ref document: A1