US20210073942A1 - Image processing device, imaging system provided therewith, and calibration method - Google Patents
Image processing device, imaging system provided therewith, and calibration method Download PDFInfo
- Publication number
- US20210073942A1 US20210073942A1 US16/952,827 US202016952827A US2021073942A1 US 20210073942 A1 US20210073942 A1 US 20210073942A1 US 202016952827 A US202016952827 A US 202016952827A US 2021073942 A1 US2021073942 A1 US 2021073942A1
- Authority
- US
- United States
- Prior art keywords
- image
- touch operation
- camera
- adjustment
- camera image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims description 75
- 238000003384 imaging method Methods 0.000 title claims description 21
- 239000002131 composite material Substances 0.000 claims abstract description 91
- 239000000203 mixture Substances 0.000 claims abstract description 74
- 238000005096 rolling process Methods 0.000 claims description 18
- 238000001514 detection method Methods 0.000 abstract description 11
- 238000010586 diagram Methods 0.000 description 26
- 230000005057 finger movement Effects 0.000 description 15
- 230000008602 contraction Effects 0.000 description 6
- 230000007547 defect Effects 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 4
- 230000006866 deterioration Effects 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 238000009966 trimming Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B37/00—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
- G03B37/04—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G06K9/60—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/22—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
- G06V10/235—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition based on user input or interaction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/24—Aligning, centring, orientation detection or correction of the image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/246—Calibration of cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H04N5/232—
-
- H04N5/23216—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2625—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of images from a temporal image sequence, e.g. for a stroboscopic effect
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/16—Image acquisition using multiple overlapping images; Image stitching
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
- H04N23/632—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
-
- H04N5/232935—
Definitions
- the present disclosure relates to an image processing device that outputs a composite image by performing an image composition process on an image for each of a plurality of cameras, an imaging system provided therewith, and a calibration method of securing consistency between camera image regions in the composite image.
- a wide-angle image which cannot be obtained by one camera can be generated.
- two adjacent cameras are arranged so that respective imaging areas of the cameras partially overlap with each other and image regions of boundary portions corresponding to overlapping portions of the imaging areas of the two cameras are superimposed or trimming is appropriately performed on the image regions to composite the images.
- the disparity correction there is known a technique of obtaining the positional relationship between the images of the subjects which appear in the respective captured images of the two cameras by block matching based on edges and feature amounts and performing the disparity correction for deforming the image based on the information (see PTL 1). Specifically, in the technique, a stitching point which defines a deformation degree of the image during the disparity correction is changed for each of frames so as to appropriately generate a composite image for each of the frames.
- the present disclosure enables the user to conveniently perform the adjustment work for the calibration of ensuring the consistency between the camera image regions in the composite image.
- an image processing device that outputs a composite image by performing an image composition process on an image for each of a plurality of cameras, the device including: an image processing unit that generates a plurality of composition source images from each of captured images for each of the cameras based on a preset processing condition and performs the image composition process on the plurality of composition source images to generate a composite image for preview; a screen generation unit that generates a preview display screen on which the composite image for preview is displayed and outputs the preview display screen to a display input device; a touch operation determination unit that determines a camera image region to be adjusted in the composite image for preview based on a detection result of a touch operation which a user performs on the preview display screen displayed on the display input device and determines an adjustment item according to an operation mode of the touch operation; and a processing condition setting unit that sets a temporary processing condition related to the camera image region to be adjusted based on a determination result of the touch operation determination unit, in which the image processing unit generates the composition source image from
- an imaging system includes the image processing device, the plurality of cameras, and the display input device.
- a calibration method of securing consistency between camera image regions in a composite image generated by performing an image composition process on an image for each of a plurality of cameras including: generating a plurality of composition source images from each of captured images for each of the cameras based on a preset processing condition and performing the image composition process on the plurality of composition source images to generate a composite image for preview; generating a preview display screen on which the composite image for preview is displayed and outputting the preview display screen to a display input device; determining the camera image region to be adjusted in the composite image for preview based on a detection result of a touch operation which a user performs on the preview display screen displayed on the display input device and determining an adjustment item according to an operation mode of the touch operation; setting a temporary processing condition related to the camera image region to be adjusted based on a determination result; and generating the composition source image from the captured image of the camera corresponding to the image region to be adjusted based on the temporary processing condition and updating the composite image for
- the present disclosure it is possible to execute calibration of ensuring the consistency between the camera image regions in the composite image by the user performing the touch operation on the preview display screen displayed on the display input device. Accordingly, it is possible for the user to conveniently perform adjustment work for the calibration by an intuitive operation.
- FIG. 1 is an overall configuration diagram of an imaging system according to the present embodiment.
- FIG. 2 is an explanatory diagram illustrating a composite image generated by image processing device 3 and displayed on display input device 4 .
- FIG. 3 is a functional block diagram illustrating a schematic configuration of image processing device 3 .
- FIG. 4 is an explanatory diagram illustrating an outline of a stitching process performed by image processing device 3 .
- FIG. 5 is an explanatory diagram schematically illustrating states of an image in a case where disparity correction is not executed and a case where the disparity correction is executed.
- FIGS. 6A-6D illustrate an art of adjusting camera image regions 12 .
- FIG. 7 is an explanatory diagram illustrating a list of adjustment items according to an operation mode of a touch operation.
- FIGS. 8A-8D are explanatory diagrams illustrating the touch operation in a case of individually adjusting each of camera image regions 12 of horizontal cameras 1 .
- FIG. 9 is an explanatory diagram illustrating the touch operation in a case of individually adjusting camera image regions 12 of upward cameras 1 .
- FIG. 10 is an explanatory diagram illustrating the touch operation in a case of integrally adjusting all of camera image regions 12 .
- FIGS. 11A-11B are explanatory diagrams illustrating the touch operation in a case of integrally adjusting all of camera image regions 12 .
- FIG. 12 is a flowchart illustrating a flow of a process executed by image processing device 3 according to the touch operation of a user.
- FIG. 13 is a flowchart illustrating a flow of a process executed by image processing device 3 according to the touch operation of the user.
- FIG. 14 is a flowchart illustrating a flow of a process executed by image processing device 3 according to the touch operation of the user.
- FIG. 15 is an explanatory diagram illustrating an image process of improving visibility of camera image region 12 to be adjusted.
- FIG. 16 is an explanatory diagram illustrating control of enabling the user to recognize a just fit state of camera image region 12 to be adjusted.
- FIG. 17 is an explanatory diagram illustrating the touch operation when a tone and brightness of camera image region 12 are adjusted.
- FIG. 1 is an overall configuration diagram of an imaging system according to the present embodiment.
- FIG. 2 is an explanatory diagram illustrating a composite image generated by an image processing device 3 and displayed on a display input device 4 .
- the present imaging system includes camera unit 2 having a plurality of cameras 1 , image processing device 3 , and display input device 4 .
- Seven cameras 1 of one upward camera 1 which is disposed so that an optical axis is upward in an approximately vertical direction and six horizontal cameras 1 which are arranged at equal intervals in a circumferential direction so that an optical axis is radial in an approximately horizontal direction are mounted on camera unit 2 .
- Each of cameras 1 has a wide angle of view (for example, 120°) and is arranged so that two imaging areas adjacent to each other partially overlap with each other.
- image processing device 3 performs a stitching process of compositing captured images captured by each of cameras 1 to generate one composite image 11 .
- Composite image 11 is in a state in which camera image regions 12 are combined based on each of captured images captured by each of cameras 1 .
- the stitching process is performed on each of frames, a composite image of each of the frames is output to display input device 4 , and the composite images are displayed on display input device 4 as a video.
- image processing device 3 generates the composite image in real time and outputs the composite image to display input device 4 so that a user can see the composite image in real time, but the composite image generated by image processing device 3 may be stored in an information storage unit included in image processing device 3 or an information storage device connected to image processing device 3 .
- Display input device 4 is a so-called touch panel display which combines a display panel (display device) and a touch panel (position input device). Display input device 4 displays a screen of the composite image output from image processing device 3 and the user performs a touch operation on the screen so that it is possible to variously adjust each of camera image regions 12 in the composite image.
- camera unit 2 when assembling an optical system constituting camera 1 or when assembling camera 1 itself as a base, an error occurs in a mechanical disposition of positions, angles, and the like.
- This error of the mechanical disposition causes a state in which each of camera image regions 12 is not appropriately consistent in the composite image obtained by an image composition process and causes deterioration of a quality of the composite image.
- the user adjusts a position, an angle, and a size of camera image region 12 so that stitching portions between adjacent camera image regions 12 in composite image 11 are approximately matched.
- tone and brightness may be different at portions where the tones and the brightness should be originally matched in different camera image regions 12 .
- the user adjusts the tone and the brightness of camera image regions 12 .
- adjustment work for calibration of ensuring the consistency between camera image regions 12 in the composite image is performed by the touch operation of display input device 4 and the calibration is executed by performing a predetermined touch operation on camera image region 12 which is a defect.
- FIG. 3 is a functional block diagram illustrating the schematic configuration of image processing device 3 .
- FIG. 4 is an explanatory diagram illustrating an outline of the stitching process performed by image processing device 3 .
- Camera 1 includes lens unit 21 , image sensor unit 22 , signal processing unit 23 , and control unit 24 .
- Image sensor unit 22 images a subject via lens unit 21 and outputs an image signal.
- Signal processing unit 23 executes a necessary signal process on the image signal output from image sensor unit 22 and outputs a captured image.
- Control unit 24 controls operations of image sensor unit 22 and signal processing unit 23 .
- Image processing device 3 includes stitching processing unit (image processing unit) 31 , screen generation unit 32 , touch operation determination unit 33 , processing condition setting unit 34 , and image adjustment unit 35 .
- Stitching processing unit 31 includes disparity correction amount calculator 41 , panoramic image generation unit 42 , preview output switching unit 43 , disparity correction unit 44 , image composition unit 45 , and just fit detection unit 46 .
- Disparity correction amount calculator 41 is provided for each of combinations of the two adjacent cameras 1 and panoramic image generation unit 42 , preview output switching unit 43 , and disparity correction unit 44 are provided for each of cameras 1 .
- Disparity correction amount calculator 41 calculates the disparity correction amount which defines a deformation degree of the image during disparity correction for each of the frames. Specifically, as illustrated in FIG. 4 , each of processes of collimation (projection to cylinder), processing region clipping, and disparity calculation is performed. In the disparity calculation, disparity (deviation amount) is calculated by block matching between the two captured images. That is, while shifting the two captured images little by little, a difference between the two captured images is calculated and the disparity is calculated from a positional relationship at which the difference is the smallest.
- Panoramic image generation unit 42 makes the captured images respectively output from cameras 1 to be panoramic images (projection to sphere) and generates the panoramic image (composition source image). First, panoramic image generation unit 42 generates the panoramic image based on the preset control parameter (processing condition). When processing condition setting unit 34 sets a temporary control parameter according to the touch operation of the user, panoramic image generation unit 42 generates the panoramic image based on the temporary control parameter.
- Preview output switching unit 43 normally outputs the panoramic image generated by panoramic image generation unit 42 to disparity correction unit 44 and outputs the panoramic image generated by panoramic image generation unit 42 to image composition unit 45 when displaying a preview display screen. Accordingly, when displaying the preview display screen, the image composition process is performed without a disparity correction process.
- Disparity correction unit 44 performs the disparity correction on the panoramic image generated by panoramic image generation unit 42 based on the disparity correction amount output from disparity correction amount calculator 41 to generate a disparity correction image.
- Image composition unit 45 normally performs the image composition process on a plurality of disparity correction images generated by disparity correction unit 44 to generate the composite image. In addition, when displaying the preview display screen, image composition unit 45 performs the image composition process on a plurality of panoramic images generated by panoramic image generation unit 42 to generate the composite image for preview.
- the composite image for preview is normally lower in resolution than the composite image.
- Just fit detection unit 46 detects a just fit state in which consistency of the stitching portion between camera image region 12 to be adjusted and adjacent camera image region 12 in the composite image is high.
- an absolute value of a difference between pixel values is obtained between pixels positioned at the stitching portion of adjacent camera image regions 12 , that is, the pixels adjacent to each other across a boundary line of camera image region 12 , the absolute values of the differences are added, and a sum of the absolute values of the differences are calculated. Then, based on a magnitude of the sum of the absolute values of the differences, a degree (error) of positional deviation of adjacent camera image regions 12 is determined and when the sum of the absolute values of the differences becomes a minimum value in the vicinity, the just fit state is determined.
- just fit detection unit 46 detects the just fit state in this way, the composite images at those timings are continuously output within a predetermined range. Accordingly, on the preview display screen, camera image region 12 to be adjusted is displayed in a stationary state within the predetermined range without interlocking with the touch operation.
- Screen generation unit 32 normally generates a regular display screen on which the composite image generated by image composition unit 45 performing the image composition process on the disparity correction image is displayed and outputs the regular display screen to display input device 4 .
- screen generation unit 32 generates the preview display screen on which the composite image for preview generated by image composition unit 45 performing the image composition process on the panoramic image is displayed and outputs the preview display screen to display input device 4 .
- Touch operation determination unit 33 obtains a detection result of the touch operation which the user performs on the preview display screen displayed on display input device 4 , from display input device 4 . Based on the detection result of the touch operation, touch operation determination unit 33 determines camera image region 12 to be adjusted in composite image 11 and determines an adjustment item (pan adjustment, tilt adjustment, rolling adjustment, and zoom adjustment) according to an operation mode of the touch operation.
- one camera image region 12 on which the touch operation is performed is set as an adjustment target and an individual adjustment of individually adjusting each of camera image regions 12 is performed.
- all of camera image regions 12 are set as the adjustment targets and an overall adjustment of integrally adjusting all of camera image regions 12 is performed.
- the operation mode of the touch operation is a touch operation for different camera image regions 12 with two fingers
- two camera image regions 12 on which the touch operation is performed are set as the adjustment targets and it is determined that an image adjustment is performed between two camera image regions 12 .
- processing condition setting unit 34 sets a temporary processing condition related to camera image region 12 to be adjusted.
- the control parameter related to generation of the panoramic image (composition source image) by panoramic image generation unit 42 is set.
- processing condition setting unit 34 sets the temporary processing condition, based on the temporary processing condition, stitching processing unit 31 generates the composition source image from the captured image of camera 1 corresponding to camera image region 12 to be adjusted and updates the composite image for preview by performing the image composition process on the composition source image.
- Image adjustment unit 35 performs the image adjustment between two camera image regions 12 .
- the image adjustment is performed so that at least either tones or brightness in touch regions in two camera image regions 12 is matched.
- the image adjustment is executed in a case where the operation mode of the touch operation is a touch operation for different camera image regions 12 with two fingers.
- the imaging condition of camera 1 that is, the control parameter related to tones and brightness is set so that the tones and the brightness in two camera image regions 12 are matched.
- the control parameter is output from image adjustment unit 35 to control unit 24 of camera 1 and control unit 24 controls image sensor unit 22 and signal processing unit 23 based on the control parameter.
- Image processing device 3 illustrated in FIG. 3 is configured by a general-purpose information processing device. Each of units of image processing device 3 can be realized by causing a processor to execute a program for the image process stored in a memory. In addition, at least a part of image processing device 3 also can be realized by an exclusive hardware (screen processing circuit) for screen processing.
- FIG. 5 is an explanatory diagram schematically illustrating states of an image in a case where the disparity correction is not executed and a case where the disparity correction is executed.
- FIG. 5 is an explanatory diagram schematically illustrating states of an image in a case where the disparity correction is not executed and a case where the disparity correction is executed.
- an example of processing the images of two adjacent cameras 1 will be described.
- FIG. 5 illustrates imaging status by two adjacent cameras 1 .
- a person and a mountain which is a background are simultaneously imaged by two cameras 1 .
- FIG. 5 illustrate images captured by two cameras 1 .
- the image of distant view representing the mountain and the image of close-range view representing the person appear in image regions of boundary portions corresponding to an overlapping portion of the imaging areas of two cameras 1 .
- the mountain of distant view and the person of close-range view have different distances from camera 1 , so that in the two captured images, a positional relationship between the image of distant view and the image of close-range view deviates.
- FIG. 5 illustrate composite images obtained by performing simple composite on the images captured by two cameras 1 based on the image of distant view.
- the positional relationship between the image of distant view and the image of close-range view deviates in the two captured images, when the two captured images are simply composited based on the image of distant view, as illustrated in (C- 1 ) and (C- 2 ) of FIG. 5 , a defect of duplicately appearing the image of close-range view or of losing a part of the image of close-range view occurs in the composite image.
- a deviation of the image of close-range view when setting the image of distant view as a standard means a disparity between the captured images of two cameras 1 .
- the positional relationship between the images of distant view respectively captured by two cameras 1 can be obtained in advance in a state in which the image of close-range view does not exist and the images are composited based on the information.
- the disparity does not exist in the captured images of two cameras 1 and when the image of close-range view exists, the disparity between the captured images of two cameras 1 exists.
- the image is deformed so that the image region in which the image of close-range view mainly appears is horizontally shifted, and then, the captured images of two cameras 1 are composited so as to consist the positional relationship of the image of distant view and the image of close-range view between the captured images of two cameras 1 .
- a curved stitching boundary is set so as to avoid the image region in which the image of close-range view appears. After trimming is performed to clip the captured images of two cameras 1 along the stitching boundary, the images are composited.
- the first correction method that is, the disparity correction of deforming the image to horizontally shift the image region in which the image of close-range view mainly appears is performed on the captured images of two cameras 1 so that the positional relationship of the image of distant view and the image of close-range view between the captured images of two cameras 1 consists.
- FIG. 6 is an explanatory diagram illustrating the art of adjusting camera image regions 12 .
- panoramic image generation unit 42 when displaying the preview display screen, panoramic image generation unit 42 makes the captured images obtained from cameras 1 to be the panoramic image (projection to sphere) and generates the panoramic image (composition source image). Then, the images are composited by clipping a necessary region from the panoramic image. At this time, a position and a size of each of camera image regions 12 on composite image 11 are defined according to a projection position (central position of projection region) and a projection size (size of projection region) during making the panoramic image.
- blank region 53 is formed outside used region 52 corresponding to camera image region 12 finally displayed on the composite image. It is possible to adjust the projection position and the projection size during making the panoramic image within a range of blank region 53 .
- camera image region 12 can be moved in the horizontal direction and at this time, a process of adjusting the projection position of the image in the horizontal direction is performed.
- camera image region 12 can be moved in the vertical direction and at this time, a process of adjusting the projection position of the image in the vertical direction is performed.
- camera image region 12 in the rolling adjustment (position adjustment in rotation direction), can be rotated and at this time, a process of adjusting the projection position of the image in the rotation direction is performed.
- zoom adjustment extension/contraction adjustment
- camera image region 12 can be zoomed out or zoomed in and at this time, a process of adjusting the size of the image is performed.
- processing condition setting unit 34 sets a setting value related to the projection position and the projection size as the control parameter related to generation of the panoramic image (composition source image) according to the touch operation. Based on the control parameter, panoramic image generation unit 42 adjusts the projection position and the projection size during making the panoramic image.
- the control parameter related to each of the adjustment items of the pan adjustment, the tilt adjustment, the rolling adjustment, and the zoom adjustment is preset.
- the control parameter related to the adjustment item corresponding to camera image region 12 to be adjusted is updated.
- FIG. 7 is an explanatory diagram illustrating a list of the adjustment items according to the operation mode of the touch operation.
- one camera image region 12 on which the touch operation is performed is set as an adjustment target and an individual adjustment of individually adjusting each of camera image regions 12 can be performed.
- all of camera image regions 12 are set as the adjustment targets and an overall adjustment of integrally adjusting all of camera image regions 12 can be performed.
- the pan adjustment (position adjustment in horizontal direction) of horizontally moving camera image region 12 to be adjusted the tilt adjustment (position adjustment in vertical direction) of vertically moving camera image region 12 to be adjusted, the rolling adjustment (position adjustment in rotation direction) of rotating camera image region 12 to be adjusted, and the zoom adjustment (extension/contraction adjustment) of zooming in or zooming out camera image region 12 to be adjusted can be performed.
- FIG. 8 is an explanatory diagram illustrating the touch operation in a case of individually adjusting each of camera image regions 12 of horizontal cameras 1 .
- FIG. 9 is an explanatory diagram illustrating the touch operation in a case of individually adjusting camera image regions 12 of upward cameras 1 .
- one camera image region 12 on which the touch operation is performed is set as an adjustment target and it is possible to individually adjust each of camera image regions 12 .
- zoom adjustment extension/contraction adjustment
- zooming in or zooming out camera image region 12 it is possible to perform the zoom adjustment (extension/contraction adjustment) of zooming in or zooming out camera image region 12 to be operated by a pinch operation of moving two fingers touched on the screen to widen or narrow a gap between the two fingers.
- a horizontal movement on composite image 11 corresponds to rotating of projection region 62 in a latitude line direction on projection sphere 61 as illustrated in (A- 2 ) of FIG. 9 .
- the rolling adjustment position adjustment in rotation direction of adjusting the projection position of the image in the rotation direction is performed.
- a vertical movement on composite image 11 corresponds to moving of projection region 62 in a longitude line direction on projection sphere 61 as illustrated in (B- 2 ) of FIG. 9 .
- the tilt adjustment position adjustment in diameter direction
- FIGS. 10 and 11 are explanatory diagrams illustrating the touch operation in the case of integrally adjusting all of camera image regions 12 .
- all of camera image regions 12 are set as the adjustment targets and it is possible to integrally adjust all of camera image regions 12 .
- camera image region 12 related to horizontal camera 1 is displayed so as to be circulated.
- camera image region 12 positioned at a left or right end can be arranged at a center of the screen as illustrated in (A- 2 ) of FIG. 10 .
- camera image region 12 is displayed so as to be vertically circulated. Accordingly, for example, in the state illustrated in (B- 1 ) of FIG. 10 , camera image region 12 related to upward camera 1 positioned at an upper end can be arranged at a center of the screen as illustrated in (B- 2 ) of FIG. 10 . In this case, by sandwiching camera image region 12 related to upward camera 1 , camera image region 12 related to upward camera 1 is displayed in a state in which camera image region 12 is divided into two.
- FIGS. 12, 13 , and 14 are flowcharts illustrating the flow of the process executed by image processing device 3 according to the touch operation of the user.
- touch operation determination unit 33 determines whether or not the touch operation is detected (ST 101 ).
- the disparity correction is released (ST 102 ).
- the touch operation is the operation with one finger (Yes in ST 103 )
- it is determined whether or not the operation target is camera image region 12 related to upward camera 1 (ST 104 ).
- processing condition setting unit 34 executes the pan adjustment (position adjustment in horizontal direction) according to a finger movement and updates the control parameter of pan (ST 106 ).
- processing condition setting unit 34 executes the tilt adjustment (position adjustment in vertical direction) according to the finger movement and updates the control parameter of tilt (ST 108 ).
- processing condition setting unit 34 executes the rolling adjustment (position adjustment in rotation direction) according to the finger movement and updates the control parameter of rolling (ST 110 ).
- processing condition setting unit 34 executes the tilt adjustment (position adjustment in vertical direction) according to the finger movement and updates the control parameter of tilt (ST 112 ).
- processing condition setting unit 34 executes the rolling adjustment (position adjustment in rotation direction) according to the finger movement and updates the control parameter of rolling (ST 115 ).
- processing condition setting unit 34 executes the zoom adjustment (extension/contraction adjustment) according to the finger movement and updates the control parameter of zoom (ST 117 ).
- processing condition setting unit 34 executes the pan adjustment (position adjustment in horizontal direction) according to a finger movement and updates the control parameter of pan (ST 119 ).
- processing condition setting unit 34 executes the tilt adjustment (position adjustment in vertical direction) according to the finger movement and updates the control parameter of tilt (ST 121 ).
- processing condition setting unit 34 executes the rolling adjustment (position adjustment in rotation direction) according to the finger movement and updates the control parameter of rolling (ST 123 ).
- processing condition setting unit 34 executes the zoom adjustment (extension/contraction adjustment) according to the finger movement and updates the control parameter of zoom (ST 125 ).
- panoramic image generation unit 42 of stitching processing unit 31 performs the process of generating the panoramic image based on the updated control parameter and next, image composition unit 45 performs the process of compositing the panoramic images and the generated composite image is displayed on the preview display screen (ST 126 ).
- FIG. 15 is an explanatory diagram illustrating the image process of improving the visibility of camera image region 12 to be adjusted.
- composite image 11 changes a tone of camera image region 12 not to be adjusted from the initial tone, so that the image process is performed on image composition unit 45 to excellently improve visibility of camera image region 12 to be adjusted.
- at least an image process of changing luminance is performed on camera image region 12 not to be adjusted.
- the image process of lowering the luminance is performed on camera image region 12 not to be adjusted and camera image region 12 not to be adjusted is grayed out and darkly displayed.
- camera image region 12 to be adjusted becomes conspicuous and the user can intuitively grasp camera image region 12 to be adjusted.
- FIG. 16 is an explanatory diagram illustrating the control of causing the user to recognize the just fit state of camera image region 12 to be adjusted.
- the processing condition (control parameter related to generation of panoramic image) of stitching processing unit 31 is regularly updated according to the touch operation
- the composition source image is updated according to this
- the composite image composited from the composition source images is regularly output from stitching processing unit 31 . Accordingly, the composite image is displayed on the preview display screen as a video, at this time, an animation in which camera image region 12 to be operated changes in conjunction with the finger movement is executed.
- just fit detection unit 46 detects the just fit state in which consistency of the stitching portion between camera image region 12 to be operated and adjacent camera image region 12 is high. Then, when just fit detection unit 46 detects the just fit state, the animation of camera image region 12 to be operated, that is, the movement of camera image region 12 in conjunction with the finger movement is stopped.
- image composition unit 45 continuously outputs the composite image within a predetermined range based on the composition source image (panoramic image) obtained at a timing at which the just fit state is detected. Accordingly, on the preview display screen, camera image region 12 to be operated is displayed in a stationary state without interlocking with the finger movement.
- a timing of releasing the stop of the animation can be defined by a distance. That is, when the distance from a position of the just fit state to a touch position becomes larger than a predetermined distance, the stop of the animation is released.
- the timing of releasing the stop of the animation can be defined by a time. That is, when a predetermined time (for example, 1 second) elapses after the just fit state is detected, the stop of the animation is released.
- the swipe operation of sliding one finger touched on the screen in the vertical direction is performed.
- (A) of FIG. 16 when the finger moves, camera image region 12 to be operated is moved in the vertical direction according to the finger movement.
- (B) of FIG. 16 when the just fit state is detected, as illustrated in (C) of FIG. 16 , even if the finger further moves, camera image region 12 to be operated is stopped as it is.
- (D) of FIG. 16 when the finger further moves, camera image region 12 to be operated is moved in the vertical direction according to the finger movement.
- the animation of camera image region 12 is stopped. Accordingly, the user can intuitively grasp the just fit state and it is possible to quickly perform the adjustment operation.
- the user by stopping the animation of camera image region 12 to be operated, the user recognizes the just fit state.
- a method of causing the user to recognize the just fit state is not limited thereto.
- camera image region 12 may be flashed (blinked).
- a frame line surrounding camera image region 12 may be displayed.
- FIG. 17 is an explanatory diagram illustrating the touch operation when the tone and the brightness of camera image region 12 are adjusted.
- two camera image regions 12 on which the touch operation is performed are set as the adjustment targets and image adjustment unit 35 performs the image adjustment between two camera image regions 12 .
- the image adjustment is performed so that the tones and the brightness of touch regions 71 (predetermined range around touch position) in two camera image regions 12 on which the touch operation is performed are matched.
- the touch operation is performed in a region of a sky in two camera image regions 12 so that the tones and the brightness of the sky in two camera image regions 12 are matched.
- image adjustment information on the tones and brightness of the touch region in two camera image regions 12 to be adjusted is obtained and camera 1 of an imaging source of two camera image regions 12 is controlled based on the information so that the tones and the brightness of the touch region in two camera image regions 12 are matched.
- the imaging condition of camera 1 that is, the control parameter related to tones and brightness is adjusted so that the tones and the brightness in two camera image regions 12 are matched.
- a white balance gain value in signal processing unit 23 is adjusted as a control parameter related to the tone (white balance).
- a shutter value (shutter speed) and a sensor gain value in image sensor unit 22 , and a digital gain value in signal processing unit 23 are adjusted as control parameters related to brightness (exposure).
- the touch operation is simultaneously performed on two camera image regions 12 with two fingers, so that the image adjustment of two camera image regions 12 is performed.
- the touch operation is simultaneously performed on three or more camera image regions 12 with three or more fingers, the image adjustment of three or more camera image regions 12 can be performed.
- the control is performed so that both of the tones and the brightness are matched, but one of the tones and the brightness, for example, the control may be performed so that only the tones are matched.
- the embodiment is described as an example of a technique disclosed in the present application.
- the technique in the present disclosure is not limited thereto.
- the technique also can be applied to the embodiment being changed, exchanged, added, and omitted.
- the image processing device (including PC) provided separately from the camera unit having the plurality of cameras performs the image composition process or the like and outputs the composite image, but the embodiment can be configured as the camera unit (imaging device) mounting the image processing device.
- the embodiment described above is configured to include the upward camera and the horizontal camera, but the embodiment may be configured to include a downward camera instead of the upward camera. Further, the embodiment may be configured to include both of the upward camera and the downward camera.
- a preview output switching unit when the preview display screen is displayed, a preview output switching unit outputs the input composition source image (panoramic image) to the image composition unit and the composition source image is not input to the disparity correction unit, but after the composition source image is input to the disparity correction unit, by setting the disparity correction amount to 0, the disparity correction process may not be actually performed.
- the image processing device which generates the normal composite image executes the calibration for securing consistency between the camera image regions in the composite image
- the embodiment may be configured to include an exclusive device for the calibration, an exclusive program for the calibration may be installed in the information processing device, and the calibration may be executed.
- An image processing device, an imaging system that includes the image processing device, and a calibration method according to the present disclosure are advantageous in that a user can conveniently perform adjustment work for calibration of securing consistency between camera image regions in a composite image, and are useful as an image processing device that outputs a composite image by performing an image composition process on an image for each of a plurality of cameras, an imaging system that includes the image processing device, a calibration method of securing consistency between camera image regions in the composite image, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
An image processing device includes a stitching processing unit that composites composition source images generated from captured images under a preset processing condition to generate a composite image, a screen generation unit that generates a screen and outputs the composite image to a display input device, a touch operation determination unit that determines a camera image region to be adjusted based on a detection result of a touch operation on the screen and determines an adjustment item according to an operation mode of the touch operation, and a processing condition setting unit that sets a temporary processing condition related to the camera image region to be adjusted. The stitching processing unit generates the composition source image from the captured image of the camera corresponding to the camera image region to be adjusted under the temporary processing condition and updates the composite image by compositing the composition source images.
Description
- This is a continuation of U.S. patent application Ser. No. 16/088,697, filed Sep. 26, 2018, which is a National Stage Entry of International Patent Appl. No. PCT/JP2017/014327, filed Apr. 6, 2017, which claims priority to Japanese Appl. No. 2016-087443, filed Apr. 25, 2016. The disclosure of each of the above-mentioned documents, including the specification, drawings, and claims, is incorporated herein by reference in its entirety.
- The present disclosure relates to an image processing device that outputs a composite image by performing an image composition process on an image for each of a plurality of cameras, an imaging system provided therewith, and a calibration method of securing consistency between camera image regions in the composite image.
- By performing a so-called stitching process of generating one composite image by compositing images captured by a plurality of cameras, a wide-angle image which cannot be obtained by one camera can be generated. In the stitching process, two adjacent cameras are arranged so that respective imaging areas of the cameras partially overlap with each other and image regions of boundary portions corresponding to overlapping portions of the imaging areas of the two cameras are superimposed or trimming is appropriately performed on the image regions to composite the images.
- On the other hand, when subjects with different distances from the camera exist, that is, the subject of distant view and the subject of close-range view exist in the overlapping portion of the imaging areas of the two cameras, a positional relationship of an image of the subject of distant view and an image of the subject of close-range view deviates, so-called a disparity occurs, the image of the subject of distant view duplicately appears, and a defect of losing a part of the image of the subject of distant view occurs between the respective captured images of the two cameras. Therefore, in the stitching, disparity correction for suppressing the image defect caused by the disparity is performed.
- Regarding the disparity correction, there is known a technique of obtaining the positional relationship between the images of the subjects which appear in the respective captured images of the two cameras by block matching based on edges and feature amounts and performing the disparity correction for deforming the image based on the information (see PTL 1). Specifically, in the technique, a stitching point which defines a deformation degree of the image during the disparity correction is changed for each of frames so as to appropriately generate a composite image for each of the frames.
- Meanwhile, in a camera unit in which the plurality of cameras are mounted, when assembling an optical system constituting the camera, an error occurs in a mechanical disposition of positions, angles, and the like. This error of the mechanical disposition causes a state in which the image regions of the respective cameras are not appropriately consistent in the composite image obtained by the image composition process and causes deterioration of a quality of the composite image.
- On the other hand, it is difficult for a user to adjust the mechanical disposition itself after product shipment. In addition, the error of the mechanical disposition is changed by a difference in an installation environment and is also changed as time passes. Therefore, by changing a processing condition of the image process, calibration of ensuring consistency between the camera image regions in the composite image is performed, and in particular, a technique which allows the user to conveniently perform adjustment work for the calibration at an appropriate timing such as immediately before photographing is desired.
- The present disclosure enables the user to conveniently perform the adjustment work for the calibration of ensuring the consistency between the camera image regions in the composite image.
- PTL 1: Japanese Patent Unexamined Publication No. 2010-50842
- According to the present disclosure, there is provided an image processing device that outputs a composite image by performing an image composition process on an image for each of a plurality of cameras, the device including: an image processing unit that generates a plurality of composition source images from each of captured images for each of the cameras based on a preset processing condition and performs the image composition process on the plurality of composition source images to generate a composite image for preview; a screen generation unit that generates a preview display screen on which the composite image for preview is displayed and outputs the preview display screen to a display input device; a touch operation determination unit that determines a camera image region to be adjusted in the composite image for preview based on a detection result of a touch operation which a user performs on the preview display screen displayed on the display input device and determines an adjustment item according to an operation mode of the touch operation; and a processing condition setting unit that sets a temporary processing condition related to the camera image region to be adjusted based on a determination result of the touch operation determination unit, in which the image processing unit generates the composition source image from the captured image of the camera corresponding to the camera image region to be adjusted based on the temporary processing condition and updates the composite image for preview by performing the image composition process on the composition source image.
- In addition, an imaging system according to the present disclosure includes the image processing device, the plurality of cameras, and the display input device.
- In addition, according to the present disclosure, there is provided a calibration method of securing consistency between camera image regions in a composite image generated by performing an image composition process on an image for each of a plurality of cameras, the method including: generating a plurality of composition source images from each of captured images for each of the cameras based on a preset processing condition and performing the image composition process on the plurality of composition source images to generate a composite image for preview; generating a preview display screen on which the composite image for preview is displayed and outputting the preview display screen to a display input device; determining the camera image region to be adjusted in the composite image for preview based on a detection result of a touch operation which a user performs on the preview display screen displayed on the display input device and determining an adjustment item according to an operation mode of the touch operation; setting a temporary processing condition related to the camera image region to be adjusted based on a determination result; and generating the composition source image from the captured image of the camera corresponding to the image region to be adjusted based on the temporary processing condition and updating the composite image for preview by performing the image composition process on the composition source image.
- According to the present disclosure, it is possible to execute calibration of ensuring the consistency between the camera image regions in the composite image by the user performing the touch operation on the preview display screen displayed on the display input device. Accordingly, it is possible for the user to conveniently perform adjustment work for the calibration by an intuitive operation.
-
FIG. 1 is an overall configuration diagram of an imaging system according to the present embodiment. -
FIG. 2 is an explanatory diagram illustrating a composite image generated byimage processing device 3 and displayed ondisplay input device 4. -
FIG. 3 is a functional block diagram illustrating a schematic configuration ofimage processing device 3. -
FIG. 4 is an explanatory diagram illustrating an outline of a stitching process performed byimage processing device 3. -
FIG. 5 is an explanatory diagram schematically illustrating states of an image in a case where disparity correction is not executed and a case where the disparity correction is executed. -
FIGS. 6A-6D illustrate an art of adjustingcamera image regions 12. -
FIG. 7 is an explanatory diagram illustrating a list of adjustment items according to an operation mode of a touch operation. -
FIGS. 8A-8D are explanatory diagrams illustrating the touch operation in a case of individually adjusting each ofcamera image regions 12 ofhorizontal cameras 1. -
FIG. 9 is an explanatory diagram illustrating the touch operation in a case of individually adjustingcamera image regions 12 ofupward cameras 1. -
FIG. 10 is an explanatory diagram illustrating the touch operation in a case of integrally adjusting all ofcamera image regions 12. -
FIGS. 11A-11B are explanatory diagrams illustrating the touch operation in a case of integrally adjusting all ofcamera image regions 12. -
FIG. 12 is a flowchart illustrating a flow of a process executed byimage processing device 3 according to the touch operation of a user. -
FIG. 13 is a flowchart illustrating a flow of a process executed byimage processing device 3 according to the touch operation of the user. -
FIG. 14 is a flowchart illustrating a flow of a process executed byimage processing device 3 according to the touch operation of the user. -
FIG. 15 is an explanatory diagram illustrating an image process of improving visibility ofcamera image region 12 to be adjusted. -
FIG. 16 is an explanatory diagram illustrating control of enabling the user to recognize a just fit state ofcamera image region 12 to be adjusted. -
FIG. 17 is an explanatory diagram illustrating the touch operation when a tone and brightness ofcamera image region 12 are adjusted. - Hereinafter, embodiments of the present disclosure will be described with reference to drawings.
-
FIG. 1 is an overall configuration diagram of an imaging system according to the present embodiment.FIG. 2 is an explanatory diagram illustrating a composite image generated by animage processing device 3 and displayed on adisplay input device 4. - As illustrated in
FIG. 1 , the present imaging system includescamera unit 2 having a plurality ofcameras 1,image processing device 3, anddisplay input device 4. - Seven
cameras 1 of oneupward camera 1 which is disposed so that an optical axis is upward in an approximately vertical direction and sixhorizontal cameras 1 which are arranged at equal intervals in a circumferential direction so that an optical axis is radial in an approximately horizontal direction are mounted oncamera unit 2. Each ofcameras 1 has a wide angle of view (for example, 120°) and is arranged so that two imaging areas adjacent to each other partially overlap with each other. - As illustrated in
FIG. 2 ,image processing device 3 performs a stitching process of compositing captured images captured by each ofcameras 1 to generate onecomposite image 11.Composite image 11 is in a state in whichcamera image regions 12 are combined based on each of captured images captured by each ofcameras 1. In addition, the stitching process is performed on each of frames, a composite image of each of the frames is output to displayinput device 4, and the composite images are displayed ondisplay input device 4 as a video. - In the present embodiment,
image processing device 3 generates the composite image in real time and outputs the composite image to displayinput device 4 so that a user can see the composite image in real time, but the composite image generated byimage processing device 3 may be stored in an information storage unit included inimage processing device 3 or an information storage device connected toimage processing device 3. -
Display input device 4 is a so-called touch panel display which combines a display panel (display device) and a touch panel (position input device).Display input device 4 displays a screen of the composite image output fromimage processing device 3 and the user performs a touch operation on the screen so that it is possible to variously adjust each ofcamera image regions 12 in the composite image. - Here, in
camera unit 2, when assembling an opticalsystem constituting camera 1 or when assemblingcamera 1 itself as a base, an error occurs in a mechanical disposition of positions, angles, and the like. This error of the mechanical disposition causes a state in which each ofcamera image regions 12 is not appropriately consistent in the composite image obtained by an image composition process and causes deterioration of a quality of the composite image. - In the present embodiment, by changing a processing condition of the image process performed by
image processing device 3 as calibration for securing consistency betweencamera image regions 12 in the composite image, the user adjusts a position, an angle, and a size ofcamera image region 12 so that stitching portions between adjacentcamera image regions 12 incomposite image 11 are approximately matched. - In addition, in
composite image 11, in some cases, tone and brightness may be different at portions where the tones and the brightness should be originally matched in differentcamera image regions 12. Here, in the present embodiment, by changing an imaging condition ofcamera 1 as calibration for securing consistency betweencamera image regions 12 in the composite image, the user adjusts the tone and the brightness ofcamera image regions 12. - In addition, adjustment work for calibration of ensuring the consistency between
camera image regions 12 in the composite image is performed by the touch operation ofdisplay input device 4 and the calibration is executed by performing a predetermined touch operation oncamera image region 12 which is a defect. - Next, a schematic configuration of
image processing device 3 will be described.FIG. 3 is a functional block diagram illustrating the schematic configuration ofimage processing device 3.FIG. 4 is an explanatory diagram illustrating an outline of the stitching process performed byimage processing device 3. -
Camera 1 includeslens unit 21,image sensor unit 22,signal processing unit 23, andcontrol unit 24. -
Image sensor unit 22 images a subject vialens unit 21 and outputs an image signal.Signal processing unit 23 executes a necessary signal process on the image signal output fromimage sensor unit 22 and outputs a captured image.Control unit 24 controls operations ofimage sensor unit 22 andsignal processing unit 23. -
Image processing device 3 includes stitching processing unit (image processing unit) 31,screen generation unit 32, touchoperation determination unit 33, processingcondition setting unit 34, andimage adjustment unit 35.Stitching processing unit 31 includes disparitycorrection amount calculator 41, panoramicimage generation unit 42, previewoutput switching unit 43,disparity correction unit 44,image composition unit 45, and justfit detection unit 46. - Here, only a portion of processing the captured images of two
cameras 1 is described. As illustrated inFIG. 1 , in the present embodiment, the sevencameras 1 are provided. Disparitycorrection amount calculator 41 is provided for each of combinations of the twoadjacent cameras 1 and panoramicimage generation unit 42, previewoutput switching unit 43, anddisparity correction unit 44 are provided for each ofcameras 1. - Disparity
correction amount calculator 41 calculates the disparity correction amount which defines a deformation degree of the image during disparity correction for each of the frames. Specifically, as illustrated inFIG. 4 , each of processes of collimation (projection to cylinder), processing region clipping, and disparity calculation is performed. In the disparity calculation, disparity (deviation amount) is calculated by block matching between the two captured images. That is, while shifting the two captured images little by little, a difference between the two captured images is calculated and the disparity is calculated from a positional relationship at which the difference is the smallest. - Panoramic
image generation unit 42 makes the captured images respectively output fromcameras 1 to be panoramic images (projection to sphere) and generates the panoramic image (composition source image). First, panoramicimage generation unit 42 generates the panoramic image based on the preset control parameter (processing condition). When processingcondition setting unit 34 sets a temporary control parameter according to the touch operation of the user, panoramicimage generation unit 42 generates the panoramic image based on the temporary control parameter. - Preview
output switching unit 43 normally outputs the panoramic image generated by panoramicimage generation unit 42 todisparity correction unit 44 and outputs the panoramic image generated by panoramicimage generation unit 42 to imagecomposition unit 45 when displaying a preview display screen. Accordingly, when displaying the preview display screen, the image composition process is performed without a disparity correction process. -
Disparity correction unit 44 performs the disparity correction on the panoramic image generated by panoramicimage generation unit 42 based on the disparity correction amount output from disparitycorrection amount calculator 41 to generate a disparity correction image. -
Image composition unit 45 normally performs the image composition process on a plurality of disparity correction images generated bydisparity correction unit 44 to generate the composite image. In addition, when displaying the preview display screen,image composition unit 45 performs the image composition process on a plurality of panoramic images generated by panoramicimage generation unit 42 to generate the composite image for preview. The composite image for preview is normally lower in resolution than the composite image. - When displaying the preview display screen, since only
camera image region 12 to be adjusted is changed according to the touch operation, it is possible to generate a new composite image by superimposingcamera image region 12 to be adjusted on the original composite image. - Just fit
detection unit 46 detects a just fit state in which consistency of the stitching portion betweencamera image region 12 to be adjusted and adjacentcamera image region 12 in the composite image is high. - Here, in order to detect the just fit state, an absolute value of a difference between pixel values is obtained between pixels positioned at the stitching portion of adjacent
camera image regions 12, that is, the pixels adjacent to each other across a boundary line ofcamera image region 12, the absolute values of the differences are added, and a sum of the absolute values of the differences are calculated. Then, based on a magnitude of the sum of the absolute values of the differences, a degree (error) of positional deviation of adjacentcamera image regions 12 is determined and when the sum of the absolute values of the differences becomes a minimum value in the vicinity, the just fit state is determined. - When just
fit detection unit 46 detects the just fit state in this way, the composite images at those timings are continuously output within a predetermined range. Accordingly, on the preview display screen,camera image region 12 to be adjusted is displayed in a stationary state within the predetermined range without interlocking with the touch operation. -
Screen generation unit 32 normally generates a regular display screen on which the composite image generated byimage composition unit 45 performing the image composition process on the disparity correction image is displayed and outputs the regular display screen to displayinput device 4. In addition,screen generation unit 32 generates the preview display screen on which the composite image for preview generated byimage composition unit 45 performing the image composition process on the panoramic image is displayed and outputs the preview display screen to displayinput device 4. - Touch
operation determination unit 33 obtains a detection result of the touch operation which the user performs on the preview display screen displayed ondisplay input device 4, fromdisplay input device 4. Based on the detection result of the touch operation, touchoperation determination unit 33 determinescamera image region 12 to be adjusted incomposite image 11 and determines an adjustment item (pan adjustment, tilt adjustment, rolling adjustment, and zoom adjustment) according to an operation mode of the touch operation. - In the present embodiment, in a case where the touch operation is performed with one or two fingers, one
camera image region 12 on which the touch operation is performed is set as an adjustment target and an individual adjustment of individually adjusting each ofcamera image regions 12 is performed. In addition, in a case where the touch operation is performed with three or more fingers, all ofcamera image regions 12 are set as the adjustment targets and an overall adjustment of integrally adjusting all ofcamera image regions 12 is performed. - Further, in a case where the operation mode of the touch operation is a touch operation for different
camera image regions 12 with two fingers, twocamera image regions 12 on which the touch operation is performed are set as the adjustment targets and it is determined that an image adjustment is performed between twocamera image regions 12. - Based on a determination result of touch
operation determination unit 33, processingcondition setting unit 34 sets a temporary processing condition related tocamera image region 12 to be adjusted. In the present embodiment, as the processing condition, the control parameter related to generation of the panoramic image (composition source image) by panoramicimage generation unit 42 is set. - When processing
condition setting unit 34 sets the temporary processing condition, based on the temporary processing condition,stitching processing unit 31 generates the composition source image from the captured image ofcamera 1 corresponding tocamera image region 12 to be adjusted and updates the composite image for preview by performing the image composition process on the composition source image. -
Image adjustment unit 35 performs the image adjustment between twocamera image regions 12. In the present embodiment, the image adjustment is performed so that at least either tones or brightness in touch regions in twocamera image regions 12 is matched. The image adjustment is executed in a case where the operation mode of the touch operation is a touch operation for differentcamera image regions 12 with two fingers. - In addition, in the present embodiment, the imaging condition of
camera 1, that is, the control parameter related to tones and brightness is set so that the tones and the brightness in twocamera image regions 12 are matched. The control parameter is output fromimage adjustment unit 35 to controlunit 24 ofcamera 1 andcontrol unit 24 controlsimage sensor unit 22 andsignal processing unit 23 based on the control parameter. -
Image processing device 3 illustrated inFIG. 3 is configured by a general-purpose information processing device. Each of units ofimage processing device 3 can be realized by causing a processor to execute a program for the image process stored in a memory. In addition, at least a part ofimage processing device 3 also can be realized by an exclusive hardware (screen processing circuit) for screen processing. - Next, the disparity correction performed by
disparity correction unit 44 will be described.FIG. 5 is an explanatory diagram schematically illustrating states of an image in a case where the disparity correction is not executed and a case where the disparity correction is executed. Here, for convenience of explanation, an example of processing the images of twoadjacent cameras 1 will be described. - (A) of
FIG. 5 illustrates imaging status by twoadjacent cameras 1. In an example illustrated in (A) ofFIG. 5 , a person and a mountain which is a background are simultaneously imaged by twocameras 1. - (B-1) and (B-2) in
FIG. 5 illustrate images captured by twocameras 1. As illustrated in (B-1) and (B-2) ofFIG. 5 , in the captured image by twocameras 1, the image of distant view representing the mountain and the image of close-range view representing the person appear in image regions of boundary portions corresponding to an overlapping portion of the imaging areas of twocameras 1. Here, the mountain of distant view and the person of close-range view have different distances fromcamera 1, so that in the two captured images, a positional relationship between the image of distant view and the image of close-range view deviates. - (C-1) and (C-2) in
FIG. 5 illustrate composite images obtained by performing simple composite on the images captured by twocameras 1 based on the image of distant view. As described above, since the positional relationship between the image of distant view and the image of close-range view deviates in the two captured images, when the two captured images are simply composited based on the image of distant view, as illustrated in (C-1) and (C-2) ofFIG. 5 , a defect of duplicately appearing the image of close-range view or of losing a part of the image of close-range view occurs in the composite image. - Here, a deviation of the image of close-range view when setting the image of distant view as a standard means a disparity between the captured images of two
cameras 1. The positional relationship between the images of distant view respectively captured by twocameras 1 can be obtained in advance in a state in which the image of close-range view does not exist and the images are composited based on the information. On the other hand, when the image of close-range view does not exist, the disparity does not exist in the captured images of twocameras 1 and when the image of close-range view exists, the disparity between the captured images of twocameras 1 exists. - Therefore, in a state in which the image of distant view and the image of close-range view appear in the image regions of the boundary portions in the captured images of two
cameras 1, the disparity generated between the two captured images causes a defect like the image of close-range view duplicately appears in the composite image. Then, the disparity correction for removing the defect is necessary. There are representatively two types of correction methods for the disparity correction. (C-3) inFIG. 5 illustrates a first correction method and (C-4) inFIG. 5 illustrates a second correction method. - As illustrated in (C-3) of
FIG. 5 , in the first correction method, the image is deformed so that the image region in which the image of close-range view mainly appears is horizontally shifted, and then, the captured images of twocameras 1 are composited so as to consist the positional relationship of the image of distant view and the image of close-range view between the captured images of twocameras 1. - As illustrated in (C-4) of
FIG. 5 , in the second correction method, a curved stitching boundary is set so as to avoid the image region in which the image of close-range view appears. After trimming is performed to clip the captured images of twocameras 1 along the stitching boundary, the images are composited. - In this way, by performing the disparity correction, it is possible to generate the appropriate image. In the present embodiment, the first correction method, that is, the disparity correction of deforming the image to horizontally shift the image region in which the image of close-range view mainly appears is performed on the captured images of two
cameras 1 so that the positional relationship of the image of distant view and the image of close-range view between the captured images of twocameras 1 consists. - Next, an art of adjusting
camera image regions 12 will be described.FIG. 6 is an explanatory diagram illustrating the art of adjustingcamera image regions 12. - In the present embodiment, when displaying the preview display screen, panoramic
image generation unit 42 makes the captured images obtained fromcameras 1 to be the panoramic image (projection to sphere) and generates the panoramic image (composition source image). Then, the images are composited by clipping a necessary region from the panoramic image. At this time, a position and a size of each ofcamera image regions 12 oncomposite image 11 are defined according to a projection position (central position of projection region) and a projection size (size of projection region) during making the panoramic image. - On the other hand, by each of
cameras 1 imaging the subject with a wide angle, in capturedimage 51,blank region 53 is formed outside usedregion 52 corresponding tocamera image region 12 finally displayed on the composite image. It is possible to adjust the projection position and the projection size during making the panoramic image within a range ofblank region 53. - Here, in the present embodiment, it is possible to perform the pan adjustment, tilt adjustment, rolling adjustment, and zoom adjustment by the touch operation.
- As illustrated in (A) of
FIG. 6 , in the pan adjustment (position adjustment in horizontal direction),camera image region 12 can be moved in the horizontal direction and at this time, a process of adjusting the projection position of the image in the horizontal direction is performed. As illustrated in (B) ofFIG. 6 , in the tilt adjustment (position adjustment in vertical direction),camera image region 12 can be moved in the vertical direction and at this time, a process of adjusting the projection position of the image in the vertical direction is performed. As illustrated in (C) ofFIG. 6 , in the rolling adjustment (position adjustment in rotation direction),camera image region 12 can be rotated and at this time, a process of adjusting the projection position of the image in the rotation direction is performed. As illustrated in (D) ofFIG. 6 , in the zoom adjustment (extension/contraction adjustment),camera image region 12 can be zoomed out or zoomed in and at this time, a process of adjusting the size of the image is performed. - In addition, in the present embodiment, processing
condition setting unit 34 sets a setting value related to the projection position and the projection size as the control parameter related to generation of the panoramic image (composition source image) according to the touch operation. Based on the control parameter, panoramicimage generation unit 42 adjusts the projection position and the projection size during making the panoramic image. - Here, regarding each of
camera image regions 12, the control parameter related to each of the adjustment items of the pan adjustment, the tilt adjustment, the rolling adjustment, and the zoom adjustment is preset. When the touch operation related to one of the pan adjustment, the tilt adjustment, the rolling adjustment, and the zoom adjustment is performed, the control parameter related to the adjustment item corresponding tocamera image region 12 to be adjusted is updated. - Next, the adjustment item according to the operation mode of the touch operation will be described.
FIG. 7 is an explanatory diagram illustrating a list of the adjustment items according to the operation mode of the touch operation. - In the present embodiment, in a case where the touch operation is performed with one or two fingers, one
camera image region 12 on which the touch operation is performed is set as an adjustment target and an individual adjustment of individually adjusting each ofcamera image regions 12 can be performed. In addition, in a case where the touch operation is performed with three or more fingers, all ofcamera image regions 12 are set as the adjustment targets and an overall adjustment of integrally adjusting all ofcamera image regions 12 can be performed. - In addition, in the present embodiment, the pan adjustment (position adjustment in horizontal direction) of horizontally moving
camera image region 12 to be adjusted, the tilt adjustment (position adjustment in vertical direction) of vertically movingcamera image region 12 to be adjusted, the rolling adjustment (position adjustment in rotation direction) of rotatingcamera image region 12 to be adjusted, and the zoom adjustment (extension/contraction adjustment) of zooming in or zooming outcamera image region 12 to be adjusted can be performed. - Hereinafter, the adjustment item according to the operation mode of the touch operation will be described in detail.
- First, a case where each of
camera image regions 12 is individually adjusted will be described.FIG. 8 is an explanatory diagram illustrating the touch operation in a case of individually adjusting each ofcamera image regions 12 ofhorizontal cameras 1.FIG. 9 is an explanatory diagram illustrating the touch operation in a case of individually adjustingcamera image regions 12 ofupward cameras 1. - In the present embodiment, in a case where the touch operation is performed with one or two fingers, one
camera image region 12 on which the touch operation is performed is set as an adjustment target and it is possible to individually adjust each ofcamera image regions 12. - First, in a case of
camera image region 12 related tohorizontal camera 1, as illustrated in (A) ofFIG. 8 , it is possible to perform the pan adjustment (position adjustment in horizontal direction) of horizontally movingcamera image region 12 to be operated by a swipe operation of sliding one finger touched on the screen in the horizontal direction (left and right direction of screen). - In addition, as illustrated in (B) of
FIG. 8 , it is possible to perform the tilt adjustment (position adjustment in vertical direction) of vertically movingcamera image region 12 to be operated by a swipe operation of sliding one finger touched on the screen in the vertical direction (up and down direction of screen). - In addition, as illustrated in (C) of
FIG. 8 , it is possible to perform the rolling adjustment (position adjustment in rotation direction) of rotatingcamera image region 12 to be operated by a rotation operation of moving two fingers touched on the screen to draw a circle. - In addition, as illustrated in (D) of
FIG. 8 , it is possible to perform the zoom adjustment (extension/contraction adjustment) of zooming in or zooming outcamera image region 12 to be operated by a pinch operation of moving two fingers touched on the screen to widen or narrow a gap between the two fingers. - On the other hand, in a case of
camera image region 12 related toupward camera 1, as illustrated in (A-1) ofFIG. 9 , it is possible to move the image which appears incamera image region 12 in the horizontal direction by a swipe operation of sliding one finger touched on the screen in the horizontal direction (left and right direction of image). - Here, in
camera image region 12 related toupward camera 1, a horizontal movement oncomposite image 11 corresponds to rotating ofprojection region 62 in a latitude line direction onprojection sphere 61 as illustrated in (A-2) ofFIG. 9 . For this reason, when the horizontal swipe operation is performed, the rolling adjustment (position adjustment in rotation direction) of adjusting the projection position of the image in the rotation direction is performed. - In addition, as illustrated in (B-1) of
FIG. 9 , it is possible to move the image which appears incamera image region 12 in the vertical direction by a swipe operation of sliding one finger touched on the screen in the vertical direction (up and down direction of image). - Here, in
camera image region 12 related toupward camera 1, a vertical movement oncomposite image 11 corresponds to moving ofprojection region 62 in a longitude line direction onprojection sphere 61 as illustrated in (B-2) ofFIG. 9 . For this reason, when the vertical swipe operation is performed, the tilt adjustment (position adjustment in diameter direction) of adjusting the projection position of the image in the vertical direction is performed. - Next, a case of integrally adjusting all of
camera image regions 12 will be described.FIGS. 10 and 11 are explanatory diagrams illustrating the touch operation in the case of integrally adjusting all ofcamera image regions 12. - In the present embodiment, in a case where the touch operation is performed with three or more fingers, all of
camera image regions 12 are set as the adjustment targets and it is possible to integrally adjust all ofcamera image regions 12. - First, as illustrated in (A-1) of
FIG. 10 , it is possible to perform the pan adjustment (position adjustment in horizontal direction) of horizontally moving all ofcamera image regions 12 by a swipe operation of sliding three or more fingers touched on the screen in the horizontal direction (left and right direction of screen). At this time, according to the movement ofcamera image region 12 related tohorizontal camera 1, the image which appears incamera image region 12 related toupward camera 1 moves in the horizontal direction. - In this case, by performing the swipe operation greatly,
camera image region 12 related tohorizontal camera 1 is displayed so as to be circulated. - Accordingly, for example, in the state illustrated in (A-1) of
FIG. 10 ,camera image region 12 positioned at a left or right end can be arranged at a center of the screen as illustrated in (A-2) ofFIG. 10 . - In addition, as illustrated in (B-1) of
FIG. 10 , it is possible to perform the tilt adjustment (position adjustment in vertical direction) of vertically moving all ofcamera image regions 12 by a swipe operation of sliding three or more fingers touched on the screen in the vertical direction (up and down direction of screen). - In this case, by performing the swipe operation greatly,
camera image region 12 is displayed so as to be vertically circulated. Accordingly, for example, in the state illustrated in (B-1) ofFIG. 10 ,camera image region 12 related toupward camera 1 positioned at an upper end can be arranged at a center of the screen as illustrated in (B-2) ofFIG. 10 . In this case, by sandwichingcamera image region 12 related toupward camera 1,camera image region 12 related toupward camera 1 is displayed in a state in whichcamera image region 12 is divided into two. - In addition, as illustrated in (A) of
FIG. 11 , it is possible to perform the rolling adjustment (position adjustment in rotation direction) of rotating all ofcamera image regions 12 by a rotation operation of moving three or more fingers touched on the screen to draw a circle. - In addition, as illustrated in (B) of
FIG. 11 , it is possible to perform the zoom adjustment (extension/contraction adjustment) of zooming in or zooming out all ofcamera image regions 12 by the pinch operation of moving three or more fingers touched on the screen to widen or narrow a gap between three or more fingers. - Next, a flow of a process executed by
image processing device 3 according to the touch operation of the user will be described.FIGS. 12, 13 , and 14 are flowcharts illustrating the flow of the process executed byimage processing device 3 according to the touch operation of the user. - First, touch
operation determination unit 33 determines whether or not the touch operation is detected (ST 101). Here, in a case where the touch operation is detected (Yes in ST 101), the disparity correction is released (ST 102). Then, it is determined whether or not the touch operation is an operation with one finger (ST 103). - Here, in a case where the touch operation is the operation with one finger (Yes in ST 103), next, it is determined whether or not the operation target is
camera image region 12 related to upward camera 1 (ST 104). - Here, in a case where the operation target is not
camera image region 12 related toupward camera 1, that is, the operation target iscamera image region 12 related to horizontal camera 1 (No in ST 104), next, it is determined whether or not the touch operation is the swipe operation in the horizontal direction (ST 105). Here, in the case of the swipe operation in the horizontal direction (Yes in ST 105), processingcondition setting unit 34 executes the pan adjustment (position adjustment in horizontal direction) according to a finger movement and updates the control parameter of pan (ST 106). - Here, in a case where the touch operation is not the swipe operation in the horizontal direction (No in ST 105), next, it is determined whether or not the touch operation is the swipe operation in the vertical direction (ST 107). Here, in the case of the swipe operation in the vertical direction (Yes in ST 107), processing
condition setting unit 34 executes the tilt adjustment (position adjustment in vertical direction) according to the finger movement and updates the control parameter of tilt (ST 108). - On the other hand, in a case where the operation target is
camera image region 12 related to upward camera 1 (Yes in ST 104), next, it is determined whether or not the touch operation is the swipe operation in the horizontal direction (ST 109). Here, in the case of the swipe operation in the horizontal direction (Yes in ST 109), processingcondition setting unit 34 executes the rolling adjustment (position adjustment in rotation direction) according to the finger movement and updates the control parameter of rolling (ST 110). - Here, in a case where the touch operation is not the swipe operation in the horizontal direction (No in ST 109), next, it is determined whether or not the touch operation is the swipe operation in the vertical direction (ST 111). Here, in the case of the swipe operation in the vertical direction (Yes in ST 111), processing
condition setting unit 34 executes the tilt adjustment (position adjustment in vertical direction) according to the finger movement and updates the control parameter of tilt (ST 112). - On the other hand, in a case where the touch operation is not the operation with one finger (No in ST 103), next, it is determined whether or not the touch operation is the operation with two fingers (ST 113).
- Here, in a case where the touch operation is the operation with two fingers (Yes in ST 113), next, it is determined whether or not the touch operation is the rotation operation (ST 114). Here, in the case of the rotation operation (Yes in ST 114), processing
condition setting unit 34 executes the rolling adjustment (position adjustment in rotation direction) according to the finger movement and updates the control parameter of rolling (ST 115). - In addition, in a case where the touch operation is not the rotation operation (No in ST 114), next, it is determined whether or not the touch operation is the pinch operation (ST 116). Here, in the case of the pinch operation (Yes in ST 116), processing
condition setting unit 34 executes the zoom adjustment (extension/contraction adjustment) according to the finger movement and updates the control parameter of zoom (ST 117). - On the other hand, in a case where the touch operation is not the operation with two fingers (No in ST 113), it is determined that the touch operation is the operation with three or more fingers and next, it is determined whether or not the touch operation is the swipe operation in the horizontal direction (ST 118). Here, in the case of the swipe operation in the horizontal direction (Yes in ST 118), processing
condition setting unit 34 executes the pan adjustment (position adjustment in horizontal direction) according to a finger movement and updates the control parameter of pan (ST 119). - Here, in a case where the touch operation is not the swipe operation in the horizontal direction (No in ST 118), next, it is determined whether or not the touch operation is the swipe operation in the vertical direction (ST 120). Here, in the case of the swipe operation in the vertical direction (Yes in ST 120), processing
condition setting unit 34 executes the tilt adjustment (position adjustment in vertical direction) according to the finger movement and updates the control parameter of tilt (ST 121). - On the other hand, in a case where the touch operation is not the swipe operation in the vertical direction (No in ST 120), next, it is determined whether or not the touch operation is the rotation operation (ST 122). Here, in the case of the rotation operation (Yes in ST 122), processing
condition setting unit 34 executes the rolling adjustment (position adjustment in rotation direction) according to the finger movement and updates the control parameter of rolling (ST 123). - In addition, in a case where the touch operation is not the rotation operation (No in ST 122), next, it is determined whether or not the touch operation is the pinch operation (ST 124). Here, in the case of the pinch operation (Yes in ST 124), processing
condition setting unit 34 executes the zoom adjustment (extension/contraction adjustment) according to the finger movement and updates the control parameter of zoom (ST 125). - As described above, when the control parameters of the pan, the tilt, the rolling, and the zoom are updated by the touch operation of the user, panoramic
image generation unit 42 ofstitching processing unit 31 performs the process of generating the panoramic image based on the updated control parameter and next,image composition unit 45 performs the process of compositing the panoramic images and the generated composite image is displayed on the preview display screen (ST 126). - Next, an image process of improving visibility of
camera image region 12 to be adjusted will be described.FIG. 15 is an explanatory diagram illustrating the image process of improving the visibility ofcamera image region 12 to be adjusted. - In the present embodiment,
composite image 11 changes a tone ofcamera image region 12 not to be adjusted from the initial tone, so that the image process is performed onimage composition unit 45 to excellently improve visibility ofcamera image region 12 to be adjusted. Specifically, in the present embodiment, at least an image process of changing luminance is performed oncamera image region 12 not to be adjusted. In the example illustrated inFIG. 15 , the image process of lowering the luminance is performed oncamera image region 12 not to be adjusted andcamera image region 12 not to be adjusted is grayed out and darkly displayed. - Accordingly, in the preview display screen on which
composite image 11 is displayed,camera image region 12 to be adjusted becomes conspicuous and the user can intuitively graspcamera image region 12 to be adjusted. - Next, control of causing the user to recognize the just fit state of
camera image region 12 to be adjusted will be described.FIG. 16 is an explanatory diagram illustrating the control of causing the user to recognize the just fit state ofcamera image region 12 to be adjusted. - In the present embodiment, when a touch operation of adjusting
camera image region 12 is performed on the preview display screen, the processing condition (control parameter related to generation of panoramic image) ofstitching processing unit 31 is regularly updated according to the touch operation, the composition source image is updated according to this, and the composite image composited from the composition source images is regularly output fromstitching processing unit 31. Accordingly, the composite image is displayed on the preview display screen as a video, at this time, an animation in whichcamera image region 12 to be operated changes in conjunction with the finger movement is executed. - In addition, in the present embodiment, just
fit detection unit 46 detects the just fit state in which consistency of the stitching portion betweencamera image region 12 to be operated and adjacentcamera image region 12 is high. Then, when justfit detection unit 46 detects the just fit state, the animation ofcamera image region 12 to be operated, that is, the movement ofcamera image region 12 in conjunction with the finger movement is stopped. - At this time,
image composition unit 45 continuously outputs the composite image within a predetermined range based on the composition source image (panoramic image) obtained at a timing at which the just fit state is detected. Accordingly, on the preview display screen,camera image region 12 to be operated is displayed in a stationary state without interlocking with the finger movement. - In addition, the animation is stopped within a predetermined range and in a case of an outside of the predetermined range, the stop of animation is released. In this case, a timing of releasing the stop of the animation can be defined by a distance. That is, when the distance from a position of the just fit state to a touch position becomes larger than a predetermined distance, the stop of the animation is released. In addition, the timing of releasing the stop of the animation can be defined by a time. That is, when a predetermined time (for example, 1 second) elapses after the just fit state is detected, the stop of the animation is released.
- In the example illustrated in
FIG. 16 , the swipe operation of sliding one finger touched on the screen in the vertical direction (up and down direction of image) is performed. As illustrated in (A) ofFIG. 16 , when the finger moves,camera image region 12 to be operated is moved in the vertical direction according to the finger movement. Then, at the timing illustrated in (B) ofFIG. 16 , when the just fit state is detected, as illustrated in (C) ofFIG. 16 , even if the finger further moves,camera image region 12 to be operated is stopped as it is. As illustrated in (D) ofFIG. 16 , when the finger further moves,camera image region 12 to be operated is moved in the vertical direction according to the finger movement. - In the present embodiment, when the just fit state of
camera image region 12 to be adjusted is detected, the animation ofcamera image region 12 is stopped. Accordingly, the user can intuitively grasp the just fit state and it is possible to quickly perform the adjustment operation. - In the present embodiment, by stopping the animation of
camera image region 12 to be operated, the user recognizes the just fit state. A method of causing the user to recognize the just fit state is not limited thereto. For example, at the timing of the just fit,camera image region 12 may be flashed (blinked). In addition, at the timing of the just fit, a frame line surroundingcamera image region 12 may be displayed. - Next, a touch operation when a tone and brightness of
camera image region 12 are adjusted will be described.FIG. 17 is an explanatory diagram illustrating the touch operation when the tone and the brightness ofcamera image region 12 are adjusted. - In the present embodiment, in a case where the touch operation is simultaneously performed on two
camera image regions 12 with two fingers, twocamera image regions 12 on which the touch operation is performed are set as the adjustment targets andimage adjustment unit 35 performs the image adjustment between twocamera image regions 12. - Specifically, in the present embodiment, the image adjustment is performed so that the tones and the brightness of touch regions 71 (predetermined range around touch position) in two
camera image regions 12 on which the touch operation is performed are matched. In the example illustrated inFIG. 17 , the touch operation is performed in a region of a sky in twocamera image regions 12 so that the tones and the brightness of the sky in twocamera image regions 12 are matched. - Accordingly, in a case where a difference in the tones and the brightness occurs in two
camera image regions 12 in the region in which the tones and the brightness should be originally matched, by performing the touch operation on the region, it is possible to easily match the tones and the brightness of twocamera image regions 12. - In the image adjustment, information on the tones and brightness of the touch region in two
camera image regions 12 to be adjusted is obtained andcamera 1 of an imaging source of twocamera image regions 12 is controlled based on the information so that the tones and the brightness of the touch region in twocamera image regions 12 are matched. - At this time, the imaging condition of
camera 1, that is, the control parameter related to tones and brightness is adjusted so that the tones and the brightness in twocamera image regions 12 are matched. Specifically, a white balance gain value insignal processing unit 23 is adjusted as a control parameter related to the tone (white balance). In addition, a shutter value (shutter speed) and a sensor gain value inimage sensor unit 22, and a digital gain value insignal processing unit 23 are adjusted as control parameters related to brightness (exposure). - In the present embodiment, the touch operation is simultaneously performed on two
camera image regions 12 with two fingers, so that the image adjustment of twocamera image regions 12 is performed. When the touch operation is simultaneously performed on three or morecamera image regions 12 with three or more fingers, the image adjustment of three or morecamera image regions 12 can be performed. In addition, in the present embodiment, the control is performed so that both of the tones and the brightness are matched, but one of the tones and the brightness, for example, the control may be performed so that only the tones are matched. - As described above, the embodiment is described as an example of a technique disclosed in the present application. However, the technique in the present disclosure is not limited thereto. The technique also can be applied to the embodiment being changed, exchanged, added, and omitted. In addition, it is also possible to form a new embodiment by combining components in the embodiment described above.
- For example, in the embodiment described above, the image processing device (including PC) provided separately from the camera unit having the plurality of cameras performs the image composition process or the like and outputs the composite image, but the embodiment can be configured as the camera unit (imaging device) mounting the image processing device.
- In addition, the embodiment described above is configured to include the upward camera and the horizontal camera, but the embodiment may be configured to include a downward camera instead of the upward camera. Further, the embodiment may be configured to include both of the upward camera and the downward camera.
- In addition, in the embodiment described above, when the preview display screen is displayed, a preview output switching unit outputs the input composition source image (panoramic image) to the image composition unit and the composition source image is not input to the disparity correction unit, but after the composition source image is input to the disparity correction unit, by setting the disparity correction amount to 0, the disparity correction process may not be actually performed.
- Further, in the embodiment described above, the image processing device which generates the normal composite image executes the calibration for securing consistency between the camera image regions in the composite image, but the embodiment may be configured to include an exclusive device for the calibration, an exclusive program for the calibration may be installed in the information processing device, and the calibration may be executed.
- An image processing device, an imaging system that includes the image processing device, and a calibration method according to the present disclosure are advantageous in that a user can conveniently perform adjustment work for calibration of securing consistency between camera image regions in a composite image, and are useful as an image processing device that outputs a composite image by performing an image composition process on an image for each of a plurality of cameras, an imaging system that includes the image processing device, a calibration method of securing consistency between camera image regions in the composite image, and the like.
-
- 1 CAMERA
- 2 CAMERA UNIT
- 3 IMAGE PROCESSING DEVICE
- 4 DISPLAY INPUT DEVICE
- 11 COMPOSITE IMAGE
- 12 CAMERA IMAGE REGION
- 31 STITCHING PROCESSING UNIT (IMAGE PROCESSING UNIT)
- 32 SCREEN GENERATION UNIT
- 33 TOUCH OPERATION DETERMINATION UNIT
- 34 PROCESSING CONDITION SETTING UNIT
- 35 IMAGE ADJUSTMENT UNIT
- 41 DISPARITY CORRECTION AMOUNT CALCULATOR
- 42 PANORAMIC IMAGE GENERATION UNIT
- 43 PREVIEW OUTPUT SWITCHING UNIT
- 44 DISPARITY CORRECTION UNIT
- 45 IMAGE COMPOSITION UNIT
- 46 JUST FIT DETECTION UNIT
Claims (14)
1. An image processing device that outputs a composite image by performing an image composition process on an image for each of a plurality of cameras, the image processing device comprising:
a processor; and
a memory including instructions that, when executed by the processor, cause the processor to perform operations, the operations including:
generating a plurality of composition source images from the image for each of the plurality of cameras based on a preset processing condition;
performing the image composition process on the plurality of composition source images to generate a composite image for preview, the composite image including a plurality of camera image regions, each of the plurality of camera image regions corresponding to one of the plurality of composition source images and one of the plurality of cameras;
generating a preview display screen on which the composite image is displayed and outputting the preview display screen to a display;
detecting a first touch operation that a user performs on the preview display screen displayed on the display and determining a first adjustment item according to the first touch operation, the first touch operation being performed with one or two fingers and in relation to a first camera image region of the plurality of camera image regions, the first camera image region corresponding to a first composition source image of the plurality of composition source images and a first camera of the plurality of cameras;
first adjusting at least one of the first composition source image or the first camera according to the first adjustment item and updating the composite image;
detecting a second touch operation that the user performs on the preview display screen displayed on the display and determining a second adjustment item, the second touch operation being performed with at least three fingers; and
second adjusting at least one of all composition source images of the plurality of composition source images or all cameras of the plurality of cameras according to the second adjustment item and updating the composite image.
2. The image processing device of claim 1 ,
wherein the operations further include:
detecting a third touch operation that the user performs on the preview display screen displayed on the display and determining a third adjustment item, the third touch operation being performed with two fingers and in relation to two camera image regions; and
in a case where the third touch operation is performed, performing an image adjustment between the two camera image regions according to the third adjustment item.
3. The image processing device of claim 1 ,
wherein the operations further include:
in a case where the first touch operation is performed, setting the first camera image region on which the first touch operation is performed as an adjustment target and determining the first adjustment item according to the first touch operation.
4. The image processing device of claim 3 ,
wherein the first adjusting includes:
performing a pan adjustment in a case where the first touch operation is a swipe operation in a horizontal direction with one finger, performing a tilt adjustment in a case where the first touch operation is a swipe operation in a vertical direction with one finger, performing a rolling adjustment in a case where the first touch operation is a rotation operation with two fingers, and performing a zoom adjustment in a case where the first touch operation is a pinch operation with two fingers.
5. The image processing device of claim 1 ,
wherein the operations further include:
performing image processing of changing a tone of at least one of the plurality of camera image regions not to be adjusted in the composite image from an initial tone.
6. The image processing device of claim 5 ,
wherein the operations further include:
performing image processing of changing at least luminance on the at least one of the plurality of camera image regions not to be adjusted.
7. The image processing device of claim 1 ,
wherein the operations further include:
in a case where the second touch operation is performed with at least three fingers, setting all of the plurality of camera image regions on which the second touch operation is performed as adjustment targets and determining the second adjustment item according to the second touch operation.
8. The image processing device of claim 1 ,
wherein the second adjusting includes:
performing a pan adjustment in a case where the second touch operation is a swipe operation in a horizontal direction with at least three fingers, performing a tilt adjustment in a case where the second touch operation is a swipe operation in a vertical direction with at least three fingers, performing a rolling adjustment in a case where the second touch operation is a rotation operation with at least three fingers, performing a zoom adjustment in a case where the second touch operation is a pinch operation with at least three fingers.
9. The image processing device of claim 1 ,
wherein the operations further include:
detecting a just fit state in which consistency of a stitching portion between the first camera image region to be adjusted and a second camera image region adjacent to the first camera image region to be adjusted in the composite image is high, and
in a case of detecting that the just fit state is high, outputting the composite image within a predetermined range at a timing at which the just fit state is detected.
10. The image processing device of claim 1 ,
wherein the operations further include:
detecting a third touch operation that the user performs on the preview display screen displayed on the display and determining a third adjustment item, the third touch operation being performed with two fingers and in simultaneous relationship to two camera image regions; and
in a case where the third touch operation is performed, performing an image adjustment so that at least either tones or brightness in the two camera image regions is matched.
11. The image processing device of claim 1 ,
wherein the operations further include:
performing a disparity correction process on the plurality of composition source images before the image composition process; and
after the preview display screen is displayed, the disparity correction process is not performed on the plurality of composition source images.
12. An imaging system comprising:
the image processing device according to claim 1 ;
the plurality of cameras; and
the display.
13. The image processing device of claim 1 , wherein, in the second adjusting, all cameras of the plurality of cameras are adjusted according to the second adjustment item.
14. A calibration method of securing consistency between camera image regions in a composite image generated by performing an image composition process on an image for each of a plurality of cameras, the calibration method comprising:
generating a plurality of composition source images from the image for each of the plurality of cameras based on a preset processing condition and performing the image composition process on the plurality of composition source images to generate a composite image for preview, the composite image including a plurality of camera image regions, each of the plurality of camera image regions corresponding to one of the plurality of composition source images and one of the plurality of cameras;
generating a preview display screen on which the composite image is displayed and outputting the preview display screen to a display;
detecting a first touch operation which a user performs on the preview display screen displayed on the display and determining a first adjustment item according to the first touch operation, the first touch operation being performed with one or two fingers and in relation to a first camera image region of the plurality of camera image regions, the first camera image region corresponding to a first composition source image of the plurality of composition source images and a first camera of the plurality of cameras;
adjusting at least one of the first composition source image or the first camera according to the first adjustment item and updating the composite image;
detecting a second touch operation that the user performs on the preview display screen displayed on the display and determining a second adjustment item, the second touch operation being performed with at least three fingers; and
adjusting at least one of all composition source images of the plurality of composition source images or all cameras of the plurality of cameras according to the second adjustment item and updating the composite image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/952,827 US20210073942A1 (en) | 2016-04-25 | 2020-11-19 | Image processing device, imaging system provided therewith, and calibration method |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016-087443 | 2016-04-25 | ||
JP2016087443A JP6323729B2 (en) | 2016-04-25 | 2016-04-25 | Image processing apparatus, imaging system including the same, and calibration method |
PCT/JP2017/014327 WO2017187923A1 (en) | 2016-04-25 | 2017-04-06 | Image processing device, imaging system provided therewith, and calibration method |
US201816088697A | 2018-09-26 | 2018-09-26 | |
US16/952,827 US20210073942A1 (en) | 2016-04-25 | 2020-11-19 | Image processing device, imaging system provided therewith, and calibration method |
Related Parent Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/088,697 Continuation US10872395B2 (en) | 2016-04-25 | 2017-04-06 | Image processing device, imaging system provided therewith, and calibration method |
PCT/JP2017/014327 Continuation WO2017187923A1 (en) | 2016-04-25 | 2017-04-06 | Image processing device, imaging system provided therewith, and calibration method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210073942A1 true US20210073942A1 (en) | 2021-03-11 |
Family
ID=60161597
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/088,697 Active 2037-08-05 US10872395B2 (en) | 2016-04-25 | 2017-04-06 | Image processing device, imaging system provided therewith, and calibration method |
US16/952,827 Abandoned US20210073942A1 (en) | 2016-04-25 | 2020-11-19 | Image processing device, imaging system provided therewith, and calibration method |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/088,697 Active 2037-08-05 US10872395B2 (en) | 2016-04-25 | 2017-04-06 | Image processing device, imaging system provided therewith, and calibration method |
Country Status (3)
Country | Link |
---|---|
US (2) | US10872395B2 (en) |
JP (1) | JP6323729B2 (en) |
WO (1) | WO2017187923A1 (en) |
Families Citing this family (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AR106558A1 (en) | 2015-11-03 | 2018-01-24 | Spraying Systems Co | APPARATUS AND SPRAY DRYING METHOD |
US9854156B1 (en) | 2016-06-12 | 2017-12-26 | Apple Inc. | User interface for camera effects |
DK180859B1 (en) | 2017-06-04 | 2022-05-23 | Apple Inc | USER INTERFACE CAMERA EFFECTS |
US11112964B2 (en) | 2018-02-09 | 2021-09-07 | Apple Inc. | Media capture lock affordance for graphical user interface |
DE102018202707A1 (en) | 2018-02-22 | 2019-08-22 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Generation of panoramic pictures |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
US10375313B1 (en) | 2018-05-07 | 2019-08-06 | Apple Inc. | Creative camera |
JPWO2020039605A1 (en) * | 2018-08-20 | 2021-08-26 | コニカミノルタ株式会社 | Gas detectors, information processing devices and programs |
DK201870623A1 (en) | 2018-09-11 | 2020-04-15 | Apple Inc. | User interfaces for simulated depth effects |
US10645294B1 (en) | 2019-05-06 | 2020-05-05 | Apple Inc. | User interfaces for capturing and managing visual media |
US11770601B2 (en) | 2019-05-06 | 2023-09-26 | Apple Inc. | User interfaces for capturing and managing visual media |
US11128792B2 (en) | 2018-09-28 | 2021-09-21 | Apple Inc. | Capturing and displaying images with multiple focal planes |
US11321857B2 (en) | 2018-09-28 | 2022-05-03 | Apple Inc. | Displaying and editing images with depth information |
JP6790038B2 (en) * | 2018-10-03 | 2020-11-25 | キヤノン株式会社 | Image processing device, imaging device, control method and program of image processing device |
US11706521B2 (en) | 2019-05-06 | 2023-07-18 | Apple Inc. | User interfaces for capturing and managing visual media |
US11991450B2 (en) | 2019-05-27 | 2024-05-21 | Sony Group Corporation | Composition control device, composition control method, and program |
US11039074B1 (en) | 2020-06-01 | 2021-06-15 | Apple Inc. | User interfaces for managing media |
CN113965743A (en) * | 2020-07-21 | 2022-01-21 | 珠海格力电器股份有限公司 | Image shooting method and device and electronic equipment |
US11212449B1 (en) * | 2020-09-25 | 2021-12-28 | Apple Inc. | User interfaces for media capture and management |
US11394851B1 (en) * | 2021-03-05 | 2022-07-19 | Toshiba Tec Kabushiki Kaisha | Information processing apparatus and display method |
CN113032217B (en) * | 2021-03-26 | 2023-03-10 | 山东英信计算机技术有限公司 | Cluster monitoring method and related device |
US11539876B2 (en) | 2021-04-30 | 2022-12-27 | Apple Inc. | User interfaces for altering visual media |
US11778339B2 (en) | 2021-04-30 | 2023-10-03 | Apple Inc. | User interfaces for altering visual media |
US12112024B2 (en) | 2021-06-01 | 2024-10-08 | Apple Inc. | User interfaces for managing media styles |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6324545B1 (en) * | 1997-10-15 | 2001-11-27 | Colordesk Ltd. | Personalized photo album |
US20040252884A1 (en) * | 2003-06-12 | 2004-12-16 | Fuji Xerox Co., Ltd. | Methods for multisource color normalization |
US20060097991A1 (en) * | 2004-05-06 | 2006-05-11 | Apple Computer, Inc. | Multipoint touchscreen |
US20100328306A1 (en) * | 2008-02-19 | 2010-12-30 | The Board Of Trustees Of The Univer Of Illinois | Large format high resolution interactive display |
US20110085016A1 (en) * | 2009-10-14 | 2011-04-14 | Tandberg Telecom As | Device, computer program product and method for providing touch control of a video conference |
US20110197147A1 (en) * | 2010-02-11 | 2011-08-11 | Apple Inc. | Projected display shared workspaces |
US20110195781A1 (en) * | 2010-02-05 | 2011-08-11 | Microsoft Corporation | Multi-touch mouse in gaming applications |
US20120282974A1 (en) * | 2011-05-03 | 2012-11-08 | Green Robert M | Mobile device controller application for any security system |
US20130093727A1 (en) * | 2002-11-04 | 2013-04-18 | Neonode, Inc. | Light-based finger gesture user interface |
US8677282B2 (en) * | 2009-05-13 | 2014-03-18 | International Business Machines Corporation | Multi-finger touch adaptations for medical imaging systems |
US20140082491A1 (en) * | 2011-05-26 | 2014-03-20 | Panasonic Corporation | Electronic device and editing method for synthetic image |
US20140215365A1 (en) * | 2013-01-25 | 2014-07-31 | Morpho, Inc | Image display apparatus, image displaying method and program |
US20150277705A1 (en) * | 2014-03-25 | 2015-10-01 | Youlapse Oy | Graphical user interface user input technique for choosing and combining digital images as video |
US20150356735A1 (en) * | 2013-01-30 | 2015-12-10 | Fujitsu Ten Limited | Image processing apparatus and image processing method |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008070968A (en) * | 2006-09-12 | 2008-03-27 | Funai Electric Co Ltd | Display processor |
JP2010050842A (en) | 2008-08-22 | 2010-03-04 | Sony Taiwan Ltd | High dynamic stitching method for multi-lens camera system |
-
2016
- 2016-04-25 JP JP2016087443A patent/JP6323729B2/en active Active
-
2017
- 2017-04-06 US US16/088,697 patent/US10872395B2/en active Active
- 2017-04-06 WO PCT/JP2017/014327 patent/WO2017187923A1/en active Application Filing
-
2020
- 2020-11-19 US US16/952,827 patent/US20210073942A1/en not_active Abandoned
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6324545B1 (en) * | 1997-10-15 | 2001-11-27 | Colordesk Ltd. | Personalized photo album |
US20130093727A1 (en) * | 2002-11-04 | 2013-04-18 | Neonode, Inc. | Light-based finger gesture user interface |
US20040252884A1 (en) * | 2003-06-12 | 2004-12-16 | Fuji Xerox Co., Ltd. | Methods for multisource color normalization |
US20060097991A1 (en) * | 2004-05-06 | 2006-05-11 | Apple Computer, Inc. | Multipoint touchscreen |
US20100328306A1 (en) * | 2008-02-19 | 2010-12-30 | The Board Of Trustees Of The Univer Of Illinois | Large format high resolution interactive display |
US8677282B2 (en) * | 2009-05-13 | 2014-03-18 | International Business Machines Corporation | Multi-finger touch adaptations for medical imaging systems |
US20110085016A1 (en) * | 2009-10-14 | 2011-04-14 | Tandberg Telecom As | Device, computer program product and method for providing touch control of a video conference |
US20110195781A1 (en) * | 2010-02-05 | 2011-08-11 | Microsoft Corporation | Multi-touch mouse in gaming applications |
US20110197147A1 (en) * | 2010-02-11 | 2011-08-11 | Apple Inc. | Projected display shared workspaces |
US20120282974A1 (en) * | 2011-05-03 | 2012-11-08 | Green Robert M | Mobile device controller application for any security system |
US20140082491A1 (en) * | 2011-05-26 | 2014-03-20 | Panasonic Corporation | Electronic device and editing method for synthetic image |
US20140215365A1 (en) * | 2013-01-25 | 2014-07-31 | Morpho, Inc | Image display apparatus, image displaying method and program |
US20150356735A1 (en) * | 2013-01-30 | 2015-12-10 | Fujitsu Ten Limited | Image processing apparatus and image processing method |
US20150277705A1 (en) * | 2014-03-25 | 2015-10-01 | Youlapse Oy | Graphical user interface user input technique for choosing and combining digital images as video |
Non-Patent Citations (3)
Title |
---|
Edelmann et al. ("The DabR - A multitouch system for intuitive 3D scene navigation," 3DTV Conference: The True Vision - Capture, Transmission and Display of 3D Video; Date of Conference: 04-06 May 2009) (Year: 2009) * |
Li et al. ("Analyzing Algorithm of Multi-camera Multi-touch System for Educational Application," Second International Conference on Education Technology and Training; Date of Conference: 13-14 December 2009) (Year: 2009) * |
Schoeffmann et al. ("Video navigation on tablets with multi-touch gestures," IEEE International Conference on Multimedia and Expo Workshops (ICMEW), 14-18 July 2014) (Year: 2014) * |
Also Published As
Publication number | Publication date |
---|---|
JP6323729B2 (en) | 2018-05-16 |
WO2017187923A1 (en) | 2017-11-02 |
US10872395B2 (en) | 2020-12-22 |
JP2017199982A (en) | 2017-11-02 |
US20190114740A1 (en) | 2019-04-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210073942A1 (en) | Image processing device, imaging system provided therewith, and calibration method | |
JP4699040B2 (en) | Automatic tracking control device, automatic tracking control method, program, and automatic tracking system | |
CN105196917B (en) | Panoramic view monitoring device for image and its method of work | |
US9179069B2 (en) | Photographing device, portable information processing terminal, monitor display method for photographing device, and program | |
JP2017199982A5 (en) | ||
JP2004260785A (en) | Projector with distortion correction function | |
US11190747B2 (en) | Display control apparatus, display control method, and storage medium | |
JP6589294B2 (en) | Image display device | |
JP2003219324A (en) | Image correction data calculation method, image correction data calculation apparatus, and multi- projection system | |
JP5066497B2 (en) | Face detection apparatus and method | |
JP2009031334A (en) | Projector and projection method for projector | |
JP2011039454A5 (en) | Focus adjustment device, focus adjustment method and program | |
JP2013236215A (en) | Display video forming apparatus and display video forming method | |
JPWO2019026746A1 (en) | Image processing apparatus and method, imaging apparatus, and program | |
JP2005318652A (en) | Projector with distortion correcting function | |
US10955235B2 (en) | Distance measurement apparatus and distance measurement method | |
US10861188B2 (en) | Image processing apparatus, medium, and method | |
US11416978B2 (en) | Image processing apparatus, control method and non-transitory computer-readable recording medium therefor | |
CN114822331A (en) | Display method and display system | |
EP3547663B1 (en) | Panaoramic vision system with parallax mitigation | |
US11830177B2 (en) | Image processing apparatus, control method and non-transitory computer-readable recording medium therefor | |
US10359616B2 (en) | Microscope system. method and computer-readable storage device storing instructions for generating joined images | |
RU2672966C1 (en) | Method of visual observation and control of mechanical processing | |
JP2010130309A (en) | Imaging device | |
JP2018137786A (en) | Picture processing device and imaging system comprising the same, and calibration method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |