WO2017077650A1 - 画像処理装置、画像処理方法及び画像処理プログラム - Google Patents
画像処理装置、画像処理方法及び画像処理プログラム Download PDFInfo
- Publication number
- WO2017077650A1 WO2017077650A1 PCT/JP2015/081358 JP2015081358W WO2017077650A1 WO 2017077650 A1 WO2017077650 A1 WO 2017077650A1 JP 2015081358 W JP2015081358 W JP 2015081358W WO 2017077650 A1 WO2017077650 A1 WO 2017077650A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- dimensional object
- imaging device
- overhead
- area
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 120
- 238000003672 processing method Methods 0.000 title claims description 6
- 238000004364 calculation method Methods 0.000 claims abstract description 38
- 238000003384 imaging method Methods 0.000 claims description 198
- 238000000034 method Methods 0.000 claims description 61
- 240000004050 Pentaglottis sempervirens Species 0.000 claims description 53
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 claims description 53
- 230000008569 process Effects 0.000 claims description 49
- 239000007787 solid Substances 0.000 claims description 9
- 230000006870 function Effects 0.000 description 25
- 238000001514 detection method Methods 0.000 description 17
- 238000006243 chemical reaction Methods 0.000 description 15
- 238000009434 installation Methods 0.000 description 11
- 230000006835 compression Effects 0.000 description 9
- 238000007906 compression Methods 0.000 description 9
- 238000004891 communication Methods 0.000 description 8
- 238000012937 correction Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 239000002131 composite material Substances 0.000 description 7
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 230000002194 synthesizing effect Effects 0.000 description 3
- 230000000052 comparative effect Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000013078 crystal Substances 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
- G06T3/4038—Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/174—Segmentation; Edge detection involving the use of two or more images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
- H04N25/61—Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/303—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/60—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
- B60R2300/607—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
Definitions
- the present invention relates to an image processing device, an image processing method, and an image processing program for synthesizing a plurality of overhead images obtained by viewpoint conversion.
- a technique for generating a bird's-eye view image that converts a plurality of camera images into viewpoints and looking down from above, and generates a synthesized bird's-eye view image by synthesizing the generated plurality of bird's-eye views.
- the bird's-eye view image generated by the viewpoint conversion all the subjects in the image are displayed as existing on the same plane.
- the overhead image is normally generated based on the ground, an object at a position higher than the ground is stretched so as to fall down on the ground, and is displayed distorted.
- the cameras are usually installed so that the imaging ranges of adjacent camera images partially overlap. An image of an object located in the overlapped imaging range may disappear or appear double due to camera parallax.
- Patent Document 1 discloses a technique for preferentially displaying, as an image of an overlapping portion, an overhead image in which a three-dimensional object appears larger in two overhead images generated from two camera images partially overlapping in imaging range. Is disclosed.
- Patent Document 1 since one of the two overhead images is displayed, the object does not disappear or appear double, but the overhead image in which the three-dimensional object appears larger is adopted. There is a problem that an image with a large distortion of a three-dimensional object may be displayed.
- An object of the present invention is to obtain an image with a small distortion of a three-dimensional object as an image of an overlapping portion between two overhead images generated from two camera images.
- An image processing apparatus includes: 1st apparatus information including the positional information of the 1st imaging device which images the 1st field including the common field where a solid thing is located, and the positional information of the 2nd imaging device which images the 2nd field including the common field
- a boundary calculation unit that calculates a boundary position that serves as a reference for dividing the common area into the first imaging device side and the second imaging device side using the second device information; Based on the boundary position and the position of the three-dimensional object, an image obtained by imaging the first region by the first imaging device is a first bird's-eye view image, and the image of the three-dimensional object is distorted.
- the three-dimensional A selection unit that selects a bird's-eye view image with a small distortion of the object image as a selection image; An image of the first area other than the common area in the first overhead image, an image of the second area other than the common area in the second overhead image, and an image of the common area included in the selected image
- an image generation unit that generates an image of an area composed of the first area and the second area.
- the image processing apparatus includes a boundary calculation unit that calculates a boundary position that serves as a reference for dividing the common area into the first imaging device side and the second imaging device side, and the boundary position and the position of the three-dimensional object. Based on the first and second overhead images in which the image of the three-dimensional object is distorted, a selection unit that selects an overhead image having a small distortion of the three-dimensional object image as a selection image is provided. There is an effect that an image with a small distortion of a three-dimensional object can be obtained as an overhead image.
- FIG. 1 is a diagram showing a configuration of an image processing apparatus 100 according to Embodiment 1.
- FIG. 3 is a diagram showing a positional relationship between a first imaging device 210 and a second imaging device 220 according to Embodiment 1.
- 3 is a flowchart showing an image processing method 510 according to the first embodiment and an image processing S100 of an image processing program 520.
- FIG. FIG. 3 is a flowchart showing boundary calculation processing S110 according to the first embodiment.
- FIG. 6 is a diagram illustrating a calculation method of a boundary position xc in the imaging space 800 according to the first embodiment.
- FIG. 3 is a flowchart showing an overhead view generation process S120 according to the first embodiment.
- FIG. 3 is a flowchart showing a selection process S130 according to the first embodiment.
- FIG. FIG. 3 is a flowchart showing an image generation process S140 according to the first embodiment.
- FIG. 6 shows a method for generating a region image 340 according to the first embodiment.
- FIG. 6 shows a method for generating a region image 340 according to the first embodiment.
- FIG. 6 is a diagram showing a configuration of an image processing apparatus 100 according to a modification of the first embodiment.
- FIG. 3 is a diagram illustrating an application example of the image processing apparatus 100 according to the first embodiment.
- FIG. 4 is a diagram illustrating a configuration of an image processing apparatus 100a according to a second embodiment.
- FIG. 9 is a flowchart showing image processing S100a of the image processing apparatus 100a according to the second embodiment.
- FIG. 9 is a flowchart showing region image generation processing S150a according to the second embodiment. The figure which shows area
- FIG. 9 is a diagram illustrating a configuration of an image processing apparatus 100a according to a second embodiment.
- FIG. 9 is a flowchart showing image processing S100a of the image processing apparatus 100a according to the second embodiment.
- FIG. 9 is a flowchart showing region image generation processing S150a according to the second embodiment. The figure which shows area
- Embodiment 1 FIG. *** Explanation of configuration *** The configuration of the image processing apparatus 100 according to the present embodiment will be described with reference to FIG.
- the image processing apparatus 100 is a computer.
- the image processing apparatus 100 includes hardware such as a processor 910, a storage device 920, an input interface 930, and an output interface 940. Further, the image processing apparatus 100 includes a boundary calculation unit 110, an overhead view generation unit 120, a selection unit 130, an image generation unit 140, and a storage unit 150 as functional configurations.
- the selection unit 130 includes a position detection unit 131 and an image selection unit 132.
- the functions of the boundary calculation unit 110, the overhead view generation unit 120, the position detection unit 131, the image selection unit 132, and the image generation unit 140 in the image processing apparatus 100 are described as “parts” of the image processing apparatus 100. Is called the function.
- the function of “unit” of the image processing apparatus 100 is realized by software.
- the storage unit 150 is realized by the storage device 920.
- the storage unit 150 stores imaging device information 160, subject information 170, a boundary position 180, and a comparative image 190.
- the imaging device information 160 includes first device information 161 and second device information 162.
- the subject information 170 includes height information 171.
- the processor 910 is connected to other hardware via a signal line, and controls these other hardware.
- the processor 910 is an integrated circuit (IC) that performs processing.
- the processor 910 is specifically a CPU (Central Processing Unit).
- the storage device 920 includes an auxiliary storage device 921 and a memory 922.
- the auxiliary storage device 921 is a ROM (Read / Only / Memory), a flash memory, or an HDD (Hard / Disk / Drive).
- the memory 922 is specifically a RAM (Random Access Memory).
- the storage unit 150 is realized by the auxiliary storage device 921. Note that the storage unit 150 may be realized by the auxiliary storage device 921 and the memory 922.
- the input interface 930 is a port connected to the first imaging device 210 and the second imaging device 220.
- the input interface 930 is an image input interface that takes the first image 310 captured by the first imaging device 210 and the second image 320 captured by the second imaging device 220 into the image processing apparatus 100.
- the first image 310 and the second image 320 captured by the input interface 930 are stored in the memory 922.
- the input interface 930 may be a port connected to an input device such as a mouse, a keyboard, or a touch panel.
- the input interface 930 is a USB (Universal / Serial / Bus) terminal.
- the input interface 930 may be a port connected to a LAN (Local / Area / Network).
- SDI Serial, Digital, Interface
- HDMI Registered Trademark
- VGA Video, Graphics, Array
- DVI Digital, Video, Interface
- Image etc. It may be a capture board that can be loaded into the apparatus 100.
- the output interface 940 is a port to which a cable of a display device such as a display is connected.
- the output interface 940 is, for example, a USB terminal or an HDMI (registered trademark) (High Definition, Multimedia, Interface) terminal.
- the display is an LCD (Liquid / Crystal / Display).
- the auxiliary storage device 921 stores a program that realizes the function of “unit”. This program is loaded into the memory 922, read into the processor 910, and executed by the processor 910.
- the auxiliary storage device 921 also stores an OS (Operating System). At least a part of the OS is loaded into the memory 922, and the processor 910 executes a program that realizes the function of “unit” while executing the OS.
- OS Operating System
- the image processing apparatus 100 may include only one processor 910, or may include a plurality of processors 910.
- a plurality of processors 910 may execute a program for realizing the function of “unit” in cooperation with each other.
- Information, data, signal values, and variable values indicating the results of processing by the function of “unit” are stored in the auxiliary storage device 921, the memory 922, or a register or cache memory in the processor 910.
- an arrow connecting each unit and the storage unit 150 indicates that each unit stores the processing result in the storage unit 150 or that each unit reads information from the storage unit 150.
- arrows connecting the respective parts represent the flow of control.
- the memory 922 indicating that information such as the first bird's-eye view image 311, the second bird's-eye view image 321, or the selection image 330 is exchanged between the units via the memory 922 and the arrows between the units are omitted.
- the program for realizing the function of “part” may be stored in a portable recording medium such as a magnetic disk, a flexible disk, an optical disk, a compact disk, a Blu-ray (registered trademark) disk, or a DVD (Digital Versatile Disc).
- a program that realizes the function of “unit” is also referred to as an image processing program.
- the image processing program is a program that realizes the function described as “part”.
- what is called an image processing program product is a storage medium and storage device on which an image processing program is recorded, and is loaded with a computer-readable program regardless of the appearance format.
- FIG. 2 schematically shows the positional relationship between the first imaging device 210 and the second imaging device 220 in the imaging space 800.
- the imaging space 800 is a space to be imaged by the first imaging device 210 and the second imaging device 220.
- the first imaging device 210 and the second imaging device 220 are camera devices including an imaging element and a lens such as a CCD (Charged-Coupled Devices) or a CMOS (Complementary Metal-Oxide-Semiconductor).
- the first imaging device 210 and the second imaging device 220 include the same type of imaging element and the same type of lens.
- Each of the first imaging device 210 and the second imaging device 220 is specifically a fixed camera such as a surveillance camera.
- the center of the imaging element of the first imaging device 210 is defined as an imaging element center 3a.
- the center of the image sensor of the second imaging device 220 is defined as an image sensor center 3b.
- the center of the image sensor is also called the lens focus.
- an XZ plan view in which the imaging space 800 is represented by an XZ plane of a two-dimensional X axis and a Z axis and an XY plan view in which the imaging space 800 is represented by an XY plane of a two-dimensional X axis and a Y axis.
- the X axis is on the ground surface 500 and represents the direction from the first imaging device 210 toward the second imaging device 220. That is, the imaging element center 3a of the first imaging device 210 and the imaging element center 3b of the second imaging device 220 are included in the XZ plane.
- the Y axis is on the ground surface 500 and represents a direction toward the rear surface of FIG. 2 in the XZ plan view.
- the Z axis represents the height direction from the ground surface 500. Further, the Z axis includes the imaging element center 3 a of the first imaging device 210. The intersection of the perpendicular line extending from the imaging element center 3 a of the first imaging device 210 to the X axis and the X axis is the origin of the XYZ space representing the imaging space 800.
- the first imaging device 210 and the second imaging device 220 are installed in the imaging space 800 under the following installation conditions shown in FIG.
- the installation condition of the first image pickup device 210 is that the height of the image sensor center 3a from the ground surface 500 is Ha, the depression angle that is an angle looking down from the horizontal plane including the image sensor center 3a is ⁇ a, and the lens field angle is ⁇ a.
- the installation conditions of the second imaging device 220 are that the height of the image sensor center 3b from the ground surface 500 is Hb, the depression angle that is an angle looking down from the horizontal plane including the image sensor center 3b is ⁇ b, and the lens angle of view is ⁇ b. is there.
- the depression angle is schematically represented.
- the inter-device distance between the imaging element center 3a of the first imaging device 210 and the imaging element center 3b of the second imaging device 220 is Xab.
- Xab is the X-axis component of the shortest distance between the image sensor center 3a of the first image pickup device 210 and the image sensor center 3b of the second image pickup device 220.
- the first region Ra is an imaging range for the ground surface 500 by the first imaging device 210. That is, it is the range of the ground surface 500 imaged by the first imaging device 210.
- the second region Rb is an imaging range with respect to the ground surface 500 by the second imaging device 220. That is, it is the range of the ground surface 500 imaged by the second imaging device 220.
- the common region Rab is a range where the imaging regions of the first imaging device 210 and the second imaging device 220 overlap. The first imaging device 210 and the second imaging device 220 are installed so that some imaging regions overlap.
- a three-dimensional object 600 is located in the common region Rab. Specifically, the three-dimensional object 600 is a subject having a height such as a person, a car, or a building.
- the entire region R is a region composed of the first region Ra and the second region Rb. That is, the entire region R is the entire imaging region imaged by the first imaging device 210 and the second imaging device 220.
- the installation conditions are set so that the imaging scale of the image of the common area Rab imaged by the first imaging device 210 is equal to the imaging scale of the image of the common area Rab imaged by the second imaging device 220. It is desirable that The imaging scale is the image size and the image resolution.
- the virtual viewpoint P represents the viewpoint of the overhead image generated in the image processing apparatus 100.
- the term “overhead image” means an image viewed from the virtual viewpoint P.
- the imaging device information 160 is information relating to installation conditions of the imaging device that is set in the storage unit 150 in advance.
- the information related to the installation conditions of the imaging device includes information such as position information related to the position of the imaging device, posture information related to the orientation of the imaging device, and resolution of the imaging device.
- the position information includes coordinates in the imaging space 800 of the imaging device.
- the posture information includes the yaw, roll, and pitch of the imaging device.
- the imaging device information 160 includes first device information 161 and second device information 162.
- the first device information 161 is information including position information, posture information, and resolution of the first imaging device 210 that images the first region Ra including the common region Rab.
- position information, posture information, and resolution of the first imaging device 210 that satisfy the installation conditions such as the height Ha from the ground surface 500, the depression angle ⁇ a, and the lens angle of view ⁇ a are set.
- the second device information 162 is information including position information, posture information, and resolution of the second imaging device 220 that images the second region Rb including the common region Rab.
- the first region Ra, the second region Rb, the common region Rab, and the inter-device distance Xab are information determined in advance from the first device information 161 and the second device information 162.
- the first region Ra, the second region Rb, the common region Rab, and the inter-device distance Xab may be calculated in advance and stored in the imaging device information 160.
- the first region Ra, the second region Rb, the common region Rab, and the inter-device distance Xab may be calculated each time image processing is executed by the image processing apparatus 100 and stored in the memory 922.
- the subject information 170 is information related to a subject that is a main imaging target in the imaging region.
- the subject is a three-dimensional object.
- the type of the three-dimensional object that is the subject is predetermined as an imaging target in the imaging region and is stored as one of the subject information 170.
- the type of the three-dimensional object is a type predetermined as a main imaging target in the imaging region.
- the subject information 170 includes height information 171.
- the height information 171 is information indicating the height from the ground surface 500 determined based on the type of the three-dimensional object.
- the height information 171 is stored in the storage unit 150.
- the height information 171 will be further described.
- the type of the three-dimensional object is a person.
- the type of a three-dimensional object that is an imaging target that is, information related to a person is set.
- the height information 171 is set with information indicating the type of the three-dimensional object, that is, the height from the ground surface determined based on the person. Specifically, information indicating the height from the ground surface determined by a preset method is set based on the average value of the person's height.
- the imaging device is a camera for monitoring a car
- the type of the three-dimensional object is a car.
- information related to the car is set.
- information representing the height from the ground surface determined based on the vehicle is set. Specifically, information representing the height from the ground surface determined by a preset method is set based on the average value of the vehicle height.
- the height information 171 is a value slightly higher than the height from the ground surface of the three-dimensional object to be imaged, and falls within the lens field angle ⁇ a of the first imaging device 210 and within the lens field angle ⁇ b of the second imaging device 220. Value is set. Specifically, the average value of the height from the ground surface of some kind of three-dimensional object that is the subject is calculated, and the height information 171 is 1.2 times the average value. At this time, the installation condition of the imaging device needs to be set so that 1.2 times the average height of the three-dimensional object that is the subject falls within the lens field angle ⁇ a and the lens field angle ⁇ b. Note that 1.2 times is an example, and any method for setting the height information 171 may be used as long as the height information 171 is appropriately set.
- the image processing S100 includes a boundary calculation processing S110, an overhead view generation processing S120, and a region image generation processing S150.
- the boundary calculation unit 110 uses the first device information 161 of the first imaging device 210, the second device information 162 of the second imaging device 220, and the height information 171 in common.
- a boundary position 180 serving as a reference for dividing the region Rab into the first imaging device 210 side and the second imaging device 220 side is calculated.
- the boundary calculation unit 110 stores the calculated boundary position 180 in the storage unit 150.
- the overhead view generation unit 120 captures from the memory 922 the first image 310 obtained by the first imaging device 210 capturing the first region Ra, and the second imaging device 220 captures the second region Rb.
- the second image 320 is read out.
- the first image 310 and the second image 320 are stored in the memory 922 by the input interface 930.
- the overhead view generation unit 120 converts the viewpoint of each of the first image 310 and the second image 320 to generate a first overhead image 311 and a second overhead image 321.
- the first overhead image 311 the three-dimensional object 600 in the common region Rab is distorted due to the influence of viewpoint conversion.
- the second overhead image 321 the three-dimensional object 600 in the common region Rab is distorted due to the influence of viewpoint conversion.
- region image generation processing S150 the image processing apparatus 100 generates a region image 340 that is an overhead image when the entire region R is viewed from the virtual viewpoint P.
- the selection unit 130 executes the selection process S130
- the image generation unit 140 executes the image generation process S140.
- the area image generation process S150 includes a selection process S130 and an image generation process S140.
- the selection process S130 the selection unit 130 selects the three-dimensional object 600 out of the first overhead image 311 and the second overhead image 321 based on the boundary position 180 and the position of the three-dimensional object 600 located in the common region Rab. An overhead image with a small image distortion is selected as the selection image 330.
- the boundary position 180 determines which of the first overhead image 311 and the second overhead image 321 is used as the overhead image of the common region Rab when the overhead image of the entire region R is generated. Used for.
- the image generation unit 140 includes an image of the first area Ra other than the common area Rab in the first overhead image 311 and an image of the second area Rb other than the common area Rab in the second overhead image 321.
- an image of the entire area R including the first area Ra and the second area Rb is generated as the area image 340.
- the image generation unit 140 outputs the region image 340 to a display device such as a display via the output interface 940.
- FIG. 4 is a flowchart showing the boundary calculation process S110 according to the present embodiment.
- FIG. 5 is a diagram for explaining a calculation method of the boundary position 180 in the imaging space 800 described with reference to FIG.
- the boundary calculation process S110 will be described in detail with reference to FIGS.
- the boundary calculation unit 110 reads the first device information 161, the second device information 162, and the height information 171 from the storage unit 150.
- the first device information 161 includes the height Ha of the first imaging device 210.
- the second device information 162 includes the height Hb of the second imaging device 220.
- a value set in the height information 171 is T.
- the boundary calculation unit 110 performs geometric calculation using the first device information 161, the second device information 162, and the height information 171 to calculate the boundary position 180. Specifically, the boundary calculation unit 110 uses the first distortion degree that represents the degree of distortion in the first overhead image 311 and the second distortion degree that represents the degree of distortion in the second overhead image 321 to use the boundary position. As 180, xc which is a position on the X axis is calculated. Hereinafter, the value calculated as the boundary position 180 may be referred to as a boundary position xc.
- the first degree of distortion is obtained by using the first device information 161 and the height information 171.
- the second degree of distortion is obtained using the second device information 162 and the height information 171.
- the boundary calculation unit 110 calculates, as the boundary position xc, the coordinate on the X axis of the common region Rab that minimizes the absolute value of the difference between the first distortion degree and the second distortion degree.
- Expression 1 below is an expression for obtaining the first distortion degree Ea.
- Expression 2 below is an expression for obtaining the second distortion degree Eb.
- the coordinate on the X axis be x.
- Formula 1: Ea x ⁇ T / (Ha ⁇ T)
- Formula 2: Eb (Xab ⁇ x) ⁇ T / (Hb ⁇ T)
- the boundary calculation unit 110 calculates, as the boundary position xc, the coordinate x on the X axis that minimizes the absolute value of the difference between Expression 1 and Expression 2.
- the boundary calculation unit 110 stores the calculated boundary position 180 in the storage unit 150.
- FIG. 5 shows the positional relationship among the first imaging device 210, the second imaging device 220, and the three-dimensional object 600 when the three-dimensional object 600 is located in the common region Rab under the installation conditions of the imaging device described in FIG. It is XZ top view.
- An intersection point between the line a and the line b set to intersect at the height T of the common region Rab is defined as an intersection point T1.
- Line a is a straight line drawn from the imaging element center 3a onto the common area Rab
- line b is a straight line drawn from the imaging element center 3b onto the common area Rab.
- a boundary position xc is defined as an intersection point between the perpendicular line drawn from the intersection point T1 on the common region Rab and the ground surface 500, that is, the X axis.
- the distance in the X-axis direction between the intersection of the line a and the ground surface 500 and the boundary position xc is the first degree of distortion Ea.
- the distance in the X-axis direction between the intersection of the line b and the ground surface 500 and the boundary position xc is the second distortion degree Eb.
- the both end portions in the X-axis direction of the common region Rab are defined as an end portion ra and an end portion rb.
- Each of the first distortion degree Ea and the second distortion degree Eb represents the degree of distortion of the projected image of the three-dimensional object 600 on the ground surface 500 by the imaging device. That is, the first distortion degree Ea represents the degree of distortion in the first overhead image 311, and the second distortion degree Eb represents the degree of distortion in the second overhead image 321. Further, the first distortion degree Ea and the second distortion degree Eb are expressed by Expression 1 and Expression 2, respectively, by geometric calculation based on FIG. When the difference between the first distortion degree Ea and the second distortion degree Eb is ⁇ e, the boundary position xc is obtained by determining x so that ⁇ e is minimized.
- the boundary position xc is a position where the magnitudes of the first distortion degree Ea and the second distortion degree Eb are reversed. That is, the boundary position xc is a position where the first distortion degree Ea and the second distortion degree Eb are equal.
- step S ⁇ b> 121 the overhead view generation unit 120 reads the first image 310 and the second image 320 from the memory 922.
- step S122 the bird's-eye view generation unit 120 executes lens distortion correction processing for correcting image distortion due to the influence of the lens of the imaging device for each of the first image 310 and the second image 320.
- the bird's-eye view generation unit 120 uses the calculation parameters for correcting distortions of the lenses of the first imaging device 210 and the second imaging device 220, and uses the first image 310 and the second image 320.
- a lens distortion correction process is executed for each of.
- the calculation parameters are collected in advance for each of the first imaging device 210 and the second imaging device 220 and stored in the imaging device information 160 of the storage unit 150.
- the bird's-eye view generation unit 120 reads out the calculation parameters from the storage unit 150, and executes lens distortion correction processing for each of the first image 310 and the second image 320 using the read calculation parameters.
- step S123 the overhead view generation unit 120 converts the first image 310 and the second image 320 in which distortion due to the influence of the lens is corrected in step S122 into a first overhead image 311 and a second overhead image 321.
- the first bird's-eye view image 311 and the second bird's-eye view image 321 are bird's-eye view images looking down on the ground surface 500 from the virtual viewpoint P.
- the bird's-eye view generation unit 120 uses the transformation determinant including appropriate calculation parameters for generating the bird's-eye view image looking down on the ground surface 500 from the virtual viewpoint P, using the first image 310 and the second image 320.
- the viewpoint conversion process is executed for each of the above, and a first overhead image 311 and a second overhead image 321 are generated.
- This transformation determinant is stored in advance in the imaging device information 160 of the storage unit 150.
- the bird's-eye view generation unit 120 reads the conversion determinant from the storage unit 150, and executes the viewpoint conversion process for each of the first image 310 and the second image 320 using the read conversion determinant.
- the three-dimensional object 600 in the common region Rab is distorted due to the influence of the viewpoint conversion process.
- step S131 the position detection unit 131 reads the comparison image 190 from the storage unit 150, and based on the first bird's-eye view image 311, the second bird's-eye view image 321, and the comparison image 190, the three-dimensional object 600 positioned in the common region Rab.
- the position is calculated as a three-dimensional object position xp.
- the position detection unit 131 is based on the comparison image 190, the overhead image of the common area Rab included in the first overhead image 311 and the overhead image of the common area Rab included in the second overhead image 321.
- the solid object position xp is detected.
- the comparison image 190 is a bird's-eye view of the common area Rab, which is a range where the first bird's-eye view image 311 and the second bird's-eye view image 321 overlap, as viewed from the virtual viewpoint P. That is, the comparison image 190 is a bird's-eye view image of a state in which no three-dimensional object exists in the common region Rab, that is, a state in which the three-dimensional object is deleted as viewed from the virtual viewpoint P.
- the comparison image 190 is generated in advance and stored in the storage unit 150.
- the position detection unit 131 includes a common overhead image 30a that is an overhead image of the common area Rab included in the first overhead image 311 and a common overhead image 30b that is an overhead image of the common area Rab included in the second overhead image 321. get.
- FIG. 8A is a common overhead image 30 a of the first overhead image 311.
- An image of the three-dimensional object 600 represented in the common overhead image 30a is defined as a three-dimensional object image 60a.
- the three-dimensional object image 60 a is an image of the three-dimensional object 600 obtained by changing the viewpoint of the first image 310, and is an image in which the three-dimensional object 600 falls down on the right side, that is, in a direction away from the first imaging device 210.
- FIG. 8B is a common overhead image 30 b of the second overhead image 321. An image of the three-dimensional object 600 represented in the common overhead image 30b is defined as a three-dimensional object image 60b.
- the three-dimensional object image 60b is an image of the three-dimensional object 600 obtained by changing the viewpoint of the second image 320, and is an image in which the three-dimensional object 600 falls down on the left side, that is, in a direction away from the second imaging device 220.
- FIG. 8C shows a comparison image 190 stored in the storage unit 150.
- the comparison image 190 is a bird's-eye view image of the common area Rab without the three-dimensional object viewed from the virtual viewpoint P.
- the position detection unit 131 calculates the difference image 31a from which the three-dimensional object image 60a is extracted by taking the difference between the common overhead image 30a and the comparison image 190.
- FIG. 8D shows the difference image 31a.
- the position detection unit 131 calculates the difference image 31b from which the three-dimensional object image 60b is extracted by taking the difference between the common overhead image 30b and the comparison image 190.
- FIG. 8E shows the difference image 31b.
- the position detection unit 131 performs blending, that is, a semi-transparent image overlay process, on the difference image 31a and the difference image 31b, and generates a composite image 33 in which the difference image 31a and the difference image 31b are combined.
- FIG. 8D shows the difference image 31a.
- the position detection unit 131 calculates the difference image 31b from which the three-dimensional object image 60b is extracted by taking the difference between the common overhead image 30b and the comparison image 190.
- FIG. 8E shows the difference image 31b.
- the position detection unit 131 detects the X coordinate of the center position of the three-dimensional object overlapping portion 60ab as the three-dimensional object position xp that is the position of the three-dimensional object 600.
- step S ⁇ b> 132 the image selection unit 132 reads the boundary position xc from the storage unit 150.
- step S133 the image selection unit 132 selects the first overhead image 311 and the second overhead image 321 based on the read boundary position xc and the three-dimensional object position xp of the three-dimensional object 600 located in the common region Rab. Then, an overhead image with a small distortion of the image of the three-dimensional object 600 is selected as the selection image 330.
- the image selection unit 132 determines whether the three-dimensional object position xp is on the first imaging device 210 side or the second imaging device 220 side based on the boundary position xc.
- the image selection unit 132 selects the first overhead image 311 as the selection image 330 when the three-dimensional object position xp is closer to the first imaging device 210 than the boundary position xc. Further, the image selection unit 132 selects the second overhead image 321 as the selection image 330 when the three-dimensional object position xp is closer to the second imaging device 220 than the boundary position xc.
- the fact that the three-dimensional object position xp is closer to the first imaging device 210 than the boundary position xc means that the three-dimensional object position xp that is the X coordinate of the three-dimensional object 600 is closer to the first imaging device 210 than the boundary position xc. is there. That is, it means that the three-dimensional object 600 is located closer to the first imaging device 210 than the straight line on the common area Rab having the boundary position xc as the X coordinate.
- the three-dimensional object position xp is closer to the second imaging device 220 than the boundary position xc means that the three-dimensional object position xp that is the X coordinate of the three-dimensional object 600 is closer to the second imaging device 220 than the boundary position xc. That is. That is, it means that the three-dimensional object 600 is closer to the second imaging device 220 than the straight line on the common region Rab having the boundary position xc as the X coordinate.
- step S133 If the three-dimensional object position xp is greater than or equal to the boundary position xc in step S133, the process proceeds to step 134. That the three-dimensional object position xp is greater than or equal to the boundary position xc means that the three-dimensional object position xp is closer to the second imaging device 220 than the boundary position xc. Therefore, in step S ⁇ b> 134, the image selection unit 132 selects the second overhead image 321 as the selection image 330. If the three-dimensional object position xp is smaller than the boundary position xc in step S133, the process proceeds to step 135.
- step S141 the image generation unit 140 selects the image of the first region Ra other than the common region Rab in the first bird's-eye view image 311, the image of the second region Rb other than the common region Rab in the second bird's-eye view image 321, and the selected image.
- an image of the entire area R composed of the first area Ra and the second area Rb is generated as the area image 340. That is, the image generation unit 140 synthesizes the selected image 330 and an image other than the common region Rab of the overhead image that has not been selected in the selection process S130, and the region image 340 obtained by viewing the entire region R from the virtual viewpoint P. Is generated.
- step S142 the image generation unit 140 outputs the region image 340 to a display device such as a display via the output interface 940.
- a method for generating the region image 340 will be described with reference to FIGS. 10 and 11.
- the image selection unit 132 selects the first overhead image 311 as the selection image 330. Therefore, the image selection unit 132 combines the first bird's-eye view image 311 and the image of the second region Rb excluding the common region Rab in the second bird's-eye view image 321 to generate a region image 340 of the entire region R.
- the area image 340 includes an image other than the common area Rab in the first overhead image 311, an image other than the common area Rab in the second overhead image 321, and a common area included in the first overhead image 311 that is the selection image 330.
- Rab images since the three-dimensional object position xp is closer to the second imaging device 220 than the boundary position xc, the image selection unit 132 selects the second overhead image 321 as the selection image 330. Therefore, the image selection unit 132 combines the second bird's-eye view image 321 and the image of the first region Ra excluding the common region Rab in the first bird's-eye view image 311 to generate the region image 340 of the entire region R.
- the area image 340 includes an image other than the common area Rab in the first overhead image 311, an image other than the common area Rab in the second overhead image 321, and a common area included in the second overhead image 321 that is the selection image 330. And Rab images.
- the image processing apparatus 100 is configured to acquire the first image 310 and the second image 320 via the input interface 930 and output the region image 340 via the output interface 940.
- the image processing apparatus 100 may include a communication device and receive the first image 310 and the second image 320 via the communication device. Further, the image processing apparatus 100 may transmit the region image 340 via a communication device.
- the communication device includes a receiver and a transmitter.
- the communication device is a communication chip or a NIC (Network, Interface, Card).
- the communication device functions as a communication unit that communicates data.
- the receiver functions as a receiving unit that receives data
- the transmitter functions as a transmitting unit that transmits data.
- the function of “unit” of the image processing apparatus 100 is realized by software.
- the function of “part” of the image processing apparatus 100 may be realized by hardware.
- the configuration of the image processing apparatus 100 according to a modification of the present embodiment will be described with reference to FIG. As illustrated in FIG. 12, the image processing apparatus 100 includes hardware such as a processing circuit 909, an input interface 930, and an output interface 940.
- the processing circuit 909 is a dedicated electronic circuit that realizes the function of the “unit” and the storage unit 150 described above. Specifically, the processing circuit 909 includes a single circuit, a composite circuit, a programmed processor, a processor programmed in parallel, a logic IC, a GA (Gate Array), an ASIC (Application Specific Integrated Circuit), or FPGA (Field-Programmable / Gate / Array).
- unit may be realized by one processing circuit 909 or may be realized by being distributed to a plurality of processing circuits 909.
- the function of the image processing apparatus 100 may be realized by a combination of software and hardware. That is, some functions of the image processing apparatus 100 may be realized by dedicated hardware, and the remaining functions may be realized by software.
- the processor 910, the storage device 920, and the processing circuit 909 are collectively referred to as a “processing circuit”. That is, regardless of the configuration of the image processing apparatus 100 shown in FIGS. 1 and 10, the function of “unit” and the storage unit 150 are realized by a processing circuit.
- Part may be read as “Process”, “Procedure” or “Process”. Further, the function of “unit” may be realized by firmware.
- the boundary calculation process S110 of FIG. 3 only needs to be executed when the first area image is generated. Since the boundary position 180 calculated by the boundary calculation process S110 is stored in the storage unit 150, when generating the second and subsequent area images, the boundary position 180 stored in the storage unit 150 is read and the overhead view generation process S120 is performed. The image generation process S140 may be executed. Further, when the storage unit 150 is realized by the auxiliary storage device 921, information read from the storage unit 150 in the process of generating the first area image is stored in the memory 922. This eliminates the need to read from the auxiliary storage device 921 in the process of generating the second and subsequent area images.
- FIG. 13 there may be a plurality of regions in which the moving direction of the three-dimensional object that is the subject is constant in the entire region R of the imaging region. Specifically, this is a case where a subject is a car that passes through a road. In such a case, the entire region R may be divided for each region where the moving direction of the three-dimensional object is constant, and the above-described image processing S100 may be performed on each of the divided regions. Specifically, as shown in FIG.
- the entire region R includes a region R1 that is a lane in which the three-dimensional object 600b moves in the B direction and a region R2 that is a lane in which the three-dimensional object 600a moves in the A direction.
- the image processing apparatus stores in advance in the storage unit imaging device information defined by dividing the entire region R into regions R1 and R2. Then, the image processing apparatus executes image processing S100 in each of the region R1 and the region R2, and generates an overhead image of the region R1 and an overhead image of the region R2. Then, the image processing apparatus generates a region image of the entire region R by synthesizing the overhead image of the region R1 and the overhead image of the region R2.
- the boundary position where the difference in the degree of distortion of the three-dimensional object is the smallest is used as a boundary, and on which side the three-dimensional object position is located with respect to the boundary position. judge. Then, the overhead image with the smaller distortion degree of the three-dimensional object is selected by performing a selection process of the overhead image to be displayed as the image of the common area based on the determination result. Therefore, according to the image processing apparatus according to the present embodiment, an unnatural image in which a three-dimensional object image is displayed twice is not obtained, and a combined overhead image with small distortion of the three-dimensional object is obtained.
- the imaging apparatus information regarding the installation conditions of the imaging apparatus and the height information determined from the type of the subject are stored in advance in the storage unit. That is, in the image processing apparatus according to the present embodiment, optimal imaging apparatus information and height information can be set in accordance with the usage situation. Therefore, according to the image processing device according to the present embodiment, the selection image displayed as the overhead image of the common area can be selected from appropriate imaging device information and height information, so that the composition with high image quality is performed. An overhead image can be provided.
- the position of the three-dimensional object is determined based on the comparison image and the generated overhead image. Can be detected. Therefore, according to the image processing apparatus according to the present embodiment, it is not necessary to use a device such as a sensor for detecting the position of a three-dimensional object, and the cost can be reduced.
- Embodiment 2 differences from the first embodiment will be mainly described.
- the image processing when one solid object is present in the common region Rab has been described.
- the region image is generated while the image of the three-dimensional object subjected to the viewpoint conversion remains distorted.
- n is a natural number of 1 or more
- the images of the three-dimensional objects are compressed and deformed by compressing and deforming the images after the viewpoint conversion of each three-dimensional object.
- An image processing apparatus 100a that displays with reduced distortion will be described.
- the image processing apparatus 100a includes a selection unit 130a and an image generation unit 140a as functional configurations instead of the selection unit 130 and the image generation unit 140 in the first embodiment. That is, the image processing apparatus 100a includes a position detection unit 131a and an image selection unit 132a as functional configurations instead of the position detection unit 131 and the image selection unit 132.
- the function of “part” of the image processing apparatus 100a is the function of the boundary calculation unit 110, the overhead view generation unit 120, the position detection unit 131a, the image selection unit 132a, and the image generation unit 140a. is there.
- region image generation processing S150a is different from region image generation processing S150 of the first embodiment. That is, the selection process S130a and the image generation process S140a are different from the selection process S130 and the image generation process S140 of the first embodiment.
- the image processing apparatus 100a compresses and deforms the distortion of each of the three-dimensional object images 601, 602,..., 60n after the viewpoint conversion of the n three-dimensional objects in the common area Rab. Correct by
- step S151 the position detection unit 131a reads the comparison image 190 from the storage unit 150. Based on the first bird's-eye view image 311, the second bird's-eye view image 321, and the comparison image 190, the position detection unit 131 a detects the three-dimensional object position xpi (where i is n or less) of the n three-dimensional objects located in the common region Rab. Natural number) is calculated. That is, the position detection unit 131a detects the three-dimensional object positions xp1, xp2,.
- the method for detecting each three-dimensional object position xpi in step S151 is the same as that in step S131 described in the first embodiment.
- step S152 the image selection unit 132a reads the boundary position xc from the storage unit 150.
- the processing in step S152 is the same as that in step S132 described in the first embodiment.
- step S153 the image selection unit 132a selects the three-dimensional object image 60i out of the first overhead image 311 and the second overhead image 321 based on the boundary position xc and the three-dimensional object position xpi located in the common region Rab. An overhead image with small distortion is selected as the selection image 330.
- the three-dimensional object image 60i is a three-dimensional object image to be processed.
- the image selection unit 132a selects the first overhead image 311 as the selection image 330 when the three-dimensional object position xpi is closer to the first imaging device 210 than the boundary position xc (step S155).
- the image selection unit 132a selects the second overhead image 321 as the selection image 330 when the three-dimensional object position xpi is closer to the second imaging device 220 than the boundary position xc (Step S154).
- the processing from step S153 to step S155 is the same as that from step S133 to step S135 described in the first embodiment.
- FIG. 17 is a diagram for explaining the region image generation processing S150a according to the present embodiment.
- three three-dimensional objects are located in the common region Rab.
- the three-dimensional object positions xp1, xp2, and xp3 of the three three-dimensional objects are detected.
- the three-dimensional object positions xp1 and xp2 are closer to the first imaging device 210 than the boundary position xc.
- the three-dimensional object position xp3 is closer to the second imaging device 220 than the boundary position xc.
- the first bird's-eye view image 311 is selected as the bird's-eye view image of each three-dimensional object at the three-dimensional object position xp1, xp2.
- the three-dimensional object images 601 and 602 be the bird's-eye view images of the three-dimensional objects at the three-dimensional object positions xp1 and xp2.
- the second bird's-eye view image 321 is selected.
- a three-dimensional object image 603 is an overhead image of the three-dimensional object at the three-dimensional object position xp3.
- 35a in FIG. 17A represents a position where the coordinate position of the imaging element center 3a of the first imaging device 210 is mapped on the XY plane after the overhead image conversion.
- 17b of FIG. 17 represents a position where the coordinate position of the imaging element center 3b of the second imaging device 220 is mapped on the XY plane after the overhead image conversion.
- step S156 the image generation unit 140a determines an image deformation coefficient Qi that determines the degree of deformation when compressing and deforming the three-dimensional object image 60i based on the three-dimensional object position xpi.
- the image deformation coefficient is also called a deformation rate.
- the image generation unit 140a determines the image deformation coefficient Qi based on the length of the distance between the center of the imaging element of the imaging device and the three-dimensional object position xpi.
- the image deformation coefficient Qi increases as the distance between the imaging element center of the imaging apparatus and the three-dimensional object position xpi increases. In the examples of FIGS.
- the distance L2 between 35a and the three-dimensional object position xp2 is longer than the distance L1 between 35a that is the center of the image sensor and the three-dimensional object position xp1. Therefore, the image deformation coefficient Q2 of the three-dimensional object image 602 is determined to be larger than the image deformation coefficient Q1 of the three-dimensional object image 601.
- an arithmetic expression using the three-dimensional object position xp as a variable is stored in advance in the storage unit 150, and the image generation unit 140a uses the arithmetic expression stored in the storage unit 150 and the three-dimensional object position xpi to generate an image.
- a deformation coefficient Qi is calculated.
- an image deformation coefficient table using the three-dimensional object position xp as a parameter is stored in the storage unit 150 in advance, and the image generation unit 140a converts the value corresponding to xpi from the image deformation coefficient table stored in the storage unit 150 into an image deformation. You may make it read as coefficient Qi.
- step S157 the image generation unit 140a extracts the three-dimensional object image 60i from the selected image 330. Specifically, the image generation unit 140a extracts the three-dimensional object image 60i using the common region Rab portion of the selection image 330 and the comparison image 190.
- (C) and (d) in FIG. 17 are three-dimensional object images 601, 602, and 603 extracted by the image generation unit 140a.
- the three-dimensional object images 601 and 602 are extracted from the first overhead image 311 that is the selection image 330. Further, the three-dimensional object image 603 is extracted from the second overhead image 321 that is the selection image 330.
- step S158 the image generation unit 140a performs compression deformation processing on the extracted three-dimensional object image 60i using the image deformation coefficient Qi.
- the compression deformation process is a process for correcting the three-dimensional object image 60i by reducing the distortion of the three-dimensional object image 60i by performing a compression deformation using the image deformation coefficient Qi.
- the three-dimensional object corrected images 601p and 602p in (e) of FIG. 17 represent the results of performing compression deformation processing on each of the three-dimensional object images 601 and 602 in (c) of FIG.
- the image deformation coefficient Q2 for the three-dimensional object image 602 is determined to be larger than the image deformation coefficient Q1 for the three-dimensional object image 601.
- the three-dimensional object corrected image 603p in (f) of FIG. 17 represents the result of performing the compression deformation process on the three-dimensional object image 603 in (d) of FIG.
- it is desirable that the compression deformation for the three-dimensional object image is performed in a linear direction connecting the center of the imaging element and the center of the three-dimensional object image in the overhead image.
- step S159 the image generation unit 140a determines whether or not a solid object that has not been subjected to the compression deformation process remains among the n solid objects. That is, the image generation unit 140a determines whether correction by compression deformation has been completed for all three-dimensional objects in the common region Rab. If correction has been completed for all three-dimensional objects in step S159, the process proceeds to step S160. If a three-dimensional object that has not been corrected remains in step S159, the process returns to step S153, and the next three-dimensional object image of the three-dimensional object image 60i is processed.
- step S160 the image generation unit 140a converts each of the three-dimensional object corrected images 601p, 602p,..., 60np subjected to the compression deformation processing to the three-dimensional object positions xp1, xp2,.
- the correction common image 36 is generated by pasting to xpn.
- FIG. 17G shows the corrected common image 36 in a state in which the three-dimensional object corrected images 601p, 602p, and 603p are pasted on the three-dimensional object positions xp1, xp2, and xp3 of the comparison image 190, respectively.
- step S161 the image generation unit 140a uses the entire region R using the image other than the common region Rab in the first bird's-eye view image 311, the image other than the common region Rab in the second bird's-eye view image 321, and the corrected common image 36.
- a region image 340 is generated.
- step S162 the image generation unit 140 outputs the region image 340 to a display device such as a display via the output interface 940.
- the image processing apparatus for a plurality of three-dimensional objects that exist in the common area, the three-dimensional object image that is compressed and deformed based on the three-dimensional object position and the boundary position is selected and selected.
- a three-dimensional object image can be compressed and deformed. Therefore, according to the image processing device according to the present embodiment, it is possible to further reduce the distortion of the three-dimensional object image in the common region, and to improve the image quality of the composite overhead image.
- the image processing apparatus it is possible to individually reduce the distortion for each three-dimensional object by extracting the three-dimensional object image and compressing and deforming only the extracted three-dimensional object image. Therefore, according to the image processing apparatus according to the present embodiment, since distortion due to correction does not occur in an area that does not need to be compressed, the quality of the synthesized overhead image can be further improved.
- Embodiment 1 and 2 of this invention were demonstrated, any one may be employ
- the image processing apparatus may be configured with any combination or arbitrary block configuration of these functional blocks.
- the image processing apparatus may be an image processing system including a plurality of apparatuses instead of a single apparatus.
- Embodiments 1 and 2 have been described, a plurality of these two embodiments may be partially combined. Alternatively, one of the two embodiments may be partially implemented. In addition, these two embodiments may be implemented in any combination in whole or in part. In addition, said embodiment is an essentially preferable illustration, Comprising: It does not intend restrict
- 3a, 3b Image sensor center 100, 100a image processing device, 110 boundary calculation unit, 120 overhead view generation unit, 130, 130a selection unit, 131, 131a position detection unit, 132, 132a image selection unit, 140, 140a image generation unit , 150 storage unit, 160 imaging device information, 161 first device information, 162 second device information, 170 subject information, 171 height information, 180 boundary position, 190 comparative image, 210 first imaging device, 220 second imaging device , 310 1st image, 320 2nd image, 311 1st bird's-eye view image, 321 2nd bird's-eye view image, 330 selection image, 340 area image, 30a, 30b common overhead view image, 31a, 31b difference image, 33 composite image, 36 correction Common image, 500 ground surface, 510 image processing method, 520 Image processing program, 600, 600a, 600b three-dimensional object, 60a, 60b, 601, 602, 603, 60i three-dimensional object image, 60ab three-dimensional object superimposed portion, 601p
Abstract
Description
立体物が位置する共通領域を含む第1領域を撮像する第1撮像装置の位置情報を含む第1装置情報と、前記共通領域を含む第2領域を撮像する第2撮像装置の位置情報を含む第2装置情報とを用いて、前記共通領域を前記第1撮像装置の側と前記第2撮像装置の側とに分ける基準となる境界位置を算出する境界算出部と、
前記境界位置と前記立体物の位置とに基づいて、前記第1撮像装置が前記第1領域を撮像した画像が視点変換された第1俯瞰画像であって前記立体物の画像が歪んでいる第1俯瞰画像と、前記第2撮像装置が前記第2領域を撮像した画像が視点変換された第2俯瞰画像であって前記立体物の画像が歪んでいる第2俯瞰画像とのうち、前記立体物の画像の歪みが小さい俯瞰画像を選択画像として選択する選択部と、
前記第1俯瞰画像における前記共通領域以外の前記第1領域の画像と、前記第2俯瞰画像における前記共通領域以外の前記第2領域の画像と、前記選択画像に含まれる前記共通領域の画像とを用いて、前記第1領域と前記第2領域とからなる領域の画像を生成する画像生成部とを備える。
***構成の説明***
図1を用いて、本実施の形態に係る画像処理装置100の構成について説明する。
本実施の形態において、画像処理装置100は、コンピュータである。画像処理装置100は、プロセッサ910、記憶装置920、入力インタフェース930、出力インタフェース940といったハードウェアを備える。
また、画像処理装置100は、機能構成として、境界算出部110と、俯瞰生成部120と、選択部130と、画像生成部140と、記憶部150とを備える。選択部130は、位置検出部131と、画像選択部132とを備える。以下の説明では、画像処理装置100における境界算出部110と、俯瞰生成部120と、位置検出部131と、画像選択部132と、画像生成部140との機能を、画像処理装置100の「部」の機能という。画像処理装置100の「部」の機能は、ソフトウェアで実現される。
また、記憶部150は、記憶装置920で実現される。記憶部150には、撮像装置情報160と、被写体情報170と、境界位置180と、比較画像190とが記憶される。撮像装置情報160は、第1装置情報161と第2装置情報162とを含む。また、被写体情報170は、高さ情報171を含む。
プロセッサ910は、プロセッシングを行うIC(Integrated・Circuit)である。プロセッサ910は、具体的には、CPU(Central・Processing・Unit)である。
なお、「部」の機能を実現するプログラムを画像処理プログラムともいう。画像処理プログラムは、「部」として説明している機能を実現するプログラムである。また、画像処理プログラムプロダクトと称されるものは、画像処理プログラムが記録された記憶媒体及び記憶装置であり、見た目の形式に関わらず、コンピュータ読み取り可能なプログラムをロードしているものである。
本実施の形態に係る画像処理装置100の動作について説明する。
まず、図2を用いて、第1撮像装置210と第2撮像装置220との位置関係について説明する。以下の説明では、第1撮像装置210と第2撮像装置220との両方あるいは一方を単に撮像装置と称して説明する場合がある。
図2は、撮像空間800における第1撮像装置210と第2撮像装置220との位置関係を模式的に示している。撮像空間800は、第1撮像装置210と第2撮像装置220との撮像対象となる空間である。
第1撮像装置210と第2撮像装置220との各々は、具体的には、監視カメラのような固定カメラである。
第1撮像装置210の撮像素子の中心を撮像素子中心3aとする。第2撮像装置220の撮像素子の中心を撮像素子中心3bとする。撮像素子中心はレンズ焦点ともいう。
X軸は、地表500上であり、かつ、第1撮像装置210から第2撮像装置220に向かう方向を表す。つまり、第1撮像装置210の撮像素子中心3aと第2撮像装置220の撮像素子中心3bとはXZ平面に含まれる。
Y軸は、地表500上にあり、XZ平面図において図2の後面に向かう方向を表す。
Z軸は、地表500からの高さ方向を表す。また、Z軸は、第1撮像装置210の撮像素子中心3aを含む。第1撮像装置210の撮像素子中心3aからX軸に下した垂線とX軸との交点が、撮像空間800を表すXYZ空間の原点となる。
第1撮像装置210の設置条件は、撮像素子中心3aの地表500からの高さはHa、撮像素子中心3aを含む水平面から下方を見下ろす角である俯角はθa、レンズ画角はψaである。また、第2撮像装置220の設置条件は、撮像素子中心3bの地表500からの高さがHb、撮像素子中心3bを含む水平面から下方を見下ろす角である俯角がθb、レンズ画角がψbである。なお、図2では、俯角については模式的に表している。
また、第1撮像装置210の撮像素子中心3aと第2撮像装置220の撮像素子中心3bとの装置間距離はXabである。なお、Xabは、第1撮像装置210の撮像素子中心3aと第2撮像装置220の撮像素子中心3bとの最短距離のX軸成分である。
共通領域Rabには立体物600が位置する。立体物600は、具体的には、人、車、あるいは建築物といった高さのある被写体である。
全体領域Rは、第1領域Raと第2領域Rbとからなる領域である。すなわち、全体領域Rは、第1撮像装置210と第2撮像装置220とにより撮像される撮像領域の全てである。なお、第1撮像装置210により撮像される共通領域Rabの画像の撮像スケールと、第2撮像装置220により撮像される共通領域Rabの画像の撮像スケールとが同等となるように、設置条件が設定されることが望ましい。撮像スケールとは、画像のサイズ及び画像の解像度である。
また、仮想視点Pは、画像処理装置100において生成される俯瞰画像の視点を表す。本実施の形態において俯瞰画像と称する場合は、仮想視点Pから見た画像を意味するものとする。
撮像装置情報160は、予め記憶部150に設定される撮像装置の設置条件に関する情報である。撮像装置の設置条件に関する情報には、撮像装置の位置に関する位置情報、撮像装置の姿勢に関する姿勢情報、撮像装置の解像度といった情報が含まれる。位置情報は、撮像装置の撮像空間800における座標を含む。姿勢情報は、撮像装置のヨー、ロール、ピッチを含む。
第1装置情報161は、共通領域Rabを含む第1領域Raを撮像する第1撮像装置210の位置情報、姿勢情報及び解像度を含む情報である。第1装置情報161には、地表500からの高さHa、俯角θa及びレンズ画角ψaといった設置条件を満たす第1撮像装置210の位置情報、姿勢情報及び解像度が設定されている。
第2装置情報162は、共通領域Rabを含む第2領域Rbを撮像する第2撮像装置220の位置情報、姿勢情報及び解像度を含む情報である。第2装置情報162には、地表500からの高さHb、俯角θb及びレンズ画角ψbといった設置条件を満たす第2撮像装置220の位置情報、姿勢情報及び解像度が設定されている。
なお、第1領域Ra、第2領域Rb、共通領域Rab及び装置間距離Xabは、第1装置情報161と第2装置情報162とから予め定まる情報である。第1領域Ra、第2領域Rb、共通領域Rab及び装置間距離Xabは、予め計算されて撮像装置情報160に記憶されていてもよい。あるいは、第1領域Ra、第2領域Rb、共通領域Rab及び装置間距離Xabは、画像処理装置100が実行する画像処理において都度計算され、メモリ922に記憶するとしてもよい。
被写体情報170は、高さ情報171を含む。高さ情報171は、立体物の種類に基づいて定められた地表500からの高さを表す情報である。高さ情報171は、記憶部150に記憶されている。
具体例として、撮像装置が人を監視するためのカメラである場合、立体物の種類は人である。被写体情報170には、撮像対象である立体物の種類、すなわち人に関する情報が設定されている。また、高さ情報171には、立体物の種類、すなわち人に基づいて定められた地表からの高さを表す情報が設定される。具体的には、人の身長の平均値に基づいて、予め設定された方式により定められた地表からの高さを表す情報が設定される。
また、別の具体例として、撮像装置が車を監視するためのカメラである場合、立体物の種類は車である。被写体情報170には、車に関する情報が設定されている。また、高さ情報171には、車に基づいて定められた地表からの高さを表す情報が設定される。具体的には、車の高さの平均値に基づいて、予め設定された方式により定められた地表からの高さを表す情報が設定される。
画像処理S100は、境界算出処理S110と、俯瞰生成処理S120と、領域画像生成処理S150とを有する。
まず、境界算出処理S110において、境界算出部110は、第1撮像装置210の第1装置情報161と、第2撮像装置220の第2装置情報162と、高さ情報171とを用いて、共通領域Rabを第1撮像装置210の側と第2撮像装置220の側とに分ける基準となる境界位置180を算出する。境界算出部110は、算出した境界位置180を記憶部150に記憶する。
領域画像生成処理S150は、選択処理S130と、画像生成処理S140とからなる。
選択処理S130において、選択部130は、境界位置180と、共通領域Rabに位置する立体物600の位置とに基づいて、第1俯瞰画像311と第2俯瞰画像321とのうち、立体物600の画像の歪みが小さい俯瞰画像を選択画像330として選択する。すなわち、境界位置180は、全体領域Rの俯瞰画像を生成する際に、共通領域Rabの俯瞰画像として第1俯瞰画像311と第2俯瞰画像321とのどちらの俯瞰画像を採用するかを判定するために用いられる。
画像生成処理S140において、画像生成部140は、第1俯瞰画像311における共通領域Rab以外の第1領域Raの画像と、第2俯瞰画像321における共通領域Rab以外の第2領域Rbの画像と、選択画像330に含まれる共通領域Rabの画像とを用いて、第1領域Raと第2領域Rbとからなる全体領域Rの画像を領域画像340として生成する。画像生成部140は、出力インタフェース940を介して、領域画像340をディスプレイなどの表示機器へ出力する。
図4は、本実施の形態に係る境界算出処理S110を示すフロー図である。図5は、図2で説明した撮像空間800における境界位置180の算出方法について説明するための図である。
図4及び図5を用いて、境界算出処理S110について詳しく説明する。
以下において、境界位置180として算出される値を境界位置xcと表記する場合がある。
式1:Ea=x・T/(Ha-T)
式2:Eb=(Xab-x)・T/(Hb-T)
境界算出部110は、式1と式2との差分の絶対値が最小となるX軸上の座標xを境界位置xcとして算出する。境界算出部110は、算出した境界位置180を記憶部150に記憶する。
図5は、図2で説明した撮像装置の設置条件において、共通領域Rabに立体物600が位置する場合の、第1撮像装置210と第2撮像装置220と立体物600との位置関係を示すXZ平面図である。
共通領域Rabの高さTにおいて交わるように設定されたラインaとラインbとの交点を交点T1とする。ラインaは、撮像素子中心3aから共通領域Rab上に下された直線であり、ラインbは撮像素子中心3bから共通領域Rab上に下された直線である。交点T1から共通領域Rab上に下された垂線と地表500、すなわちX軸との交点を境界位置xcとする。
ラインaと地表500との交点と境界位置xcとの間のX軸方向の距離が第1歪み度Eaである。
ラインbと地表500との交点と、境界位置xcとの間のX軸方向の距離が第2歪み度Ebである。
図6を用いて、本実施の形態に係る俯瞰生成処理S120について詳しく説明する。
ステップS122において、俯瞰生成部120は、第1画像310と第2画像320との各々について、撮像装置のレンズの影響による画像歪みを補正するレンズ歪み補正処理を実行する。具体的には、俯瞰生成部120は、第1撮像装置210と第2撮像装置220との各々のレンズが持つ歪みを補正するための計算パラメータを用いて、第1画像310と第2画像320との各々についてレンズ歪み補正処理を実行する。この計算パラメータは第1撮像装置210と第2撮像装置220との各々について予め採取され、記憶部150の撮像装置情報160に記憶されている。俯瞰生成部120は、記憶部150から計算パラメータを読み出し、読み出した計算パラメータを用いて第1画像310と第2画像320との各々についてレンズ歪み補正処理を実行する。
なお、第1俯瞰画像311と第2俯瞰画像321との各々は、視点変換処理の影響により、共通領域Rabの立体物600が歪んでいる。
図7を用いて、本実施の形態に係る選択処理S130について詳しく説明する。
比較画像190とは、第1俯瞰画像311と第2俯瞰画像321とが重なっている範囲である共通領域Rabにおいて、立体物が無い状態を仮想視点Pから見た俯瞰画像である。すなわち、比較画像190は、共通領域Rabにおいて立体物が存在しない状態、すなわち立体物が削除された状態を仮想視点Pから見た俯瞰画像である。比較画像190は、予め生成され、記憶部150に記憶される。
位置検出部131は、第1俯瞰画像311に含まれる共通領域Rabの俯瞰画像である共通俯瞰画像30aと、第2俯瞰画像321に含まれる共通領域Rabの俯瞰画像である共通俯瞰画像30bとを取得する。
図8の(a)は、第1俯瞰画像311の共通俯瞰画像30aである。共通俯瞰画像30aに表される立体物600の画像を立体物画像60aとする。立体物画像60aは、第1画像310が視点変換された立体物600の画像であり、立体物600が右側、すなわち第1撮像装置210から離れる方向に倒れこむような画像となる。
図8の(b)は、第2俯瞰画像321の共通俯瞰画像30bである。共通俯瞰画像30bに表される立体物600の画像を立体物画像60bとする。立体物画像60bは、第2画像320が視点変換された立体物600の画像であり、立体物600が左側、すなわち第2撮像装置220から離れる方向に倒れこむような画像となる。
図8の(c)は、記憶部150に記憶されている比較画像190である。比較画像190は、立体物が無い状態の共通領域Rabを仮想視点Pから見た俯瞰画像である。
位置検出部131は、差分画像31aと差分画像31bとに対し、ブレンディング、すなわち半透明画像の重ね合わせ処理を実行し、差分画像31aと差分画像31bとが合成された合成画像33を生成する。図8の(f)は、立体物画像60aと立体物画像60bとが重畳された合成画像33を示している。合成画像33において、立体物画像60aと立体物画像60bとが重畳する部分の立体物重畳部分60abは、立体物600が地表500に接する領域を示している。位置検出部131は、立体物重畳部分60abの中心位置のX座標を立体物600の位置である立体物位置xpとして検出する。
ステップS133において、画像選択部132は、読み出した境界位置xcと、共通領域Rabに位置する立体物600の立体物位置xpとに基づいて、第1俯瞰画像311と第2俯瞰画像321とのうち、立体物600の画像の歪みが小さい俯瞰画像を選択画像330として選択する。
具体的には、画像選択部132は、境界位置xcに基づいて、立体物位置xpが第1撮像装置210の側にあるか第2撮像装置220の側にあるかを判定する。そして、画像選択部132は、立体物位置xpが境界位置xcより第1撮像装置210の側にある場合に第1俯瞰画像311を選択画像330として選択する。また、画像選択部132は、立体物位置xpが境界位置xcより第2撮像装置220の側にある場合に第2俯瞰画像321を選択画像330として選択する。
また、立体物位置xpが境界位置xcより第2撮像装置220の側にあるということは、立体物600のX座標である立体物位置xpが境界位置xcより第2撮像装置220の側にあることである。すなわち、立体物600が境界位置xcをX座標とする共通領域Rab上の直線より第2撮像装置220の側にあることを意味する。
ステップS133において、立体物位置xpが境界位置xcより小さい場合、ステップ135に進む。立体物位置xpが境界位置xcより小さいということは、立体物位置xpが境界位置xcより第1撮像装置210の側にあることを意味する。よって、ステップS135において、画像選択部132は、第1俯瞰画像311を選択画像330として選択する。
図9を用いて、本実施の形態に係る画像生成処理S140について詳しく説明する。
ステップS142において、画像生成部140は、出力インタフェース940を介して、領域画像340をディスプレイなどの表示機器へ出力する。
図10では、立体物位置xpが境界位置xcより第1撮像装置210の側にあるため、画像選択部132は第1俯瞰画像311を選択画像330として選択する。よって、画像選択部132は、第1俯瞰画像311と、第2俯瞰画像321において共通領域Rabを除いた第2領域Rbの画像とを合成し、全体領域Rの領域画像340を生成する。すなわち、領域画像340は、第1俯瞰画像311における共通領域Rab以外の画像と、第2俯瞰画像321における共通領域Rab以外の画像と、選択画像330である第1俯瞰画像311に含まれる共通領域Rabの画像とにより生成されている。
図11では、立体物位置xpが境界位置xcより第2撮像装置220の側にあるため、画像選択部132は、第2俯瞰画像321を選択画像330として選択する。よって、画像選択部132は、第2俯瞰画像321と、第1俯瞰画像311において共通領域Rabを除いた第1領域Raの画像とを合成し、全体領域Rの領域画像340を生成する。すなわち、領域画像340は、第1俯瞰画像311における共通領域Rab以外の画像と、第2俯瞰画像321における共通領域Rab以外の画像と、選択画像330である第2俯瞰画像321に含まれる共通領域Rabの画像とにより生成されている。
本実施の形態では、画像処理装置100は、入力インタフェース930を介して第1画像310及び第2画像320を取得し、出力インタフェース940を介して領域画像340を出力する構成であった。しかし、画像処理装置100が通信装置を備え、通信装置を介して第1画像310及び第2画像320を受信してもよい。また、画像処理装置100は、通信装置を介して領域画像340を送信してもよい。この場合、通信装置はレシーバとトランスミッタとを備える。具体的には、通信装置は通信チップまたはNIC(Network・Interface・Card)である。通信装置はデータを通信する通信部として機能する。レシーバはデータを受信する受信部として機能し、トランスミッタはデータを送信する送信部として機能する。
図12を用いて、本実施の形態の変形例に係る画像処理装置100の構成について説明する。
図12に示すように、画像処理装置100は、処理回路909、入力インタフェース930、出力インタフェース940といったハードウェアを備える。
また、記憶部150が補助記憶装置921で実現されている場合は、1枚目の領域画像を生成する処理の際に記憶部150から読み出された情報をメモリ922に記憶しておく。これにより、2枚目以降の領域画像を生成する処理では補助記憶装置921から読み出す必要はなくなる。
このような場合には、立体物の移動方向が一定の領域ごとに全体領域Rを分割して、分割したそれぞれの領域で上述した画像処理S100を実行すればよい。具体的には、図13に示すように、全体領域Rには、立体物600bがB方向に移動する車線である領域R1と、立体物600aがA方向に移動する車線である領域R2とが存在する。画像処理装置は、全体領域Rが領域R1と領域R2とに分割されて定義された撮像装置情報を予め記憶部に記憶しておく。そして、画像処理装置は、領域R1と領域R2との各々の領域において画像処理S100を実行し、領域R1の俯瞰画像と領域R2の俯瞰画像とを生成する。そして、画像処理装置は、領域R1の俯瞰画像と領域R2の俯瞰画像とを合成して全体領域Rの領域画像を生成する。
以上のように、本実施の形態に係る画像処理装置では、立体物の歪み度の差分が最も小さくなる境界位置を境界として、立体物位置が境界位置に対していずれの側に位置するのかを判定する。そして、この判定結果に基づいて共通領域の画像として表示する俯瞰画像の選択処理を行うことにより、立体物の歪み度が小さい方の俯瞰画像が選択される。よって、本実施の形態に係る画像処理装置によれば、立体物画像が2重に表示されるといった不自然な画像になることはなく、立体物の歪みが小さい合成俯瞰画像が得られる。
本実施の形態では、主に、実施の形態1と異なる点について説明する。
実施の形態1では、共通領域Rab内に1つの立体物がある場合の画像処理について説明した。また、実施の形態1では、視点変換された立体物の画像が歪んだままで領域画像が生成されていた。本実施の形態では、共通領域Rab内にn個(nは1以上の自然数)の立体物がある場合に、各々の立体物の視点変換後の画像を圧縮変形させることにより立体物の画像の歪みを低減して表示する画像処理装置100aについて説明する。
図14を用いて、本実施の形態に係る画像処理装置100aの構成について説明する。
なお、本実施の形態において、実施の形態1で説明した構成と同様の構成については同一の符号を付し、その説明を省略する。
図15を用いて、本実施の形態に係る画像処理装置100aの画像処理S100aについて説明する。本実施の形態では、領域画像生成処理S150aが実施の形態1の領域画像生成処理S150とは異なる。すなわち、選択処理S130a及び画像生成処理S140aが、実施の形態1の選択処理S130及び画像生成処理S140とは異なる。
領域画像生成処理S150aでは、画像処理装置100aは、共通領域Rab内のn個の立体物の視点変換後の立体物画像601,602,・・・,60nについて、それぞれの画像の歪みを圧縮変形により補正する。
ステップS151において、位置検出部131aは、記憶部150から比較画像190を読み出す。位置検出部131aは、第1俯瞰画像311と第2俯瞰画像321と比較画像190とに基づいて、共通領域Rabに位置するn個の立体物の各々の立体物位置xpi(iはn以下の自然数)を算出する。すなわち、位置検出部131aは、立体物位置xp1,xp2,・・・,xpnを検出する。ステップS151の各立体物位置xpiの検出方法は実施の形態1で説明したステップS131と同様である。
ステップS152において、画像選択部132aは、記憶部150から境界位置xcを読み出す。ステップS152の処理は実施の形態1で説明したステップS132と同様である。
ステップS153において、画像選択部132aは、境界位置xcと、共通領域Rabに位置する立体物位置xpiとに基づいて、第1俯瞰画像311と第2俯瞰画像321とのうち、立体物画像60iの歪みが小さい俯瞰画像を選択画像330として選択する。立体物画像60iは処理対象の立体物画像である。画像選択部132aは、立体物位置xpiが境界位置xcより第1撮像装置210の側にある場合に第1俯瞰画像311を選択画像330として選択する(ステップS155)。また、画像選択部132aは、立体物位置xpiが境界位置xcより第2撮像装置220の側にある場合に第2俯瞰画像321を選択画像330として選択する(ステップS154)。ステップS153からステップS155の処理は実施の形態1で説明したステップS133からステップS135と同様である。
具体例として、共通領域Rabに3つの立体物が位置するものとする。また、3つの立体物の各々の立体物位置xp1,xp2,xp3が検出されたものとする。図17の(a)に示すように、立体物位置xp1,xp2は境界位置xcより第1撮像装置210の側にある。また、図17の(b)に示すように、立体物位置xp3は境界位置xcより第2撮像装置220の側にある。
よって、立体物位置xp1,xp2にある各々の立体物の俯瞰画像としては、第1俯瞰画像311が選択される。立体物位置xp1,xp2にある各々の立体物の俯瞰画像を、立体物画像601,602とする。立体物位置xp3にある立体物の俯瞰画像としては、第2俯瞰画像321が選択される。立体物位置xp3にある立体物の俯瞰画像を、立体物画像603とする。
図17の(a)(b)の例では、撮像素子中心である35aと立体物位置xp1との距離L1よりも、35aと立体物位置xp2との距離L2の方が長い。よって、立体物画像602の画像変形係数Q2は、立体物画像601の画像変形係数Q1よりも大きな値に決定される。具体的には、立体物位置xpを変数とする演算式が予め記憶部150に記憶されており、画像生成部140aは記憶部150に記憶された演算式と立体物位置xpiとを用いて画像変形係数Qiを算出する。または、立体物位置xpをパラメータとした画像変形係数テーブルが予め記憶部150に記憶されており、画像生成部140aは記憶部150に記憶された画像変形係数テーブルからxpiに対応する値を画像変形係数Qiとして読み込むようにしてもよい。
また、図17の(f)の立体物補正画像603pは、図17の(d)の立体物画像603に対して圧縮変形処理を行った結果を表している。
なお、立体物画像に対する圧縮変形は、撮像素子中心と俯瞰画像における立体物画像の中心とを結ぶ直線方向に行われることが望ましい。
ステップS159において全ての立体物について補正が完了した場合、ステップS160進む。
ステップS159において補正が完了していない立体物が残っている場合、ステップS153に戻り、立体物画像60iの次の立体物画像について処理を行う。
ステップS161において、画像生成部140aは、第1俯瞰画像311における共通領域Rab以外の画像と、第2俯瞰画像321における共通領域Rab以外の画像と、補正共通画像36とを用いて、全体領域Rの領域画像340を生成する。
ステップS162において、画像生成部140は、出力インタフェース940を介して、領域画像340をディスプレイなどの表示機器へ出力する。
以上のように、本実施の形態に係る画像処理装置によれば、共通領域に複数存在する立体物について、立体物位置と境界位置とに基づいて圧縮変形する立体物画像を選択し、選択した立体物画像を圧縮変形することができる。よって、本実施の形態に係る画像処理装置によれば、共通領域における立体物画像の歪みをさらに小さくすることができ、合成俯瞰画像の画像品質を向上させることができる。
なお、上記の実施の形態は、本質的に好ましい例示であって、本発明、その適用物や用途の範囲を制限することを意図するものではなく、必要に応じて種々の変更が可能である。
Claims (9)
- 立体物が位置する共通領域を含む第1領域を撮像する第1撮像装置の位置情報を含む第1装置情報と、前記共通領域を含む第2領域を撮像する第2撮像装置の位置情報を含む第2装置情報とを用いて、前記共通領域を前記第1撮像装置の側と前記第2撮像装置の側とに分ける基準となる境界位置を算出する境界算出部と、
前記境界位置と前記立体物の位置とに基づいて、前記第1撮像装置が前記第1領域を撮像した画像が視点変換された第1俯瞰画像であって前記立体物の画像が歪んでいる第1俯瞰画像と、前記第2撮像装置が前記第2領域を撮像した画像が視点変換された第2俯瞰画像であって前記立体物の画像が歪んでいる第2俯瞰画像とのうち、前記立体物の画像の歪みが小さい俯瞰画像を選択画像として選択する選択部と、
前記第1俯瞰画像における前記共通領域以外の前記第1領域の画像と、前記第2俯瞰画像における前記共通領域以外の前記第2領域の画像と、前記選択画像に含まれる前記共通領域の画像とに基づいて、前記第1領域と前記第2領域とからなる領域の領域画像を生成する画像生成部と
を備える画像処理装置。 - 前記選択部は、
前記立体物の位置が前記境界位置より前記第1撮像装置の側にある場合に前記第1俯瞰画像を前記選択画像として選択し、前記立体物が前記境界位置より前記第2撮像装置の側にある場合に前記第2俯瞰画像を前記選択画像として選択する請求項1に記載の画像処理装置。 - 前記境界算出部は、
前記第1俯瞰画像における歪みの度合いを表す第1歪み度であって前記第1装置情報と前記立体物の種類に基づいて定められた地表からの高さを表す高さ情報とを用いて求められる第1歪み度と、前記第2俯瞰画像における歪みの度合いを表す第2歪み度であって前記第2装置情報と前記高さ情報とを用いて求められる第2歪み度とを用いて、前記境界位置を算出する請求項1または2に記載の画像処理装置。 - 前記境界算出部は、
前記第1歪み度と前記第2歪み度との差分の絶対値が最小となる前記共通領域の位置を前記境界位置として算出する請求項3に記載の画像処理装置。 - 前記選択部は、
前記共通領域において立体物が存在しない比較画像と前記第1俯瞰画像に含まれる前記共通領域の画像と前記第2俯瞰画像に含まれる前記共通領域の画像とに基づいて、前記立体物の位置を検出する請求項1から4のいずれか1項に記載の画像処理装置。 - 前記画像生成部は、
前記選択画像に含まれる前記共通領域の画像の前記立体物の画像の歪みを補正し、歪みが補正された前記立体物の画像と前記比較画像とを合成し、合成して得られる合成画像と、前記第1俯瞰画像における前記共通領域以外の前記第1領域の画像と、前記第2俯瞰画像における前記共通領域以外の前記第2領域の画像とを用いて、前記領域画像を生成する請求項5に記載の画像処理装置。 - 前記画像生成部は、
前記選択部により検出された前記立体物の位置に基づいて画像変形係数を決定し、前記画像変形係数を用いて前記立体物の画像の歪みを補正する請求項6に記載の画像処理装置。 - 境界算出部が、立体物が位置する共通領域を含む第1領域を撮像する第1撮像装置の位置情報を含む第1装置情報と、前記共通領域を含む第2領域を撮像する第2撮像装置の位置情報を含む第2装置情報とを用いて、前記共通領域を前記第1撮像装置の側と前記第2撮像装置の側とに分ける基準となる境界位置を算出し、
選択部が、前記境界位置と前記立体物の位置とに基づいて、前記第1撮像装置が前記第1領域を撮像した画像が視点変換された第1俯瞰画像であって前記立体物の画像が歪んでいる第1俯瞰画像と、前記第2撮像装置が前記第2領域を撮像した画像が視点変換された第2俯瞰画像であって前記立体物の画像が歪んでいる第2俯瞰画像とのうち、前記立体物の画像の歪みが小さい俯瞰画像を選択画像として選択し、
画像生成部が、前記第1俯瞰画像における前記共通領域以外の前記第1領域の画像と、前記第2俯瞰画像における前記共通領域以外の前記第2領域の画像と、前記選択画像に含まれる前記共通領域の画像とに基づいて、前記第1領域と前記第2領域とからなる領域の領域画像を生成する画像処理方法。 - 立体物が位置する共通領域を含む第1領域を撮像する第1撮像装置の位置情報を含む第1装置情報と、前記共通領域を含む第2領域を撮像する第2撮像装置の位置情報を含む第2装置情報とを用いて、前記共通領域を前記第1撮像装置の側と前記第2撮像装置の側とに分ける基準となる境界位置を算出する境界算出処理と、
前記境界位置と前記立体物の位置とに基づいて、前記第1撮像装置が前記第1領域を撮像した画像が視点変換された第1俯瞰画像であって前記立体物の画像が歪んでいる第1俯瞰画像と、前記第2撮像装置が前記第2領域を撮像した画像が視点変換された第2俯瞰画像であって前記立体物の画像が歪んでいる第2俯瞰画像とのうち、前記立体物の画像の歪みが小さい俯瞰画像を選択画像として選択する選択処理と、
前記第1俯瞰画像における前記共通領域以外の前記第1領域の画像と、前記第2俯瞰画像における前記共通領域以外の前記第2領域の画像と、前記選択画像に含まれる前記共通領域の画像とに基づいて、前記第1領域と前記第2領域とからなる領域の領域画像を生成する画像生成処理とをコンピュータに実行させる画像処理プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017543843A JP6239205B2 (ja) | 2015-11-06 | 2015-11-06 | 画像処理装置、画像処理方法及び画像処理プログラム |
PCT/JP2015/081358 WO2017077650A1 (ja) | 2015-11-06 | 2015-11-06 | 画像処理装置、画像処理方法及び画像処理プログラム |
US15/756,455 US10417743B2 (en) | 2015-11-06 | 2015-11-06 | Image processing device, image processing method and computer readable medium |
GB1803427.2A GB2556797B (en) | 2015-11-06 | 2015-11-06 | Image processing apparatus, image processing method, and image processing program |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/081358 WO2017077650A1 (ja) | 2015-11-06 | 2015-11-06 | 画像処理装置、画像処理方法及び画像処理プログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017077650A1 true WO2017077650A1 (ja) | 2017-05-11 |
Family
ID=58661799
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/081358 WO2017077650A1 (ja) | 2015-11-06 | 2015-11-06 | 画像処理装置、画像処理方法及び画像処理プログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US10417743B2 (ja) |
JP (1) | JP6239205B2 (ja) |
GB (1) | GB2556797B (ja) |
WO (1) | WO2017077650A1 (ja) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10957047B2 (en) * | 2017-02-15 | 2021-03-23 | Panasonic Intellectual Property Management Co., Ltd. | Image processing device and image processing method |
US10579067B2 (en) * | 2017-07-20 | 2020-03-03 | Huawei Technologies Co., Ltd. | Method and system for vehicle localization |
GB2586712B (en) | 2018-03-28 | 2021-12-22 | Mitsubishi Electric Corp | Image processing device, image processing method, and image processing program |
US11115623B2 (en) * | 2018-05-07 | 2021-09-07 | Maxim Integrated Products, Inc. | Systems and methods for asymmetric image splitter with line mark memory |
US10380440B1 (en) * | 2018-10-23 | 2019-08-13 | Capital One Services, Llc | Method for determining correct scanning distance using augmented reality and machine learning models |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005333565A (ja) * | 2004-05-21 | 2005-12-02 | Auto Network Gijutsu Kenkyusho:Kk | 監視装置 |
JP2008048317A (ja) * | 2006-08-21 | 2008-02-28 | Sanyo Electric Co Ltd | 画像処理装置並びに視界支援装置及び方法 |
WO2010070920A1 (ja) * | 2008-12-19 | 2010-06-24 | パナソニック株式会社 | 車両周囲画像生成装置 |
WO2010137265A1 (ja) * | 2009-05-25 | 2010-12-02 | パナソニック株式会社 | 車両周囲監視装置 |
WO2012096058A1 (ja) * | 2011-01-11 | 2012-07-19 | アイシン精機株式会社 | 画像生成装置 |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050031169A1 (en) * | 2003-08-09 | 2005-02-10 | Alan Shulman | Birds eye view virtual imaging for real time composited wide field of view |
JP4830380B2 (ja) | 2005-07-13 | 2011-12-07 | 日産自動車株式会社 | 車両周辺監視装置及び車両周辺監視方法 |
JP4674900B2 (ja) | 2005-10-14 | 2011-04-20 | アルパイン株式会社 | 車載画像表示装置 |
JP4934308B2 (ja) | 2005-10-17 | 2012-05-16 | 三洋電機株式会社 | 運転支援システム |
JP4248570B2 (ja) | 2006-08-21 | 2009-04-02 | 三洋電機株式会社 | 画像処理装置並びに視界支援装置及び方法 |
US7728879B2 (en) | 2006-08-21 | 2010-06-01 | Sanyo Electric Co., Ltd. | Image processor and visual field support device |
JP5053043B2 (ja) | 2007-11-09 | 2012-10-17 | アルパイン株式会社 | 車両周辺画像生成装置および車両周辺画像の歪み補正方法 |
JP5222597B2 (ja) * | 2008-03-19 | 2013-06-26 | 三洋電機株式会社 | 画像処理装置及び方法、運転支援システム、車両 |
JP4214291B1 (ja) | 2008-03-26 | 2009-01-28 | 株式会社ザイナス | 接地点推定装置、接地点推定方法、動線表示システムおよびサーバ |
JP5067632B2 (ja) | 2008-11-28 | 2012-11-07 | アイシン精機株式会社 | 鳥瞰画像生成装置 |
JP2010147523A (ja) | 2008-12-16 | 2010-07-01 | Panasonic Corp | 車両周囲の俯瞰画像生成装置 |
JP5091882B2 (ja) | 2009-01-21 | 2012-12-05 | 三洋電機株式会社 | 画像処理装置並びに視界支援装置及び方法 |
JP5035284B2 (ja) * | 2009-03-25 | 2012-09-26 | 株式会社日本自動車部品総合研究所 | 車両周辺表示装置 |
EP2539197B1 (en) * | 2010-02-26 | 2020-12-16 | Gentex Corporation | Automatic vehicle equipment monitoring, warning, and control system |
JP5699679B2 (ja) | 2011-02-24 | 2015-04-15 | 富士通セミコンダクター株式会社 | 画像処理装置、画像処理システム、及び画像処理方法 |
JP5483120B2 (ja) * | 2011-07-26 | 2014-05-07 | アイシン精機株式会社 | 車両周辺監視システム |
WO2013046593A1 (ja) * | 2011-09-30 | 2013-04-04 | パナソニック株式会社 | 俯瞰画像生成装置、俯瞰画像生成方法、および俯瞰画像生成プログラム |
JP5906696B2 (ja) | 2011-11-30 | 2016-04-20 | アイシン精機株式会社 | 車両周辺撮影装置および車両周辺画像の処理方法 |
-
2015
- 2015-11-06 GB GB1803427.2A patent/GB2556797B/en active Active
- 2015-11-06 JP JP2017543843A patent/JP6239205B2/ja active Active
- 2015-11-06 US US15/756,455 patent/US10417743B2/en active Active
- 2015-11-06 WO PCT/JP2015/081358 patent/WO2017077650A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005333565A (ja) * | 2004-05-21 | 2005-12-02 | Auto Network Gijutsu Kenkyusho:Kk | 監視装置 |
JP2008048317A (ja) * | 2006-08-21 | 2008-02-28 | Sanyo Electric Co Ltd | 画像処理装置並びに視界支援装置及び方法 |
WO2010070920A1 (ja) * | 2008-12-19 | 2010-06-24 | パナソニック株式会社 | 車両周囲画像生成装置 |
WO2010137265A1 (ja) * | 2009-05-25 | 2010-12-02 | パナソニック株式会社 | 車両周囲監視装置 |
WO2012096058A1 (ja) * | 2011-01-11 | 2012-07-19 | アイシン精機株式会社 | 画像生成装置 |
Also Published As
Publication number | Publication date |
---|---|
JP6239205B2 (ja) | 2017-11-29 |
GB2556797A (en) | 2018-06-06 |
GB2556797B (en) | 2018-10-24 |
JPWO2017077650A1 (ja) | 2017-12-07 |
US10417743B2 (en) | 2019-09-17 |
US20180253823A1 (en) | 2018-09-06 |
GB201803427D0 (en) | 2018-04-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6239205B2 (ja) | 画像処理装置、画像処理方法及び画像処理プログラム | |
JP5954668B2 (ja) | 画像処理装置、撮像装置および画像処理方法 | |
US9013559B2 (en) | System, method and program for capturing images from a virtual viewpoint | |
KR102253553B1 (ko) | 사발형 이미징 시스템에서의 물체 가시화 | |
KR101697512B1 (ko) | 영상 정합 장치 및 방법 | |
US11589023B2 (en) | Image processing apparatus, image processing method, and storage medium | |
JP5694300B2 (ja) | 画像処理装置、画像処理方法およびプログラム | |
JP6570296B2 (ja) | 画像処理装置、画像処理方法およびプログラム | |
JP6882868B2 (ja) | 画像処理装置、画像処理方法、システム | |
US20100103264A1 (en) | Vehicle-surroundings displaying method and system | |
CN102714695A (zh) | 图像处理装置、图像处理方法及程序 | |
US11488354B2 (en) | Information processing apparatus and information processing method | |
JP7159384B2 (ja) | 画像処理装置、画像処理方法、及びプログラム | |
US10535193B2 (en) | Image processing apparatus, image synthesizing apparatus, image processing system, image processing method, and storage medium | |
US20180367709A1 (en) | Image processing apparatus, object shape estimation method, and storage medium | |
US20170142384A1 (en) | Image processing apparatus, image processing method, image projection system, and storage medium | |
JP2019504414A (ja) | 車両の周囲の正面の光景を表示する方法及びデバイス並びにそれぞれの車両 | |
JP2006302195A (ja) | 画像処理方法及び画像処理装置 | |
JP5955003B2 (ja) | 画像処理装置および画像処理方法、プログラム | |
KR20110025083A (ko) | 입체 영상 시스템에서 입체 영상 디스플레이 장치 및 방법 | |
JPWO2020012556A1 (ja) | 撮像装置、画像補正方法および画像補正プログラム | |
JP5891751B2 (ja) | 画像間差分装置および画像間差分方法 | |
JP6320165B2 (ja) | 画像処理装置及びその制御方法、並びにプログラム | |
JP6762779B2 (ja) | 画像処理装置、撮像装置、画像処理方法、及びプログラム | |
US20230325969A1 (en) | Image processing apparatus, image processing method, and non-transitory computer readable medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15907836 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017543843 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15756455 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 201803427 Country of ref document: GB Kind code of ref document: A Free format text: PCT FILING DATE = 20151106 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15907836 Country of ref document: EP Kind code of ref document: A1 |