WO2016067470A1 - 画像処理装置及び画像処理方法 - Google Patents
画像処理装置及び画像処理方法 Download PDFInfo
- Publication number
- WO2016067470A1 WO2016067470A1 PCT/JP2014/079130 JP2014079130W WO2016067470A1 WO 2016067470 A1 WO2016067470 A1 WO 2016067470A1 JP 2014079130 W JP2014079130 W JP 2014079130W WO 2016067470 A1 WO2016067470 A1 WO 2016067470A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- overhead
- unit
- subject
- projection plane
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims description 53
- 238000003672 processing method Methods 0.000 title claims description 5
- 238000004364 calculation method Methods 0.000 claims abstract description 55
- 238000003384 imaging method Methods 0.000 claims description 37
- 238000001514 detection method Methods 0.000 claims description 25
- 230000002194 synthesizing effect Effects 0.000 claims 1
- 238000000034 method Methods 0.000 description 29
- 240000004050 Pentaglottis sempervirens Species 0.000 description 13
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 13
- 239000013598 vector Substances 0.000 description 9
- 230000014509 gene expression Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000006870 function Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000017105 transposition Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/04—Context-preserving transformations, e.g. by using an importance map
- G06T3/053—Detail-in-context presentations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/265—Mixing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Definitions
- the present invention relates to an image processing apparatus and an image processing method for creating a bird's-eye view image from a camera image photographed by a photographing apparatus.
- the entire region located farther than the obstacle is a compression target. That is, the entire region is compressed regardless of whether there is another obstacle further away from the obstacle.
- the presence of another obstacle farther away than an obstacle is not particularly considered, and distortion is not suppressed individually corresponding to the obstacle. .
- the compression processing has been performed on all the portions where there are no obstacles in the region located farther than the obstacles, that is, all the portions that do not need to be compressed. When a portion that does not need to be compressed is compressed, the portion is distorted.
- the present invention has been made to solve the above-described problems, and an object thereof is to obtain an image processing apparatus and an image processing method capable of suppressing distortion for each object individually.
- the image processing apparatus is configured to capture a captured image based on an image acquisition unit that acquires a captured image captured by the imaging apparatus, imaging apparatus information regarding the imaging apparatus, and spatial information regarding a space captured by the imaging apparatus.
- an overhead image creation unit that creates an overhead image from the captured image
- an object detection unit that detects the position of an object that does not have the corresponding spatial information
- an object position A projection plane calculation unit that calculates a projection plane, an object image creation unit that sets a captured image on the projection plane and creates an object image of the object when viewed from the viewpoint position of the overhead image, and an object image in the overhead image
- a display image creation unit that composes an image.
- an object image in which distortion is individually suppressed can be created in the overhead view image.
- FIG. 1 shows the configuration of an image processing apparatus 100 according to Embodiment 1 of the present invention.
- FIG. 1 also shows the photographing devices 1a to 1c and the display device 2.
- the image processing apparatus 100 includes an image acquisition unit 101, an imaging device information storage unit 102, a facility information storage unit 103, a viewpoint position storage unit 104, an overhead image creation unit 105, a difference calculation image storage unit 106, A subject detection unit 107, a projection plane calculation unit 108, a subject image creation unit 109, and a display image creation unit 110 are provided.
- the image acquisition unit 101 acquires camera images (captured images) captured by the image capturing apparatuses 1a to 1c.
- the photographing device information storage unit 102 stores information regarding the photographing devices 1a to 1c as photographing device information.
- the imaging device information includes, for example, the installation positions and orientations of the imaging devices 1a to 1c, the focal length of the lens, and the like.
- the facility information storage unit 103 stores, as facility information (spatial information), information related to a space that is imaged by the imaging devices 1a to 1c and is to be monitored.
- the facility information includes, for example, the size of the space to be monitored, the position and height of the wall, the size and position of a structure that is always installed, and the like.
- the viewpoint position storage unit 104 stores the viewpoint position when creating the bird's-eye view image.
- the overhead image creation unit 105 creates an overhead image from the camera image acquired by the image acquisition unit 101 based on the imaging device information and the facility information.
- the difference calculation image storage unit 106 stores camera images captured in advance by the imaging devices 1a to 1c as difference calculation images. Note that the difference calculation image may be periodically acquired and updated.
- the subject detection unit 107 detects the subject position (the position of the object) based on the overhead view image created by the overhead view image creation unit 105 and the difference calculation image.
- the subject in the present invention refers to an object other than an object stored as facility information or an object other than an object shown in a difference calculation image.
- the subject refers to an object in which the image processing apparatus 100 cannot know in advance that it exists in the space monitored by the photographing apparatuses 1a to 1c.
- a person who temporarily passes through the space monitored by the photographing apparatuses 1a to 1c corresponds to the subject.
- the projection plane calculation unit 108 calculates a projection plane that is newly defined at the subject position detected by the subject detection unit 107.
- the subject image creation unit 109 creates a subject image (object image) by setting to project a camera image on the projection plane defined by the projection plane calculation unit 108.
- the display image creation unit 110 creates and outputs an image obtained by combining the overhead view image created by the overhead view image creation unit 105 and the subject image created by the subject image creation unit 109.
- the image acquisition unit 101 includes an interface (for example, a USB port) that acquires camera images from the photographing apparatuses 1a to 1c.
- the imaging device information storage unit 102, the facility information storage unit 103, the viewpoint position storage unit 104, and the difference calculation image storage unit 106 are configured by various storage devices such as a hard disk.
- the overhead image creation unit 105, the subject detection unit 107, the projection plane calculation unit 108, the subject image creation unit 109, and the display image creation unit 110 are configured by, for example, a semiconductor integrated circuit mounted with a CPU (Central Processing Unit).
- CPU Central Processing Unit
- the imaging device information storage unit 102, the facility information storage unit 103, the viewpoint position storage unit 104, and the difference calculation image storage unit 106 are placed outside the image processing device 100 as a storage device separate from the image processing device 100. Also good. In this case, the storage device and the image processing apparatus 100 are electrically connected.
- the photographing apparatus information storage unit 102, the facility information storage unit 103, the viewpoint position storage unit 104, and the difference calculation image storage unit 106 are stored in an internal memory or an external memory of the computer.
- the photographing devices 1a to 1c are composed of cameras. Note that the number is not limited to the three illustrated.
- the display device 2 displays the image output by the display image creation unit 110.
- the display device 2 is, for example, a liquid crystal display.
- the imaging device information is stored in the imaging device information storage unit 102 (step ST101).
- the viewpoint position is stored in the viewpoint position storage unit 104 (step ST102).
- the facility information is stored in the facility information storage unit 103 (step ST103).
- the difference calculation image is stored in the difference calculation image storage unit 106 (step ST104).
- Steps ST101 to ST104 are processes performed in advance by a user or the like when performing a process of creating a bird's-eye view image from the camera images of the photographing apparatuses 1a to 1c.
- the image acquisition unit 101 acquires camera images captured by the imaging devices 1a to 1c (step ST105).
- the camera image is composed of pixel data and the like.
- the camera acquired by the image acquisition unit 101 based on the imaging device information stored in the imaging device information storage unit 102 and the facility information stored in the facility information storage unit 103 by the overhead image creation unit 105. From the image, an overhead image when looking down from the viewpoint position stored in the viewpoint position storage unit 104 is created (step ST106).
- the bird's-eye view image creation unit 105 polygonizes walls in the space, structures that are always installed, and the like based on the facility information. As a result, walls in the space, structures that are always installed, and the like are defined using the three-dimensional positions of the vertices of the created polygons. Subsequently, the overhead image creation unit 105 corresponds to the vertex of each polygon based on the intersection of the straight line connecting the three-dimensional position of the vertex of each polygon and the optical center of the imaging device 1a and the imaging device plane of the imaging device 1a.
- the three-dimensional position D the pixel data (dist x, dist y, dist z) determining the T.
- the three-dimensional position D is a vector.
- the superscript T represents transposition.
- the imaging element plane refers to a plane in which light receiving elements constituting the imaging element are two-dimensionally arranged.
- the three-dimensional position of the pixel data means the three-dimensional position of the light receiving element corresponding to the pixel data.
- u represents a unit vector in the x-axis direction of the imaging element plane
- v represents a unit vector in the y-axis direction of the imaging element plane perpendicular to the x-axis direction.
- I in the formula (2) is a vector representing the center coordinates of the imaging element plane.
- the unit vectors u and v and the center coordinate I are shown in bold in the formula (2).
- the unit vectors u and v, the center coordinate I, and the optical center C in the formula (1) are obtained based on the photographing apparatus information. Alternatively, it may be originally included in the photographing apparatus information.
- image_x a ⁇ camera_px_width + camera_px_width / 2
- image_y b ⁇ camera_px_height + camera_px_height / 2
- camera_px_width represents the number of light receiving elements arranged in the x-axis direction on the imaging element plane (the number of pixels in the x-axis direction on the camera image).
- camera_px_height in Expression (5) represents the number of light receiving elements arranged in the y-axis direction on the imaging element plane (the number of pixels in the y-axis direction on the camera image).
- the pixel data of the camera image is stored in the three-dimensional space. It is possible to create a bird's-eye view image by mapping and looking down from the viewpoint position stored in the viewpoint position storage unit 104. The flow when creating a bird's-eye view image from the camera images photographed by the photographing devices 1b and 1c is the same as that of the photographing device 1a.
- a wall in a space, a structure that is always installed, etc. is defined as a polygon based on facility information has been described.
- step ST106 has been described in detail above.
- the subject detection unit 107 superimposes a total of three bird's-eye images based on the camera images of the photographing devices 1a to 1c created by the bird's-eye view image creation unit 105 (hereinafter referred to as an area where two or more bird's-eye images are superimposed) , Referred to as a superimposed region) (step ST107).
- the overlapping area can be obtained using various known image processing techniques.
- the subject detection unit 107 obtains a position where the subject exists (subject position) in the overlapping region obtained in step ST107 (step ST108).
- a method for obtaining the subject position will be described with reference to FIG.
- the case where the photographing apparatuses 1a and 1b and the subject 3 are in the positional relationship shown in FIG. 3A will be described as an example.
- the actual subject is a person or the like, for convenience of explanation, the subject 3 here will be described as being cylindrical.
- the camera images photographed by the photographing apparatuses 1a and 1b based on the positional relationship shown in FIG. 3A are converted into overhead images by the processes in steps ST105 and ST106 described above. Then, the superimposed area of the overhead image is obtained by the process of step ST107 described above.
- the subject detection unit 107 uses the difference calculation images captured by the imaging devices 1a and 1b among the difference calculation images stored in the difference calculation image storage unit 106.
- An overhead image (difference calculation overhead image) is created for each.
- the method for creating the overhead image from the difference calculation image is the same as the process in step ST106. Similar to the camera image, the difference calculation image is composed of pixel data and the like.
- the overhead image creation unit 105 may create an overhead image using the difference calculation image. In any case, the subject detection unit 107 or the overhead image creation unit 105 that creates the difference calculation overhead image from the difference calculation image functions as the difference calculation overhead image creation unit.
- the subject detection unit 107 takes the difference between the overhead view image created using the difference calculation image and the overhead view image created by the process of step ST106 for each photographing apparatus, and the subject 3 exists in the overhead view image.
- Area (hereinafter referred to as a subject area, which can also be referred to as an object area).
- FIG. 3B shows the subject area 41a specified in the overhead image 4a corresponding to the photographing device 1a
- FIG. 3C shows the subject region 41a specified in the overhead image 4b corresponding to the photographing device 1b.
- a subject area 41b is shown.
- FIGS. 3B and 3C also show the overlapping area 42 of the overhead images 4a and 4b obtained by the process of step ST107.
- the subject areas 41a and 41b are in a state in which the subject 3 extends so as to fall down on the ground.
- the subject detection unit 107 obtains a portion where the subject region 41 a and the subject region 41 b overlap in the overlapping region 42, and specifies it as the subject position 43.
- the subject position 43 substantially corresponds to the contact surface between the subject 3 and the ground.
- the projection plane calculation unit 108 newly defines a virtual projection plane at the subject position obtained by the subject detection unit 107 (step ST109).
- the projection plane is a plane that can be seen from the viewpoint position. As shown in FIG. 4A, the projection plane is defined as a columnar projection plane 5 having a size that covers the subject position 43, for example.
- the subject image creation unit 109 projects a camera image on the projection plane defined by the projection plane calculation unit 108 (step ST110).
- the projection plane defined by the projection plane calculation unit 108 can be defined as a polygon or a point group in the same manner as a wall, structure, etc. in the space included in the facility information. Therefore, as in step ST106, the correspondence between each point constituting the projection plane and the pixel position on the camera image, that is, the correspondence between each point and the pixel data can be obtained.
- the subject image creation unit 109 obtains the correspondence between each point constituting the projection plane and the pixel data, and responds to each point on the projection plane so that the camera image is projected on the projection plane based on the correspondence relationship. Pixel data is mapped.
- pixel data other than the pixel data corresponding to the subject area in the bird's-eye view image is subjected to mask processing and excluded from the mapping process onto the projection plane, so that an image other than the subject is displayed on the projection plane. Can be prevented from being included.
- a plurality of camera images are projected on the projection surface.
- the camera images of the photographing apparatuses 1a and 1b are respectively projected onto the projection plane 5 defined as shown in FIG. 4A with respect to the subject position 43 obtained in the overlapping region 42 as shown in FIG. .
- the subject image creation unit 109 creates a subject image based on the camera image projected on the projection plane.
- the subject image is created by converting the camera image projected on the projection plane into an image when viewed from the viewpoint position when creating the overhead view image.
- the display image creation unit 110 outputs an image obtained by combining the subject image created by the subject image creation unit 109 and the overhead view image created by the overhead view image creation unit 105 in step ST106 to the display device 2 (step S106).
- ST111 an overhead image in a state where pixel data in the subject area is omitted from the overhead image created by the overhead image creation unit 105 may be combined with the subject image. In this way, it is possible to reduce the risk of the subject being displayed twice.
- the projected camera image may be blurred and displayed if the actual shape is not reflected on the projection plane.
- FIG. 4A will be described as an example.
- a vector connecting the optical centers of the photographing apparatuses 1a and 1b and the respective points on the projection plane 5, the viewpoint position of the overhead image and the projection plane 5 Based on the vectors connecting the points, the subject image creation unit 109 may project only the camera image that can be seen from the viewpoint position of the overhead view image among the camera images of the photographing apparatuses 1a and 1b.
- the projection plane calculation unit 108 may correct the shape of the projection plane 5 based on the shape of the subject area.
- the projection plane calculation unit 108 may define a plane that faces the photographing apparatus that outputs the camera image as the projection plane.
- FIG. 4B shows a planar projection surface 51 that is defined when only the camera image of the photographing apparatus 1b is projected.
- the projection surface 51 is defined so as to cover, for example, the subject position 43 when projected onto the ground.
- the subject detection unit 107 is created from the overhead image created using the difference calculation image photographed by the photographing device 1a and the camera image photographed by the photographing device 1a by the processing in step ST106. A difference from the obtained overhead image is taken to obtain a subject area in the overhead image.
- FIG. 5B shows a subject area 41a of the subject 3 specified in the overhead image 4a corresponding to the photographing apparatus 1a. Then, the subject detection unit 107 sets the position closest to the position of the photographing apparatus 1a in the subject area 41a as the subject position 44. Alternatively, a portion closest to the position of the photographing apparatus 1a out of the subject area 41a divided into a plurality may be set as the subject position 44.
- the projection plane defined by the projection plane calculation unit 108 at the subject position 44 is a planar projection plane that faces the photographing apparatus 1a.
- an image obtained by combining a subject image created by projecting a camera image on a newly defined projection plane at a subject position with an overhead image is obtained. can get. That is, a subject image in which distortion is individually suppressed can be created in the overhead view image.
- the image processing apparatus 100 that detects a subject as described above does not need to prepare sensors such as an ultrasonic sensor in order to detect the subject, and the configuration can be simplified.
- sensors for detecting the subject may be provided for use in detecting the subject position. It is possible to obtain the above effect that a subject image in which distortion is suppressed can be created.
- an infrared camera is provided, and the subject detection unit 107 detects the subject position using a camera image taken by the infrared camera.
- processes other than taking the difference may be used as appropriate.
- various methods of detecting the subject position are conceivable, and the subject position may be detected by appropriately using a well-known technique other than those described above.
- the subject detection unit 107 takes a difference between the overhead view image created from the camera image and the overhead view image created from the difference calculation image for each photographing apparatus, and detects the overlapping portion of the subject areas as the subject position. It was. In this way, it is possible to calculate an appropriate subject position in the overlapping area of the overhead image created by the overhead image creation unit 105.
- the projection plane calculation unit 108 calculates the projection plane based on the viewpoint position of the overhead view image. By using such a projection plane, it is possible to efficiently create a subject image displaying a subject appropriately.
- Embodiment 2 when one cycle of processing from step ST105 to step ST111 is completed and then one cycle of processing from step ST105 to step ST111 is performed again, the viewpoint position when creating an overhead image is: The same is used. That is, while the image processing is being performed by the image processing apparatus 100, the viewpoint position of the overhead image is fixed. In the second embodiment, an image processing apparatus 200 that can change the viewpoint position of an overhead image will be described.
- FIG. 6 shows the configuration of the image processing apparatus 200.
- the image processing apparatus 200 includes an image acquisition unit 101, an imaging device information storage unit 102, a facility information storage unit 103, a viewpoint position storage unit 104, an overhead image creation unit 105, a difference calculation image storage unit 106, In addition to the subject detection unit 107, the projection plane calculation unit 108, the subject image creation unit 109, and the display image creation unit 110, a viewpoint position change unit 211 is provided.
- the viewpoint position changing unit 211 receives a user operation via an input device (not shown), and changes the viewpoint position stored in the viewpoint position storage unit 104 to a position desired by the user. At this time, for example, the viewpoint position changing unit 211 presents a plurality of viewpoint position candidates so that the user can select one of the candidates. Alternatively, instead of presenting candidates, the user may be able to specify an arbitrary viewpoint position.
- the viewpoint position changing unit 211 is configured by, for example, a semiconductor integrated circuit mounted with a CPU.
- a program describing the processing content of the viewpoint position changing unit 211 is stored in the memory of the computer.
- Components other than the viewpoint position changing unit 211 are denoted by the same reference numerals as those in FIG. 1 and the description thereof is omitted or simplified.
- Steps ST101 to ST104 are the same as those described in the first embodiment.
- the viewpoint position changing unit 211 overwrites and stores the viewpoint position selected or designated by the user in the viewpoint position storage unit 104 (step ST212).
- the processes after step ST212 are performed.
- Steps ST105 to ST111 following step ST212 are the same as those described in the first embodiment.
- the one-cycle processing from step ST105 to step ST111 is completed, and then the viewpoint position changing unit is moved to the next one-cycle processing from step ST105 to step ST111.
- the viewpoint position is changed by 211. That is, a user-desired viewpoint position input at an arbitrary timing during one cycle from step ST105 to step ST111 is used in the process of the next cycle.
- the same viewpoint position is used in the processing of the next cycle. In this way, it is possible to create an overhead image from a different viewpoint position for each cycle.
- step ST212 for the first time after steps ST101 to ST104 are completed, that is, in the first cycle, if a user operation for changing the viewpoint position has not been performed so far, step ST212 is substantially skipped. Then, the process proceeds to step ST105.
- the viewpoint position changing unit 211 is provided, so that the viewpoint position desired by the user is reflected as needed in addition to the effects shown in the first embodiment. It is possible to perform the processing.
- the image processing apparatus and the image processing method according to the present invention can create a subject image in which distortion is individually suppressed in an overhead image. For this reason, for example, it is suitable for use in a monitoring system that has a photographing device and a display device and monitors a facility.
- 1a to 1c photographing device, 2 display device, 3 subject (object), 4a, 4b overhead view image, 5 projection plane, 41a, 41b subject region (object region), 42 superimposed region, 43, 44 subject position (object position) , 51 projection plane, 100 image processing device, 101 image acquisition unit, 102 imaging device information storage unit, 103 facility information storage unit, 104 viewpoint position storage unit, 105 overhead view image creation unit, 106 difference calculation image storage unit, 107 subject Detection unit (object detection unit), 108 projection plane calculation unit, 109 subject image creation unit (object image creation unit), 110 display image creation unit, 200 image processing device, 211 viewpoint position change unit.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
そこで、例えば特許文献1には、障害物が検出された方向を撮影するカメラによって撮影されたカメラ画像のうち、車両から見て当該障害物よりも遠方に位置する領域の画像の幅を、車両周辺画像(俯瞰画像)の中央方向に圧縮するようにして、車両周辺画像を作成する技術が記載されている。これにより、立体的な障害物が俯瞰画像で歪んで表示されるのを抑制している。また、障害物は、超音波センサ等のセンサを用いて検出される。
実施の形態1.
図1に、この発明の実施の形態1に係る画像処理装置100の構成を示す。図1には、撮影装置1a~1cと表示装置2についても示している。
画像処理装置100は、画像取得部101と、撮影装置情報記憶部102と、施設情報記憶部103と、視点位置記憶部104と、俯瞰画像作成部105と、差分計算用画像記憶部106と、被写体検出部107と、投影面算出部108と、被写体画像作成部109と、表示画像作成部110とを備えている。
撮影装置情報記憶部102は、撮影装置1a~1cに関する情報を、撮影装置情報として記憶する。撮影装置情報には、例えば、撮影装置1a~1cの設置位置、向き、レンズの焦点距離等が含まれる。
施設情報記憶部103は、撮影装置1a~1cにより撮影されて監視対象となる空間に関する情報を、施設情報(空間情報)として記憶する。施設情報には、例えば、監視対象となる空間の広さ、壁の位置及び高さ、常時設置されている構造物の大きさ及び位置等が含まれる。
視点位置記憶部104は、俯瞰画像を作成する際の視点位置を、記憶する。
差分計算用画像記憶部106は、撮影装置1a~1cによりあらかじめ撮影されたカメラ画像を、差分計算用画像として記憶する。なお、差分計算用画像は、定期的に取得されて更新されるものとしてもよい。
被写体検出部107は、俯瞰画像作成部105が作成した俯瞰画像と、差分計算用画像とに基づき、被写体位置(物体の位置)を検出する。本発明での被写体とは、施設情報として記憶されている物体以外の物体、また、差分計算用画像に写っている物体以外の物体を指す。つまり、被写体とは、撮影装置1a~1cが監視する空間内に存在していることを、画像処理装置100があらかじめ知ることのできないものを指す。例えば、撮影装置1a~1cが監視する空間内を一時的に通過する人等が、被写体にあたる。
被写体画像作成部109は、投影面算出部108が定義した投影面に、カメラ画像を投影するようにして設定して、被写体画像(物体画像)を作成する。
表示画像作成部110は、俯瞰画像作成部105が作成した俯瞰画像と、被写体画像作成部109が作成した被写体画像とを合成した画像を作成し、出力する。
撮影装置情報記憶部102、施設情報記憶部103、視点位置記憶部104、差分計算用画像記憶部106は、例えばハードディスク等の各種の記憶装置で構成される。
俯瞰画像作成部105、被写体検出部107、投影面算出部108、被写体画像作成部109、表示画像作成部110は、例えばCPU(Central Processing Unit)を実装した半導体集積回路等で構成される。
撮影装置情報記憶部102、施設情報記憶部103、視点位置記憶部104、差分計算用画像記憶部106は、画像処理装置100とは別個の記憶装置として画像処理装置100の外部に置かれていてもよい。この場合、当該記憶装置と画像処理装置100とは、電気的に接続されるようにする。
表示装置2は、表示画像作成部110が出力した画像を表示する。表示装置2は、例えば液晶ディスプレイである。
撮影装置情報記憶部102に、撮影装置情報が記憶される(ステップST101)。
続いて、視点位置記憶部104に、視点位置が記憶される(ステップST102)。
続いて、施設情報記憶部103に、施設情報が記憶される(ステップST103)。
続いて、差分計算用画像記憶部106に、差分計算用画像が記憶される(ステップST104)。
ステップST101~ST104は、撮影装置1a~1cのカメラ画像から俯瞰画像を作成する処理を行うにあたって、ユーザ等によりあらかじめ実施される処理である。
続いて、俯瞰画像作成部105が、撮影装置情報記憶部102に記憶されている撮影装置情報と、施設情報記憶部103に記憶されている施設情報とに基づき、画像取得部101が取得したカメラ画像から、視点位置記憶部104に記憶されている視点位置から見下ろした際の俯瞰画像を作成する(ステップST106)。
まず、俯瞰画像作成部105は、施設情報を基に、空間内の壁、常時設置されている構造物等をポリゴン化する。これにより、空間内の壁、常時設置されている構造物等は、作成された各ポリゴンの頂点の3次元位置を用いて定義される。
続いて、俯瞰画像作成部105は、各ポリゴンの頂点の3次元位置と撮影装置1aの光学中心を結んだ直線と、撮影装置1aの撮像素子平面との交点に基づき、各ポリゴンの頂点に対応する画素データの3次元位置D=(distx,disty,distz)Tを求める。3次元位置Dは、ベクトルである。上付き文字Tは、転置を表す。また、撮像素子平面とは、撮像素子を構成する受光素子が2次元的に配置されている平面を指す。そして、画素データの3次元位置とは、当該画素データに対応する受光素子の3次元位置を意味する。
頂点P、光学中心C、点Q、点Q’はベクトルであり、式(1)及び式(2)中では太字で示す。
式(2)中のuは撮像素子平面のx軸方向の単位ベクトル、vは当該x軸方向に垂直な撮像素子平面のy軸方向の単位ベクトルを表している。式(2)中のIは、撮像素子平面の中心座標を表すベクトルである。単位ベクトルu,v、中心座標Iは、式(2)中では太字で示している。単位ベクトルu,v、中心座標I及び式(1)中の光学中心Cは、撮影装置情報に基づき求められる。または、もともと撮影装置情報に含まれるようにしてもよい。
式(3)からa,b,tの各値が求まるので、3次元位置Dを求めることができる。
image_x=a×camera_px_width+camera_px_width/2 ・・・(4)
image_y=b×camera_px_height+camera_px_height/2 ・・・(5)
式(4)中のcamera_px_widthは、撮像素子平面のx軸方向に並ぶ受光素子の数(カメラ画像上のx軸方向の画素数)を表す。同様に、式(5)中のcamera_px_heightは、撮像素子平面のy軸方向に並ぶ受光素子の数(カメラ画像上のy軸方向の画素数)を表す。
なお上記では、施設情報を基に、空間内の壁、常時設置されている構造物等をポリゴンとして定義する場合について説明したが、空間内の壁、常時設置されている構造物等の面を点群として定義してもよい。この場合も、上記の方法を用いることで、各点とカメラ画像上での画素位置との対応を、求めることができる。
以上、ステップST106の処理について詳述した。
続いて、被写体検出部107は、ステップST107で求めた重畳領域において、被写体が存在する位置(被写体位置)を求める(ステップST108)。
図3(a)に示す位置関係のもとで撮影装置1a,1bが撮影したカメラ画像は、上記したステップST105,ST106の処理によって、それぞれ俯瞰画像に変換される。そして、上記したステップST107の処理によって、俯瞰画像の重畳領域が求められる。
以上、ステップST108の処理について詳述した。
被写体画像作成部109は、投影面を構成する各点と画素データとの対応を求めて、その対応関係に基づき、投影面にカメラ画像が投影されるように、投影面の各点に対応する画素データをマッピングしていく。なお、このとき、俯瞰画像内の被写体領域と対応している画素データ以外の画素データについてはマスク処理を施し、投影面へのマッピング処理の対象外とすることで、投影面に被写体以外の画像が含まれるのを防ぐことができる。
被写体画像作成部109は、投影面に投影されたカメラ画像を基に、被写体画像を作成する。被写体画像は、投影面に投影されたカメラ画像を、俯瞰画像を作成する際の視点位置から見たときの画像に変換することで作成される。
また、1つのカメラ画像のみを投影面に投影するのであれば、投影面算出部108は、当該カメラ画像を出力する撮影装置に正対する平面を、投影面として定義してもよい。図4(b)は、撮影装置1bのカメラ画像のみを投影する際に定義される、平面状の投影面51を示している。投影面51は、地面に投影した際に例えば被写体位置43を覆うように定義される。
そして、被写体検出部107は、被写体領域41aのうちの、撮影装置1aの位置から最も近い位置を、被写体位置44とする。または、被写体領域41aを複数に分割したうちの、撮影装置1aの位置から最も近い部分を被写体位置44としてもよい。
この被写体位置44において投影面算出部108が定義する投影面は、撮影装置1aに正対する平面状の投影面となる。
しかしながら、センサ類を設けることに関して特に制約が無いのであれば、被写体を検出するためのセンサ類を設けて被写体位置の検出に利用してもよく、この場合も、俯瞰画像の中に、個別に歪みが抑制された被写体画像を作成することができるという上記の効果を得ることが可能である。
例えば赤外線カメラを設け、赤外線カメラが撮影したカメラ画像を用いて被写体検出部107が被写体位置を検出する。このとき、上記したステップST108等で示した被写体検出部107の各処理のうち、差分を取る以外の処理を適宜用いてもよい。
このように、被写体位置の検出の仕方は種々考えられ、説明した以外にも周知の技術を適宜用いて被写体位置を検出してよい。
実施の形態1では、ステップST105からステップST111までの1サイクルの処理が終わり、次に再びステップST105からステップST111までの1サイクルの処理が行われる際、俯瞰画像を作成する際の視点位置は、同じものが用いられる。つまり、画像処理装置100で画像処理を行っている最中は、俯瞰画像の視点位置が固定されていた。実施の形態2では、俯瞰画像の視点位置を変更可能とした画像処理装置200について説明する。
視点位置変更部211以外の構成については、図1と同一又は相当の部分については同一の符号を付し、その説明を省略又は簡略化する。
ステップST101~ST104については、実施の形態1で説明した処理と同様である。
ステップST104に続いて、視点位置変更部211が、視点位置記憶部104に、ユーザが選択又は指定した視点位置を上書きして記憶させる(ステップST212)。このときに視点位置記憶部104に記憶された視点位置を用いて、ステップST212以降の処理が行われる。
ステップST212に続くステップST105~ST111については、実施の形態1で説明した処理と同様である。
なお、ステップST101~ST104を終えて初めてステップST212に移る場合、つまり、初回のサイクルである場合において、それまでに視点位置を変更するユーザ操作が行われていなければ、ステップST212は実質的に飛ばされてステップST105へ移る。
Claims (4)
- 撮影装置が撮影した撮影画像を取得する画像取得部と、
前記撮影装置に関しての撮影装置情報と、前記撮影装置が撮影する空間に関しての空間情報とに基づき、前記撮影画像を構成する画素データと前記空間との対応を算出して、前記撮影画像から俯瞰画像を作成する俯瞰画像作成部と、
対応する前記空間情報を有しない物体の位置を検出する物体検出部と、
前記物体の位置における投影面を算出する投影面算出部と、
前記投影面に前記撮影画像を設定して、前記俯瞰画像の視点位置から見たときの前記物体の物体画像を作成する物体画像作成部と、
前記俯瞰画像に前記物体画像を合成して画像を作成する表示画像作成部とを備えることを特徴とする画像処理装置。 - 前記撮影装置情報と前記空間情報とに基づき、前記撮影装置が撮影した差分計算用画像を構成する画素データと前記空間との対応を算出して、前記差分計算用画像から差分計算用俯瞰画像を作成する差分計算用俯瞰画像作成部を備え、
前記俯瞰画像作成部は、複数の前記撮影装置がそれぞれ撮影した前記撮影画像からそれぞれ前記俯瞰画像を作成し、
前記差分計算用俯瞰画像作成部は、複数の前記撮影装置がそれぞれ撮影した前記差分計算用画像からそれぞれ前記差分計算用俯瞰画像を作成し、
前記物体検出部は、同一の前記撮影装置ごとの前記俯瞰画像と前記差分計算用俯瞰画像との差分から物体領域をそれぞれ求め、前記物体領域同士の重畳部分を、前記物体の位置として検出することを特徴とする請求項1記載の画像処理装置。 - 前記投影面算出部は、前記俯瞰画像の視点位置に基づき、前記投影面を算出することを特徴とする請求項1記載の画像処理装置。
- 画像取得部が、撮影装置が撮影した撮影画像を取得する画像取得ステップと、
俯瞰画像作成部が、前記撮影装置に関しての撮影装置情報と、前記撮影装置が撮影する空間に関しての空間情報とに基づき、前記撮影画像を構成する画素データと前記空間との対応を算出して、前記撮影画像から俯瞰画像を作成する俯瞰画像作成ステップと、
物体検出部が、対応する前記空間情報を有しない物体の位置を検出する物体検出ステップと、
投影面算出部が、前記物体の位置における投影面を算出する投影面算出ステップと、
物体画像作成部が、前記投影面に前記撮影画像を設定して、前記俯瞰画像の視点位置から見たときの前記物体の物体画像を作成する物体画像作成ステップと、
表示画像作成部が、前記俯瞰画像に前記物体画像を合成して画像を作成する表示画像作成ステップとを備えることを特徴とする画像処理方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/079130 WO2016067470A1 (ja) | 2014-10-31 | 2014-10-31 | 画像処理装置及び画像処理方法 |
US15/502,879 US10097772B2 (en) | 2014-10-31 | 2014-10-31 | Image processing device and image processing method |
JP2016556171A JP6257798B2 (ja) | 2014-10-31 | 2014-10-31 | 画像処理装置及び画像処理方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/079130 WO2016067470A1 (ja) | 2014-10-31 | 2014-10-31 | 画像処理装置及び画像処理方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016067470A1 true WO2016067470A1 (ja) | 2016-05-06 |
Family
ID=55856848
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/079130 WO2016067470A1 (ja) | 2014-10-31 | 2014-10-31 | 画像処理装置及び画像処理方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US10097772B2 (ja) |
JP (1) | JP6257798B2 (ja) |
WO (1) | WO2016067470A1 (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6636252B2 (ja) * | 2015-03-19 | 2020-01-29 | 株式会社メガチップス | 投影システム、プロジェクター装置、撮像装置、および、プログラム |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002112421A (ja) * | 2000-07-25 | 2002-04-12 | Furukawa Electric Co Ltd:The | 架空線弛度監視方法 |
JP2008177856A (ja) * | 2007-01-18 | 2008-07-31 | Sanyo Electric Co Ltd | 俯瞰画像提供装置、車両、および俯瞰画像提供方法 |
WO2013018173A1 (ja) * | 2011-07-29 | 2013-02-07 | 富士通株式会社 | 画像処理装置、画像処理方法および画像処理プログラム |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4907883B2 (ja) | 2005-03-09 | 2012-04-04 | 株式会社東芝 | 車両周辺画像表示装置および車両周辺画像表示方法 |
JP4969269B2 (ja) * | 2007-02-21 | 2012-07-04 | アルパイン株式会社 | 画像処理装置 |
JP5053043B2 (ja) | 2007-11-09 | 2012-10-17 | アルパイン株式会社 | 車両周辺画像生成装置および車両周辺画像の歪み補正方法 |
JP2012147149A (ja) * | 2011-01-11 | 2012-08-02 | Aisin Seiki Co Ltd | 画像生成装置 |
EP2720458A4 (en) * | 2011-06-09 | 2015-02-25 | Aisin Seiki | IMAGING DEVICE |
JP5483120B2 (ja) * | 2011-07-26 | 2014-05-07 | アイシン精機株式会社 | 車両周辺監視システム |
-
2014
- 2014-10-31 WO PCT/JP2014/079130 patent/WO2016067470A1/ja active Application Filing
- 2014-10-31 JP JP2016556171A patent/JP6257798B2/ja active Active
- 2014-10-31 US US15/502,879 patent/US10097772B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002112421A (ja) * | 2000-07-25 | 2002-04-12 | Furukawa Electric Co Ltd:The | 架空線弛度監視方法 |
JP2008177856A (ja) * | 2007-01-18 | 2008-07-31 | Sanyo Electric Co Ltd | 俯瞰画像提供装置、車両、および俯瞰画像提供方法 |
WO2013018173A1 (ja) * | 2011-07-29 | 2013-02-07 | 富士通株式会社 | 画像処理装置、画像処理方法および画像処理プログラム |
Also Published As
Publication number | Publication date |
---|---|
US10097772B2 (en) | 2018-10-09 |
JP6257798B2 (ja) | 2018-01-10 |
JPWO2016067470A1 (ja) | 2017-04-27 |
US20170237909A1 (en) | 2017-08-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10825198B2 (en) | 3 dimensional coordinates calculating apparatus, 3 dimensional coordinates calculating method, 3 dimensional distance measuring apparatus and 3 dimensional distance measuring method using images | |
US8482599B2 (en) | 3D modeling apparatus, 3D modeling method, and computer readable medium | |
JP6733267B2 (ja) | 情報処理プログラム、情報処理方法および情報処理装置 | |
JP5070435B1 (ja) | 3次元相対座標計測装置およびその方法 | |
JP6615545B2 (ja) | 画像処理装置、画像処理方法および画像処理用プログラム | |
US9704255B2 (en) | Three-dimensional shape measurement device, three-dimensional shape measurement method, and three-dimensional shape measurement program | |
EP3765815B1 (en) | Imaging device, image processing apparatus, and image processing method | |
JP6656035B2 (ja) | 画像処理装置、撮像装置および画像処理装置の制御方法 | |
US11222433B2 (en) | 3 dimensional coordinates calculating apparatus and 3 dimensional coordinates calculating method using photo images | |
JP2011064566A (ja) | 距離推定装置 | |
EP3820139B1 (en) | Image-capture plan presentation apparatus and method | |
JP6617150B2 (ja) | 物体検出方法及び物体検出装置 | |
JP6622575B2 (ja) | 制御装置、制御方法、およびプログラム | |
JP4198536B2 (ja) | 物体撮影装置、物体撮影方法及び物体撮影プログラム | |
TW201342303A (zh) | 三維空間圖像的獲取系統及方法 | |
JP5648159B2 (ja) | 3次元相対座標計測装置およびその方法 | |
JP6625654B2 (ja) | 投影装置、投影方法、および、プログラム | |
JP6257798B2 (ja) | 画像処理装置及び画像処理方法 | |
JP5514062B2 (ja) | 電子機器、情報付き撮像画面表示方法及びプログラム | |
JP2008241609A (ja) | 距離計測システム及び距離計測方法 | |
JP2005275789A (ja) | 三次元構造抽出方法 | |
JP6579706B2 (ja) | 画像処理装置、画像処理方法および画像処理用プログラム | |
JP5885974B2 (ja) | 空中写真画像データの対応点設定方法及び対応点設定装置並びに対応点設定プログラム | |
JP2014021672A (ja) | 三次元座標算出装置 | |
JP2008011187A (ja) | 表示制御装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14904733 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016556171 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15502879 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14904733 Country of ref document: EP Kind code of ref document: A1 |