WO2016038805A1 - Dispositif d'imagerie, système d'imagerie, procédé de traitement d'image, et programme - Google Patents

Dispositif d'imagerie, système d'imagerie, procédé de traitement d'image, et programme Download PDF

Info

Publication number
WO2016038805A1
WO2016038805A1 PCT/JP2015/004169 JP2015004169W WO2016038805A1 WO 2016038805 A1 WO2016038805 A1 WO 2016038805A1 JP 2015004169 W JP2015004169 W JP 2015004169W WO 2016038805 A1 WO2016038805 A1 WO 2016038805A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
imaging
image
real space
imaging unit
Prior art date
Application number
PCT/JP2015/004169
Other languages
English (en)
Japanese (ja)
Inventor
翔 池村
山中 睦裕
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2016038805A1 publication Critical patent/WO2016038805A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor

Definitions

  • the present invention generally relates to an imaging apparatus, an imaging system, an image processing method, and a program, and more specifically, an imaging apparatus, an imaging system, an image processing method, and an image processing apparatus that perform appropriate image processing on an image generated by an imaging unit. And program.
  • an imaging apparatus described in Document 1 Japanese Patent Application Publication No. 2013-214947
  • an inclination detection unit that detects an inclination with respect to a vertical direction
  • conversion data that converts a plane coordinate value into a spherical coordinate value
  • correction means for correcting the conversion data.
  • this imaging apparatus detects the vertical direction and corrects the conversion table used for image processing according to the vertical direction. Thereby, the imaging device can generate a correct omnidirectional (omnidirectional) image in the vertical direction in response to an arbitrary inclination of the device with respect to the vertical direction.
  • This imaging device is basically used in the vertical direction (vertical direction), but when tilted with respect to the vertical direction, the zenith portion of the image to be taken may be displaced and the horizontal line may be distorted. Even in such a case, the imaging device described in Document 1 corrects the conversion data according to the inclination with respect to the vertical direction, and suppresses the distortion of the horizontal line as in the case where it is installed without being inclined with respect to the vertical direction. Images can be obtained.
  • the orientation of the converted image is uniquely determined with respect to the vertical direction so that the zenith portion of the converted image matches the zenith portion of the real space. Therefore, for example, if the imaging device has a horizontal angle of view of 360 degrees, an image with a horizontal angle of view of 360 degrees is the same regardless of whether the imaging device is attached to the ceiling or the imaging device is attached to a wall. Will be obtained. Then, when the imaging device is attached to the wall, only the wall is reflected in about half of the image, and there is a possibility that the image is not suitable for the user's browsing.
  • the present invention has been made in view of the above reasons, and provides an imaging apparatus, an imaging system, an image processing method, and a program capable of obtaining an image suitable for a user's browsing according to the direction in which the imaging unit is installed. For the purpose.
  • An imaging apparatus includes: an imaging unit that captures an image; a detection unit that detects an inclination of the imaging unit with respect to a vertical direction of real space; and an image process performed on the image captured by the imaging unit.
  • An image processing unit to be applied, and the image processing unit is a virtual surface set in the real space and is based on the inclination detected by the detection unit from a plurality of types of projection surfaces having different shapes.
  • a selection unit that selects a projection plane, and a conversion unit that converts the image captured by the imaging unit so that the image captured by the imaging unit is mapped to the projection plane selected by the selection unit; It is characterized by having.
  • An imaging system acquires data from an imaging unit that captures an image, a detection unit that detects an inclination of the imaging unit with respect to a vertical direction of real space, and the imaging unit and the detection unit.
  • An image processing unit that performs image processing on the image captured by the imaging unit, and the image processing unit is a virtual surface that is set in the real space and has a plurality of types of projection surfaces having different shapes.
  • the selection unit that selects a projection plane based on the inclination detected by the detection unit, and the image captured by the imaging unit is mapped to the projection plane selected by the selection unit.
  • a conversion unit that converts an image captured by the imaging unit.
  • An image processing method is based on an inclination of an imaging unit with respect to a vertical direction of the real space from a plurality of types of projection surfaces that are virtual surfaces set in real space and have different shapes.
  • a program according to an aspect of the present invention is a program that connects a computer connected to an imaging unit that captures an image from a plurality of types of projection planes that are virtual planes set in real space and have different shapes.
  • a selection unit that selects a projection plane based on an inclination of the imaging unit with respect to a vertical direction of space, and the imaging unit so that an image captured by the imaging unit is mapped to the projection plane selected by the selection unit It functions as a conversion unit that converts the image captured in (1).
  • FIG. 1A is a perspective view of a state where the imaging apparatus according to Embodiment 1 is attached to a ceiling
  • FIG. 1B is a perspective view of a state where the imaging apparatus according to Embodiment 1 is attached to a wall
  • 1 is a block diagram illustrating a configuration of an imaging apparatus according to a first embodiment.
  • 3A is a side view of the imaging apparatus according to Embodiment 1 installed in a downward direction
  • FIG. 3B is a side view of the imaging apparatus according to Embodiment 1 installed in an upward direction
  • FIG. 3C is related to Embodiment 1.
  • FIG. 4A is a schematic diagram of a converted image in the comparative example
  • FIG. 4B is a schematic diagram of a converted image obtained by the imaging apparatus according to the first embodiment.
  • 6 is a perspective view showing an imaging apparatus according to Embodiment 2.
  • FIG. 6 is a flowchart illustrating an operation of the imaging apparatus according to the second embodiment.
  • 6 is a perspective view showing an imaging apparatus according to Embodiment 2.
  • FIG. 8A is a perspective view of a state in which the imaging device according to the modification is installed upward
  • FIG. 8B is a perspective view of a state in which the imaging device according to the modification is installed obliquely upward
  • FIG. 8C is an imaging device according to the modification. It is a perspective view of the state installed diagonally downward.
  • It is a block diagram which shows the structure of the imaging system which concerns on embodiment.
  • the imaging apparatus 1 As shown in FIG. 2, the imaging apparatus 1 according to the present embodiment includes an imaging unit 2 that captures an image, and a detection unit 3 that detects the inclination of the imaging unit 2 with respect to the vertical direction V of real space (see FIG. 1A).
  • the image processing unit 4 performs image processing on the image captured by the image capturing unit 2.
  • the image processing unit 4 includes a selection unit 41 and a conversion unit 42 that converts an image captured by the imaging unit 2.
  • the selection unit 41 is detected by the detection unit 3 from a plurality of types of projection planes 101 and 102 which are virtual planes set in the real space and have different shapes. A projection plane is selected based on the tilt.
  • the conversion unit 42 converts the image (captured by the imaging unit 2) so that the image captured by the imaging unit 2 is mapped to the projection plane selected by the selection unit 41.
  • the imaging apparatus 1 determines which projection plane to map an image from among a plurality of types of projection planes that are virtual planes set in real space and have different shapes. The selection is made based on the inclination of the imaging unit 2 with respect to the vertical direction V of the real space.
  • the imaging device 1 performs image processing on the image captured by the imaging unit 2 by the image processing unit 4 so that the image is mapped to any one of the projection planes. Has been converted.
  • the projection plane onto which the image captured by the imaging unit 2 is mapped is automatically selected based on the inclination of the imaging unit 2 with respect to the vertical direction V of the real space. Therefore, the imaging apparatus 1 of the present embodiment has an advantage that an image suitable for the user's browsing can be obtained according to the direction in which the imaging unit 2 is installed.
  • the imaging apparatus 1 of the present embodiment will be described in detail.
  • the imaging device 1 described below is merely an example of the present invention.
  • the present invention is not limited to this embodiment (Embodiment 1) and Embodiment 2 described later, and other embodiments may be used as long as they do not depart from the technical idea of the present invention.
  • Various changes can be made according to the design and the like.
  • the imaging device 1 is attached to the ceiling or wall of a building (for example, a detached house, an apartment house, an office, a store, a factory, etc.) as a monitoring camera will be described as an example. To do.
  • the imaging apparatus 1 is assumed to be a camera that captures a moving image here, but may be a camera (still camera) that captures a still image.
  • the imaging apparatus 1 includes the imaging unit 2, the detection unit 3, and the image processing unit 4 described above in one housing 5.
  • the imaging unit 2 has one imaging element 21 and one optical system 22.
  • the imaging element 21 is a two-dimensional image sensor such as a CCD (Charge-Coupled Device) image sensor or a CMOS (Complementary Metal-Oxide Semiconductor) image sensor.
  • the imaging element 21 forms an image of light from the imaging target on the imaging surface (light receiving surface) by the optical system 22, converts it into an electrical signal, and outputs it.
  • the optical system 22 includes a fisheye lens (super wide angle lens) having an angle of view of 180 degrees or more.
  • an image having an angle of view of 180 degrees is projected through the optical system 22 onto the imaging surface of the imaging device 21.
  • the image projected on the imaging surface of the imaging device 21 is an orthographic image obtained by projecting an image on the hemisphere, which is the surface of the fisheye lens, perpendicularly to the plane, which is the bottom surface of the hemisphere. That is, an image picked up by the image pickup unit 2 (hereinafter referred to as “original image”) is an image projected onto a hemispherical virtual surface set around the image pickup unit (image pickup element 21) 2 in real space. Become.
  • a virtual surface in real space in which an image obtained by imaging is regarded as being projected is hereinafter referred to as a projection surface, and in particular, a projection surface corresponding to the original image (the virtual surface on the hemisphere described above). Is called the original projection plane 100 (see FIG. 1A).
  • an XYZ orthogonal coordinate system defined by the X axis and the Y axis included in the imaging surface of the imaging device 21 and the Z axis corresponding to the optical axis of the imaging unit 2 is a camera coordinate system, and the center of the original image
  • the camera coordinate system here is a coordinate system set on the real space with reference to the orientation of the imaging unit 2.
  • the camera coordinate system (XYZ orthogonal coordinate system) is converted to the UV coordinate system in the original image as follows.
  • the U axis is parallel to the X axis
  • the V axis is parallel to the Y axis
  • the camera coordinate system which is an orthogonal coordinate system, is converted into a polar coordinate system, so that the position of the point P on the original projection plane 100 represented by (x, y, z) in the camera coordinate system is ( ⁇ , ⁇ ). , R).
  • is an angle formed by a straight line connecting the center of the original image and the point P on the original projection plane 100 and the optical axis (Z axis) of the imaging unit 2
  • is the point P and the bottom of the original projection plane 100.
  • An angle formed by a straight line connecting the point projected perpendicularly to the plane and the center of the original image and the X axis, R is the radius of the original projection plane 100.
  • the projection method of the fisheye lens is not limited to the orthographic projection as described above, and may be, for example, an equisolid angle projection, an orthographic projection, or an equidistant projection.
  • the imaging unit 2 can capture the captured image (original image), for example, in JPEG format or H.264. H.264 or H.264
  • the image data is compressed by an image compression method such as H.265, and is output to the image processing unit 4 as image data.
  • the optical system 22 is not limited to a fisheye lens, and may be configured using another wide-angle lens or a curved mirror such as an omnidirectional mirror.
  • the detection unit 3 measures at least the magnitude (tilt angle) of the imaging unit 2 with respect to the vertical direction (vertical direction) V of the real space.
  • an acceleration sensor is used for the detection unit 3 .
  • a triaxial acceleration sensor is used for the detection unit 3 as an example.
  • the detection unit 3 measures an angle formed by the optical axis (Z axis) of the imaging unit 2 and the vertical direction V of the real space as the first angle ⁇ 1, and a value corresponding to the first angle ⁇ 1 is measured by the image processing unit 4. Output to.
  • the detection unit 3 is configured to detect the angle formed by the optical axis of the imaging unit 2 and the vertical direction V of the real space in the above inclination.
  • the first measured by the detection unit 3 when the optical axis (Z-axis) of the imaging unit 2 coincides with the vertical direction V and faces downward (the direction in which gravity acts), the first measured by the detection unit 3.
  • the angle ⁇ 1 of 1 is 0 degree.
  • the first angle ⁇ 1 measured by the detection unit 31 is 180 degrees.
  • the first angle ⁇ 1 measured by the detection unit 31 is 90 degrees.
  • the image processing unit 4 further includes a determination unit 43, a storage unit 44, and an output unit 45 in addition to the selection unit 41 and the conversion unit 42 described above, as shown in FIG.
  • the image processing unit 4 mainly includes a computer having a CPU (Central Processing Unit) and a memory, such as a microcomputer.
  • the image processing unit 4 causes the computer to function as the selection unit 41 and the conversion unit 42 by executing a program stored in the memory by the CPU.
  • the program may be written in the memory in advance, or may be provided by being recorded through a telecommunication line or in a recording medium such as a memory card.
  • the determination unit 43 determines the orientation of the imaging unit 2 in real space based on the output value of the detection unit 3. In the present embodiment, the determination unit 43 determines whether the imaging unit 2 is “vertical” or “horizontal”.
  • “vertical” is a state in which the optical axis (Z-axis) of the imaging unit 2 is along the vertical direction (upward or downward) V in real space
  • “lateral” is the optical axis (Z-axis) of the imaging unit 2. Is a state along the horizontal direction in real space.
  • the determination unit 43 compares the first angle ⁇ 1 measured by the detection unit 3 with a predetermined threshold, and determines “vertical” or “horizontal” based on the comparison result.
  • the threshold value to be compared with the first angle ⁇ 1 is not limited to one value, and may be two or more values, for example. In the present embodiment, binary values of 45 degrees (first threshold value) and 135 degrees (second threshold value) are set as threshold values. Then, the determination unit 43 determines that the first angle ⁇ 1 is smaller than the first threshold (0 ⁇ ⁇ 1 ⁇ 45) and the first angle ⁇ 1 is larger than the second threshold (135 ⁇ 1 ⁇ 180). ), It is determined that the imaging unit 2 is “vertical”. On the other hand, when the first angle ⁇ 1 is greater than or equal to the first threshold and less than or equal to the second threshold (45 ⁇ ⁇ 1 ⁇ 135), the determination unit 43 determines that the imaging unit 2 is “landscape”.
  • determination unit 43 is not limited to the two-stage determination of “vertical” and “horizontal”, and may be configured to perform determination of three or more stages.
  • the selection unit 41 selects a projection plane based on the determination result of the determination unit 43 from a plurality of types of projection planes having different shapes. That is, the selection unit 41 receives the determination result of the determination unit 43 and selects different projection planes depending on whether the imaging unit 2 is “vertical” or “horizontal”. In the present embodiment, the selection unit 41 selects a projection plane from two types of projection planes, which are a first projection plane 101 (see FIG. 1A) and a second projection plane 102 (see FIG. 1B), which will be described later.
  • the detection unit 3 measures the angle (first angle ⁇ 1) formed by the optical axis of the imaging unit 2 and the vertical direction V of the real space.
  • the image processing unit 4 further includes a determination unit 43 that determines the orientation of the imaging unit 2 in the real space based on the first angle ⁇ 1 measured by the detection unit 3.
  • the selection unit 41 is configured to select a projection surface from a plurality of types of projection surfaces based on the determination result of the determination unit 43. In other words, the selection unit 41 selects the projection plane through the determination by the determination unit 43 based on the inclination detected by the detection unit 3.
  • the selection unit 41 may select a projection plane based on the inclination detected by the detection unit 3, and is not limited to a configuration that directly selects the projection plane from the inclination detected by the detection unit 3.
  • the projection surface may be selected indirectly from the inclination detected in step 3.
  • the selection unit 41 is configured to select, from a plurality of types of projection planes, a projection plane having a shape along the vertical direction V of the real space and curved in the horizontal plane of the real space. Therefore, the selection unit 41 selects the first projection plane 101 if the imaging unit 2 is “vertical”, and selects the second projection plane 102 if the imaging unit 2 is “horizontal”. Specific shapes of the first projection plane 101 and the second projection plane 102 will be described later.
  • the conversion unit 42 converts the original image so that the original image captured by the imaging unit 2 is mapped to the projection plane selected by the selection unit 41. Actually, the conversion unit 42 performs coordinate conversion of the original image using a conversion formula corresponding to the projection plane selected by the selection unit 41. At this time, the conversion unit 42 performs distortion correction and interpolation processing together with conversion of the original image so as to suppress distortion of the converted image (hereinafter referred to as “converted image”).
  • the storage unit 44 stores information corresponding to a plurality of types of projection planes having different shapes in association with the determination result of the determination unit 43 in advance.
  • the storage unit 44 stores the conversion formula used by the conversion unit 42 in association with the determination result of the determination unit 43.
  • the selection unit 41 receives the determination result of the determination unit 43 and reads the conversion formula corresponding to the determination result of the determination unit 43 from the storage unit 44, thereby performing projection after conversion based on the determination result of the determination unit 43. A face can be selected.
  • the output unit 45 is configured to be able to communicate with an external device.
  • the output unit 45 transmits the image after the image processing, that is, the image converted by the conversion unit 42 (converted image) to an external device via a network including a local area network (LAN), for example.
  • the external device is a storage device that stores the converted image, a monitor device that displays the converted image, or the like.
  • the output unit 45 is not limited to the configuration for transmitting the converted image to the external device, but may be configured to write the converted image to a recording medium such as a built-in memory or a memory card.
  • the imaging device 1 configured as described above generates a converted image by performing image processing on the original image obtained by the imaging unit 2 by the image processing unit 4.
  • the image processing unit 4 moves the virtual plane in the real space where the image is assumed to be projected, that is, the projection plane to which the image is mapped, from the original projection plane 100 , Switching to the first projection plane 101 or the second projection plane 102.
  • the image processing unit 4 selects a projection plane based on the inclination of the imaging unit 2 detected by the detection unit 3 from a plurality of types of projection planes having different shapes.
  • the image processing unit 4 selects the first projection plane 101 (see FIG. 1A) based on the inclination of the imaging unit 2 detected by the detection unit 3 if the imaging unit 2 is “vertical”. In the case of “landscape”, the second projection plane 102 (see FIG. 1B) is selected.
  • FIG. 1A illustrates a state in which the imaging device 1 is attached to the ceiling of a building, and the imaging device 1 illustrated in FIG. 1A is in a state in which the optical axis of the imaging unit 2 is directed downward. Accordingly, the first projection plane 101 is set around the central axis in the vertical direction V passing through the imaging unit 2 below the imaging device 1 in real space.
  • the first projection plane 101 is a projection plane having a shape along the vertical direction V of the real space and curved in the horizontal plane of the real space.
  • the imaging apparatus 1 converts the original image by the conversion unit 42 so that the original image captured by the imaging unit 2 is mapped to the first projection plane 101. That is, the conversion unit 42 converts the original image using a conversion formula corresponding to the first projection plane 101 among the conversion formulas stored in the storage unit 44.
  • the converted image output from the output unit 45 is a two-dimensional image obtained by expanding the image projected on the cylindrical first projection plane 101 into a planar shape. Thereby, the imaging apparatus 1 can output a panoramic image having a horizontal angle of view of 360 degrees in the real space from the output unit 45 as a converted image.
  • FIG. 1B shows a state where the imaging device 1 is attached to the wall of the building, and the imaging device 1 shown in FIG.
  • the second projection plane 102 is set around the central axis in the vertical direction V passing through the imaging unit 2 in front of the imaging device 1 in real space.
  • the second projection plane 102 is a projection plane having a shape along the vertical direction V of the real space and curved in the horizontal plane of the real space.
  • the imaging apparatus 1 converts the original image by the conversion unit 42 so that the original image captured by the imaging unit 2 is mapped to the second projection plane 102. That is, the conversion unit 42 converts the original image using a conversion formula corresponding to the second projection plane 102 among the conversion formulas stored in the storage unit 44.
  • the converted image output from the output unit 45 is a two-dimensional image obtained by expanding the image projected on the curved second projection plane 102 into a planar shape.
  • the imaging device 1 can output a panoramic image having a horizontal angle of view of about 180 degrees in the real space from the output unit 45 as a converted image.
  • the detection unit 3 simply switches the opening and closing of the contact depending on whether it is “vertical” or “horizontal”, for example. It may be configured as follows. In this way, in the image processing unit 4, the determination unit 43 can be omitted if the output of the detection unit 3 is switched depending on “vertical” or “horizontal”.
  • the detection unit 3 may be configured such that the detection unit 3 with respect to the vertical direction V detects an inclination by image processing instead of the sensor.
  • the detection unit 3 extracts a horizontal plane such as a floor from the original image, for example, and specifies the vertical direction V from the normal vector of the horizontal plane.
  • the imaging apparatus 1 may include an input unit that receives an operation input by a user, and may be configured to select a projection plane according to an input from the input unit. In this case, the user can manually select a projection plane different from the projection plane automatically selected by the selection unit 41.
  • the imaging apparatus 1 determines which projection plane to map an image from among a plurality of types of projection planes that are virtual planes set in real space and have different shapes. The selection is made based on the inclination of the imaging unit 2 with respect to the vertical direction V of the real space. In other words, the imaging device 1 performs image processing on the image captured by the imaging unit 2 by the image processing unit 4 so that the image is mapped to any one of the projection planes. Convert.
  • the projection plane onto which the image captured by the imaging unit 2 is mapped is based on the inclination of the imaging unit 2 with respect to the vertical direction V of the real space, from the first projection plane 101 and the second projection plane 102 having different shapes. Automatically selected.
  • the image pickup apparatus 1 performs different image processing (conversion) on the original image based on the inclination of the image pickup unit 2 with respect to the vertical direction V of the real space.
  • image processing conversion
  • the imaging apparatus 1 can obtain an image suitable for the user's browsing according to the direction in which the imaging unit 2 is installed.
  • the selection unit 41 selects a projection surface that is curved along the vertical direction V of the real space and curved in the horizontal plane of the real space from among a plurality of types of projection surfaces. It is preferable to be configured. According to this configuration, the imaging apparatus 1 can generate a converted image (image converted by the converting unit 42) in which straight line distortion along the vertical direction V of the real space is reduced. In order to explain this point, consider the case where the imaging device 1 captures an image of the object 200 standing upright in the vertical direction V in real space.
  • the selection unit 41 selects a projection surface that is curved along the horizontal direction of the real space and is curved in the vertical plane of the real space, as shown in FIG. 4A, the object in the converted image Im1
  • the distortion of 200 is relatively large.
  • the distortion of the object 200 increases at positions near both ends in the horizontal direction in the converted image Im1.
  • the selection unit 41 selects a projection surface having a curved shape in the horizontal plane of the real space along the vertical direction V of the real space as in the present embodiment, the conversion is performed.
  • the distortion of the object 200 in the image Im2 is reduced. In this way, the distortion of the object 200 standing upright in the vertical direction V in the real space can be reduced in the imaging apparatus 1 that uses an upright person as the object, such as a surveillance camera installed for the purpose of crime prevention or the like. It is particularly useful.
  • the detection unit 3 sets the angle formed by the optical axis of the imaging unit 2 and the vertical direction V of the real space to the inclination (the inclination of the imaging unit 2 with respect to the vertical direction V of the real space). It is preferable to be configured to include and detect. According to this configuration, for example, different image processing (conversion) is performed on the original image when the imaging device 1 is attached to the ceiling and when the imaging device 1 is attached to the wall. Therefore, for example, when the imaging device 1 is attached to a wall, it is possible to avoid a situation in which only the wall appears in about half of the image, and an image suitable for user browsing can be obtained.
  • the selection unit 41 selects a projection plane from a plurality of types of projection planes based on a comparison result between the magnitude of the inclination detected by the detection unit 3 and a predetermined threshold value. It is preferable that it is comprised. According to this configuration, even when the magnitudes of inclinations detected by the detection unit 3 are different, the selection unit 41 selects the same projection plane as long as the comparison result between the magnitude of the inclination and the threshold value is the same. become. Therefore, even when the magnitude of the inclination of the imaging unit 2 with respect to the vertical direction V slightly changes with time, the imaging apparatus 1 is sufficient for the original image to perform the same image processing. Can be suppressed.
  • the detection unit 3 has a reference direction that is set for the imaging unit 2 and is orthogonal to the optical axis (Z-axis) of the imaging unit 2 and a vertical direction V in real space.
  • the angle formed is included in the inclination of the imaging unit 2 with respect to the vertical direction V of the real space and is detected.
  • the Y-axis direction of the camera coordinate system is set as the reference direction.
  • the detection unit 3 measures the angle between the Y-axis direction as the reference direction and the vertical direction V of the real space as the second angle ⁇ 2, and outputs a value corresponding to the second angle ⁇ 2 to the image processing unit 4. To do.
  • the optical axis (Z-axis) of the imaging unit 2 coincides with the horizontal direction (when the first angle ⁇ 1 is 90 degrees)
  • the Y-axis direction of the imaging unit 2 is horizontal. If it coincides with the direction, the second angle ⁇ 2 measured by the detector 31 is 90 degrees.
  • the optical axis (Z-axis) of the imaging unit 2 matches the horizontal direction
  • the Y-axis of the imaging unit 2 matches the vertical direction V and faces downward (the direction in which gravity acts).
  • the second angle ⁇ 2 measured by the detection unit 3 is 0 degree.
  • the second angle ⁇ 2 is 180 degrees.
  • the determination unit 43 performs “rotation” around the optical axis based on the angle between the reference direction detected by the detection unit 3 and the vertical direction V of the real space, It is determined whether it is “no rotation” around the axis.
  • the “with rotation” state is a state in which the reference direction is along the horizontal direction of the real space
  • the “without rotation” state is a state in which the reference direction is along the vertical direction V of the real space.
  • the determination unit 43 compares the second angle ⁇ 2 measured by the detection unit 3 with a predetermined threshold, and determines whether the rotation is present or not based on the comparison result.
  • the threshold value compared with the second angle ⁇ 2 is not limited to one value, and may be two or more values, for example. In the present embodiment, binary values of 45 degrees (third threshold value) and 135 degrees (fourth threshold value) are set as threshold values. Then, the determination unit 43 determines that the second angle ⁇ 2 is smaller than the third threshold (0 ⁇ ⁇ 2 ⁇ 45) and the second angle ⁇ 2 is larger than the fourth threshold (135 ⁇ 2 ⁇ 180). ), “Rotation” is determined. On the other hand, when the second angle ⁇ 2 is not less than the third threshold and not more than the fourth threshold (45 ⁇ ⁇ 2 ⁇ 135), the determination unit 43 determines “no rotation”.
  • the selection unit 41 selects a projection plane based on the determination result of the determination unit 43 from a plurality of types of projection planes having different shapes. That is, the selection unit 41 receives the determination result of the determination unit 43, and when the imaging unit 2 is “vertical” or “horizontal”, or “horizontal”, “rotates” around the optical axis. Different projection planes are selected depending on whether or not “no rotation”.
  • the selection unit 41 has three types of projections: a first projection plane 101 (see FIG. 1A), a second projection plane 102 (see FIG. 1B), and a third projection plane 103 (see FIG. 7) described later. Select a projection plane from the plane.
  • the detection unit 3 is an angle (second angle) formed by a reference direction that is set for the imaging unit 2 and that is one direction orthogonal to the optical axis of the imaging unit 2 and the vertical direction V of the real space. ⁇ 2) is measured.
  • the image processing unit 4 further includes a determination unit 43 that determines the orientation of the imaging unit 2 in real space (whether there is rotation around the optical axis) based on the second angle ⁇ 2 measured by the detection unit 3. Yes.
  • the selection unit 41 is configured to select a projection surface from a plurality of types of projection surfaces based on the determination result of the determination unit 43. In other words, the selection unit 41 selects the projection plane through the determination by the determination unit 43 based on the inclination detected by the detection unit 3.
  • the selection unit 41 may select a projection plane based on the inclination detected by the detection unit 3, and is not limited to a configuration that directly selects the projection plane from the inclination detected by the detection unit 3.
  • the projection surface may be selected indirectly from the inclination detected in step 3.
  • the selection unit 41 is configured to select, from a plurality of types of projection planes, a projection plane having a shape along the vertical direction V of the real space and curved in the horizontal plane of the real space. Therefore, the selection unit 41 selects the first projection plane 101 if the imaging unit 2 is “vertical”, and selects the second projection plane 102 if the imaging unit 2 is “horizontal”. The specific shape of the third projection plane 103 will be described later.
  • the imaging device 1 configured as described above generates a converted image by performing image processing on the original image obtained by the imaging unit 2 by the image processing unit 4.
  • the image processing unit 4 converts the virtual plane in the real space where the image is projected, that is, the projection plane onto which the image is mapped, from the original projection plane 100 to the first projection plane 101 and the second projection. Switch to the plane 102 or the third projection plane 103.
  • the image processing unit 4 selects a projection plane based on the inclination of the imaging unit 2 detected by the detection unit 3 from a plurality of types of projection planes having different shapes.
  • the image processing unit 4 is based on the inclination of the imaging unit 2 detected by the detection unit 3 and if the imaging unit 2 is “vertical” (S1: Yes).
  • the first projection plane 101 is selected (S3).
  • the imaging unit 2 is “landscape” (S1: No)
  • the image processing unit 4 is “rotated” or “rotated” around the optical axis based on the inclination of the imaging unit 2 detected by the detection unit 3. It is determined whether it is “none” (S2). At this time, if the image processing unit 4 is “with rotation” (S2: Yes), the third projection plane 103 is selected (S4). If it is “without rotation” (S2: No), the second projection is performed.
  • the surface 102 is selected (S5).
  • the imaging device 1 intersects with the optical axis (Z axis) of the imaging unit 2 as shown in FIG. 7 if the imaging unit 2 is “laterally” and “rotates” around the optical axis, and
  • the selection unit 41 selects a projection plane (third projection plane) 103 whose cross section perpendicular to the vertical direction V of the real space is an arc.
  • the third projection surface 103 is a curved surface that faces the XY plane including the imaging surface of the imaging unit 2 and is curved so as to approach the XY plane as the distance from the X axis increases in the Y axis direction.
  • FIG. 7 illustrates a state in which the imaging device 1 is attached to the wall of the building, and the imaging device 1 illustrated in FIG.
  • the third projection plane 103 is set around the central axis in the vertical direction V passing through the imaging unit 2 in front of the imaging device 1 in real space.
  • the third projection plane 103 is a projection plane having a shape along the vertical direction V of the real space and curved in the horizontal plane of the real space.
  • the imaging apparatus 1 converts the original image in the conversion unit 42 so that the original image captured by the imaging unit 2 is mapped to the third projection plane 103. That is, the conversion unit 42 converts the original image using a conversion formula corresponding to the third projection plane 103 among the conversion formulas stored in the storage unit 44.
  • the converted image output from the output unit 45 is a two-dimensional image obtained by expanding the image projected on the curved third projection plane 103 into a planar shape.
  • the imaging device 1 can output a panoramic image having a horizontal angle of view of about 180 degrees in the real space from the output unit 45 as a converted image.
  • the detection unit 3 is set for the imaging unit 2 and the reference direction that is one direction orthogonal to the optical axis (Z axis) of the imaging unit 2 and the real space.
  • the angle formed with the vertical direction V is detected by being included in the inclination of the imaging unit 2 with respect to the vertical direction V in the real space.
  • the image processing unit 4 differs from the original image depending on whether the image pickup unit 2 is “vertical” or “horizontal”, and “rotation” or “no rotation” around the optical axis. Image processing (conversion) can be performed. Therefore, the imaging device 1 of the present embodiment can obtain an image that is more suitable for the user's browsing.
  • the shape of the projection plane that the selection unit 41 selects based on the inclination detected by the detection unit 3 is not limited to the above example. It can be changed as appropriate.
  • the imaging device 1 uses the optical axis (Z axis) of the imaging unit 2 as a central axis.
  • the selection unit 41 may select a cylindrical projection surface (fourth projection surface) 104 to be cylindrical.
  • the fourth projection plane 104 is set around the central axis in the vertical direction V passing through the imaging unit 2 above the imaging device 1 in real space.
  • the imaging device 1 intersects the optical axis (Z axis) of the imaging unit 2 and the X axis.
  • the selection unit 41 selects a projection plane (fifth projection plane) 105 in which the cross section perpendicular to the direction (horizontal direction) is an arc.
  • the fifth projection surface 105 is a curved surface that faces the XY plane including the imaging surface of the imaging unit 2 and is curved so as to approach the XY plane as the distance from the X axis increases in the Y axis direction.
  • the imaging device 1 intersects the optical axis (Z-axis) of the imaging unit 2 and the X-axis.
  • the selection unit 41 selects a projection plane (sixth projection plane) 106 in which a cross section perpendicular to the direction (horizontal direction) is an arc.
  • the sixth projection surface 106 is a curved surface that faces the XY plane including the imaging surface of the imaging unit 2 and is curved so as to approach the XY plane as the distance from the X axis increases in the Y axis direction.
  • This modification can be combined with any of the configurations of the first and second embodiments.
  • Imaging system Functions similar to those of the imaging apparatus 1 described above can be realized even in an imaging system 10 in which a plurality of units are combined as shown in FIG.
  • the imaging system 10 shown in FIG. 9 includes an imaging unit 2 that captures an image, a detection unit 3 that detects the inclination of the imaging unit 2 with respect to the vertical direction of the real space, and an image processing unit 40.
  • the image processing unit 40 is configured to acquire data from the imaging unit 2 and the detection unit 3 and to perform image processing on the image captured by the imaging unit 2.
  • the image processing unit 40 includes a selection unit 41 and a conversion unit 42 that converts an image captured by the imaging unit 2.
  • the selection unit 41 selects a projection plane based on the inclination detected by the detection unit 3 from a plurality of types of projection planes that are virtual planes set in the real space and have different shapes.
  • the conversion unit 42 converts the image (captured by the imaging unit 2) so that the image captured by the imaging unit 2 is mapped to the projection plane selected by the selection unit 41.
  • the same components as those of the imaging device 1 are denoted by the same reference numerals and description thereof is omitted as appropriate.
  • the imaging system 10 includes an imaging unit 20 in which the imaging unit 2 and the detection unit 3 are integrated.
  • the housing 51 of the imaging unit 20 is separate from the housing 52 of the image processing unit 40. Therefore, the imaging unit 20 is configured to be able to communicate with the image processing unit 40.
  • the communication method between the imaging unit 20 and the image processing unit 40 may be a wired method or a wireless method. Further, the imaging unit 20 may be connected to the image processing unit 40 via a network such as the Internet.
  • the image processing unit 40 includes an acquisition unit 46, and the acquisition unit 46 acquires the image data of the original image from the imaging unit 2 and the output value of the detection unit 3.
  • the imaging system 10 determines which of the plurality of types of projection planes, which are virtual planes set in the real space and have different shapes, is mapped to the vertical plane in the real space. The selection can be made based on the inclination of the imaging unit 2 with respect to the direction V. Therefore, according to this imaging system 10, there exists an advantage that the image suitable for a user's browsing can be obtained according to the direction in which the imaging part 2 is installed.
  • An image processing method that can be applied to the imaging apparatus 1 and the imaging system 100 described above is a virtual plane set in the real space, and a plurality of types of projection planes having different shapes from each other with respect to the vertical direction V of the real space.
  • a first step of selecting a projection plane based on the inclination of the imaging unit 2 is included.
  • the image processing method includes a second step of converting the image captured by the imaging unit 2 so that the image captured by the imaging unit 2 is mapped to the projection plane selected in the first step. .
  • an imaging unit with respect to the vertical direction V of the real space, to which of the plurality of types of projection planes that are set in the real space and have different shapes is mapped.
  • a program that can be applied to the imaging apparatus 1 and the imaging system 100 causes a computer connected to the imaging unit 2 that captures an image to function as the selection unit 41 and the conversion unit 42 that converts the image captured by the imaging unit. .
  • the selection unit 41 selects a projection plane based on the inclination of the imaging unit 2 with respect to the vertical direction of the real space from a plurality of types of projection planes that are virtual planes set in the real space and have different shapes.
  • the conversion unit 42 converts the image (captured by the imaging unit 2) so that the image captured by the imaging unit 2 is mapped to the projection plane selected by the selection unit 41.
  • the program may be written in advance in a memory of a computer, but may be provided by being recorded through a telecommunication line or a recording medium such as a memory card.
  • the computer here is realized by, for example, the image processing unit 4 of the imaging apparatus 1 or the image processing unit 40 of the imaging system 100 described above.
  • the imaging system 100, the image processing method, and the program can be applied in combination with any configuration described in the first embodiment, the second embodiment, and the modification.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un dispositif d'imagerie, un système d'imagerie, un procédé de traitement d'image, et un programme tels qu'une image qui est appropriée pour une visualisation par un utilisateur puisse être obtenue selon l'orientation dans laquelle une unité d'imagerie est installée. Le dispositif d'imagerie (1) comprend : une unité d'imagerie permettant de capturer une image ; une unité de détection permettant de détecter l'inclinaison de l'unité d'imagerie par rapport à la direction verticale (V) dans l'espace réel ; et une unité de traitement d'image permettant d'appliquer un traitement d'image à l'image capturée par l'unité d'imagerie. L'unité de traitement d'image a une partie de sélection et une partie de conversion. La partie de sélection sélectionne un plan de projection parmi de multiples types de plans de projection (101, 102) en fonction de l'inclinaison détectée par l'unité de détection, lesdits plans de projection étant des plans virtuels définis dans l'espace réel et ayant des formes différentes entre eux. La partie de conversion convertit l'image capturée par l'unité d'imagerie de façon que l'image capturée par l'unité d'imagerie soit mappée sur le plan de projection sélectionné par la partie de sélection.
PCT/JP2015/004169 2014-09-09 2015-08-20 Dispositif d'imagerie, système d'imagerie, procédé de traitement d'image, et programme WO2016038805A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-183703 2014-09-09
JP2014183703A JP2016058886A (ja) 2014-09-09 2014-09-09 撮像装置、撮像システム、画像処理方法、およびプログラム

Publications (1)

Publication Number Publication Date
WO2016038805A1 true WO2016038805A1 (fr) 2016-03-17

Family

ID=55458580

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/004169 WO2016038805A1 (fr) 2014-09-09 2015-08-20 Dispositif d'imagerie, système d'imagerie, procédé de traitement d'image, et programme

Country Status (2)

Country Link
JP (1) JP2016058886A (fr)
WO (1) WO2016038805A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3088754A1 (fr) * 2018-11-15 2020-05-22 Renault S.A. Methode de creation d’une vue a partir d’une image capturee par une camera grand angle inclinee

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009015313A (ja) * 2007-06-05 2009-01-22 Yamaha Corp カメラシステム
JP2013214947A (ja) * 2012-03-09 2013-10-17 Ricoh Co Ltd 撮像装置、撮像システム、画像処理方法、情報処理装置、及びプログラム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009015313A (ja) * 2007-06-05 2009-01-22 Yamaha Corp カメラシステム
JP2013214947A (ja) * 2012-03-09 2013-10-17 Ricoh Co Ltd 撮像装置、撮像システム、画像処理方法、情報処理装置、及びプログラム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3088754A1 (fr) * 2018-11-15 2020-05-22 Renault S.A. Methode de creation d’une vue a partir d’une image capturee par une camera grand angle inclinee

Also Published As

Publication number Publication date
JP2016058886A (ja) 2016-04-21

Similar Documents

Publication Publication Date Title
US10805531B2 (en) Image processing system, image generation apparatus, and image generation method
KR101657039B1 (ko) 화상 처리 장치, 화상 처리 방법, 및 촬상 시스템
JP6927382B2 (ja) 撮像システム、方法、プログラム、動画表示装置および画像処理装置。
JP6919334B2 (ja) 画像処理装置、画像処理方法、プログラム
JP5136060B2 (ja) 画像処理装置、画像処理方法、そのプログラム及びそのプログラムを記録した記録媒体と撮像装置
JP4098808B2 (ja) 遠隔映像表示方法、映像取得装置及びその方法とそのプログラム
US20160269632A1 (en) Image processing system and image processing method
JP6677098B2 (ja) 全天球動画の撮影システム、及びプログラム
JP2017208619A (ja) 画像処理装置、画像処理方法、プログラム及び撮像システム
JP6003135B2 (ja) 画像処理装置、画像処理方法及び撮像装置
US10897573B2 (en) Image capturing system, terminal and computer readable medium which correct images
JP5857269B2 (ja) 画像生成装置、画像生成方法及びプログラム
KR20180129667A (ko) 표시 제어 장치, 표시 제어 방법 및 저장 매체
JP2015046044A (ja) 画像処理装置、画像処理方法、プログラムおよび撮像システム
KR20200087816A (ko) 화상 처리 장치, 화상 처리 시스템, 화상 처리 방법 및 기록 매체
JP6724659B2 (ja) 撮影装置、方法およびプログラム
WO2016038805A1 (fr) Dispositif d'imagerie, système d'imagerie, procédé de traitement d'image, et programme
JP6152991B2 (ja) 画像生成装置、カメラ装置、画像表示装置及び画像生成方法
JP6256513B2 (ja) 撮像システム、撮像装置、方法およびプログラム
JP5955114B2 (ja) 撮像装置、その制御方法およびプログラム
WO2017057426A1 (fr) Dispositif de projection, dispositif de détermination de contenu, procédé de projection, et programme
JP2013012930A (ja) 全方位撮像装置、及びその制御方法
JP6769357B2 (ja) 画像処理装置、画像処理方法および撮像装置
JP6213640B2 (ja) 画像処理装置、画像処理方法、撮像装置、及びシステム
KR20140030585A (ko) 초광각 카메라를 이용한 카메라 시스템

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15840924

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15840924

Country of ref document: EP

Kind code of ref document: A1