CN113330275A - Camera information calculation device, camera information calculation system, camera information calculation method, and program - Google Patents

Camera information calculation device, camera information calculation system, camera information calculation method, and program Download PDF

Info

Publication number
CN113330275A
CN113330275A CN201980089997.3A CN201980089997A CN113330275A CN 113330275 A CN113330275 A CN 113330275A CN 201980089997 A CN201980089997 A CN 201980089997A CN 113330275 A CN113330275 A CN 113330275A
Authority
CN
China
Prior art keywords
camera
information
coordinate system
unit
marker
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201980089997.3A
Other languages
Chinese (zh)
Other versions
CN113330275B (en
Inventor
田上祐也
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sothink Corp
Original Assignee
Sothink Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sothink Corp filed Critical Sothink Corp
Publication of CN113330275A publication Critical patent/CN113330275A/en
Application granted granted Critical
Publication of CN113330275B publication Critical patent/CN113330275B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/03Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring coordinates of points
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • G03B37/04Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/61Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3204Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium
    • H04N2201/3205Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium of identification information, e.g. name or ID code
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/3253Position information, e.g. geographical position at time of capture, GPS data

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

In one embodiment, a camera information calculation device includes a first calculation unit and a second calculation unit. The first calculation unit calculates first camera position information indicating a position of the first camera based on a first image including the first subject captured by the first camera and a first moving image including the first subject captured by the third camera. The second calculation unit calculates second camera position information indicating a position of a second camera based on a second image including a second subject captured by a second camera disposed separately from the first camera and a second moving image including the second subject captured by a third camera.

Description

Camera information calculation device, camera information calculation system, camera information calculation method, and program
Technical Field
The invention relates to a camera information calculation device, a camera information calculation system, a camera information calculation method, and a camera information calculation program.
Background
Conventionally, the following devices are known: when performing image analysis using a camera, camera information that can specify the position of the camera is calculated using a marker (see, for example, patent documents 1 and 2).
Patent document 1: japanese patent laid-open publication No. 2004-279049
Patent document 2: japanese patent laid-open publication No. 2013-127783
Disclosure of Invention
However, conventionally, there are a plurality of cameras installed, and when a positional relationship between the cameras is required, a common mark must be included in an image captured by each camera. For example, when the distance between the plurality of cameras is long, the imaging areas of the plurality of cameras do not overlap, and a common mark cannot be imaged in the images captured by the cameras, there is a problem that it is difficult to obtain the positional relationship of the cameras.
In one aspect, the present invention aims to provide a camera information calculation device, a system, a camera information calculation method, and a program that can determine a positional relationship of a plurality of cameras that are separately provided.
In one aspect, a camera information calculation device disclosed in the present application includes: a first calculation unit that calculates first camera position information indicating a position of a first camera based on a first image including a first subject captured by the first camera and a first moving image including the first subject captured by a third camera; and a second calculation unit that calculates second camera position information indicating a position of the second camera based on a second image including a second object captured by a second camera disposed separately from the first camera and a second moving image including the second object captured by the third camera.
According to one embodiment of the camera information calculation device disclosed in the present application, the positional relationship of a plurality of cameras provided separately can be obtained.
Drawings
Fig. 1 is a diagram showing an example of a system configuration according to an embodiment.
Fig. 2 is a plan view of a store to which the system of the embodiment is applied.
Fig. 3 is a plan view of a store to which the system of the embodiment is applied.
Fig. 4 is a diagram showing an example of a basic hardware configuration of the first camera, the second camera, and the mobile camera according to the embodiment.
Fig. 5 is a diagram showing an example of functions of the first camera according to the embodiment.
Fig. 6 is a diagram showing an example of functions of the second camera according to the embodiment.
Fig. 7 is a diagram showing an example of functions of the mobile camera according to the embodiment.
Fig. 8 is a diagram showing a detailed example of functions of the mobile camera according to the embodiment.
Fig. 9 is a diagram showing an example of functions of the calculation unit according to the embodiment.
Fig. 10 is a diagram showing an example of functions of the tag information converting unit according to the embodiment.
Fig. 11 is a diagram showing an example of functions of the spatial feature information conversion unit according to the embodiment.
Fig. 12 is a diagram showing an example of functions of the marker information/spatial feature information conversion unit according to the embodiment.
Fig. 13 is a flowchart showing an operation example of the mobile camera according to the embodiment.
Fig. 14 is a diagram showing an example of the configuration of a system according to a modification.
Fig. 15 is a diagram showing an example of a hardware configuration of a server according to a modification.
Fig. 16 is a diagram showing an example of functions of a server according to a modification.
Detailed Description
Embodiments of a camera information calculation device, a camera information calculation system, a camera information calculation method, and a camera information calculation program disclosed in the present application will be described below in detail with reference to the accompanying drawings.
Fig. 1 is a diagram showing an example of the configuration of a system 1 according to the present embodiment. As shown in fig. 1, the system 1 of the present embodiment includes a data server 10, a first camera 20, a second camera 30, and a mobile camera 40. The first camera 20 and the second camera 30 are cameras that can be set in any position and orientation. The data server 10 is connected to be able to communicate with each of the first camera 20, the second camera 30, and the mobile camera 40 via a network 50 such as the internet, for example.
Fig. 2 is a plan view of a place, for example, a store, to which the system 1 of the present embodiment is applied. In the example of fig. 2, a first mark 70 and a second mark 80 having mutually different patterns are provided on the floor (ground) 60 of the store. In this example, the first mark 70 and the second mark 80 are each a flat plate-like member having a pattern of an 8 × 8 rectangular matrix. In the following description, the first mark 70 and the second mark 80 may be simply referred to as "marks" when they are not distinguished from each other.
The first mark 70 is an example of a "first subject" included in the image captured by the first camera 20, and has a first pattern corresponding to the first identification information. For example, the first identification information may be an ID indicating "1" and the first pattern may be a pattern indicating ID "1".
The first mark 70 is provided in the shooting area of the first camera 20 (first shooting area), and therefore the first mark 70 is included in the image shot by the first camera 20. In the following description, an image captured by the first camera 20 is sometimes referred to as a "first image". That is, the first camera 20 captures a first image containing the first marker 70.
The second mark 80 is an example of a "second subject" included in the image captured by the second camera 30, and has a second pattern corresponding to the second identification information. For example, the second identification information may be an ID indicating "2" and the second pattern may be a pattern indicating ID "2".
Since the second marker 80 is disposed in the imaging region (second imaging region) of the second camera 30, the second marker 80 is included in the image captured by the second camera 30. In the following description, an image captured by the second camera 30 is sometimes referred to as a "second image". That is, the second camera 30 captures a second image containing the second marker 80.
Here, the first camera 20 and the second camera 30 are disposed separately from each other and at positions where the common mark cannot be captured in the respective captured images. In this example, the first camera 20 and the second camera 30 are disposed at a distance of at least a certain distance, and are disposed so that their imaging areas do not overlap. However, not limited to this, for example, the first camera 20 and the second camera 30 may be in the following manner: the position is arranged at a position where the common mark cannot be captured or the pattern of the common mark cannot be recognized in the respective captured images although the respective captured areas partially overlap each other.
In the example of fig. 2, the moving camera 40 that performs imaging while moving on the floor 60 is present at a position where at least a part of the first imaging area can be imaged. In the example of fig. 2, the moving camera 40 exists at a position where the first mark 70 can be photographed. The moving camera 40 can perform imaging while moving within the range 90 shown in fig. 2, for example, and can also move to a position where at least a part of the second imaging region can be imaged, as shown in fig. 3, for example. In the example of fig. 3, the moving camera 40 exists at a position where the second mark 80 can be photographed. That is, the moving camera 40 is a camera capable of moving between a position where at least a part of the first imaging region can be imaged and a position where at least a part of the second imaging region can be imaged. The moving camera 40 can continuously photograph or photograph a plurality of times so that a plurality of photographing regions overlap. The moving camera 40 is an example of a "third camera" and takes a first moving image containing the first marker 70 and a second moving image containing the second marker 80. However, the manner of the "third camera" is not limited to moving the camera 40. In this example, although at least a part of the imaging area of the first moving image overlaps at least a part of the imaging area of the second moving image, the present invention is not limited thereto.
The mobile camera 40 may also be, for example, a portable device with a camera (e.g., a smartphone, a tablet, an unmanned airplane, etc.). In the present embodiment, the case where the mobile camera 40 is a terminal with a camera which is held by a user who moves on the floor 60 of the store is described as an example, but the present invention is not limited thereto.
Fig. 4 is a diagram showing an example of a basic hardware configuration of the first camera 20, the second camera 30, and the mobile camera 40. In the example of fig. 4, the hardware elements that are minimally required are illustrated, but the present invention is not limited to this, and the first camera 20, the second camera 30, and the moving camera 40 may be configured to include other hardware elements (for example, an input device, a display device, and the like).
As shown in fig. 4, the first camera 20, the second camera 30, and the moving camera 40 respectively include: an optical system 101 such as a lens, an image pickup device 102, a CPU (Central Processing Unit) 103, a storage device 104, and a communication I/F Unit 105. The optical system 101, the imaging element 102, the CPU103, the storage device 104, and the communication I/F unit 105 are connected to each other via a bus 106.
The imaging element 102 is an element (element for performing imaging) that converts an image of an imaging subject imaged by the optical system 101 such as a lens into an electric signal. The CPU103 corresponds to an example of a hardware processor. The CPU103 controls the operation of the apparatus (any one of the first camera 20, the second camera 30, and the moving camera 40). The CPU103 executes a program stored in the storage device 104 to realize various functions of the apparatus. The various functions that the first camera 20, the second camera 30, and the mobile camera 40 each have will be described later.
The storage device 104 stores various data such as programs. For example, the storage device 104 includes: a ROM (Read Only Memory) which is a nonvolatile Memory storing a program, a RAM (Random Access Memory) which is a volatile Memory having an operation area of the CPU103, and the like. The communication I/F section 105 is an interface for connecting to the network 50.
Fig. 5 is a diagram showing an example of functions of the first camera 20. In the example of fig. 5, the functions related to the present embodiment are mainly illustrated, but the functions of the first camera 20 are not limited to this. As shown in fig. 5, the first camera 20 includes an image acquisition unit 201, a point/line feature detection unit 202, an internal parameter calculation unit 203, a first mark detection unit 204, an external parameter calculation unit 205, a spatial feature detection unit 206, a lens distortion information calculation unit 250, a line/arc feature detection unit 251, a mark detection unit 252, and a mark information storage unit 253.
The image acquisition section 201 acquires a first image captured by the first camera 20. Specifically, the image acquisition unit 201 acquires a first image captured by the imaging element 102.
The lens distortion information calculation unit 250 analyzes the first image acquired by the image acquisition unit 201, and the line segment data (straight line or circular arc) is detected by the line segment/circular arc feature detection unit 251. The lens distortion information calculation unit 250 calculates lens distortion information of the first camera 20 based on the line segment data detected by the line segment/circular arc feature detection unit 251. Further, the mark detection unit 252 may detect a mark based on the line segment data, and the lens distortion information calculation unit 250 may calculate lens distortion information using the shape and size information of the mark detected by the mark detection unit 252. The model formula of the lens distortion is represented by a higher-order formula using a distortion coefficient in the radial direction and a distortion coefficient in the circumferential direction of the lens, and lens distortion information is represented by these coefficients. The lens distortion information calculation unit 250 transmits the calculated lens distortion information to the internal parameter calculation unit 203. The lens distortion information calculation unit 250 sends the lens distortion information and the image from which the distortion of the first image has been removed to the point/line segment feature detection unit 202. Alternatively, when the lens distortion information calculation unit 250 cannot generate an image from which distortion has been removed, the lens distortion information and the first image are sent to the point/line segment feature detection unit 202. In this example, the lens distortion information is calculated using the first image captured by the first camera 20, but the present invention is not limited thereto, and a known lens distortion information may be provided in advance.
The point/line feature detection unit 202 analyzes the lens distortion information calculated by the lens distortion information calculation unit 250 and the first image or the image from which the distortion of the first image has been removed. More specifically, the point/line feature detection unit 202 extracts feature data such as a feature of the first mark 70, information on the periphery of the first mark 70, and a spatial feature of the entire image (a feature of a straight line or a plane corresponding to the floor, the wall, or the like). Instead of analyzing the image from which the lens distortion is removed, the result of correcting the line segment data detected by the line segment/circular arc feature detection unit 251 using the lens distortion information may be extracted as feature data.
The first marker detection unit 204 detects the first marker 70 in the first image based on the feature data detected by the point/line segment feature detection unit 202 and the marker identification information stored in the marker information storage unit 253, and transmits first marker information indicating the position and shape of the detected first marker 70 to the internal parameter calculation unit 203, the external parameter calculation unit 205, and the data server 10. In this example, the first mark detection unit 204 recognizes first identification information corresponding to the first pattern of the detected first mark 70, and transmits the recognized first identification information to the data server 10 together with the first mark information. The first mark information includes, for example, information indicating the size of one side of the first mark 70, the size of one side of a rectangle constituting the first pattern, the position of the first mark 70 in the first image, and the like.
The internal parameter calculation unit 203 calculates a first internal parameter indicating the internal parameter of the first camera 20 based on the lens distortion information received from the lens distortion information calculation unit 250, the known lens focal length of the first camera 20, the first marker information received from the first marker detection unit 204, and the like. The first internal parameters are represented by a matrix that determines a correspondence relationship between coordinate points of a three-dimensional coordinate system of the camera of the first camera 20 (e.g., a three-dimensional coordinate system with the lens center of the camera as an origin) and coordinate points of a coordinate system (two-dimensional coordinate system) of the first image captured by the first camera 20. The internal parameter calculation unit 203 transmits the calculated first internal parameter and the lens distortion information received from the lens distortion information calculation unit 250 to the external parameter calculation unit 205. The lens distortion information may be one of the internal parameters, and hereinafter, when referred to as "first internal parameter", the lens distortion information is included. As a method of calculating the first internal parameter, various known techniques can be used. In this example, the first internal parameters are calculated using the first image captured by the first camera 20, but the present invention is not limited to this, and known first internal parameters may be provided in advance.
The external parameter calculation unit 205 calculates a first external parameter indicating an external parameter of the first camera 20 based on the first internal parameter received from the internal parameter calculation unit 203 and the first flag information received from the first flag detection unit 204. In this example, the external parameter calculation unit 205 sets a three-dimensional coordinate system on the first camera 20 side with the first marker 70 detected by the first marker detection unit 204 as the origin (for example, the center of the first marker 70 or any of 4 corners). The extrinsic parameter calculation unit 205 calculates a matrix that determines the correspondence relationship between the set coordinate points of the three-dimensional coordinate system on the first camera 20 side and the coordinate points of the three-dimensional coordinate system of the camera of the first camera 20 as the first extrinsic parameter. The first extrinsic parameter calculated by the extrinsic parameter calculation unit 205 is information that can specify the position and orientation of the first camera 20 in the three-dimensional coordinate system on the first camera 20 side. Then, the extrinsic parameter calculation unit 205 transmits the calculated first extrinsic parameter and the first intrinsic parameter received from the intrinsic parameter calculation unit 203 to the data server 10 as first camera information indicating the camera information of the first camera 20.
The spatial feature detection unit 206 detects the spatial feature of the periphery of the first mark 70 based on the first internal parameter and the first external parameter received from the external parameter calculation unit 205 and the feature data received from the point/line segment feature detection unit 202. Then, first spatial feature information indicating the peripheral information of the first marker 70 and information on the spatial feature of the entire image is transmitted to the data server 10. The information related to the spatial feature (spatial feature information) may include a set of position information and an identifier such as a straight line, a plane, or an angle, for example. The data server 10 manages (stores) the first camera information, the first marker information, the first spatial feature information, and the first identification information transmitted from the first camera 20 in association with each other.
The CPU103 executes the program stored in the storage device 104 to realize the functions of the first camera 20 described above. However, the present invention is not limited to this, and all or a part of the plurality of functions of the first camera 20 may be realized by a dedicated hardware circuit.
Fig. 6 is a diagram showing an example of functions of the second camera 30. In the example of fig. 6, the functions related to the present embodiment are mainly illustrated, but the functions of the second camera 30 are not limited to this. As shown in fig. 6, the second camera 30 includes an image acquisition unit 301, a point/line feature detection unit 302, an internal parameter calculation unit 303, a second marker detection unit 304, an external parameter calculation unit 305, a spatial feature detection unit 306, a lens distortion information calculation unit 350, a line/arc feature detection unit 351, a marker detection unit 352, and a marker information storage unit 353. Since the basic functions are the same as those of the first camera 20, the functions of the second camera 30 will be described below with appropriate simplification.
The image acquisition section 301 acquires a second image captured by the second camera 30. Specifically, the image acquisition unit 301 acquires the second image captured by the imaging element 102.
The lens distortion information calculation unit 350 analyzes the second image acquired by the image acquisition unit 301, and the line segment data (straight line or circular arc) is detected by the line segment/circular arc feature detection unit 351. The lens distortion information calculation unit 350 calculates lens distortion information of the second camera 30 based on the line segment data detected by the line segment/circular arc feature detection unit 351. Further, the mark detection unit 352 may detect a mark based on the line segment data, and the lens distortion information calculation unit 350 may calculate lens distortion information using the shape and size information of the mark detected by the mark detection unit 352. The model formula of the lens distortion is represented by a higher-order formula using a distortion coefficient in the radial direction and a distortion coefficient in the circumferential direction of the lens, and lens distortion information is represented by these coefficients. The lens distortion information calculation unit 350 transmits the calculated lens distortion information to the internal parameter calculation unit 303. The lens distortion information calculation unit 350 sends the lens distortion information and the image from which the distortion of the second image has been removed to the point/line segment feature detection unit 302. Alternatively, when the lens distortion information calculation unit 350 cannot generate an image from which distortion has been removed, the lens distortion information and the second image are sent to the point/line segment feature detection unit 302. In this example, the lens distortion information is calculated using the second image captured by the second camera 30, but the present invention is not limited to this, and a known lens distortion information may be provided in advance.
The point/line feature detection unit 302 analyzes the lens distortion information calculated by the lens distortion information calculation unit 350, and the second image or the image from which the distortion of the second image is removed. More specifically, the point/line feature detection unit 302 extracts feature data such as the feature of the second mark 80, the peripheral information of the second mark 80, and the spatial feature of the entire image. Instead of analyzing the image from which the lens distortion is removed, the result of correcting the line segment data detected by the line segment/circular arc feature detection unit 351 using the lens distortion information may be extracted as feature data.
The second marker detecting unit 304 detects the second marker 80 in the second image based on the feature data detected by the point/line segment feature detecting unit 302 and the marker identification information stored in the marker information storing unit 353, and transmits second marker information indicating the position and shape of the detected second marker 80 to the internal parameter calculating unit 303, the external parameter calculating unit 305, and the data server 10. The second mark detection unit 304 recognizes second identification information corresponding to the second pattern of the detected second mark 80, and transmits the recognized second identification information to the data server 10 together with the second mark information. The second mark information includes, for example, information indicating the size of one side of the second mark 80, the size of one side of a rectangle constituting the second pattern, the position of the second mark 80 in the second image, and the like.
The internal parameter calculation unit 303 calculates a second internal parameter indicating an internal parameter of the second camera 30 based on the lens distortion information received from the lens distortion information calculation unit 350, the known lens focal length of the second camera 30, the second marker information received from the second marker detection unit 304, and the like. The second internal parameters are represented by a matrix that determines a correspondence relationship between coordinate points of a three-dimensional coordinate system of the camera of the second camera 30 (e.g., a three-dimensional coordinate system with the lens center of the camera as an origin) and coordinate points of a coordinate system of the second image captured by the second camera 30 (a two-dimensional coordinate system). The internal parameter calculation unit 303 sends the calculated second internal parameter and the lens distortion information received from the lens distortion information calculation unit 350 to the external parameter calculation unit 305. The lens distortion information may be one of the internal parameters, and hereinafter, when referred to as "second internal parameter", the lens distortion information is included. As a method of calculating the second internal parameter, various known techniques can be used. In this example, the second internal parameters are calculated using the second image captured by the second camera 30, but the present invention is not limited to this, and known internal parameters may be provided in advance.
The external parameter calculation unit 305 calculates a second external parameter indicating the external parameter of the second camera 30 based on the second internal parameter received from the internal parameter calculation unit 303 and the second marker information received from the second marker detection unit 304. The external parameter calculation unit 305 sets a three-dimensional coordinate system on the second camera 30 side with the second marker 80 detected by the second marker detection unit 304 as the origin (for example, the center of the second marker 80 or any of the 4 corners). The extrinsic parameter calculation unit 305 calculates a matrix that determines the set correspondence relationship between the coordinate points of the three-dimensional coordinate system on the second camera 30 side and the coordinate points of the three-dimensional coordinate system of the camera of the second camera 30 as the second extrinsic parameters. The second external parameter calculated by the external parameter calculation unit 305 is information capable of specifying the position and orientation of the second camera 30 in the three-dimensional coordinate system on the second camera 30 side. Then, the extrinsic parameter calculation unit 305 transmits the calculated second extrinsic parameter and the second intrinsic parameter received from the intrinsic parameter calculation unit 303 to the data server 10 as second camera information indicating the camera information of the second camera 30.
The spatial feature detection unit 306 detects the spatial feature of the periphery of the second mark 80 based on the second internal parameter and the second external parameter received from the external parameter calculation unit 305 and the feature data received from the point/line segment feature detection unit 302. Then, second spatial feature information indicating the peripheral information of the second marker 80 and information on the spatial feature of the entire image is transmitted to the data server 10. The data server 10 manages (stores) the second camera information, the second marker information, the second spatial feature information, and the second identification information transmitted from the second camera 30 in association with each other. In the following description, the first flag information and the second flag information managed by the data server 10 may be simply referred to as "flag information" in some cases without distinguishing them from each other. Similarly, the first identification information and the second identification information managed by the data server 10 may be simply referred to as "identification information" when they are not distinguished from each other, and may be simply referred to as "spatial feature information" when they are not distinguished from each other.
The CPU103 executes the program stored in the storage device 104 to realize the functions of the second camera 30 described above. However, the present invention is not limited to this, and may be configured such that all or a part of the plurality of functions of the second camera 30 is realized by a dedicated hardware circuit, for example.
Fig. 7 is a diagram showing an example of functions of the mobile camera 40. In the example of fig. 7, the functions related to the present embodiment are mainly illustrated, but the functions of the mobile camera 40 are not limited to this. As shown in fig. 7, the moving camera 40 includes an image acquisition unit 401, a point/line feature detection unit 402, an internal parameter calculation unit 403, a mark detection unit 404, an external parameter calculation unit 405, a spatial feature detection unit 406, a lens distortion information calculation unit 450, a line/arc feature detection unit 451, a mark detection unit 452, and a mark information storage unit 453. The mobile camera 40 includes a core unit 407, an information conversion unit 408, a storage unit 409, an analog unit 410, and a data storage unit 454.
Hereinafter, the functions of the mobile camera 40 will be described. The image acquisition unit 401 acquires a moving image captured by the mobile camera 40. More specifically, the image acquisition unit 401 sequentially acquires the third images (acquires moving images) captured in time series by the imaging element 102.
The lens distortion information calculation unit 450 analyzes the third image acquired by the image acquisition unit 401, and the line segment data (straight line or circular arc) is detected by the line segment/circular arc feature detection unit 451. The lens distortion information calculation unit 450 calculates lens distortion information of the mobile camera 40 based on the line segment data detected by the line segment/circular arc feature detection unit 451. Further, the mark detection unit 452 may detect a mark based on the line segment data, and the lens distortion information calculation unit 450 may calculate lens distortion information using the shape and size information of the mark detected by the mark detection unit 452. The model formula of the lens distortion is represented by a higher-order formula using a distortion coefficient in the radial direction and a distortion coefficient in the circumferential direction of the lens, and lens distortion information is represented by these coefficients. The lens distortion information calculation unit 450 transmits the calculated lens distortion information to the internal parameter calculation unit 403. The lens distortion information calculation unit 450 sends the lens distortion information and the image from which the distortion of the third image has been removed to the point/line segment feature detection unit 402. Alternatively, when the lens distortion information calculation unit 450 cannot generate an image from which distortion has been removed, the lens distortion information and the third image are sent to the point/line segment feature detection unit 402. In this example, the lens distortion information is calculated using the image (third image) of the moving image captured by the moving camera 40, but the present invention is not limited to this, and a known lens distortion information may be provided in advance.
The point/line feature detection unit 402 analyzes the lens distortion information calculated by the lens distortion information calculation unit 450 and the third image or the image from which the distortion of the third image has been removed. More specifically, the point/line feature detection unit 402 extracts feature data included in a moving image, such as features of a marker, information on the periphery of the marker, and spatial features of the entire image (features of a straight line or a plane corresponding to a floor, a wall, or the like). Instead of analyzing the image from which the lens distortion is removed, the result of correcting the line segment data detected by the line segment/circular arc feature detection unit 451 using the lens distortion information may be extracted as feature data.
The marker detection unit 404 detects a marker (in this example, the first marker 70 or the second marker 80) in the moving image based on the feature data detected by the point/line segment feature detection unit 402 and the marker identification information stored in the marker information storage unit 453. The marker detection unit 404 transmits third marker information indicating the position and shape of the detected marker to the internal parameter calculation unit 403 and the external parameter calculation unit 405. The third mark information includes, for example, information indicating the size of one side of the mark, the size of one side of a rectangle constituting a pattern of the mark, the position of the mark in the moving image, and the like. The mark detection unit 404 recognizes the identification information corresponding to the pattern of the detected mark, and transmits the recognized third identification information and the third mark information to the external parameter calculation unit 405 and the data storage unit 454. For example, the marker detection unit 404 may detect the marker and recognize the third identification information using images of a plurality of frames.
The internal parameter calculation unit 403 calculates a third internal parameter indicating the internal parameter of the mobile camera 40 based on the lens distortion information received from the lens distortion information calculation unit 450, the known lens focal length of the mobile camera 40, the distortion coefficient of the lens, the third flag information received from the flag detection unit 404, and the like. The third internal parameter is represented by a matrix that determines a correspondence relationship between a coordinate point of a three-dimensional coordinate system of the camera of the mobile camera 40 (e.g., a three-dimensional coordinate system with the lens center of the camera as an origin) and a coordinate point of a coordinate system (a two-dimensional coordinate system) of an image of a moving image (third image) captured by the mobile camera 40. The internal parameter calculation unit 403 transmits the calculated third internal parameter and the lens distortion information received from the lens distortion information calculation unit 450 to the external parameter calculation unit 405. The lens distortion information may be one of the internal parameters, and hereinafter, when referred to as "third internal parameter", the lens distortion information is included. As a method of calculating the internal parameters, various known techniques can be used. In this example, the internal parameters are calculated using the image (third image) of the moving image captured by the moving camera 40, but the present invention is not limited to this, and known internal parameters may be provided in advance.
The extrinsic parameter calculation unit 405 calculates a third extrinsic parameter indicating an extrinsic parameter of the mobile camera 40 based on the third intrinsic parameter received from the intrinsic parameter calculation unit 403 and the third flag information received from the flag detection unit 404. In this example, the external parameter calculation unit 405 sets a three-dimensional coordinate system on the side of the mobile camera 40 with the first marker (the first marker 70 in this example) detected by the marker detection unit 404 as the origin (for example, the center of the first marker 70 or any of the 4 corners). The extrinsic parameter calculation unit 405 calculates a matrix that determines the correspondence relationship between the set coordinate points of the three-dimensional coordinate system on the mobile camera 40 side and the coordinate points of the three-dimensional coordinate system of the camera of the mobile camera 40 as the third extrinsic parameter. The third extrinsic parameter calculated by the extrinsic parameter calculation unit 405 is information that can specify the position and orientation of the mobile camera 40 in the three-dimensional coordinate system on the mobile camera 40 side. In this example, even if the 2 nd and subsequent markers (in this example, the second marker 80) are detected, the external parameter calculation unit 405 calculates the third external parameter based on the moving images sequentially captured by the moving camera 40 (the moving amount, acceleration, and the like of the subject obtained from the moving images can also be used) without changing the origin.
The extrinsic parameter calculation unit 405 calculates a third extrinsic parameter of the mobile camera 40 each time the third flag information is received from the flag detection unit 404 (i.e., each time a flag is newly detected). Then, the extrinsic parameter calculation unit 405 transmits the calculated third extrinsic parameter and the third intrinsic parameter received from the intrinsic parameter calculation unit 403 to the data storage unit 454 as third camera information indicating the camera information of the moving camera 40. The external parameter calculation unit 405 also transmits the third marker information received from the marker detection unit 404 to the data storage unit 454. That is, in this example, every time the marker is detected by the marker detection unit 404, the third camera information and the third marker information corresponding to the marker are transmitted to the data storage unit 454.
The spatial feature detection unit 406 detects the spatial feature around the mark based on the third internal parameter and the third external parameter received from the external parameter calculation unit 405 and the feature data received from the point/line segment feature detection unit 402. Then, the third spatial feature information indicating the information on the peripheral information of the mark and the spatial feature of the entire image is transmitted to the data storage unit 454. In this example, the third spatial feature information may include a set of position information and an identifier such as a straight line, a plane, or a corner. The data storage unit 454 stores third camera information, third marker information, third spatial feature information, and third identification information of the mobile camera 40.
The checking section 407 reads a group of the third flag information and the third identification information from the data storage section 454, and checks whether or not the identification information matching the third identification information included in the group exists in the data server 10. When the confirmed identification information is present in the data server 10, the matching unit 407 sets a reference coordinate system. Specifically, when the first mark 70 detected from the first image captured by the first camera 20 or the second mark 80 detected from the second image captured by the second camera 30 is the same as the mark detected from the moving image captured by the moving camera 40, a reference coordinate is set in association with the mark, and the mark is the identification information of the mark set as the reference coordinate system, and is transmitted to the data server 10. In this example, the reference coordinate system is referenced to the first mark 70. Of the plurality of pieces of third identification information, the third identification information set as the reference coordinates includes identification information set as the reference coordinate system, and is transmitted to the information conversion unit 408 and the simulation unit 410. In addition, in the case where the confirmed identification information exists in the data server 10, the collating section 407 instructs the data server 10 to transmit the first or second camera information, the first or second marker information, and the first or second spatial feature information associated with the identification information to the information converting section 408. The data server 10 that has received the instruction transmits the first or second camera information, the first or second marker information, and the first or second spatial feature information to the information conversion unit 408.
When receiving the first or second camera information, the first or second marker information, and the first or second spatial feature information from the data server 10, the information conversion unit 408 converts the first or second camera information, the first or second marker information, and the first or second spatial feature information in accordance with the reference coordinate system described above. That is, when the first marker 70 included in the first image captured by the first camera 20 or the second marker 80 included in the second image captured by the second camera 30 is detected from the moving image captured by the moving camera 40, the information conversion unit 408 converts the first camera information of the first camera 20 and the like or the second camera information of the second camera 30 and the like in accordance with the reference coordinate system. The specific contents of the information conversion unit 408 will be described later. The information converted by the information conversion unit 408 (converted information) is stored in the storage unit 409.
The analog unit 410 reads the converted information of each of the first camera 20 and the second camera 30 stored in the storage unit 409, reads the information from the data storage unit 454 on the side of the mobile camera 40, converts the information, and compares the information to detect a difference. More specific functions of the simulation unit 410 will be described later.
Fig. 8 is a functional block diagram for explaining more specific contents of the information conversion unit 408, the storage unit 409, and the simulation unit 410. The portion a shown in fig. 8 corresponds to the portion a shown in fig. 7. As described above, when the identification information matching the third identification information included in the group read out from the data storage unit 454 exists in the data server 10, the collating unit 407 instructs the data server 10 to transmit the information (the first or second camera information, the first or second marker information, the first or second spatial feature information) of the first camera 20 or the second camera 30 associated with the identification information to the information conversion unit 408. The data server 10 that has received the instruction transmits information of the first camera 20 or the second camera 30 to the information conversion unit 408. Further, the information conversion unit 408 receives third identification information including identification information set as a reference coordinate system from the verification unit 407.
That is, every time the first marker 70 included in the first image captured by the first camera 20 or the second marker 80 included in the second image captured by the second camera 30 is detected from the moving image captured by the moving camera 40, the information of the first camera 20 or the second camera 30 is input to the information conversion unit 408, and the conversion by the information conversion unit 408 is performed. In this example, when the first marker 70 is detected from the moving image of the moving camera 40, the first camera information, the first marker information, and the first spatial feature information of the first camera 20 are input to the information conversion unit 408. Then, when the second marker 80 is detected from the moving camera 40 moving to the position where the second imaging area can be imaged, the second camera information, the second marker information, and the second spatial feature information of the second camera 30 are input to the information conversion unit 408.
As shown in fig. 8, the information conversion section 408 has a calculation section 421, a marker information conversion section 422, and a spatial feature information conversion section 423.
The calculation unit 421 converts the first camera information of the first camera 20 or the second camera information of the second camera 30 in accordance with the reference coordinate system and outputs the first or second reference coordinate system camera information. Fig. 9 is a diagram showing an example of functions of the calculation unit 421. As shown in fig. 9, the calculation unit 421 includes, for example, a first calculation unit 501 and a second calculation unit 502. In the example of fig. 9, functions related to the present embodiment are mainly illustrated, but the function of the calculation unit 421 is not limited to this.
The first calculation unit 501 calculates first reference coordinate system camera information indicating the position of the first camera 20 based on the first image including the first marker 70 captured by the first camera 20 and the first moving image including the first marker 70 captured by the moving camera 40. More specifically, when the first marker 70 is detected from the moving image captured by the moving camera 40, the first calculation unit 501 converts the first camera information into first reference coordinate system camera information indicating the position of the first camera 20 in the reference coordinate system based on the first image. In this example, the moving image including the first marker 70 captured by the moving camera 40 corresponds to the "first moving image", but is not limited thereto. In this example, the first reference coordinate system camera information is a first external parameter of the first camera 20 (hereinafter, sometimes referred to as "converted first external parameter" according to the first external parameter of the reference coordinate system) that determines a correspondence relationship between the coordinate points of the reference coordinate system and the coordinate points of the three-dimensional coordinate system of the camera of the first camera 20, but is not limited thereto. The first reference coordinate system camera information may be information indicating a correspondence relationship between a reference coordinate system shared by the first camera 20, the second camera 30, and the moving camera 40 and a coordinate system of the first camera 20. A specific calculation method of the first reference coordinate system camera information will be described below.
The first calculation unit 501 of the present embodiment converts the first external parameter into the first reference coordinate system camera information when receiving the information of the first camera 20 from the data server 10, that is, when detecting the first marker 70 from the moving image captured by the moving camera 40. The first calculation unit 501 calculates a first external parameter of the first camera 20 in the reference coordinate system as the first reference coordinate system camera information based on the three-dimensional position (x, y, z) and orientation (rotation) of the first mark 70 in the reference coordinate system, the first internal parameter of the first camera 20, and the position and orientation of the first mark 70 in the first image.
In this example, the first internal parameter of the first camera 20 is included in the first camera information of the first camera 20 received from the data server 10. In addition, the information indicating the position of the first marker 70 in the first image is included in the first marker information received from the data server 10. As described above, the first camera information and the first marker information of the first camera 20 received from the data server 10 are found based on the first image. In addition, in this example, the position of the first mark 70 in the reference coordinate system is set as the origin of the reference coordinate system, and thus can be specified without using the moving image of the moving camera 40. Therefore, it can be considered that the first calculation unit 501 calculates the first reference coordinate system camera information indicating the position of the first camera 20 in the reference coordinate system based on the first image. The first reference coordinate system camera information is an example of "first camera position information", but is not limited thereto. In addition, as described above, in this example, the first reference coordinate system camera information is the converted first external parameter of the first camera 20, but is not limited thereto.
The second calculation unit 502 calculates second reference coordinate system camera information indicating the position of the second camera 30 based on the second image including the second marker 80 captured by the second camera 30 disposed separately from the first camera 20 and the second moving image including the second marker 80 captured by the moving camera 40. More specifically, when the second marker 80 is detected from the moving image captured by the moving camera 40, the second calculation unit 502 converts the second camera information into second reference coordinate system camera information indicating the position of the second camera 30 in the reference coordinate system based on the second image and the moving image captured by the moving camera 40. In this example, the moving image including the second marker 80 captured by the moving camera 40 corresponds to the "second moving image", but is not limited thereto. In this example, the second reference coordinate system camera information is a second external parameter of the second camera 30 (hereinafter, sometimes referred to as "converted second external parameter" according to the second external parameter of the reference coordinate system) that determines a correspondence relationship between the coordinate points of the reference coordinate system and the coordinate points of the three-dimensional coordinate system of the camera of the second camera 30, but is not limited thereto. The second reference coordinate system camera information may be information indicating a correspondence relationship between the reference coordinate system and the coordinate system of the second camera 30. A specific calculation method of the second reference coordinate system camera information will be described below.
The second calculation unit 502 of the present embodiment converts the second external parameter into the second reference coordinate system camera information when receiving the information of the second camera 30 from the data server 10, that is, when detecting the second marker 80 from the moving image captured by the moving camera 40. As described above, when the second flag 80 is detected, the third camera information and the third flag information corresponding to the second flag 80 are transmitted from the data storage unit 454 to the information conversion unit 408. The second calculation unit 502 determines the three-dimensional position (x, y, z) and orientation (rotation) of the second marker 80 in the reference coordinate system (in this example, with the first marker 70 as the origin) based on the third camera information and the third marker information corresponding to the second marker 80 received from the data storage unit 454. More specifically, the second calculation unit 502 determines the position and orientation of the second mark 80 in the reference coordinate system based on the position and orientation of the second mark 80 indicated by the third mark information and the third external parameter and the third internal parameter included in the third camera information. The third extrinsic parameter included in the third camera information represents, for example, the movement amount and the rotation amount of the moving camera 40, that is, the difference between the orientation of the moving camera 40 at the time of detection of the first marker 70 and the orientation of the moving camera 40 at the time of detection of the second marker 80.
As described above, in this example, the third camera information and the third marker information corresponding to the second marker 80 are found based on the moving image captured by the moving camera 40. Therefore, it can be considered that the second calculation section 502 determines the position and orientation of the second mark 80 in the reference coordinate system based on the moving image captured by the moving camera 40.
The second calculation unit 502 calculates the second external parameter of the second camera 30 in the reference coordinate system as the second reference coordinate system camera information based on the position and orientation of the second mark 80 in the reference coordinate system, the second internal parameter of the second camera 30, and the position and orientation of the second mark 80 in the second image. In this example, the second internal parameter of the second camera 30 is included in the second camera information received from the data server 10. In addition, the information indicating the position of the second marker 80 in the second image is included in the second marker information received from the data server 10. As described above, the second camera information and the second marker information of the second camera 30 are found based on the second image.
From the above, it is considered that the second calculation unit 502 of the present embodiment calculates the second reference coordinate system camera information indicating the position of the second camera 30 in the reference coordinate system based on the second image and the moving image captured by the moving camera 40. The second reference coordinate system camera information is an example of "second camera position information", but is not limited thereto. In addition, as described above, in this example, the second reference coordinate system camera information is the converted second external parameter of the second camera 30, but is not limited thereto. The first reference coordinate system camera information and the second reference coordinate system camera information calculated by the calculation unit 421 as described above are stored in the storage unit 409. In the following description, the first reference coordinate system camera information and the second reference coordinate system camera information may be referred to as "reference coordinate system camera information" when they are not distinguished from each other.
The description is continued with reference to fig. 8. The mark information converting unit 422 is an example of a "first converting unit" and converts information on the position and shape of the first mark 70 included in the first image in accordance with the reference coordinate system to output first converted information, or converts information on the position and shape of the second mark 80 included in the second image in accordance with the reference coordinate system to output second converted information. The following is a detailed description. In this example, the mark information conversion unit 422 converts the first mark information or the second mark information in accordance with the above-described reference coordinate system. For example, the marker information converting unit 422 can perform the conversion using the first reference coordinate system camera information or the second reference coordinate system camera information calculated by the calculating unit 421, and the first internal parameters of the first camera 20 or the second internal parameters of the second camera 30. Hereinafter, the first flag information converted by the flag information conversion unit 422 may be referred to as "converted first flag information", and the second flag information converted by the flag information conversion unit 422 may be referred to as "converted second flag information". When the two are not distinguished from each other, the information is referred to as "converted flag information". The converted flag information is stored in the storage unit 409. In this example, the converted first flag information corresponds to the above-mentioned "first conversion completed information", and the converted second flag information corresponds to the above-mentioned "second conversion completed information", but the present invention is not limited thereto.
Fig. 10 is a diagram showing an example of functions of the flag information conversion unit 422. As shown in fig. 10, the mark information converting section 422 includes a first position shape converting section 511 and a second position shape converting section 512. Further, the function of the flag information converting section 422 is not limited to this.
The first position/shape conversion unit 511 converts the information on the position and shape of the first mark 70 in the first image into the information on the position and shape in the reference coordinate system. For example, the first position/shape converting unit 511 converts the first mark information acquired from the data server 10 in the reference coordinate system using the first reference coordinate system camera information (in this example, the converted first external parameter) and the first internal parameter of the first camera 20 acquired from the data server 10. Hereinafter, the first mark information converted in accordance with the reference coordinate system is referred to as "converted first mark information".
Similarly, the second position/shape conversion unit 512 converts the information on the position and shape of the second mark 80 in the second image into the information on the position and shape in the reference coordinate system described above. For example, the second position/shape converting unit 512 converts the second marker information acquired from the data server 10 in the reference coordinate system using the second reference coordinate system camera information (in this example, the converted second external parameter) and the second internal parameter of the second camera 30 acquired from the data server 10. Hereinafter, the second mark information converted in the reference coordinate system is referred to as "converted second mark information".
The description is continued with reference to fig. 8. The spatial feature information conversion unit 423 converts the first or second spatial feature information in accordance with the reference coordinate system described above. For example, the spatial feature information conversion unit 423 may perform the conversion using the first or second reference coordinate system camera information calculated by the calculation unit 421 and the known first internal parameters of the first camera 20 or the known second internal parameters of the second camera 30. Hereinafter, the first spatial feature information converted by the spatial feature information conversion unit 423 may be referred to as "converted first spatial feature information", and the second spatial feature information after the conversion may be referred to as "converted second spatial feature information". The term "converted spatial feature information" refers to spatial feature information that is not distinguished from each other. The converted spatial feature information is stored in the storage unit 409.
Fig. 11 is a diagram showing an example of functions of the spatial feature information conversion unit 423. As shown in fig. 11, the spatial feature information converting unit 423 includes a first feature information converting unit 521 and a second feature information converting unit 522. The function of the spatial feature information conversion unit 423 is not limited to this.
The first feature information converting unit 521 converts the first spatial feature information in accordance with the reference coordinate system. The first feature information conversion unit 521 performs conversion such as scaling, parallel shift, and rotational shift. For example, the first feature information converting unit 521 converts the first spatial feature information acquired from the data server 10 in the reference coordinate system using the first reference coordinate system camera information (in this example, the converted first external parameter) and the first internal parameter of the first camera 20 acquired from the data server 10.
Similarly, the second spatial feature information converting unit 522 converts the second spatial feature information in accordance with the reference coordinate system. For example, the second feature information converting unit 522 converts the second spatial feature information acquired from the data server 10 in the reference coordinate system using the above-described second reference coordinate system camera information (in this example, the converted second external parameter) and the second internal parameter of the second camera 30 acquired from the data server 10.
The description is continued with reference to fig. 8. As shown in fig. 8, the storage unit 409 includes a camera information storage unit 431, a marker information storage unit 432, and a spatial feature information storage unit 433. The camera information storage unit 431 is a storage area for storing the reference coordinate system camera information calculated by the calculation unit 421. The flag information storage section 432 is a storage area for storing the converted flag information converted by the flag information conversion section 422. The spatial feature information storage unit 433 is a storage area for storing the converted spatial feature information converted by the spatial feature information conversion unit 423.
As shown in fig. 8, the simulation unit 410 includes a converted information acquisition unit 441, a marker information/spatial feature information conversion unit 442, a difference detection unit 443, a recalculation unit 444, and a correction unit 445.
The converted information acquiring unit 441 acquires the converted information (including the reference coordinate system camera information, the converted flag information, and the converted spatial feature information) from the storage unit 409. In this example, the converted information acquiring unit 441 transmits the converted information of each of the first camera 20 and the second camera 30 to the difference detecting unit 443 at the timing when the converted information of each of the first camera 20 and the second camera 30 is acquired.
The marker information/spatial feature information conversion unit 442 receives the third identification information including the identification information set as the reference coordinate system from the verification unit 407, converts the information on the side of the moving camera 40 in accordance with the reference coordinate system, and transmits the converted information to the difference detection unit 443. For example, the marker information/spatial feature information conversion section 442 has a function of converting the third marker information of the moving camera 40 in accordance with the above-described reference coordinate system. The marker information/spatial feature information converting unit 442 is an example of a "second converting unit" and converts information on the position and shape of the first marker 70 included in the moving image captured by the moving camera 40 in accordance with the reference coordinate system and outputs third converted information. The marker information/spatial feature information conversion unit 442 converts information on the position and shape of the second marker 80 included in the moving image captured by the moving camera 40 in accordance with the reference coordinate system and outputs fourth converted information. More specific details will be described later. The marker information/spatial feature information conversion unit 442 also has a function of converting the third spatial feature information of the moving camera 40 in accordance with the reference coordinate system.
Fig. 12 is a diagram showing an example of functions of the marker information/spatial feature information conversion section 442. As shown in fig. 12, for example, the mark information/spatial feature information converting section 442 includes a third position shape converting section 531, a fourth position shape converting section 532, a third feature information converting section 533, and a fourth feature information converting section 534. Note that the function of the marker information/spatial feature information conversion section 442 is not limited to this. For example, the marker information/spatial feature information conversion unit 442 may have a function of constructing a space of the reference coordinate system (for example, constructing a 3D model representing a three-dimensional point group) based on a moving image captured by the moving camera 40.
The third position/shape converting unit 531 converts the information of the position and shape of the first mark 70 in the moving image captured by the moving camera 40 into the information of the position and shape in the reference coordinate system described above. For example, the marker information/spatial feature information conversion section 442 can perform the conversion using the third internal parameter and the third external parameter of the moving camera 40 when the first marker 70 is detected by the moving camera 40. In this example, since the position of the first mark 70 in the reference coordinate system is set as the origin of the reference coordinate system, information of the position and shape of the first mark 70 in the moving image captured by the moving camera 40 can be used. In this example, the information on the position and shape of the first mark 70 converted by the third position/shape conversion unit 531 corresponds to the above-described "third converted information", but is not limited thereto. Similarly, the fourth position-shape converting unit 532 converts the information of the position and shape of the second mark 80 in the moving image captured by the moving camera 40 into the information of the position and shape in the reference coordinate system described above. For example, the marker information/spatial feature information conversion section 442 can perform the conversion using the third internal parameter and the third external parameter of the moving camera 40 when the second marker 80 is detected by the moving camera 40. In this example, the information on the position and shape of the second mark 80 converted by the fourth position/shape converting unit 532 corresponds to the above-described "fourth converted information", but is not limited thereto.
The third feature information converting unit 533 converts feature information of the peripheral information of the first mark 70 in the moving image captured by the moving camera 40 into feature information in the reference coordinate system described above. For example, the marker information/spatial feature information conversion section 442 can perform the conversion using the third internal parameter and the third external parameter of the moving camera 40 when the first marker 70 is detected by the moving camera 40. In this example, since the position of the first mark 70 in the reference coordinate system is set as the origin of the reference coordinate system, the feature information of the peripheral information of the first mark 70 in the moving image captured by the moving camera 40 can be used. Similarly, the fourth feature information conversion unit 534 converts the feature information of the peripheral information of the second mark 80 in the moving image captured by the moving camera 40 into the feature information in the reference coordinate system described above. For example, the marker information/spatial feature information conversion section 442 can perform the conversion using the third internal parameter and the third external parameter of the moving camera 40 when the second marker 80 is detected by the moving camera 40.
The description is continued with reference to fig. 8. The difference detection unit 443 detects a difference between the converted marker information on the first camera 20 and the second camera 30 side and the converted marker information on the moving camera 40 side. In this example, the difference detection portion 443 detects a first difference indicating a difference between information on the position and shape of the first marker 70 based on the first image and information on the position and shape of the first marker 70 based on the moving image captured by the moving camera 40. In addition, the difference detection portion 443 detects a second difference indicating a difference between the information of the position and shape of the second mark 80 based on the second image and the information of the position and shape of the second mark 80 based on the moving image captured by the moving camera 40. More specifically, as described below.
The difference detection unit 443 detects a first difference indicating a difference between the information on the position and shape of the first mark 70 converted by the first position/shape conversion unit 511 (in this example, corresponding to "first conversion completed information") and the information on the position and shape of the first mark 70 converted by the third position/shape conversion unit 531 (in this example, corresponding to "third conversion completed information"). That is, the difference detection unit 443 compares the first conversion-completed information and the third conversion-completed information to detect a first difference. As described above, the information on the position and shape of the first mark 70 converted by the first position-shape converting unit 511 is obtained by converting the position and shape of the first mark 70 in the first image into information on the position and shape in the reference coordinate system. The information on the position and shape of the first mark 70 converted by the third position and shape converting unit 531 is obtained by converting the position and shape of the first mark 70 in the moving image captured by the moving camera 40 into information on the position and shape in the reference coordinate system.
The difference detecting unit 443 detects a second difference indicating a difference between the information on the position and shape of the second mark 80 converted by the second position/shape converting unit 512 (in this example, corresponding to "second converted information") and the information on the position and shape of the second mark 80 converted by the fourth position/shape converting unit 532 (in this example, corresponding to "fourth converted information"). That is, the difference detection unit 443 compares the second conversion-completed information and the fourth conversion-completed information to detect a second difference. As described above, the information on the position and shape of the second mark 80 converted by the second position/shape conversion unit 512 is obtained by converting the position and shape of the second mark 80 in the second image into information on the position and shape in the reference coordinate system. The information on the position and shape of the second mark 80 converted by the fourth position-shape converting unit 532 is obtained by converting the information on the position and shape of the second mark 80 in the moving image captured by the moving camera 40 into the position and shape in the reference coordinate system. For example, the difference detection unit 443 may detect the first difference and the second difference by viewpoint-converting the mark viewed from the fixed camera using the third external parameter of the moving camera 40, or may perform viewpoint conversion opposite to the viewpoint-converting.
The difference detecting unit 443 detects a third difference indicating a difference between the spatial feature information converted by the first feature information converting unit 521 and the spatial feature information converted by the third feature information converting unit 533. That is, the difference detection unit 443 performs detailed matching including spatial feature information (not limited to complete matching, but may be matching based on partial matching or approximation), and detects the third difference. As described above, the spatial feature information converted by the first feature information conversion unit 521 is obtained by converting the feature information of the peripheral information of the first mark 70 in the first image into the feature information in the reference coordinate system described above. The spatial feature information converted by the third feature information converting unit 533 is obtained by converting the feature information of the peripheral information of the first mark 70 in the moving image captured by the moving camera 40 into the feature information in the reference coordinate system described above.
Similarly, the difference detecting unit 443 detects a fourth difference indicating a difference between the spatial feature information converted by the second feature information converting unit 522 and the spatial feature information converted by the fourth feature information converting unit 534. As described above, the spatial feature information converted by the second feature information conversion unit 522 is obtained by converting the feature information of the peripheral information of the second mark 80 in the second image into the feature information in the reference coordinate system. The spatial feature information converted by the fourth feature information conversion unit 534 is obtained by converting feature information of the peripheral information of the second mark 80 in the moving image captured by the moving camera 40 into spatial feature information in the reference coordinate system.
The first to fourth difference correction units 445 detected by the difference detection unit 443 as described above are input. The difference detection unit 443 transmits the respective values of the first difference and the third difference, or the second difference and the fourth difference to the correction unit 445. The present invention is not limited to this, and only the first difference or the second difference may be detected and transmitted to the correction unit 445. The correction unit 445 corrects the first reference coordinate system camera information (in this example, the converted first extrinsic parameters) based on the first difference, and corrects the second reference coordinate system camera information (in this example, the converted second extrinsic parameters) based on the second difference. Alternatively, the correcting unit 445 may correct the first reference coordinate system camera information (in this example, the converted first extrinsic parameters) based on the first difference and the third difference, and may correct the second reference coordinate system camera information (in this example, the converted second extrinsic parameters) based on the second difference and the fourth difference. More specifically, the correcting unit 445 corrects the first reference coordinate system camera information and the second reference coordinate system camera information so that the first difference, the third difference, the second difference, and the fourth difference are equal to or less than the allowable value. The allowable value can be arbitrarily changed according to design conditions and the like.
When the first reference coordinate system camera information and the second reference coordinate system camera information are corrected by the correcting unit 445, the corrected first reference coordinate system camera information and second reference coordinate system camera information are input to the recalculating unit 444. In this example, when the corrected first reference coordinate system camera information and second reference coordinate system camera information are acquired from the correcting unit 445, the recalculating unit 444 acquires the converted marker information and the converted spatial feature information from the converted information acquiring unit 441. Then, the recalculation unit 444 recalculates the converted marker information and the converted spatial feature information acquired from the converted information acquisition unit 441, using the corrected first reference coordinate system camera information and second reference coordinate system camera information acquired from the correction unit 445. Then, the recalculation unit 444 inputs the recalculated marker information and spatial feature information to the difference detection unit 443. Then, the difference detection unit 443 detects the first difference and the third difference, and the second difference and the fourth difference again, and inputs them to the correction unit 445.
When the first difference, the third difference, the second difference, and the fourth difference input from the difference detection unit 443 are equal to or less than the allowable value, the correction unit 445 outputs the first reference coordinate system camera information and the second reference coordinate system camera information at that time, and outputs the converted spatial feature information (spatial feature information converted in accordance with the reference coordinate system) at that time.
For example, the output destinations of the first reference coordinate system camera information, the second reference coordinate system camera information, and the converted spatial feature information from the correction unit 445 may be servers that provide final services, such as an information processing device, an edge server (in-store server), and a cloud server. For example, the server can generate service information based on processing using the information and provide the service information to the user (for example, output the service information to a terminal (mobile camera 40) held by the user). As the above-described processing, for example, various kinds of processing in recognition processing of an image using the first camera 20 or the second camera 30 after the position correction are assumed. For example, it is assumed that the determination is processing for determining an area in which a person or an object cannot move due to the presence of a high obstacle, processing for determining overlap of a person or an object, processing for overlapping data such as a movement path with an image of the first camera 20 or the second camera 30, processing for rendering a 3D space, or the like. However, the above processing is not limited to the above examples.
The CPU103 executes the program stored in the storage device 104 to realize the functions of the mobile camera 40 described above. However, the present invention is not limited to this, and may be configured such that all or a part of the plurality of functions of the mobile camera 40 described above is realized by a dedicated hardware circuit, for example. Note that the marker information storage 253, the marker information storage 353, the marker information storage 453, the data storage 454, and the storage 409 may be the storage device 104, for example. In the present embodiment, the moving camera 40 functions as a "camera information calculation device" that calculates the first reference coordinate system camera information and the second reference coordinate system camera information described above.
Fig. 13 is a flowchart showing an example of the operation of the mobile camera 40 according to the present embodiment. The specific contents of each step are as described above, and therefore, the description thereof will be appropriately omitted. The order of the steps can be changed arbitrarily, and is not limited to the example of fig. 13.
As shown in fig. 13, when the marker detecting section 404 detects the first pattern (the first marker 70) (step S1), the marker information/spatial feature information converting section 442 in the simulation section 410 converts the information on the side of the moving camera 40 (step S2). As described above, the mark information/spatial feature information conversion unit 442 converts the information on the position and shape of the first mark 70 detected by the mark detection unit 404 in accordance with the reference coordinate system. The marker information/spatial feature information conversion unit 442 converts feature information of the peripheral information of the first marker 70 detected by the spatial feature detection unit 406 in accordance with a reference coordinate system. The specific contents are as described above.
The checkup section 407 also checks whether or not the first identification information corresponding to the first pattern (first mark 70) detected by the mark detection section 404 exists in the data server 10 (step S3). Here, the description is continued assuming that the first identification information exists in the data server 10. Next, the information conversion section 408 acquires the first camera information, the first marker information, and the first spatial feature information from the data server 10 (step S4).
Next, the information conversion section 408 converts the information (the first camera information, the first marker information, and the first spatial feature information) acquired in step S4 in accordance with the above-described reference coordinate system (step S5). The specific contents are as described above.
Then, when the moving camera 40 moved to the position where the second camera 30 can capture the image of the image capture area detects the second pattern (the second marker 80) (step S6), the marker information/spatial feature information conversion section 442 converts the information on the moving camera 40 side (step S7). As described above, the mark information/spatial feature information conversion unit 442 converts the information on the position and shape of the second mark 80 detected by the mark detection unit 404 in accordance with the reference coordinate system. Further, feature information of the peripheral information of the second mark 80 detected by the spatial feature detecting unit 406 is converted in accordance with a reference coordinate system. The specific contents are as described above.
The checkup section 407 also checks whether or not the second identification information corresponding to the second pattern (second mark 80) detected by the mark detection section 404 exists in the data server 10 (step S8). Here, the description is continued assuming that the second identification information exists in the data server 10. Next, the information conversion section 408 acquires the second camera information, the second marker information, and the second spatial feature information from the data server 10 (step S9).
Next, the information conversion section 408 converts the information (the camera information of the second camera 30, the second marker information, and the second spatial feature information) acquired in step S9 in accordance with the above-described reference coordinate system (step S10). The specific contents are as described above.
Next, the difference detection unit 443 in the simulation unit 410 detects the first difference or the second difference (step S11). The specific contents are as described above. When the first difference or the second difference detected in step S11 is equal to or greater than the allowable value (yes in step S12), the correction unit 445 corrects the first reference coordinate system camera information and the second reference coordinate system camera information (step S13). Then, the recalculating unit 444 recalculates the marker information and the spatial feature information on the first camera 20 or the second camera 30 side using the corrected first reference coordinate system camera information and second reference coordinate system camera information (step S14), and repeats the processing of and after step S11.
On the other hand, when the first difference or the second difference detected in step S11 is smaller than the allowable value (no in step S12), the first reference coordinate system camera information and the second reference coordinate system camera information at that time are output, and the converted spatial feature information at that time is output (step S15).
As described above, the moving camera 40 calculates the first reference coordinate system camera information indicating the position of the first camera 20 based on the first image including the first marker 70 captured by the first camera 20 and the first moving image including the first marker 70 captured by the moving camera 40. In addition, the moving camera 40 calculates second reference coordinate system camera information indicating the position of the second camera 30 based on a second image including the second marker 80 captured by the second camera 30 disposed separately from the first camera 20 and a second moving image including the second marker 80 captured by the moving camera 40. More specifically, when the moving camera 40 detects the first mark 70 from the moving image captured while moving, it calculates first reference coordinate system camera information indicating the position of the first camera 20 in a reference coordinate system with respect to the first mark 70 based on the first image. Then, when the moving camera 40 detects the second mark 80 from the moving image captured, the second reference coordinate system camera information indicating the position of the second camera 30 in the reference coordinate system is calculated based on the second image and the captured moving image. That is, in the present embodiment, the imaging area of the first camera 20 and the imaging area of the second camera 30 are recognized from the moving image of the moving camera 40, and the first camera 20 and the second camera 30 are mapped to the reference coordinate system on the moving camera 40 side, respectively.
According to the present embodiment, for example, even when it is difficult to capture a common subject in the captured images of the first camera 20 and the second camera 30, the positional relationship between the first camera 20 and the second camera 30 in the reference coordinate system can be obtained. That is, according to the present embodiment, it is possible to obtain the positional relationship of a plurality of cameras that cannot capture a common subject in each captured image.
In addition, as described above, the moving camera 40 detects the first difference (difference of the first marker 70) representing the difference between the information of the position and shape of the first marker 70 based on the first image and the information of the position and shape of the first marker 70 based on the moving image of the moving camera 40. In addition, the moving camera 40 detects a second difference (difference of the second marker 80) representing a difference between the information of the position and shape of the second marker 80 based on the second image and the information of the position and shape of the second marker 80 based on the moving image of the moving camera 40. Then, the moving camera 40 corrects the first reference coordinate system camera information and the second reference coordinate system camera information so that the first difference or the second difference is equal to or smaller than an allowable value. This allows the positional relationship between the first camera 20 and the second camera 30 in the reference coordinate system to be obtained with higher accuracy. In the present embodiment, the positions of the first camera 20 and the second camera 30 in the reference coordinate system are obtained based on the first difference or the second difference, but the present invention is not limited to this, and the positions of the first camera 20 and the second camera 30 in the reference coordinate system may be obtained based on the first difference and the third difference (difference in the peripheral information of the first marker 70) or the second difference and the fourth difference (difference in the peripheral information of the second marker 80).
The embodiments have been described above, but the camera information calculation device, the system, the camera information calculation method, and the program disclosed in the present application are not limited to the above-described embodiments, and the components may be modified and embodied in the implementation stage without departing from the scope of the invention. In addition, various inventions can be formed by appropriate combinations of a plurality of constituent elements disclosed in the above embodiments. For example, some of the components may be deleted from all the components shown in the embodiments.
The following describes modifications. For example, the following modifications may be arbitrarily combined with each other.
(1) Modification example 1
In the above-described embodiment, the moving camera 40 functions as the "camera information calculation device" that calculates the first reference coordinate system camera information and the second reference coordinate system camera information, but a device different from the moving camera 40 may function as the "camera information calculation device". For example, as shown in fig. 14, the server 100 functioning as the "camera information calculation device" may be provided separately from the mobile camera 40. As shown in fig. 14, the system 2 of the present modification includes a data server 10, a server 100, a first camera 20, a second camera 30, and a mobile camera 40. The server 100 is connected to be able to communicate with the data server 10, the first camera 20, the second camera 30, and the mobile camera 40 via the network 50, respectively.
Fig. 15 is a diagram showing an example of the hardware configuration of the server 100. In the example of fig. 15, the minimum required hardware elements are illustrated, but the present invention is not limited to this, and the server 100 may be a system including other hardware elements (for example, an input device, a display device, a camera, and the like). As shown in fig. 15, the server 100 includes a CPU110, a storage device 120, a communication I/F unit 130, and a bus 140 connecting these units to each other.
The CPU110 corresponds to an example of a hardware processor. The CPU110 controls the operation of the server 100. CPU110 executes a program stored in storage device 120 to realize various functions of server 100. The storage device 120 stores various data such as programs. For example, the storage device 120 includes a ROM, which is a nonvolatile memory for storing a program, a RAM, which is a volatile memory having an operation area of the CPU110, and the like. The communication I/F unit 130 is an interface for connecting to the network 50.
Fig. 16 is a diagram showing an example of functions of the server 100. Note that the functions of the server 100 are not limited to the example of fig. 16. As shown in fig. 16, the server 100 includes a mobile camera information acquisition unit 111, a verification unit 407, an information conversion unit 408, a storage unit 409, a simulation unit 410, and an output unit 112. In this example, some of the plurality of functions (the check unit 407, the information conversion unit 408, the storage unit 409, and the simulation unit 410) of the mobile camera 40 described in the above embodiment are mounted on the server 100.
The moving camera information acquisition unit 111 shown in fig. 16 acquires the third camera information, the third marker information, the third identification information, and the third spatial feature information of the moving camera 40 from the moving camera 40. The output unit 112 shown in fig. 16 generates service information based on processing using the first reference coordinate system camera information and the second reference coordinate system camera information calculated in the server 100, and outputs the generated service information to the mobile camera 40 (the terminal of the user).
For example, the server 100 may be constituted by 1 computer or may be constituted by a plurality of computers (computer groups). For example, the server 100 may be configured by a plurality of computers, and the plurality of functions of the server 100 may be mounted on the plurality of computers in a distributed manner.
(2) Modification 2
In the above-described embodiment, the first pattern (first mark 70) corresponding to the first identification information is described as an example of the "first object" included in the first image captured by the first camera 20. However, the present invention is not limited to this, and for example, a mode may be adopted in which the first marker 70 is not provided, and a subject other than the first marker 70 is set as the first subject. That is, an object other than the first mark 70, which is present in the imaging area of the first camera 20 and is commonly included in the first image of the first camera 20 and the moving image of the mobile camera 40, for example, an object whose size and shape are known, or a boundary between a floor and a wall, may be used as the first object.
Similarly, in the above-described embodiment, the second pattern (the second mark 80) corresponding to the second identification information is described as an example of the "second object" included in the second image captured by the second camera 30. However, the present invention is not limited to this, and for example, a mode may be adopted in which the second mark 80 is not provided, and the second object is an object other than the second mark 80. That is, an object other than the second mark 80 which exists in the imaging area of the second camera 30 and is commonly included in the second image of the second camera 30 and the moving image of the mobile camera 40, for example, an object whose size and shape are known, or a boundary between a floor and a wall, may be used as the second object.
(3) Modification 3
In the above-described embodiment, 2 cameras that can be set at arbitrary positions and orientations and two markers corresponding to 1 for each of the 2 cameras are provided, but the number of cameras and markers is not limited to 2. For example, 3 or more cameras that can be set at arbitrary positions and orientations and 3 or more marks that correspond to 1 for each of the 3 or more cameras may be provided. In addition, as long as at least one marker can be imaged from a camera that can be set at an arbitrary position and orientation, the number of markers may be set smaller than the number of cameras, instead of 1 to 1 with respect to the cameras. In this manner, the same method as in the above-described embodiment can be performed. Thus, for each of 3 or more cameras, camera information specifying the position of the camera on the reference coordinate system on the side of the mobile camera 40 with the mark first detected by the mobile camera 40 as a reference can be calculated.
(4) Modification example 4
The system 1 of the above-described embodiment is applied to the inside of a store (indoor), but is not limited thereto, and may be applied to the outside of a house, for example. For example, a fixed camera may be provided near the traffic signal, and a vehicle mounted with the camera may be used as the moving camera.
Description of reference numerals: 1 … system; 10 … data server; 20 … a first camera; 30 … second camera; 40 … moving the camera; 50 … network; 100 … server; 111 … moving the camera information acquisition section; 112 … output; 401 … an image acquisition unit; 402 … point/line feature detection unit; 403 … internal parameter calculation section; 404 … mark detection part; 405 … an external parameter calculation unit; 406 … a spatial feature detection section; 407 … checkup section; 408 … information conversion section; 409 … storage section; 410 … analog part; 421 … calculation part; 422 … mark information conversion part; 423 … space characteristic information conversion part; 441 …, a converted information acquisition unit; 442 … mark information/spatial feature information conversion section; 443 … differential detection unit; 444 … recalculation section; 445 … correcting section; 501 … a first calculation unit; 502 … a second calculation section; 511 … a first position shape converting part; 512 … second position shape converting part; 521 … a first characteristic information converting section; 522 … second characteristic information converting section; 531 … third position shape converting part; 532 … fourth position shape converting part; 533 … a third characteristic information converting section; 534 … fourth characteristic information conversion section.

Claims (9)

1. A camera information calculation device is provided with:
a first calculation unit that calculates first camera position information indicating a position of a first camera based on a first image including a first subject captured by the first camera and a first moving image including the first subject captured by a third camera; and
and a second calculation unit that calculates second camera position information indicating a position of a second camera, based on a second image including a second subject captured by the second camera disposed separately from the first camera and a second moving image including the second subject captured by the third camera.
2. The camera information computing device of claim 1, wherein,
the first camera position information is information representing a correspondence between a reference coordinate system common to the first camera, the second camera, and the third camera and a coordinate system of the first camera,
the second camera position information is information representing a correspondence between the reference coordinate system and a coordinate system of the second camera.
3. The camera information calculation device according to claim 2, wherein:
a first conversion unit that converts information on a position and a shape of the first object included in the first image in accordance with the reference coordinate system and outputs first converted information, and that converts information on a position and a shape of the second object included in the second image in accordance with the reference coordinate system and outputs second converted information;
a second conversion unit that converts information on a position and a shape of the first object included in the first moving image in accordance with the reference coordinate system to output third converted information, and converts information on a position and a shape of the second object included in the second moving image in accordance with the reference coordinate system to output fourth converted information; and
a difference detection unit that compares the first conversion-completed information and the third conversion-completed information to detect a first difference, and compares the second conversion-completed information and the fourth conversion-completed information to detect a second difference.
4. The camera information computing device of claim 3,
the camera information calculation device includes a correction unit that corrects the first camera position information based on the first difference and corrects the second camera position information based on the second difference.
5. The camera information computing device of any one of claims 1 to 4,
at least a part of the shooting area of the first moving image overlaps at least a part of the shooting area of the second moving image.
6. A system is provided with:
a first camera that captures a first image including a first subject;
a second camera that captures a second image including a second subject; and
and a third camera that captures a first moving image including the first subject and a second moving image including the second subject, and that has a first calculation unit that calculates first camera position information indicating a position of the first camera based on the first image and the first moving image, and a second calculation unit that calculates second camera position information indicating a position of the second camera based on the second image and the second moving image.
7. A system is provided with:
a first camera that captures a first image including a first subject;
a second camera that captures a second image including a second subject;
a third camera that captures a first moving image including the first subject and a second moving image including the second subject; and
and a server having a first calculation unit that calculates first camera position information indicating a position of the first camera based on the first image and the first moving image, and a second calculation unit that calculates second camera position information indicating a position of the second camera based on the second image and the second moving image.
8. A camera information calculation method, executed by a computer:
calculating first camera position information representing a position of a first camera based on a first image including a first subject captured by the first camera and a first moving image including the first subject captured by a third camera,
second camera position information representing a position of a second camera is calculated based on a second image including a second subject captured by the second camera configured separately from the first camera and a second moving image including the second subject captured by the third camera.
9. A program for causing a computer to execute processing of:
calculating first camera position information representing a position of a first camera based on a first image including a first subject captured by the first camera and a first moving image including the first subject captured by a third camera,
the second camera position information is calculated based on a second image including a second subject captured by a second camera configured separately from the first camera and a second moving image including the second subject captured by the third camera.
CN201980089997.3A 2019-01-23 2019-01-23 Camera information calculation device, camera information calculation system, camera information calculation method, and recording medium Active CN113330275B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/002137 WO2020152810A1 (en) 2019-01-23 2019-01-23 Camera information calculation device, system, camera information calculation method and program

Publications (2)

Publication Number Publication Date
CN113330275A true CN113330275A (en) 2021-08-31
CN113330275B CN113330275B (en) 2023-05-09

Family

ID=71736855

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980089997.3A Active CN113330275B (en) 2019-01-23 2019-01-23 Camera information calculation device, camera information calculation system, camera information calculation method, and recording medium

Country Status (4)

Country Link
US (1) US20210348915A1 (en)
JP (1) JP7310835B2 (en)
CN (1) CN113330275B (en)
WO (1) WO2020152810A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005241323A (en) * 2004-02-25 2005-09-08 Advanced Telecommunication Research Institute International Imaging system and calibration method
US20080240612A1 (en) * 2007-03-30 2008-10-02 Intel Corporation Non-overlap region based automatic global alignment for ring camera image mosaic
CN101583969A (en) * 2007-01-16 2009-11-18 松下电器产业株式会社 Icho keiji; misaki masayuki; kawamura takashi; isogai kuniaki; kawanishi ryouichi; ohmiya jun; nishiyama hiromichi
WO2017149869A1 (en) * 2016-02-29 2017-09-08 ソニー株式会社 Information processing device, method, program, and multi-camera system

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE60133026T2 (en) * 2000-08-28 2009-02-19 Cognitens Ltd. ACCURATE ALIGNMENT OF IMAGES IN DIGITAL PICTURE SYSTEMS BY ADJUSTING POINTS IN THE PICTURES
JP4708752B2 (en) * 2004-09-28 2011-06-22 キヤノン株式会社 Information processing method and apparatus
JP4960754B2 (en) * 2007-04-25 2012-06-27 キヤノン株式会社 Information processing apparatus and information processing method
JP6237326B2 (en) * 2014-02-25 2017-11-29 富士通株式会社 Posture estimation apparatus, posture estimation method, and computer program for posture estimation
JP2015204512A (en) 2014-04-14 2015-11-16 パナソニックIpマネジメント株式会社 Information processing apparatus, information processing method, camera, reception device, and reception method
US11361466B2 (en) * 2018-11-30 2022-06-14 Casio Computer Co., Ltd. Position information acquisition device, position information acquisition method, recording medium, and position information acquisition system
US20220058826A1 (en) * 2018-12-27 2022-02-24 Nec Communication Systems, Ltd. Article position managing apparatus, article position management system, article position managing method, and program
JP7124840B2 (en) * 2020-03-24 2022-08-24 カシオ計算機株式会社 Information processing device, information processing system, information processing method and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005241323A (en) * 2004-02-25 2005-09-08 Advanced Telecommunication Research Institute International Imaging system and calibration method
CN101583969A (en) * 2007-01-16 2009-11-18 松下电器产业株式会社 Icho keiji; misaki masayuki; kawamura takashi; isogai kuniaki; kawanishi ryouichi; ohmiya jun; nishiyama hiromichi
US20080240612A1 (en) * 2007-03-30 2008-10-02 Intel Corporation Non-overlap region based automatic global alignment for ring camera image mosaic
WO2017149869A1 (en) * 2016-02-29 2017-09-08 ソニー株式会社 Information processing device, method, program, and multi-camera system

Also Published As

Publication number Publication date
CN113330275B (en) 2023-05-09
JPWO2020152810A1 (en) 2021-12-02
JP7310835B2 (en) 2023-07-19
US20210348915A1 (en) 2021-11-11
WO2020152810A1 (en) 2020-07-30

Similar Documents

Publication Publication Date Title
CN107328420B (en) Positioning method and device
CN110568447A (en) Visual positioning method, device and computer readable medium
JP2006252473A (en) Obstacle detector, calibration device, calibration method and calibration program
JPWO2018235163A1 (en) Calibration apparatus, calibration chart, chart pattern generation apparatus, and calibration method
JP2004334819A (en) Stereo calibration device and stereo image monitoring device using same
KR20180105875A (en) Camera calibration method using single image and apparatus therefor
CN112270719B (en) Camera calibration method, device and system
CN110956660A (en) Positioning method, robot, and computer storage medium
CN112272292B (en) Projection correction method, apparatus and storage medium
CN115376109B (en) Obstacle detection method, obstacle detection device, and storage medium
US11880993B2 (en) Image processing device, driving assistance system, image processing method, and program
CN110673607B (en) Feature point extraction method and device under dynamic scene and terminal equipment
KR102490521B1 (en) Automatic calibration through vector matching of the LiDAR coordinate system and the camera coordinate system
CN111383264B (en) Positioning method, positioning device, terminal and computer storage medium
CN113330275B (en) Camera information calculation device, camera information calculation system, camera information calculation method, and recording medium
CN114611635B (en) Object identification method and device, storage medium and electronic device
JP2008224323A (en) Stereoscopic photograph measuring instrument, stereoscopic photograph measuring method, and stereoscopic photograph measuring program
CN114608521A (en) Monocular distance measuring method and device, electronic equipment and storage medium
CN111656404B (en) Image processing method, system and movable platform
JP6492603B2 (en) Image processing apparatus, system, image processing method, and program
KR101934317B1 (en) System for automatic registration of images using association analysis of linear features
CN112750165A (en) Parameter calibration method, intelligent driving method and device, equipment and storage medium thereof
WO2024001847A1 (en) 2d marker, and indoor positioning method and apparatus
CN117115242B (en) Identification method of mark point, computer storage medium and terminal equipment
WO2024009528A1 (en) Camera parameter calculation device, camera parameter calculation method, and camera parameter calculation program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant