US20210112181A1 - Image processing device, image processing method, and recording medium - Google Patents
Image processing device, image processing method, and recording medium Download PDFInfo
- Publication number
- US20210112181A1 US20210112181A1 US17/131,306 US202017131306A US2021112181A1 US 20210112181 A1 US20210112181 A1 US 20210112181A1 US 202017131306 A US202017131306 A US 202017131306A US 2021112181 A1 US2021112181 A1 US 2021112181A1
- Authority
- US
- United States
- Prior art keywords
- image
- target object
- camera
- captured image
- image processing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000012545 processing Methods 0.000 title claims abstract description 107
- 238000003672 processing method Methods 0.000 title claims description 21
- 239000007787 solid Substances 0.000 claims description 11
- 230000015654 memory Effects 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims 4
- 238000010586 diagram Methods 0.000 description 32
- 238000000034 method Methods 0.000 description 30
- 230000006870 function Effects 0.000 description 17
- 238000013507 mapping Methods 0.000 description 9
- 238000012544 monitoring process Methods 0.000 description 7
- 230000000694 effects Effects 0.000 description 4
- NRNCYVBFPDDJNE-UHFFFAOYSA-N pemoline Chemical compound O1C(N)=NC(=O)C1C1=CC=CC=C1 NRNCYVBFPDDJNE-UHFFFAOYSA-N 0.000 description 4
- 230000008901 benefit Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19678—User interface
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H04N5/225—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/08—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using communication transmission lines
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H04N5/23293—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/87—Regeneration of colour television signals
- H04N9/8715—Regeneration of colour television signals involving the mixing of the reproduced video signal with a non-recorded signal, e.g. a text signal
-
- G06K9/00771—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20096—Interactive definition of curve of interest
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20101—Interactive definition of point of interest, landmark or seed
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
Definitions
- the present invention relates to an image processing technique.
- An actual size and position of a person or object appearing in a video of the monitoring camera may be calculated using information (hereinafter, referred to as camera parameters) on a position and attitude (posture) of the camera and a size and position on an image of the person or object appearing in the video (image).
- camera parameters information on a position and attitude (posture) of the camera and a size and position on an image of the person or object appearing in the video (image).
- NPL 1 discloses a method in which a calibration pattern is image-captured by a camera and camera parameters (a rotation and translation of the camera) indicating a position and attitude of the camera are estimated from an association relation between three-dimensional coordinates of the calibration pattern in a real world and two-dimensional coordinates of the calibration pattern of the captured image.
- camera parameters previously calculated by executing calibration for a camera having been a past target may be acquired, or camera parameters defined on the basis of information such as a position and attitude upon installation of the camera may be acquired.
- NPL 1 Gang Xu and Saburo Tsuji, “Three-dimensional Vision”, Kyoritsu Shuppan, pp. 79-82, 1998
- camera parameters it is difficult for camera parameters to always appropriately indicate a position and attitude or the like of a camera that is a target. For example, in a method for calculating camera parameters by calibration, due to a cause such as an input error of a corresponding point, lens distortion, and the like, camera parameters indicating a position and attitude different from an actual position and attitude of a camera may be calculated. Further, also when an already-estimated cameral parameter is acquired, it is difficult to understand whether the camera parameters are appropriate. It is possible that, for example, with an elapsed time, a position and attitude of a camera may change, and therefore camera parameters estimated in the past and a current position and attitude of the camera may differ from each other.
- the object of the present invention is to provide a technique enabling a use to easily confirm whether camera parameters are appropriate.
- a first image processing device includes: an input means configured to accept inputting of an operation for movement, on a captured image captured by a camera, to a first image that is superimposed on the captured image on the basis of predetermined camera parameters indicating a position and attitude of the camera and indicates a target object having a predetermined shape and a predetermined size set on a real space; and a presentation means configured to present the first image indicating the target object in a manner of view relating to a position on the captured image after the movement on the basis of the camera parameters.
- a second image processing device includes: a display means configured to display a captured image captured by a camera; a parameter acquisition means configured to acquire a cameral parameter indicates a position and an attitude of the camera; an input means configured to accept designation of a first position in the captured image; and a presentation means configured to, based on the camera parameters, a predetermined shape and a predetermined size on the real space of the target object, and the second position on the real space relating to the first position, present a first image indicating a target object on the captured image appearing in a camera defined by the camera parameters upon disposing the target object in a second position in the captured image relating to the first position.
- a third image processing device includes: a first display means configured to display a captured image captured by a camera; a parameter acquisition means configured to acquire a cameral parameter indicates a position and an attitude of the camera; an input means configured to accept inputting of a dot or a line relating to the captured image; and a second display means configured to display the first image indicating the dot or a line mapped on a plane representing a ground surface is viewed from a direction vertical to the plane, based on the camera parameter, a position of the dot or the line on the captured image.
- a first image processing method provided by the present invention includes: an input step of accepting inputting of an operation for movement, on a captured image captured by a camera, to a first image that is superimposed on the captured image on the basis of predetermined camera parameters indicating a position and attitude of the camera and indicates a target object having a predetermined shape and a predetermined size set on a real space; and a presentation step of presenting the first image indicating the target object in a manner of view relating to a position on the captured image after the movement on the, basis of the camera parameters.
- a second image processing method provided by the present invention includes: a display step of displaying a captured image captured by a camera; a parameter acquisition step of acquiring a cameral parameter indicating a position and an attitude of the camera; an input step of accepting designation of a first position in the captured image; and a presentation step of, based on the camera parameters, a predetermined shape and a predetermined site on the real space of the target object, and the second position on the real space relating to the first position, presenting a first image indicating a target object on the captured image appearing in a camera defined by the camera parameters upon disposing the target object in a second position in the captured image relating to the first position.
- a third image processing method provided by the present invention includes: a first display step of displaying a captured image captured by a camera; a parameter acquisition step of acquiring a cameral parameter indicates a position and an attitude of the camera; an input step of accepting inputting of a dot or a line relating to the captured image; and a second display step of displaying the first image indicating the dot or a line mapped on a plane representing a ground surface is viewed from a direction vertical to the plane, based on the camera parameter, a position of the dot or the line on the captured image.
- a program provided by the present invention cause a computer to operate as the first image processing device, the second image processing device, or the third image processing device.
- a technique enabling the user to easily confirm whether camera parameters are appropriate is provided.
- FIG. 1 is a block diagram illustrating an image processing device according to a first example embodiment.
- FIG. 2 is a diagram illustrating a situation where an image processing device has presented a predetermined object on a captured image.
- FIG. 4 is a diagram illustrating a captured image in which a first image has been presented by a presentation unit.
- FIG. 5 is a block diagram illustrating a hardware configuration of an image processing device.
- FIG. 6 is a diagram illustrating a situation where a first image indicating a target object of a planar shape is presented on a captured image.
- FIG. 7 is a block diagram illustrating an image processing device according to a second example embodiment.
- FIG. 8 is a diagram illustrating a captured image in which a line is input via a second input unit.
- FIG. 9 is a diagram illustrating an image indicating a situation where a dotted line mapped on a plane representing a ground surface is viewed from a direction vertical to the plane.
- FIG. 10 is a diagram illustrating an image in which a position and a field of view of a camera have been presented together with a projective line illustrated in FIG. 9( a ) .
- FIG. 11 is a flowchart illustrating a flow of processing executed by the image processing device of the second example embodiment.
- FIG. 12 is a diagram illustrating a situation where error information is presented on a captured image.
- FIG. 13 is a diagram illustrating a situation where the user moves a target object on a captured image.
- FIG. 14 is a diagram illustrating a situation where a target object is moved across a plurality of areas having different heights.
- FIG. 15 is a block diagram illustrating an image processing device according to a third example embodiment.
- FIG. 16 is a flowchart illustrating a flow of processing executed by the image processing device of the third example embodiment.
- FIG. 17 is a diagram illustrating a projective line of a target object presented on a captured image on the plane representing the ground surface illustrated in FIG. 9( a ) .
- FIG. 1 is a block diagram illustrating an image processing device 2000 according to a first example embodiment.
- an arrow indicates a flow of information.
- each block does not represent a configuration of a hardware unit but represents a configuration of a function unit.
- the image processing device 2000 includes a display unit 2020 , a parameter acquisition unit 2040 , an input unit 2060 , and a presentation unit 2080 .
- the display unit 2020 displays a captured image captured by a camera.
- the parameter acquisition unit 2040 acquires camera parameters indicating a position and attitude or the like of the camera.
- the camera parameters may include a parameter other than the position and attitude of the camera.
- the parameter other than the position and attitude of the camera will be described later.
- the input unit 2060 accepts a designation of a first position on a captured image.
- the presentation unit 2080 generates a first image indicating a target object on the captured image appearing in a camera defined by the camera parameters upon disposing the target object in a second position on a real space relating to the first position.
- the first image is an image indicating how the target object looks when viewed from a point of view of the camera defined by the camera parameters.
- it is possible to determine the second position on the real space from the camera parameters and height information of the first position and the second position. “Disposing a target object in a second position” means that it is assumed that the target object exists in a position (the second position) on a real space relating to the first position on the captured image.
- the presentation unit 2080 generates the first image using the camera parameters, a predetermined shape and a predetermined size on the real space of the target object, and the second position. Further, the presentation unit 2080 presents the generated first image in the first position on the captured image.
- the target object is a virtual object having a planar shape or a solid shape.
- the predetermined size and the predetermined shape set for the target object are a size and a shape in which a real world is assumed.
- the predetermined size and the predetermined shape may be input by the user or may be previously stored in the inside or the outside of the image processing device 2000 .
- FIG. 2 is a diagram illustrating a situation where the image processing device 2000 has presented a predetermined object on a captured image.
- the predetermined object is a rectangular parallelepiped 20 .
- FIG. 2( a ) illustrates a situation where the rectangular parallelepiped 20 is viewed at an appropriate angle.
- a size of the rectangular parallelepiped 20 is 30 cm in width and depth and 170 cm in height.
- the rectangular parallelepiped 20 in this example is an object in which a shape and size of an average person are simplified.
- FIG. 2( b ) is a diagram in which the image processing device 2000 has presented the rectangular parallelepiped 20 on a captured image 10 .
- a first position 30 indicates a first position input to the input unit 2060 .
- the presentation unit 2080 presents a first image 40 in the first position 30 .
- the first image 40 is an image indicating in a pseudo manner, when the rectangular parallelepiped 20 disposed in a position equivalent to the first position 30 in a real world is image-captured by a camera specified by camera parameters, the rectangular parallelepiped 20 appearing in the camera.
- FIG. 3 is a flowchart illustrating a flow of processing executed by the image processing device 2000 of the first example embodiment.
- the display unit 2020 displays a captured image captured by a camera.
- the input unit 2060 accepts a designation of a first position on the captured image.
- the parameter acquisition unit 2040 acquires camera parameters indicating a position and attitude or the like of the camera.
- the presentation unit 2080 generates a first image. As described above, the first image indicates a target object on a captured image upon appearing in a camera specified by the camera parameters when being disposed in a second position.
- the presentation unit 2080 presents the generated first image in the first position on the captured image.
- processing (step S 106 ) of acquiring camera parameters may be executed before processing (step S 104 ) of accepting inputting of a first position.
- the user of the image processing device 2000 views an object presented by the presentation unit 2080 , and thereby the user can easily confirm whether camera parameters appropriately indicate a position and attitude or the like of a camera (hereinafter, a real camera) having captured a captured image displayed by the display unit 2020 .
- a camera hereinafter, a real camera
- FIG. 4 is a diagram illustrating a captured image in which a first image has been presented by the presentation unit 2080 .
- FIG. 4( a ) is a diagram in which camera parameters acquired by the parameter acquisition unit 2040 indicate a position and attitude approximate to a position and attitude of a real camera.
- FIG. 4( b ) is a diagram in which camera parameters acquired by the parameter acquisition unit 2040 indicate a position and attitude different from a position and attitude of a real camera.
- a target object in FIG. 4 is a rectangular parallelepiped having a height of 170 cm and depth and width of 30 cm in the same manner as in the case of FIG. 2 .
- the first image presented by the presentation unit 2080 is presented on a captured image as if a target object disposed in a place appearing on a captured image has been image-captured by a camera installed in a position and attitude indicated by camera parameters. Therefore, when the camera parameters indicate a position and attitude approximate to a position and attitude of a real camera, there is no feeling of strangeness in a manner of view or the like depending on a size and angle when a person, an object, or the like appearing on the captured image and the first image are compared.
- a height of the target object is, for example, 170 cm, and therefore when the target object and a person are compared, it is conceivable that heights to substantially the same extent are obtained.
- a transverse side of a person appearing on the captured image 10 is designated as a first position, and therefore the first image 40 is presented in a transverse side of the person.
- sizes of a person and a rectangular parallelepiped indicated by the first image 40 are substantially the same, resulting in no feeling of strangeness. Further, in FIG. 4 , in FIG. 4
- a height of a rectangular parallelepiped indicated by a first image 40 - 10 is approximately twice a height of a person, and therefore it is difficult to say that the first image 40 - 10 indicates an object (the rectangular parallelepiped 20 ) having a height of 170 cm disposed in a place appearing on a captured image 10 - 2 .
- each first image 40 presented by a captured image 10 - 1 a top surface of every rectangular parallelepiped is visible and appears in a manner of view so as to be looked down from a close proximity.
- a depression angle of a sight line direction of a camera indicated by camera parameters in FIG. 4( b ) has come to be larger than a depression angle of a sight line direction of a real camera.
- the user may designate a plurality of first positions and dispose a plurality of target objects within one captured image.
- the user using the image processing device 2000 compares a first image presented by the presentation unit 2080 and a captured image and thereby can easily grasp whether camera parameters acquired by the parameter acquisition unit 2040 indicate a position and attitude approximate to a position and attitude of a camera having captured the captured image.
- the user can determine that a combination between the camera parameters and a video of a monitoring camera is usable.
- countermeasures such that camera parameters are estimated again and a position and attitude of a real camera are corrected may be taken.
- Each function configuration unit of the image processing device 2000 may be realized by a hardware component (e.g. a hard-wired electronic circuit) that realizes each function configuration unit or may be realized by a combination between a hardware component and a software component (e.g. a combination between an electronic circuit and a program that controls the circuit).
- a hardware component e.g. a hard-wired electronic circuit
- a software component e.g. a combination between an electronic circuit and a program that controls the circuit.
- FIG. 5 is a block diagram illustrating a hardware configuration of the image processing device 2000 .
- the image processing device 2000 includes a bus 1020 , a processor 1040 , a memory 1060 , a storage 1080 , and an input/output interface 1100 .
- the bus 1020 is a data transmission channel in order for the processor 1040 , the memory 1060 , the storage 1080 , and the input/output interface 1100 to mutually execute data transmission/reception.
- a method for mutually connecting the processor 1040 and the like is not limited to bus connection.
- the processor 1040 is an arithmetic processing unit such as a CPU (Central Processing Unit), a CPU (Graphics Processing Unit), or the like, for example.
- the memory 1060 is a memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), or the like, for example.
- the storage 1080 is a storage device such as a hard disk, an SSD (Solid State Drive), a memory card, or the like, for example. Further, the storage 1080 may be a memory such as a RAM, a ROM, or the like.
- the input/output interface 1100 is an input/output interface in order for the image processing device 2000 to transmit/receive data between itself and an input device, an external device, or the like.
- the image processing device 2000 acquires, for example, the captured image and the first position via the input/output interface 1100 . Further, the image processing device 2000 outputs, for example, a captured image presenting a first image via the input/output interface.
- the storage 1080 stores a program for realizing a function of the image processing device 2000 .
- the storage stores program modules for realizing functions of the display unit 2020 , the parameter acquisition unit 2040 , the input unit 2060 , and the presentation unit 2080 , respectively.
- the processor 1040 executes these program modules and thereby realizes the functions of the display unit 2020 , the parameter acquisition unit 2040 , the input unit 2060 , and the presentation unit 2080 , respectively.
- the processor 1040 may read the modules onto the memory 1060 and execute the modules or may execute the modules without being read onto the memory 1060 .
- each program module may be stored on the memory 1060 .
- the image processing device 2000 may not include the storage 1080 .
- camera parameters may include a parameter other than a position and attitude of a camera.
- the camera parameters include, for example, an internal parameter indicating an internal characteristic of a camera such as a focal length, lens distortion, coordinates of a center of an image, and the like.
- the position and attitude of a camera is an external parameter indicating an external characteristic of the camera.
- the camera parameters may be calculated by associating two-dimensional coordinates on a captured image with three-dimensional coordinates on a real space.
- the image processing device 2000 of the present example embodiment specifies height information (the z-coordinate) of the second position on the real space and thereby uniquely determines the second position on the real space relating to the first position on the captured image.
- an origin on the real space is set on a ground surface immediately below a camera
- the x-coordinate and the y-coordinate are set in a width direction and a depth direction parallel to the ground surface, respectively
- the z-coordinate is set in a direction vertical to the ground surface to make description.
- a technique for executing mutual transformation between coordinates on an image and coordinates on a real space using camera parameters is a known technique and is described in, for example, NPL 1. Therefore, further detailed description on this technique will be omitted.
- the parameter acquisition unit 2040 acquires camera parameters.
- the parameter acquisition unit 2040 receives, for example, camera parameters transmitted from an external device. Further, the parameter acquisition unit 2040 accepts, for example, manual inputting of camera parameters. Further, the parameter acquisition unit 2040 reads, for example, camera parameters from a storage device storing camera parameters.
- the display unit 2020 displays a captured image on a display screen such as a display and the like.
- the display screen may be a stationary display or may be a portable display included in a mobile terminal and the like.
- the input unit 2060 may accept a designation of a first position using various methods capable of specifying a position on a captured image.
- the input unit 2060 accepts, for example, an operation (a click operation or the like) for designating any position on a captured image by an input device such as a mouse and the like. Further, when a captured image is displayed on a touch panel, the input unit 2060 accepts touch inputting or the like for any position on the captured image. Further, the input unit 2060 may accept inputting of coordinates indicating a position on a captured image.
- a target object is an object having, for example, a predetermined size and shape on a real space.
- Information defining a predetermined target object that is, for example, “a rectangular parallelepiped having a height of 170 cm and depth and width of 30 cm” as described above is previously stored in the inside or outside of the image processing device 2000 .
- the presentation unit 2080 uses this predetermined handling object.
- the image processing device 2000 may include a function for accepting inputting of information defining a target object.
- the device may accept information indicating both a shape and a size on a real space of the target object or may accept information indicating only any one of the shape and the size.
- the shape of the target object is previously determined as a shape of a rectangular parallelepiped, for example, and a designation of the size (depth and width and a height) is accepted from the user.
- the shape of the target object is not limited to a rectangular parallelepiped.
- the target object may be, for example, conical or spherical. Further, the target object may be an object indicating a shape of a person, an animal, or the like such as an avatar and the like.
- FIG. 6 is a diagram illustrating a situation where a first image 40 indicating a target object of a planar shape is presented on a captured image 10 .
- the user designates, for example, depth and width of a plane.
- a first image presented by the presentation unit 2080 becomes parallel to a ground surface.
- the user compares the first image 40 presented by the presentation unit 2080 with the ground surface appearing on the captured image 10 and checks whether a plane represented by the first image 40 is parallel to the ground surface, and thereby may easily confirm whether the camera parameters appropriately indicate the position and attitude or the like of the real camera.
- the size of the plane is designated, and therefore, when an object or the like of a known size appearing within a captured image is compared with an appearance and a size on an image and a feeling of strangeness is confirmed, it is possible to easily confirm whether the camera parameters appropriately indicate a position and attitude or the like of the real camera.
- the presentation unit 2080 generates, when a target object disposed in a second position appears in a camera determined by the camera parameters, an image indicating the target object on a captured image.
- the presentation unit 2080 executes, for example, the following processing.
- the presentation unit 2080 calculates a second position on a real space relating to a first position on a target image.
- a first position (two-dimensional coordinates) on a target image uniquely determines, by itself, a second position (three-dimensional coordinates) on a real space relating to the first position. Therefore, the presentation unit 2080 acquires information (a z-coordinate of the second position) indicating a height of the second position.
- the presentation unit 2080 calculates three-dimensional coordinates of the second position using two-dimensional coordinates of the first position, the height information of the second position, and camera parameters. As described above, when these pieces of information are used, two-dimensional coordinates on a captured image can be transformed to three-dimensional coordinates on a real space.
- the height information of the second position can be previously provided for the presentation unit 2080 or can be supplied from the outside. Alternatively, the height information of the second position may be set as a different height for each of a plurality of areas within a target image.
- the presentation unit 2080 generates a first image indicating a target object to be presented on the captured image.
- the presentation unit 2080 calculates coordinates of each apex of the target object to be presented on the captured image to generate the first image.
- the presentation unit 2080 transforms three-dimensional coordinates of each apex in which the target object is disposed in the second position on the real space to two-dimensional coordinates of each apex on the captured image, using the camera parameters.
- the presentation unit 2080 generates the first image by connecting each apex with a straight line or the like.
- An angle of the target object disposed in the real space is optional.
- the presentation unit 2080 assumes that the target object has been disposed in the second position such that, for example, in an xyz space representing the real space, a width-direction side of the target object is parallel to the x-axis, a depth-direction side thereof is parallel to the y-axis, and a height-direction side thereof is parallel to the z-axis. Directions of these sides may be previously determined, or designations therefor by the user may be accepted.
- a target object of a planar shape is used, a depth-direction side is matched with a line on a ground surface, and thereby it becomes possible to easily determine whether the target object and the ground surface are parallel to each other.
- a depth-direction side of the target object is matched with a line or the like of a tile having a known size of a floor face, and thereby a size of the tile may be measured. The size is confirmed, and thereby determination is more easily performed.
- the image processing device 2000 enables the user to rotate a target object being presented on the captured image 10 using a mouse or the like.
- the image processing device 2000 determines a direction of rotating the target object in accordance with a direction of the drag. For example, a rotation direction upon being dragged in a left direction is regarded as clockwise rotation, and a rotation direction upon being dragged in a right direction is regarded as counter-clockwise rotation. Further, the image processing device 2000 determines an angle of rotation of the target object in accordance with a distance of the drag.
- the image processing device 2000 rotates the target object on the basis of the determined direction and angle around a straight line (e.g. a straight line parallel to the z-axis), as a rotation axis, passing through the second position.
- the user disposes the depth-direction side of the target object along a line of the ground surface and compares the target object on the captured image 10 with the ground surface.
- the second position is not limited to an internal point of the target object and may be located externally.
- the presentation unit 2080 may accept an operation for moving a target object on the captured image 10 .
- the user moves the target object on the captured image 10 , for example, by an operation such as “dragging on the captured image 10 by the right button of a mouse.”
- the input unit 2060 repeatedly acquires a position of a moving mouse pointer as the above-described first position. This acquisition is executed, for example, at a predetermined time interval.
- the presentation unit 2080 presents, in a first position on the captured image 10 newly acquired by the input unit 2060 , the first image 40 newly generated on the basis of the first position, a fixedly obtained camera parameters, and height information of a second position.
- the presentation unit 2080 deletes, from the captured image 10 , the first image 40 having been presented in a first position acquired before the first position. By doing so, from a point of view of the user, the target object appears to be moving on a space appearing on the captured image 10 .
- FIG. 13 is a diagram illustrating a situation where the user moves a target object on the captured image 10 .
- a trajectory 170 indicates a trajectory in which the user has moved a target object.
- a first position 30 - 1 to a first position 30 - 5 indicate positions on the trajectory 170 , respectively.
- a first image 40 - 1 to a first image 40 - 5 indicate first images 40 presented in the first position 30 - 1 to the first position 30 - 5 , respectively.
- the first image 40 drawn with dotted lines indicates the first image 40 having already disappeared from the captured image 10
- the first image 40 drawn with solid lines indicates the first image 40 being currently presented.
- the first image 40 - 5 is being presented and the first image 40 - 1 to the first image 40 - 4 have disappeared.
- the user moves a target object so as to pass through a transverse side of a person or the like appearing on the captured image 10 and thereby confirms whether there is no feeling of strangeness in a manner of view of the target object.
- camera parameters acquired by the parameter acquisition unit 2040 indicate a position and attitude approximate to a position and attitude of a camera having captured the captured image 10 .
- the user can easily verify, for various positions on the captured image 10 , whether there is no feeling of strangeness in a manner of view of the target object. Specifically, when a manner of view of the target object is provided via continuous movement, rightfulness and a feeling of strangeness based on human visual sense is further emphasized, resulting in an effective function for verification.
- height information may be set for each area having a step.
- a trajectory is illustrated in FIG. 14 , by moving a target object on an image, the user may easily verify whether there is no feeling of strangeness in a manner of view of the target object seamlessly including the steps.
- FIG. 7 is a block diagram illustrating an image processing device 2000 according to a second example embodiment.
- an arrow indicates a flow of information.
- each block does not represent a configuration of a hardware unit but represents a configuration of a function unit.
- the image processing device 2000 of the second example embodiment includes a display unit 2020 , a parameter acquisition unit 2040 , a second input unit 2100 , and a second display unit 2120 .
- Functions included in the display unit 2020 and the parameter acquisition unit 2040 of the present example embodiment are the same as the functions included in the display unit 2020 and the parameter acquisition unit 2040 described in the first example embodiment, respectively.
- the second input unit 2100 accepts inputting of a point or line to a captured image displayed by the display unit 2020 .
- the second display unit 2120 displays, on the basis of camera parameters, a position on the captured image of the input point or line, and height information on a real space of the input point or line, an image indicating the point or line upon mapping on a plane parallel to a around surface.
- the second display unit 2120 displays, when it is assumed that the input point or line within the captured image exists within a field of view of a camera having captured the captured image, an image in which the point or line assumed to exist within the field of view of the camera is mapped on the plane parallel to the ground surface.
- the second display unit 2120 may perform display for the same display as a display or the like on which a captured image is being displayed by the display unit 2020 or may perform display for a different display or the like.
- the height information of the input point or line on the real space may be previously provided for the second display unit 2120 or may be input to the second input unit 2100 together with the point or line.
- the height information on the real space of the input point or line is previously provided for the second display unit 2120
- the second display unit 2120 maps a point or line existing on a captured image on a plane parallel to a ground surface in a real space.
- a mapping method of a point is described below.
- the second display unit 2120 transforms two-dimensional coordinates of a point on a captured image to three-dimensional coordinates on a real space.
- three-dimensional coordinates on the real space relating to two-dimensional coordinates on the captured image are not uniquely determined. Therefore, the second display unit 2120 uses height information of the input point. Specifically, it is assumed that the height information on the real space of the input point is given height information. Thereby, the second display unit 2120 may uniquely transform two-dimensional coordinates on the captured image to three-dimensional coordinates on the real space.
- a position of the input point on the plane parallel to the ground surface on the real space is represented by a width-direction coordinate and a depth-direction coordinate (the x-coordinate and the y-coordinate except the z-coordinate indicating height) of calculated three-dimensional coordinates.
- a technique for calculating, on the basis of camera parameters, two-dimensional coordinates of a point on a captured image, and height information on a real space of the point, three-dimensional coordinates on the real space relating to the two-dimensional coordinates is a known technique. Therefore, detailed description on this technique will be omitted.
- a principle of processing of mapping a line input onto a captured image on a plane parallel to a ground surface in a real space is the same as the above-described principle of processing of mapping a point.
- the second display unit 2120 maps, for example, each of two or more points (e.g. points of both ends) existing on an input line on a plane parallel to a ground surface in a real space.
- the second display unit 2120 connects these mapped points with a line such as a straight line and the like. By doing so, the line input onto the captured image is mapped on the plane parallel to the ground surface in the real space.
- FIG. 8 is a diagram illustrating a captured image 10 in which a line has been input via the second input unit 2100 .
- a dotted line 90 represents a line input to the second input unit 2100 .
- a pattern 100 is a line drawn on a ground surface on a real world appearing on a captured image.
- a pattern 100 - 1 and a pattern 100 - 2 are lines parallel to each other on the real world.
- a border 110 is a border between a wall and the ground surface on the real world appearing on the captured image.
- a border 110 - 1 and a border 110 - 2 vertically intersect with each other on the real world.
- the second display unit 2120 maps the dotted line 90 on a plane parallel to the ground surface.
- the second display unit 2120 displays a situation where the dotted line 90 mapped on the plane parallel to the around surface is viewed from a direction vertical to the plane.
- FIG. 9 is a diagram illustrating an image representing a situation where the dotted line 90 mapped on a plane representing a ground surface is viewed from a direction vertical to the plane.
- FIG. 9( a ) is a diagram in which camera parameters indicate a position and attitude approximate to a position and attitude of a real camera. As described above, in a real world (in a place appearing on a captured image), the pattern 100 - 1 and the pattern 100 - 2 are lines drawn parallel to each other. Therefore, in FIG.
- a projective line 120 - 1 in which a dotted line 90 - 1 is mapped on a plane representing a ground surface and a projective line 120 - 2 in which a dotted line 90 - 2 is mapped on the plane representing the ground surface are parallel or substantially parallel to each other.
- a border 110 - 3 and a border 110 - 4 vertically intersect with each other. Therefore, in FIG.
- a projective line 120 - 3 in which a dotted line 90 - 3 is mapped on the plane representing the ground surface and a projective line 120 - 4 in which a dotted line 90 - 4 is mapped on the plane representing the ground surface intersect with each other vertically or at a substantially vertical angle.
- FIG. 9( b ) is a diagram in which camera parameters indicate a position and attitude different from a position and attitude of a real camera.
- the projective line 120 - 1 and the projective line 120 - 2 may not have a parallel or substantially parallel relation, or the projective line 120 - 3 and the projective line 120 - 4 may not have a vertical or substantially vertical relation.
- the user using the captured image illustrated in FIG. 8 uses the pattern 100 and the border 110 in which a relation in a real world is known or easily predicted and views a result in which these are displayed by the second display unit 2120 , the user may easily confirm whether camera parameters appropriately indicate a position and attitude of a real camera.
- the method for using a pattern and the like on a ground surface is not limited to the above-described method.
- a method for inputting a plurality of points onto the pattern 100 - 1 and confirming whether the plurality of points are disposed on a straight line is conceivable, for example.
- FIG. 17 is a diagram illustrating a projective line 180 of a target object presented on a captured image on the plane representing the ground surface illustrated in FIG. 9( a ) .
- FIG. 17( a ) is a case in which the projective line 180 of the target object is presented when a first image indicating a still target object is presented on a captured image (e.g. FIG. 2( b ) ).
- FIG. 17( a ) is a case in which the projective line 180 of the target object is presented when a first image indicating a still target object is presented on a captured image (e.g. FIG. 2( b ) ).
- FIG. 17( a ) is a case in which the projective line 180 of the target object is presented when a first image indicating a still target object is presented on a captured image (e.g. FIG. 2( b ) ).
- FIG. 2( b ) On the other hand, FIG.
- 17( b ) is a case in which the projective line 180 of the target object is moved in accordance with movement of the target object on a captured image when an operation for moving the target object is being executed (e.g. FIG. 13 ).
- a trajectory 190 represents a trajectory of movement of the projective line 180 .
- a line tracing the shape may be input to the second input unit 2100 .
- camera parameters indicate a position and attitude approximate to a position and attitude of a real camera
- a shape of a line displayed by the second display unit 2120 represents a shape close to an original shape of a traced object.
- a shape of the line displayed by the second display unit 2120 becomes a perfect circle or a shape close to a perfect circle.
- a shape of a line presented by the second display unit 2120 becomes a shape (e.g. an elliptical shape) different from a perfect circle.
- the second display unit 2120 may present a position and a field of view of a camera on an image, together with a point and a line mapped on a plane parallel to a ground surface.
- FIG. 10 is a diagram illustrating an image in which a position and a field of view of a camera are presented, together with the projective lines illustrated in FIG. 9( a ) .
- a camera position 150 represents a position of the camera
- a field of view 160 represents a field of view of the camera.
- a system setter or the like handling the image processing device 2000 of the second example embodiment views a position relation of a point and a line mapped on a plane parallel to a ground surface and thereby confirms whether camera parameters appropriately indicate a position and attitude or the like of a real camera.
- the system setter or the like may further grasp a position relation between the mapped point and line and the position and the field of view of the camera. Therefore, the system setter or the like may more easily and accurately confirm whether the camera parameters appropriately indicate the position and attitude or the like of the real camera.
- FIG. 11 is a flowchart illustrating a flow of processing executed by the image processing device 2000 of the second example embodiment. Processing executed in steps S 102 and S 106 is the same as the processing executed in steps S 102 and S 106 of FIG. 3 .
- the second input unit 2100 accepts inputting of a point or line to a captured image displayed by the display unit 2020 .
- the second display unit 2120 displays an image indicating the point or line upon mapping on a plane parallel to a ground surface.
- the user inputs a line or the like that easily specifies an original shape or a position relation to a captured image and checks whether a line or the like displayed by the second display unit 2120 satisfies the original shape or the position relation, and thereby may easily confirm whether camera parameters appropriately indicate a position and attitude or the like of a real camera.
- FIG. 15 is a block diagram illustrating an image processing device 3000 according to a third example embodiment.
- an arrow indicates a flow of information.
- each block does not represent a configuration of a hardware unit but represents a configuration of a function unit.
- the image processing device 3000 of the third example embodiment includes an input unit 3020 and a presentation unit 3040 .
- the input unit 3020 accepts inputting of an operation for moving a first image being presented on a captured image captured by a camera.
- the first image is an image in which a target object having a predetermined shape and a predetermined size on a real space is superimposed on the captured image on the basis of predetermined camera parameters indicating a position and attitude of the camera.
- a position on the captured image in which the first image is being presented is designated as a position A
- the first image is equivalent to a first image presented by the presentation unit 2080 upon designating the position A as a first position in the image processing device 2000 of the first example embodiment.
- a target object in the third example embodiment is the same as the target object described in the first example embodiment.
- predetermined camera parameters in the third example embodiment is the same as the camera parameters described in the first example embodiment.
- the presentation unit 3040 presents, on the basis of the camera parameters, a first image indicating a target object in a manner of view relating to a position on the captured image after the movement.
- a method in which the presentation unit 3040 presents a first image relating to a target object to be moved is the same as “the method in which the presentation unit 2080 presents the first image 40 relating to a target object to be moved on the captured image 10 ” described in the first example embodiment.
- a hardware configuration of the image processing device 3000 is the same as the hardware configuration of the image processing device 2000 .
- FIG. 16 is a flowchart illustrating a flow of processing executed by the image processing device 3000 of the third example embodiment.
- the input unit 3020 accepts inputting of an operation for movement to a first image superimposed on a captured image.
- the presentation unit 3040 presents, on the basis of camera parameters, the first image indicating the target object in a manner of view relating to a position on the captured image after the movement.
- the flow of processing illustrated in FIG. 16 is one example, and a flow of processing executed by the image processing device 3000 is not limited to the flow illustrated in FIG. 16 .
- the user moves a target object so as to pass through a transverse side of a person or the like appearing on the captured image 10 and thereby may easily confirm whether there is no feeling of strangeness in a manner of view of the target object.
- a manner of view of a target object is provided via continuous movement, rightfulness and a feeling of strangeness based on human visual sense are further emphasized, resulting in an effective function for verification.
- the image processing device 2000 may include functions as described below.
- the image processing device 2000 including the following functions is expressed as an image processing device 2000 of a first modified example.
- the image processing device 2000 of the first modified example may include the functions of the image processing device 2000 of the above-described first and second example embodiments or may not include these functions.
- a calibration pattern or an object equivalent thereto is image-captured by a camera, and estimation is performed on the basis of an association relation between three-dimensional coordinates of the calibration pattern in a real world and two-dimensional coordinates of the calibration pattern of the captured image” (NPL 1).
- camera parameters are calculated so as to reduce, using estimated camera parameters, an error (re-projection error) between two-dimensional coordinates upon projecting three-dimensional coordinates of a calibration pattern in a real world on a captured image and two-dimensional coordinates of the calibration pattern appearing on the captured image.
- an error re-projection error
- the system setter or the like views only camera parameters as an estimation result and does not view the error that is an interim progress.
- accuracy in estimation of camera parameters may be enhanced.
- positions having large errors are concentrated on an edge of a captured image, it is conceivable that an error is increased due to a cause resulting from an input error of a corresponding point or lens distortion.
- a selection manner of a calibration pattern is changed so as not to use a calibration pattern image-captured in a position within a predetermined distance from an edge of an image to estimate camera parameters, accuracy of the camera parameters may be enhanced.
- FIG. 12 is a diagram illustrating a situation where information (error information 140 ) indicating an error is presented on a captured image.
- error information 140 indicating an error is presented on a captured image.
- a person is used to obtain a calibration pattern.
- a line 130 connecting the feet and the head of a person substantially standing erect is used as a calibration pattern.
- the error information 140 presented in a transverse side of the line 130 indicates a re-projection error relating to the line 130 .
- the image processing device 2000 may map the calibration pattern on a ground surface on the basis of the technique described in the second example embodiment and display the error in association with the calibration pattern mapped on the ground surface.
- an input means configured to accept inputting of an operation for movement, on a captured image captured by a camera, to a first image that is superimposed on the captured image on the basis of predetermined camera parameters indicating a position and attitude of the camera and indicates a target object having a predetermined shape and a predetermined size set on a real space;
- a presentation means configured to present the first image indicating the target object in a manner of view relating to a position on the captured image after the movement on the basis of the camera parameters.
- the input means accepts an operation for the movement by repeatedly accepting a designation of a first position on the captured image
- the presentation means generates, when a certain first position is designated, on the basis of the camera parameters, a predetermined shape and a predetermined size on a real space of the target object, and a second position on the real space relating to the first position, a first image indicating the target object on the captured image appearing in a camera determined by the camera parameters when the target object is disposed in the second position and presents the generated first image in the first position on the captured image.
- a second input means configured to accept inputting of a point or line to the captured image
- a second display means configured to display a second image indicating the point or line upon mapping on a plane parallel to a ground surface, on the basis of the camera parameters, a position on the captured image of the point or line, and height information on a real space of the point or line.
- an input means configured to accept a designation of a first position on a captured image
- a presentation means configured to present, on the basis of predetermined camera parameters indicating a position and attitude of a camera, a predetermined shape and a predetermined size on a real space of a target object, and a second position on the real space relating to the first position, a first image indicating the target object on the captured image appearing in a camera determined by the camera parameters when the target object is disposed in the second position in the first position on the captured image.
- the input means accepts designations of a plurality of first positions
- the presentation means presents first images indicating a plurality of target objects relating to the plurality of first positions in respective corresponding first positions on the captured image.
- the input means repeatedly accepts a designation of the first position
- the presentation means generates, when a certain first position is designated, a first image indicating the target object disposed in a second position on a real space relating to the first position and presents the generated first image in the first position on the captured image.
- an input means configured to accept inputting of a point or line to a captured image captured by a camera
- a display means configured to display a first image indicating the point or line upon mapping on a plane parallel to a around surface, on the basis of predetermined camera parameters indicating a position and attitude of the camera, a position on the captured image of the point or line, and height information on a real space of the point or line.
- the input step accepts an operation for the movement by repeatedly accepting a designation of a first position on the captured image
- the presentation step generates, when a certain first position is designated, on the basis of the camera parameters, a predetermined shape and a predetermined size on a real space of the target object, and a second position on the real space relating to the first position, a first image indicating the target object on the captured image appearing in a camera determined by the camera parameters when the target object is disposed in the second position and presents the generated first image in the first position on the captured image.
- the input step accepts designations of a plurality of first positions
- the presentation step presents first images indicating a plurality of target objects relating to the plurality of first positions in respective corresponding first positions on the captured image.
- the input step repeatedly accepts a designation of the first position
- the presentation step generates, when a certain first position is designated, a first image indicating the target object disposed in a second position on a real space relating to the first position and presents the generated first image in the first position on the captured image.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Emergency Management (AREA)
- Business, Economics & Management (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Processing (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
- The present invention relates to an image processing technique.
- As one method for monitoring facilities and the like, there is a method that uses a video of a monitoring camera installed in the facilities and the like. An actual size and position of a person or object appearing in a video of the monitoring camera may be calculated using information (hereinafter, referred to as camera parameters) on a position and attitude (posture) of the camera and a size and position on an image of the person or object appearing in the video (image). Through such calculation, it is possible to grasp, when, for example, an important person (a criminal of a case or the like) is appearing in a video of a monitoring camera, a height and the like of the person using the video of the monitoring camera.
- Camera parameters used in the above-described purpose and the like are estimated, for example, by calibration. NPL 1 discloses a method in which a calibration pattern is image-captured by a camera and camera parameters (a rotation and translation of the camera) indicating a position and attitude of the camera are estimated from an association relation between three-dimensional coordinates of the calibration pattern in a real world and two-dimensional coordinates of the calibration pattern of the captured image.
- Further, there is a case in which already-estimated camera parameters are acquired and used. For example, camera parameters previously calculated by executing calibration for a camera having been a past target may be acquired, or camera parameters defined on the basis of information such as a position and attitude upon installation of the camera may be acquired.
- NPL 1: Gang Xu and Saburo Tsuji, “Three-dimensional Vision”, Kyoritsu Shuppan, pp. 79-82, 1998
- It is difficult for camera parameters to always appropriately indicate a position and attitude or the like of a camera that is a target. For example, in a method for calculating camera parameters by calibration, due to a cause such as an input error of a corresponding point, lens distortion, and the like, camera parameters indicating a position and attitude different from an actual position and attitude of a camera may be calculated. Further, also when an already-estimated cameral parameter is acquired, it is difficult to understand whether the camera parameters are appropriate. It is possible that, for example, with an elapsed time, a position and attitude of a camera may change, and therefore camera parameters estimated in the past and a current position and attitude of the camera may differ from each other.
- When the camera parameters do not appropriately indicate a position and attitude or the like of a camera that is a target, a problem that an error in a calculation result occurs upon calculating, for example, a height of an important person appearing in a video of the above-described monitoring camera is produced.
- In view of the above-described problem, an object of the present invention has been achieved. The object of the present invention is to provide a technique enabling a use to easily confirm whether camera parameters are appropriate.
- A first image processing device provided by the present invention includes: an input means configured to accept inputting of an operation for movement, on a captured image captured by a camera, to a first image that is superimposed on the captured image on the basis of predetermined camera parameters indicating a position and attitude of the camera and indicates a target object having a predetermined shape and a predetermined size set on a real space; and a presentation means configured to present the first image indicating the target object in a manner of view relating to a position on the captured image after the movement on the basis of the camera parameters.
- A second image processing device provided by the present invention includes: a display means configured to display a captured image captured by a camera; a parameter acquisition means configured to acquire a cameral parameter indicates a position and an attitude of the camera; an input means configured to accept designation of a first position in the captured image; and a presentation means configured to, based on the camera parameters, a predetermined shape and a predetermined size on the real space of the target object, and the second position on the real space relating to the first position, present a first image indicating a target object on the captured image appearing in a camera defined by the camera parameters upon disposing the target object in a second position in the captured image relating to the first position.
- A third image processing device provided by the present invention includes: a first display means configured to display a captured image captured by a camera; a parameter acquisition means configured to acquire a cameral parameter indicates a position and an attitude of the camera; an input means configured to accept inputting of a dot or a line relating to the captured image; and a second display means configured to display the first image indicating the dot or a line mapped on a plane representing a ground surface is viewed from a direction vertical to the plane, based on the camera parameter, a position of the dot or the line on the captured image.
- A first image processing method provided by the present invention includes: an input step of accepting inputting of an operation for movement, on a captured image captured by a camera, to a first image that is superimposed on the captured image on the basis of predetermined camera parameters indicating a position and attitude of the camera and indicates a target object having a predetermined shape and a predetermined size set on a real space; and a presentation step of presenting the first image indicating the target object in a manner of view relating to a position on the captured image after the movement on the, basis of the camera parameters.
- A second image processing method provided by the present invention includes: a display step of displaying a captured image captured by a camera; a parameter acquisition step of acquiring a cameral parameter indicating a position and an attitude of the camera; an input step of accepting designation of a first position in the captured image; and a presentation step of, based on the camera parameters, a predetermined shape and a predetermined site on the real space of the target object, and the second position on the real space relating to the first position, presenting a first image indicating a target object on the captured image appearing in a camera defined by the camera parameters upon disposing the target object in a second position in the captured image relating to the first position.
- A third image processing method provided by the present invention includes: a first display step of displaying a captured image captured by a camera; a parameter acquisition step of acquiring a cameral parameter indicates a position and an attitude of the camera; an input step of accepting inputting of a dot or a line relating to the captured image; and a second display step of displaying the first image indicating the dot or a line mapped on a plane representing a ground surface is viewed from a direction vertical to the plane, based on the camera parameter, a position of the dot or the line on the captured image.
- A program provided by the present invention cause a computer to operate as the first image processing device, the second image processing device, or the third image processing device.
- According to the present invention, a technique enabling the user to easily confirm whether camera parameters are appropriate is provided.
- The above-described object and other objects as well as features and advantages will become further apparent from the following description of preferred example embodiments and the following accompanying drawings.
-
FIG. 1 is a block diagram illustrating an image processing device according to a first example embodiment. -
FIG. 2 is a diagram illustrating a situation where an image processing device has presented a predetermined object on a captured image. -
FIG. 3 is a flowchart illustrating a flow of processing executed by the image processing device of the first example embodiment. -
FIG. 4 is a diagram illustrating a captured image in which a first image has been presented by a presentation unit. -
FIG. 5 is a block diagram illustrating a hardware configuration of an image processing device. -
FIG. 6 is a diagram illustrating a situation where a first image indicating a target object of a planar shape is presented on a captured image. -
FIG. 7 is a block diagram illustrating an image processing device according to a second example embodiment. -
FIG. 8 is a diagram illustrating a captured image in which a line is input via a second input unit. -
FIG. 9 is a diagram illustrating an image indicating a situation where a dotted line mapped on a plane representing a ground surface is viewed from a direction vertical to the plane. -
FIG. 10 is a diagram illustrating an image in which a position and a field of view of a camera have been presented together with a projective line illustrated inFIG. 9(a) . -
FIG. 11 is a flowchart illustrating a flow of processing executed by the image processing device of the second example embodiment. -
FIG. 12 is a diagram illustrating a situation where error information is presented on a captured image. -
FIG. 13 is a diagram illustrating a situation where the user moves a target object on a captured image. -
FIG. 14 is a diagram illustrating a situation where a target object is moved across a plurality of areas having different heights. -
FIG. 15 is a block diagram illustrating an image processing device according to a third example embodiment. -
FIG. 16 is a flowchart illustrating a flow of processing executed by the image processing device of the third example embodiment. -
FIG. 17 is a diagram illustrating a projective line of a target object presented on a captured image on the plane representing the ground surface illustrated inFIG. 9(a) . - Hereinafter, example embodiments of the present invention will he described using the accompanying drawings. In all the drawings, the same components are assigned with the same reference signs, and description thereof will be omitted, as appropriate.
-
FIG. 1 is a block diagram illustrating animage processing device 2000 according to a first example embodiment. InFIG. 1 , an arrow indicates a flow of information. Further, inFIG. 1 , each block does not represent a configuration of a hardware unit but represents a configuration of a function unit. - The
image processing device 2000 includes adisplay unit 2020, aparameter acquisition unit 2040, aninput unit 2060, and apresentation unit 2080. - The
display unit 2020 displays a captured image captured by a camera. Theparameter acquisition unit 2040 acquires camera parameters indicating a position and attitude or the like of the camera. The camera parameters may include a parameter other than the position and attitude of the camera. The parameter other than the position and attitude of the camera will be described later. - The
input unit 2060 accepts a designation of a first position on a captured image. Thepresentation unit 2080 generates a first image indicating a target object on the captured image appearing in a camera defined by the camera parameters upon disposing the target object in a second position on a real space relating to the first position. In other words, the first image is an image indicating how the target object looks when viewed from a point of view of the camera defined by the camera parameters. Further, it is possible to determine the second position on the real space from the camera parameters and height information of the first position and the second position. “Disposing a target object in a second position” means that it is assumed that the target object exists in a position (the second position) on a real space relating to the first position on the captured image. Thepresentation unit 2080 generates the first image using the camera parameters, a predetermined shape and a predetermined size on the real space of the target object, and the second position. Further, thepresentation unit 2080 presents the generated first image in the first position on the captured image. The target object is a virtual object having a planar shape or a solid shape. The predetermined size and the predetermined shape set for the target object are a size and a shape in which a real world is assumed. The predetermined size and the predetermined shape may be input by the user or may be previously stored in the inside or the outside of theimage processing device 2000. - Using
FIG. 2 , specific description will be made.FIG. 2 is a diagram illustrating a situation where theimage processing device 2000 has presented a predetermined object on a captured image. InFIG. 2 , the predetermined object is arectangular parallelepiped 20.FIG. 2(a) illustrates a situation where therectangular parallelepiped 20 is viewed at an appropriate angle. As illustrated inFIG. 2(a) , a size of therectangular parallelepiped 20 is 30 cm in width and depth and 170 cm in height. Therectangular parallelepiped 20 in this example is an object in which a shape and size of an average person are simplified. -
FIG. 2(b) is a diagram in which theimage processing device 2000 has presented therectangular parallelepiped 20 on a capturedimage 10. Afirst position 30 indicates a first position input to theinput unit 2060. Thepresentation unit 2080 presents afirst image 40 in thefirst position 30. Thefirst image 40 is an image indicating in a pseudo manner, when therectangular parallelepiped 20 disposed in a position equivalent to thefirst position 30 in a real world is image-captured by a camera specified by camera parameters, therectangular parallelepiped 20 appearing in the camera. -
FIG. 3 is a flowchart illustrating a flow of processing executed by theimage processing device 2000 of the first example embodiment. In step S102, thedisplay unit 2020 displays a captured image captured by a camera. In step S104, theinput unit 2060 accepts a designation of a first position on the captured image. In step S106, theparameter acquisition unit 2040 acquires camera parameters indicating a position and attitude or the like of the camera. In step S108, thepresentation unit 2080 generates a first image. As described above, the first image indicates a target object on a captured image upon appearing in a camera specified by the camera parameters when being disposed in a second position. In step S110, thepresentation unit 2080 presents the generated first image in the first position on the captured image. - The flow of processing illustrated
FIG. 3 is one example, and a flow of processing executed by theimage processing device 2000 is not limited to the flow illustrated inFIG. 3 . For example, processing (step S106) of acquiring camera parameters may be executed before processing (step S104) of accepting inputting of a first position. - According to the present example embodiment, the user of the
image processing device 2000 views an object presented by thepresentation unit 2080, and thereby the user can easily confirm whether camera parameters appropriately indicate a position and attitude or the like of a camera (hereinafter, a real camera) having captured a captured image displayed by thedisplay unit 2020. Hereinafter, usingFIG. 4 , detailed description will be made. -
FIG. 4 is a diagram illustrating a captured image in which a first image has been presented by thepresentation unit 2080.FIG. 4(a) is a diagram in which camera parameters acquired by theparameter acquisition unit 2040 indicate a position and attitude approximate to a position and attitude of a real camera. On the other hand,FIG. 4(b) is a diagram in which camera parameters acquired by theparameter acquisition unit 2040 indicate a position and attitude different from a position and attitude of a real camera. A target object inFIG. 4 is a rectangular parallelepiped having a height of 170 cm and depth and width of 30 cm in the same manner as in the case ofFIG. 2 . - The first image presented by the
presentation unit 2080 is presented on a captured image as if a target object disposed in a place appearing on a captured image has been image-captured by a camera installed in a position and attitude indicated by camera parameters. Therefore, when the camera parameters indicate a position and attitude approximate to a position and attitude of a real camera, there is no feeling of strangeness in a manner of view or the like depending on a size and angle when a person, an object, or the like appearing on the captured image and the first image are compared. A height of the target object is, for example, 170 cm, and therefore when the target object and a person are compared, it is conceivable that heights to substantially the same extent are obtained. - In
FIG. 4 , a transverse side of a person appearing on the capturedimage 10 is designated as a first position, and therefore thefirst image 40 is presented in a transverse side of the person. InFIG. 4(a) , in any position, sizes of a person and a rectangular parallelepiped indicated by thefirst image 40 are substantially the same, resulting in no feeling of strangeness. Further, inFIG. 4(a) , in the same manner as in a case where a person, a wall, and the like appear by being looked down from a front-diagonally upward side, a rectangular parallelepiped indicated by eachfirst image 40 is also looked down from a front-diagonally upward side, and therefore there is no feeling of strangeness also in a manner of view depending on an angle of each rectangular parallelepiped. - In contrast, in
FIG. 4(b) , there is a feeling of strangeness in a manner of view caused by a size and angle of a rectangular parallelepiped indicated by thefirst image 40. For example, a height of a rectangular parallelepiped indicated by a first image 40-10 is approximately twice a height of a person, and therefore it is difficult to say that the first image 40-10 indicates an object (the rectangular parallelepiped 20) having a height of 170 cm disposed in a place appearing on a captured image 10-2. Further, differently from a rectangular parallelepiped indicated by eachfirst image 40 presented by a captured image 10-1, in the captured image 10-2, a top surface of every rectangular parallelepiped is visible and appears in a manner of view so as to be looked down from a close proximity. In this manner, from a size and angle of each rectangular parallelepiped indicated by thefirst image 40, it is predictable that a depression angle of a sight line direction of a camera indicated by camera parameters inFIG. 4(b) has come to be larger than a depression angle of a sight line direction of a real camera. - As illustrated in
FIG. 4 , the user may designate a plurality of first positions and dispose a plurality of target objects within one captured image. - As described above, according to the
image processing device 2000 of the present example embodiment, the user using theimage processing device 2000 compares a first image presented by thepresentation unit 2080 and a captured image and thereby can easily grasp whether camera parameters acquired by theparameter acquisition unit 2040 indicate a position and attitude approximate to a position and attitude of a camera having captured the captured image. When it is possible to confirm that a position and attitude approximate to a position and attitude of a camera having captured a captured image are indicated, the user can determine that a combination between the camera parameters and a video of a monitoring camera is usable. Conversely, when it is possible to confirm that a position and attitude approximate to a position and attitude of a camera having captured a captured image are not indicated, countermeasures such that camera parameters are estimated again and a position and attitude of a real camera are corrected may be taken. - Hereinafter, the
image processing device 2000 of the present example embodiment will be described in more detail. - Each function configuration unit of the
image processing device 2000 may be realized by a hardware component (e.g. a hard-wired electronic circuit) that realizes each function configuration unit or may be realized by a combination between a hardware component and a software component (e.g. a combination between an electronic circuit and a program that controls the circuit). -
FIG. 5 is a block diagram illustrating a hardware configuration of theimage processing device 2000. Theimage processing device 2000 includes abus 1020, aprocessor 1040, amemory 1060, astorage 1080, and an input/output interface 1100. Thebus 1020 is a data transmission channel in order for theprocessor 1040, thememory 1060, thestorage 1080, and the input/output interface 1100 to mutually execute data transmission/reception. However, a method for mutually connecting theprocessor 1040 and the like is not limited to bus connection. Theprocessor 1040 is an arithmetic processing unit such as a CPU (Central Processing Unit), a CPU (Graphics Processing Unit), or the like, for example. Thememory 1060 is a memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), or the like, for example. Thestorage 1080 is a storage device such as a hard disk, an SSD (Solid State Drive), a memory card, or the like, for example. Further, thestorage 1080 may be a memory such as a RAM, a ROM, or the like. The input/output interface 1100 is an input/output interface in order for theimage processing device 2000 to transmit/receive data between itself and an input device, an external device, or the like. Theimage processing device 2000 acquires, for example, the captured image and the first position via the input/output interface 1100. Further, theimage processing device 2000 outputs, for example, a captured image presenting a first image via the input/output interface. - The
storage 1080 stores a program for realizing a function of theimage processing device 2000. Specifically, the storage stores program modules for realizing functions of thedisplay unit 2020, theparameter acquisition unit 2040, theinput unit 2060, and thepresentation unit 2080, respectively. Theprocessor 1040 executes these program modules and thereby realizes the functions of thedisplay unit 2020, theparameter acquisition unit 2040, theinput unit 2060, and thepresentation unit 2080, respectively. When executing the modules, theprocessor 1040 may read the modules onto thememory 1060 and execute the modules or may execute the modules without being read onto thememory 1060. - The hardware configuration of the
image processing device 2000 is not limited to the configuration illustrated inFIG. 5 . For example, each program module may be stored on thememory 1060. In this case, theimage processing device 2000 may not include thestorage 1080. - As described above, camera parameters may include a parameter other than a position and attitude of a camera. The camera parameters include, for example, an internal parameter indicating an internal characteristic of a camera such as a focal length, lens distortion, coordinates of a center of an image, and the like. The position and attitude of a camera is an external parameter indicating an external characteristic of the camera. The camera parameters may be calculated by associating two-dimensional coordinates on a captured image with three-dimensional coordinates on a real space.
- When camera parameters are used, mutual transformation between the two-dimensional coordinates on the captured image and the three-dimensional coordinates on a real space may be made. However, it is difficult that the two-dimensional coordinates on the captured image uniquely determines, by itself, the three-dimensional coordinates on the real space relating to the two-dimensional coordinates. To uniquely determine the three-dimensional coordinates on the real space relating to the two-dimensional coordinates on the captured image, it is necessary to specify, for example, any one of an x-coordinate, a y-coordinate, and a z-coordinate of the three-dimensional coordinates. The
image processing device 2000 of the present example embodiment specifies height information (the z-coordinate) of the second position on the real space and thereby uniquely determines the second position on the real space relating to the first position on the captured image. In the present example embodiment, an origin on the real space is set on a ground surface immediately below a camera, the x-coordinate and the y-coordinate are set in a width direction and a depth direction parallel to the ground surface, respectively, and the z-coordinate is set in a direction vertical to the ground surface to make description. A technique for executing mutual transformation between coordinates on an image and coordinates on a real space using camera parameters is a known technique and is described in, for example, NPL 1. Therefore, further detailed description on this technique will be omitted. - There are various methods in which the
parameter acquisition unit 2040 acquires camera parameters. Theparameter acquisition unit 2040 receives, for example, camera parameters transmitted from an external device. Further, theparameter acquisition unit 2040 accepts, for example, manual inputting of camera parameters. Further, theparameter acquisition unit 2040 reads, for example, camera parameters from a storage device storing camera parameters. - The
display unit 2020 displays a captured image on a display screen such as a display and the like. The display screen may be a stationary display or may be a portable display included in a mobile terminal and the like. - The
input unit 2060 may accept a designation of a first position using various methods capable of specifying a position on a captured image. Theinput unit 2060 accepts, for example, an operation (a click operation or the like) for designating any position on a captured image by an input device such as a mouse and the like. Further, when a captured image is displayed on a touch panel, theinput unit 2060 accepts touch inputting or the like for any position on the captured image. Further, theinput unit 2060 may accept inputting of coordinates indicating a position on a captured image. - A target object is an object having, for example, a predetermined size and shape on a real space. Information defining a predetermined target object that is, for example, “a rectangular parallelepiped having a height of 170 cm and depth and width of 30 cm” as described above is previously stored in the inside or outside of the
image processing device 2000. In this case, thepresentation unit 2080 uses this predetermined handling object. - Further, the
image processing device 2000 may include a function for accepting inputting of information defining a target object. In this case, the device may accept information indicating both a shape and a size on a real space of the target object or may accept information indicating only any one of the shape and the size. In the latter case, the shape of the target object is previously determined as a shape of a rectangular parallelepiped, for example, and a designation of the size (depth and width and a height) is accepted from the user. - The shape of the target object is not limited to a rectangular parallelepiped. The target object may be, for example, conical or spherical. Further, the target object may be an object indicating a shape of a person, an animal, or the like such as an avatar and the like.
- Further, the target object may have a planar shape.
FIG. 6 is a diagram illustrating a situation where afirst image 40 indicating a target object of a planar shape is presented on a capturedimage 10. In this case, the user designates, for example, depth and width of a plane. When camera parameters appropriately indicate a position and attitude or the like of a real camera, a first image presented by thepresentation unit 2080 becomes parallel to a ground surface. The user compares thefirst image 40 presented by thepresentation unit 2080 with the ground surface appearing on the capturedimage 10 and checks whether a plane represented by thefirst image 40 is parallel to the ground surface, and thereby may easily confirm whether the camera parameters appropriately indicate the position and attitude or the like of the real camera. Further, the size of the plane is designated, and therefore, when an object or the like of a known size appearing within a captured image is compared with an appearance and a size on an image and a feeling of strangeness is confirmed, it is possible to easily confirm whether the camera parameters appropriately indicate a position and attitude or the like of the real camera. - As described above, the
presentation unit 2080 generates, when a target object disposed in a second position appears in a camera determined by the camera parameters, an image indicating the target object on a captured image. Thepresentation unit 2080 executes, for example, the following processing. - First, the
presentation unit 2080 calculates a second position on a real space relating to a first position on a target image. As described above, it is difficult that a first position (two-dimensional coordinates) on a target image uniquely determines, by itself, a second position (three-dimensional coordinates) on a real space relating to the first position. Therefore, thepresentation unit 2080 acquires information (a z-coordinate of the second position) indicating a height of the second position. The height information of the second position indicates, for example, a height (z=0) of a ground surface on the real space. When the height information of the second position is specified in this manner, a position on the real space relating to the first position on the target image is uniquely determined. Thepresentation unit 2080 calculates three-dimensional coordinates of the second position using two-dimensional coordinates of the first position, the height information of the second position, and camera parameters. As described above, when these pieces of information are used, two-dimensional coordinates on a captured image can be transformed to three-dimensional coordinates on a real space. The height information of the second position can be previously provided for thepresentation unit 2080 or can be supplied from the outside. Alternatively, the height information of the second position may be set as a different height for each of a plurality of areas within a target image. - The
presentation unit 2080 generates a first image indicating a target object to be presented on the captured image. When the target object has, for example, a shape of a rectangular parallelepiped or a cone, thepresentation unit 2080 calculates coordinates of each apex of the target object to be presented on the captured image to generate the first image. Specifically, thepresentation unit 2080 transforms three-dimensional coordinates of each apex in which the target object is disposed in the second position on the real space to two-dimensional coordinates of each apex on the captured image, using the camera parameters. Thepresentation unit 2080 generates the first image by connecting each apex with a straight line or the like. - An angle of the target object disposed in the real space is optional. The
presentation unit 2080 assumes that the target object has been disposed in the second position such that, for example, in an xyz space representing the real space, a width-direction side of the target object is parallel to the x-axis, a depth-direction side thereof is parallel to the y-axis, and a height-direction side thereof is parallel to the z-axis. Directions of these sides may be previously determined, or designations therefor by the user may be accepted. When, for example, in the capturedimage 10 ofFIG. 6 , a target object of a planar shape is used, a depth-direction side is matched with a line on a ground surface, and thereby it becomes possible to easily determine whether the target object and the ground surface are parallel to each other. In addition thereto, when, for example, a lattice of a predetermined width on a real space is drawn in a target object, a depth-direction side of the target object is matched with a line or the like of a tile having a known size of a floor face, and thereby a size of the tile may be measured. The size is confirmed, and thereby determination is more easily performed. Therefore, theimage processing device 2000, for example, enables the user to rotate a target object being presented on the capturedimage 10 using a mouse or the like. When, for example, the target object being presented on the capturedimage 10 or a periphery thereof has been dragged by a mouse or the like, theimage processing device 2000 determines a direction of rotating the target object in accordance with a direction of the drag. For example, a rotation direction upon being dragged in a left direction is regarded as clockwise rotation, and a rotation direction upon being dragged in a right direction is regarded as counter-clockwise rotation. Further, theimage processing device 2000 determines an angle of rotation of the target object in accordance with a distance of the drag. In this case, a relation between a distance of a drag and an angle of rotation is previously defined. Theimage processing device 2000 rotates the target object on the basis of the determined direction and angle around a straight line (e.g. a straight line parallel to the z-axis), as a rotation axis, passing through the second position. The user disposes the depth-direction side of the target object along a line of the ground surface and compares the target object on the capturedimage 10 with the ground surface. The second position is not limited to an internal point of the target object and may be located externally. - Further, the
presentation unit 2080 may accept an operation for moving a target object on the capturedimage 10. The user moves the target object on the capturedimage 10, for example, by an operation such as “dragging on the capturedimage 10 by the right button of a mouse.” In this case, theinput unit 2060 repeatedly acquires a position of a moving mouse pointer as the above-described first position. This acquisition is executed, for example, at a predetermined time interval. Thepresentation unit 2080 presents, in a first position on the capturedimage 10 newly acquired by theinput unit 2060, thefirst image 40 newly generated on the basis of the first position, a fixedly obtained camera parameters, and height information of a second position. Further, thepresentation unit 2080 deletes, from the capturedimage 10, thefirst image 40 having been presented in a first position acquired before the first position. By doing so, from a point of view of the user, the target object appears to be moving on a space appearing on the capturedimage 10. -
FIG. 13 is a diagram illustrating a situation where the user moves a target object on the capturedimage 10. InFIG. 13 , atrajectory 170 indicates a trajectory in which the user has moved a target object. A first position 30-1 to a first position 30-5 indicate positions on thetrajectory 170, respectively. A first image 40-1 to a first image 40-5 indicatefirst images 40 presented in the first position 30-1 to the first position 30-5, respectively. Thefirst image 40 drawn with dotted lines indicates thefirst image 40 having already disappeared from the capturedimage 10, and thefirst image 40 drawn with solid lines indicates thefirst image 40 being currently presented. InFIG. 13 , since a currently designatedfirst position 30 is the first position 30-5, the first image 40-5 is being presented and the first image 40-1 to the first image 40-4 have disappeared. - As illustrated in
FIG. 13 , for example, the user moves a target object so as to pass through a transverse side of a person or the like appearing on the capturedimage 10 and thereby confirms whether there is no feeling of strangeness in a manner of view of the target object. In the case ofFIG. 13 , when there is no feeling of strangeness in a size and direction of the target object even upon moving the target object to a transverse side of any person, it is conceivable that camera parameters acquired by theparameter acquisition unit 2040 indicate a position and attitude approximate to a position and attitude of a camera having captured the capturedimage 10. When such a moving operation is provided, the user can easily verify, for various positions on the capturedimage 10, whether there is no feeling of strangeness in a manner of view of the target object. Specifically, when a manner of view of the target object is provided via continuous movement, rightfulness and a feeling of strangeness based on human visual sense is further emphasized, resulting in an effective function for verification. - Further, as illustrated in
FIG. 14 , for example, in a captured image in which areas having a step as in stairs appear, height information may be set for each area having a step. In this case, as a trajectory is illustrated inFIG. 14 , by moving a target object on an image, the user may easily verify whether there is no feeling of strangeness in a manner of view of the target object seamlessly including the steps. -
FIG. 7 is a block diagram illustrating animage processing device 2000 according to a second example embodiment. InFIG. 7 , an arrow indicates a flow of information. Further, inFIG. 7 , each block does not represent a configuration of a hardware unit but represents a configuration of a function unit. - The
image processing device 2000 of the second example embodiment includes adisplay unit 2020, aparameter acquisition unit 2040, asecond input unit 2100, and asecond display unit 2120. Functions included in thedisplay unit 2020 and theparameter acquisition unit 2040 of the present example embodiment are the same as the functions included in thedisplay unit 2020 and theparameter acquisition unit 2040 described in the first example embodiment, respectively. - The
second input unit 2100 accepts inputting of a point or line to a captured image displayed by thedisplay unit 2020. Thesecond display unit 2120 displays, on the basis of camera parameters, a position on the captured image of the input point or line, and height information on a real space of the input point or line, an image indicating the point or line upon mapping on a plane parallel to a around surface. In other words, thesecond display unit 2120 displays, when it is assumed that the input point or line within the captured image exists within a field of view of a camera having captured the captured image, an image in which the point or line assumed to exist within the field of view of the camera is mapped on the plane parallel to the ground surface. Thesecond display unit 2120 may perform display for the same display as a display or the like on which a captured image is being displayed by thedisplay unit 2020 or may perform display for a different display or the like. - The height information of the input point or line on the real space may be previously provided for the
second display unit 2120 or may be input to thesecond input unit 2100 together with the point or line. When the height information on the real space of the input point or line is previously provided for thesecond display unit 2120, the height information is set as, for example, a height (e.g. height information (z-coordinate)=0) of a ground surface on the real space. - As described above, the
second display unit 2120 maps a point or line existing on a captured image on a plane parallel to a ground surface in a real space. First, a mapping method of a point is described below. Thesecond display unit 2120 transforms two-dimensional coordinates of a point on a captured image to three-dimensional coordinates on a real space. As described above, three-dimensional coordinates on the real space relating to two-dimensional coordinates on the captured image are not uniquely determined. Therefore, thesecond display unit 2120 uses height information of the input point. Specifically, it is assumed that the height information on the real space of the input point is given height information. Thereby, thesecond display unit 2120 may uniquely transform two-dimensional coordinates on the captured image to three-dimensional coordinates on the real space. A position of the input point on the plane parallel to the ground surface on the real space is represented by a width-direction coordinate and a depth-direction coordinate (the x-coordinate and the y-coordinate except the z-coordinate indicating height) of calculated three-dimensional coordinates. - As described in the first example embodiment, a technique for calculating, on the basis of camera parameters, two-dimensional coordinates of a point on a captured image, and height information on a real space of the point, three-dimensional coordinates on the real space relating to the two-dimensional coordinates is a known technique. Therefore, detailed description on this technique will be omitted.
- A principle of processing of mapping a line input onto a captured image on a plane parallel to a ground surface in a real space is the same as the above-described principle of processing of mapping a point. The
second display unit 2120 maps, for example, each of two or more points (e.g. points of both ends) existing on an input line on a plane parallel to a ground surface in a real space. Thesecond display unit 2120 connects these mapped points with a line such as a straight line and the like. By doing so, the line input onto the captured image is mapped on the plane parallel to the ground surface in the real space. - Hereinafter, a utilization method of the
image processing device 2000 of the second example embodiment will be described. - The user of the
image processing device 2000 inputs, for example, a pattern in a real world and a line tracing a border between a wall and a around surface to thesecond input unit 2100.FIG. 8 is a diagram illustrating a capturedimage 10 in which a line has been input via thesecond input unit 2100. A dotted line 90 represents a line input to thesecond input unit 2100. Apattern 100 is a line drawn on a ground surface on a real world appearing on a captured image. A pattern 100-1 and a pattern 100-2 are lines parallel to each other on the real world. A border 110 is a border between a wall and the ground surface on the real world appearing on the captured image. A border 110-1 and a border 110-2 vertically intersect with each other on the real world. - The
second display unit 2120 maps the dotted line 90 on a plane parallel to the ground surface. Thesecond display unit 2120 displays a situation where the dotted line 90 mapped on the plane parallel to the around surface is viewed from a direction vertical to the plane.FIG. 9 is a diagram illustrating an image representing a situation where the dotted line 90 mapped on a plane representing a ground surface is viewed from a direction vertical to the plane.FIG. 9(a) is a diagram in which camera parameters indicate a position and attitude approximate to a position and attitude of a real camera. As described above, in a real world (in a place appearing on a captured image), the pattern 100-1 and the pattern 100-2 are lines drawn parallel to each other. Therefore, inFIG. 9(a) in which camera parameters indicate a position and attitude approximate to a position and attitude of a real camera, a projective line 120-1 in which a dotted line 90-1 is mapped on a plane representing a ground surface and a projective line 120-2 in which a dotted line 90-2 is mapped on the plane representing the ground surface are parallel or substantially parallel to each other. Further, as described above, in a real world (in a place appearing on a captured image), a border 110-3 and a border 110-4 vertically intersect with each other. Therefore, inFIG. 9(a) , a projective line 120-3 in which a dotted line 90-3 is mapped on the plane representing the ground surface and a projective line 120-4 in which a dotted line 90-4 is mapped on the plane representing the ground surface intersect with each other vertically or at a substantially vertical angle. - On the other hand,
FIG. 9(b) is a diagram in which camera parameters indicate a position and attitude different from a position and attitude of a real camera. In this case, the projective line 120-1 and the projective line 120-2 may not have a parallel or substantially parallel relation, or the projective line 120-3 and the projective line 120-4 may not have a vertical or substantially vertical relation. - In this manner, when the user using the captured image illustrated in
FIG. 8 uses thepattern 100 and the border 110 in which a relation in a real world is known or easily predicted and views a result in which these are displayed by thesecond display unit 2120, the user may easily confirm whether camera parameters appropriately indicate a position and attitude of a real camera. - The method for using a pattern and the like on a ground surface is not limited to the above-described method. A method for inputting a plurality of points onto the pattern 100-1 and confirming whether the plurality of points are disposed on a straight line is conceivable, for example.
- Further, the
image processing device 2000 of the present example embodiment may map and present, on the plane, a target object being presented on a captured image in the first example embodiment.FIG. 17 is a diagram illustrating aprojective line 180 of a target object presented on a captured image on the plane representing the ground surface illustrated inFIG. 9(a) .FIG. 17(a) is a case in which theprojective line 180 of the target object is presented when a first image indicating a still target object is presented on a captured image (e.g.FIG. 2(b) ). On the other hand,FIG. 17(b) is a case in which theprojective line 180 of the target object is moved in accordance with movement of the target object on a captured image when an operation for moving the target object is being executed (e.g.FIG. 13 ). A trajectory 190 represents a trajectory of movement of theprojective line 180. - Further, when an object (a manhole or the like) in which an original shape is understandable appears on a ground surface of a captured image, a line tracing the shape may be input to the
second input unit 2100. When camera parameters indicate a position and attitude approximate to a position and attitude of a real camera, a shape of a line displayed by thesecond display unit 2120 represents a shape close to an original shape of a traced object. When, for example, a line is input so as to trace a manhole appearing on a captured image, a shape of the line displayed by thesecond display unit 2120 becomes a perfect circle or a shape close to a perfect circle. On the other hand, when camera parameters indicate a position and attitude different from a position and attitude of a real camera, a shape of a line presented by thesecond display unit 2120 becomes a shape (e.g. an elliptical shape) different from a perfect circle. - Further, the
second display unit 2120 may present a position and a field of view of a camera on an image, together with a point and a line mapped on a plane parallel to a ground surface.FIG. 10 is a diagram illustrating an image in which a position and a field of view of a camera are presented, together with the projective lines illustrated inFIG. 9(a) . InFIG. 10 , acamera position 150 represents a position of the camera, and a field ofview 160 represents a field of view of the camera. - A system setter or the like handling the
image processing device 2000 of the second example embodiment views a position relation of a point and a line mapped on a plane parallel to a ground surface and thereby confirms whether camera parameters appropriately indicate a position and attitude or the like of a real camera. As illustrated inFIG. 10 , when a position and a field of view of a camera are presented together with a point and a line mapped on a plane parallel to a ground surface, the system setter or the like may further grasp a position relation between the mapped point and line and the position and the field of view of the camera. Therefore, the system setter or the like may more easily and accurately confirm whether the camera parameters appropriately indicate the position and attitude or the like of the real camera. -
FIG. 11 is a flowchart illustrating a flow of processing executed by theimage processing device 2000 of the second example embodiment. Processing executed in steps S102 and S106 is the same as the processing executed in steps S102 and S106 ofFIG. 3 . In step S202, thesecond input unit 2100 accepts inputting of a point or line to a captured image displayed by thedisplay unit 2020. In step S204, thesecond display unit 2120 displays an image indicating the point or line upon mapping on a plane parallel to a ground surface. - According to the
image processing device 2000 of the present example embodiment, the user inputs a line or the like that easily specifies an original shape or a position relation to a captured image and checks whether a line or the like displayed by thesecond display unit 2120 satisfies the original shape or the position relation, and thereby may easily confirm whether camera parameters appropriately indicate a position and attitude or the like of a real camera. -
FIG. 15 is a block diagram illustrating animage processing device 3000 according to a third example embodiment. InFIG. 15 , an arrow indicates a flow of information. Further, inFIG. 15 , each block does not represent a configuration of a hardware unit but represents a configuration of a function unit. - The
image processing device 3000 of the third example embodiment includes aninput unit 3020 and apresentation unit 3040. Theinput unit 3020 accepts inputting of an operation for moving a first image being presented on a captured image captured by a camera. The first image is an image in which a target object having a predetermined shape and a predetermined size on a real space is superimposed on the captured image on the basis of predetermined camera parameters indicating a position and attitude of the camera. When, for example, a position on the captured image in which the first image is being presented is designated as a position A, the first image is equivalent to a first image presented by thepresentation unit 2080 upon designating the position A as a first position in theimage processing device 2000 of the first example embodiment. A target object in the third example embodiment is the same as the target object described in the first example embodiment. Further, predetermined camera parameters in the third example embodiment is the same as the camera parameters described in the first example embodiment. - The
presentation unit 3040 presents, on the basis of the camera parameters, a first image indicating a target object in a manner of view relating to a position on the captured image after the movement. A method in which thepresentation unit 3040 presents a first image relating to a target object to be moved is the same as “the method in which thepresentation unit 2080 presents thefirst image 40 relating to a target object to be moved on the capturedimage 10” described in the first example embodiment. - A hardware configuration of the
image processing device 3000 is the same as the hardware configuration of theimage processing device 2000. -
FIG. 16 is a flowchart illustrating a flow of processing executed by theimage processing device 3000 of the third example embodiment. In step S302, theinput unit 3020 accepts inputting of an operation for movement to a first image superimposed on a captured image. In step S304, thepresentation unit 3040 presents, on the basis of camera parameters, the first image indicating the target object in a manner of view relating to a position on the captured image after the movement. - The flow of processing illustrated in
FIG. 16 is one example, and a flow of processing executed by theimage processing device 3000 is not limited to the flow illustrated inFIG. 16 . - According to the present example embodiment, as illustrated, for example, in
FIG. 13 orFIG. 14 , the user moves a target object so as to pass through a transverse side of a person or the like appearing on the capturedimage 10 and thereby may easily confirm whether there is no feeling of strangeness in a manner of view of the target object. Specifically, when a manner of view of a target object is provided via continuous movement, rightfulness and a feeling of strangeness based on human visual sense are further emphasized, resulting in an effective function for verification. - The
image processing device 2000 may include functions as described below. Theimage processing device 2000 including the following functions is expressed as animage processing device 2000 of a first modified example. Theimage processing device 2000 of the first modified example may include the functions of theimage processing device 2000 of the above-described first and second example embodiments or may not include these functions. - As described above, for estimation of camera parameters, used is a method in which “a calibration pattern or an object equivalent thereto is image-captured by a camera, and estimation is performed on the basis of an association relation between three-dimensional coordinates of the calibration pattern in a real world and two-dimensional coordinates of the calibration pattern of the captured image” (NPL 1). Specifically, camera parameters are calculated so as to reduce, using estimated camera parameters, an error (re-projection error) between two-dimensional coordinates upon projecting three-dimensional coordinates of a calibration pattern in a real world on a captured image and two-dimensional coordinates of the calibration pattern appearing on the captured image. There is, for example, a method for calculating estimation values of camera parameters so as to minimize a square sum of errors.
- Commonly, when a system setter or the like handling the
image processing device 2000 performs work for estimating camera parameters using the above-described calibration, the system setter or the like views only camera parameters as an estimation result and does not view the error that is an interim progress. However, when the error that is an interim progress is caused to be viewed by the system setter or the like, it is conceivable that accuracy in estimation of camera parameters may be enhanced. When, for example, positions having large errors are concentrated on an edge of a captured image, it is conceivable that an error is increased due to a cause resulting from an input error of a corresponding point or lens distortion. In such a case, when a selection manner of a calibration pattern is changed so as not to use a calibration pattern image-captured in a position within a predetermined distance from an edge of an image to estimate camera parameters, accuracy of the camera parameters may be enhanced. - The
image processing device 2000 presents, for each position where a calibration pattern is image-captured, the error with respect to the calibration pattern image-captured in the position in a periphery of a position on a captured image relating to the position.FIG. 12 is a diagram illustrating a situation where information (error information 140) indicating an error is presented on a captured image. InFIG. 12 , to obtain a calibration pattern, a person is used. Specifically, aline 130 connecting the feet and the head of a person substantially standing erect is used as a calibration pattern. The error information 140 presented in a transverse side of theline 130 indicates a re-projection error relating to theline 130. - The
image processing device 2000 may map the calibration pattern on a ground surface on the basis of the technique described in the second example embodiment and display the error in association with the calibration pattern mapped on the ground surface. - While the example embodiments of the present invention have been described with reference to the drawings, these example embodiments are illustrative of the present invention, and various constitutions other than the above are employable.
- Hereinafter, examples of reference modes will be supplementarily noted.
- 1. An image processing device includes:
- an input means configured to accept inputting of an operation for movement, on a captured image captured by a camera, to a first image that is superimposed on the captured image on the basis of predetermined camera parameters indicating a position and attitude of the camera and indicates a target object having a predetermined shape and a predetermined size set on a real space; and
- a presentation means configured to present the first image indicating the target object in a manner of view relating to a position on the captured image after the movement on the basis of the camera parameters.
- 2. The image processing device according to 1., wherein
- the input means accepts an operation for the movement by repeatedly accepting a designation of a first position on the captured image, and
- the presentation means generates, when a certain first position is designated, on the basis of the camera parameters, a predetermined shape and a predetermined size on a real space of the target object, and a second position on the real space relating to the first position, a first image indicating the target object on the captured image appearing in a camera determined by the camera parameters when the target object is disposed in the second position and presents the generated first image in the first position on the captured image.
- 3. The image processing device according to 2., wherein
- the presentation means
-
- acquires height information of the second position and
- calculates the second position on the basis of the camera parameters, the first position, and the height information of the second position.
- 4. The image processing device according to 3., wherein the presentation means acquires information indicating a height of a around surface in a real space as the height information of the second position.
- 5. The image processing device according to 3., wherein the presentation means acquires pieces of information of different heights for a plurality of areas on the captured image, respectively, as the height information of the second position.
- 6. The image processing device according to any one of 1. to 5., wherein the target object has a planar shape.
- 7. The image processing device according to any one of 1. to 6., comprising:
- a second input means configured to accept inputting of a point or line to the captured image; and
- a second display means configured to display a second image indicating the point or line upon mapping on a plane parallel to a ground surface, on the basis of the camera parameters, a position on the captured image of the point or line, and height information on a real space of the point or line.
- 8. An image processing device includes:
- an input means configured to accept a designation of a first position on a captured image; and
- a presentation means configured to present, on the basis of predetermined camera parameters indicating a position and attitude of a camera, a predetermined shape and a predetermined size on a real space of a target object, and a second position on the real space relating to the first position, a first image indicating the target object on the captured image appearing in a camera determined by the camera parameters when the target object is disposed in the second position in the first position on the captured image.
- 9. The image processing device according to 8., wherein
- the input means accepts designations of a plurality of first positions, and
- the presentation means presents first images indicating a plurality of target objects relating to the plurality of first positions in respective corresponding first positions on the captured image.
- 10. The image processing device according to 8. or 9., wherein
- the input means repeatedly accepts a designation of the first position, and
- the presentation means generates, when a certain first position is designated, a first image indicating the target object disposed in a second position on a real space relating to the first position and presents the generated first image in the first position on the captured image.
- 11. An image processing device comprising:
- an input means configured to accept inputting of a point or line to a captured image captured by a camera; and
- a display means configured to display a first image indicating the point or line upon mapping on a plane parallel to a around surface, on the basis of predetermined camera parameters indicating a position and attitude of the camera, a position on the captured image of the point or line, and height information on a real space of the point or line.
- 12. An image processing method executed by a computer, the method comprising:
- an input step of accepting inputting of an operation for movement, on a captured image captured by a camera, to a first image that is superimposed on the captured image on the basis of predetermined camera parameters indicating a position and attitude of the camera and indicates a target object having a predetermined shape and a predetermined size set on a real space; and
- a presentation step of presenting the first image indicating the target object in a manner of view relating to a position on the captured image after the movement on the basis of the camera parameters.
- 13. The image processing method according to 12., wherein
- the input step accepts an operation for the movement by repeatedly accepting a designation of a first position on the captured image, and
- the presentation step generates, when a certain first position is designated, on the basis of the camera parameters, a predetermined shape and a predetermined size on a real space of the target object, and a second position on the real space relating to the first position, a first image indicating the target object on the captured image appearing in a camera determined by the camera parameters when the target object is disposed in the second position and presents the generated first image in the first position on the captured image.
- 14. The image processing method according to 13., wherein
- the presentation step
- acquires height information of the second position and
- calculates the second position on the basis of the camera parameters, the first position, and the height information of the second position.
- 15. The image processing method according to 14., wherein the presentation step acquires information indicating a height of a ground surface in a real space as the height information of the second position.
- 16. The image processing method according to 14., wherein the presentation step acquires pieces of information of different heights for a plurality of areas on the captured image, respectively, as the height information of the second position.
- 17. The image processing method according to any one of 12. to 16., wherein the target object has a planar shape.
- 18. The image processing method according to any one of 12. to 17., including:
- a second input step of accepting inputting of a point or line to the captured image, and
- a second display step of displaying a second image indicating the point or line upon mapping on a plane parallel to a ground surface, on the basis of the camera parameters, a position on the captured image of the point or line, and height information on a real space of the point or line.
- 19. An image processing method executed by a computer, the method comprising:
- an input step of accepting a designation of a first position on a captured image; and
- a presentation step of presenting, on the basis of predetermined camera parameters indicating a position and attitude of a camera, a predetermined shape and a predetermined size on a real space of a target object, and a second position on the real space relating to the first position, a first image indicating the target object on the captured image appearing in the camera determined by the camera parameters when the target object is disposed in the second position in the first position on the captured image.
- 20. The image processing method according to 19., wherein
- the input step accepts designations of a plurality of first positions, and
- the presentation step presents first images indicating a plurality of target objects relating to the plurality of first positions in respective corresponding first positions on the captured image.
- 21. The image processing method according to 19. or 20., wherein
- the input step repeatedly accepts a designation of the first position, and
- the presentation step generates, when a certain first position is designated, a first image indicating the target object disposed in a second position on a real space relating to the first position and presents the generated first image in the first position on the captured image.
- 22. An image processing method executed by a computer, the method comprising:
- an input step of accepting inputting of a point or line to a captured image captured by a camera; and
- a display step of displaying a first image indicating the point or line upon mapping on a plane parallel to a ground surface, on the basis of predetermined camera parameters indicating a position and attitude of the camera, a position on the captured image of the point or line, and height information on a real space of the point or line.
- 23. A program that causes a computer to operate as the image processing device according to any one of 1. to 11.
- This application is based upon and claims the benefit of priority from Japanese patent application No. 2014-191480, filed on Sep. 19, 2014 and Japanese patent application No. 2014-257137, filed on Dec. 19, 2014, the disclosures of which are incorporated herein in their entirety by reference.
Claims (16)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/131,306 US20210112181A1 (en) | 2014-09-19 | 2020-12-22 | Image processing device, image processing method, and recording medium |
US18/241,301 US20230412903A1 (en) | 2014-09-19 | 2023-09-01 | Image processing device, image processing method, and recording medium |
US18/241,299 US20230412902A1 (en) | 2014-09-19 | 2023-09-01 | Image processing device, image processing method, and recording medium |
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-191480 | 2014-09-19 | ||
JP2014191480 | 2014-09-19 | ||
JP2014-257137 | 2014-12-19 | ||
JP2014257137 | 2014-12-19 | ||
PCT/JP2015/071750 WO2016042926A1 (en) | 2014-09-19 | 2015-07-31 | Image processing device, image processing method, and program |
US201715512340A | 2017-03-17 | 2017-03-17 | |
US17/131,306 US20210112181A1 (en) | 2014-09-19 | 2020-12-22 | Image processing device, image processing method, and recording medium |
Related Parent Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/071750 Continuation WO2016042926A1 (en) | 2014-09-19 | 2015-07-31 | Image processing device, image processing method, and program |
US15/512,340 Continuation US10911645B2 (en) | 2014-09-19 | 2015-07-31 | Image processing device, image processing method, and recording medium |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/241,299 Continuation US20230412902A1 (en) | 2014-09-19 | 2023-09-01 | Image processing device, image processing method, and recording medium |
US18/241,301 Continuation US20230412903A1 (en) | 2014-09-19 | 2023-09-01 | Image processing device, image processing method, and recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210112181A1 true US20210112181A1 (en) | 2021-04-15 |
Family
ID=55532970
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/512,340 Active 2035-12-27 US10911645B2 (en) | 2014-09-19 | 2015-07-31 | Image processing device, image processing method, and recording medium |
US16/409,320 Abandoned US20190268509A1 (en) | 2014-09-19 | 2019-05-10 | Image processing device, image processing method, and recording medium |
US17/131,306 Abandoned US20210112181A1 (en) | 2014-09-19 | 2020-12-22 | Image processing device, image processing method, and recording medium |
US18/241,299 Pending US20230412902A1 (en) | 2014-09-19 | 2023-09-01 | Image processing device, image processing method, and recording medium |
US18/241,301 Pending US20230412903A1 (en) | 2014-09-19 | 2023-09-01 | Image processing device, image processing method, and recording medium |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/512,340 Active 2035-12-27 US10911645B2 (en) | 2014-09-19 | 2015-07-31 | Image processing device, image processing method, and recording medium |
US16/409,320 Abandoned US20190268509A1 (en) | 2014-09-19 | 2019-05-10 | Image processing device, image processing method, and recording medium |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/241,299 Pending US20230412902A1 (en) | 2014-09-19 | 2023-09-01 | Image processing device, image processing method, and recording medium |
US18/241,301 Pending US20230412903A1 (en) | 2014-09-19 | 2023-09-01 | Image processing device, image processing method, and recording medium |
Country Status (3)
Country | Link |
---|---|
US (5) | US10911645B2 (en) |
JP (4) | JP6747292B2 (en) |
WO (1) | WO2016042926A1 (en) |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI439960B (en) | 2010-04-07 | 2014-06-01 | Apple Inc | Avatar editing environment |
JP2016213674A (en) * | 2015-05-08 | 2016-12-15 | キヤノン株式会社 | Display control system, display control unit, display control method, and program |
AU2017100670C4 (en) | 2016-06-12 | 2019-11-21 | Apple Inc. | User interfaces for retrieving contextually relevant media content |
KR102596477B1 (en) * | 2016-09-23 | 2023-11-02 | 애플 인크. | Avatar creation and editing |
DK201870374A1 (en) | 2018-05-07 | 2019-12-04 | Apple Inc. | Avatar creation user interface |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
SG10201809572RA (en) * | 2018-10-29 | 2020-05-28 | Nec Asia Pacific Pte Ltd | Methods and apparatus to cluster and collect head-toe lines for automatic camera calibration |
JP7277187B2 (en) * | 2019-03-13 | 2023-05-18 | キヤノン株式会社 | Image processing device, imaging device, image processing method, and program |
JP7310252B2 (en) * | 2019-04-19 | 2023-07-19 | 株式会社リコー | MOVIE GENERATOR, MOVIE GENERATION METHOD, PROGRAM, STORAGE MEDIUM |
US11921998B2 (en) | 2020-05-11 | 2024-03-05 | Apple Inc. | Editing features of an avatar |
DK202070625A1 (en) | 2020-05-11 | 2022-01-04 | Apple Inc | User interfaces related to time |
US11714536B2 (en) | 2021-05-21 | 2023-08-01 | Apple Inc. | Avatar sticker editor user interfaces |
US11776190B2 (en) | 2021-06-04 | 2023-10-03 | Apple Inc. | Techniques for managing an avatar on a lock screen |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6463121B1 (en) * | 1999-10-13 | 2002-10-08 | General Electric Company | Interactive x-ray position and exposure control using image data as reference information |
US20040041822A1 (en) * | 2001-03-13 | 2004-03-04 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, studio apparatus, storage medium, and program |
US20040085451A1 (en) * | 2002-10-31 | 2004-05-06 | Chang Nelson Liang An | Image capture and viewing system and method for generating a synthesized image |
US20050226530A1 (en) * | 2004-04-08 | 2005-10-13 | Hajime Murayama | Image processing program, image processing method, image processing apparatus and storage medium |
US20080317373A1 (en) * | 2006-03-06 | 2008-12-25 | Koji Aoyama | Image processing apparatus, image processing method, recording medium, and program |
US20090225173A1 (en) * | 2008-03-05 | 2009-09-10 | Sony Corporation | Image capturing method, control method therefor, and program |
US20130208005A1 (en) * | 2012-02-10 | 2013-08-15 | Sony Corporation | Image processing device, image processing method, and program |
US20140375691A1 (en) * | 2011-11-11 | 2014-12-25 | Sony Corporation | Information processing apparatus, information processing method, and program |
US20170148176A1 (en) * | 2015-11-24 | 2017-05-25 | Fujitsu Limited | Non-transitory computer-readable storage medium, evaluation method, and evaluation device |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2622620B2 (en) | 1989-11-07 | 1997-06-18 | プロクシマ コーポレイション | Computer input system for altering a computer generated display visible image |
EP1102211A3 (en) * | 1999-11-19 | 2006-09-13 | Matsushita Electric Industrial Co., Ltd. | Image processor, method of providing image processing services and order processing method |
JP2001209827A (en) * | 1999-11-19 | 2001-08-03 | Matsushita Electric Ind Co Ltd | Image processor, image processing service providing method and order receiving processing method |
JP4217100B2 (en) | 2003-04-17 | 2009-01-28 | 本田技研工業株式会社 | Image composition method, apparatus, and program, and stereo model rendering method, apparatus, and program |
JP2005142938A (en) * | 2003-11-07 | 2005-06-02 | Casio Comput Co Ltd | Electronic camera, control program |
JP4244040B2 (en) * | 2005-03-10 | 2009-03-25 | 任天堂株式会社 | Input processing program and input processing apparatus |
US7801330B2 (en) | 2005-06-24 | 2010-09-21 | Objectvideo, Inc. | Target detection and tracking from video streams |
JP5120249B2 (en) * | 2006-03-15 | 2013-01-16 | オムロン株式会社 | Monitoring device and monitoring method, control device and control method, and program |
EP1862969A1 (en) * | 2006-06-02 | 2007-12-05 | Eidgenössische Technische Hochschule Zürich | Method and system for generating a representation of a dynamically changing 3D scene |
JP5223318B2 (en) * | 2007-12-07 | 2013-06-26 | ソニー株式会社 | Image processing apparatus, image processing method, and program |
US20110187703A1 (en) | 2010-01-29 | 2011-08-04 | Kedar Anil Patwardhan | Method and system for object tracking using appearance model |
JP5656567B2 (en) | 2010-11-05 | 2015-01-21 | キヤノン株式会社 | Video processing apparatus and method |
JP2013110551A (en) * | 2011-11-21 | 2013-06-06 | Sony Corp | Information processing device, imaging device, information processing method, and program |
US8836768B1 (en) | 2012-09-04 | 2014-09-16 | Aquifi, Inc. | Method and system enabling natural user interface gestures with user wearable glasses |
JP5516693B2 (en) * | 2012-10-29 | 2014-06-11 | 富士通モバイルコミュニケーションズ株式会社 | Portable information device and information processing program |
DE102013211492B4 (en) | 2013-06-19 | 2020-10-15 | Trimble Jena Gmbh | Determination of a measurement error |
-
2015
- 2015-07-31 US US15/512,340 patent/US10911645B2/en active Active
- 2015-07-31 JP JP2016548769A patent/JP6747292B2/en active Active
- 2015-07-31 WO PCT/JP2015/071750 patent/WO2016042926A1/en active Application Filing
-
2019
- 2019-05-10 US US16/409,320 patent/US20190268509A1/en not_active Abandoned
-
2020
- 2020-07-31 JP JP2020129925A patent/JP6996594B2/en active Active
- 2020-12-22 US US17/131,306 patent/US20210112181A1/en not_active Abandoned
-
2021
- 2021-12-13 JP JP2021201328A patent/JP7294396B2/en active Active
-
2023
- 2023-06-07 JP JP2023093623A patent/JP2023111962A/en active Pending
- 2023-09-01 US US18/241,299 patent/US20230412902A1/en active Pending
- 2023-09-01 US US18/241,301 patent/US20230412903A1/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6463121B1 (en) * | 1999-10-13 | 2002-10-08 | General Electric Company | Interactive x-ray position and exposure control using image data as reference information |
US20040041822A1 (en) * | 2001-03-13 | 2004-03-04 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, studio apparatus, storage medium, and program |
US20040085451A1 (en) * | 2002-10-31 | 2004-05-06 | Chang Nelson Liang An | Image capture and viewing system and method for generating a synthesized image |
US20050226530A1 (en) * | 2004-04-08 | 2005-10-13 | Hajime Murayama | Image processing program, image processing method, image processing apparatus and storage medium |
US20080317373A1 (en) * | 2006-03-06 | 2008-12-25 | Koji Aoyama | Image processing apparatus, image processing method, recording medium, and program |
US20090225173A1 (en) * | 2008-03-05 | 2009-09-10 | Sony Corporation | Image capturing method, control method therefor, and program |
US20140375691A1 (en) * | 2011-11-11 | 2014-12-25 | Sony Corporation | Information processing apparatus, information processing method, and program |
US20130208005A1 (en) * | 2012-02-10 | 2013-08-15 | Sony Corporation | Image processing device, image processing method, and program |
US20170148176A1 (en) * | 2015-11-24 | 2017-05-25 | Fujitsu Limited | Non-transitory computer-readable storage medium, evaluation method, and evaluation device |
Also Published As
Publication number | Publication date |
---|---|
JP6747292B2 (en) | 2020-08-26 |
JP2020182251A (en) | 2020-11-05 |
JPWO2016042926A1 (en) | 2017-07-20 |
JP2023111962A (en) | 2023-08-10 |
JP7294396B2 (en) | 2023-06-20 |
US20230412903A1 (en) | 2023-12-21 |
US20230412902A1 (en) | 2023-12-21 |
US20190268509A1 (en) | 2019-08-29 |
JP6996594B2 (en) | 2022-01-17 |
JP2022022434A (en) | 2022-02-03 |
WO2016042926A1 (en) | 2016-03-24 |
US20170289411A1 (en) | 2017-10-05 |
US10911645B2 (en) | 2021-02-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210112181A1 (en) | Image processing device, image processing method, and recording medium | |
US10235118B2 (en) | Augmented reality device and method for providing assistance to a worker at a remote site | |
US10086955B2 (en) | Pattern-based camera pose estimation system | |
US9805509B2 (en) | Method and system for constructing a virtual image anchored onto a real-world object | |
US10451403B2 (en) | Structure-based camera pose estimation system | |
US11490062B2 (en) | Information processing apparatus, information processing method, and storage medium | |
JP5117418B2 (en) | Information processing apparatus and information processing method | |
US9361731B2 (en) | Method and apparatus for displaying video on 3D map | |
JP2008275341A (en) | Information processor and processing method | |
JP6174968B2 (en) | Imaging simulation device | |
US20170116735A1 (en) | Optimized camera pose estimation system | |
US20180204387A1 (en) | Image generation device, image generation system, and image generation method | |
US20240071016A1 (en) | Mixed reality system, program, mobile terminal device, and method | |
WO2019012803A1 (en) | Designation device and designation method | |
US20170094244A1 (en) | Image processing device and image processing method | |
JP2020126208A5 (en) | ||
US9881419B1 (en) | Technique for providing an initial pose for a 3-D model | |
JP2013092888A (en) | Data processor | |
JP2019185475A (en) | Specification program, specification method, and information processing device | |
KR20210023663A (en) | Image processing method and image processing apparatus for generating 3d content using 2d images | |
JP2023125590A (en) | Bar arrangement inspection device, control method of the bar arrangement inspection device and bar arrangement inspection program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: NEC CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IKEDA, HIROO;REEL/FRAME:054732/0957 Effective date: 20170302 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |