WO2016042926A1 - 画像処理装置、画像処理方法、及びプログラム - Google Patents
画像処理装置、画像処理方法、及びプログラム Download PDFInfo
- Publication number
- WO2016042926A1 WO2016042926A1 PCT/JP2015/071750 JP2015071750W WO2016042926A1 WO 2016042926 A1 WO2016042926 A1 WO 2016042926A1 JP 2015071750 W JP2015071750 W JP 2015071750W WO 2016042926 A1 WO2016042926 A1 WO 2016042926A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- camera
- captured image
- image processing
- target object
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19678—User interface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/08—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using communication transmission lines
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/87—Regeneration of colour television signals
- H04N9/8715—Regeneration of colour television signals involving the mixing of the reproduced video signal with a non-recorded signal, e.g. a text signal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20096—Interactive definition of curve of interest
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20101—Interactive definition of point of interest, landmark or seed
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
Definitions
- the present invention relates to image processing technology.
- the actual size or position of a person or thing shown in the video of the surveillance camera is information on the position and orientation of the camera (hereinafter referred to as camera parameters) and the image of the person or thing shown in the video (image). It can be calculated using the size and position above. With this calculation, for example, when an important person (such as a criminal of an incident) is shown in the video of the surveillance camera, the height of the person can be grasped using the video of the surveillance camera.
- Non-Patent Document 1 captures a calibration pattern with a camera. From the correspondence between the three-dimensional coordinates of the calibration pattern in the real world and the two-dimensional coordinates of the calibration pattern on the captured image, the position and orientation of the camera are described. Discloses a method for estimating camera parameters (camera rotation and translation).
- estimated camera parameters are obtained and used. For example, obtain camera parameters that have been calculated by calibrating the target camera in the past, or obtain camera parameters defined based on information such as position and orientation when the camera was installed There are things to do.
- Camera parameters do not always represent the position and orientation of the target camera.
- camera parameters representing positions and orientations different from the actual camera positions or orientations may be calculated due to input errors of corresponding points, lens distortion, and the like.
- the camera parameters do not correctly represent the position and orientation of the target camera, for example, when calculating the height of an important person shown in the video of the monitoring camera described above, there is a problem that an error occurs in the calculation result.
- the object of the present invention has been made in view of the above problems.
- the objective of this invention is providing the technique which can confirm easily whether a camera parameter is appropriate.
- a first image processing apparatus includes a predetermined shape in real space superimposed on a captured image captured by a camera based on predetermined camera parameters representing the position and orientation of the camera, and Corresponding to the position on the captured image after the movement based on the input means for receiving the input of the movement operation on the captured image with respect to the first image representing the target object having a predetermined size.
- Presenting means for presenting a first image representing the target object in a visible manner.
- a second image processing apparatus includes a display unit that displays a captured image captured by a camera, a parameter acquisition unit that acquires a camera parameter indicating the position and orientation of the camera, and a second unit on the captured image.
- the target object Presenting means for presenting, at the first position on the captured image, a first image representing the target object on the captured image captured by the camera parameter determined by the camera parameter when placed at the second position; .
- a third image processing apparatus includes a first display unit that displays a captured image captured by a camera, a parameter acquisition unit that acquires a camera parameter that represents the position and orientation of the camera, and the captured image A plane parallel to the ground surface based on input means for receiving the input of a point or line, the camera parameter, the position of the point or line on the captured image, and the height information of the point or line in real space Second display means for displaying a first image representing the point or line when mapped on top.
- the first image processing method provided by the present invention is executed by a computer.
- the image processing method is an object in which a predetermined shape and a predetermined size in a real space are superimposed on a captured image captured by a camera based on predetermined camera parameters representing the position and orientation of the camera.
- the second image processing method provided by the present invention is executed by a computer.
- the image processing method includes a display step for displaying a captured image captured by a camera, a parameter acquisition step for acquiring a camera parameter representing the position and orientation of the camera, and an input for receiving designation of a first position on the captured image.
- the target object is placed at the second position based on the step, the camera parameter, the predetermined shape and size of the target object in the real space, and the second position in the real space corresponding to the first position.
- the third image processing method provided by the present invention is executed by a computer.
- the image processing method receives a first display step for displaying a captured image captured by a camera, a parameter acquisition step for acquiring a camera parameter representing the position and orientation of the camera, and input of a point or a line with respect to the captured image. Based on the input step, the camera parameter, the position of the point or line on the captured image, and the height information of the point or line in real space, the mapping is performed on a plane parallel to the ground surface. And a second display step for displaying a first image representing a point or line.
- the program provided by the present invention causes a computer to operate as the first image processing apparatus, the second image processing apparatus, or the third image processing apparatus provided by the present invention.
- a technique for easily confirming whether or not a camera parameter is appropriate is provided.
- FIG. 1 is a block diagram illustrating an image processing apparatus according to a first embodiment. It is a figure which illustrates a mode that the image processing apparatus presented the predetermined object on the captured image.
- 3 is a flowchart illustrating a flow of processing executed by the image processing apparatus according to the first embodiment. It is a figure which illustrates the picked-up image by which the 1st image was shown by the presentation part. It is a block diagram which illustrates the hardware constitutions of an image processing apparatus. It is a figure which illustrates a mode that the 1st image showing the planar-shaped target object is shown on the captured image.
- FIG. 4 is a block diagram illustrating an image processing apparatus according to a second embodiment.
- FIG. 6 is a flowchart illustrating a flow of processing executed by the image processing apparatus according to the second embodiment. It is a figure which illustrates a mode that error information is shown on a captured image. It is a figure which illustrates a mode that a user moves a target object on a captured image.
- FIG. 6 is a block diagram illustrating an image processing apparatus according to a third embodiment.
- 10 is a flowchart illustrating the flow of processing executed by the image processing apparatus according to the third embodiment. It is a figure which illustrates the projection line of the target object shown on the picked-up image on the plane showing the ground surface demonstrated in Fig.9 (a).
- FIG. 1 is a block diagram illustrating an image processing apparatus 2000 according to the first embodiment.
- arrows indicate the flow of information.
- each block represents a functional unit configuration, not a hardware unit configuration.
- the image processing apparatus 2000 includes a display unit 2020, a parameter acquisition unit 2040, an input unit 2060, and a presentation unit 2080.
- Display unit 2020 displays a captured image captured by the camera.
- the parameter acquisition unit 2040 acquires camera parameters representing the camera position and orientation.
- the camera parameters may include parameters other than the camera position and orientation. Camera parameters other than the camera position and orientation will be described later.
- the input unit 2060 accepts designation of the first position on the captured image.
- the presentation unit 2080 generates a first image representing the target object on the captured image that is captured by the camera determined by the camera parameter when the target object is placed at the second position in the real space corresponding to the first position.
- the first image is an image representing how the target object looks from the viewpoint of the camera determined by the camera parameters.
- the second position in the real space can be obtained from the camera parameters, the height information of the first position, and the second position.
- “place the target object at the second position” means that the target object is assumed to exist at a position (second position) in the real space corresponding to the first position on the captured image.
- the presentation unit 2080 generates a first image using the camera parameter, the predetermined shape and size of the target object in the real space, and the second position. Furthermore, the presentation unit 2080 presents the generated first image at the first position on the captured image.
- the target object is a virtual object having a planar shape or a three-dimensional shape.
- the predetermined size and the predetermined shape set for the target object are sizes and shapes that assume the real world. The predetermined size and the predetermined shape may be input by the user, or may be stored in advance or inside the image processing apparatus 2000.
- FIG. 2 is a diagram illustrating a state in which the image processing apparatus 2000 presents a predetermined object on a captured image.
- the predetermined object is a rectangular parallelepiped 20.
- FIG. 2A shows a state in which the rectangular parallelepiped 20 is viewed from an appropriate angle.
- the cuboid 20 has a horizontal and vertical length of 30 cm and a height of 170 cm.
- the rectangular parallelepiped 20 in this example is an object that simplifies the shape and size of an average person.
- FIG. 2B is a diagram in which the image processing device 2000 presents the rectangular parallelepiped 20 on the captured image 10.
- the first position 30 represents the first position input to the input unit 2060.
- the presentation unit 2080 presents the first image 40 at the first position 30.
- the first image 40 is an image that artificially represents the rectangular parallelepiped 20 captured by the camera.
- FIG. 3 is a flowchart illustrating the flow of processing executed by the image processing apparatus 2000 according to the first embodiment.
- the display unit 2020 displays a captured image captured by the camera.
- the input unit 2060 accepts designation of the first position on the captured image.
- the parameter acquisition unit 2040 acquires camera parameters representing the position and orientation of the camera.
- the presentation unit 2080 generates a first image. As described above, the first image represents the target object on the captured image when it is captured by the camera determined by the camera parameter when placed at the second position.
- the presentation unit 2080 presents the generated first image at the first position on the captured image.
- a process (step S106) of acquiring camera parameters may be performed prior to a process of receiving an input of the first position (step S104).
- a camera that captures a captured image displayed by the display unit 2020 when the user of the image processing apparatus 2000 views an object presented by the presentation unit 2080 (hereinafter, an actual camera). It is possible to easily confirm whether or not the position and posture of the camera are properly represented. Hereinafter, this will be described in detail with reference to FIG.
- FIG. 4 is a diagram illustrating a captured image in which the first image is presented by the presentation unit 2080.
- FIG. 4A is a diagram in a case where the camera parameters acquired by the parameter acquisition unit 2040 represent positions and orientations that approximate the actual camera positions and orientations.
- FIG. 4B is a diagram when the camera parameter acquired by the parameter acquisition unit 2040 represents a position and orientation different from the actual camera position and orientation.
- the target object in FIG. 4 is a rectangular parallelepiped having a height of 170 cm and both vertical and horizontal lengths of 30 cm as in the case of FIG.
- the first image presented by the presentation unit 2080 is captured as if the target object placed at the location shown in the captured image was captured by the camera installed at the position and orientation represented by the camera parameters. Presented on the image. Therefore, when the camera parameter represents a position and orientation that approximates the position and orientation of the real camera, when the first image is compared with a person or an object shown on the captured image, the appearance by size and angle There is no sense of incongruity. For example, since the height of the target object is 170 cm, it is considered that when the target object is compared with a person, the height is approximately the same.
- FIG. 4 since the side of the person shown in the captured image 10 is designated as the first position, the first image 40 is presented beside the person.
- FIG. 4A the size of the person and the rectangular parallelepiped represented by the first image 40 is almost the same at any position, and there is no sense of incongruity.
- FIG. 4 (a) the rectangular parallelepiped represented by each first image 40 seems to be looked down from the front diagonally in the same manner as a person or a wall is seen from the front diagonally upward. There is no sense of incongruity about how each rectangular parallelepiped looks.
- FIG. 4B there is a sense of incongruity regarding how the rectangular parallelepiped represented by the first image 40 looks depending on the size and angle.
- the height of the rectangular parallelepiped represented by the first image 40-10 is nearly twice as high as that of a person, and the first image 40-10 is the height placed at the place shown in the captured image 10-2. It cannot be said that it represents a 170 cm square object (cuboid 20).
- the upper surface of all the rectangular parallelepipeds is visible in the captured image 10-2, and the appearance is as if looking down from the vicinity. Yes.
- the depression angle in the camera viewing direction represented by the camera parameters in FIG. 4B is larger than the depression angle in the viewing direction of the real camera. It can be expected that it has become.
- the user can designate a plurality of first positions and place a plurality of target objects in one captured image.
- the user who uses the image processing apparatus 2000 compares the first image presented by the presentation unit 2080 with the captured image, and is acquired by the parameter acquisition unit 2040. It is possible to easily grasp whether or not the obtained camera parameter represents a position and orientation that approximates the position and orientation of the camera that captured the captured image. If it can be confirmed that the position and orientation approximate to the position and orientation of the camera that captured the captured image can be confirmed, the user can determine that the combination of the camera parameter and the video of the monitoring camera may be used. Conversely, if it can be confirmed that it does not represent a position and orientation that approximates the position and orientation of the camera that captured the captured image, the camera parameters are estimated again, the position and orientation of the actual camera are corrected, etc. You can take action.
- Each functional component of the image processing apparatus 2000 may be realized by a hardware component (eg, a hard-wired electronic circuit) that implements each functional component, or a hardware component and a software component. (For example, a combination of an electronic circuit and a program for controlling the electronic circuit).
- a hardware component eg, a hard-wired electronic circuit
- a hardware component and a software component for example, a combination of an electronic circuit and a program for controlling the electronic circuit.
- FIG. 5 is a block diagram illustrating a hardware configuration of the image processing apparatus 2000.
- the image processing apparatus 2000 includes a bus 1020, a processor 1040, a memory 1060, a storage 1080, and an input / output interface 1100.
- the bus 1020 is a data transmission path through which the processor 1040, the memory 1060, the storage 1080, and the input / output interface 1100 transmit / receive data to / from each other.
- the method of connecting the processors 1040 and the like is not limited to bus connection.
- the processor 1040 is an arithmetic processing device such as a CPU (Central Processing Unit) and a GPU (Graphics Processing Unit).
- the memory 1060 is a memory such as a RAM (Random Access Memory) or a ROM (Read Only Memory).
- the storage 1080 is a storage device such as a hard disk, SSD (Solid State Drive), or memory card.
- the storage 1080 may be a memory such as a RAM or a ROM.
- the input / output interface 1100 is an input / output interface for the image processing apparatus 2000 to transmit / receive data to / from an input device, an external apparatus, or the like.
- the image processing apparatus 2000 acquires a captured image, a first position, and the like via the input / output interface 1100. Further, for example, the image processing apparatus 2000 outputs a captured image presenting the first image via the input / output interface.
- the storage 1080 stores a program for realizing the functions of the image processing apparatus 2000.
- achieves the function of the display part 2020, the parameter acquisition part 2040, the input part 2060, and the presentation part 2080 is memorize
- the processor 1040 implements the functions of the display unit 2020, the parameter acquisition unit 2040, the input unit 2060, and the presentation unit 2080 by executing these program modules.
- the processor 1040 may execute the modules after reading them onto the memory 1060 or without reading them onto the memory 1060.
- each program module may be stored in the memory 1060.
- the image processing apparatus 2000 may not include the storage 1080.
- the camera parameters may include parameters other than the position and orientation of the camera.
- the camera parameters include internal parameters representing internal characteristics of the camera, such as focal length, lens distortion, or image center coordinates.
- the position and orientation of the camera are external parameters that represent the external characteristics of the camera.
- the camera parameter can be calculated by associating the two-dimensional coordinates on the captured image with the three-dimensional coordinates on the real space.
- the three-dimensional coordinates in the real space corresponding to the two-dimensional coordinates are not uniquely determined only by the two-dimensional coordinates on the captured image.
- the second position in the real space corresponding to the first position on the captured image is uniquely identified by specifying the height information (z coordinate) of the second position in the real space. It is stipulated in.
- the origin in the real space is set on the ground surface directly below the camera, the horizontal and vertical directions parallel to the ground surface are set as x and y coordinates, and the z coordinate is set in the direction perpendicular to the ground surface.
- the technique of mutually converting the coordinates on the image and the coordinates on the real space using the camera parameters is a known technique, and is described in Non-Patent Document 1, for example. Therefore, further detailed explanation regarding this technique is omitted.
- the parameter acquisition unit 2040 acquires camera parameters.
- the parameter acquisition unit 2040 receives camera parameters transmitted from an external device.
- the parameter acquisition unit 2040 accepts manual camera parameter input.
- the parameter acquisition unit 2040 reads camera parameters from a storage device that stores camera parameters.
- the display unit 2020 displays the captured image on a display screen such as a display.
- the display screen may be a stationary display or a portable display provided in a mobile terminal or the like.
- the input unit 2060 can accept the designation of the first position by various methods capable of specifying the position on the captured image. For example, the input unit 2060 receives an operation (such as a click operation) for designating an arbitrary position on the captured image with an input device such as a mouse. When the captured image is displayed on the touch panel, the input unit 2060 accepts a touch input or the like for an arbitrary position on the captured image. The input unit 2060 may accept input of coordinates representing a position on the captured image.
- the target object is, for example, an object having a predetermined size and shape in real space.
- the above-described information defining a predetermined target object “a rectangular parallelepiped having a height of 170 cm and a vertical and horizontal length of 30 cm” is stored in advance or inside the image processing apparatus 2000.
- the presentation unit 2080 uses this predetermined countermeasure object.
- the image processing apparatus 2000 may have a function of accepting input of information that defines the target object.
- information representing both the shape and size of the target object in the real space may be received, or information representing only either the shape or the size may be received.
- the shape of the target object is determined in advance as a rectangular parallelepiped shape, and designation of size (vertical and horizontal lengths and heights) is received from the user.
- the shape of the target object is not limited to a rectangular parallelepiped.
- the target object may be a cone or a sphere.
- the target object may be an object representing a shape of a person or an animal such as an avatar.
- the target object may be a planar shape.
- FIG. 6 is a diagram illustrating a state in which the first image 40 representing the planar target object is presented on the captured image 10.
- the user specifies the vertical and horizontal lengths of the plane.
- the camera parameter appropriately represents the position and orientation of the real camera
- the first image presented by the presentation unit 2080 is parallel to the ground surface.
- the user compares the first image 40 presented by the presentation unit 2080 with the ground surface reflected in the captured image 10, and determines whether or not the plane represented by the first image 40 is parallel to the ground surface. It is possible to easily confirm whether or not the camera parameters appropriately represent the position and orientation of the actual camera.
- the camera parameters can be determined by comparing the appearance and size on the image with an object of a known size that appears in the captured image, so that the camera parameters It is possible to easily confirm whether or not the posture is appropriately represented.
- the presentation unit 2080 generates an image representing the target object on the captured image when the target object when placed at the second position appears in the camera determined by the camera parameter. For example, the presentation unit 2080 performs the following processing.
- the presentation unit 2080 calculates a second position in the real space corresponding to the first position on the target image.
- the second position (three-dimensional coordinates) in the real space corresponding to the first position is not uniquely determined only by the first position (two-dimensional coordinates) on the target image. Therefore, the presentation unit 2080 acquires information indicating the height of the second position (the z coordinate of the second position).
- the position in the real space corresponding to the first position on the target image is uniquely determined.
- the presentation unit 2080 calculates the three-dimensional coordinates of the second position by using the two-dimensional coordinates representing the first position, the height information of the second position, and the camera parameters. As described above, by using these pieces of information, the two-dimensional coordinates on the captured image can be converted into the three-dimensional coordinates on the real space.
- the height information of the second position may be given in advance to the presentation unit 2080 or may be input from the outside. Alternatively, the height information of the second position may be set as a different height for each of a plurality of regions in the target image.
- the presentation unit 2080 generates a first image representing the target object to be presented on the captured image. For example, when the target object is a rectangular parallelepiped or a cone, the presentation unit 2080 calculates the coordinates of each vertex of the target object to be presented on the captured image in order to generate the first image. Specifically, using the camera parameters, the presentation unit 2080 converts the three-dimensional coordinates of each vertex when the target object is placed at the second position in the real space into the two-dimensional coordinates of each vertex on the captured image. Convert. Then, the presentation unit 2080 generates a first image by connecting each vertex with a straight line or the like.
- the angle of the target object placed in the real space is arbitrary.
- the presentation unit 2080 has the horizontal side of the target object parallel to the x axis, the vertical side parallel to the y axis, and the side in the height direction parallel to the z axis.
- the target object is placed at the second position.
- the directions of these sides may be determined in advance or may be designated by the user. For example, when using a planar target object in the captured image 10 of FIG. 6, whether or not the target object and the ground surface are parallel by aligning the vertical side with a line on the ground surface. This makes it easier to make such decisions.
- the image processing apparatus 2000 enables the user to rotate the target object presented on the captured image 10 using a mouse or the like. For example, when the target object or its periphery presented on the captured image 10 is dragged with a mouse or the like, the image processing apparatus 2000 determines the direction in which the target object is rotated according to the drag direction.
- the rotation direction when dragged in the left direction is clockwise
- the rotation direction when dragged in the right direction is counterclockwise.
- the image processing apparatus 2000 determines an angle for rotating the target object according to the drag distance. In this case, the relationship between the drag distance and the rotation angle is defined in advance. Then, the image processing apparatus 2000 rotates the target object with the determined direction and angle using a straight line passing through the second position (for example, a straight line parallel to the z axis) as a rotation axis. The user then compares the target object on the captured image 10 with the ground surface after the vertical sides of the target object are along the line on the ground surface.
- the second position is not limited to a point inside the target object, and may be outside.
- the presentation unit 2080 may accept an operation for moving the target object on the captured image 10.
- the user moves the target object on the captured image 10 by an operation such as “drag the captured image 10 with the right mouse button”.
- the input unit 2060 repeatedly acquires the position of the moving mouse pointer as the first position described above. For example, this acquisition is performed at predetermined time intervals.
- the presentation unit 2080 newly generates the first position on the captured image 10 newly acquired by the input unit 2060 based on the first position, the camera parameter obtained in a fixed manner, and the height information of the second position.
- Presented first image 40 is presented.
- the presentation unit 2080 deletes the first image 40 presented at the first position acquired before the first position from the captured image 10. By doing so, it appears that the target object is moving in the space shown in the captured image 10 from the viewpoint of the user.
- FIG. 13 is a diagram illustrating a state in which the user moves the target object on the captured image 10.
- a trajectory 170 represents a trajectory in which the user has moved the target object.
- Each of the first position 30-1 to the first position 30-5 represents a position on the trajectory 170.
- the first image 40-1 to the first image 40-5 represent the first image 40 presented at the first position 30-1 to the first position 30-5, respectively.
- the first image 40 drawn with a dotted line represents the first image 40 that has already disappeared from the captured image 10
- the first image 40 drawn with a solid line is the first image that is currently presented.
- An image 40 is represented.
- the currently designated first position 30 is the first position 30-5
- the first image 40-5 is presented, and the first image 40-1 to the first image 40-4 are displayed. Disappeared.
- the user moves the target object so as to pass next to a person or the like shown on the captured image 10 to determine whether or not the target object looks uncomfortable.
- Check if there is no sense of incongruity about the size and orientation of the target object even if the target object is moved beside any person, the camera parameter acquired by the parameter acquisition unit 2040 indicates the position of the camera that captured the captured image 10 and It is considered that the position and posture approximate to the posture are represented.
- the user can easily verify whether there is no sense of incongruity in the appearance of the target object at various positions on the captured image 10. In particular, by providing a way to see the target object by moving continuously, the legitimacy and discomfort of human vision are more emphasized and function effectively for verification.
- height information may be set for each stepped region in a photographed image showing a stepped region such as a staircase.
- the user can easily verify whether there is no sense of incongruity in the appearance of the target object seamlessly including the step by moving the target object on the image as shown in the locus in FIG.
- FIG. 7 is a block diagram illustrating an image processing apparatus 2000 according to the second embodiment.
- arrows indicate the flow of information.
- each block represents a functional unit configuration, not a hardware unit configuration.
- the image processing apparatus 2000 includes a display unit 2020, a parameter acquisition unit 2040, a second input unit 2100, and a second display unit 2120.
- the functions of the display unit 2020 and the parameter acquisition unit 2040 of the present embodiment are the same as the functions of the display unit 2020 and the parameter acquisition unit 2040 described in the first embodiment, respectively.
- the second input unit 2100 receives a point or line input for the captured image displayed by the display unit 2020.
- the second display unit 2120 is arranged on a plane parallel to the ground surface based on the camera parameters, the position of the input point or line on the captured image, and the height information of the input point or line in real space.
- An image representing the point or line when mapped is displayed.
- the second display unit 2120 is assumed to exist in the field of view of the camera when it is assumed that a point or line in the input captured image exists in the field of view of the camera that captured the captured image.
- An image in which the dots or lines are mapped to a plane parallel to the ground surface is displayed.
- the second display unit 2120 may display on the same display or the like on which the captured image is displayed by the display unit 2020, or may display on a different display or the like.
- the height information of the input point or line in real space may be given to the second display unit 2120 in advance, or may be input to the second input unit 2100 together with these points or lines.
- the second display unit 2120 maps the points or lines on the captured image onto a plane parallel to the ground surface in real space.
- a point mapping method will be described.
- the second display unit 2120 converts the two-dimensional coordinates of the points on the captured image into three-dimensional coordinates on the real space.
- the three-dimensional coordinates on the real space corresponding to the two-dimensional coordinates on the captured image are not uniquely determined. Therefore, the second display unit 2120 uses the input point height information. Specifically, the height information of the input point in the real space is assumed to be given height information. Accordingly, the second display unit 2120 can uniquely convert the two-dimensional coordinates on the captured image into the three-dimensional coordinates on the real space.
- the position of the input point on the plane parallel to the ground surface in the real space is the horizontal coordinate and the vertical coordinate (excluding the z coordinate indicating the height, x (y coordinate).
- the technique for calculating the coordinates is a known technique. Therefore, detailed description regarding this technique is omitted.
- the principle of the process of mapping the line input on the captured image onto a plane parallel to the ground surface in real space is the same as the principle of the process of mapping the points described above.
- the second display unit 2120 maps two or more points (for example, points at both ends) on the input line on a plane parallel to the ground surface in the real space.
- the second display unit 2120 connects these mapped points with a line such as a straight line. By doing in this way, the line input on the captured image is mapped on a plane parallel to the ground surface in real space.
- FIG. 8 is a diagram illustrating the captured image 10 in which a line is input via the second input unit 2100.
- a dotted line 90 represents a line input to the second input unit 2100.
- the pattern 100 is a line drawn on the ground surface in the real world shown in the captured image.
- the pattern 100-1 and the pattern 100-2 are lines parallel to each other in the real world.
- the boundary 110 is a boundary between the wall and the ground surface in the real world shown in the captured image.
- the boundary 110-1 and the boundary 110-2 intersect each other vertically in the real world.
- the second display unit 2120 maps the dotted line 90 on a plane parallel to the ground surface. Then, second display unit 2120 displays a state in which dotted line 90 mapped on a plane parallel to the ground surface is viewed from a direction perpendicular to the plane.
- FIG. 9 is a diagram illustrating an image representing a state in which a dotted line 90 mapped to a plane representing the ground surface is viewed from a direction perpendicular to the plane.
- FIG. 9A is a diagram in the case where the camera parameters represent positions and orientations that approximate the actual camera positions and orientations.
- the pattern 100-1 and the pattern 100-2 are lines drawn parallel to each other. For this reason, in FIG.
- a projection line 120-1 in which a dotted line 90-1 is mapped on a plane representing the ground surface, and a dotted line Projection lines 120-2 in which 90-2 is mapped on the plane representing the ground surface are parallel or close to each other. Further, as described above, in the real world (in a place shown in the captured image), the boundary 110-3 and the boundary 110-4 intersect each other vertically. Therefore, in FIG.
- the projection line 120-3 in which the dotted line 90-3 is mapped onto the plane representing the ground surface and the projection line 120-4 in which the dotted line 90-4 is mapped onto the plane representing the ground surface are , Intersect each other at an angle close to or perpendicular to each other.
- FIG. 9B is a diagram in the case where the camera parameters represent positions and orientations different from the actual camera positions and orientations.
- the projection line 120-1 and the projection line 120-2 are not parallel or nearly parallel, or the projection line 120-3 and the projection line 120-4 are not perpendicular or nearly perpendicular. To do.
- the user who uses the captured image shown in FIG. 8 uses the pattern 100, the boundary 110, or the like whose relation in the real world is known or easy to predict, and displays the results displayed by the second display unit 2120. By looking, it can be easily confirmed whether or not the camera parameters appropriately represent the position and orientation of the actual camera.
- the method of using the pattern on the ground surface is not limited to the above-described method.
- a method of inputting a plurality of points on the pattern 100-1 and confirming whether or not these plurality of points are arranged on a straight line can be considered.
- FIG. 17 is a diagram illustrating the projection line 180 of the target object presented on the captured image on the plane representing the ground surface described with reference to FIG.
- FIG. 17A shows a case where the projection line 180 of the target object is presented when the first image representing the stationary target object is presented on the captured image (eg, FIG. 2B). It is.
- FIG. 17B shows that when an operation for moving the target object is performed (for example, FIG. 13), the projection line 180 of the target object is moved in accordance with the movement of the target object on the captured image.
- the trajectory 190 represents a trajectory of movement of the projection line 180.
- a line tracing the shape may be input to the second input unit 2100.
- the shape of the line displayed by the second display unit 2120 represents a shape close to the original shape of the traced object.
- the shape of the line displayed by the second display unit 2120 is a perfect circle or a shape close to a perfect circle.
- the shape of the line presented by the second display unit 2120 is a shape (for example, an ellipse) different from a perfect circle.
- FIG. 10 is a diagram illustrating an image in which the position and field of view of the camera are presented together with the projection line shown in FIG.
- a camera position 150 represents the position of the camera
- a field of view 160 represents the field of view of the camera.
- the system setting person who handles the image processing apparatus 2000 according to the second embodiment appropriately determines the position and orientation of the actual camera by checking the positional relationship between the points and lines mapped on the plane parallel to the ground surface. Check whether it is expressed in Here, as shown in FIG. 10, when the camera position and field of view are presented together with the points and lines mapped on a plane parallel to the ground surface, the system setter etc. It becomes possible to further grasp the positional relationship with the field of view. Therefore, a system setter or the like can more easily and accurately confirm whether or not the camera parameter appropriately represents the position and orientation of the actual camera.
- FIG. 11 is a flowchart illustrating the flow of processing executed by the image processing apparatus 2000 according to the second embodiment.
- the processes performed in steps S102 and S106 are the same as the processes performed in steps S102 and S106 in FIG.
- step S ⁇ b> 202 the second input unit 2100 receives a point or line input for the captured image displayed by the display unit 2020.
- step S204 the second display unit 2120 displays an image representing the point or line when mapped on a plane parallel to the ground surface.
- the user inputs a line or the like in which the original shape or positional relationship can be easily specified with respect to the captured image, and the line or the like displayed by the second display unit 2120 By checking whether or not the positional relationship is satisfied, it can be easily confirmed whether or not the camera parameter appropriately represents the position and orientation of the actual camera.
- FIG. 15 is a block diagram illustrating an image processing device 3000 according to the third embodiment.
- arrows indicate the flow of information.
- each block represents a functional unit configuration, not a hardware unit configuration.
- the image processing apparatus 3000 includes an input unit 3020 and a presentation unit 3040.
- the input unit 3020 receives an input of an operation for moving the first image presented on the captured image captured by the camera.
- the first image is an image in which a target object having a predetermined shape and a predetermined size in real space is superimposed on a captured image based on predetermined camera parameters representing the position and orientation of the camera. For example, if the position on the captured image where the first image is presented is a position A, the first image is displayed when the position A is designated as the first position in the image processing apparatus 2000 of the first embodiment. This corresponds to the first image presented by 2080.
- the target object in the third embodiment is the same as the target object described in the first embodiment.
- the predetermined camera parameters in the third embodiment are the same as the camera parameters described in the first embodiment.
- the presenting unit 3040 presents a first image representing the target object in a manner corresponding to the position on the captured image after movement based on the camera parameters.
- the method of presenting the first image corresponding to the target object to which the presenting unit 3040 is moved is the same as that described in the first embodiment “the first image 40 corresponding to the target object to which the presenting unit 2080 is moved on the captured image 10. It is the same as the “method for presenting”.
- the hardware configuration of the image processing apparatus 3000 is the same as the hardware configuration of the image processing apparatus 2000.
- FIG. 16 is a flowchart illustrating the flow of processing executed by the image processing apparatus 3000 according to the third embodiment.
- the input unit 3020 receives an input of a movement operation for the first image superimposed on the captured image.
- the presentation unit 3040 presents a first image that represents the target object in a manner corresponding to the position on the captured image after movement based on the camera parameters.
- the user moves the target object so as to pass next to a person or the like shown on the captured image 10, thereby causing the target object to move.
- the legitimacy and sense of incongruity of human vision are more emphasized and function effectively for verification.
- the image processing apparatus 2000 may have the following functions.
- An image processing apparatus 2000 having the following functions is referred to as an image processing apparatus 2000 according to the first modification.
- Note that the image processing apparatus 2000 according to the first modification may or may not have the functions of the image processing apparatus 2000 according to the first and second embodiments described above.
- the estimation of the camera parameter is performed by “taking a calibration pattern or its equivalent with a camera and capturing the three-dimensional coordinates of the calibration pattern in the real world and the two-dimensional calibration pattern on the captured image.
- a method of “estimating based on correspondence with coordinates” is used (Non-Patent Document 1). Specifically, the two-dimensional coordinates when the three-dimensional coordinates of the calibration pattern in the real world are projected on the captured image using the estimated camera parameters, and the two-dimensional coordinates of the calibration pattern reflected in the captured image
- the camera parameters are calculated so as to reduce the error (reprojection error) between the two. For example, there is a method of calculating an estimated value of a camera parameter so that the sum of squares of errors is minimized.
- the system setting person or the like sees only a camera parameter that is an estimation result. You will not see the above error, which is a lapse.
- the accuracy of camera parameter estimation can be improved by showing the above-mentioned error that is in the middle of progress to the system setter or the like. For example, when the position where the error is large is concentrated toward the edge of the captured image, it is considered that the error is large due to the input error of the corresponding point or the lens distortion.
- the accuracy of the camera parameter can be improved by changing the method of selecting the calibration pattern so that the calibration pattern captured at a position within a predetermined distance from the edge of the image is not used for camera parameter estimation. it can.
- FIG. 12 is a diagram illustrating a state in which information indicating an error (error information 140) is presented on the captured image.
- error information 140 information indicating an error
- a person is used to obtain a calibration pattern.
- a line 130 that connects the feet and heads of a substantially upright person is used as a calibration pattern.
- the error information 140 presented beside the line 130 indicates a reprojection error corresponding to the line 130.
- the image processing apparatus 2000 may map the calibration pattern on the ground surface and display the error in association with the calibration pattern mapped on the ground surface by the method described in the second embodiment.
- Input means for receiving an input of a movement operation on the captured image,
- Presenting means for presenting a first image representing the target object in a manner corresponding to a position on the captured image after the movement based on the camera parameter;
- An image processing apparatus 2.
- the input means receives the movement operation by repeatedly receiving the designation of the first position on the captured image,
- the presenting means sets the camera parameter, a predetermined shape and a predetermined size in the real space of the target object, and a second position in the real space corresponding to the first position. Based on this, when the target object is placed at the second position, a first image representing the target object on the captured image captured by the camera determined by the camera parameter is generated, and the first image on the captured image is generated.
- the presenting means is Obtaining height information of the second position; 1. calculating the second position based on height information of the camera parameter, the first position, and the second position; An image processing apparatus according to 1. 4). 2.
- the presenting means acquires information indicating the height of the ground surface in real space as the height information of the second position.
- the presenting means obtains information on different heights for each of a plurality of regions on the captured image as height information on the second position.
- the target object has a planar shape.
- Second input means for receiving an input of a point or a line with respect to the captured image; Based on the camera parameter, the position of the point or line on the captured image, and the height information of the point or line in real space, the point or line when mapped on a plane parallel to the ground surface
- Second display means for displaying a second image to be represented; Having 1. To 6.
- Input means for accepting designation of the first position on the captured image; Based on predetermined camera parameters representing the position and orientation of the camera, a predetermined shape and size in the real space of the target object, and a second position in the real space corresponding to the first position, the target object is Presenting means for presenting, at the first position on the captured image, a first image representing the target object on the captured image captured by the camera determined by the camera parameters when placed at two positions; An image processing apparatus. 9.
- the input means accepts designation of a plurality of first positions,
- the presenting means presents a first image representing a plurality of target objects corresponding to the plurality of first positions at a corresponding first position on the captured image.
- An image processing apparatus according to 1. 10.
- the input means repeatedly receives the designation of the first position.
- the presentation means is the object placed at the second position in the real space corresponding to the first position.
- 7. Generate a first image representing the object and present it at the first position on the captured image.
- An image processing apparatus according to 1.
- Display means for displaying a first image representing the point or line when An image processing apparatus.
- An image processing method executed by a computer A first image representing a target object having a predetermined shape and a predetermined size in real space superimposed on a captured image captured by a camera based on predetermined camera parameters indicating the position and orientation of the camera.
- An image processing method executed by a computer, A first image representing a target object having a predetermined shape and a predetermined size in real space superimposed on a captured image captured by a camera based on predetermined camera parameters indicating the position and orientation of the camera.
- the input step accepts the movement operation by repeatedly accepting designation of the first position on the captured image
- the camera parameter, a predetermined shape and a predetermined size in the real space of the target object, and a second position in the real space corresponding to the first position are set. Based on this, when the target object is placed at the second position, a first image representing the target object on the captured image captured by the camera determined by the camera parameter is generated, and the first image on the captured image is generated.
- An image processing method described in 1. 14 includes Obtaining height information of the second position; 12. Calculate the second position based on height information of the camera parameter, the first position, and the second position.
- the presenting step information indicating the height of the ground surface in real space is acquired as the height information of the second position.
- An image processing method described in 1. 16 information on different heights for each of a plurality of regions on the captured image is acquired as height information on the second position.
- the target object has a planar shape. To 16. The image processing method according to any one of the above. 18.
- a second input step for receiving an input of a point or a line with respect to the captured image; Based on the camera parameter, the position of the point or line on the captured image, and the height information of the point or line in real space, the point or line when mapped on a plane parallel to the ground surface
- An image processing method executed by a computer An input step for accepting designation of the first position on the captured image; Based on predetermined camera parameters representing the position and orientation of the camera, a predetermined shape and size in the real space of the target object, and a second position in the real space corresponding to the first position, the target object is A presenting step of presenting, at the first position on the captured image, a first image representing the target object on the captured image captured by the camera parameter determined by the camera parameter when placed at two positions; An image processing method. 20.
- the input step accepts designation of a plurality of first positions;
- the presenting step presents a first image representing a plurality of target objects corresponding to the plurality of first positions at a corresponding first position on the captured image.
- the input step repeatedly receives the designation of the first position.
- the presentation step when a certain first position is designated, is the object placed at the second position in the real space corresponding to the first position. 18.
- An image processing method executed by a computer An input step for receiving an input of a point or a line with respect to a captured image captured by the camera; Mapping on a plane parallel to the ground surface based on predetermined camera parameters representing the position and orientation of the camera, the position of the point or line on the captured image, and the height information of the point or line in real space
Abstract
Description
図1は、実施形態1に係る画像処理装置2000を例示するブロック図である。図1において、矢印は情報の流れを表している。さらに、図1において、各ブロックは、ハードウエア単位の構成ではなく、機能単位の構成を表している。
図3は、実施形態1の画像処理装置2000が実行する処理の流れを例示するフローチャートである。ステップS102において、表示部2020は、カメラによって撮像された撮像画像を表示する。ステップS104において、入力部2060は、撮像画像上の第1位置の指定を受け付ける。ステップS106において、パラメータ取得部2040は、カメラの位置及び姿勢などを表すカメラパラメータを取得する。ステップS108において、提示部2080は第1画像を生成する。前述したように、第1画像は、第2位置に置いた場合に上記カメラパラメータによって定まるカメラに写るときの、撮像画像上における対象オブジェクトを表す。そしてステップS110において、提示部2080は、生成した第1画像を撮像画像上の第1位置に提示する。
本実施形態によれば、提示部2080によって提示されるオブジェクトを画像処理装置2000のユーザが見ることにより、カメラパラメータが、表示部2020よって表示される撮像画像を撮像したカメラ(以下、実カメラ)の位置及び姿勢などを適切に表しているか否かを容易に確かめることができる。以下、図4を用いて詳しく説明する。
画像処理装置2000の各機能構成部は、各機能構成部を実現するハードウエア構成要素(例:ハードワイヤードされた電子回路など)で実現されてもよいし、ハードウエア構成要素とソフトウエア構成要素との組み合わせ(例:電子回路とそれを制御するプログラムの組み合わせなど)で実現されてもよい。
前述したように、カメラパラメータは、カメラの位置及び姿勢以外のパラメータを含んでもよい。例えばカメラパラメータには、焦点距離、レンズ歪み、又は画像中心の座標など、カメラの内部的な特徴を表す内部パラメータが含まれる。なお、カメラの位置及び姿勢は、カメラの外部的な特徴を表す外部パラメータである。カメラパラメータは、撮像画像上の2次元座標と、実空間上の3次元座標とを対応づけることにより算出できる。
表示部2020は、ディスプレイ等の表示画面に撮像画像を表示する。ここで、この表示画面は据え置き型のディスプレイであってもよいし、携帯端末等に備えられている可搬型のディスプレイであってもよい。
入力部2060は、撮像画像上の位置を特定できる様々な方法で、第1位置の指定を受け付けることができる。例えば、入力部2060は、マウス等の入力デバイスによって撮像画像上の任意の位置を指定する操作(クリック操作など)を受け付ける。また、撮像画像がタッチパネル上に表示されている場合、入力部2060は、撮像画像上の任意の位置に対するタッチ入力等を受け付ける。また入力部2060は、撮像画像上の位置を表す座標の入力を受け付けてもよい。
対象オブジェクトは、例えば予め定められた実空間上のサイズ及び形状を持つオブジェクトである。例えば前述した、「高さ 170cm、かつ縦及び横の長さが 30cm である直方体」という所定の対象オブジェクトを定義する情報が画像処理装置2000の内部又は外部に予め格納されている。この場合、提示部2080は、この所定の対処オブジェクトを利用する。
前述したように、提示部2080は、第2位置に置いた場合の対象オブジェクトが、上記カメラパラメータによって定まるカメラに写るときの、撮像画像上の対象オブジェクトを表す画像を生成する。例えば提示部2080は、以下のような処理を行う。
図7は、実施形態2に係る画像処理装置2000を例示するブロック図である。図7において、矢印は情報の流れを表している。さらに図7において、各ブロックは、ハードウエア単位の構成ではなく、機能単位の構成を表している。
図11は、実施形態2の画像処理装置2000が実行する処理の流れを例示するフローチャートである。なお、ステップS102及びS106で行われる処理は、図3のステップS102及びS106で行われる処理と同様の処理である。ステップS202において、第2入力部2100は、表示部2020によって表示された撮像画像に対する点又は線の入力を受け付ける。ステップS204において、第2表示部2120は、地表面と平行な平面上にマッピングした場合における上記点又は線を表す画像を表示する。
本実施形態の画像処理装置2000によれば、ユーザは撮像画像に対して本来の形状や位置関係が特定しやすい線などを入力し、第2表示部2120によって表示される線などが本来の形状や位置関係を満たしているか否かを見ることで、カメラパラメータが実カメラの位置及び姿勢などを適切に表しているか否かを容易に確認することができる。
図15は、実施形態3に係る画像処理装置3000を例示するブロック図である。図15において、矢印は情報の流れを表している。さらに図15において、各ブロックは、ハードウエア単位の構成ではなく、機能単位の構成を表している。
図16は、実施形態3の画像処理装置3000が実行する処理の流れを例示するフローチャートである。ステップS302において、入力部3020は、撮像画像上に重畳された第1画像に対する移動の操作の入力を受け付ける。ステップS304において、提示部3040は、カメラパラメータの基で、移動後の撮像画像上の位置に対応する見え方で前記対象オブジェクトを表す第1画像を提示する。
本実施形態によれば、例えば図13や図14に表されているように、ユーザは、撮像画像10上に写っている人物などの横を通るように対象オブジェクトを移動させることで、対象オブジェクトの見え方に違和感がないかどうかを容易に確認できる。特に、連続的に移動して対象オブジェクトの見え方を提供することにより、人間の視覚による正当性や違和感がより強調され、検証に対して有効に機能する。
画像処理装置2000は、次のような機能を有していてもよい。以下の機能を有する画像処理装置2000を変形例1の画像処理装置2000と表記する。なお、変形例1の画像処理装置2000は、上述した実施形態1や2の画像処理装置2000が有する機能を有していてもよいし、有していなくてもよい。
1. カメラによって撮影された撮像画像に対して、当該カメラの位置及び姿勢を表す所定のカメラパラメータの基で重畳された、実空間上における所定形状及び所定サイズが定められた対象オブジェクトを表す第1画像に対する、前記撮像画像上における移動の操作の入力を受け付ける入力手段と、
前記カメラパラメータの基で、前記移動後の前記撮像画像上の位置に対応する見え方で前記対象オブジェクトを表す第1画像を提示する提示手段と、
を有する画像処理装置。
2. 前記入力手段は、前記撮像画像上の第1位置の指定を繰り返し受け付けることで前記移動の操作を受け付け、
前記提示手段は、或る第1位置が指定された際、前記カメラパラメータ、前記対象オブジェクトの実空間上における所定形状及び所定サイズ、並びに前記第1位置に対応する実空間上の第2位置に基づいて、前記対象オブジェクトを前記第2位置に置いた場合に前記カメラパラメータによって定まるカメラに写る前記撮像画像上の前記対象オブジェクトを表す第1画像を生成して、前記撮像画像上の前記第1位置に提示する1.に記載の画像処理装置。
3. 前記提示手段は、
前記第2位置の高さ情報を取得し、
前記カメラパラメータ、前記第1位置、及び前記第2位置の高さ情報に基づいて、前記第2位置を算出する2.に記載の画像処理装置。
4. 前記提示手段は、前記第2位置の高さ情報として、実空間における地表面の高さを示す情報を取得する3.に記載の画像処理装置。
5. 前記提示手段は、前記第2位置の高さ情報として、前記撮像画像上における複数の領域毎に異なる高さの情報を取得する3.に記載の画像処理装置。
6. 前記対象オブジェクトは、平面形状である1.乃至5.いずれか一つに記載の画像処理装置。
7. 前記撮像画像に対する点又は線の入力を受け付ける第2入力手段と、
前記カメラパラメータ、前記点又は線の前記撮像画像上の位置、及び前記点又は線の実空間上の高さ情報に基づいて、地表面と平行な面上にマッピングした場合の前記点又は線を表す第2画像を表示する第2表示手段と、
を有する1.乃至6.いずれか一つに記載の画像処理装置。
8. 撮像画像上の第1位置の指定を受け付ける入力手段と、
カメラの位置及び姿勢を表す所定のカメラパラメータ、対象オブジェクトの実空間上における所定形状及び所定サイズ、並びに前記第1位置に対応する実空間上の第2位置に基づいて、前記対象オブジェクトを前記第2位置に置いた場合に前記カメラパラメータによって定まるカメラに写る前記撮像画像上の前記対象オブジェクトを表す第1画像を、前記撮像画像上の前記第1位置に提示する提示手段と、
を有する画像処理装置。
9. 前記入力手段は、複数の第1位置の指定を受け付け、
前記提示手段は、前記複数の第1位置に対応する複数の対象オブジェクトを表す第1画像を、前記撮像画像上で各々対応する第1位置に提示する8.に記載の画像処理装置。
10. 前記入力手段は、前記第1位置の指定を繰り返し受け付け
前記提示手段は、或る第1位置が指定された時、その第1位置に対応する実空間上の第2位置に置かれた前記対象オブジェクトを表す第1画像を生成して、前記撮像画像上のその第1位置に提示する8.又は9.に記載の画像処理装置。
11. カメラによって撮影された撮像画像に対する点又は線の入力を受け付ける入力手段と、
カメラの位置及び姿勢を表す所定のカメラパラメータ、前記点又は線の前記撮像画像上の位置、及び前記点又は線の実空間上の高さ情報に基づいて、地表面と平行な面上にマッピングした場合の前記点又は線を表す第1画像を表示する表示手段と、
を有する画像処理装置。
12. コンピュータによって実行される画像処理方法であって、
カメラによって撮影された撮像画像に対して、当該カメラの位置及び姿勢を表す所定のカメラパラメータの基で重畳された、実空間上における所定形状及び所定サイズが定められた対象オブジェクトを表す第1画像に対する、前記撮像画像上における移動の操作の入力を受け付ける入力ステップと、
前記カメラパラメータの基で、前記移動後の前記撮像画像上の位置に対応する見え方で前記対象オブジェクトを表す第1画像を提示する提示ステップと、
を有する画像処理方法。
13. 前記入力ステップは、前記撮像画像上の第1位置の指定を繰り返し受け付けることで前記移動の操作を受け付け、
前記提示ステップは、或る第1位置が指定された際、前記カメラパラメータ、前記対象オブジェクトの実空間上における所定形状及び所定サイズ、並びに前記第1位置に対応する実空間上の第2位置に基づいて、前記対象オブジェクトを前記第2位置に置いた場合に前記カメラパラメータによって定まるカメラに写る前記撮像画像上の前記対象オブジェクトを表す第1画像を生成して、前記撮像画像上の前記第1位置に提示する12.に記載の画像処理方法。
14. 前記提示ステップは、
前記第2位置の高さ情報を取得し、
前記カメラパラメータ、前記第1位置、及び前記第2位置の高さ情報に基づいて、前記第2位置を算出する13.に記載の画像処理方法。
15. 前記提示ステップは、前記第2位置の高さ情報として、実空間における地表面の高さを示す情報を取得する14.に記載の画像処理方法。
16. 前記提示ステップは、前記第2位置の高さ情報として、前記撮像画像上における複数の領域毎に異なる高さの情報を取得する14.に記載の画像処理方法。
17. 前記対象オブジェクトは、平面形状である12.乃至16.いずれか一つに記載の画像処理方法。
18. 前記撮像画像に対する点又は線の入力を受け付ける第2入力ステップと、
前記カメラパラメータ、前記点又は線の前記撮像画像上の位置、及び前記点又は線の実空間上の高さ情報に基づいて、地表面と平行な面上にマッピングした場合の前記点又は線を表す第2画像を表示する第2表示ステップと、
を有する12.乃至17.いずれか一つに記載の画像処理方法。
19. コンピュータによって実行される画像処理方法であって、
撮像画像上の第1位置の指定を受け付ける入力ステップと、
カメラの位置及び姿勢を表す所定のカメラパラメータ、対象オブジェクトの実空間上における所定形状及び所定サイズ、並びに前記第1位置に対応する実空間上の第2位置に基づいて、前記対象オブジェクトを前記第2位置に置いた場合に前記カメラパラメータによって定まるカメラに写る前記撮像画像上の前記対象オブジェクトを表す第1画像を、前記撮像画像上の前記第1位置に提示する提示ステップと、
を有する画像処理方法。
20. 前記入力ステップは、複数の第1位置の指定を受け付け、
前記提示ステップは、前記複数の第1位置に対応する複数の対象オブジェクトを表す第1画像を、前記撮像画像上で各々対応する第1位置に提示する19.に記載の画像処理方法。
21. 前記入力ステップは、前記第1位置の指定を繰り返し受け付け
前記提示ステップは、或る第1位置が指定された時、その第1位置に対応する実空間上の第2位置に置かれた前記対象オブジェクトを表す第1画像を生成して、前記撮像画像上のその第1位置に提示する19.又は20.に記載の画像処理方法。
22. コンピュータによって実行される画像処理方法であって、
カメラによって撮影された撮像画像に対する点又は線の入力を受け付ける入力ステップと、
カメラの位置及び姿勢を表す所定のカメラパラメータ、前記点又は線の前記撮像画像上の位置、及び前記点又は線の実空間上の高さ情報に基づいて、地表面と平行な面上にマッピングした場合の前記点又は線を表す第1画像を表示する表示ステップと、
を有する画像処理方法。
23. コンピュータを、1.乃至11.いずれか一つに記載の画像処理装置として動作させるプログラム。
Claims (15)
- カメラによって撮影された撮像画像に対して、当該カメラの位置及び姿勢を表す所定のカメラパラメータの基で重畳された、実空間上における所定形状及び所定サイズが定められた対象オブジェクトを表す第1画像に対する、前記撮像画像上における移動の操作の入力を受け付ける入力手段と、
前記カメラパラメータの基で、前記移動後の前記撮像画像上の位置に対応する見え方で前記対象オブジェクトを表す第1画像を提示する提示手段と、
を有する画像処理装置。 - 前記入力手段は、前記撮像画像上の第1位置の指定を繰り返し受け付けることで前記移動の操作を受け付け、
前記提示手段は、或る第1位置が指定された際、前記カメラパラメータ、前記対象オブジェクトの実空間上における所定形状及び所定サイズ、並びに前記第1位置に対応する実空間上の第2位置に基づいて、前記対象オブジェクトを前記第2位置に置いた場合に前記カメラパラメータによって定まるカメラに写る前記撮像画像上の前記対象オブジェクトを表す第1画像を生成して、前記撮像画像上の前記第1位置に提示する請求項1に記載の画像処理装置。 - 前記提示手段は、
前記第2位置の高さ情報を取得し、
前記カメラパラメータ、前記第1位置、及び前記第2位置の高さ情報に基づいて、前記第2位置を算出する請求項2に記載の画像処理装置。 - 前記提示手段は、前記第2位置の高さ情報として、実空間における地表面の高さを示す情報を取得する請求項3に記載の画像処理装置。
- 前記提示手段は、前記第2位置の高さ情報として、前記撮像画像上における複数の領域毎に異なる高さの情報を取得する請求項3に記載の画像処理装置。
- 前記対象オブジェクトは、平面形状である請求項1乃至5いずれか一項に記載の画像処理装置。
- 前記撮像画像に対する点又は線の入力を受け付ける第2入力手段と、
前記カメラパラメータ、前記点又は線の前記撮像画像上の位置、及び前記点又は線の実空間上の高さ情報に基づいて、地表面と平行な面上にマッピングした場合の前記点又は線を表す第2画像を表示する第2表示手段と、
を有する請求項1乃至6いずれか一項に記載の画像処理装置。 - 撮像画像上の第1位置の指定を受け付ける入力手段と、
カメラの位置及び姿勢を表す所定のカメラパラメータ、対象オブジェクトの実空間上における所定形状及び所定サイズ、並びに前記第1位置に対応する実空間上の第2位置に基づいて、前記対象オブジェクトを前記第2位置に置いた場合に前記カメラパラメータによって定まるカメラに写る前記撮像画像上の前記対象オブジェクトを表す第1画像を、前記撮像画像上の前記第1位置に提示する提示手段と、
を有する画像処理装置。 - 前記入力手段は、複数の第1位置の指定を受け付け、
前記提示手段は、前記複数の第1位置に対応する複数の対象オブジェクトを表す第1画像を、前記撮像画像上で各々対応する第1位置に提示する請求項8に記載の画像処理装置。 - 前記入力手段は、前記第1位置の指定を繰り返し受け付け
前記提示手段は、或る第1位置が指定された時、その第1位置に対応する実空間上の第2位置に置かれた前記対象オブジェクトを表す第1画像を生成して、前記撮像画像上のその第1位置に提示する請求項8又は9に記載の画像処理装置。 - カメラによって撮影された撮像画像に対する点又は線の入力を受け付ける入力手段と、
カメラの位置及び姿勢を表す所定のカメラパラメータ、前記点又は線の前記撮像画像上の位置、及び前記点又は線の実空間上の高さ情報に基づいて、地表面と平行な面上にマッピングした場合の前記点又は線を表す第1画像を表示する表示手段と、
を有する画像処理装置。 - コンピュータによって実行される画像処理方法であって、
カメラによって撮影された撮像画像に対して、当該カメラの位置及び姿勢を表す所定のカメラパラメータの基で重畳された、実空間上における所定形状及び所定サイズが定められた対象オブジェクトを表す第1画像に対する、前記撮像画像上における移動の操作の入力を受け付ける入力ステップと、
前記カメラパラメータの基で、前記移動後の前記撮像画像上の位置に対応する見え方で前記対象オブジェクトを表す第1画像を提示する提示ステップと、
を有する画像処理方法。 - コンピュータによって実行される画像処理方法であって、
撮像画像上の第1位置の指定を受け付ける入力ステップと、
カメラの位置及び姿勢を表す所定のカメラパラメータ、対象オブジェクトの実空間上における所定形状及び所定サイズ、並びに前記第1位置に対応する実空間上の第2位置に基づいて、前記対象オブジェクトを前記第2位置に置いた場合に前記カメラパラメータによって定まるカメラに写る前記撮像画像上の前記対象オブジェクトを表す第1画像を、前記撮像画像上の前記第1位置に提示する提示ステップと、
を有する画像処理方法。 - コンピュータによって実行される画像処理方法であって、
カメラによって撮影された撮像画像に対する点又は線の入力を受け付ける入力ステップと、
カメラの位置及び姿勢を表す所定のカメラパラメータ、前記点又は線の前記撮像画像上の位置、及び前記点又は線の実空間上の高さ情報に基づいて、地表面と平行な面上にマッピングした場合の前記点又は線を表す第1画像を表示する表示ステップと、
を有する画像処理方法。 - コンピュータを、請求項1乃至11いずれか一項に記載の画像処理装置として動作させるプログラム。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016548769A JP6747292B2 (ja) | 2014-09-19 | 2015-07-31 | 画像処理装置、画像処理方法、及びプログラム |
US15/512,340 US10911645B2 (en) | 2014-09-19 | 2015-07-31 | Image processing device, image processing method, and recording medium |
US16/409,320 US20190268509A1 (en) | 2014-09-19 | 2019-05-10 | Image processing device, image processing method, and recording medium |
US17/131,306 US20210112181A1 (en) | 2014-09-19 | 2020-12-22 | Image processing device, image processing method, and recording medium |
US18/241,301 US20230412903A1 (en) | 2014-09-19 | 2023-09-01 | Image processing device, image processing method, and recording medium |
US18/241,299 US20230412902A1 (en) | 2014-09-19 | 2023-09-01 | Image processing device, image processing method, and recording medium |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-191480 | 2014-09-19 | ||
JP2014191480 | 2014-09-19 | ||
JP2014-257137 | 2014-12-19 | ||
JP2014257137 | 2014-12-19 |
Related Child Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/512,340 A-371-Of-International US10911645B2 (en) | 2014-09-19 | 2015-07-31 | Image processing device, image processing method, and recording medium |
US16/409,320 Continuation US20190268509A1 (en) | 2014-09-19 | 2019-05-10 | Image processing device, image processing method, and recording medium |
US17/131,306 Continuation US20210112181A1 (en) | 2014-09-19 | 2020-12-22 | Image processing device, image processing method, and recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016042926A1 true WO2016042926A1 (ja) | 2016-03-24 |
Family
ID=55532970
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/071750 WO2016042926A1 (ja) | 2014-09-19 | 2015-07-31 | 画像処理装置、画像処理方法、及びプログラム |
Country Status (3)
Country | Link |
---|---|
US (5) | US10911645B2 (ja) |
JP (4) | JP6747292B2 (ja) |
WO (1) | WO2016042926A1 (ja) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021152919A (ja) * | 2016-09-23 | 2021-09-30 | アップル インコーポレイテッドApple Inc. | アバターの作成及び編集 |
JP2022504444A (ja) * | 2018-10-29 | 2022-01-13 | 日本電気株式会社 | カメラの較正方法、カメラ及びプログラム |
US11380077B2 (en) | 2018-05-07 | 2022-07-05 | Apple Inc. | Avatar creation user interface |
US11442414B2 (en) | 2020-05-11 | 2022-09-13 | Apple Inc. | User interfaces related to time |
US11481988B2 (en) | 2010-04-07 | 2022-10-25 | Apple Inc. | Avatar editing environment |
US11681408B2 (en) | 2016-06-12 | 2023-06-20 | Apple Inc. | User interfaces for retrieving contextually relevant media content |
US11714536B2 (en) | 2021-05-21 | 2023-08-01 | Apple Inc. | Avatar sticker editor user interfaces |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
US11776190B2 (en) | 2021-06-04 | 2023-10-03 | Apple Inc. | Techniques for managing an avatar on a lock screen |
US11921998B2 (en) | 2020-05-11 | 2024-03-05 | Apple Inc. | Editing features of an avatar |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016213674A (ja) * | 2015-05-08 | 2016-12-15 | キヤノン株式会社 | 表示制御システム、表示制御装置、表示制御方法、及びプログラム |
JP7277187B2 (ja) | 2019-03-13 | 2023-05-18 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理方法、およびプログラム |
JP7310252B2 (ja) * | 2019-04-19 | 2023-07-19 | 株式会社リコー | 動画生成装置、動画生成方法、プログラム、記憶媒体 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001209827A (ja) * | 1999-11-19 | 2001-08-03 | Matsushita Electric Ind Co Ltd | 画像処理装置、画像処理サービス提供方法および受注処理方法 |
JP2005142938A (ja) * | 2003-11-07 | 2005-06-02 | Casio Comput Co Ltd | 電子カメラ、制御プログラム |
JP2013021733A (ja) * | 2012-10-29 | 2013-01-31 | Fujitsu Mobile Communications Ltd | 携帯情報機器 |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2622620B2 (ja) | 1989-11-07 | 1997-06-18 | プロクシマ コーポレイション | コンピュータにより発生されたデイスプレイ可視像を変更するためのコンピュータ入力システム |
US6463121B1 (en) * | 1999-10-13 | 2002-10-08 | General Electric Company | Interactive x-ray position and exposure control using image data as reference information |
EP1102211A3 (en) * | 1999-11-19 | 2006-09-13 | Matsushita Electric Industrial Co., Ltd. | Image processor, method of providing image processing services and order processing method |
WO2002073955A1 (en) * | 2001-03-13 | 2002-09-19 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, studio apparatus, storage medium, and program |
US7307654B2 (en) * | 2002-10-31 | 2007-12-11 | Hewlett-Packard Development Company, L.P. | Image capture and viewing system and method for generating a synthesized image |
JP4217100B2 (ja) | 2003-04-17 | 2009-01-28 | 本田技研工業株式会社 | 画像合成方法、装置、およびプログラム、ならびに立体モデルのレンダリング方法、装置、およびプログラム |
JP2005301492A (ja) * | 2004-04-08 | 2005-10-27 | Olympus Corp | 画像履歴処理プログラム、画像履歴処理方法、画像履歴処理装置及び記録媒体 |
JP4244040B2 (ja) * | 2005-03-10 | 2009-03-25 | 任天堂株式会社 | 入力処理プログラムおよび入力処理装置 |
US7801330B2 (en) | 2005-06-24 | 2010-09-21 | Objectvideo, Inc. | Target detection and tracking from video streams |
JP4730141B2 (ja) * | 2006-03-06 | 2011-07-20 | ソニー株式会社 | 画像処理装置および方法、記録媒体、並びに、プログラム |
WO2007105792A1 (ja) * | 2006-03-15 | 2007-09-20 | Omron Corporation | 監視装置および監視方法、制御装置および制御方法、並びにプログラム |
EP1862969A1 (en) * | 2006-06-02 | 2007-12-05 | Eidgenössische Technische Hochschule Zürich | Method and system for generating a representation of a dynamically changing 3D scene |
JP5223318B2 (ja) * | 2007-12-07 | 2013-06-26 | ソニー株式会社 | 画像処理装置、画像処理方法およびプログラム |
JP5040734B2 (ja) * | 2008-03-05 | 2012-10-03 | ソニー株式会社 | 画像処理装置、画像記録方法およびプログラム |
US20110187703A1 (en) | 2010-01-29 | 2011-08-04 | Kedar Anil Patwardhan | Method and system for object tracking using appearance model |
JP5656567B2 (ja) | 2010-11-05 | 2015-01-21 | キヤノン株式会社 | 映像処理装置および方法 |
JP6121647B2 (ja) * | 2011-11-11 | 2017-04-26 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
JP2013110551A (ja) * | 2011-11-21 | 2013-06-06 | Sony Corp | 情報処理装置、撮像装置、情報処理方法およびプログラム |
JP2013165366A (ja) * | 2012-02-10 | 2013-08-22 | Sony Corp | 画像処理装置、画像処理方法及びプログラム |
US8836768B1 (en) | 2012-09-04 | 2014-09-16 | Aquifi, Inc. | Method and system enabling natural user interface gestures with user wearable glasses |
DE102013211492B4 (de) | 2013-06-19 | 2020-10-15 | Trimble Jena Gmbh | Bestimmung eines Messfehlers |
JP2017093803A (ja) * | 2015-11-24 | 2017-06-01 | 富士通株式会社 | 評価プログラム、評価方法及び評価装置 |
-
2015
- 2015-07-31 US US15/512,340 patent/US10911645B2/en active Active
- 2015-07-31 JP JP2016548769A patent/JP6747292B2/ja active Active
- 2015-07-31 WO PCT/JP2015/071750 patent/WO2016042926A1/ja active Application Filing
-
2019
- 2019-05-10 US US16/409,320 patent/US20190268509A1/en not_active Abandoned
-
2020
- 2020-07-31 JP JP2020129925A patent/JP6996594B2/ja active Active
- 2020-12-22 US US17/131,306 patent/US20210112181A1/en not_active Abandoned
-
2021
- 2021-12-13 JP JP2021201328A patent/JP7294396B2/ja active Active
-
2023
- 2023-06-07 JP JP2023093623A patent/JP2023111962A/ja active Pending
- 2023-09-01 US US18/241,299 patent/US20230412902A1/en active Pending
- 2023-09-01 US US18/241,301 patent/US20230412903A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001209827A (ja) * | 1999-11-19 | 2001-08-03 | Matsushita Electric Ind Co Ltd | 画像処理装置、画像処理サービス提供方法および受注処理方法 |
JP2005142938A (ja) * | 2003-11-07 | 2005-06-02 | Casio Comput Co Ltd | 電子カメラ、制御プログラム |
JP2013021733A (ja) * | 2012-10-29 | 2013-01-31 | Fujitsu Mobile Communications Ltd | 携帯情報機器 |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11869165B2 (en) | 2010-04-07 | 2024-01-09 | Apple Inc. | Avatar editing environment |
US11481988B2 (en) | 2010-04-07 | 2022-10-25 | Apple Inc. | Avatar editing environment |
US11681408B2 (en) | 2016-06-12 | 2023-06-20 | Apple Inc. | User interfaces for retrieving contextually relevant media content |
US11941223B2 (en) | 2016-06-12 | 2024-03-26 | Apple Inc. | User interfaces for retrieving contextually relevant media content |
JP2021152919A (ja) * | 2016-09-23 | 2021-09-30 | アップル インコーポレイテッドApple Inc. | アバターの作成及び編集 |
JP7166391B2 (ja) | 2016-09-23 | 2022-11-07 | アップル インコーポレイテッド | アバターの作成及び編集 |
US11380077B2 (en) | 2018-05-07 | 2022-07-05 | Apple Inc. | Avatar creation user interface |
US11682182B2 (en) | 2018-05-07 | 2023-06-20 | Apple Inc. | Avatar creation user interface |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
JP2022504444A (ja) * | 2018-10-29 | 2022-01-13 | 日本電気株式会社 | カメラの較正方法、カメラ及びプログラム |
JP7136344B2 (ja) | 2018-10-29 | 2022-09-13 | 日本電気株式会社 | カメラの較正方法、カメラ及びプログラム |
US11442414B2 (en) | 2020-05-11 | 2022-09-13 | Apple Inc. | User interfaces related to time |
US11822778B2 (en) | 2020-05-11 | 2023-11-21 | Apple Inc. | User interfaces related to time |
US11921998B2 (en) | 2020-05-11 | 2024-03-05 | Apple Inc. | Editing features of an avatar |
US11714536B2 (en) | 2021-05-21 | 2023-08-01 | Apple Inc. | Avatar sticker editor user interfaces |
US11776190B2 (en) | 2021-06-04 | 2023-10-03 | Apple Inc. | Techniques for managing an avatar on a lock screen |
Also Published As
Publication number | Publication date |
---|---|
JP7294396B2 (ja) | 2023-06-20 |
JP2020182251A (ja) | 2020-11-05 |
JP6996594B2 (ja) | 2022-01-17 |
JP6747292B2 (ja) | 2020-08-26 |
JP2023111962A (ja) | 2023-08-10 |
US20190268509A1 (en) | 2019-08-29 |
US20230412903A1 (en) | 2023-12-21 |
US20230412902A1 (en) | 2023-12-21 |
JPWO2016042926A1 (ja) | 2017-07-20 |
US20170289411A1 (en) | 2017-10-05 |
US10911645B2 (en) | 2021-02-02 |
JP2022022434A (ja) | 2022-02-03 |
US20210112181A1 (en) | 2021-04-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6996594B2 (ja) | 画像処理装置、画像処理方法、及びプログラム | |
US11232593B2 (en) | Calibration apparatus, calibration system, and calibration method | |
US11039121B2 (en) | Calibration apparatus, chart for calibration, chart pattern generation apparatus, and calibration method | |
US10469829B2 (en) | Information processor and information processing method | |
US8265374B2 (en) | Image processing apparatus, image processing method, and program and recording medium used therewith | |
CN104574350B (zh) | 三维数据撷取方法及其系统 | |
US9161027B2 (en) | Method and apparatus for providing camera calibration | |
JP6344050B2 (ja) | 画像処理システム、画像処理装置、プログラム | |
CN110312111B (zh) | 用于图像装置的自动校准的装置、系统和方法 | |
JP6174968B2 (ja) | 撮像シミュレーション装置 | |
JP2004062758A (ja) | 情報処理装置および方法 | |
JP2007036482A (ja) | 情報投影表示装置およびプログラム | |
US11490062B2 (en) | Information processing apparatus, information processing method, and storage medium | |
CN107950019A (zh) | 信息处理装置、信息处理方法和程序 | |
US20240071016A1 (en) | Mixed reality system, program, mobile terminal device, and method | |
JPWO2018167918A1 (ja) | プロジェクタ、マッピング用データ作成方法、プログラム及びプロジェクションマッピングシステム | |
CN112669392B (zh) | 一种应用于室内视频监控系统的地图定位方法及系统 | |
JP2017116280A (ja) | カメラ較正システム、カメラ較正プログラム、及びカメラ較正方法 | |
US11924561B2 (en) | Determining a camera control point for virtual production | |
JP6984583B2 (ja) | 情報処理装置および情報処理方法 | |
JP6124863B2 (ja) | ポインティング・ジェスチャ位置を認識する方法、コンピュータ、およびコンピュータ・プログラム | |
JP2020057430A (ja) | 複合現実システム、プログラム、携帯端末装置、及び方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15842350 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016548769 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15512340 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15842350 Country of ref document: EP Kind code of ref document: A1 |