US11483528B2 - Information processing apparatus and information processing method - Google Patents
Information processing apparatus and information processing method Download PDFInfo
- Publication number
- US11483528B2 US11483528B2 US16/964,818 US201916964818A US11483528B2 US 11483528 B2 US11483528 B2 US 11483528B2 US 201916964818 A US201916964818 A US 201916964818A US 11483528 B2 US11483528 B2 US 11483528B2
- Authority
- US
- United States
- Prior art keywords
- projection
- section
- posture
- imaging
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/10—Projectors with built-in or built-on screen
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/145—Housing details, e.g. position adjustments thereof
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/147—Optical correction of image distortions, e.g. keystone
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B37/00—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
- G03B37/04—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/3147—Multi-projection systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
Definitions
- the present disclosure relates to an information processing apparatus and an information processing method. More particularly, the disclosure relates to an information processing apparatus and an information processing method for suppressing a decrease in the accuracy of image projection correction.
- Some methods representing the techniques involve three-dimensionally measuring the shape of a projection plane (screen) and geometrically correcting a projected image thereon, based on information regarding the measurements.
- internal parameters e.g., focal point distance, principal point, and lens distortion factor
- external parameters indicative of their positions and postures relative to each other.
- the f ⁇ lens involves the effect of lens distortion far larger than the f tan ⁇ lens.
- performing calibration without regard to the lens distortion as in the case of the f tan ⁇ lens system can make the accuracy of projection correction lower than in the case of the f tan ⁇ lens system.
- the present disclosure has been made in view of the above circumstances and aims at suppressing a decrease in the accuracy of image projection correction.
- an information processing apparatus including a posture estimation section configured such that, by use of an image projection model using a distortion factor of an f ⁇ lens with an image height of incident light expressed by a product of a focal point distance f and an incident angle ⁇ of the incident light, the posture estimation section estimates a posture of a projection section for projecting an image and a posture of an imaging section for capturing a projection plane to which the image is projected.
- an information processing method including, by use of an image projection model using a distortion factor of an f ⁇ lens with an image height of incident light expressed by a product of a focal point distance f and an incident angle ⁇ of the incident light, estimating a posture of a projection section for projecting an image and a posture of an imaging section for capturing a projection plane to which the image is projected.
- an image projection model using a distortion factor of an f ⁇ lens with an image height of incident light expressed by a product of a focal point distance f and an incident angle ⁇ of the incident light a posture of a projection section for projecting an image and a posture of an imaging section for capturing a projection plane to which the image is projected are estimated.
- the disclosure permits reduction of a decrease in the accuracy of image projection correction.
- FIG. 1 is a block diagram depicting a principal configuration example of a projection imaging system.
- FIG. 2 is a block diagram depicting a principal configuration example of a control apparatus.
- FIG. 3 is a functional block diagram depicting examples of major functional blocks implemented by a control section.
- FIG. 4 is a block diagram depicting a principal configuration example of a projection apparatus.
- FIG. 5 is a block diagram depicting a principal configuration example of an imaging apparatus.
- FIG. 6 is a flowchart explaining a typical flow of a calibration process.
- FIG. 7 is a view depicting how pixel-to-pixel correspondence is obtained using structured light.
- FIG. 8 is a flowchart explaining a typical flow of a posture estimation process.
- FIG. 9 is a flowchart explaining a typical flow of a parameter estimation process.
- FIG. 10 is a view depicting how distortion is typically corrected.
- FIG. 11 is a view depicting how a ray trace is typically performed with distortion taken into consideration.
- FIG. 12 is a view depicting how posture estimation is typically performed.
- FIG. 13 is a flowchart explaining a typical flow of a geometric correction process.
- FIG. 14 is a view depicting how geometric projection correction is typically performed with respect to a virtual viewpoint.
- FIGS. 15A, 15B, and 15C are views depicting how a virtual viewpoint is typically set.
- FIGS. 16A and 16B are views depicting how two-dimensional curved surface fitting is typically performed.
- FIG. 17 is a view depicting how model misalignment typically takes place.
- FIG. 18 is a view depicting how a model misalignment corresponding process is typically performed.
- FIGS. 19A, 19B, and 19C are block diagrams depicting another configuration example of the projection imaging system.
- Some methods representing such techniques involve getting the configured projectors to project patterns or markers to the screen and causing the cameras or sensors, which are also configured, to obtain information for correction purposes.
- the method using two-dimensional information involves a simplified apparatus configuration without the need for calibrating the projectors or cameras.
- this method does not guarantee that the corrected image is geometrically accurate (e.g., a straight line when corrected ought to be seen as a straight line from a camera point of view).
- the method using three-dimensional information, with its correction aligned with the screen shape is more likely to guarantee the geometric accuracy of the resulting image but requires the following procedures for calibration.
- the projector projects patterns or makers to a target; the camera captures the projected patterns or markers; and the control apparatus obtains pixel-to-pixel correspondence between the projector and the camera by using the captured image and measures depth information (depth) by using the principle or triangulation.
- depth depth information
- the control apparatus is required to follow procedures for estimating the internal variables of the projector and camera and their relative positions and postures, i.e., the control apparatus requires calibration in the case of the method using three-dimensional information.
- the existing calibration methods are based on the assumption that the methods are applied to a system using a projector equipped with what is generally called the f tan ⁇ lens (ordinary lens) of which the image height of light at an incident angle ⁇ is represented by the product of a focal point distance f and tan ⁇ (f ⁇ tan ⁇ ).
- the effect of lens distortion in the projector is very small compared with the effect from the internal and external parameters.
- the projector using the f ⁇ lens is more suitable for image projection onto the curved surface such as the dome-type projection plane than the projector employing the f tan ⁇ lens.
- the f ⁇ lens is subject to a far higher effect of lens distortion than the f tan ⁇ lens.
- performing calibration without regard to the lens distortion as in the case of the f tan ⁇ lens system can make the accuracy of projection correction lower than in the case of the f tan ⁇ lens system.
- the method described therein involves projecting images to two projection planes, i.e., a plane in the optical axis direction of the projector, and a plane perpendicular to that plane. For this reason, it is difficult to apply this method to image projection onto the dome-type spherical surface screen.
- an image projection model that uses the distortion factor of the f ⁇ lens with the image height of incident light represented by the product of the focal point distance f and the incident angle ⁇ of the incident light is used to estimate the posture of a projection section for projecting an image and the posture of an imaging section for capturing the projection plane to which the image is projected.
- an information processing apparatus includes a posture estimation section for estimating the posture of a projection section that projects an image and the posture of an imaging section that captures the projection plane to which the image is projected, through the use of an image projection model that uses the distortion factor of the f ⁇ lens with the image height of incident light represented by the product of the focal point distance f and the incident angle ⁇ of the incident light.
- the above configuration permits posture estimation while correcting the lens distortion of the f ⁇ lens.
- the above configuration enables calibration of the internal and external parameters of the projection section and imaging section with sufficiently high accuracy. That is, image projection correction is easier to perform, so that the robustness of the accuracy of parameters in the face of environmental changes is enhanced. This enables practical operation of the image projection system that uses the f ⁇ lens.
- FIG. 1 is a block diagram depicting a principal configuration example of a projection imaging system to which the present technology is applied.
- a projection imaging system 100 is a system that projects images to a projection plane and calibrates parameters by using images captured of the images projected onto the projection plane.
- the projection imaging system 100 includes a control apparatus 111 , a projection apparatus 112 - 1 , an imaging apparatus 113 - 1 , a projection apparatus 112 - 2 , and an imaging apparatus 113 - 2 .
- the projection apparatus 112 - 1 , the imaging apparatus 113 - 1 , the projection apparatus 112 - 2 , and the imaging apparatus 113 - 2 are communicably connected with the control apparatus 111 via cables 115 - 1 to 115 - 4 , respectively.
- the projection apparatuses 112 - 1 and 112 - 2 will be referred to as the projection apparatus or apparatuses 112 in the case where there is no need for their individual explanation.
- the imaging apparatuses 113 - 1 and 113 - 2 will be referred to as the imaging apparatus or apparatuses 113 where there is no need for their individual explanation.
- the cables 115 - 1 to 115 - 4 will be referred to as the cable or cables 115 where there is no need for their individual explanation.
- the control apparatus 111 controls each projection apparatus 112 and each imaging apparatus 113 via the cables 115 .
- the control apparatus 111 is supplied with an image via a cable 114 .
- the control apparatus 111 feeds the image to each projection apparatus 112 that in turn projects the image to a dome-type (partially spherical surface-shaped) screen 121 .
- the control apparatus 111 causes each imaging apparatus 113 to capture the screen 121 (e.g., image projected onto the screen 121 ) and acquires the captured image.
- control apparatus 111 calibrates the parameters of the projection apparatuses 112 and imaging apparatuses 113 by using the captured image, thereby calculating the parameters for geometrically correcting the images to be projected by the projection apparatuses 112 . Using the calculated parameters, the control apparatus 111 geometrically corrects images supplied from the outside and feeds the geometrically corrected images to the projection apparatuses 112 .
- the projection apparatuses 112 each have the function of what is generally called a projector. For example, under the control of the control apparatus 111 , the projection apparatuses 112 project to the screen 121 images supplied from the control apparatus 111 . The projection apparatuses 112 under the control of the control apparatus 111 operate in cooperation with each other to perform image projection such that a single projected image appears on the screen 121 (i.e., one projected image is displayed on the screen 121 ).
- the multiple projection apparatuses 112 perform image projection in such a manner that the images projected are arranged side by side with no gap therebetween on the screen 121 , thereby obtaining a projected image larger (with high resolution) than the image projected by a single projection apparatus 112 (i.e., such a projected image is displayed on the screen 121 ).
- the multiple projection apparatuses 112 perform image projection in such a manner that the images projected coincide with each other in position on the screen 121 , thereby acquiring an image brighter (of high dynamic range) than the image projected by a single projection apparatus 112 (i.e., such a projected image is displayed on the screen 121 ). That is, the projection imaging system 100 in such a case is what is generally called a multi-projection system that implements what is known as projection mapping.
- the imaging apparatuses 113 each have the function of what is generally called a camera. For example, under the control of the control apparatus 111 , the imaging apparatuses 113 capture the screen 121 (i.e., screen 121 to which images are projected by the projection apparatuses 112 ) and feeds data of the captured images (also called captured image data) to the control apparatus 111 . The captured images are used by the control apparatus 111 in calculating the parameters for geometrically correcting images (i.e., in calibrating the parameters of the projection apparatuses 112 and imaging apparatuses 113 ). That is, the imaging apparatuses 113 are configured to geometrically correct the images to be projected (i.e., configured to calculate the parameters for geometric correction).
- the screen 121 is an approximately dome-shaped (partially spherical surface-shaped) projection plane. Configured to be a curved surface, the screen 121 allows images to be projected (displayed) thereon with a wider viewing angle than when the images are projected onto a flat screen. This enables the user to experience more realistic sensations and a deeper sense of immersion.
- the projection apparatuses 112 and the imaging apparatuses 113 each include what is generally called the f ⁇ lens (also known as the fish-eye lens) instead of what is generally called the f tan ⁇ lens (ordinary lens). It follows that the images projected by the projection apparatuses 112 or captured by the imaging apparatuses 113 each have larger distortion, particularly in a peripheral region, than in the case of the f tan ⁇ lens.
- FIG. 2 is a block diagram depicting a principal configuration example of the control apparatus 111 as an embodiment of the information processing apparatus to which the present technology is applied. It is to be noted that FIG. 2 depicts major processing blocks and principal data flows therebetween and does not cover the entire configuration of the control apparatus 111 . That is, the control apparatus 111 may include processing blocks that are not illustrated in FIG. 11 as well as data flows and processes other than those indicated by arrows in FIG. 2 .
- the control apparatus 111 includes a control section 201 , an input section 211 , an output section 212 , a storage section 213 , a communication section 214 , and a drive 215 .
- the control section 201 performs processes related to controls.
- the control section 201 controls any configured elements in the control apparatus 111 .
- the control section 201 also performs processes related to controls over other apparatuses such as the projection apparatuses 112 and imaging apparatuses 113 .
- the control section 201 may be configured in any manner desired.
- the control section 201 may include a CPU (Central Processing Unit), a ROM (Read Only Memory), and RAM (Random Access Memory), the CPU loading programs and data from the ROM into the RAM and executing and operating on the loaded programs and data to carry out relevant processes.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- the input section 211 includes input devices for accepting information from the outside such as the input from the user.
- the input section 211 may include a keyboard, a mouse, operation buttons, a touch panel, a camera, a microphone, and input terminals.
- the input section 211 may further include various sensors such as an acceleration sensor, an optical sensor, and a temperature sensor, as well as input equipment such as a barcode reader.
- the output section 212 includes output devices for outputting information such as images and sounds.
- the output section 212 may include a display unit, speakers, and output terminals.
- the storage section 213 includes storage media for storing information such as programs and data.
- the storage section 213 may include a hard disk, a RAM disk, and a nonvolatile memory.
- the communication section 214 includes a communication device for communicating with external apparatuses by sending and receiving information such as programs and data thereto and therefrom via predetermined communication media (e.g., suitable networks such as the Internet).
- the communication section 214 may include a network interface.
- the communication section 214 performs communication (i.e., exchanges programs and data), for example, with apparatuses external to the control apparatus 111 (e.g., projection apparatuses 112 and imaging apparatuses 113 ).
- the communication section 214 may have a wired communication function or a wireless communication function, or both.
- the drive 215 retrieves information (e.g., programs and data) from removable media 221 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory attached to the drive 215 .
- the drive 215 supplies the information retrieved from the removable media 221 to the control section 201 , among others.
- the drive 215 can have the information (e.g., programs and data) supplied from the control section 201 stored in the attached piece of removable media 221 .
- FIG. 3 is a functional block diagram depicting examples of major functional blocks implemented by the control apparatus 111 performing programs, for example. As depicted in FIG. 3 , the control apparatus 111 executes programs to implement the functions of a sensing processing section 251 , a posture estimation section 252 , and a geometric correction section 253 , for example.
- the sensing processing section 251 performs processes related to sensing. For example, the sensing processing section 251 performs the process of detecting corresponding points between the pixels of the projection apparatus 112 and those of the imaging apparatus 113 by using captured images from the imaging apparatuses 113 . The sensing processing section 251 supplies the posture estimation section 252 with the result of the process (i.e., information regarding the corresponding points between the projection apparatus 112 and the imaging apparatus 113 ).
- the posture estimation section 252 performs processes related to estimation of the postures of the projection apparatuses 112 and imaging apparatuses 113 . For example, using an image projection model that uses the distortion factor of the f ⁇ lens, the posture estimation section 252 estimates the parameters (variables) related to the postures of at least either the projection apparatuses 112 or the imaging apparatuses 113 (i.e., calculates the estimates of the variables related to postures).
- the posture-related parameters may be of any suitable type.
- the parameters may include the internal parameters (also called internal variables) of at least either the projection apparatuses 112 or the imaging apparatuses 113 .
- the internal parameters may be of any suitable type.
- the internal parameters may include at least one of the focal point distance, principal point, or the parameter (k inv ) corresponding to inverse transformation of the lens distortion factor of the projection apparatus 112 or the imaging apparatus 113 .
- the posture-related parameters may include the external parameters (also called external variables) of at least either the projection apparatuses 112 or the imaging apparatuses 113 .
- the external parameters may be of any suitable type.
- the external parameters may include at least either a rotation matrix or a translation vector with respect to the origin of a world coordinate system of the projection apparatus 112 or of the imaging apparatus 113 .
- the posture estimation section 252 estimates the posture-related parameters, based on the information regarding the corresponding points between the projection apparatus 112 and the imaging apparatus 113 , the information being supplied from the sensing processing section 251 .
- the posture estimation section 252 also estimates the posture-related parameters, based on representative values of the internal parameters (also called internal variable representative values) of at least either the projection apparatuses 112 or the imaging apparatuses 113 having been determined beforehand.
- the posture estimation section 252 supplies the geometric correction section 253 with the obtained parameter estimates (at least either the internal parameter estimates (also called internal variable estimates) or the external parameter estimates (also called external variable estimates) of at least either the projection apparatuses 112 or the imaging apparatuses 113 ).
- the geometric correction section 253 performs processes related to geometric correction of images. For example, on the basis of the parameter estimates supplied from the posture estimation section 252 , the geometric correction section 253 calculates the parameters (e.g., vector data for geometric correction) for use in geometric correction of the images input from the outside via the cable 114 .
- the parameters e.g., vector data for geometric correction
- the posture estimation section 252 includes an imaging variable estimation section 261 , a projection variable estimation section 262 , and a total optimization section 263 .
- the imaging variable estimation section 261 performs processes related to estimating at least either the internal parameters or the external parameters of the imaging apparatuses 113 (the parameters are also called imaging variables).
- the projection variable estimation section 262 performs processes related to estimating at least either the internal parameters or the external parameters of the projection apparatuses 112 (the parameters are also called projection variables).
- the total optimization section 263 performs processes related to optimizing the estimates of the imaging variables (also called imaging variable estimates) obtained by the imaging variable estimation section 261 and the estimates of the projection variables (also called the projection variable estimates) acquired by the projection variable estimation section 262 .
- the posture estimation section 252 estimates the imaging variables and projection variables and, through total optimization, obtains at least either the internal variable estimates or the external variable estimates of at least either the projection apparatuses 112 or the imaging apparatuses 113 .
- the posture estimation section 252 performs the above-described posture estimation by using the image projection model that uses the distortion factor of the f ⁇ lens. That is, the imaging variable estimation section 261 obtains the imaging variable estimates by using the image projection model that uses the distortion factor of the f ⁇ lens. Likewise, the projection variable estimation section 262 acquires the projection variable estimates by use of the image projection model that uses the distortion factor of the f ⁇ lens. Similarly, the total optimization section 263 optimizes all of these parameters through the use of the image projection model that uses the distortion factor of the f ⁇ lens.
- control apparatus 111 can suppress a decrease in the accuracy of image projection correction. This permits practical operation of the projection imaging system 100 using the f ⁇ lens.
- the geometric correction section 253 includes a projection plane modeling section 271 , a virtual viewpoint position/projection direction estimation section 272 , a model misalignment corresponding processing section 273 , and a projection mask generation section 274 .
- the projection plane modeling section 271 performs processes related to projection plane modeling (functionalization of curved surface).
- the virtual viewpoint position/projection direction estimation section 272 performs processes related to estimating a virtual viewpoint position serving as a reference point for distortion correction and an image projection direction relative to that virtual viewpoint position.
- the model misalignment corresponding processing section 273 performs a corresponding process for suppressing misalignment between the actual projection plane and the model thereof (also called model misalignment).
- the projection mask generation section 274 performs processes related to generating projection masks for limiting the range in which the projection apparatuses 112 project images.
- FIG. 4 is a block diagram depicting a principal configuration example of the projection apparatus 112 as one embodiment of the information processing apparatus to which the present technology is applied. It is to be noted that FIG. 4 depicts major processing blocks and principal data flows therebetween and does not cover the entire configuration of the projection apparatus 112 . That is, the projection apparatus 112 may include processing blocks that are not illustrated in FIG. 4 as well as data flows and processes other than those indicated by arrows in FIG. 4 .
- the projection apparatus 112 includes a control section 301 , a projection section 302 , an input section 311 , an output section 312 , a storage section 313 , a communication section 314 , and a drive 315 .
- the control section 301 performs processes related to controls.
- the control section 301 controls any configured elements in the projection apparatus 112 .
- the control section 301 controls drive of the projection section 302 .
- the control section 301 may be configured in any manner desired.
- the control section 301 may include a CPU, a ROM, and RAM, the CPU loading programs and data from the ROM into the RAM and executing and operating on the loaded programs and data to carry out relevant processes.
- the projection section 302 under the control of the control section 301 performs processes related to image projection. For example, the projection section 302 acquires from the control section 301 the image data supplied from the control apparatus 111 and projects the acquired image to the screen 121 .
- the projection section 302 has the f ⁇ lens as mentioned above, so that the image is projected to the screen 121 via the f ⁇ lens.
- the input section 311 includes input devices for accepting information from the outside such as the input from the user.
- the input section 311 may include a keyboard, a mouse, operation buttons, a touch panel, a camera, a microphone, and input terminals.
- the input section 311 may further include various sensors such as an acceleration sensor, an optical sensor, and a temperature sensor, as well as input equipment such as a barcode reader.
- the output section 312 includes output devices for outputting information such as images and sounds.
- the output section 312 may include a display unit, speakers, and output terminals.
- the storage section 313 includes storage media for storing information such as programs and data.
- the storage section 313 may include a hard disk, a RAM disk, and a nonvolatile memory.
- the communication section 314 includes a communication device for communicating with external apparatuses by sending and receiving information such as programs and data thereto and therefrom via predetermined communication media (e.g., suitable networks such as the Internet).
- the communication section 314 may include a network interface.
- the communication section 314 performs communication (i.e., exchanges programs and data), for example, with apparatuses external to the projection apparatus 112 (e.g., control apparatus 111 ).
- the communication section 314 may have a wired communication function or a wireless communication function, or both.
- the drive 315 retrieves information (e.g., programs and data) from removable media 321 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory attached to the drive 315 .
- the drive 315 supplies the information retrieved from the removable media 321 to the control section 301 , among others.
- the drive 315 can have the information (e.g., programs and data) supplied from the control section 301 stored in the attached piece of removable media 321 .
- FIG. 5 is a block diagram depicting a principal configuration example of the imaging apparatus 113 as one embodiment of the information processing apparatus to which the present technology is applied. It is to be noted that FIG. 5 depicts major processing blocks and principal data flows therebetween and does not cover the entire configuration of the imaging apparatus 113 . That is, the imaging apparatus 113 may include processing blocks that are not illustrated in FIG. 5 as well as data flows and processes other than those indicated by arrows in FIG. 5 .
- the imaging apparatus 113 includes a control section 401 , an imaging section 402 , an input section 411 , an output section 412 , a storage section 413 , a communication section 414 , and a drive 415 .
- the control section 401 performs processes related to controls.
- the control section 401 controls any configured elements in the imaging apparatus 113 .
- the control section 401 controls drive of the imaging section 402 .
- the control section 401 may be configured in any manner desired.
- the control section 401 may include a CPU, a ROM, and RAM, the CPU loading programs and data from the ROM into the RAM and executing and operating on the loaded programs and data to carry out relevant processes.
- the imaging section 402 under the control of the control section 401 performs processes related to capturing an imaged subject. For example, the imaging section 402 captures the image projected to the screen 121 by the projection apparatus 112 so as to obtain captured image data. The imaging section 402 supplies the captured image data to the control section 401 . In turn, the control section 401 supplies the captured image data to the control apparatus 111 via the communication section 414 . Note that the imaging section 402 has the f ⁇ lens as mentioned above, so that the imaging section 402 captures the screen 121 (i.e., projected image) via the f ⁇ lens.
- the input section 411 includes input devices for accepting information from the outside such as the input from the user.
- the input section 411 may include a keyboard, a mouse, operation buttons, a touch panel, a camera, a microphone, and input terminals.
- the input section 411 may further include various sensors such as an acceleration sensor, an optical sensor, and a temperature sensor, as well as input equipment such as a barcode reader.
- the output section 412 includes output devices for outputting information such as images and sounds.
- the output section 412 may include a display unit, speakers, and output terminals.
- the storage section 413 includes storage media for storing information such as programs and data.
- the storage section 413 may include a hard disk, a RAM disk, and a nonvolatile memory.
- the communication section 414 includes a communication device for communicating with external apparatuses by sending and receiving information such as programs and data thereto and therefrom via predetermined communication media (e.g., suitable networks such as the Internet).
- the communication section 414 may include a network interface.
- the communication section 414 performs communication (i.e., exchanges programs and data), for example, with apparatuses external to the imaging apparatus 113 (e.g., control apparatus 111 ).
- the communication section 414 may have a wired communication function or a wireless communication function, or both.
- the drive 415 retrieves information (e.g., programs and data) from removable media 421 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory attached to the drive 415 .
- the drive 415 supplies the information retrieved from the removable media 421 to the control section 401 , among others.
- the drive 415 can have the information (e.g., programs and data) supplied from the control section 401 stored in the attached piece of removable media 421 .
- the control apparatus 111 in the projection imaging system 100 performs a calibration process to calibrate the projection variables (internal and external parameters of the projection apparatuses 112 ) and the imaging variables (internal and external parameters of the imaging apparatuses 113 ).
- the sensing processing section 251 performs a sensing process in step S 101 to detect corresponding points.
- the Structured Light method is used to obtain pixel-to-pixel correspondence between the projection apparatus 112 (e.g., projector) and the imaging apparatus 113 (e.g., camera). More specifically, the projection apparatus 112 (projector) projects to the dome-type screen 121 patterns with their pixels encoded in the time direction (e.g., gray code or checker pattern) while switching the patterns in time series. Further, the imaging apparatus 113 (camera) captures a projected image of each of these patterns. On the basis of each of the patterns included in the captured images, the control apparatus 111 obtains the corresponding points between the pixels of the projection apparatus 112 and those of the imaging apparatus 113 . When the information regarding the corresponding points (i.e., corresponding points between the projection and imaging apparatuses) has been obtained, the process advances to step S 102 .
- the projection apparatus 112 projects to the dome-type screen 121 patterns with their pixels encoded in the time direction (e.g., gray code or checker pattern) while switching the patterns in time series.
- step S 102 the posture estimation section 252 performs a posture estimation process based on the information regarding the corresponding points obtained in step S 101 , so as to obtain the internal and external variable estimates of each of the projection apparatuses 112 and imaging apparatuses 113 .
- posture estimation section 252 initially regards the internal and external variables of the projection apparatuses 112 and imaging apparatuses 113 as unknowns. After the projection apparatuses 112 and imaging apparatuses 113 have been suitably arranged relative to the screen 121 to which the projection apparatuses 112 project images, the posture estimation section 252 estimates the variables according to this arrangement. That is, with the present method, there is no need to perform preliminary calibration procedures on the internal and external variables of the projection apparatuses 112 and imaging apparatuses 113 .
- the projection apparatuses 112 and imaging apparatuses 113 may preferably retain as initial values the representative values of their internal variables (e.g., focal point distance, principal point, and lens distortion).
- the posture estimation section 252 may then perform the posture estimation process using these representative values.
- the focal point distance and the principal point may be set on the basis of the resolution of captured and projected images.
- An average of the values obtained by calibrating multiple projection apparatuses 112 and imaging apparatuses 113 beforehand may be used as the lens distortion factor.
- These internal variables are used solely as the initial values.
- the posture estimation section 252 estimates again all these internal variables.
- the posture estimation section 252 further estimates the external variables of the projection apparatuses 112 and imaging apparatuses 113 . Obtaining the external variables does not require preparing their initial values beforehand. It is possible to automatically estimate the external variables in a state where they are completely unknown.
- step S 103 After the internal and external variable estimates of each of the projection apparatuses 112 and imaging apparatuses 113 have been acquired by the above-described posture estimation process, the process advances to step S 103 .
- step S 103 the geometric correction section 253 obtains vector data for geometric correction by performing a geometric correction process using the internal and external variable estimates of each of the projection apparatuses 112 and imaging apparatuses 113 acquired in the processing of step S 102 .
- step S 103 Upon completion of the processing in step S 103 , the calibration process is terminated.
- the imaging variable estimation section 261 in the posture estimation section 252 estimates the internal and external variables of the imaging apparatuses 113 (i.e., imaging variables) in step S 121 .
- the imaging variable estimation section 261 estimates the posture-related parameters of the imaging apparatuses 113 (i.e., imaging variables) by using the image projection model that uses the distortion factor of the f ⁇ lens.
- the process advances to step S 122 .
- step S 122 the projection variable estimation section 262 estimates the internal and external variables of the projection apparatuses 112 (i.e., projection variables).
- the projection variables are estimated in a manner similar to the case where the imaging variables are estimated in step S 121 .
- step S 123 the total optimization section 263 optimizes the estimates of the imaging variables obtained in step S 121 (internal and external variable estimates of the imaging apparatuses 113 ) and the estimates of the projection variables acquired in step S 122 (internal and external variable estimates of the projection apparatuses 112 ).
- step S 123 After each of the variable estimates is optimized and after the processing of step S 123 is terminated, the posture estimation process comes to an end. The process then returns to the flowchart of FIG. 6 .
- the posture estimation section 252 performs the posture estimation process to individually estimate and optimize the internal parameters (focal point distance, principal point, and parameter k inv corresponding to inverse transformation of the lens distortion factor) and the external parameters (rotation matrix and translation vector with respect to the origin of a world coordinate system) of the projection apparatuses 112 , before finally and simultaneously optimizing the parameters to obtain the final estimates.
- step S 121 the estimation of the imaging variables (step S 121 ) as well as the estimation of the projection variables (step S 122 ) performed during the above-described posture estimation process.
- the imaging variables and the projection variables are estimated basically using similar methods. These variables are estimated using the image projection model that uses the f ⁇ lens.
- the internal parameters (e.g., focal point distance and principal point) of the projection apparatuses 112 and imaging apparatuses 113 may be expressed by the expression (12) below.
- the internal parameters may be expressed by the expression (13) below.
- the present method uses the parameter k inv corresponding to inverse transformation of the lens distortion factor.
- a three-dimensional space in which the target to be measured exists is an ideal space free of distortion and that the pixel value expressed in the image coordinate system of the projection apparatuses 112 (or imaging apparatuses 113 ) and corresponding to a point in a three-dimensional space includes distortion.
- a pixel p′ in the image coordinate system of the projection apparatuses 112 is subjected to distortion correction to obtain an ideal coordinate value p (free of distortion), which in turn is projected to a three-dimensional space to acquire a light ray on which exists a three-dimensional point P corresponding to the point p′.
- corresponding ideal coordinates are projected to the three-dimensional space to obtain multiple light rays of which the intersection point is measured as a three-dimensional point corresponding to each pixel.
- the value k is defined as the parameter for re-projecting a point from the three-dimensional space in the direction of a distorted two-dimensional image.
- the present method introduces the parameter k inv corresponding to inverse transformation of the lens distortion factor k in order to perform distortion correction on the coordinate values of the projection apparatuses 112 and imaging apparatuses 113 by use of the above-described expressions (7) to (9).
- Using the parameter k inv permits unified distortion correction on all pixels. Further, compared with methods of compensating each pixel, this method suppresses an increase in calculation costs.
- the parameter kin is initially estimated by the procedure below using the value of the distortion factor k. Thereafter, the parameter k inv is estimated again in optimization steps (parameter estimation process), to be discussed later.
- k inv [ k inv(1) k inv(2) k inv(3) k inv(4) ] T (16)
- the points (pixels) for use in the estimation are to be a sufficient number of points sampled at equal intervals longitudinally and crosswise over the entire image.
- the lens distortion factor k in the direction of the distorted two-dimensional image (in the direction of re-projection) is to be given an appropriate initial value, such as a representative value generated from an average of the calibration values of multiple projection apparatuses 112 (projectors) and imaging apparatuses 113 (cameras).
- ⁇ ′ r ′(1+ k inv(1) r′ 2 +k inv(2) r′ 4 +k inv(3) r′ 6 k inv(4) r′ 8 ) (18)
- steps S 121 and S 122 of FIG. 8 the parameter k inv corresponding to inverse transformation of the lens distortion factor is re-estimated.
- the re-estimation is implemented by nonlinear optimization that involves minimizing the distance between corresponding light rays (triangulation error) obtained from the pixel-to-pixel correspondence between the projection apparatus 112 and the imaging apparatus 113 .
- a typical flow of the parameter estimation process constituted by steps S 121 and S 122 in FIG. 8 is explained below with reference to the flowchart of FIG. 9 .
- the Levenberg-Marquardt method may be used, for example.
- the imaging variable estimation section 261 (or projection variable estimation section 262 ) sets, for example, the parameter k inv as an estimation target parameter in step S 141 .
- the parameter k inv any internal or external variable other than the parameter k inv may be designated instead.
- multiple parameters may be designated at the same time.
- step S 142 the imaging variable estimation section 261 (or projection variable estimation section 262 ) performs distortion correction on each corresponding point by using the parameter k inv corresponding to inverse transformation of the lens distortion factor.
- the three-dimensional space is regarded as a distortion-free ideal space, and the captured or projected image is defined to include distortion caused by the f ⁇ lens. That is, previously acquired pixel-to-pixel correspondence between the projection apparatus 112 and the imaging apparatus 113 has been obtained as the corresponding relations between the image coordinates of pixels each including distortion. Thus, in order to perform a ray trace in the three-dimensional space on the basis of the corresponding relations, it is necessary to use distortion-free image coordinates.
- a corresponding pixel 601 in an image 611 on a two-dimensional plane such as the one depicted on the left in FIG. 10 is subjected to distortion correction using the parameter k inv and the above-described expressions (7) to (11).
- An image coordinate value (corresponding pixel 602 ) is thus obtained in a distortion-free image 612 whose range is made wider than the initial rectangle due to the effect of the f ⁇ lens, as depicted on the right in FIG. 10 .
- This distortion correction is carried out on each of the corresponding pixels.
- step S 143 the imaging variable estimation section 261 (or projection variable estimation section 262 ) calculates (approximates) triangulation points by tracing corresponding light rays.
- the light rays corresponding to a pixel can be traced by projecting distortion-free image coordinates in the direction of the three-dimensional space, by using the relations of the expressions (10) and (11) above. That is, as depicted in FIG. 11 , the distortion correction transforms a corresponding pixel 601 A in a distorted two-dimensional plane image 611 A to a corresponding pixel 602 A in an image 612 A.
- the distortion correction transforms a corresponding pixel 601 B in a distorted two-dimensional plane image 611 B to a corresponding pixel 602 B in an image 612 B.
- the corresponding pixels between the projection apparatus 112 and the imaging apparatus 113 are corrected for distortion and subjected to projection to obtain light rays of which the intersection point is measured as the three-dimensional point corresponding to the pixel.
- the distance between the light rays corresponding to a triangulation point is regarded as an error between the corresponding light rays (i.e., triangulation error).
- the error is zero, that means the corresponding light rays intersect with each other at one point in the three-dimensional space.
- step S 144 the imaging variable estimation section 261 (or projection variable estimation section 262 ) calculates an average error of the entire corresponding points on the basis of the measurement errors obtained in the processing of step S 143 .
- step S 145 the imaging variable estimation section 261 (or projection variable estimation section 262 ) determines whether or not all corresponding points have been processed. In the case where an unprocessed corresponding point is determined to exist, the process returns to step S 141 and the subsequent steps are repeated. That is, steps S 141 to S 145 are carried out on each of the corresponding points. Then, in the case where it is determined in step S 145 that all corresponding points have been processed, the process advances to step S 146 .
- step S 146 the imaging variable estimation section 261 (or projection variable estimation section 262 ) determines whether or not the average error calculated in step S 144 is equal to or smaller than a predetermined threshold value. In the case where the average error is determined to be larger than the predetermined threshold value, the process advances to step S 147 .
- step S 147 the imaging variable estimation section 261 (or projection variable estimation section 262 ) corrects the estimation parameters (e.g., parameter k inv and other parameters). Note that in order to achieve highly accurate estimation, the corresponding points of which the errors are very large are to be removed as needed by each processing block so that these points will not be used for the estimation.
- the process Upon completion of the processing in step S 147 , the process returns to step S 141 and the subsequent steps are repeated. That is, the processing in step S 141 to step S 147 is repeated until the average error of the entire corresponding points is optimized to be equal to or smaller than the predetermined threshold value.
- step S 146 determines that the average error is equal to or smaller than the predetermined threshold value and that the parameters have been optimized. The process then returns to the flowchart of FIG. 8 .
- outliers are removed so that the points outside the screen 121 or the points of low sensing accuracy will not be used for the estimation.
- an error between corresponding light rays from the projection apparatus 112 and imaging apparatus 113 is obtained at the time of finding a triangulation point.
- the optimization for minimizing the error between the corresponding light rays and the removal of the sensing points with very large errors between the corresponding light rays are repeated. This enables still more accurate estimation of the internal and external variables in the case where highly accurate information regarding the corresponding points has been obtained from the sensing process.
- step S 123 of FIG. 8 the parameters are optimized basically in a similar manner. It is to be noted, however, that during the total optimization, the process of updating all projection and imaging variables estimated as described above is repeated while the parameters targeted for the estimation are changed one after another.
- the projection plane modeling section 271 in the geometric correction section 253 reconfigures in step S 161 the projection plane by using the parameters related to posture estimation of the projection apparatuses 112 and imaging apparatuses 113 , thereby fitting a two-dimensional curved surface.
- step S 162 the virtual viewpoint position/projection direction estimation section 272 estimates a virtual viewpoint position and a direction of image projection relative to that virtual viewpoint position. For example, suppose that the processing in step S 161 has set a screen shape model 701 as illustrated in FIG. 14 . Then, the virtual viewpoint position/projection direction estimation section 272 sets a virtual viewpoint 702 at the front of the screen shape model 701 and establishes a projection direction (in front) relative to the virtual viewpoint 702 .
- the virtual viewpoint position/projection direction estimation section 272 selects, from a group of three-dimensional points measured as depicted in FIG. 15A , a group of points corresponding to the edge of the screen 121 as illustrated in FIG. 15B .
- the virtual viewpoint position/projection direction estimation section 272 further fits the selected group of points to a plane as pictured in FIG. 15C .
- a normal direction to that plane is regarded as the front direction as viewed from a viewpoint camera.
- the projection apparatuses 112 - 1 and 112 - 2 are generally arranged approximately at the same height, the horizontal direction is determined on the basis of that height of the projection apparatuses 112 .
- the vertical direction is finally determined as a vector that is at right angles to the other two directions. Such estimation of the virtual viewpoint direction and of the projection direction is automatically performed using measurement information. There is no need to designate the direction explicitly and manually.
- the above processing allows the virtual viewpoint direction and the projection direction to be estimated.
- a geometric correction vector can be generated in such a manner as to permit viewing of the image from that position in a geometrically accurate manner.
- step S 163 the model misalignment corresponding processing section 273 determines whether or not the estimates correspond to model misalignment. In the case where it is determined that there is misalignment between the projection plane and its model (model misalignment) and that the estimates correspond to model misalignment, the process advances to step S 164 .
- step S 164 the model misalignment corresponding processing section 273 performs a model misalignment corresponding interpolation process.
- three-dimensional points 721 on the projection plane are measured using the information regarding the corresponding points between the projection apparatus 112 and the imaging apparatus 113 and the internal and external variable estimates of these apparatuses.
- this group of measured three-dimensional points is fitted with the screen shape model 701 of a two-dimensional curved surface (e.g., ellipsoid, hyperboloid, or cylinder) such as to minimize the least-square error.
- the points are usually modeled as an ellipsoid. This provides smooth, noise-resistant geometric correction through calculation of the geometric correction vector based on the model estimated from the whole pixels.
- the model-based geometric correction can entail an error of the screen shape when the shape is measured as three-dimensional points deviating, due to distortion, for example, from the model estimated as depicted in FIG. 17 .
- a three-dimensional point 731 measured on the estimated screen shape model 701 is found deviating from a three-dimensional point 732 that ought to be measured on the actual (real-world) screen 121 .
- the geometric correction vector is generated from the position of an intersection point on a model with its radius set by the sphere center of a virtually arranged spherical model and an interpolated value of distances of triangulation points. That is, in the example of FIG. 18 , a three-dimensional point 741 on a screen shape model 701 A and a three-dimensional point 742 on a screen shape model 701 B are measured as the triangulation points. A three-dimensional point 743 is then interpolated between these three-dimensional points (i.e., an assumed intersection point on a screen shape model 701 C with its radius interpolated relative to the sphere center). The three-dimensional point 743 is then used to generate the geometric correction vector.
- step S 165 the model misalignment corresponding processing section 273 generates a correction vector corresponding to the model misalignment (also called the model misalignment corresponding correction vector).
- step S 165 the process advances to step S 167 .
- step S 163 the process advances to step S 166 .
- step S 166 the model misalignment corresponding processing section 273 generates a correction vector not corresponding to the model misalignment (also called the model misalignment non-corresponding correction vector).
- step S 167 the process advances to step S 167 .
- step S 167 the projection mask generation section 274 generates a projection mask such as to limit the range of image projection to an area inside the screen 121 (i.e., the projected image does not protrude from the screen 121 (confined within the screen 121 )).
- the projection mask generation section 274 outputs geometric correction vector data including the correction vector generated in step S 165 or S 166 and the projection mask generated in step S 167 .
- the geometric correction process is terminated. The process then returns to the flowchart of FIG. 6 .
- control apparatus 111 implements correction of projection to the dome-type screen 121 for use with multiple projection apparatuses 112 each using the f ⁇ lens, by means of a three-dimensional approach using the projection imaging system 100 .
- images are corrected on the basis of a two-dimensional curved surface model built on three-dimensional information representative of a screen shape.
- the processing thus guarantees the geometric accuracy of images, such as a straight line being seen as a correct straight line.
- the processing easily enables the steps ranging from calibration of the internal and external variables of the projection apparatuses 112 and imaging apparatuses 113 to projection correction, the steps not having been dealt with successfully by the methods involving the use of a projector-camera system based on the existing perspective projection optical assembly.
- geometrically accurate correction is implemented with respect to an established virtual viewpoint without the imaging apparatus 113 being arranged in the position desired to be the viewpoint. Further, from the virtual viewpoint, projection is performed to the screen 112 in an appropriate projection direction with suitable up-down/left-right image orientation maintained. Further, the projected image is corrected in a geometrically accurate manner relative to the front of the screen 121 on the basis of the information obtained with projection and imaging by the projection apparatus 112 and imaging apparatus 113 not in front of the screen 121 but from the side thereof.
- the parameter k inv corresponding to inverse transformation of the lens distortion factor of the projection apparatuses 112 is introduced in parameter estimation, the parameter k inv being used to uniformly correct the pixels for distortion. This eliminates the need for compensating each of the pixels with use of the lens distortion factor k.
- ray traces are performed by correcting the pixels of a distorted image for distortion and by projecting the pixels onto a distortion-free image.
- the optimization for minimizing the distance between corresponding light rays achieves re-estimation of the internal and external variables of the projection apparatuses 112 and imaging apparatuses 113 including the parameter k inv .
- some of the internal parameters of the projection apparatuses 112 and imaging apparatuses 113 need only be given sufficiently appropriate initial values (e.g., averages of calibration values). This still provides calibration of the whole internal and external variables of the projection apparatuses 112 and imaging apparatuses 113 that are arranged before the screen without recourse to preliminary calibration procedures for each of the apparatus enclosures.
- control apparatus 111 estimates the internal and external variables of each of the projection apparatuses 112 and imaging apparatuses 113 .
- the present method is not limited to the foregoing examples.
- the control apparatus 111 may estimate the internal and external variables of either the projection apparatuses 112 or the imaging apparatuses 113 .
- the internal and external variables of the remaining apparatuses may be determined beforehand or may be estimated by the other apparatuses.
- control apparatus 111 may estimate either the internal variables or the external variables.
- the other variables may be determined beforehand or may be estimated by the other apparatuses.
- control apparatus 111 need only estimate at least either the internal variables or the external variables of either the projection apparatuses 112 or the imaging apparatuses 113 .
- the configuration of the imaging system to which the present technology is applied is not limited to the above-described example in FIG. 1 .
- the control apparatus 111 , the projection apparatus 112 , and the imaging apparatus 113 may be provided in desired numbers.
- the control apparatus 111 may, for example, be provided in plural numbers.
- the projection apparatus 112 and the imaging apparatus 113 need not be provided in equal numbers.
- the projection apparatuses 112 and the imaging apparatuses 113 are connected with the control apparatus 111 via the cables 115 .
- these apparatuses may be interconnected in any other suitable manner as long as they can communicate with each other.
- the control apparatus 111 may communicate with the projection apparatuses 112 and imaging apparatuses 113 by wire or wirelessly, or in both wired and wireless fashion.
- the control apparatus 111 , the projection apparatuses 112 , and the imaging apparatuses 113 may be interconnected communicably via any suitable communication network.
- the network may include a single or multiple communication networks.
- this network may include the Internet, public telephone networks, mobile broadband networks such as what is generally called the 3G or 4G networks, WAN (Wide Area Network), LAN (Local Area Network), wireless communication networks for communication based on the Bluetooth (registered trademark) standard, communication channels for short-range wireless communication such as NFC (Near Field Communication), communication channels for infrared ray communication, wired communication networks based on such standards as HDMI (High-Definition Multimedia Interface; registered trademark) and USB (Universal Serial Bus), or any other communication networks and communication channels based on any suitable communication standards.
- 3G or 4G Wide Area Network
- WAN Wide Area Network
- LAN Local Area Network
- wireless communication networks for communication based on the Bluetooth (registered trademark) standard
- communication channels for short-range wireless communication such as NFC (Near Field Communication)
- communication channels for infrared ray communication wired communication networks based on such standards as HDMI (High-Definition Multimedia Interface; registered trademark) and
- the projection apparatus 112 and the imaging apparatus 113 may be integrated into a single apparatus.
- the projection imaging system 100 may include a projection imaging apparatus 801 - 1 , a projection imaging apparatus 801 - 2 , and a control apparatus 111 .
- the projection imaging apparatus 801 - 1 includes a projection section 811 - 1 and an imaging section 812 - 1 .
- the projection section 811 - 1 has functions similar to those of the projection apparatus 112 - 1 in FIG. 1 .
- the imaging section 812 - 1 has functions similar to those of the imaging apparatus 113 - 1 in FIG. 1 . That is, the projection imaging apparatus 801 - 1 has the functions of the projection apparatus 112 - 1 and imaging apparatus 113 - 1 .
- the projection imaging apparatus 801 - 2 includes a projection section 811 - 2 and an imaging section 812 - 2 .
- the projection section 811 - 2 has functions similar to those of the projection apparatus 112 - 2 in FIG. 1 .
- the imaging section 812 - 2 has functions similar to those of the imaging apparatus 113 - 2 in FIG. 1 . That is, the projection imaging apparatus 801 - 2 has the functions of the projection apparatus 112 - 2 and imaging apparatus 113 - 2 .
- the control apparatus 111 is communicably connected with the projection imaging apparatus 801 - 1 via the cable 115 - 1 .
- the communication allows the control apparatus 111 to control the projection imaging apparatus 801 - 1 .
- the projection imaging apparatus 801 - 1 is supplied with an image, projects the image to the projection plane, and captures a projected image on the projection plane for image acquisition.
- the control apparatus 111 is also connected communicably with the projection imaging apparatus 801 - 2 via the cable 115 - 2 .
- the communication allows the control apparatus 111 to control the projection imaging apparatus 801 - 2 .
- the projection imaging apparatus 801 - 2 is supplied with an image, projects the image to the projection plane (e.g., screen 121 ), and captures a projected image on the projection plane for image acquisition.
- the projection imaging system 100 can perform image projection correction by use of the present technology and in a manner similar to the case in FIG. 1 .
- the projection imaging apparatuses 801 - 1 and 801 - 2 are referred to as the projection imaging apparatus 801 in the case where there is no need for their individual explanation.
- the projection sections 811 - 1 and 811 - 2 are referred to as the projection section 811 where there is no need for their individual explanation.
- the imaging sections 812 - 1 and 812 - 2 are referred to as the imaging section 812 where there is no need for their individual explanation.
- the projection imaging apparatus 801 may be provided in desired numbers. For example, three or more projection imaging apparatuses 801 may be provided. Further, there may be provided one or more projection sections 811 and one or more imaging sections 812 in the projection imaging apparatus 801 ; the projection section 811 and the imaging section 812 may be provided in different numbers. Moreover, each of the projection imaging apparatuses 801 may include different numbers of the projection sections 811 and imaging sections 812 . Further, the projection imaging system 100 may include the projection imaging apparatus 801 and either the projection apparatus 112 or the imaging apparatus 113 , or the projection imaging apparatus 801 and both the projection apparatus 112 and the imaging apparatus 113 in a mixed manner.
- control apparatus 111 may be integrated with another apparatus.
- the projection imaging system 100 may include an imaging apparatus 820 , a projection apparatus 112 , and a projection imaging apparatus 801 .
- the imaging apparatus 820 includes the imaging section 812 - 1 and a control section 821 .
- the control section 821 has functions similar to those of the control apparatus 111 in FIG. 1 or in FIG. 19A . That is, the imaging apparatus 820 has the functions of the imaging apparatus 113 and the control apparatus 111 .
- the imaging apparatus 820 , the projection apparatus 112 , and the projection imaging apparatus 801 are connected communicably with each other via the cable 115 .
- Images are thus supplied to the imaging apparatus 820 via the cable 114 .
- the control section 821 in the imaging apparatus 820 controls, via the cable 115 , the projection section 811 - 1 in the projection apparatus 112 and the projection section 811 - 2 in the projection imaging apparatus 801 to project the supplied image to the projection plane (e.g., screen 121 ).
- the control section 821 further controls the imaging section 812 - 1 as well as the imaging section 812 - 2 in the projection imaging apparatus 801 via the cable 115 to capture the projected image on the projection plane.
- the control section 821 performs geometric correction on the image by using the present technology so that a geometrically corrected image will be projected.
- the projection imaging system 100 can perform image projection correction by use of the present technology and in a manner similar to the case in FIG. 1 .
- control apparatus 111 may be integrated with an apparatus other than the imaging apparatus 113 , such as with the projection apparatus 112 or the projection imaging apparatus 801 . That is, the apparatus that includes the control section 821 may be configured in any suitable manner and may have the projection sections 811 and imaging sections 812 in desired numbers. Also, there may be one or multiple apparatuses each having the control section 821 . Further, in the projection imaging system 100 , the constituent elements other than the apparatus having the control section 821 may be configured as desired. The configuration of the projection imaging system 100 is thus not limited to that of the example in FIG. 19B .
- the entire configuration of the projection imaging system 100 may be integrated into a single apparatus.
- the whole system may be integrated into a projection imaging apparatus 830 .
- the projection imaging apparatus 830 includes a projection section 811 - 1 , a projection section 811 - 2 , an imaging section 812 - 1 , an imaging section 812 - 2 , and a control section 821 . That is, the projection imaging apparatus 830 is configured in a manner similar to the projection imaging system 100 in FIG. 1 . As described above, the present technology may be applied internally to the projection imaging apparatus 830 .
- the projection imaging apparatus 830 may be configured as desired and the configuration is not limited to that of the example in FIG. 19C .
- the control section 821 , the projection section 811 , and the imaging section 812 may each be provided in desired numbers.
- the series of the processes described above may be executed either by hardware or by software.
- the programs constituting the software are installed from a network or from a recording medium.
- its recording medium is constituted by the removable media 221 on which the programs are recorded and which are distributed to users apart from the apparatus in order to deliver the recorded programs.
- a piece of the removable media 221 on which the programs are recorded may be attached to the drive 215 so as to have the programs installed into the storage section 213 following their retrieval from the attached piece of removable media 221 .
- its recording medium is constituted by the removable media 321 on which the programs are recorded and which are distributed to users apart from the apparatus in order to deliver the recorded programs.
- a piece of the removable media 321 on which the programs are recorded may be attached to the drive 315 so as to have the programs installed into the storage section 313 following their retrieval from the attached piece of the removable media 321 .
- its recording medium is constituted by the removable media 421 on which the programs are recorded and which are distributed to users apart from the apparatus in order to deliver the recorded programs.
- a piece of the removable media 421 on which the programs are recorded may be attached to the drive 415 so as to have the programs installed into the storage section 413 following their retrieval from the attached piece of removable media 421 .
- the programs may be offered via wired or wireless transmission media such as local area networks, the Internet, and digital satellite broadcasts.
- the programs may be received by the communication section 214 and installed into the storage section 213 .
- the programs may be received by the communication section 314 and installed into the storage section 313 .
- the programs may be received by the communication section 414 and installed into the storage section 413 .
- the programs may be preinstalled in a storage section or a ROM.
- the programs may be preinstalled in the storage section 213 or in a ROM (not depicted) inside the control section 201 .
- the programs may be preinstalled in the storage section 313 or in a ROM (not depicted) inside the control section 301 .
- the programs may be preinstalled in the storage section 413 or in a ROM (not depicted) inside the control section 401 .
- the present technology may be implemented as any of the components constituting an apparatus or any of the apparatuses configuring a system, such as a processor (e.g., video processor) in the form of a system LSI (Large Scale Integration), a module (e.g., video module) using multiple processors, a unit (e.g., video unit) using multiple modules, and a set (e.g., video set) supplementing a unit with other functions (i.e., as part of the apparatus).
- a processor e.g., video processor
- LSI Large Scale Integration
- module e.g., video module
- unit e.g., video unit
- a set e.g., video set
- the present technology may also be applied to a network system including multiple apparatuses.
- the technology may be applied to cloud services that offer image-related (video-related) services to any types of terminals such as computers, AV (Audio Visual) equipment, mobile information processing terminals, and IoT (Internet of Things) devices.
- image-related video-related
- terminals such as computers, AV (Audio Visual) equipment, mobile information processing terminals, and IoT (Internet of Things) devices.
- IoT Internet of Things
- systems, apparatuses, or processing sections to which the present technology is applied may be used for desired purposes in any types of fields such as transportation, healthcare, crime prevention, agriculture, livestock farming, mining, beauty care, factories, home electric appliances, climate, and nature monitoring.
- the present technology may be applied to systems and devices used for offering content for aesthetic or appreciative purposes.
- the present technology may be applied to systems and devices for transportation-related purposes such as for monitoring traffic conditions and controlling automated driving.
- the present technology may be applied to systems and devices for security purposes.
- the present technology may be applied to systems and devices for automated control of machines.
- the present technology may be applied to systems and devices for use in agriculture and livestock farming.
- the present technology may be applied to systems and devices for monitoring the state of nature such as volcanoes, forests and oceans, as well as the state of wildlife.
- the present technology may be applied to systems and devices for use in sports.
- the present technology may be implemented as any of the components constituting an apparatus or a system, such as a processor (e.g., video processor) in the form of a system LSI (Large Scale Integration), a module (e.g., video module) using multiple processors, a unit (e.g., video unit) using multiple modules, and a set (e.g., video set) supplementing a unit with other functions (i.e., as part of the apparatus).
- a processor e.g., video processor
- LSI Large Scale Integration
- module e.g., video module
- unit e.g., video unit
- a set e.g., video set
- system refers to an aggregate of multiple components (e.g., apparatuses or modules (parts)). It does not matter whether all components are housed in the same enclosure. Thus, a system may be configured with multiple apparatuses housed in separate enclosures and interconnected via a network, or with a single apparatus in a single enclosure that houses multiple modules.
- any configuration explained in the foregoing paragraphs as one apparatus (or processing section) may be divided into multiple apparatuses (or processing sections).
- the configurations explained above as multiple apparatuses (or processing sections) may be unified into one apparatus (or processing section).
- the configuration of each apparatus (or processing section) may obviously be supplemented with a configuration or configurations other than those discussed above.
- part of the configuration of an apparatus (or processing section) may be included in the configuration of another apparatus (or processing section), provided the configurations and the workings remain substantially the same for the system as a whole.
- the present technology may be implemented as a cloud computing setup in which a single function is processed cooperatively by multiple networked apparatuses on a shared basis.
- the above-described programs may be executed by any apparatus.
- the apparatus is only required to have necessary functions (e.g., functional blocks) and obtain necessary information for program execution.
- each of the steps discussed in reference to the above-described flowcharts may be executed either by a single apparatus or by multiple apparatuses on a shared basis. Further, if a single step includes multiple processes, these processes may be executed either by a single apparatus or by multiple apparatuses on a shared basis. In other words, multiple steps included in a single step may be executed as a process of multiple steps. Conversely, the process explained as made up of multiple steps may be executed as a single step.
- the programs executed by the computer may each be processed in such a manner that the processes of the steps describing the program are carried out chronologically, i.e., in the sequence depicted in this description, in parallel with other programs, or in otherwise appropriately timed fashion such as when the program is invoked as needed. That is, the above processes of steps may be carried out in sequences different from those discussed above as long as there is no conflict between the steps. Furthermore, the processes of the steps describing a given program may be performed in parallel with, or in combination with, the processes of other programs.
- An information processing apparatus including:
- a posture estimation section configured such that, by use of an image projection model using a distortion factor of an f ⁇ lens with an image height of incident light expressed by a product of a focal point distance f and an incident angle ⁇ of the incident light, the posture estimation section estimates a posture of a projection section for projecting an image and a posture of an imaging section for capturing a projection plane to which the image is projected.
- the posture estimation section estimates posture-related parameters of at least either the projection section or the imaging section.
- the posture-related parameters include internal parameters of at least either the projection section or the imaging section.
- the internal parameters include at least one of a focal point distance, a principal point, and a parameter corresponding to inverse transformation of the distortion factor regarding either the projection section or the imaging section.
- the posture-related parameters include external parameters of at least either the projection section or the imaging section.
- the external parameters include either a rotation matrix or a translation vector with respect to an origin of a world coordinate system of either the projection section or the imaging section.
- the posture estimation section optimizes the posture-related parameters in such a manner that an average error of the detected corresponding points becomes equal to or smaller than a predetermined threshold value.
- the posture estimation section corrects the parameters for use in estimation of the posture-related parameters
- the posture estimation section repeatedly estimates the posture-related parameters until the average error becomes equal to or smaller than the threshold value.
- the posture estimation section optimizes the posture-related parameters while removing as an outlier a corresponding point having a large error.
- the information processing apparatus as stated in any one of paragraphs (2) to (11) above, further including:
- a geometric correction section configured such that, by use of the posture-related parameters estimated by the posture estimation section, the geometric correction section generates vector data for geometric correction of the image projected by the projection section.
- the geometric correction section estimates a virtual viewpoint position in front of the projection plane and a projection direction relative to the virtual viewpoint position, thereby generating the vector data for suppressing distortion of the virtual viewpoint position.
- the geometric correction section performs a model misalignment corresponding process for suppressing an error between an actual projection plane and the model.
- the geometric correction section generates a projection mask for limiting a range in which the image is to be projected.
- a corresponding point detection section configured to detect a corresponding point between the projection section and the imaging section, in which,
- the posture estimation section estimates the posture of the projection section and that of the imaging section.
- the information processing apparatus as stated in any one of paragraphs (1) to (17) above, further including:
- An information processing method including:
- an image projection model using a distortion factor of an f ⁇ lens with an image height of incident light expressed by a product of a focal point distance f and an incident angle ⁇ of the incident light, estimating a posture of a projection section for projecting an image and a posture of an imaging section for capturing a projection plane to which the image is projected.
- Projection imaging system 111 Control apparatus, 112 Projection apparatus, 113 Imaging apparatus, 201 Control section, 251 Sensing processing section, 252 Posture estimation section, 253 Geometric correction section, 261 Imaging variable estimation section, 262 Projection variable estimation section, 263 Total optimization section, 271 Projection plane modeling section, 272 Virtual viewpoint position/projection direction estimation section, 273 Model misalignment corresponding processing section, 274 Projection mask generation section, 301 Control section, 302 Projection section, 401 Control section, 402 Imaging section, 801 Projection imaging apparatus, 811 Projection section, 812 Imaging section, 820 Imaging apparatus, 821 Control section, 830 Projection imaging apparatus
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Geometry (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Projection Apparatus (AREA)
Abstract
Description
[Math 1]
X c =RX+T (1)
[Math. 6]
k=[k 1 k 2 k 3 k 4]T (6)
[Math. 7]
θd=θ(1+k 1θ2 +k 2θ4 +k 3θ6 +k 4θ8) (7)
[Math. 8]
x′=(θd /r)a (8)
[Math. 9]
y′=(θd /r)b (9)
[Math. 10]
u=f x(x′+αy′)+c x (10)
[Math. 11]
v=f y y′+c y (11)
[Math. 16]
k inv=[k inv(1) k inv(2) k inv(3) k inv(4)]T (16)
[Math. 17]
r′ 2 =x′ 2 +y′ 2 (17)
[Math. 18]
θ′=r′(1+k inv(1) r′ 2 +k inv(2) r′ 4 +k inv(3) r′ 6 k inv(4) r′ 8) (18)
[Math. 19]
a=(tan θ′/r′)x′ (19)
[Math. 20]
b=(tan θ′/r′)y′ (20)
-
- performs image distortion correction on the projection section and the imaging section by using the parameter corresponding to inverse transformation of the distortion factor, and
- performs a ray trace to detect a corresponding point by use of the projection section and the imaging section subjected to the distortion correction, thereby estimating the posture-related parameters.
(8)
-
- estimates the posture-related parameters of the projection section,
- estimates the posture-related parameters of the imaging section, and
- optimizes the estimated posture-related parameters of the projection section and the estimated posture-related parameters of the imaging section.
(12)
-
- obtains the projection plane by use of the posture-related parameters of the projection section and of the imaging section so as to model the obtained projection plane as a two-dimensional curved surface, and
- generates the vector data by use of the projection plane model thus obtained.
(14)
Claims (17)
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018020612 | 2018-02-08 | ||
JPJP2018-020612 | 2018-02-08 | ||
JP2018-020612 | 2018-02-08 | ||
PCT/JP2019/002389 WO2019155903A1 (en) | 2018-02-08 | 2019-01-25 | Information processing device and method |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210067753A1 US20210067753A1 (en) | 2021-03-04 |
US11483528B2 true US11483528B2 (en) | 2022-10-25 |
Family
ID=67547993
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/964,818 Active 2039-06-30 US11483528B2 (en) | 2018-02-08 | 2019-01-25 | Information processing apparatus and information processing method |
Country Status (3)
Country | Link |
---|---|
US (1) | US11483528B2 (en) |
CN (1) | CN111670574A (en) |
WO (1) | WO2019155903A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4044104A4 (en) * | 2020-12-04 | 2023-01-04 | Shenzhen Institutes of Advanced Technology Chinese Academy of Sciences | Panoramic presentation method and device therefor |
CN114765667A (en) * | 2021-01-13 | 2022-07-19 | 安霸国际有限合伙企业 | Fixed pattern calibration for multi-view stitching |
CN115103169B (en) * | 2022-06-10 | 2024-02-09 | 深圳市火乐科技发展有限公司 | Projection picture correction method, projection picture correction device, storage medium and projection device |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001320652A (en) | 2000-05-11 | 2001-11-16 | Nec Corp | Projector |
JP2004015255A (en) | 2002-06-05 | 2004-01-15 | Sony Corp | Imaging apparatus and imaging method, image processor and image processing method, and program and recording medium |
JP2004282711A (en) | 2003-02-28 | 2004-10-07 | Victor Co Of Japan Ltd | Projection display device |
JP2005244835A (en) | 2004-02-27 | 2005-09-08 | Olympus Corp | Multiprojection system |
JP2007309660A (en) | 2006-05-16 | 2007-11-29 | Roland Dg Corp | Calibration method in three-dimensional shape measuring device |
US20120051666A1 (en) * | 2010-08-31 | 2012-03-01 | Hitachi Information & Communication Engineering, Ltd. | Image correcting device, method for creating corrected image, correction table creating device, method for creating correction table, program for creating correction table, and program for creating corrected image |
US8482595B2 (en) * | 2007-07-29 | 2013-07-09 | Nanophotonics Co., Ltd. | Methods of obtaining panoramic images using rotationally symmetric wide-angle lenses and devices thereof |
US20140169827A1 (en) | 2012-12-17 | 2014-06-19 | Ricoh Company, Ltd. | Developing device and image forming apparatus |
US20150189267A1 (en) * | 2013-12-27 | 2015-07-02 | Sony Corporation | Image projection device and calibration method thereof |
JP2015142157A (en) | 2014-01-27 | 2015-08-03 | パナソニックIpマネジメント株式会社 | Image projection system, projection controller, projection controlling program |
WO2016204068A1 (en) | 2015-06-19 | 2016-12-22 | ソニー株式会社 | Image processing apparatus and image processing method and projection system |
US10176595B2 (en) * | 2015-03-25 | 2019-01-08 | Vadas, Ltd | Image processing apparatus having automatic compensation function for image obtained from camera, and method thereof |
US20200380729A1 (en) * | 2017-03-27 | 2020-12-03 | Nec Corporation | Camera parameter estimation device, method and program |
US11195252B2 (en) * | 2016-12-06 | 2021-12-07 | SZ DJI Technology Co., Ltd. | System and method for rectifying a wide-angle image |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4631048B2 (en) * | 2005-02-14 | 2011-02-16 | 国立大学法人岩手大学 | Imaging apparatus and imaging system parameter calibration method |
JP2013005393A (en) * | 2011-06-21 | 2013-01-07 | Konica Minolta Advanced Layers Inc | Image processing method having wide-angle distortion correction processing, image processing apparatus and imaging apparatus |
CN103247030A (en) * | 2013-04-15 | 2013-08-14 | 丹阳科美汽车部件有限公司 | Fisheye image correction method of vehicle panoramic display system based on spherical projection model and inverse transformation model |
JP6448196B2 (en) * | 2014-02-13 | 2019-01-09 | 株式会社バンダイナムコエンターテインメント | Image generation system and program |
-
2019
- 2019-01-25 CN CN201980011183.8A patent/CN111670574A/en active Pending
- 2019-01-25 WO PCT/JP2019/002389 patent/WO2019155903A1/en active Application Filing
- 2019-01-25 US US16/964,818 patent/US11483528B2/en active Active
Patent Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001320652A (en) | 2000-05-11 | 2001-11-16 | Nec Corp | Projector |
JP2004015255A (en) | 2002-06-05 | 2004-01-15 | Sony Corp | Imaging apparatus and imaging method, image processor and image processing method, and program and recording medium |
US20040032649A1 (en) | 2002-06-05 | 2004-02-19 | Tetsujiro Kondo | Method and apparatus for taking an image, method and apparatus for processing an image, and program and storage medium |
JP2004282711A (en) | 2003-02-28 | 2004-10-07 | Victor Co Of Japan Ltd | Projection display device |
JP2005244835A (en) | 2004-02-27 | 2005-09-08 | Olympus Corp | Multiprojection system |
JP2007309660A (en) | 2006-05-16 | 2007-11-29 | Roland Dg Corp | Calibration method in three-dimensional shape measuring device |
US8482595B2 (en) * | 2007-07-29 | 2013-07-09 | Nanophotonics Co., Ltd. | Methods of obtaining panoramic images using rotationally symmetric wide-angle lenses and devices thereof |
US20120051666A1 (en) * | 2010-08-31 | 2012-03-01 | Hitachi Information & Communication Engineering, Ltd. | Image correcting device, method for creating corrected image, correction table creating device, method for creating correction table, program for creating correction table, and program for creating corrected image |
US20140169827A1 (en) | 2012-12-17 | 2014-06-19 | Ricoh Company, Ltd. | Developing device and image forming apparatus |
US20150189267A1 (en) * | 2013-12-27 | 2015-07-02 | Sony Corporation | Image projection device and calibration method thereof |
JP2015128242A (en) | 2013-12-27 | 2015-07-09 | ソニー株式会社 | Image projection device and calibration method of the same |
JP2015142157A (en) | 2014-01-27 | 2015-08-03 | パナソニックIpマネジメント株式会社 | Image projection system, projection controller, projection controlling program |
US10176595B2 (en) * | 2015-03-25 | 2019-01-08 | Vadas, Ltd | Image processing apparatus having automatic compensation function for image obtained from camera, and method thereof |
WO2016204068A1 (en) | 2015-06-19 | 2016-12-22 | ソニー株式会社 | Image processing apparatus and image processing method and projection system |
US20180232855A1 (en) | 2015-06-19 | 2018-08-16 | Sony Corporation | Image processing unit, image processing method, and projection system |
US11195252B2 (en) * | 2016-12-06 | 2021-12-07 | SZ DJI Technology Co., Ltd. | System and method for rectifying a wide-angle image |
US20200380729A1 (en) * | 2017-03-27 | 2020-12-03 | Nec Corporation | Camera parameter estimation device, method and program |
Non-Patent Citations (1)
Title |
---|
International Search Report and Written Opinion of PCT Application No. PCT/JP2019/002389, dated Apr. 16, 2019, 11 pages of ISRWO. |
Also Published As
Publication number | Publication date |
---|---|
US20210067753A1 (en) | 2021-03-04 |
CN111670574A (en) | 2020-09-15 |
WO2019155903A1 (en) | 2019-08-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9344695B2 (en) | Automatic projection image correction system, automatic projection image correction method, and non-transitory storage medium | |
US11232593B2 (en) | Calibration apparatus, calibration system, and calibration method | |
Raposo et al. | Fast and accurate calibration of a kinect sensor | |
US9967463B2 (en) | Method for camera motion estimation and correction | |
US10478149B2 (en) | Method of automatically positioning an X-ray source of an X-ray system and an X-ray system | |
US11483528B2 (en) | Information processing apparatus and information processing method | |
CN112399158B (en) | Projection image calibration method and device and projection equipment | |
US20200177866A1 (en) | Calibration apparatus, chart for calibration, chart pattern generation apparatus, and calibration method | |
TWI534755B (en) | A method and apparatus for building a three dimension model | |
JP2011253376A (en) | Image processing device, image processing method and program | |
US20120320190A1 (en) | System for Calibrating a Vision System | |
US20140125772A1 (en) | Image processing apparatus and method, image processing system and program | |
US20210150762A1 (en) | Camera calibration using depth data | |
US20120162220A1 (en) | Three-dimensional model creation system | |
CN108053375B (en) | Image data correction method and device and automobile | |
US10740923B2 (en) | Face direction estimation device and face direction estimation method for estimating the direction of a face represented on an image | |
EP3633606B1 (en) | Information processing device, information processing method, and program | |
US20120162387A1 (en) | Imaging parameter acquisition apparatus, imaging parameter acquisition method and storage medium | |
JP2016014720A (en) | Information processor and method | |
KR101349347B1 (en) | System for generating a frontal-view image for augmented reality based on the gyroscope of smart phone and Method therefor | |
US11240475B2 (en) | Information processing apparatus and method | |
JP6318576B2 (en) | Image projection system, image processing apparatus, image projection method, and program | |
US20210407113A1 (en) | Information processing apparatus and information processing method | |
JP5410328B2 (en) | Optical projection stabilization device, optical projection stabilization method, and program | |
JP7040511B2 (en) | Information processing equipment and methods |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TAHARA, TOMU;KOBAYASHI, NAOKI;KATSUKI, YUGO;SIGNING DATES FROM 20200807 TO 20200817;REEL/FRAME:056047/0951 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |