US12250500B2 - Image projection system and image projection method - Google Patents
Image projection system and image projection method Download PDFInfo
- Publication number
- US12250500B2 US12250500B2 US18/326,006 US202318326006A US12250500B2 US 12250500 B2 US12250500 B2 US 12250500B2 US 202318326006 A US202318326006 A US 202318326006A US 12250500 B2 US12250500 B2 US 12250500B2
- Authority
- US
- United States
- Prior art keywords
- image
- laser light
- physical object
- light
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2518—Projection by scanning of the object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2545—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3129—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] scanning a light beam on the display screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3129—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] scanning a light beam on the display screen
- H04N9/3135—Driving therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/315—Modulator illumination systems
- H04N9/3161—Modulator illumination systems using laser light sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
Definitions
- the present disclosure relates to an image projection system and an image projection method for projecting an image on a physical object.
- the present disclosure has been made in view of such problems as described above, and it is desirable to provide a technology that easily implements projection mapping with a high degree of accuracy.
- an image projection system including an image projection unit that irradiates a physical object with image laser light for forming pixels of an image to be projected on the physical object, a reference light irradiation unit that irradiates the physical object with reference laser light emitted through an emission port common to the image laser light, and a shape measurement unit that detects, at a position circumscribing the emission port, the reference laser light reflected from the physical object and measures a three-dimensional shape of the physical object on the basis of a result of the detection.
- an image projection method by an image projection system includes irradiating a physical object with image laser light for forming pixels of an image to be projected on the physical object, irradiating the physical object with reference laser light emitted through an emission port common to the image laser light, and detecting, at a position circumscribing the emission port, the reference laser light reflected from the physical object and measuring a three-dimensional shape of the physical object on the basis of a result of the detection.
- projection mapping with a high degree of accuracy can easily be implemented at a low cost.
- FIG. 1 is a block diagram depicting an example of a configuration of an image projection system that implements projection mapping according to an embodiment of the present disclosure
- FIG. 2 is a block diagram depicting an example of a configuration of an existing system that simultaneously implements acquisition of a three-dimensional shape of a physical object and projection of an image;
- FIG. 3 is a schematic view illustrating the necessity for coordinate conversion in the existing technology
- FIG. 4 is a view illustrating an image projection technology of a laser light scanning method adopted in the present embodiment
- FIG. 5 is a block diagram depicting a detailed configuration of the image projection system according to the present embodiment.
- FIG. 6 is a block diagram depicting a configuration of an internal circuit of an image data outputting unit in the present embodiment
- FIG. 7 is a block diagram depicting a configuration of functional blocks of the image data outputting unit and a shape measurement unit in the present embodiment
- FIGS. 8 A and 8 B each depict a schematic view exemplifying a positional relation between image laser light and reference laser light upon emission of them in the present embodiment
- FIG. 9 is a block diagram depicting another example of the configuration of the image projection system in the present embodiment.
- FIG. 10 is a view illustrating a mode in which a scanning controller in the present embodiment adjusts a scanning speed with laser light according to contents of an image
- FIG. 11 is a view depicting an example of setting of a rule when the scanning controller in the present embodiment adjusts the scanning speed with laser light.
- FIG. 1 depicts an example of a configuration of an image projection system that implements projection mapping according to an embodiment of the present disclosure.
- an image projection system 14 includes an image data outputting unit 10 and a light irradiation unit 12 .
- the image data outputting unit 10 outputs data of an image to be projected on a physical object 6 to the light irradiation unit 12 .
- the light irradiation unit 12 acquires the data and irradiates the physical object 6 with laser light, which represents a color of each pixel that configures an image, to project an image 8 on a surface of the physical object 6 .
- the light irradiation unit 12 further has a function of measuring a three-dimensional shape of the physical object 6 .
- the light irradiation unit 12 irradiates the physical object 6 with reference light (laser light for reference) such as infrared light on the physical object 6 and observes reflected light from the physical object 6 to acquire a distance to the surface of the physical object 6 .
- the light irradiation unit 12 applies the reference laser light through an emission port common to the laser light for forming each pixel of the image 8 .
- the light irradiation unit 12 applies the reference laser light coaxially with the laser light for forming each pixel of the image 8 . Consequently, the reference laser light is applied in a unit of a pixel to and reflected from the physical object 6 .
- the light irradiation unit 12 performs emission and observation of the reference laser light at the same position together with projection of the image 8 to acquire the distance to the physical object 6 as a two-dimensional distribution on the surface of the physical object 6 .
- the two-dimensional distribution of the distance to the surface of the physical object 6 represents a position, a shape, and a posture of the physical object 6 .
- the parameters of them are hereinafter collectively referred to as a “three-dimensional shape” of the physical object 6 .
- the image data outputting unit 10 adaptively adjusts an image to be made a projection source, according to a three-dimensional shape of the physical object 6 obtained last, such that projection mapping is performed with a high degree of accuracy.
- the image data outputting unit 10 expands, contracts, or shades an image in such a manner as to conform to the physical object 6 in terms of the posture or unevenness.
- the image data outputting unit 10 performs such editing as translation, rotation, or deformation according to a movement of the physical object 6 . Since such image adjustment can be performed applying a general technology, in the following description, a mechanism for acquiring a three-dimensional shape of the physical object 6 with a high degree of accuracy while the image 8 is projected is described.
- FIG. 2 depicts an example of a configuration of an existing system that simultaneously implements acquisition of a three-dimensional shape of the physical object 6 and projection of an image.
- This system is configured such that it includes, in addition to an image projection unit 112 that projects an image 8 on the physical object 6 , a shape measurement device 116 that measures a three-dimensional shape of the physical object 6 .
- FIG. 2 assumes, as a shape measuring technique of a physical object, a patterned stereo method that projects an infrared image of a specific pattern such as an array of dots on the physical object 6 and captures the projected infrared image by a stereo camera to obtain a parallax.
- an infrared irradiation unit 114 of the shape measurement device 116 irradiates the physical object 6 with an infrared image of a specific pattern.
- the stereo camera includes a right viewpoint camera 120 a and a left viewpoint camera 120 b , and includes an infrared pass filter 118 , which passes light of an infrared wavelength band therethrough, in front of an imaging plane of each of the right viewpoint camera 120 a and the left viewpoint camera 120 b .
- the stereo camera performs shooting in stereo of a picture of an infrared pattern with which the physical object 6 is irradiated.
- a shape acquisition unit 122 calculates a distance to the physical object 6 by the principle of trigonometry on the basis of a parallax appearing between two pictures of the infrared patterns in the captured stereo image.
- the distance value is obtained in units of feature points that configure the infrared pattern, such as dots.
- the shape acquisition unit 122 can acquire a distribution of distances on the surface of the physical object 6 at a granularity of the feature points and hence can calculate a three-dimensional shape of the physical object 6 .
- the three-dimensional shape in this case is information on a camera coordinate system that is based on the imaging plane of the right viewpoint camera 120 a or the left viewpoint camera 120 b.
- the shape measurement device 116 includes a coordinate conversion unit 124 and converts information relating to a three-dimensional shape of the physical object 6 on the camera coordinate system into information on a coordinate system that is based on the projection plane of the image projection unit 112 (plane of the light emitting element that is the projection source of the image 8 ).
- An image data outputting unit 110 adjusts an image to be projected, on the basis of information relating to the three-dimensional shape, the information being obtained after the coordinate conversion is performed, and outputs a resulting image to the image projection unit 112 .
- the image projection unit 112 projects the image, on which the result of the shape measurement is reflected in this manner, to the physical object 6 .
- FIG. 3 is a view illustrating the necessity for the coordinate conversion in the existing technology.
- the shape acquisition unit 122 obtains a distribution of distances (for example, a distance d) from an imaging plane 132 of one of the right viewpoint camera 120 a and the left viewpoint camera 120 b to the physical object 6 and further obtains information relating to a three-dimensional shape of the physical target with reference to the imaging plane 132 .
- information relating to a three-dimensional shape as viewed from a projection plane 130 of the image projection unit 112 may be required.
- the projection plane 130 when the physical object 6 moves away from the projection plane 130 , in order to cause the image 8 to look as a design of the physical object 6 , it may be necessary to reduce an image 9 of the projection source, according to an increase of the distance from the projection plane 130 (for example, a distance d′).
- the coordinate conversion unit 124 performs coordinate conversion of information relating to a three-dimensional shape acquired by the shape acquisition unit 122 , to acquire information relating to the three-dimensional shape with reference to the projection plane 130 . Since the information after the conversion has a significant influence on a result of projection of an image, it may be necessary to strictly perform calibration in advance between the projection system of an image and an observation system of infrared rays to obtain a conversion parameter with a high degree of accuracy. Even if this countermeasure is taken, since the process for coordinate conversion is involved, this increases the probability that an error may occur and gives rise to a problem of insurance of processing resources and a problem of delay time.
- the infrared irradiation unit 114 and the shape measurement device 116 that includes various cameras may be required in addition to the image projection unit 112 , and the entire system is likely to become complicated and large-sized. As a result, there is also a problem that the design of the appearance is constrained or the production cost increases.
- Such problems as described above similarly arise as long as the projection plane 130 and the imaging plane 132 are provided in the system, irrespective of a shape measurement technique such as a grid projection method by which a grid-like infrared image is projected and the projected picture is observed to acquire a three-dimensional shape of the physical object 6 .
- a shape measurement technique such as a grid projection method by which a grid-like infrared image is projected and the projected picture is observed to acquire a three-dimensional shape of the physical object 6 .
- a process for extracting corresponding points of infrared patterns in two images of a stereo image may be required, and this further increases the calculation cost.
- a stereo camera since a stereo camera may be required, this is disadvantageous in terms of the production cost and size reduction of an apparatus.
- the image projection unit 112 includes a digital micromirror device and infrared light is applied, from the infrared irradiation unit 114 , in a superposed relation with the plane of a projection image
- the image projection unit 112 includes a digital micromirror device and infrared light is applied, from the infrared irradiation unit 114 , in a superposed relation with the plane of a projection image
- infrared irradiation unit 114 for example, refer to Uwe Lippmann and 9 others, “In Good Light: A New High-Speed Projector with Visible and Infrared Capabilities,” [online], Dec. 13, 2021, Tokyo Institute of Technology, [searched Apr. 28, 2022], Internet ⁇ URL: https://www.titech.ac.jp/english/news/2021/062614>).
- a three-dimensional shape of a region that coincides with a region on a surface of a physical object that is a projection destination can be acquired with a high resolution.
- laser light for projecting an image and reference laser light for obtaining a three-dimensional shape of a physical object are caused to be emitted through a common emission port such that they are applied in a superposed relation in units of pixels. Then, it is made possible to observe reflection of the reference laser light at a position circumscribing the emission port.
- a laser light scanning method is adopted for projection of an image, and irradiation with light and observation in units of pixels are repeated while the position of the emission destination is successively changed, to perform shape measurement time-divisionally together with projection of an image. Consequently, there is no necessity to additionally provide the projection plane 130 and the imaging plane 132 , and image projection with a high degree of accuracy is implemented at a low cost.
- FIG. 4 is a view illustrating an image projection technology of the laser light scanning method adopted in the present embodiment.
- the laser light scanning method is a technique of causing laser light corresponding to pixels to perform two-dimensional scanning using a mirror for deflection to thereby form an image on a physical object.
- an image laser light source 50 outputs laser light that includes components of red (R), green (G), and blue (B). The laser light is reflected by a mirror 52 and projected on the surface of the physical object 6 .
- the projected image 8 is depicted in a state in which it is viewed from the front.
- the image laser light source 50 generates laser light representative of a color of each pixel, in such a manner as to synchronize with the movement of the arrival point of the laser light. Consequently, an image 8 having pixels of the colors of the laser light outputted at the individual points of time is formed.
- a video projection apparatus that utilizes reflection by a mirror is disclosed, for example, in Japanese Patent Laid-Open No. 2017-83657 and so forth.
- the image data outputting unit 10 adjusts data of the image of the projection source according to the change, so that it is possible to cause the image 8 to look as if it were the surface itself of the physical object 6 .
- the image data outputting unit 10 reduces the image 8 , for example, in the vertical direction.
- adjustment of the image is more complicated as well by change of the position, the posture, or the shape of the physical object as described hereinabove.
- reference laser light for shape measurement is also reflected by the mirror 52 , so that irradiation with the reference laser light along a path of, preferably, in a coaxial relation with, the laser light for an image. Further, by utilizing such an irradiation mechanism at a “point” such that reflection of the reference laser light is detected at a position proximate to the mirror 52 , information relating to a three-dimensional shape of the physical object 6 with reference to the emission position of the image laser light is directly obtained.
- FIG. 5 depicts a detailed configuration of the image projection system according to the present embodiment.
- the image projection system 14 includes the image data outputting unit 10 and the light irradiation unit 12 .
- the light irradiation unit 12 includes, as depicted in FIG. 4 , the image laser light source 50 and the mirror 52 as an image projection unit that irradiates the physical object 6 with light for forming pixels of an image to be projected on the physical object 6 .
- the image laser light source 50 time-divisionally generates image laser light 57 , which represents a color of each pixel of the projection image, on the basis of image data I outputted from the image data outputting unit 10 .
- the image laser light 57 includes three different kinds of light, that is, three laser beams corresponding, for example, to R, G, and B, the wavelength or the number of such laser beams is not restrictive as long as the laser beams represent colors corresponding to pixel values.
- the mirror 52 for example, a micro electro mechanical systems (MEMS) mirror is used.
- the MEMS mirror is a device that is small in size and low in power consumption and can be controlled with a high degree of accuracy in regard to the angle change around two axes by electromagnetic driving.
- the driving method of the mirror 52 is not specifically restrictive.
- the mirror 52 is changed in terms of the angle by a control signal M from the image data outputting unit 10 such that the image laser light 57 is reflected in such a manner as to arrive at an appropriate position on the physical object 6 .
- the light irradiation unit 12 further includes a reference laser light source 56 , a beam splitter 58 , a reference laser light pass filter 62 , and a shape measurement unit 60 .
- the reference laser light source 56 outputs reference laser light for measuring a three-dimensional shape of the physical object 6
- the beam splitter 58 superposes the reference laser light on the image laser light and introduces the resulting light to the mirror 52 .
- the reference laser light pass filter 62 passes therethrough light of the wavelength of the reference laser light
- the shape measurement unit 60 detects reflected light of the reference laser light to acquire the distance to the physical object 6 and further acquire three-dimensional shape information relating to the physical object 6 .
- the reference laser light source 56 , the beam splitter 58 , and the mirror 52 configure a reference light irradiation unit that applies reference laser light through an emission port common to the image laser light 57 .
- the reference laser light source 56 generates, as reference laser light 59 , near infrared laser light of a pulse width of, for example, 100 picoseconds to several nanoseconds.
- the beam splitter 58 is provided such that it superposes the image laser light 57 and the reference laser light 59 on each other and introduces the resulting light to the mirror 52 .
- the image laser light 57 and the reference laser light 59 are reflected in the superposed state by the mirror 52 and arrive at a position of each pixel (for example, a pixel 64 ) on the surface of the physical object 6 . It is to be noted that it is sufficient if the image laser light 57 and the reference laser light 59 advance in a substantially common axial direction and arrive at the physical object 6 , and whether or not they are superposed on each other or the degree of such superposition is not restrictive. In the description of the present embodiment, this state is referred to as “coaxial” in some cases.
- the shape measurement unit 60 detects light of the reference laser light reflected from the physical object 6 , to acquire information relating to a three-dimensional shape of the physical object 6 .
- the shape measurement unit 60 includes, for example, a direct time of flight (dTOF) sensor and is driven in synchronism with emission of the reference laser light 59 .
- the reference laser light source 56 cyclically generates a pulse of the reference laser light 59 in response to a synchronizing signal S that is inputted from the shape measurement unit 60 that serves as a trigger.
- the shape measurement unit 60 repeatedly measures the time difference between the emission timing of the reference laser light 59 based on the outputting time of the synchronizing signal S and the detection timing of the reflection light 61 of the reference laser light 59 to acquire the distance to the physical object 6 .
- the technique for measuring the distance to a physical object by detecting reflection of reference laser light is not limited to the dTOF.
- iTOF indirect time of flight
- the mirror 52 is used to two-dimensionally displace the arriving destination of the reference laser light 59 together with the image laser light 57 and therefore acquires a reflection position of the reference laser light 59 to thereby acquire the distance to the physical object 6 in units of pixels of the projection image.
- the shape measurement unit 60 has a function of measuring a three-dimensional shape of the physical object 6 .
- a front elevational view when the light reception surface of the shape measurement unit 60 is viewed in a direction of an arrow mark A is depicted.
- the shape measurement unit 60 is structured such that light reception elements are arrayed on a surface thereof that has a hollow rectangular shape and has an opening 66 provided at the center thereof.
- the reference laser light pass filter 62 also has a similar shape.
- the opening 66 forms an exit for the image laser light 57 and the reference laser light 59 that are reflected from the mirror 52 .
- the shape measurement unit 60 detects reflected light of the reference laser light at a position circumscribing the emission port for the laser light.
- the shape measurement unit 60 detects the reflected light of the reference laser light at the position circumscribing the emission port for the laser light, the shape of the opening 66 or the surface on which the light reception elements are arrayed is not restrictive.
- laser light is applied sequentially for each pixel by the laser light scanning method. Therefore, even if the light reception surface of the shape measurement unit 60 is provided in such a manner as to circumscribe the emission port for the laser light as depicted in FIG. 5 , the distribution of the acquisition positions of the distance on the surface of the physical object 6 and the distribution of the pixels on the projection image do not interfere with each other. As a result, while the resolution of the projection image is maintained, three-dimensional shape information relating to the physical object 6 on the same coordinate system can directly be acquired with the same resolution. Note that it is sufficient if the distance between the light reception element array of the shape measurement unit 60 and the emission port for the laser light remains within a minimum distance range in which it can be regarded that they contact with each other.
- the light reception surface for reflected light such that it surrounds the opening 66 serving as the emission port for the laser light as depicted in FIG. 5 , it is possible to cause the irradiation axis of the image laser light (and the reference laser light) and the center axis 67 of the light reception surface (vertical axis passing the center of the light reception surface) to coincide with each other. This makes it possible to perform projection of an image and acquisition of three-dimensional shape information on the same coordinate system whose origin is, for example, the center of the opening 66 .
- the reference laser light 59 also arrives at the position of the surface of the physical object 6 at which the image laser light 57 arrives, and reflection of the reference laser light 59 can also be detected almost without a blind angle. Therefore, three-dimensional shape information necessary for image projection can be obtained without omission.
- the image data outputting unit 10 acquires information F relating to a three-dimensional shape of the physical object 6 from the shape measurement unit 60 and performs adjustment of the image in such a manner as to correspond to the information F.
- the image data outputting unit 10 inputs data I of the image adjusted as occasion demands, to the image laser light source 50 .
- the time ⁇ t elapsed from emission of the reference laser light to detection of reflected light is given by the following expression.
- the number P of times by which a laser pulse can be emitted per one frame is given by the following expression.
- the resolution of the projection image is 1280 ⁇ 720 pixels
- the number of times of measurement for a period of time necessary to measure the distance with practical accuracy is approximately 500 and an ideal condition that reflected light of a pulse of laser light can be detected by all light reception elements is satisfied, then it is sufficient if approximately 100 light reception elements are disposed on the light reception surface of the shape measurement unit 60 .
- the shape measurement unit 60 calculates a final distance value for which the influence of a detection error is reduced by, for example, averaging distance values for the number of times of detection.
- FIG. 6 depicts a configuration of an internal circuit of the image data outputting unit 10 .
- the image data outputting unit 10 includes a central processing unit (CPU) 23 , a graphics processing unit (GPU) 24 , and a main memory 26 .
- the components of the image data outputting unit 10 are connected to each other by a bus 30 .
- An input/output interface 28 is further connected to the bus 30 .
- a communication unit 32 for establishing communication with a server or the like, a storage unit 34 such as a hard disk drive or a nonvolatile memory, an outputting unit 36 that outputs data or a control signal to the image laser light source 50 and the mirror 52 , an inputting unit 38 that inputs data from the shape measurement unit 60 , and a recording medium driving unit 40 that drives a removable recording medium such as a magnetic disk, an optical disk, or a semiconductor memory are connected.
- the communication unit 32 includes a peripheral equipment interface of a universal serial bus (USB) or Institute of Electrical and Electronics Engineers (IEEE) 1394 or a network interface with a wired or wireless local area network (LAN).
- USB universal serial bus
- IEEE Institute of Electrical and Electronics Engineers
- the CPU 23 controls the entire image data outputting unit 10 by executing an operating system stored in the storage unit 34 .
- the CPU 23 further executes various programs read out from the removable recording medium and loaded into the main memory 26 or downloaded via the communication unit 32 .
- the GPU 24 has a function of a geometry engine and a function of a rendering processor, performs a drawing process according to a drawing command from the CPU 23 , and outputs a result of the drawing process to the outputting unit 36 .
- the main memory 26 includes a random access memory (RAN) and stores programs and data necessary for processing.
- RAN random access memory
- FIG. 7 depicts a configuration of functional blocks of the image data outputting unit 10 and the shape measurement unit 60 .
- the functional blocks depicted in FIG. 7 can be implemented, in hardware, by various sensors, a microprocessor, and so forth, in addition to the CPU 23 , the GPU 24 , and the main memory 26 depicted in FIG. 6 .
- the functional blocks can be implemented, in software, by a program that is loaded from a recording medium into a memory and exerts various functions such as an information processing function, an image drawing function, a data inputting/outputting function, and a communication function. Accordingly, it can be recognized by those skilled in the art that the functional blocks described above can be realized only by hardware, only by software, or by a combination of hardware and software, and this is not restrictive.
- the image data outputting unit 10 and the shape measurement unit 60 may actually be a single device or may be implemented as three or more devices. Moreover, some of the functions of the image data outputting unit 10 depicted in FIG. 7 may be provided in the shape measurement unit 60 , or some of the functions of the shape measurement unit 60 depicted in FIG. 7 may be provided in the image data outputting unit 10 .
- the shape measurement unit 60 includes a synchronizing signal outputting unit 72 that outputs a synchronizing signal to the reference laser light source 56 , a detection unit 70 that detects reflection light of reference laser light, and a shape information acquisition unit 74 that acquires information relating to a three-dimensional shape of a physical object according to a result of the detection.
- the synchronizing signal outputting unit 72 generates a synchronizing signal that acts as a trigger to generation of a pulse of the reference laser light as described hereinabove and provides the synchronizing signal to the reference laser light source 56 .
- the detection unit 70 includes an array of light reception elements.
- the detection unit 70 detects light reflected from a physical object when a pulse of reference laser light generated, by the reference laser light source 56 , in response to the synchronizing signal as a trigger is emitted to the physical object, and notifies the shape information acquisition unit 74 of the timing of the detection.
- the shape information acquisition unit 74 calculates an emission timing of a pulse of the reference laser light on the basis of the timing of the synchronizing signal generated by the synchronizing signal outputting unit 72 . Then, the shape information acquisition unit 74 calculates the distance to the reflection position on the physical object according to the expression given hereinabove, on the basis of the time difference between the emission timing and the detection timing of the reflected light.
- the reference laser light 59 generated by the reference laser light source 56 is controlled in direction by rocking motion of the mirror 52 such that it is used for two-dimensional scanning of the surface of the physical object 6 together with the image laser light 57 .
- the shape information acquisition unit 74 associates the position of a pixel in the emission direction of the reference laser light or on the projection image with the distance value calculated through detection of the reflected light, to make three-dimensional shape information relating to the physical object.
- the image data outputting unit 10 includes a shape information acquisition unit 76 , an image generation unit 78 , an image adjustment unit 80 , an outputting unit 82 , and a scanning controller 84 .
- the shape information acquisition unit 76 acquires information relating to a three-dimensional shape of a physical object.
- the image generation unit 78 generates an image to be projected on the physical object.
- the image adjustment unit 80 adjusts the image to be projected, on the basis of the three-dimensional shape information relating to the physical object, and the outputting unit 82 outputs data of the image to be projected.
- the scanning controller 84 controls scanning of the surface of the physical object with the laser light.
- the shape information acquisition unit 76 acquires information relating to a three-dimensional shape of a physical object from the shape measurement unit 60 .
- the shape information acquisition unit 76 may sequentially acquire the information every time a distance value is measured by the shape measurement unit 60 , or may acquire shape information after every predetermined number of units such as after every unit of frames of the projection image. Part of the acquisition process of three-dimensional shape information by the shape measurement unit 60 may be taken charge of by the shape information acquisition unit 76 .
- the image generation unit 78 generates data of a still picture of a moving picture to be projected on a physical object.
- the image generation unit 78 may acquire image data generated in advance, from an external apparatus such as a server or from an internal memory device.
- the image generation unit 78 itself may draw an image by using a program or model data stored in advance in the internal memory device or the like.
- the image generation unit 78 may acquire a situation of the real space from an undepicted inputting device such as a camera, a sensor, or a controller as needed and reflect the situation on an image to be drawn.
- the image generation unit 78 may acquire shape information relating to a physical object from the shape information acquisition unit 76 and change the contents themselves of the image to be projected, on the basis of the acquired shape information.
- the image adjustment unit 80 sequentially acquires information relating to a three-dimensional shape of a physical object from the shape information acquisition unit 76 , and performs appropriate adjustment for an image generated by the image generation unit 78 , according to the acquired information.
- the adjustment may be enlargement, reduction, deformation, rotation, shading, or the like of an image.
- the image adjustment unit 80 adjusts a frame to be projected next, on the basis of three-dimensional shape information obtained during a projection period of the immediately preceding frame.
- the temporal relation between acquisition of three-dimensional shape information and image adjustment is not restrictive.
- the outputting unit 82 outputs data of an image, for which an adjustment process has been performed as occasion demands, to the image laser light source 50 .
- the scanning controller 84 controls the angle of the mirror 52 such that the image laser light representative of each pixel arrives at an appropriate position on a surface of a physical object.
- the scanning controller 84 in the present embodiment further controls the mirror 52 such that the scanning speed with the laser light changes depending upon the position according to the contents of an image generated by the image generation unit 78 .
- the scanning controller 84 detects a region in which the accuracy necessary for shape information is higher than a standard level, on the basis of a characteristic of the image and so forth, and determines the region as a focused measurement region. For example, the scanning controller 84 determines a region of an image in which many textures exist, a region in which there exists an object that indicates a great movement, or a like region as a focused measurement region. Then, the scanning controller 84 sets the scanning speed with the laser light when an image for the region is projected lower than that for the other regions to thereby increase the number of times of emission and detection of the reference laser light.
- the final distance value calculated by averaging distance values for the number of times of detection becomes less likely to include an error.
- the accuracy of three-dimensional shape information obtained in the region becomes higher, and consequently, the image in the region can be projected with a higher degree of accuracy.
- flexible and partial control can be performed in this manner.
- FIGS. 8 A and 8 B each exemplify a positional relation upon emission of image laser light and reference laser light.
- FIGS. 8 A and 8 B are each a conceptive diagram depicting a cross section of laser beams with which a physical object is irradiated.
- FIG. 8 A illustrates a case in which laser beams 90 a , 90 b , and 90 c of the three primary colors of R, G, and B are generated by the image laser light source 50 .
- the arrangement of the laser beams 90 a , 90 b , and 90 c is not restrictive.
- the center axis of a reference laser beam 92 is preferably adjusted to the center axis of the green laser beam 90 b .
- the human is high in sensitivity to light of a wavelength of green and is likely to notice positional displacement of an image.
- the projection accuracy in visibility can be increased.
- FIG. 8 B illustrates a case in which a single laser beam 94 is generated by the image laser light source 50 .
- the center axis of a reference laser beam 96 is adjusted to the center axis of the laser beam 94 .
- FIG. 9 depicts another example of the configuration of the image projection system according to the present embodiment.
- elements identical to those of the image projection system 14 depicted in FIG. 5 are denoted by identical reference signs.
- an image projection system 14 a depicted in FIG. 9 includes the image data outputting unit 10 , the shape measurement unit 60 , the mirror 52 , and the reference laser light pass filter 62 that are depicted in FIG. 5 .
- an image and reference laser light source 100 is provided in place of the image laser light source 50 and the reference laser light source 56 .
- the image and reference laser light source 100 is a laser module that generates a laser beam for image projection and a laser beam for reference from the same surface.
- the image and reference laser light source 100 generates four laser beams including laser beams for R, G, and B and a laser beam for reference is indicated by four arrow marks.
- the number of image laser beams is not restrictive.
- the positional relation between the image laser beams and the reference laser beam is made similar to the positional relations between them depicted in FIGS. 8 A and 8 B .
- the image and reference laser light source 100 generates image laser beams on the basis of the image data I obtained from the image data outputting unit 10 and generates a pulse of a reference laser beam in response to a synchronizing signal S from the shape measurement unit 60 .
- the image projection system 14 a may be similar to those described hereinabove with reference to FIG. 5 .
- the image projection system can be scaled down in comparison with that of the configuration depicted in FIG. 5 .
- FIG. 10 is a view illustrating a mode in which the scanning controller 84 adjusts the scanning speed with the laser light according to contents of an image.
- a progress of projection in a case where an image including 12 rows of pixels is projected is indicated along a time axis directed in the rightward direction.
- the progress where the scanning speed is not adjusted and the progress where the scanning speed is adjusted are depicted, respectively, and each of rectangles arrayed on each stage represents a projection time period for one row.
- the frame rate set to the image is 60 fps as an example.
- the scanning controller 84 controls the mirror 52 such that the scanning speed for the rows (“Line 05 ” and “Line 06 ”) is decreased as depicted on the lower stage.
- the scanning speed is reduced to 1 ⁇ 2
- the scanning speed is reduced to 1 ⁇ 3.
- a projection time period 102 a for the fifth row is adjusted to twice the standard time period
- a projection time period 102 b for the sixth row is adjusted to three times the standard time period.
- the distance is measured by a doubled number of times and a tripled number of times, respectively, and the accuracy of shape information can be improved as much.
- Such adjustment of the scanning speed causes such a situation that the period of time required to project an image for one frame exceeds 1/60 seconds, in some cases. In other words, the frame rate is displaced slightly from the set value therefor.
- control with a high degree of freedom in comparison with that by a projector including light emitting elements arrayed two-dimensionally can be facilitated.
- the scanning controller 84 may set a focused measurement region row by row of an image as depicted in FIG. 10 , or may set a focused measurement region otherwise pixel by pixel or region by region. Meanwhile, information on which determination of a focused measurement region is based may be acquired by the scanning controller 84 analyzing an image generated by the image generation unit 78 . Alternatively, a focused measurement region may be determined by the scanning controller 84 acquiring information regarding a texture used when the image generation unit 78 generates an image, a movement of a determined object, or a position of an important object.
- information that associates a frame number and a focused measurement region in the frame with each other may be created in advance and included into image data that is read out when the image generation unit 78 generates an image.
- the scanning speed in a focused measurement region may be set in a plurality of stages according to a degree of accuracy required for shape information, as in the case of the example depicted in FIG. 10 in which, for the fifth row, the scanning speed is reduced to 1 ⁇ 2 and, for the sixth row, the scanning speed is reduced to 1 ⁇ 3, or may otherwise be set in only one stage.
- the scanning controller 84 may naturally change the focused measurement region or the scanning speed in the focused measurement region with respect to time according to contents of the image. Moreover, the scanning controller 84 may switch whether or not the scanning speed is to be adjusted, depending on appearance or disappearance of a focused measurement region. The scanning controller 84 may determine a focused measurement region or a scanning speed in the focused measurement region on the basis not only of contents of an image but also of a shape of a physical object, a movement of the physical object, a distance to the physical object, a line of sight of an appreciator, or the like.
- the scanning controller 84 may determine, as a focused measurement region, a region on an image that is being projected on a portion of a surface of a physical object at which fine unevenness exists or at which the shape change is great. In such a case as just described, the scanning controller 84 acquires three-dimensional shape information obtained last, from the shape information acquisition unit 76 , or acquire information relating to a gazing point of an appreciator from an undepicted gazing point detector. Then, the scanning controller 84 determines, as a focused measurement region, a region in which the obtained parameter or parameters satisfy a condition set in advance.
- the number of parameters to be used for determination of a focused measurement region may be one or otherwise may be two or more in combination.
- FIG. 11 depicts an example of setting of a rule for the adjustment of the scanning speed with laser light by the scanning controller 84 .
- a rule setting table 140 associates a parameter 142 on which adjustment is based, a condition 144 for triggering adjustment, and a target value 146 for the scanning speed when the condition is satisfied with each other.
- the setting form and the contents of the adjustment rule in the present embodiment are not restrictive.
- FIG. 11 depicts a setting that, as an example, when the “speed V of object A” is made the base of adjustment and the condition of “V1 ⁇ V ⁇ V2” is satisfied, a region on the image in which the object A is represented is determined as a focused measurement region.
- the target value of the scanning speed with laser light in the focused measurement region is 1 ⁇ 2 the standard value.
- the target value of the scanning speed with laser light is 1 ⁇ 3 the standard value.
- FIG. 11 it is also set that, when the “type of texture” is also the base of adjustment and a region on an image in which a texture “T 1 ” is represented is made a focused measurement region, the scanning speed with laser light in the focused measurement region is 1 ⁇ 2 the standard value. It is to be noted that each of V1, V2, and T 1 actually is a particular speed or a particular name of texture.
- the scanning controller 84 retains such setting rules as depicted in FIG.
- the scanning controller 84 supervises the set parameters to determine whether or not they satisfy the conditions, to determine whether or not adjustment of the scanning speed with laser light is required and determine a focused measurement region and a target value of the scanning speed.
- an image is projected on a physical object by the laser light scanning method, and reference laser light for shape measurement is applied through an emission port common to image laser light to a physical object. Consequently, reflected light of the reference laser light can be detected at a position contacting with the emission port for the image laser light, and three-dimensional shape information relating to the physical object on a coordinate system that is based on the emission plane can directly be acquired.
- the projection image and a two-dimensional array of distance values can be obtained both in a high resolution without interfering with each other.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
Description
D=c×Δt/2
Δt=1/(3.0×108)[m/sec]×2[m]=6.66[nsec]
P=1/30[fps]/6.66[nsec]=5×106[dots]
If the resolution of the projection image is 1280×720 pixels, then the number p of times by which laser light can be emitted per one pixel is given by the following expression.
p=5×106[dots]/(1280×720)[pixel]=5.4[dots/pixel]
Claims (12)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2022-091632 | 2022-06-06 | ||
| JP2022091632A JP2023178760A (en) | 2022-06-06 | 2022-06-06 | Image projection system and image projection method |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20230396746A1 US20230396746A1 (en) | 2023-12-07 |
| US12250500B2 true US12250500B2 (en) | 2025-03-11 |
Family
ID=88976219
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/326,006 Active 2043-07-11 US12250500B2 (en) | 2022-06-06 | 2023-05-31 | Image projection system and image projection method |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US12250500B2 (en) |
| JP (1) | JP2023178760A (en) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN118172716B (en) * | 2024-05-11 | 2024-10-18 | 中科微至科技股份有限公司 | A parcel status detection method for cross-belt cart based on RGB-D image |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020133144A1 (en) * | 2001-03-19 | 2002-09-19 | Ball Semiconductor, Inc. | Laser irradiation mapping system |
| US9832436B1 (en) * | 2016-05-30 | 2017-11-28 | Panasonic Intellectual Property Management Co., Ltd. | Image projection system and image projection method |
| US20230221110A1 (en) * | 2016-03-09 | 2023-07-13 | Nikon Corporation | Detection device, detection system, detection method, and storage medium |
-
2022
- 2022-06-06 JP JP2022091632A patent/JP2023178760A/en active Pending
-
2023
- 2023-05-31 US US18/326,006 patent/US12250500B2/en active Active
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020133144A1 (en) * | 2001-03-19 | 2002-09-19 | Ball Semiconductor, Inc. | Laser irradiation mapping system |
| US20230221110A1 (en) * | 2016-03-09 | 2023-07-13 | Nikon Corporation | Detection device, detection system, detection method, and storage medium |
| US9832436B1 (en) * | 2016-05-30 | 2017-11-28 | Panasonic Intellectual Property Management Co., Ltd. | Image projection system and image projection method |
Non-Patent Citations (2)
| Title |
|---|
| Nick Staff, "Sony's touchscreen projector technology feels like the future of interactivity", [online], Mar. 12, 2017, The Verge, [Searched Apr. 28, 2022], Internet, <URL:https://www.theverge.com/2017/3/12/14899804/sony-touchscreen-projector-display-prototype-sxsw-2017>, p. 1-7. |
| Uwe Lippmann, et al., "In Good Light: A New High-Speed Projector with Visible and Infrared Capabilities", [online], Dec. 13, 2021, Tokyo Institute of Technology, [Searched Apr. 28, 2022] , Internet, <URL:https://www.titech.ac.jp/english/news/2021/062614>, p. 1-3. |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2023178760A (en) | 2023-12-18 |
| US20230396746A1 (en) | 2023-12-07 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12126916B2 (en) | Camera array for a mediated-reality system | |
| EP3497671B1 (en) | Devices and methods for adjustable resolution depth mapping | |
| US9846968B2 (en) | Holographic bird's eye view camera | |
| JP6981680B2 (en) | Equipment and Methods for Providing Depth Mapping Along with Scanning of Laser Image Projections | |
| US10122976B2 (en) | Projection device for controlling a position of an image projected on a projection surface | |
| US20140362188A1 (en) | Image processing device, image processing system, and image processing method | |
| US11881001B2 (en) | Calibration apparatus, chart for calibration, and calibration method | |
| JP2016138878A (en) | System and method for generating three-dimensional image using lidar and video measurements | |
| JP6862751B2 (en) | Distance measuring device, distance measuring method and program | |
| KR101616176B1 (en) | Full body high speed three dimesional scanning apparatus | |
| EP4016250B1 (en) | Virtual/augmented reality system having dynamic region resolution | |
| TW201233141A (en) | Scanning projectors and image capture modules for 3D mapping | |
| JP2019512798A (en) | Wide-baseline stereo for short latency rendering | |
| WO2022050279A1 (en) | Three-dimensional measurement device | |
| US12250500B2 (en) | Image projection system and image projection method | |
| US20190082089A1 (en) | Electronic apparatus, motion sensor, position change detection program, and position change detection method | |
| JP7352239B2 (en) | Projection system, projection control device, projection control program, and projection system control method | |
| JP4221808B2 (en) | Three-dimensional data input method and apparatus | |
| US20230306676A1 (en) | Image generation device and image generation method | |
| US12468147B2 (en) | Multi-beam laser beam scanner in a picture generation unit | |
| JP2000292121A (en) | Three-dimensional measurement method and three- dimensional input device | |
| JP2018028579A (en) | Display device and display method | |
| JP7775610B2 (en) | Imaging device, imaging method, and program | |
| US20250216553A1 (en) | Hybrid direct and indirect time-of-flight imaging | |
| JP2023047715A (en) | Imaging system, imaging method and program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| AS | Assignment |
Owner name: SONY INTERACTIVE ENTERTAINMENT INC., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KOIZUMI, MAKOTO;REEL/FRAME:063818/0107 Effective date: 20230530 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |