EP1602894A1 - Projecteur comprenant un dispositif de mesure d'angle d'inclinaison - Google Patents

Projecteur comprenant un dispositif de mesure d'angle d'inclinaison Download PDF

Info

Publication number
EP1602894A1
EP1602894A1 EP05011722A EP05011722A EP1602894A1 EP 1602894 A1 EP1602894 A1 EP 1602894A1 EP 05011722 A EP05011722 A EP 05011722A EP 05011722 A EP05011722 A EP 05011722A EP 1602894 A1 EP1602894 A1 EP 1602894A1
Authority
EP
European Patent Office
Prior art keywords
inclination
angle
image
projection
positional information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP05011722A
Other languages
German (de)
English (en)
Inventor
Kazuo Mochizuki
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sharp NEC Display Solutions Ltd
Original Assignee
NEC Viewtechnology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Viewtechnology Ltd filed Critical NEC Viewtechnology Ltd
Publication of EP1602894A1 publication Critical patent/EP1602894A1/fr
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/132Overhead projectors, i.e. capable of projecting hand-writing or drawing during action

Definitions

  • the present invention relates to a projector having a device for measuring an angle of inclination between the projection optical axis of a projection device and a projection surface, and a method for correcting a trapezoidal distortion.
  • Japanese Patent Laid-open Publication No. 281597/97 discloses a process of adjusting the angle of a liquid crystal display unit.
  • the process has the step of detecting the installation angle of a liquid crystal projector and the step of detecting the distance between the liquid crystal projector and the projection surface onto which an image is projected. According to the disclosed processes, the angle of the liquid crystal display unit needs to be mechanically adjusted.
  • the distance measuring device is provided independently of the projector.
  • the projected images are not utilized. Further, the measured distance and the actual distance to the screen do not always coincide, if the screen is surrounded by a frame projecting forwardly toward the projector, or the screen is set back away from the projector.
  • JP-A- 69211/2001 discloses a process of correcting a distortion by projecting a beam spot onto a curved screen from an angle-controllable laser pointer.
  • a spot image is generated and projected onto a screen by the projector, and the beam spot and the projected spot image are captured by a camera for measuring their positions.
  • the spot image is moved until it overlaps with the beam spot, then the coordinate of the pixel of the spot image in the frame memory is replaced with the coordinate of the beam spot and stored in a coordinate transform parameter memory.
  • the inclination of the projector in the vertical direction which easily causes a distortion, is detected by a gravity sensor, when a screen is set vertically.
  • the distortion is corrected depending on the detected inclination. See the specification etc. of JP-A-5278/2003 for details.
  • the projector cannot correct the distortion correctly, if the screen is not set vertically or the screen is inclined in the horizontal plane with respect to the projection optical axis of the projector.
  • an image can be projected without distortion from the projector onto the screen by means of conventional techniques such as the conversion of the coordinates of a frame memory of the projector, etc.
  • a projector includes a projection device, a cross line positional information acquiring device for acquiring positional information of a cross line between a projection surface of the projection device and a plane crossing the projection surface, an inclination detecting device for calculating an angle of inclination between a projection optical axis of the projection device and the projection surface based on the positional information acquired by the cross line positional information acquiring device, and an image distortion correcting device for correcting a trapezoidal distortion of an input image supplied to the projection device, based on the angle of inclination calculated by the inclination detecting device.
  • a projector includes a projection device, a cross line positional information acquiring device for acquiring positional information of a cross line between a projection surface of the projection device and a plane crossing the projection surface, a vertical angle-of-inclination acquiring device for detecting an angle of inclination of a projection optical axis of the projection device in a vertical plane, an inclination detecting device for calculating an angle of inclination between the projection optical axis and the projection surface in a horizontal plane based on the positional information acquired by the cross line positional information acquiring device, and an image distortion correcting device for correcting a trapezoidal distortion of an input image supplied to the projection device, based on the angle of inclination in the horizontal plane calculated by the inclination detecting device and based on the angle of inclination in the vertical plane calculated by the vertical angle-of-inclination acquiring device.
  • a method of correcting a trapezoidal distortion of a projector includes the steps of acquiring positional information of a cross line between a projection surface of a projection device and a plane crossing the projection surface, calculating an angle of inclination between a projection optical axis of the projection device and the projection surface based on the acquired positional information, and correcting a trapezoidal distortion of an input image supplied to the projection device, based on the calculated angle of inclination.
  • the present invention is advantageous in that a trapezoidal distortion of an image can be corrected with a simple mechanism, because horizontal and vertical angles of inclination of the projection surface with respect to the projection optical axis of the projector can be calculated, based on the positional information of a cross line between a basically vertical wall surface serving as the projection surface and a plane crossing the wall surface.
  • the projector further includes a vertical inclination sensor, then only the horizontal angle of inclination needs to be acquired based on the positional information of the cross line. It is sufficient, in this case, that only a cross line between a front wall and a ceiling appears in the imaging range of the image sensor, and which is highly likely to occur. Therefore, the present invention is applicable to a wider range of uses, allowing for easy correction of a trapezoidal distortion due to horizontal and vertical inclinations of the projector.
  • a projector having a trapezoidal distortion correcting device will be described below with reference to Figs. 1 and 2A through 2C.
  • projector 10 has projection device 20 having projection lens 21 and display unit 22, image controller 23 for controlling an image generated by display unit 22, trapezoidal distortion correcting device 30, and central processing unit 60 for controlling the entire operation of projector 10.
  • Trapezoidal distortion correcting device 30 calculates an angle of inclination between a front wall serving as projection surface 70 and projector 10, and corrects a distortion of an image that is inputted to trapezoidal distortion correcting device 30.
  • Image controller 23 controls an image of display unit 22, based on an output signal from trapezoidal distortion correcting device 30, thereby correcting the distortion on the image displayed on projection surface 70.
  • the image distortion is automatically corrected according to a predetermined process by central processing unit 60.
  • trapezoidal distortion correcting device 30 has image sensor 50, image capturer 31, inclination detector 32, image input unit 41, and image distortion correcting circuit 33.
  • Image sensor 50 has imaging lens 51 and imaging device 53.
  • Imaging lens 51 is disposed on a front surface of projector 10 and has an optical axis in a predetermined direction and a predetermined imaging range.
  • Imaging device 53 is disposed perpendicularly to the optical axis of imaging lens 51. Imaging device 53 detects light that passes through imaging lens 51 and outputs desired positional information of an image represented by the detected light.
  • Imaging device 53 has an imaging surface covering the imaging range of imaging lens 51.
  • Imaging device 53 has a two-dimensional solid-state imaging device such as an image pickup tube or a CCD (Charge-Coupled Device) for outputting an image as a collection of pixels.
  • Image capturer 31 captures an image from imaging device 53 as image information.
  • Inclination detector 32 analyzes positional information of the captured image and calculates the angle of inclination between the front wall and projector 10.
  • Image distortion correcting circuit 33 corrects a trapezoidal distortion of the image that is supplied to image input unit 41, based on the angle of inclination calculated by inclination detector 32.
  • Image input unit 41 is supplied with video information that represents an image that is projected by projection device 20, and supplies an output signal to image controller 23.
  • Projector 10 utilizes the positional information of a horizontal cross line between the front wall surface serving as projection surface 70 and a ceiling or a floor which crosses the front wall surface, and/or a vertical cross line between the front wall surface serving as projection surface 70 and a side wall surface which crosses the front wall surface. Specifically, a horizontal and/or vertical angle of inclination between projection optical axis 27 of projection device 20 of projector 10 and projection surface 70 is calculated, based on the positional information on the cross line acquired by imaging device 53 of image sensor 50.
  • the positional information of a cross line may be acquired by various processes. There are two processes available for acquiring the positional information of a cross line in an image by imaging device 53. According to the first process, the positional information of a cross line is acquired as a luminance change line in a captured image which represents the entire reflected light emitting from the reflecting surfaces including projection surface 70 in front of projector 10, through imaging lens 51 to imaging device 53. In this process, the cross line needs to be included in the imaging range of image sensor 50. Since the imaging means is a digital camera, and the cross line is usually recognized as a collection of luminance change spots, the positional information can be obtained by analyzing the pixels if there is a certain amount of luminance change.
  • the cross line in an image generated by imaging device 53 can usually be detected as a change in the luminance of the reflected light. Filtering or other appropriate processing to image data may be used in order to acquire a clear boundary line.
  • the second process is applicable if the projection range of projection device 20 can be expanded sufficiently to cover a ceiling, a floor, or side walls.
  • two or more test patterns each consisting of a vertical or a horizontal straight line are projected from projection device 20 onto front surfaces including projection surface 70, and bent points of the test patterns appearing on the captured image of reflected light are acquired.
  • a cross line between a flat surface serving as the projection surface and a surface crossing the flat surface is calculated as a straight line joining the bent points.
  • test pattern images consists of two straight lines which are joined with each other at a cross point positioned at the cross line between the ceiling and the wall surface.
  • Each set of the two crossing straight lines is determined by straight line equations using the coordinates of two detection points on each of the two straight lines.
  • the cross point between the straight lines is determined as the cross point between the determined straight lines, and the cross line can be determined from the coordinates of the obtained two cross points according to a straight line equation
  • a process of calculating an angle of inclination between the projection optical axis and the projection surface in the trapezoidal distortion correcting device of the projector according to the first embodiment of the present invention will be described below. It is assumed that projection optical axis 27 of projection lens 21 and the optical axis of imaging lens 51 lie parallel to each other. If projection optical axis 27 of projection lens 21 and the optical axis of imaging lens 51 do not lie parallel to each other, then an angle of inclination between projection optical axis 27 of projection lens 21 and projection surface 70 can be calculated, based on the relationship between projection optical axis 27 of projection lens 21 and the optical axis of imaging lens 51 which is known.
  • Figs. 3A and 3B are horizontal and vertical cross-sectional views, respectively, showing the projected image range and the image sensor imaging range when a projector is set such that the imaging range of the image sensor includes walls, a ceiling, and a floor adjacent to the front wall, as well as the front wall.
  • Fig. 4 is a view showing an image captured by the imaging device of the image sensor when a projector is set as shown in Figs. 3A and 3B.
  • Projected image range 71 is a range in which an image projected from projection lens 21 is displayed
  • image sensor imaging range 72 is a range in which an image is captured by image sensor 50.
  • Projector 10 is usually set so as to project an image onto front wall 61 such that the image is displayed in the substantially central area of front wall 61 in the horizontal direction, and is designed to project an image with a slightly upward angle in the vertical direction. Therefore, the projected image is slightly directed upwardly with regard to the horizontal line of projector 10 in front-to-back direction.
  • the imaging range of image sensor 50 is wider than the projected image range of projector 10. If projector 10 is set as shown in Figs. 3A, 3B, and 4, then front wall 61, right side wall 62, left side wall 63, ceiling 64, and floor 65 are included in the imaging range of image sensor 50.
  • Figs. 5A and 5B are horizontal and vertical cross-sectional views, respectively, showing a projected image range and an image sensor imaging range, when a projector is set such that an image is projected along a projection optical axis of the projector that is inclined with respect to a front wall in the horizontal plane.
  • Fig. 6 is a view showing an image captured by the imaging device of the image sensor when the projector is set as shown in Figs. 5A and 5B.
  • imaging device 53 of image sensor 50 captures an image as shown in Fig. 6.
  • cross lines between front wall 61 and ceiling 64 and between front wall 61 and floor 65 are captured as inclined cross lines, unlike those in the captured image shown in Fig. 4.
  • Inclination detector 32 of trapezoidal distortion correcting device 30 detects these cross lines from the image that is generated by imaging device 53 of image sensor 50 and captured by image capturer 31, according to the process described above, generates parameters for correcting an image distortion, and outputs the generated parameters to image distortion correcting circuit 33.
  • Fig. 7 is a view showing a cross line between the front wall and the ceiling that is detected from the captured image shown in Fig. 4.
  • Figs. 8A and 8B are views showing a cross line between the front wall and the ceiling that is detected from the captured image shown in Fig. 6.
  • Fig. 8A shows the captured image
  • Fig. 8B shows horizontal and vertical reference lines to determine the positional information of the cross line in the captured image.
  • the horizontal and vertical reference lines are provided as hypothetical lines and defined with respect to the origin that is established at the center of the captured image.
  • the cross line between the front wall and the ceiling that is detected is shown as a bold line.
  • the angle of inclination between front wall 61 and the main body of projector 10 can be determined by calculating the angle of inclination, using the positional information recognized by image sensor 50, of cross line 66b between image 61 b of the front wall and image 64b of the ceiling in the captured image in Figs. 8A and 8B.
  • front wall 61 extends vertically and ceiling 64 extends horizontally, front wall 61 and ceiling 64 cross perpendicularly to each other, and the cross line which is generated by the connecting edges of front wall 61 and ceiling 64 extends perpendicularly to the vertical direction. If projector 10 is inclined only in a horizontal plane, then cross line 66b is detected from the image that is captured by image sensor 50 according to the process described above, as shown in Fig. 8A.
  • left vertical reference line V1 and right vertical reference line V2 are provided at given intervals on the left and right sides of center C of the captured image, respectively.
  • cross line 66b crosses left vertical reference line V1 and right vertical reference line V2 at cross points a0, a1, respectively, and central vertical reference line V0 that passes center C crosses cross line 66b at a cross point through which first horizontal reference line H1 passes.
  • the positional information of cross points a0, a1 can be represented by coordinates x, y in a two-dimensional coordinate system that has center C as its origin.
  • the section of cross line 66b between cross points a0, a1 is shown as a bold line.
  • cross line 67b between image 61 b of the front wall and image 62b of the right side wall are captured as a vertical line in the image by imaging device 53 of image sensor 50.
  • the line segment between cross points a0, a1 of cross line 66b does not extend horizontally in the image that is captured by image sensor 50, though the actual cross line between front wall 61 and ceiling 64 extends horizontally.
  • Figs. 9A and 9B are plan and side elevational views, respectively, showing the relationship between the actual cross line between the front wall and the ceiling shown in Figs. 5A and 5B, and the image of the cross line in the image that is captured by the imaging device of the image sensor.
  • Broken line V in Figs. 9A and 9B represents hypothetical plane V for explaining the imaging surface of imaging device 53 of image sensor 50.
  • Hypothetical plane V extends perpendicularly to optical axis 55 that passes center C of the imaging surface of imaging device 53 and center 52 of imaging lens 51.
  • Hypothetical plane V is displayed at a reduced scale on the imaging surface of imaging device 53 which extends parallel to hypothetical plane V.
  • line segment a0 - a1 of cross line 66b is inclined to hypothetical plane V.
  • actual point a1 is captured as point a1' on hypothetical plane V, as shown in Fig. 9A.
  • the angle of inclination ⁇ of front wall 61 with respect to a plane perpendicular to optical axis 55 of image sensor 50, which is to be determined last, can be determined by the following equations.
  • the horizontal angle of inclination may be determined by referring to a table which was prepared in advance and represents the relationship between the horizontal angle of inclination between optical axis 27 and projection surface 70, and the variables which are obtained from the positional information.
  • the above trigonometric values may be stored as data table in a memory, and the stored data may be read from the memory.
  • a table may be stored in and read from a memory which represents the relationship between points a0, a1 which are expressed by (x, y) coordinates such as (xa0, ya0), (xa1, ya1) on the image sensor, and angle data.
  • the relationship between points a0, a1 and any other data that are required for the final image conversion may also be stored in the memory. This process allows for great reduction of the amount of calculations to be performed by central processing unit 60.
  • the lens of an image sensor usually has a distortion. It is often necessary, in the present embodiment as well, to take such a lens distortion into account in order to calculate the positional information accurately.
  • the positional information of a point obtained by the image sensor may be represented as positional information (x0, y0), and the positional information (x0, y0) may be converted into positional information (x0', y0') prior to the above calculation, by referring to a distortion correcting table. It is possible, in this way, to correct trapezoidal distortions of the lens of the image sensor taking into account the distortions of the lens, and it is not necessary to correct the distortions in a separate process.
  • Image distortion correcting circuit 33 generates corrective parameters for correcting the trapezoidal distortion of the image, by using the angle of inclination identified by the above process, according to a known process. These parameters are applied to the inputted image, and the trapezoidal distortion thereof can automatically be corrected.
  • Fig. 10 is a view showing an image captured by the imaging device of the image sensor when the projector is set such that the main body of the projector is inclined only vertically with respect to the front wall.
  • the angle of inclination of cross line 67c between image 61 c of the front wall and image 62c of the right side wall can be calculated in the same manner as described above.
  • the angle of inclination of the main body of projector 10 can be then calculated with respect to front wall 61 in the vertical direction, which allows projector 10 to automatically correct a trapezoidal distortion of the projected image.
  • FIG. 11 A shows the captured image
  • Fig. 11B shows the highlighted cross lines.
  • Cross line 66d and cross line 67d join at cross point e0.
  • Cross line 66d has left end point e1 in the image.
  • Cross line 67d and the limit line of the imaging range cross each other at cross point e2.
  • Central vertical reference line V0 and central horizontal reference line H0 pass image center C, which is the origin.
  • Right vertical reference line V3 and second horizontal reference line H2 pass cross point e0.
  • cross line 66d between image 61 d of the front wall and image 64d of the ceiling, and cross line 67d between image 61 d of the front wall and image 62d of the right side wall are extracted first. Then the x, y, and z coordinates of cross point e0, end point e1, and cross point e2 are calculated.
  • Line segments e0 - e1, e0 - e2 can be expressed explicitly by the coordinates (x, y, z) of cross point e0, angles ⁇ h formed between line segment e0 - e1 and the horizontal line, the angles ⁇ v formed between line segment e0 - e2 and the vertical line, and angle ⁇ c formed between these line segments.
  • Figs. 12A and 12B are views showing the relationship between the projector and the front wall, when the front wall faces the projector head-on and is inclined with respect to the projector.
  • the projector is assumed to be fixed, and the front wall is assumed to rotate with respect to the fixed projector.
  • Fig. 12B is a plan view showing line segment S representing a vertical wall that rotates in the imaging range of image sensor.
  • Line segment S rotates about pivot point Cr0, i.e., moves in the back-and-forth direction when viewed from image sensor 50.
  • reference point d0 positioned at the end of line segment S moves toward point d1 or point d2.
  • the movement of reference point d0 can be detected within the range represented by angle ⁇ in the image that is captured by the image sensor.
  • the angle ⁇ is defined by point d2 where reference point d0 overlaps with pivot pint Cr0 when viewed from image sensor 50 (This situation normally does not occur, since it means that the wall rotates to an angular position where the wall is recognized as being overlapped with image sensor 50), and by point d1 where a hypothetical line drawn from image sensor 50 forms a right angle with line segment S (The right side of the wall is recognized as being turned toward image sensor 50, when vied from image sensor 50).
  • the reference point which corresponds to the image of a cross line moves with the rotation of hypothetical plane V.
  • the angle of inclination of projector 10 can be determined with respect to the front wall, by rotating hypothetical plane V about the x-axis and the y-axis to transform the coordinates of cross points e0, e1, e2, and by finding a rotational angle at which line segment e0 - e1 and line segment e0 - e2 lie horizontally and vertically, respectively.
  • the inclination of the front wall can be calculated by identifying e0 (x, y, z), ⁇ h, ⁇ v, ⁇ c in the image, shown in Fig. 11B, that is captured by image sensor 50. Since the rotational angle of the image sensor can be identified together with these parameters, the angle of inclination of the main body of projector 10 can also be detected with respect to the vertical direction. Inclination detector 32 of trapezoidal distortion correcting device 30 calculates the parameters of the projection surface such as a position and a shape of thereof, based on the detected angle of inclination of the main body of projector 10 with respect to the wall surface. The calculated parameters are applied to image distortion correcting circuit 33, and projection device 20 automatically projects an appropriately shaped image, i.e., a distortion-free image, onto the wall surface.
  • projectors should preferably be installed such that a projected image is displayed not onto a ceiling or side walls, but onto a front wall only. Since the positional information of the cross line is obtained from the detected angle of inclination, it is possible to reduce the projected image onto the front wall only.
  • the angle of inclination between the projector and the front wall is acquired by using the cross line between the front wall and a surface that is connected to the front wall.
  • this angle may also be acquired by using a contact line between the front wall and, for example, a horizontal desk in contact with the front wall.
  • FIG. 13 is the block diagram of a projector having a trapezoidal distortion correcting device according to the second embodiment of the present invention.
  • Trapezoidal distortion correcting device 30 has an inclination sensor (G sensor) using a conventional acceleration detector used, for example, for centering a machine when it is installed.
  • the inclination sensor is a vertical inclination sensor 54 for accurately measuring an angle of inclination with respect to the gravitational direction and for outputting the measured angle of inclination as numerical data.
  • G sensor inclination sensor
  • the inclination sensor is a vertical inclination sensor 54 for accurately measuring an angle of inclination with respect to the gravitational direction and for outputting the measured angle of inclination as numerical data.
  • Other details of the projector according to the second embodiment are identical to the projector according to the first embodiment, and those parts identical to the first embodiment are denoted by identical reference numerals, and will not be described in detail below.
  • a trapezoidal distortion of an image is corrected based only on the information obtained from an image that is captured by image sensor 50.
  • a horizontal cross line such as cross line 66 between front wall 61 and ceiling 64
  • a vertical cross line such as cross line 67 between front wall 61 and right side wall 62
  • an angle of inclination in the vertical direction is detected by vertical inclination sensor 54, and is inputted to inclination detector 32.
  • Image capturer 31 captures the image from imaging device 53, and supplies inclination detector 32 with the positional information of cross line 66 between front wall 61 and ceiling 64.
  • Inclination detector 32 calculates an angle of inclination in the horizontal direction based on the positional information.
  • Inclination detector 32 outputs the calculated angle of inclination in the horizontal direction, together with the angle of inclination in the vertical direction that is detected by vertical inclination sensor 54, to image distortion correcting circuit 33.
  • Image distortion correcting circuit 33 generates LSI control parameters, corrects trapezoidal distortions in the vertical and horizontal directions of the image input from image that is inputted unit 41, and outputs corrected image data to image controller 23.
  • vertical inclination sensor 54 is an acceleration sensor or a gravitational sensor utilizing the gravity.
  • vertical inclination sensor 54 may be a device for detecting the tilt angle of a tilting mechanism of the main body of projector 10.
  • a process for correcting a trapezoidal distortion according to the second embodiment will be described below with reference to Fig. 14.
  • This process includes a process of identifying an angle of inclination in the horizontal direction based on the positional information of a cross line between a wall surface and a ceiling in an image sensor imaging range, acquiring an angle of inclination in the vertical direction from a vertical inclination sensor, and correcting an output image on a display unit in the projection.
  • image capturer 31 acquires image information from imaging device 53 of image sensor 50 in step S1.
  • inclination detector 32 acquires cross line 66b between image 61 b of the front wall and image 64b of the ceiling from the image information in step S2, acquires cross points a0, a1 of left and right reference lines V1, V2 and cross line 66b in the image in step S3. Inclination detector 32 then assigns coordinates to cross points a0, a1 in step S4. Thereafter, inclination detector 32 calculates the distance between the two cross points in a direction parallel to the optical axis, based on the distance between the two cross points in the vertical direction, the distance in the vertical direction between optical axis 55 and a limit line of the image sensor imaging range, and vertical angle ⁇ 0 of the image sensor imaging range in step S5.
  • inclination detector 32 calculates an angle of inclination in the horizontal direction of the projector, based on the distance between the two cross points in the direction parallel to the optical axis, the distance between the two cross points in the horizontal direction, and horizontal angle ⁇ 0 of the image sensor imaging range in step S6, acquires an angle of inclination in the vertical direction of the projector from vertical inclination sensor 54 in step S7, and outputs the acquired and calculated angle data to image distortion correcting circuit 33.
  • Image distortion correcting circuit 33 generates LSI control parameters in step S8, and controls the projector image processing LSI circuit in step S9.
  • Input image 24 is then corrected, and display unit 22 generates output image 25 in step S25.
  • output image 25 is projected onto projection surface 70, an image similar to input image 24 is displayed on projection surface 70.
  • FIG. 15 is the block diagram of the projector having the trapezoidal distortion correcting device according to the third embodiment of the present invention.
  • image sensor 50 and image capturer 31 acquire positional information of a cross line between a plane which serves as the projection surface and a plane which crosses the plane.
  • laser positioning device 150 and position acquirer 131 are used to acquire positional information of the cross line.
  • Laser positioning device 150 points laser beams at desired positions by means of laser pointer 151 that is capable of projecting a laser beam within a predetermined range including an image that is projected by the projector.
  • position acquirer 131 acquires positional information of the cross line in a hypothetical image, and outputs the acquired positional information to inclination detector 32.
  • inclination detector 32 processes the supplied positional information to calculate an angle of inclination between projection optical axis 27 of projection device 20 and projection surface 70.
  • the projector according to the third embodiment is identical in arrangement and operation to the projector according to the first embodiment, except that laser positioning device 150 and position acquirer 131 are provided instead of image sensor 50 and image capturer 31.
  • laser positioning device 150 and position acquirer 131 are provided instead of image sensor 50 and image capturer 31.
  • Various processes are available to acquire positional information of a cross line by means of laser positioning device 150.
  • a typical process will be described below as an example.
  • a laser positioning device is used that is capable of controlling the direction of the laser beam, and determining the position of the laser beam in a hypothetical image by a signal that is input when the laser beam overlaps with a cross line.
  • Figs. 16A through 16C are views showing an arrangement and operation of the laser positioning device.
  • Fig. 16A shows the manner in which the laser pointer operates.
  • Fig. 16B shows the manner in which the movement of the laser pointer is limited.
  • Fig. 16C shows the manner in which the laser positioning device is installed.
  • Laser positioning device 150 has laser pointer 151 that is movable about pivot point 152 in the vertical and horizontal direction, as shown in Fig. 16A.
  • Laser pointer 151 has a tubular member that passes through plate 154 having an H-shaped aperture defined therein, as shown Fig. 16B. Therefore, a laser beam is projected from laser pointer 151 in the directions along the H-shaped aperture.
  • Pivot point 152 is combined with a displacement acquirer (not shown) for measuring displacements (angles) in the horizontal and vertical directions.
  • Laser pointer 1 is manually moved.
  • Laser positioning device 150 is installed near projection lens 21 of projector 10, as shown in Fig. 16C.
  • Figs. 17A and 17B are views that illustrate a process for acquiring a cross line between a front wall serving as a projection surface and a ceiling that is joined to an upper edge of the front wall.
  • Fig. 17A shows a vertical cross-sectional view of a room
  • Fig. 17B shows the situation in which two points a0, a1 are acquired in a hypothetical image in the position acquirer.
  • Laser pointer 151 is moved vertically and horizontally, as shown in Fig. 17A, to point the laser beam at a cross line between walls.
  • a button provided on laser positioning device 150 is pressed in order to identify the position, and laser positioning device 150 outputs the position of the laser beam as positional information into a hypothetical image in position acquirer 131.
  • laser positioning device 150 outputs the position of the laser beam as positional information into a hypothetical image in position acquirer 131.
  • an angle of inclination of projection optical axis 27 of projector 10 can be acquired with respect to projection surface 70, by the process according to the first embodiment described above with reference to Figs. 8A, 8B and 9A, 9B.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Projection Apparatus (AREA)
  • Transforming Electric Information Into Light Information (AREA)
EP05011722A 2004-05-31 2005-05-31 Projecteur comprenant un dispositif de mesure d'angle d'inclinaison Withdrawn EP1602894A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2004161494 2004-05-31
JP2004161494A JP3960390B2 (ja) 2004-05-31 2004-05-31 台形歪み補正装置を備えたプロジェクタ

Publications (1)

Publication Number Publication Date
EP1602894A1 true EP1602894A1 (fr) 2005-12-07

Family

ID=34937090

Family Applications (1)

Application Number Title Priority Date Filing Date
EP05011722A Withdrawn EP1602894A1 (fr) 2004-05-31 2005-05-31 Projecteur comprenant un dispositif de mesure d'angle d'inclinaison

Country Status (4)

Country Link
US (1) US7452084B2 (fr)
EP (1) EP1602894A1 (fr)
JP (1) JP3960390B2 (fr)
CN (1) CN100501559C (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015052381A1 (fr) * 2013-10-11 2015-04-16 Outotec (Finland) Oy Procédé et agencement pour préparer des anodes coulées (1) à utiliser en électro-affinage de métaux

Families Citing this family (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007078821A (ja) * 2005-09-12 2007-03-29 Casio Comput Co Ltd 投影装置、投影方法及びプログラム
US20070182936A1 (en) * 2006-02-08 2007-08-09 Canon Kabushiki Kaisha Projection display apparatus
JP3880609B1 (ja) * 2006-02-10 2007-02-14 シャープ株式会社 画像投影方法及びプロジェクタ
DE102006030194B4 (de) * 2006-06-30 2015-07-02 Airbus Operations Gmbh Integrierter Projektor in einem Sitz zur optischen Anzeige von Informationen
JP5010202B2 (ja) * 2006-07-31 2012-08-29 Necディスプレイソリューションズ株式会社 プロジェクタおよび投射画像調整方法
JP5439733B2 (ja) * 2008-03-31 2014-03-12 リコーイメージング株式会社 撮像装置
JP5256899B2 (ja) * 2008-07-18 2013-08-07 セイコーエプソン株式会社 画像補正装置、画像補正方法、プロジェクタおよびプロジェクションシステム
JP5266953B2 (ja) * 2008-08-19 2013-08-21 セイコーエプソン株式会社 投写型表示装置および表示方法
US8911096B2 (en) 2008-12-10 2014-12-16 Nikon Corporation Projection apparatus for projecting and processing an image
WO2010067688A1 (fr) * 2008-12-10 2010-06-17 株式会社ニコン Dispositif de projection
JP5446753B2 (ja) * 2008-12-10 2014-03-19 株式会社ニコン 投影装置
US8773529B2 (en) 2009-06-04 2014-07-08 Sypro Optics Gmbh Projector with automatic focusing and illustration procedure
JP5257616B2 (ja) 2009-06-11 2013-08-07 セイコーエプソン株式会社 プロジェクター、プログラム、情報記憶媒体および台形歪み補正方法
JP5736535B2 (ja) * 2009-07-31 2015-06-17 パナソニックIpマネジメント株式会社 投写型映像表示装置及び画像調整方法
KR101087870B1 (ko) * 2009-09-02 2011-11-30 채광묵 원격 위치 지시용 송신장치 및 수신장치
JP5442393B2 (ja) * 2009-10-29 2014-03-12 日立コンシューマエレクトロニクス株式会社 表示装置
TWI439788B (zh) * 2010-01-04 2014-06-01 Ind Tech Res Inst 投影校正系統及方法
US8506090B2 (en) 2010-03-22 2013-08-13 Microvision, Inc. Projection system with image orientation correction and corresponding method
JP5625490B2 (ja) * 2010-05-25 2014-11-19 セイコーエプソン株式会社 プロジェクター、投射状態調整方法及び投射状態調整プログラム
JP5671901B2 (ja) * 2010-09-15 2015-02-18 セイコーエプソン株式会社 投射型表示装置およびその制御方法
CN102271237A (zh) * 2011-02-25 2011-12-07 鸿富锦精密工业(深圳)有限公司 投影装置及其矫正梯形失真的方法
JP2013065061A (ja) * 2011-09-15 2013-04-11 Funai Electric Co Ltd プロジェクタ
JP5884439B2 (ja) * 2011-11-24 2016-03-15 アイシン精機株式会社 車両周辺監視用画像生成装置
JP5924020B2 (ja) * 2012-02-16 2016-05-25 セイコーエプソン株式会社 プロジェクター、及び、プロジェクターの制御方法
JP5924042B2 (ja) * 2012-03-14 2016-05-25 セイコーエプソン株式会社 プロジェクター、及び、プロジェクターの制御方法
JP6172495B2 (ja) 2012-12-28 2017-08-02 株式会社リコー 校正装置、装置、プロジェクタ、3次元スキャナ、校正方法、方法、プログラム、及び記憶媒体
KR102144541B1 (ko) * 2014-05-08 2020-08-18 주식회사 히타치엘지 데이터 스토리지 코리아 2방향 거리 검출 장치
CN104952035A (zh) * 2015-06-12 2015-09-30 联想(北京)有限公司 一种信息处理方法及电子设备
CN106331549A (zh) * 2015-06-30 2017-01-11 中强光电股份有限公司 投影机装置
CN106331666B (zh) * 2015-07-03 2020-02-07 中兴通讯股份有限公司 一种投影终端梯形校正方法、装置及投影终端
CN105203050A (zh) * 2015-10-08 2015-12-30 扬中中科维康智能科技有限公司 一种激光跟踪仪跟踪反射镜与横轴夹角检测方法
CN106612422B (zh) * 2015-12-31 2018-08-28 北京一数科技有限公司 一种投影校正方法及装置
CN105954961A (zh) * 2016-05-06 2016-09-21 联想(北京)有限公司 一种信息处理方法及投影设备
KR101820905B1 (ko) * 2016-12-16 2018-01-22 씨제이씨지브이 주식회사 촬영장치에 의해 촬영된 이미지 기반의 투사영역 자동보정 방법 및 이를 위한 시스템
WO2018120011A1 (fr) * 2016-12-30 2018-07-05 深圳前海达闼云端智能科技有限公司 Procédé et dispositif de correction d'image projetée, et robot
CN109212874B (zh) * 2017-07-05 2021-04-30 成都理想境界科技有限公司 一种扫描投影设备
CN109660703B (zh) * 2017-10-12 2021-10-26 台湾东电化股份有限公司 光学机构的补正方法
CN107835399A (zh) * 2017-10-31 2018-03-23 潍坊歌尔电子有限公司 一种投影校正的方法、装置和投影设备
CN108050991B (zh) * 2017-11-16 2020-09-11 长江存储科技有限责任公司 基于扫描电子显微镜测量侧壁倾斜角的方法
US11259013B2 (en) * 2018-09-10 2022-02-22 Mitsubishi Electric Corporation Camera installation assistance device and method, and installation angle calculation method, and program and recording medium
JP7283904B2 (ja) * 2019-01-18 2023-05-30 株式会社トプコン 測定装置及び測定装置の制御方法
JP7207151B2 (ja) * 2019-05-16 2023-01-18 セイコーエプソン株式会社 光学デバイス、光学デバイスの制御方法、および画像表示装置
CN110809141A (zh) * 2019-09-29 2020-02-18 深圳市火乐科技发展有限公司 梯形校正方法、装置、投影仪及存储介质
CN113452971B (zh) * 2020-03-25 2023-01-03 苏州佳世达光电有限公司 一种投影装置的自动水平梯形校正方法
CN112104851B (zh) * 2020-09-15 2022-04-08 成都极米科技股份有限公司 画面校正的检测方法、装置和检测系统
CN112902876B (zh) * 2021-01-14 2022-08-26 西北工业大学 拼焊板旋压成形曲面构件焊缝偏转测量方法
JP7318669B2 (ja) * 2021-01-27 2023-08-01 セイコーエプソン株式会社 表示方法および表示システム
CN113547512B (zh) * 2021-08-04 2022-09-06 长春电子科技学院 一种钳体加工用的智能检测机械手

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003289485A (ja) * 2002-03-27 2003-10-10 Seiko Epson Corp 投写型画像表示装置及び平面被投写体
EP1395050A1 (fr) * 2002-08-30 2004-03-03 Seiko Precision Inc. Appareil de détection d'angle et projecteur équipé de cet appareil pour corriger automatiquement de l'erreur trapézoidale d'image

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2536110Y2 (ja) 1990-01-05 1997-05-21 富士通テン株式会社 音場構成装置
JPH06253241A (ja) 1993-02-26 1994-09-09 Matsushita Electric Ind Co Ltd 投写型ディスプレイの投写歪補正方法
US6520646B2 (en) * 1999-03-03 2003-02-18 3M Innovative Properties Company Integrated front projection system with distortion correction and associated method
JP3519393B2 (ja) 2001-12-26 2004-04-12 株式会社東芝 投射型表示装置
US7150536B2 (en) * 2003-08-08 2006-12-19 Casio Computer Co., Ltd. Projector and projection image correction method thereof
JP3914938B2 (ja) 2004-04-30 2007-05-16 Necビューテクノロジー株式会社 プロジェクタの台形歪み補正装置と該台形歪み補正装置を備えたプロジェクタ

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003289485A (ja) * 2002-03-27 2003-10-10 Seiko Epson Corp 投写型画像表示装置及び平面被投写体
EP1395050A1 (fr) * 2002-08-30 2004-03-03 Seiko Precision Inc. Appareil de détection d'angle et projecteur équipé de cet appareil pour corriger automatiquement de l'erreur trapézoidale d'image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 2003, no. 12 5 December 2003 (2003-12-05) *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015052381A1 (fr) * 2013-10-11 2015-04-16 Outotec (Finland) Oy Procédé et agencement pour préparer des anodes coulées (1) à utiliser en électro-affinage de métaux

Also Published As

Publication number Publication date
US20050270496A1 (en) 2005-12-08
CN100501559C (zh) 2009-06-17
CN1704837A (zh) 2005-12-07
JP2005347790A (ja) 2005-12-15
US7452084B2 (en) 2008-11-18
JP3960390B2 (ja) 2007-08-15

Similar Documents

Publication Publication Date Title
EP1602894A1 (fr) Projecteur comprenant un dispositif de mesure d'angle d'inclinaison
US7226173B2 (en) Projector with a plurality of cameras
US7139424B2 (en) Stereoscopic image characteristics examination system
EP1517550B1 (fr) Projecteur avec dispositif de mesure d'angle d'inclinaison
US7661825B2 (en) Projector having horizontal displacement sensors for correcting distortion
JP5401940B2 (ja) 投写光学系のズーム比測定方法、そのズーム比測定方法を用いた投写画像の補正方法及びその補正方法を実行するプロジェクタ
CN109949728B (zh) 一种显示面板的检测装置
KR100481399B1 (ko) 촬상 시스템, 상기 시스템에서 화상 데이터를 제어하도록사용되는 프로그램, 상기 시스템에서 촬상 화상의 왜곡을보정하기 위한 방법 및 상기 방법의 순서를 기억시키는기록 매체
JP3996610B2 (ja) プロジェクタ装置とその画像歪補正方法
JP2002071309A (ja) 3次元画像検出装置
JP3742085B2 (ja) 傾斜角度測定装置を有するプロジェクタ
JP6582683B2 (ja) 角度算出システム、角度算出装置、プログラム、および角度算出方法
JP3741136B2 (ja) 障害物適応投射型表示装置
JP3842988B2 (ja) 両眼立体視によって物体の3次元情報を計測する画像処理装置およびその方法又は計測のプログラムを記録した記録媒体
JP3926311B2 (ja) 傾斜角度測定装置を有するプロジェクタ
JP2007089042A (ja) 撮像装置
JP2005331585A (ja) 距離傾斜角度測定装置を有するプロジェクタ
JP2899553B2 (ja) 固体撮像素子の位置調整方法
JP7173825B2 (ja) カメラシステム、その制御方法およびプログラム
JP3914938B2 (ja) プロジェクタの台形歪み補正装置と該台形歪み補正装置を備えたプロジェクタ
JP3742086B2 (ja) 傾斜角度測定装置を有するプロジェクタ
JP2007333525A (ja) 距離測定装置
JP4535769B2 (ja) 傾斜角度測定装置を備えたプロジェクタ
JP3939866B2 (ja) ヘッドライトの光軸調整方法
JP3730979B2 (ja) 傾斜角度測定装置を有するプロジェクタ

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20050829

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA HR LV MK YU

AKX Designation fees paid

Designated state(s): DE FR GB

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: NEC DISPLAY SOLUTIONS, LTD.

17Q First examination report despatched

Effective date: 20120828

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20140620

RIN1 Information on inventor provided before grant (corrected)

Inventor name: MOCHIZUKI, KAZUO

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20141031