WO2018167918A1 - Projector, method of creating data for mapping, program, and projection mapping system - Google Patents

Projector, method of creating data for mapping, program, and projection mapping system Download PDF

Info

Publication number
WO2018167918A1
WO2018167918A1 PCT/JP2017/010700 JP2017010700W WO2018167918A1 WO 2018167918 A1 WO2018167918 A1 WO 2018167918A1 JP 2017010700 W JP2017010700 W JP 2017010700W WO 2018167918 A1 WO2018167918 A1 WO 2018167918A1
Authority
WO
WIPO (PCT)
Prior art keywords
projection
projector
data
dimensional
mapping
Prior art date
Application number
PCT/JP2017/010700
Other languages
French (fr)
Japanese (ja)
Inventor
青柳 寿和
Original Assignee
Necディスプレイソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necディスプレイソリューションズ株式会社 filed Critical Necディスプレイソリューションズ株式会社
Priority to PCT/JP2017/010700 priority Critical patent/WO2018167918A1/en
Priority to JP2019505625A priority patent/JP6990694B2/en
Publication of WO2018167918A1 publication Critical patent/WO2018167918A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor

Definitions

  • the present invention relates to a projector, a mapping data creation method, a program, and a projection mapping system.
  • FIG. 1A shows an example of a system that performs projection mapping.
  • This system includes a projector 101 and a video processing apparatus 105 including a personal computer.
  • the projector 101 and the video processing device 105 are connected via a video signal cable.
  • the video processing device 105 creates a projection mapping video 106 and supplies the video signal to the projector 101.
  • the projector 101 projects an image based on the image signal from the image processing device 105 toward the three-dimensional object 104 in the projection area 103.
  • the projection mapping image 106 is displayed on a projection surface including the five surfaces 104 a to 104 e of the three-dimensional object 104.
  • FIG. 1B shows an example of the projection mapping video 106.
  • the rectangular original images 141a to 141e are images to be displayed on the surfaces 104a to 104e of the three-dimensional object 104 shown in FIG. 1A.
  • the images 142a to 142e are images obtained by deforming the rectangular original images 141a to 141e according to the outer shapes of the surfaces 104a to 104e, respectively.
  • the projection mapping image 106 is obtained by assigning these images 142a to 142e so as to correspond to the surfaces 104a to 104e, respectively.
  • the method for creating the projection mapping video 106 includes a method for creating a projection mapping video in a two-dimensional space and a method for creating a projection mapping video in a three-dimensional space.
  • FIG. 2 schematically shows a method for creating a projection mapping video in a two-dimensional space.
  • a frame indicating the outer shape of each surface of the three-dimensional object in the projection video is created on the video processing device 105.
  • a frame 146 indicated by a solid line is a frame indicating the outer shape of the surface 104 c of the three-dimensional object illustrated in FIG. 1A.
  • the image of the frame formed on the video processing device 105 is actually projected from the projector 101.
  • the frame is deformed on the video processing device 105 so that the projected image of the frame matches the outer shape of the corresponding surface of the three-dimensional object 104.
  • FIG. 1 schematically shows a method for creating a projection mapping video in a two-dimensional space.
  • the frame 147 projected onto the three-dimensional object 104 corresponds to the frame 146 created on the video processing device 105, but the frame 147 matches the outer shape of the corresponding surface of the three-dimensional object 104. Absent. Therefore, the frame 146 is deformed on the video processing device 105 so that the frame 147 matches the outer shape of the corresponding surface of the three-dimensional object 104.
  • FIG. 3 schematically shows a method for creating a projection mapping video in a three-dimensional space.
  • a virtual three-dimensional space is formed on the video processing apparatus 105, and a virtual three-dimensional object 104-1 and a virtual camera 148 corresponding to the three-dimensional object 104 are arranged in the three-dimensional space.
  • a video is assigned to each surface of the virtual three-dimensional object 104-1 and reproduced, and a video reproduced on each surface of the virtual three-dimensional object 104-1 is captured using the virtual camera 148.
  • An image captured by the virtual camera 148 is used as the projection mapping image 106.
  • creating a video using a virtual camera 148 is called rendering.
  • the angle of view of the virtual camera 148 is set to be equal to the angle of view of the projector 101.
  • the projector 101 is placed at the position where the virtual camera 148 was placed, and the projector 101 projects the projection mapping image 106 created using the virtual camera 148 toward the three-dimensional object 104.
  • Techniques related to projection mapping are described in Patent Documents 1 and 2.
  • the frame 146 is displayed on the video processing device 105 so that the projected frame 147 matches the outer shape of the corresponding surface of the three-dimensional object 104. It is necessary to carry out adjustment work such as deforming. Such adjustment work is very troublesome and troublesome for the operator.
  • An object of the present invention is to solve the above-mentioned problems, and to generate a mapping data capable of creating a projection mapping video that matches the outer shape of each surface of a three-dimensional object, a projector, a mapping data creation method, To provide a program and a projection mapping system.
  • a display element having an image forming surface composed of a plurality of pixels; and a projection lens that projects an image formed on the image forming surface, and the image is projected from the projection lens toward a three-dimensional object.
  • a projector A reception unit for receiving a data creation request;
  • a mapping data generation unit for generating mapping data for creating a projection mapping video in response to the data creation request;
  • the mapping data generation unit A three-dimensional sensor unit that three-dimensionally measures the three-dimensional object and outputs three-dimensional position data representing the position and shape of each surface on which the image on the three-dimensional object is projected in three-dimensional coordinates;
  • the coordinate system of the three-dimensional position data is converted into a projector coordinate system in which the projection area of the image is expressed in three-dimensional coordinates with a point where a principal ray from a pixel located on the diagonal of the image forming surface intersects as an origin.
  • a coordinate transformation unit Based on the three-dimensional position data coordinate-converted by the coordinate conversion unit, for each of the surfaces of the three-dimensional object, a position of the surface and a ridge line indicating the outer shape of the surface are acquired, and the position and the ridge line are obtained.
  • a data generation unit that generates the mapping data based on the projector.
  • a display element having an image forming surface composed of a plurality of pixels; and a projection lens that projects an image formed on the image forming surface, and the image is projected from the projection lens toward a three-dimensional object.
  • a mapping data creation method performed by a projector, A mapping data generation step for generating mapping data for creating a projection mapping video in response to the data creation request;
  • the mapping data generation step includes: Three-dimensional measurement of the three-dimensional object, to obtain three-dimensional position data representing the position and shape of each surface on which the image on the three-dimensional object is projected in three-dimensional coordinates, The coordinate system of the three-dimensional position data is converted into a projector coordinate system in which the projection area of the image is expressed in three-dimensional coordinates with a point where a principal ray from a pixel located diagonally on the image forming surface intersects as an origin.
  • a method for creating data for mapping includes generating data.
  • a display element having an image forming surface composed of a plurality of pixels; and a projection lens that projects an image formed on the image forming surface, and the image is projected from the projection lens toward a three-dimensional object.
  • the mapping data generation process includes: Processing to measure the three-dimensional object three-dimensionally and obtain three-dimensional position data representing the position and shape of each surface on which the image on the three-dimensional object is projected in three-dimensional coordinates;
  • the coordinate system of the three-dimensional position data is converted into a projector coordinate system in which the projection area of the image is expressed in three-dimensional coordinates with a point where a principal ray from a pixel located on the diagonal of the image forming surface intersects as an origin.
  • the above projector An image processing device capable of mutual communication with the projector,
  • the video processing apparatus is provided with a projection mapping system that creates a projection mapping video based on data for a projection mapping video creation tool generated by the projector.
  • FIG. 1 is a block diagram illustrating a configuration of a projector that is a first embodiment of the present invention.
  • FIG. It is a schematic diagram for demonstrating the relationship of a field angle in case of launching, a projection optical axis, and a projection center point.
  • 5 is a flowchart showing a procedure for generating a two-dimensional data file 33; It is a schematic diagram which shows an example of a perspective projection. It is a schematic diagram for demonstrating the angle of view of launch projection. It is a schematic diagram for demonstrating an angle-of-view central axis. It is a schematic diagram for demonstrating the projection target surface which shows one surface of a solid object.
  • FIG. 5 is a block diagram showing a configuration of a projector that is a second embodiment of the present invention.
  • FIG. 5 is a block diagram showing a configuration of a projector that is a third embodiment of the present invention.
  • FIG. 4 is a block diagram showing a configuration of the projector according to the first embodiment of the present invention.
  • the projector includes a communication control unit 1, a parameter storage unit 3, a projection unit 5, a projector data generation unit 6, a mapping data generation unit 7, a file storage unit 8, a projector projection design data storage unit 9, an image
  • An angle symmetrizing unit 10 and an attitude sensor unit 11 are included.
  • the communication control unit 1 includes control means such as a CPU (Central Processing Unit), and is connected to an external device via the communication input / output unit 2 so as to be able to communicate with each other.
  • RS-232C, a wired LAN (Local Area Network), a wireless LAN, or the like can be used as a communication means between the communication input / output unit 2 and an external device, but is not limited thereto.
  • the external device is an information processing apparatus such as a personal computer, for example, and includes a projection mapping video creation tool.
  • the projection mapping video creation tool is a tool for creating a projection mapping video in a three-dimensional space, a tool for creating a projection mapping video in a two-dimensional space, or the like. Since these tools are existing, detailed description thereof is omitted here.
  • the communication control unit 1 controls the operation of each unit of the projector and transmits / receives data and instruction signals (or control signals) to / from the information processing apparatus. For example, the communication control unit 1 controls the projector data generation unit 6 and the mapping data generation unit 7 in accordance with an instruction from the information processing apparatus, and generates data (three-dimensional object data) necessary to create a projection mapping video. 2D data or 3D data or projector data) is generated.
  • the parameter storage unit 3 includes parameters used in mapping data creation processing necessary to create a projection mapping video (view angle symmetrization selection parameter, segmentation parameter, polygon meshing parameter, file format parameter, mapping mode parameter, etc.) Is stored.
  • the field angle symmetrization selection parameter is setting information indicating whether or not there is field angle symmetrization.
  • the segmentation parameter is a threshold value for extracting each surface of the three-dimensional object based on three-dimensional position data (point cloud data) obtained by three-dimensionally measuring the shape of the surface of the three-dimensional object.
  • the polygon meshing parameter is a parameter (polygon shape, mesh roughness, etc.) necessary for polygonal meshing of each surface of the extracted three-dimensional object.
  • the file format parameter is a parameter for designating the formats of the two-dimensional data file and the three-dimensional data file.
  • the mapping mode parameter is a parameter for designating any one of three modes of mapping for two-dimensional space, mapping for three-dimensional space, and
  • a video signal is input from the information processing apparatus to the projection unit 5 via the projection video input unit 4.
  • the projection unit 5 projects a video based on the input video signal from the projection video input unit 4, and includes a video processing unit 12, a distortion correction unit 13, a projection lens unit 14, and a distortion correction coefficient calculation unit 15.
  • the video processing unit 12 performs processing for converting the resolution of the input video signal to the resolution of the display device, processing for adjusting image quality, and the like.
  • the distortion correction unit 13 corrects distortion (for example, trapezoidal distortion) of the image projected on the projection surface not facing the projector, according to the distortion correction coefficient, with respect to the video signal processed by the video processing unit 12.
  • the distortion correction coefficient calculation unit 15 calculates the distortion in the distortion correction unit 13 based on the information for distortion correction set by the user or the horizontal and vertical inclination information of the projection surface from the view angle symmetrization unit 10. A distortion correction coefficient necessary for correction is calculated.
  • the user can set information for distortion correction using an operation unit (not shown).
  • the projection lens unit 14 includes a display device that forms an image based on the video signal from the distortion correction unit 13 and a projection lens that projects an image formed by the display device.
  • the projection lens includes a lens that can move in the direction of the optical axis, a zoom mechanism that changes the angle of view according to a zoom position corresponding to the position on the optical axis of the lens, and the projection lens as a whole orthogonal to the optical axis. And a lens shift mechanism that shifts in the direction of movement.
  • the projection lens unit 14 supplies zoom / shift position information indicating the zoom position and lens shift position of the projection lens to the view angle symmetrizing unit 10.
  • the display device can be referred to as a display element having an image forming surface including a plurality of pixels, and an image based on the video signal from the distortion correction unit 13 is formed on the image forming surface.
  • a liquid crystal display element, DMD (digital micromirror device), or the like can be used as the display element.
  • the projector projection design data storage unit 9 stores design data related to projector projection.
  • the design data is data related to optics for obtaining the field angle and projection center point from the zoom position and lens shift position of the projection lens, that is, how the field angle and projection center point change depending on the zoom position and lens shift position. It is data for calculating what to do. This data is based on data determined by optical design.
  • the attitude sensor unit 11 includes an attitude sensor that can detect 360 ° rotation angles and 360 ° rotation angles, for example, a triaxial acceleration sensor, and detects the inclination of the projector with respect to the horizontal plane.
  • the output of the attitude sensor unit 11 is supplied to the projector data generation unit 6 and the mapping data generation unit 7.
  • the view angle symmetrization unit 10 acquires zoom / shift position information from the projection lens unit 14, acquires design data from the projector projection design data storage unit 9, and acquires view angle symmetrization selection parameters from the parameter storage unit 3.
  • the mapping mode parameter is acquired from the parameter storage unit 3.
  • the angle-of-view symmetrization unit 10 generates a projection direction vector and an angle of view indicating the direction of the projection center axis of the projector based on zoom / shift position information and design data in accordance with the mapping mode parameter and the angle-of-view symmetrization selection parameter. And the horizontal and vertical inclination angles of the projection surface with respect to the projection surface facing the projector are calculated.
  • the angle of view represents the range of projection light from the projection lens (image projection range) as an angle
  • the range of horizontal projection light as an angle is called the horizontal angle of view.
  • a range of vertical projection light expressed as an angle is called a vertical angle of view.
  • the projection central axis is the central axis (corresponding to the central ray) of the projection light from the projection lens, and can be used as a reference for the angle of view.
  • the angle of view, the projection center axis, and the projection center point change according to the zoom position and the lens shift position.
  • FIG. 5A schematically shows the relationship between the angle of view, the projection optical axis, and the projection center point when launching is performed.
  • FIG. 5B schematically shows the relationship between the angle of view, the projection optical axis, and the projection center point when there is no launch.
  • FIG. 5C schematically shows the relationship between the angle of view, the projection optical axis, and the projection center point when the angle of view is enlarged.
  • FIG. 5D schematically shows the relationship between the angle of view, the projection center axis, and the projection center point when the lens shift is performed upward.
  • the projection optical axis is an axis passing through the center of the image forming surface and perpendicular to the image forming surface.
  • FIGS. 5B to 5D correspond to cross sections in the vertical direction.
  • a projection form called launching is used in which the image is projected above the projection optical axis so that the image is projected above the height of the table.
  • lens shift is a projection form called lens shift in which an image is projected by moving it to the vertical and horizontal positions with respect to the projection optical axis
  • launch is one form of the lens shift.
  • the projection center point 102 linearly connects each point of the four corners of the projection area 103 and each point of the four corners of the image forming area of the display device 100 with corresponding points. This is where the lines meet.
  • the projection area 103 is obtained by inverting the image of the image forming area of the display device 100 vertically and horizontally.
  • the four corner points of the image forming area of the display device 100 are A point, B point, C point, and D point, respectively, and the four corner points of the projection area 103 are respectively a point, b point, c point, and d point.
  • Points a, b, c, and d correspond to points A, B, C, and D, respectively, and the arrangement of points a, b, c, and d is points A, B, and C.
  • the positional relationship is reversed up, down, left and right with respect to the arrangement of point D.
  • the projection center point 102 is emitted from the point A and reaches the point a through the lens, the principal ray that exits from the point B and reaches the point b through the lens, and the point C.
  • the principal ray that exits and reaches the point c via the lens and the principal ray that exits from the point D and reaches the point d via the lens are shown as intersecting points.
  • Such an intersection of principal rays can be defined, for example, at the center of the aperture stop of the projection lens, and can be calculated based on lens design data. In the example with launch shown in FIG.
  • the projection optical axis 109 passes through the center of the lower end of the projection area 103, and the projection center point 102 is located above the projection optical axis 109. In this case, the projection center axis does not coincide with the projection optical axis 109.
  • the projection optical axis 109 passes through the center of the projection area 103, and the projection center point 102 is located on the projection optical axis 109.
  • the projection center axis coincides with the projection optical axis 109.
  • FIG. 5C is an example in which the angle of view is enlarged as compared with the example of FIG. 5B. 5B, the projection optical axis 109 passes through the center of the projection area 103, and the projection center point 102 is located on the projection optical axis 109.
  • the projection center point 102 is a display device than the example of FIG. It is arranged on the 100 side. Also in this case, the projection center axis coincides with the projection optical axis 109.
  • FIG. 5D is an example in which the lens shift is performed so that the projection area 103 is shifted upward as compared with the example of FIG. 5B.
  • the projection optical axis 109 passes through the center of the lower end of the projection area 103, and the projection center point 102 is located above the projection optical axis 109.
  • the projection center axis does not coincide with the projection optical axis 109.
  • the angle of view, the projection center axis, and the projection center point change according to the zoom position and the lens shift position. In other words, the angle of view, the projection center axis, and the projection center point need to be determined according to the zoom position and the lens shift position.
  • the process of calculating the projection direction vector, the view angle, and the tilt angle differs depending on whether the view angle symmetrization is performed or not.
  • the angle-of-view symmetrization unit 10 executes the following processes (A1) to (A3).
  • A1 Based on the zoom / shift position information (zoom position and lens shift position) and design data, the left and right field angles in the horizontal direction and the vertical field angles in the vertical direction are obtained.
  • A2 Considering the projection optical axis as the projection center axis, a vector in which the horizontal and vertical direction components are set to 0 is set as the projection direction vector.
  • Both horizontal and vertical inclinations are set to zero.
  • the view angle symmetrization unit 10 performs the following processing. (B1) to (B3) are executed. (B1) Based on the zoom / shift position information (zoom position and lens shift position) and design data, the left and right field angles in the horizontal direction and the vertical field angles in the vertical direction are obtained. (B2) The projection plane is perpendicular to the projection center axis, and the horizontal angle of view in the horizontal direction with respect to the projection center axis when the distortion correction is performed so that the image projected on the projection plane is square becomes equal.
  • the projection center axis and the projection plane are determined so that the vertical angle of view in the vertical direction is equal, and the projection direction vector is set in the direction of the projection center axis.
  • the horizontal and vertical inclinations of the projection plane perpendicular to the projection center axis determined in (B2) with respect to the plane perpendicular to the projection optical axis are obtained.
  • the angle-of-view symmetrization unit 10 supplies the projection direction vector and the calculation result of the angle of view to the projector data generation unit 6 and supplies the calculation result of the tilt angle to the distortion correction coefficient calculation unit 15. Note that when the mapping mode parameter is a mode without mapping, the view angle symmetrization unit 10 does not perform the calculation processing of the projection direction vector, the view angle, and the tilt.
  • the projector data generation unit 6 generates a projector data file for setting a rendering camera and a projector in a virtual three-dimensional space. This projector data file is used in a tool for creating a projection mapping video in a three-dimensional space on the information processing apparatus side.
  • the projector data generation unit 6 includes an initial projector data generation unit 16, a projector data world coordinate conversion unit 17, a projector data vertical offset unit 18, and a projector data file generation unit 19.
  • the initial projector data generation unit 16 uses the origin of the projector coordinate system that is the projection center point of the projector (projection center point 102 as shown in FIGS. 5A to 5D), that is, the coordinates (0, 0, 0) as the projector position. Set to coordinates.
  • the initial projector data generation unit 16 generates projector data including the projector position coordinates, the projection direction vector from the angle-of-view symmetrization unit 10, and the angle of view (up / down / left / right angle of view).
  • the projector data world coordinate conversion unit 17 converts the direction of the projection vector included in the projector data generated by the initial projector data generation unit 16 based on the inclination of the projector with respect to the horizontal plane detected by the attitude sensor unit 11 into the horizontal plane. Convert to world coordinate system based on.
  • the projector data vertical offset unit 18 adjusts the vertical coordinate of the three-dimensional position data performed by the mapping data generation unit 7 and adjusts the projector position included in the projector data from the projector data world coordinate conversion unit 17. Change the vertical coordinate of the coordinate. In order to perform this coordinate change, the vertical offset amount is supplied from the mapping data generation unit 7 to the projector data vertical offset unit 18.
  • the projector data file generation unit 19 generates a projector data file 31 based on the projector data from the projector data vertical offset unit 18.
  • the projector data file 31 is stored in the file storage unit 8.
  • the mapping data generation unit 7 generates a two-dimensional data file 32 indicating the outer position of each surface of the three-dimensional object or a three-dimensional data file 33 of the three-dimensional object.
  • the two-dimensional data file 32 is used by a tool for creating a projection mapping video in a two-dimensional space.
  • the three-dimensional data file 33 is used by a tool that creates a projection mapping video in a three-dimensional space.
  • the two-dimensional data file 32 and the three-dimensional three-dimensional data file 33 are stored in the file storage unit 8.
  • the mapping data generation unit 7 includes a three-dimensional sensor unit 20, a calibration data storage unit 21, a three-dimensional position data projector coordinate conversion unit 22, a three-dimensional position data world coordinate conversion unit 23, a vertical offset calculation unit 24, and a three-dimensional position.
  • a data vertical offset unit 25, a three-dimensional position data segmentation unit 26, a perspective projection unit 27, a three-dimensional position data polygon meshing unit 29, a two-dimensional data file generation unit 28, and a three-dimensional data file generation unit 30 are provided.
  • the three-dimensional sensor unit 20 includes a three-dimensional sensor that is arranged toward the optical axis direction of the projection lens and three-dimensionally measures each surface of a three-dimensional object that is a projection target.
  • the detection range of the three-dimensional sensor includes the entire projectable area where the image of the projection lens can be projected.
  • FIG. 6 is a schematic diagram for explaining the relative positional relationship between the detection range of the three-dimensional sensor and the projectable area.
  • the three-dimensional sensor 108 is arranged toward the optical axis direction of the projection lens 101a.
  • the projection area can be enlarged or reduced using the zoom function, and the projection area can be moved up, down, left and right using the lens shift function.
  • the projection area 113 is a projection area when the lens is shifted in the upper right direction to minimize the zoom.
  • the projection area 114 is a projection area when the lens is shifted in the upper right direction to maximize the zoom.
  • the projection area 115 is a projection area when the lens is shifted in the upper left direction to minimize the zoom.
  • the projection area 116 is a projection area when the lens is shifted in the upper left direction to maximize the zoom.
  • the projection area 117 is a projection area when the lens is shifted in the lower right direction to minimize the zoom.
  • the projection area 118 is a projection area when the zoom is maximized by shifting the lens in the lower right direction.
  • the projection area 119 is a projection area when the lens is shifted in the lower left direction to minimize the zoom.
  • the projection area 120 is a projection area when the lens is shifted in the lower left direction to maximize the zoom.
  • the detection range of the three-dimensional sensor 108 includes the entire projectable area where the image from the projection lens 101a can be projected, that is, the entire projection areas 113 to 120. Therefore, the three-dimensional sensor 108 can measure the three-dimensional position of the three-dimensional object arranged in the projectable area.
  • the three-dimensional sensor unit 20 supplies the three-dimensional position data output from the three-dimensional sensor 108 to the three-dimensional position data projector coordinate conversion unit 22.
  • the three-dimensional sensor 108 for example, a TOF (Time-of-Flight) method or a triangulation method three-dimensional sensor can be used, but it is not limited to these methods.
  • the TOF method is a method of performing three-dimensional measurement by projecting light toward an object and measuring the time until the projected light is reflected by the object and returned.
  • Examples of the triangulation method include a passive triangulation method and an active triangulation method.
  • the passive triangulation method an object is photographed at the same time with two cameras arranged side by side on the left and right, and the principle of triangulation is used based on the difference in position on the captured image of the object obtained by each camera. This is a method for measuring and is also called a stereo camera method.
  • the active triangulation method is a method of irradiating light on an object and performing three-dimensional measurement using the principle of triangulation based on information on reflected light from the object.
  • the calibration data storage unit 21 stores calibration data.
  • the calibration data includes parameters (rotation amount and translation amount) for converting the coordinate system of the three-dimensional sensor 108 to the coordinate system of the projector, a reference zoom position, and a reference lens shift position.
  • the amount of rotation and the amount of translation can be obtained by performing calibration that measures the positional relationship between the coordinate system of the three-dimensional sensor and the coordinate system of the projector.
  • the reference zoom position and the reference lens shift position are the zoom position and the lens shift position when calibration is performed.
  • the three-dimensional position data projector coordinate conversion unit 22 acquires calibration data (rotation amount, translation amount, reference zoom position, and reference lens shift position) from the calibration data storage unit 21 and designs from the projector projection design data storage unit 9.
  • the three-dimensional position data projector coordinate conversion unit 22 uses the three-dimensional position data of the three-dimensional object from the three-dimensional sensor based on the calibration data and the design data, and the three-dimensional position of the projector coordinate system with the projection center point as the origin. Convert to position data.
  • the calibration data and design data can be referred to as coordinate conversion data for converting the coordinate system of the three-dimensional sensor into a projector coordinate system in which the projection center point is the origin.
  • the three-dimensional position data world coordinate conversion unit 23 acquires the inclination of the projector with respect to the horizontal plane from the attitude sensor unit 11, and acquires the three-dimensional position data coordinate-converted into the projector coordinate system from the three-dimensional position data projector coordinate conversion unit 22. .
  • the three-dimensional position data world coordinate conversion unit 23 converts the three-dimensional position data coordinate-converted into the projector coordinate system into a world coordinate system based on the horizontal plane based on the tilt of the projector with respect to the horizontal plane.
  • the three-dimensional position data coordinate-converted into the world coordinate system is supplied to the vertical offset calculation unit 24 and the three-dimensional position data vertical offset unit 25.
  • the vertical offset calculator 24 obtains the minimum value of the vertical coordinates from the three-dimensional position data coordinate-converted to the world coordinate system from the three-dimensional position data world coordinate converter 23. When the minimum value of the vertical coordinate is a negative number, the vertical offset calculation unit 24 outputs a vertical offset amount indicating the absolute value of the minimum value of the vertical coordinate. When the minimum value of the vertical coordinate is a positive number, the vertical offset calculation unit 24 outputs a vertical offset amount indicating 0. The output of the vertical offset calculation unit 24 is supplied to the projector data vertical offset unit 18 and the three-dimensional position data vertical offset unit 25.
  • the three-dimensional position data vertical offset unit 25 vertically converts the three-dimensional position data coordinate-converted into the world coordinate system from the three-dimensional position data world coordinate conversion unit 23 based on the vertical offset amount calculated by the vertical offset calculation unit 24. Offset in direction. Specifically, the vertical offset is performed by adding the vertical offset amount to the vertical coordinate of the three-dimensional position data.
  • the three-dimensional position data that has been offset in the vertical direction is supplied to the three-dimensional position data segmentation unit 26, the three-dimensional position data polygon meshing unit 29, and the three-dimensional position data file creation unit 30.
  • the three-dimensional position data segmentation unit 26 acquires a segmentation parameter from the parameter storage unit 3.
  • the three-dimensional position data segmentation unit 26 detects a surface of a three-dimensional object and a ridge line indicating the shape from the three-dimensional position data from the three-dimensional position data vertical offset unit 25 based on the segmentation parameter.
  • the detection results of the surface and the ridge line are supplied to the perspective projection unit 27 and the three-dimensional position data polygon meshing unit 29.
  • the perspective projection unit 27 acquires projector data from the projector data vertical offset unit 18.
  • the perspective projection unit 27 uses the projector data to project the ridgeline detected by the three-dimensional position data segmentation unit 26 onto the projection surface, thereby obtaining two-dimensional data indicating the external position of each surface of the three-dimensional object in the projection image. Generate.
  • the two-dimensional data is supplied to the two-dimensional data file generation unit 28.
  • the two-dimensional data file generation unit 28 acquires file format parameters from the parameter storage unit 3.
  • the two-dimensional data file generation unit 28 generates a two-dimensional data file 32 based on the file format indicated by the file format parameter from the two-dimensional data created by the perspective projection unit 27.
  • the two-dimensional data file 32 is stored in the file storage unit 8.
  • the three-dimensional position data polygon meshing unit 29 acquires a polygon meshing parameter and a file format parameter from the parameter storage unit 3.
  • the 3D position data polygon meshing unit 29 converts the surface detected by the 3D position data segmentation unit 26 into a polygon mesh based on the polygon meshing parameter and the file format indicated by the file format parameter.
  • Data of the polygon meshed surface is supplied to the three-dimensional data file generation unit 30.
  • the three-dimensional data file generation unit 30 acquires file format parameters from the parameter storage unit 3.
  • the three-dimensional data file generation unit 30 generates a three-dimensional data file 33 based on the file format indicated by the file format parameter from the surface data polygon- meshed by the three-dimensional position data polygon meshing unit 29.
  • the three-dimensional data file 33 is stored in the file storage unit 8.
  • FIG. 7 shows an example of the projection mapping system.
  • the projection mapping system includes a projector 201 and a video processing device 205 such as a personal computer.
  • the projector 201 and the video processing device 205 are connected to each other via a communication unit 207 so that they can communicate with each other.
  • the communication unit 207 may be composed of a communication cable and a video signal cable.
  • the video signal cable is used by a video processing device 205 described later to supply a projection mapping video for three-dimensional space to the projector 201.
  • the communication cable is used by the projector 201 described later to supply data such as the projector data file 31, the two-dimensional data file 32, and the three-dimensional data file 33 to the video processing device 205. It is also possible to configure the communication cable and the video signal cable with one cable.
  • the communication unit 207 may include a wireless communication unit.
  • the projector 201 has the configuration described with reference to FIGS. 4 to 6 and projects video based on the video signal from the video processing device 205 onto each surface of the three-dimensional unit 104.
  • the video processing device 205 includes at least one of a two-dimensional space projection mapping video creation and a three-dimensional space projection mapping video creation tool.
  • mapping mapping is performed according to the following procedure.
  • the projector 201 executes mapping data creation processing in response to a data creation instruction (data creation request) from the video processing device 205.
  • data creation instruction data creation request
  • this mapping data creation process data for a two-dimensional space projection mapping video creation tool or data for a three-dimensional space projection mapping video creation tool is created.
  • the video processing device 205 is equipped with a projection mapping video creation tool for 3D space, and the projector 201 creates data for the projection mapping video creation tool for 3D space.
  • This data includes three-dimensional data of a three-dimensional object and projector data for setting a rendering camera and a projector in a virtual three-dimensional space.
  • the video processing device 205 acquires the data for the 3D space projection mapping video creation tool generated by the projector 201, that is, the 3D data file 33 and the projector data file 31. Then, the 3D space projection mapping video creation tool creates a 3D space projection mapping video using the 3D data file 33 and the projector data file 31 acquired from the projector 201. The video processing device 205 supplies a projection mapping video for 3D space to the projector 201. The projector 201 projects an image based on the three-dimensional space projection mapping image on each surface of the three-dimensional unit 104.
  • the projector 201 uses data for the projection mapping video creation tool for 2D space instead of the 3D data file 33.
  • a two-dimensional data file 32 is created.
  • the two-dimensional data file 32 is two-dimensional data indicating the external position of each surface of the three-dimensional object in the projection video, and is supplied from the projector 201 to the video processing device 205.
  • the two-dimensional space projection mapping video creation tool uses the two-dimensional data file 32 to create a two-dimensional space projection mapping video.
  • FIG. 8 shows a procedure of mapping data creation processing.
  • the operator performs an input operation (data creation request) for setting parameters necessary for creating a projection mapping video on the video processing device 205.
  • a parameter setting start instruction and parameter setting screen information are supplied from the video processing device 205 to the projector 201.
  • the parameter setting screen information is supplied to the projection unit 5, and the communication control unit 1 causes the projection unit 5 to project the parameter setting screen according to the parameter setting start instruction.
  • the operator inputs necessary parameter information with reference to the parameter setting screen.
  • the parameters include parameters necessary for mapping data creation processing (view angle symmetrization selection parameter, segmentation parameter, polygon meshing parameter, file format parameter, mapping mode parameter, etc.).
  • the parameter input information is supplied from the video processing device 205 to the projector 201.
  • the communication control unit 1 stores parameters input by the operator in the parameter storage unit 3.
  • the view angle symmetrization unit 10 acquires zoom / shift position information from the projection lens unit 14, acquires design data from the projector projection design data storage unit 9, and receives the view angle from the parameter storage unit 3.
  • a symmetrization selection parameter is acquired, and a mapping mode parameter is acquired from the parameter storage unit 3.
  • the angle-of-view symmetrization unit 10 generates a projection direction vector and an angle of view indicating the direction of the projection center axis of the projector based on zoom / shift position information and design data in accordance with the mapping mode parameter and the angle-of-view symmetrization selection parameter. And the horizontal and vertical inclination angles of the projection surface with respect to the projection surface facing the projector are calculated.
  • the calculation process of the angle of view, the projection direction, and the tilt by the angle of view symmetrization unit 10 includes the following first and second processes.
  • First process When the angle of view is not symmetrized so that the angle of view is vertically and horizontally symmetrical, that is, the mapping mode parameter is set to 2D space mapping or the mapping mode parameter is set to 3D space mapping.
  • the field angle symmetrization selection parameter is set to “no field angle symmetrization”
  • the field angle symmetrization unit 10 executes the first process.
  • the angle-of-view symmetrization unit 10 includes the zoom position and lens shift position of the projection lens unit 14, the angle of view of the projector stored in the projector projection design data storage unit 9, zoom characteristics, lens shift characteristics, and the like.
  • the angle of view is obtained from design data related to the projection.
  • the view angle symmetrizing unit 10 sets the projection direction vector to the direction of the projection optical axis, that is, a vector in which the horizontal / vertical direction components are 0 respectively.
  • the angle-of-view symmetrization unit 10 does not change the projection plane, and therefore sets both the horizontal inclination and the vertical inclination to zero.
  • the angle-of-view symmetrization unit 10 executes the second process.
  • the view angle symmetrizing unit 10 includes the zoom position and lens shift position of the projection lens unit 14, the angle of view of the projector stored in the projector projection design data storage unit 9, the zoom characteristic, the lens shift characteristic, and the like. The angle of view is obtained from design data related to the projection.
  • the angle-of-view symmetrizing unit 10 is a projection plane perpendicular to the projection center axis, and the horizontal direction with respect to the projection center axis when distortion correction is performed so that the image projected on the projection plane becomes square.
  • a projection plane is defined such that the vertical and vertical vertical angles of view are equal, and the projection direction vector is set in the direction of the projection center axis.
  • the view angle symmetrization unit 10 obtains the horizontal and vertical inclinations of the projection plane perpendicular to the projection central axis with respect to the plane perpendicular to the projection optical axis.
  • the projection unit 5 performs distortion correction so that the projected image on the projection surface becomes square. However, since the projected image area after this distortion correction is smaller than before correction, the distortion correction is not performed. It also affects the corners. Therefore, in consideration of the change in the projected video area after distortion correction, the projection center axis and the projector apparatus in which the horizontal field angle is equal to each other and the vertical field angle is equal to each other in the vertical direction It is desirable to obtain the inclination of the projection surface with respect to the projection surface when the screen and the screen face each other.
  • the distortion correction coefficient calculation unit 15 normally calculates a distortion correction coefficient according to the distortion correction setting by the user, and corrects the distortion that occurs when the distortion correction unit 13 projects on a screen that is not directly facing the projector. To do. However, when 2D space mapping or 3D space mapping is set in the mapping mode parameter, the distortion correction coefficient calculation unit 15 does not perform distortion correction setting by the user, but projects from the view angle symmetrization unit 10. A distortion correction coefficient is calculated according to the inclination of the surface, and the distortion correction unit 13 corrects the distortion based on the distortion correction coefficient.
  • the view angle symmetrization 10 and the distortion correction processing in the projection unit 5 are executed every time the user adjusts the zoom position and the lens shift position of the projection lens unit 14. This eliminates the problem of deviation between the projected image and the three-dimensional object when using a tool for creating a projection mapping image in a three-dimensional space in which the angle of view can only be set to be vertically or horizontally symmetrical. It becomes possible.
  • FIG. 9A shows a projection state in which the projection optical axis passes through the center of the lower side of the projection area in launch projection
  • FIG. 9B shows the projection video area before and after distortion correction.
  • the projection center axis 110 does not coincide with the projection optical axis 109.
  • the upper angle is ⁇ T and the lower angle is ⁇ B.
  • the upper angle is ⁇ ′ T and the lower angle is ⁇ ′ B.
  • the image projection central axis 110 and the projection plane perpendicular to the image projection central axis 110 are obtained, and the inclination of the projection plane with respect to the projection plane when the projector 201 and the screen face each other is obtained.
  • the view angle symmetrizing unit 10 supplies the inclination of the projection surface to the distortion correction coefficient calculation unit 15, and the distortion correction coefficient calculation unit 15 calculates a distortion correction coefficient based on the inclination of the projection surface, and the distortion correction unit 13. Based on the distortion correction coefficient, distortion correction is performed so that the projected image on the projection surface becomes square. According to this distortion correction, the projection video area 122 is corrected to the projection video area 123 as shown in FIG. 9B, and as a result, the projection area 103 is corrected to the projection area 121 as shown in FIG. 9A.
  • FIG. 10A shows a projection state when the lens is shifted in the upper left direction
  • FIG. 10B shows a projected video area before and after distortion correction.
  • the upper angle is ⁇ ′ T and the lower angle is ⁇ ′ B.
  • the right angle is ⁇ ′ R and the left angle is ⁇ ′ L. Because it is ⁇ 'T ⁇ ⁇ ' B, the vertical angle of view does not become vertically symmetrical. Also, since ⁇ ′ R ⁇ ⁇ ′ L , the horizontal field angle is not symmetrical.
  • the view angle symmetrizing unit 10 supplies the inclination of the projection surface to the distortion correction coefficient calculation unit 15, and the distortion correction coefficient calculation unit 15 calculates a distortion correction coefficient based on the inclination of the projection surface, and the distortion correction unit 13. Based on the distortion correction coefficient, distortion correction is performed so that the projected image on the projection surface becomes square. According to this distortion correction, the projection video area 122 is corrected to the projection video area 123 as shown in FIG. 10B, and as a result, the projection area 103 is corrected to the projection area 121 as shown in FIG. 10A.
  • step S12 the communication control unit 1 of the projector 201 determines whether or not a mapping data generation start instruction has been received. After adjusting the position, projection direction, zoom position, and lens shift position of the projector 201 so that the projected image is projected onto the three-dimensional object that is the projection target, the user starts mapping data generation on the information processing device 205. An input operation for instructing is performed. In response to this input operation, a mapping data generation start instruction is supplied from the information processing apparatus 205 to the projector 201. When the mapping data generation start instruction is received, in steps S13 and S14, the communication control unit 1 determines whether the mapping mode parameter is set to the two-dimensional space mapping mode or the three-dimensional space mapping mode.
  • mapping mode parameter is set to the two-dimensional space mapping mode
  • the communication control unit 1 causes the mapping data generation unit 7 to generate the two-dimensional data file 32 in step S15.
  • mapping mode parameter is set to the 3D space mapping mode
  • the communication control unit 1 causes the mapping data generation unit 7 to generate the 3D data file 33 and the projector.
  • the data generation unit 6 generates a projector data file 31. If the no mapping mode is set in the mapping mode parameter, the mapping data creation process is terminated without generating the data file.
  • the mapping data generation unit 7 executes the generation process of the three-dimensional data file 33 in accordance with the mapping data generation start instruction from the communication control unit 1, and the projector data generation unit 6 The generation process of the file 31 is executed.
  • FIG. 11 shows a procedure for generating the three-dimensional data file 33.
  • the three-dimensional sensor unit 20 measures a three-dimensional object three-dimensionally and outputs three-dimensional position data. This three-dimensional position data is represented by point group data indicated by three-dimensional coordinates in the coordinate system of the three-dimensional sensor 108.
  • step S 21 the three-dimensional position data projector coordinate conversion unit 22 acquires the rotation amount, the translation amount, the reference zoom position, and the reference lens shift position from the calibration data storage unit 21, and the current zoom position and the reference lens shift position from the projection lens unit 14.
  • the lens shift position is acquired, and the design data is acquired from the projector projection design data storage unit 9.
  • the 3D position data projector coordinate conversion unit 22 receives the 3D from the 3D sensor 20 based on the rotation amount, translation amount, reference zoom position, reference lens shift position, current zoom position and lens shift position, and design data.
  • the coordinate system of the dimension position data is converted into the coordinate system of the projector 201 with the projection center point as the origin.
  • FIG. 12 schematically shows the positional relationship between the coordinate system of the three-dimensional sensor 108 and the coordinate system of the projector 201.
  • the origin of the coordinate system of the three-dimensional sensor 108 does not coincide with the origin (projection center point) of the coordinate system of the projector 201, and the detection direction of the three-dimensional sensor 108 is also the projection direction of the projector 201. It does not match. For this reason, the coordinate system of the three-dimensional position data measured by the three-dimensional sensor 108 is different from the coordinate system of the projector 201.
  • the coordinate transformation in step S21 can be defined by a rotation amount with respect to the three coordinate axes (XYZ) and a translation amount representing movement with respect to the three coordinate axes (XYZ). Obtaining the amount of rotation and the amount of translation is called calibration.
  • the projection center point that is the origin of the projector coordinate system is not always the same position, but is moved by a zoom or lens shift operation.
  • the shift positions are stored in the calibration data storage unit 21 as the reference zoom position and the reference lens shift position, respectively.
  • the three-dimensional position data projector coordinate conversion unit 22 first obtains the coordinates of the reference projection center point when calibration is performed using the design data, the reference zoom position, and the reference lens shift position. Next, the three-dimensional position data projector coordinate conversion unit 22 obtains the coordinates of the current projection center point based on the current zoom position and lens shift position, and changes the coordinates of the reference projection center point to the coordinates of the current projection center point.
  • the translation amount for the coordinate transformation of is obtained.
  • the three-dimensional position data projector coordinate conversion unit 22 converts the three-dimensional position data from the three-dimensional sensor unit 20 based on the rotation amount and the translation amount stored in the calibration data storage unit 21.
  • the three-dimensional position data projector coordinate conversion unit 22 moves the coordinates by the amount of translation from the coordinates of the reference projection center point to the coordinates of the current projection center point based on the current zoom position and lens shift position. .
  • the coordinate system of the three-dimensional position data from the three-dimensional sensor unit 20 is converted into a projector coordinate system with the current projection center point as the origin.
  • the three-dimensional position data world coordinate conversion unit 23 converts the three-dimensional position data coordinate-converted into the projector coordinate system based on the inclination of the projector with respect to the horizontal plane detected by the attitude sensor unit 11, and uses the horizontal plane as a reference. Convert to the world coordinate system.
  • the projector 201 is installed tilted in the front-rear direction and the left-right direction, or installed upside down.
  • the installation state of the projector 201 includes various states such as a horizontal state 124, an upside down state 125, an upward state 126, a downward state 127, a right rotation state 128, and a left rotation state 129, as shown in FIG.
  • the three-dimensional position data world coordinate conversion unit 23 performs a conversion process to the world coordinate system based on the horizontal plane as shown in FIG. 12 based on the horizontal inclination detected by the attitude sensor unit 11. .
  • the projection mapping video creation tool for 3D space can create not only a planar video but also a 3D projection mapping for representing a stereoscopic video as a video projected on the surface of a three-dimensional object.
  • the viewpoint is set at a three-dimensional position, and an image of the surface of the three-dimensional object is created so that it can be seen as a three-dimensional object when viewed from there.
  • the viewpoint is the position of the human eye standing on the ground, and it is necessary to convert the viewpoint into a world coordinate system based on a horizontal plane parallel to the ground in order to set the viewpoint.
  • the vertical offset amount calculation unit 24 calculates the vertical offset amount so that all the vertical coordinates of the three-dimensional position data become 0 or more, and the three-dimensional position data vertical offset unit 25 Adds the vertical offset to all the vertical coordinates of the three-dimensional position data. Specifically, the vertical offset amount calculation unit 24 obtains the minimum value of the vertical coordinates of the three-dimensional position data, and if it is a negative number, outputs the absolute value of the minimum value as the vertical offset amount, If it is a positive number, 0 is output as the vertical offset amount. In step S24, the three-dimensional position data segmentation unit 26 generates a ridge line indicating the surface of the three-dimensional object to be projected and its outer shape from the three-dimensional position data based on the segmentation parameters stored in the parameter storage unit 3. To detect.
  • FIG. 14 shows an example of the segmentation process.
  • the three-dimensional position data segmentation unit 26 first obtains a normal vector 132 for each point of the point cloud data 130 of the three-dimensional position data.
  • the normal vector 132 is a composite vector of the normal vector 131 of a triangular surface composed of a point of interest and peripheral points.
  • the three-dimensional position data segmentation unit 26 compares the normal vectors of adjacent points, and if the difference is smaller than the threshold value set by the segmentation parameter, the three-dimensional position data segmentation unit 26 sets the same plane. Thereby, the surfaces 133 and 134 can be extracted.
  • the three-dimensional position data segmentation unit 26 calculates an edge line 135 that is an intersection line between the adjacent surfaces 133 and 134.
  • the segmentation parameter is a threshold for the difference between the normal vectors, but the value differs depending on the segmentation method.
  • the three-dimensional position data polygon meshing unit 29 performs the three-dimensional position data segmentation unit based on the polygon meshing parameters stored in the parameter storage unit 3 and the file format parameters stored in the parameter storage unit 3.
  • the surface detected at 26 is converted to a polygon mesh.
  • the polygons that can be handled differ depending on the format of the three-dimensional data file to be generated. For example, there are a format that can handle only triangular polygons, a format that can handle only triangular and quadrangular polygons, and a format that can handle polygonal polygons.
  • the three-dimensional position data polygon meshing unit 29 performs polygonization using a file format parameter that specifies the shape of the polygon and a polygon meshing parameter that specifies the roughness of the polygon. If the polygon is made fine, the curved surface can be expressed more smoothly, but the amount of calculation in the tool for creating the projection mapping video increases.
  • FIG. 15A shows an example in which a polygon mesh is formed using both a triangular polygon and a quadrangular polygon
  • FIG. 15B shows an example in which a polygon mesh is formed using only a triangular polygon.
  • the virtual three-dimensional object 302 in which the polygon mesh is formed only by the triangle polygon. 303 have finer polygons.
  • the 3D data file generation unit 30 determines the 3D position data from the 3D position data vertical offset unit 25 or the polygon mesh from the 3D position data polygon meshing unit 29 based on the file format parameter.
  • a three-dimensional data file 33 suitable for the designated file format is generated from the converted three-dimensional position data.
  • the three-dimensional data file generation unit 30 calculates information such as the normal vector of each vertex of the polygon and the normal vector of the mesh surface, and generates a three-dimensional data file 33 including these information. To do.
  • the 3D data file 33 is generated using the 3D position data from the 3D position data vertical offset unit 25, the 3D data file 33 is the raw data of the measured 3D position in the world coordinate system. including.
  • FIG. 16 shows a procedure for generating the projector data file 31.
  • the initial projector data generation unit 16 sets the origin of the projector coordinate system, that is, the projection center of the projector, that is, the coordinates (0, 0, 0) as the projector position coordinates.
  • initial projector data including the projection direction vector, the angle of view, and the projector position coordinates is generated.
  • FIG. 17 shows an example of a projection direction vector, a vertical field angle, a left and right field angle, and a projection center coordinate when no field angle symmetrization is set as the field angle symmetrization selection parameter.
  • the projection optical axis 109 passes through the center of the lower end of the projection area 103.
  • the lower angle ⁇ B 0, and the upper angle ⁇ ′ T and the lower angle ⁇ B do not match, but the projection optical axis 109 is regarded as the projection center axis 110.
  • the origin of the projector coordinate system that is the projection center point 102 of the projector 201, that is, the coordinates (0, 0, 0) are set as the coordinates of the projector position 138.
  • the projection direction vector 136 is the direction of the projection optical axis 109 (the horizontal and vertical direction components are 0 respectively).
  • FIG. 18 shows an example of the projection direction vector, the vertical and horizontal field angles, the left and right field angles, and the projection center coordinates when the field angle symmetrization is set as the field angle symmetrization selection parameter.
  • the vertical angle of view among the angle of view 137 is not vertically symmetric.
  • the projection direction vector 136 is a vector indicating a direction in which the vertical vertical field angle is symmetric and the horizontal horizontal field angle is symmetric.
  • step S31 the projector data world coordinate conversion unit 17 generates initial projector data based on the inclination of the projector with respect to the horizontal plane detected by the attitude sensor unit 11 in the same manner as the processing in the three-dimensional position data world coordinate conversion unit 23.
  • the projection direction vector generated by the unit 16 is converted into a world coordinate system based on the horizontal plane.
  • step S 32 the projector data vertical offset unit 18 converts the vertical offset amount calculated by the vertical offset amount calculation unit 24 into the vertical coordinate of the projector position coordinate in the same manner as the processing in the three-dimensional position data vertical offset unit 25. Add and adjust vertical coordinates.
  • step S33 the projector file generation unit 19 generates a projector data file 31.
  • a three-dimensional space projection mapping video creation tool constructs a three-dimensional object in a virtual three-dimensional space based on the contents of the three-dimensional data file 33, assigns an image to each surface of the three-dimensional object, and reproduces it.
  • a rendering camera is set and rendering is performed to create a projected image. If this projected image is projected from the projector, projection mapping can be performed in a state where the outer shape of the three-dimensional object is exactly matched.
  • the rendering camera in the case where the rendering camera is set at the viewpoint position in order to set the illumination and reproduce the light hitting condition, etc., according to the contents of the projector data file 31 Then, the projector is set, the rendering camera is set at the viewpoint position, and rendering is performed. Using the positional relationship with the three-dimensional object, it is converted into a projector projected image to create a projected image. If this projected image is projected from the projector, projection mapping can be performed in a state where the outer shape of the three-dimensional object is exactly matched.
  • mapping data generation unit 7 executes the generation process of the two-dimensional data file 32 according to the mapping data generation start instruction from the communication control unit 1, and the projector data generation unit 6 Processing up to the vertical offset unit 18 is executed.
  • FIG. 19 shows a procedure for generating the two-dimensional data file 33.
  • step S40 three-dimensional position data is acquired, in step S41, the coordinate system of the three-dimensional position data is converted into the projector coordinate system, in step S42, converted into the world coordinate system, in step S43, a vertical offset is performed, and in step S44.
  • the surface and ridgeline of the three-dimensional object are detected with.
  • the processes in steps S40 to S44 are the same as the processes in steps S20 to S24 shown in FIG.
  • the projector data generation unit 6 performs the processing of steps S30 to S33 shown in FIG.
  • step S45 the perspective projection unit 27 projects the edge of the three-dimensional space detected by the three-dimensional position data segmentation unit 26 based on the projector data from the projector data vertical offset unit 18 by the projector.
  • Two-dimensional data is generated by perspective projection on a surface.
  • step S46 the two-dimensional data file generation unit 28 matches the file format specified from the two-dimensional data generated by the perspective projection unit 27 based on the file format parameters stored in the parameter storage unit 3.
  • a data file 32 is generated.
  • the two-dimensional spatial projection mapping video creation tool allocates video to each surface based on the external position of each surface of the three-dimensional object on the two-dimensional data and creates a projection video. If this projected image is projected from the projector, projection mapping can be performed in a state that matches the outer shape of the three-dimensional object.
  • the three-dimensional sensor 108 measures a three-dimensional object three-dimensionally, and the three-dimensional position data (position and ridgeline of the projection surface on the three-dimensional object) as the measurement result is the coordinates of the projector. Convert coordinates to system. Thereby, the three-dimensional position data of the three-dimensional object in the projector coordinate system can be obtained. Further, two-dimensional data indicating the position and outer shape of each projection surface on the three-dimensional object is created using the coordinate-converted three-dimensional position data. A video for two-dimensional projection mapping can be created by assigning a video to each projection surface on the three-dimensional object based on the two-dimensional data.
  • the image for two-dimensional projection mapping is created based on the three-dimensional position data of the three-dimensional object in the coordinate system of the projector, when the image is projected from the projector, the range of the projected image is the three-dimensional object. It corresponds to the projection surface of
  • three-dimensional data indicating the position and outer shape of each projection surface on the three-dimensional object is generated, and a projector including a projection direction vector, an angle of view, and a projector position coordinate Generate data.
  • a projector including a projection direction vector, an angle of view, and a projector position coordinate Generate data.
  • configure a three-dimensional object in a virtual three-dimensional space assign and play a video to each surface on the virtual three-dimensional object, set a rendering camera based on the projector data, By performing rendering, a video for three-dimensional projection mapping can be created. Since the image for 3D projection mapping is also created based on the 3D position data of the 3D object in the projector coordinate system, when the image is projected from the projector, the range of the projected image is 3D It corresponds to the projection surface of
  • the projector when setting a rendering camera at the viewpoint position in order to set the illumination and reproduce the light hit condition, etc., the projector is set according to the contents of the projector data file.
  • the range of the projected video matches the projection surface of the three-dimensional object.
  • the angle of view of the projector is vertically symmetric and symmetric, and to generate a projector data file including a projection direction vector, an angle of view, and projector position coordinates according to the distortion correction. Therefore, even in a tool for creating a projection mapping image in a three-dimensional space in which only a vertically symmetrical and left / right symmetric angle of view can be set as a rendering camera or projector setting, an image that matches the projection surface of a three-dimensional object is created. be able to. Furthermore, the data in the world coordinate system based on a horizontal plane parallel to the ground is also used as it is for creating a 3D projection mapping video for a 3D video representation in a tool for creating a 3D space projection mapping video. be able to.
  • FIGS. 21A to 21F a problem relating to field angle symmetrization will be described with reference to FIGS. 21A to 21F.
  • a launch projection as shown in FIG. 21A is common.
  • the projection optical axis 109 passes through the center of the lower side of the projection area 103.
  • the left angle ⁇ L and the right angle ⁇ R are equal in the horizontal angle of view, but the lower angle ⁇ B is 0 in the vertical angle of view and does not match the upper angle ⁇ T.
  • the projection direction, the angle of view (zoom), and the position of the projector 201 are determined so that projection can be performed on the projection target surface 153 that represents one surface of the three-dimensional object.
  • the projection target surface 153 is perpendicular to the ground. Note that the position, angle of view (zoom), and direction of the projector 201 are not changed after this positioning.
  • An image for three-dimensional projection mapping is created using the three-dimensional data (three-dimensional data of the projection target surface 153) and projector data including a position, an angle of view, and a projection direction vector.
  • the position is the coordinate of the projection center point 102
  • projector data is set as a parameter of the rendering camera 148.
  • the virtual imaging surface 151 of the rendering camera 148 is perpendicular to the optical center axis 152.
  • an image to be projected on the virtual projection target surface 154 reproduced in the virtual three-dimensional space is allocated based on the three-dimensional data.
  • the video assigned to the virtual projection target surface 154 is captured by the rendering camera 148.
  • the captured image is like a captured virtual projection target surface 155 in the virtual imaging surface 151.
  • the image created as described above is projected from the projector 201.
  • the projection center axis 110 passing through the center of the projection image 156 projected onto the projection target surface 153 is located above the view angle center axis 150.
  • the projection image 156 is shifted upward with respect to the projection target surface 153.
  • the virtual imaging surface 151 of the rendering camera 148 is perpendicular to the optical center axis 152
  • the projection surface of the projection area 103 is inclined with respect to the view angle center axis 150, and the virtual imaging surface 151 and Since the projection surfaces are not parallel, the projected image 156 is distorted.
  • FIG. 21F the projection center axis 110 passing through the center of the projection image 156 projected onto the projection target surface 153 is located above the view angle center axis 150. For this reason, the projection image 156 is shifted upward with respect to the projection target surface 153.
  • the virtual imaging surface 151 of the rendering camera 148 is perpendicular to the optical center axis 152
  • the projector of this embodiment generation
  • video can be suppressed.
  • the principle will be described with reference to FIGS. 21G to 21K.
  • a projection center axis 110 connecting the projection center point 102 and the center of the projected image and a projection plane perpendicular to the projection center axis 110 are obtained, and a distortion correction coefficient is set.
  • theta "T and theta” B is less than the respective theta 'T and ⁇ ' B, ⁇ "L and theta” R respectively theta L and smaller than ⁇ R. For this reason, the angle of view becomes narrower than in the case where distortion correction is not performed.
  • the projection direction and position of the projector 201 are determined so that the projection can be performed on the projection target surface 153 that indicates one surface of the three-dimensional object.
  • the projection target surface 153 is perpendicular to the ground.
  • the position, direction, and angle of view (zoom) of the projector 201 are not changed.
  • An image for three-dimensional projection mapping is created using the three-dimensional data (three-dimensional data of the projection target surface 153) and projector data including a position, an angle of view, and a projection direction vector.
  • the position is the coordinate of the projection center point 102
  • projector data is set as a parameter of the rendering camera 148.
  • the virtual imaging surface 151 of the rendering camera 148 is perpendicular to the optical center axis 152.
  • an image to be projected on the virtual projection target surface 154 reproduced in the virtual three-dimensional space is allocated based on the three-dimensional data.
  • an image assigned to the virtual projection target surface 154 is captured by the rendering camera 148.
  • the captured image is like a captured virtual projection target surface 155 in the virtual imaging surface 151.
  • the image created as described above is projected from the projector 201.
  • the projection center axis 110 passing through the center of the projection image 156 projected onto the projection target surface 153 coincides with the field angle center axis 150.
  • the virtual imaging surface 151 of the rendering camera 148 is perpendicular to the optical center axis 152
  • the projection surface of the projection area 103 is also perpendicular to the field angle central axis 150, so that the projected video 155 is It coincides with the projection target surface 153 and no distortion occurs.
  • the operation has been described with respect to the example of the launch projection shown in FIG. 21A.
  • FIG. 22 is a block diagram showing a configuration of a projector according to the second embodiment of the present invention.
  • the projector shown in FIG. 22 includes a user interface unit 34 and an external storage device 35 in place of the communication control unit 1 and the communication input / output unit 2, and is different from the first embodiment in this respect.
  • the user interface unit 34 is an operation unit that receives an input operation by the user and performs the same control as the communication control unit 1, and includes, for example, an on-screen display and a key input unit.
  • the external storage device 35 is a removable storage device such as a USB (Universal Serial Bus) memory or an SD card.
  • the projector data file 31, the two-dimensional data file 32, and the three-dimensional data file 33 are supplied from the file storage unit 8 to the external storage device 35.
  • the information processing apparatus can read the projector data file 31, the two-dimensional data file 32, and the three-dimensional data file 33 from the external storage device 35.
  • the projector data file 31, the two-dimensional data file 32, and the three-dimensional data file 33 can be provided to the information processing apparatus.
  • the communication cable is used to send, for example, the projector data file 31, the two-dimensional data file 32, and the three-dimensional data file 33.
  • each unit of the projector (projector data generation unit 6, mapping data generation unit 7, angle of view symmetrization unit 10, distortion correction unit 13, distortion correction calculation unit 15, etc.) is supported.
  • a program for causing a computer to execute the processing to be performed may be provided.
  • a function corresponding to each unit such as the projector data generation unit 6, the mapping data generation unit 7, the view angle symmetrization unit 10, the distortion correction unit 13, and the distortion correction calculation unit 15 by the computer executing the program. can be realized.
  • the program may be provided in a computer usable or computer readable medium, or may be provided through a network such as the Internet.
  • the computer usable or computer readable medium includes a medium capable of recording or reading information using magnetism, light, electronic, electromagnetic, infrared, or the like.
  • Such media include, for example, semiconductor memory, semiconductor or solid storage devices, magnetic tape, removable computer diskettes, random access memory (RAM), read only memory (ROM), magnetic disk, optical disk, magneto-optical disk, etc. There is.
  • FIG. 23 is a block diagram showing a configuration of a projector according to the third embodiment of the present invention.
  • the projector includes a display element 400, a projection lens 401, a reception unit 402, and a mapping data generation unit 403.
  • the display element 400 includes an image forming surface including a plurality of pixels.
  • the projection lens 401 projects an image formed on the image forming surface. An image is projected from the projection lens 401 toward a three-dimensional object.
  • the accepting unit 402 accepts a data creation request.
  • the mapping data generation unit 403 generates mapping data for creating a projection mapping video in response to a data creation request.
  • the mapping data generation unit 403 includes a three-dimensional sensor unit 404, a coordinate conversion unit 406, and a data generation unit 407.
  • the three-dimensional sensor unit 404 includes, for example, a three-dimensional sensor 405 that is arranged in a direction in which an image is projected.
  • the three-dimensional sensor unit 404 three-dimensionally measures a three-dimensional object with the three-dimensional sensor 405, and outputs three-dimensional position data that represents the position and shape of each surface on which the image on the three-dimensional object is projected with three-dimensional coordinates. .
  • the coordinate conversion unit 406 is a projector coordinate system in which the coordinate system of the three-dimensional position data is represented by the three-dimensional coordinates of the image projection area with the point where the principal ray from the pixel located diagonally on the image forming surface intersects as the origin. Convert to Based on the three-dimensional position data coordinate-converted by the coordinate conversion unit 406, the data generation unit 407 determines the position of the surface and the outer shape of the surface for each surface (surface on which an image is projected) of the three-dimensional object. The ridgeline shown is acquired, and mapping data (two-dimensional data or three-dimensional data of a three-dimensional object) is generated based on the position and the ridgeline.
  • a zoom mechanism that can move at least some of the plurality of lenses constituting the projection lens 401 in the optical axis direction of the projection lens 401 and a shift direction that is a direction orthogonal to the optical axis direction.
  • a first look-up table showing a change in a position where the crossing points.
  • the coordinate conversion unit 406 refers to the first look-up table and determines the position where the principal ray intersects from the current zoom position of the zoom mechanism and the current lens shift position of the lens shift mechanism. Also good.
  • the coordinate conversion unit 406 may convert the coordinate system of the three-dimensional position data into the projector coordinate system, and then convert the coordinate system into the world coordinate system based on the horizontal plane. Further, in the projector according to the present embodiment, a second lookup table indicating a change in the angle of view representing the projection range of the image as an angle according to the zoom position and the lens shift position, and a second lookup table are provided. With reference to the current zoom angle of the zoom mechanism and the current lens shift position of the lens shift mechanism, the current angle of view is acquired, and the central ray of the projection light beam from the projection lens 401 is represented based on the angle of view.
  • An angle-of-view symmetrizing unit that sets a projection center axis and outputs angle-of-view symmetrization information including a projection direction vector indicating the direction of the projection center axis and the current angle of view, and the coordinates of the origin of the projector coordinate system
  • a projector data generation unit for generating projector data including the projector position coordinates, and the projection direction vector and the current angle of view indicated by the angle of view symmetrization information; It may further include a.
  • the projector data generation unit may convert the direction of the projection vector into a world coordinate system based on the horizontal plane.
  • the data generation unit 407 may generate two-dimensional data by perspectively projecting the ridge line on the projection surface based on the projector data.
  • the field angle symmetrizing unit may output projection plane tilt information indicating the tilt of the plane perpendicular to the projection center axis with respect to the plane perpendicular to the optical axis of the projection lens 401.
  • the projector includes a distortion correction coefficient calculation unit that calculates a distortion correction coefficient for correcting distortion of an image projected on the projection plane based on the projection plane tilt information, and a distortion correction coefficient for the input video signal.
  • a distortion correction unit that performs distortion correction based on the image signal, and an image based on the video signal subjected to the distortion correction may be formed on the image forming surface of the display element 400.
  • the data generation unit 407 may create a two-dimensional data file generated by generating two-dimensional data in a predetermined format file format.
  • the projector may further include output means for outputting a two-dimensional data file.
  • the data generation unit 407 may create a three-dimensional data file generated by generating three-dimensional data in a predetermined format file format. Further, the data generation unit 407 may generate three-dimensional data by converting each surface of the three-dimensional object into a polygon mesh. In this case, the projector may further include output means for outputting a three-dimensional data file. In the projector according to the present embodiment, the data generation unit 403 generates a three-dimensional data file in which three-dimensional data is generated in a predetermined format file format. The projector data generation unit generates the projector data in a predetermined format file.
  • the output means may be a communication means capable of mutual communication with an external information processing apparatus or a removable storage means.
  • a mapping data creation method for generating mapping data for generating mapping data is performed in response to a mapping data creation request.
  • the three-dimensional sensor unit 404 measures a three-dimensional object three-dimensionally using the three-dimensional sensor 405 arranged in the direction in which the image is projected, and the image on the three-dimensional object is projected.
  • 3D position data representing the position and shape of each surface to be expressed in 3D coordinates is output, and the coordinate conversion unit 406 converts the coordinate system of the 3D position data from the pixels located diagonally to the image forming surface.
  • the data generation unit 407 converts the projection area of the image into a projector coordinate system expressed in three-dimensional coordinates with the point where the principal rays intersect as the origin, based on the coordinate-converted three-dimensional position data. For each of the surfaces, obtaining the position of the surface and a ridge line indicating the outer shape of the surface, and generating two-dimensional data or three-dimensional data of the three-dimensional object based on the position and the ridge line.
  • mapping data generation processing for generating mapping data in response to a mapping data creation request may be used.
  • mapping data generation processing each surface on which the three-dimensional object is three-dimensionally measured using the three-dimensional sensor 405 arranged in the direction in which the image is projected, and the image on the three-dimensional object is projected.
  • the process of outputting the 3D position data representing the position and shape of the image in 3D coordinates, and the coordinate system of the 3D position data at the point where the principal ray from the pixel located diagonally on the image forming surface intersects As a process for converting the projection area of the image into a projector coordinate system expressed in three-dimensional coordinates, and for each surface of the three-dimensional object based on the coordinate-converted three-dimensional position data, the position of the surface and the outer shape of the surface And a process of generating two-dimensional data or three-dimensional data of a three-dimensional object based on the position and the ridge line.
  • the program may be provided on a computer-usable or computer-readable medium, or may be provided via a network such as the Internet.
  • the computer usable or computer readable medium includes a medium capable of recording or reading information using magnetism, light, electronic, electromagnetic, infrared, or the like.
  • Such media include, for example, semiconductor memory, semiconductor or solid storage devices, magnetic tape, removable computer diskettes, random access memory (RAM), read only memory (ROM), magnetic disk, optical disk, magneto-optical disk, etc. There is.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Projection Apparatus (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Image Processing (AREA)

Abstract

Data for mapping is generated with which it is possible to create a projection mapping video that matches the external form of each surface of a solid object. A projector having a mapping data generation unit (403) for generating data for mapping in response to a data creation request. The mapping data generation unit has: a three-dimensional sensor (405) for measuring a solid object in three dimensions and outputting three-dimensional position data that represents, by three-dimensional coordinates, the position and shape of each surface of the solid object to which surface an image is projected; a coordinate conversion unit (406) for converting the coordinate system of the three-dimensional position data into a projector coordinate system in which an image projection area is represented by three-dimensional coordinates, with a point where main beams from pixels located at the opposite angles of an image formation surface intersect set as the origin; and a data generation unit (407) for acquiring the position of each surface of the solid object and a ridgeline indicating the external form of the each surface and generating the data for mapping.

Description

プロジェクタ、マッピング用データ作成方法、プログラム及びプロジェクションマッピングシステムProjector, mapping data creation method, program, and projection mapping system
 本発明は、プロジェクタ、マッピング用データ作成方法、プログラム及びプロジェクションマッピングシステムに関する。 The present invention relates to a projector, a mapping data creation method, a program, and a projection mapping system.
 最近、プロジェクタを使って建築物等の立体物に映像を投射して映像表現を行うプロジェクションマッピングが注目を浴びている。このプロジェクションマッピングでは、立体物の表面の外形に合うように、プロジェクタから投射するための映像(プロジェクションマッピング映像)を作成する必要がある。
 図1Aに、プロジェクションマッピングを行うシステムの一例を示す。このシステムは、プロジェクタ101と、パーソナルコンピュータ等よりなる映像処理装置105とを有する。プロジェクタ101と映像処理装置105とは、映像信号ケーブルを介して接続されている。
 映像処理装置105は、プロジェクションマッピング映像106を作成し、その映像信号をプロジェクタ101に供給する。プロジェクタ101は、映像処理装置105からの映像信号に基づく映像を投射エリア103内の立体物104に向けて投射する。プロジェクションマッピング映像106は、立体物104の5つの面104a~104eからなる投射面に表示される。
Recently, projection mapping, which uses a projector to project an image on a three-dimensional object such as a building and express the image, has attracted attention. In this projection mapping, it is necessary to create an image (projection mapping image) for projection from a projector so as to match the outer shape of the surface of the three-dimensional object.
FIG. 1A shows an example of a system that performs projection mapping. This system includes a projector 101 and a video processing apparatus 105 including a personal computer. The projector 101 and the video processing device 105 are connected via a video signal cable.
The video processing device 105 creates a projection mapping video 106 and supplies the video signal to the projector 101. The projector 101 projects an image based on the image signal from the image processing device 105 toward the three-dimensional object 104 in the projection area 103. The projection mapping image 106 is displayed on a projection surface including the five surfaces 104 a to 104 e of the three-dimensional object 104.
 図1Bに、プロジェクションマッピング映像106の一例を示す。
 矩形の元映像141a~141eは、図1Aに示した立体物104の面104a~104eそれぞれで表示するための映像である。映像142a~142eは、矩形の元映像141a~141eをそれぞれ面104a~104eの外形に合わせて変形した映像である。これら映像142a~142eをそれぞれ面104a~104eに対応するように割り当てたものが、プロジェクションマッピング映像106である。
 プロジェクションマッピング映像106を作成する方法には、2次元空間でプロジェクションマッピング映像を作成する方法と、3次元空間でプロジェクションマッピング映像を作成する方法とがある。
FIG. 1B shows an example of the projection mapping video 106.
The rectangular original images 141a to 141e are images to be displayed on the surfaces 104a to 104e of the three-dimensional object 104 shown in FIG. 1A. The images 142a to 142e are images obtained by deforming the rectangular original images 141a to 141e according to the outer shapes of the surfaces 104a to 104e, respectively. The projection mapping image 106 is obtained by assigning these images 142a to 142e so as to correspond to the surfaces 104a to 104e, respectively.
The method for creating the projection mapping video 106 includes a method for creating a projection mapping video in a two-dimensional space and a method for creating a projection mapping video in a three-dimensional space.
 図2に、2次元空間でのプロジェクションマッピング映像作成方法を模式的に示す。映像処理装置105上で、投射映像における立体物の各面の外形を示す枠を作成する。図2において、実線で示された枠146は、図1Aに示した立体物の面104cの外形を示す枠である。映像処理装置105上で形成した枠の映像を、実際に、プロジェクタ101から投射する。投射された枠の映像が立体物104の対応する面の外形に合うように、映像処理装置105上で枠を変形させる。
 図2に示した例では、立体物104に投射された枠147は、映像処理装置105上で作成した枠146に対応するが、枠147は立体物104の対応する面の外形と一致していない。このため、枠147が立体物104の対応する面の外形と一致するように、映像処理装置105上で枠146を変形させる。
FIG. 2 schematically shows a method for creating a projection mapping video in a two-dimensional space. On the video processing device 105, a frame indicating the outer shape of each surface of the three-dimensional object in the projection video is created. In FIG. 2, a frame 146 indicated by a solid line is a frame indicating the outer shape of the surface 104 c of the three-dimensional object illustrated in FIG. 1A. The image of the frame formed on the video processing device 105 is actually projected from the projector 101. The frame is deformed on the video processing device 105 so that the projected image of the frame matches the outer shape of the corresponding surface of the three-dimensional object 104.
In the example illustrated in FIG. 2, the frame 147 projected onto the three-dimensional object 104 corresponds to the frame 146 created on the video processing device 105, but the frame 147 matches the outer shape of the corresponding surface of the three-dimensional object 104. Absent. Therefore, the frame 146 is deformed on the video processing device 105 so that the frame 147 matches the outer shape of the corresponding surface of the three-dimensional object 104.
 図3に、3次元空間でのプロジェクションマッピング映像作成方法を模式的に示す。映像処理装置105上で、仮想の3次元空間を形成し、該3次元空間に、立体物104に対応する仮想の立体物104-1と仮想のカメラ148とを配置する。仮想の立体物104-1の各面に、映像を割り当てて再生し、仮想のカメラ148を用いて仮想の立体物104-1の各面にて再生された映像を撮影する。仮想のカメラ148で撮影した映像をプロジェクションマッピング映像106として用いる。
 一般に、仮想のカメラ148を用いて映像を作成することをレンダリングと呼ぶ。仮想のカメラ148の画角は、プロジェクタ101の画角と等しくなるように設定される。実際の3次元空間で、仮想のカメラ148が配置されていた位置にプロジェクタ101を配置し、プロジェクタ101が、仮想のカメラ148を用いて作成したプロジェクションマッピング映像106を立体物104に向けて投射する。
 プロジェクションマッピングに関連する技術が、特許文献1、2に記載されている。
FIG. 3 schematically shows a method for creating a projection mapping video in a three-dimensional space. A virtual three-dimensional space is formed on the video processing apparatus 105, and a virtual three-dimensional object 104-1 and a virtual camera 148 corresponding to the three-dimensional object 104 are arranged in the three-dimensional space. A video is assigned to each surface of the virtual three-dimensional object 104-1 and reproduced, and a video reproduced on each surface of the virtual three-dimensional object 104-1 is captured using the virtual camera 148. An image captured by the virtual camera 148 is used as the projection mapping image 106.
In general, creating a video using a virtual camera 148 is called rendering. The angle of view of the virtual camera 148 is set to be equal to the angle of view of the projector 101. In an actual three-dimensional space, the projector 101 is placed at the position where the virtual camera 148 was placed, and the projector 101 projects the projection mapping image 106 created using the virtual camera 148 toward the three-dimensional object 104. .
Techniques related to projection mapping are described in Patent Documents 1 and 2.
実用新案登録第199779号公報Utility Model Registration No. 199779 特開2012-215633号公報JP 2012-215633 A
 図2に示した2次元空間でのプロジェクションマッピング映像作成方法においては、操作者が、投射した枠147が立体物104の対応する面の外形と一致するように、映像処理装置105上で枠146を変形させるといった調整作業を行う必要がある。そのような調整作業は、操作者にとって非常に煩わしく、面倒である。 In the projection mapping video creation method in the two-dimensional space shown in FIG. 2, the frame 146 is displayed on the video processing device 105 so that the projected frame 147 matches the outer shape of the corresponding surface of the three-dimensional object 104. It is necessary to carry out adjustment work such as deforming. Such adjustment work is very troublesome and troublesome for the operator.
 図3に示した3次元空間でのプロジェクションマッピング映像作成方法においては、実際の3次元空間で、仮想のカメラ148が配置されていた位置を特定し、その位置にプロジェクタ101を正確に配置する必要がある。しかし、仮想空間における仮想のカメラ148の位置と実空間におけるプロジェクタ101の位置とを正確に一致させることは困難であるため、立体物104の各面の外形と一致したプロジェクションマッピング映像を得ることは困難である。 In the projection mapping video creation method in the three-dimensional space shown in FIG. 3, it is necessary to specify the position where the virtual camera 148 was placed in the actual three-dimensional space and to accurately place the projector 101 at that position. There is. However, since it is difficult to accurately match the position of the virtual camera 148 in the virtual space and the position of the projector 101 in the real space, it is possible to obtain a projection mapping image that matches the outer shape of each surface of the three-dimensional object 104. Have difficulty.
 本発明の目的は、上記問題を解決することができ、立体物の各面の外形と一致したプロジェクションマッピング映像を作成可能なマッピング用データを生成することができる、プロジェクタ、マッピング用データ作成方法、プログラム及びプロジェクションマッピングシステムを提供することにある。 An object of the present invention is to solve the above-mentioned problems, and to generate a mapping data capable of creating a projection mapping video that matches the outer shape of each surface of a three-dimensional object, a projector, a mapping data creation method, To provide a program and a projection mapping system.
 上記目的を達成するため、本発明の一態様によれば、
 複数の画素からなる画像形成面を備えた表示素子と、前記画像形成面に形成された画像を投射する投射レンズと、を有し、前記投射レンズから立体物に向けて前記画像が投射されるプロジェクタであって、
 データ作成要求を受け付ける受付部と、
 前記データ作成要求に応じて、プロジェクションマッピング映像を作成するためのマッピング用データを生成するマッピング用データ生成部と、を有し、
 前記マッピング用データ生成部は、
 前記立体物を3次元計測し、該立体物上の前記画像が投射される各面の位置および形状を3次元座標で表した3次元位置データを出力する3次元センサ部と、
 前記3次元位置データの座標系を、前記画像形成面の対角に位置する画素からの主光線が交差する点を原点として前記画像の投射エリアを3次元座標で表したプロジェクタ座標系に変換する座標変換部と、
 前記座標変換部にて座標変換された3次元位置データに基づいて、前記立体物の前記各面それぞれについて、該面の位置と該面の外形を示す稜線とを取得し、該位置および稜線に基づいて前記マッピング用データを生成するデータ生成部と、を有する、プロジェクタが提供される。
In order to achieve the above object, according to one aspect of the present invention,
A display element having an image forming surface composed of a plurality of pixels; and a projection lens that projects an image formed on the image forming surface, and the image is projected from the projection lens toward a three-dimensional object. A projector,
A reception unit for receiving a data creation request;
A mapping data generation unit for generating mapping data for creating a projection mapping video in response to the data creation request;
The mapping data generation unit
A three-dimensional sensor unit that three-dimensionally measures the three-dimensional object and outputs three-dimensional position data representing the position and shape of each surface on which the image on the three-dimensional object is projected in three-dimensional coordinates;
The coordinate system of the three-dimensional position data is converted into a projector coordinate system in which the projection area of the image is expressed in three-dimensional coordinates with a point where a principal ray from a pixel located on the diagonal of the image forming surface intersects as an origin. A coordinate transformation unit;
Based on the three-dimensional position data coordinate-converted by the coordinate conversion unit, for each of the surfaces of the three-dimensional object, a position of the surface and a ridge line indicating the outer shape of the surface are acquired, and the position and the ridge line are obtained. And a data generation unit that generates the mapping data based on the projector.
 本発明の別の態様によれば、
 複数の画素からなる画像形成面を備えた表示素子と、前記画像形成面に形成された画像を投射する投射レンズと、を有し、前記投射レンズから立体物に向けて前記画像が投射されるプロジェクタにて行われるマッピング用データ作成方法であって、
 データ作成要求に応じて、プロジェクションマッピング映像を作成するためのマッピング用データを生成するマッピング用データ生成ステップを含み、
 前記マッピング用データ生成ステップは、
 前記立体物を3次元計測し、該立体物上の前記画像が投射される各面の位置および形状を3次元座標で表した3次元位置データを取得し、
 前記3次元位置データの座標系を、前記画像形成面の対角に位置する画素からの主光線が交差する点を原点として前記画像の投射エリアを3次元座標で表したプロジェクタ座標系に変換し、
 前記座標変換された3次元位置データに基づいて、前記立体物の前記各面それぞれについて、該面の位置と該面の外形を示す稜線とを取得し、該位置および稜線に基づいて前記マッピング用データを生成すること、を含む、マッピング用データ作成方法が提供される。
According to another aspect of the invention,
A display element having an image forming surface composed of a plurality of pixels; and a projection lens that projects an image formed on the image forming surface, and the image is projected from the projection lens toward a three-dimensional object. A mapping data creation method performed by a projector,
A mapping data generation step for generating mapping data for creating a projection mapping video in response to the data creation request;
The mapping data generation step includes:
Three-dimensional measurement of the three-dimensional object, to obtain three-dimensional position data representing the position and shape of each surface on which the image on the three-dimensional object is projected in three-dimensional coordinates,
The coordinate system of the three-dimensional position data is converted into a projector coordinate system in which the projection area of the image is expressed in three-dimensional coordinates with a point where a principal ray from a pixel located diagonally on the image forming surface intersects as an origin. ,
Based on the coordinate-converted three-dimensional position data, for each of the surfaces of the three-dimensional object, a position of the surface and a ridge line indicating the outer shape of the surface are acquired, and the mapping is performed based on the position and the ridge line. A method for creating data for mapping is provided that includes generating data.
 本発明のさらに別の態様によれば、
 複数の画素からなる画像形成面を備えた表示素子と、前記画像形成面に形成された画像を投射する投射レンズと、を有し、前記投射レンズから立体物に向けて前記画像が投射されるプロジェクタのコンピュータに、データ作成要求に応じて、プロジェクションマッピング映像を作成するためのマッピング用データを生成するマッピング用データ生成処理を実行させるためのプログラムであって、
 前記マッピング用データ生成処理は、
 前記立体物を3次元計測し、該立体物上の前記画像が投射される各面の位置および形状を3次元座標で表した3次元位置データを取得する処理と、
 前記3次元位置データの座標系を、前記画像形成面の対角に位置する画素からの主光線が交差する点を原点として前記画像の投射エリアを3次元座標で表したプロジェクタ座標系に変換する処理と、
 前記座標変換された3次元位置データに基づいて、前記立体物の前記各面それぞれについて、該面の位置と該面の外形を示す稜線とを取得し、該位置および稜線に基づいて前記マッピング用データを生成する処理と、を含む、プログラムが提供される。
According to yet another aspect of the invention,
A display element having an image forming surface composed of a plurality of pixels; and a projection lens that projects an image formed on the image forming surface, and the image is projected from the projection lens toward a three-dimensional object. A program for causing a projector computer to execute mapping data generation processing for generating mapping data for generating a projection mapping video in response to a data generation request,
The mapping data generation process includes:
Processing to measure the three-dimensional object three-dimensionally and obtain three-dimensional position data representing the position and shape of each surface on which the image on the three-dimensional object is projected in three-dimensional coordinates;
The coordinate system of the three-dimensional position data is converted into a projector coordinate system in which the projection area of the image is expressed in three-dimensional coordinates with a point where a principal ray from a pixel located on the diagonal of the image forming surface intersects as an origin. Processing,
Based on the coordinate-converted three-dimensional position data, for each of the surfaces of the three-dimensional object, a position of the surface and a ridge line indicating the outer shape of the surface are acquired, and the mapping is performed based on the position and the ridge line. And a process for generating data.
 本発明のさらに別の態様によれば、
 上記のプロジェクタと、
 前記プロジェクタと相互に通信可能な映像処理装置と、を有し、
 前記映像処理装置は、前記プロジェクタで生成したプロジェクションマッピング映像作成ツール用のデータに基づいてプロジェクションマッピング映像を作成する、プロジェクションマッピングシステムが提供される。
According to yet another aspect of the invention,
The above projector,
An image processing device capable of mutual communication with the projector,
The video processing apparatus is provided with a projection mapping system that creates a projection mapping video based on data for a projection mapping video creation tool generated by the projector.
プロジェクションマッピングを行うシステムの一例を示す模式図である。It is a schematic diagram which shows an example of the system which performs projection mapping. 図1Aに示すシステムで生成されるプロジェクションマッピング映像の一例を示す模式図である。It is a schematic diagram which shows an example of the projection mapping image | video produced | generated with the system shown to FIG. 1A. 2次元空間でのプロジェクションマッピング映像作成方法を説明するための模式図である。It is a schematic diagram for demonstrating the projection mapping image | video production method in two-dimensional space. 3次元空間でのプロジェクションマッピング映像作成方法を説明するための模式図である。It is a schematic diagram for demonstrating the projection mapping image | video production method in three-dimensional space. 本発明の第1の実施形態であるプロジェクタの構成を示すブロック図である。1 is a block diagram illustrating a configuration of a projector that is a first embodiment of the present invention. FIG. 打ち上げ有りの場合の画角、投射光軸及び投射中心点の関係を説明するための模式図である。It is a schematic diagram for demonstrating the relationship of a field angle in case of launching, a projection optical axis, and a projection center point. 打ち上げ無しの場合の画角、投射光軸及び投射中心点の関係を説明するための模式図である。It is a schematic diagram for demonstrating the relationship of a field angle in case of no launch, a projection optical axis, and a projection center point. 画角を拡大した場合の画角、投射光軸及び投射中心点の関係を説明するための模式図である。It is a schematic diagram for demonstrating the relationship between a field angle at the time of enlarging a field angle, a projection optical axis, and a projection center point. 上方へレンズシフトを行った場合の画角、投射中心軸及び投射中心点の関係を説明するための模式図である。It is a schematic diagram for demonstrating the relationship of a field angle at the time of performing a lens shift upward, a projection central axis, and a projection center point. 3次元センサの検出範囲と投射可能エリアとの相対的な位置関係を説明するための模式図である。It is a schematic diagram for demonstrating the relative positional relationship of the detection range of a three-dimensional sensor, and a projectable area. プロジェクションマッピングシステムの一例を示す模式図である。It is a schematic diagram which shows an example of a projection mapping system. マッピング用データ作成処理の一手順を示すフローチャートである。It is a flowchart which shows one procedure of the data creation process for mapping. 打ち上げ投射における、投射光軸が投射エリアの下辺の中心部を通る投射状態を示す模式図である。It is a schematic diagram which shows the projection state in which a projection optical axis passes along the center part of the lower side of a projection area in launch projection. 歪み補正の前後の投射映像エリアを示す模式図である。It is a schematic diagram which shows the projection video area before and behind distortion correction. 左上方向にレンズシフトさせた場合の投射状態を示す模式図である。It is a schematic diagram which shows the projection state at the time of shifting a lens to upper left direction. 歪み補正の前後の投射映像エリアを示す模式図である。It is a schematic diagram which shows the projection video area before and behind distortion correction. 3次元データファイル33の生成処理の一手順を示すフローチャートである。5 is a flowchart showing a procedure for generating a three-dimensional data file 33; 3次元センサの座標系とプロジェクタの座標系との位置関係を示す模式図である。It is a schematic diagram which shows the positional relationship of the coordinate system of a three-dimensional sensor, and the coordinate system of a projector. プロジェクタの設置状態を説明するための模式図である。It is a schematic diagram for demonstrating the installation state of a projector. セグメンテーション処理の一例を示す模式図である。It is a schematic diagram which shows an example of a segmentation process. 3角ポリゴンと4角ポリゴンの両方でポリゴンメッシュ化した例を示す模式図である。It is a schematic diagram which shows the example which made the polygon mesh in both the triangular polygon and the quadrangular polygon. 3角ポリゴンのみでポリゴンメッシュ化した例を示す模式図である。It is a schematic diagram which shows the example which made the polygon mesh only with a triangle polygon. プロジェクタデータファイルの生成処理の一手順を示すフローチャートである。It is a flowchart which shows one procedure of the production | generation process of a projector data file. 画角対称化選択パラメータに画角対称化なしが設定されている場合の投射方向ベクトル、上下画角、左右画角、投射中心座標の一例を示す模式図である。It is a schematic diagram which shows an example of a projection direction vector, a vertical field angle, a left and right field angle, and a projection center coordinate when no field angle symmetrization is set in the field angle symmetrization selection parameter. 画角対称化選択パラメータに画角対称化ありが設定されている場合の投射方向ベクトル、上下画角、左右画角、投射中心座標の一例を示す模式図である。It is a schematic diagram which shows an example of a projection direction vector, an up-down view angle, a left-right view angle, and a projection center coordinate when the view angle symmetrization is set in the view angle symmetrization selection parameter. 2次元データファイル33の生成処理の一手順を示すフローチャートである。5 is a flowchart showing a procedure for generating a two-dimensional data file 33; 透視投影の一例を示す模式図である。It is a schematic diagram which shows an example of a perspective projection. 打ち上げ投射の画角を説明するための模式図である。It is a schematic diagram for demonstrating the angle of view of launch projection. 画角中心軸を説明するための模式図である。It is a schematic diagram for demonstrating an angle-of-view central axis. 立体物の1つの面を示す投射対象面を説明するための模式図である。It is a schematic diagram for demonstrating the projection target surface which shows one surface of a solid object. レンダリング用カメラの仮想撮像面と光学中心軸との関係を説明するための模式図である。It is a schematic diagram for demonstrating the relationship between the virtual imaging surface of a camera for rendering, and an optical central axis. レンダリング用カメラの撮像映像を説明するための模式図である。It is a schematic diagram for demonstrating the picked-up image of the camera for rendering. 投射対象面への映像と投射対象面との関係を説明するための模式図である。It is a schematic diagram for demonstrating the relationship between the image | video to a projection target surface, and a projection target surface. 投射面に投射される映像がスクウェアになるように歪み補正を行ったときの映像の画角を説明するための模式図である。It is a schematic diagram for demonstrating the angle of view of an image | video when distortion correction is performed so that the image | video projected on a projection surface may become a square. 立体物の1つの面を示す投射対象面とプロジェクタの投射方向及び位置との関係を説明するための模式図である。It is a schematic diagram for demonstrating the relationship between the projection target surface which shows one surface of a solid object, and the projection direction and position of a projector. レンダリング用カメラの仮想撮像面と光学中心軸との関係を説明するための模式図である。It is a schematic diagram for demonstrating the relationship between the virtual imaging surface of a camera for rendering, and an optical central axis. 仮想投射対象面に割り付けられた映像をレンダリング用カメラで撮像する状態を示す模式図である。It is a schematic diagram which shows the state which images the image | video allocated on the virtual projection object surface with a rendering camera. 投射対象面に投射された投射映像の中心を通る投射中心軸と画角中心軸との関係を説明するための模式図である。It is a schematic diagram for demonstrating the relationship between the projection center axis | shaft which passes along the center of the projection image | video projected on the projection target surface, and a view angle center axis. 、本発明の第2の実施形態であるプロジェクタの構成を示すブロック図である。FIG. 5 is a block diagram showing a configuration of a projector that is a second embodiment of the present invention. 、本発明の第3の実施形態であるプロジェクタの構成を示すブロック図である。FIG. 5 is a block diagram showing a configuration of a projector that is a third embodiment of the present invention.
 次に、本発明の実施形態について図面を参照して説明する。
 (第1の実施形態)
 図4は、本発明の第1の実施形態であるプロジェクタの構成を示すブロック図である。
 図4を参照すると、プロジェクタは、通信制御部1、パラメータ格納部3、投射部5、プロジェクタデータ生成部6、マッピング用データ生成部7、ファイル格納部8、プロジェクタ投射設計データ格納部9、画角対称化部10及び姿勢センサ部11を有する。
 通信制御部1は、CPU(Central Processing unit)等の制御手段を含み、通信入出力部2を介して外部機器と相互に通信可能に接続される。通信入出力部2と外部機器との通信手段としては、RS-232C、有線LAN(Local Area Network)、無線LANなどを用いることができるが、これらに限定されない。
Next, embodiments of the present invention will be described with reference to the drawings.
(First embodiment)
FIG. 4 is a block diagram showing a configuration of the projector according to the first embodiment of the present invention.
Referring to FIG. 4, the projector includes a communication control unit 1, a parameter storage unit 3, a projection unit 5, a projector data generation unit 6, a mapping data generation unit 7, a file storage unit 8, a projector projection design data storage unit 9, an image An angle symmetrizing unit 10 and an attitude sensor unit 11 are included.
The communication control unit 1 includes control means such as a CPU (Central Processing Unit), and is connected to an external device via the communication input / output unit 2 so as to be able to communicate with each other. RS-232C, a wired LAN (Local Area Network), a wireless LAN, or the like can be used as a communication means between the communication input / output unit 2 and an external device, but is not limited thereto.
 外部機器は、例えば、パーソナルコンピュータ等の情報処理装置であって、プロジェクションマッピング映像作成ツールを備える。プロジェクションマッピング映像作成ツールは、3次元空間でのプロジェクションマッピング映像を作成するツールや、2次元空間でのプロジェクションマッピング映像を作成するツールなどである。これらツールは、既存のものであるので、ここではその詳細な説明を省略する。
 通信制御部1は、プロジェクタの各部の動作を制御したり、情報処理装置との間でデータや指示信号(又は制御信号)を送受信したりする。例えば、通信制御部1は、情報処理装置からの指示に応じて、プロジェクタデータ生成部6やマッピング用データ生成部7を制御して、プロジェクションマッピング映像を作成するのに必要なデータ(立体物の2次元データ又は3次元データや、プロジェクタデータ)を生成させる。
The external device is an information processing apparatus such as a personal computer, for example, and includes a projection mapping video creation tool. The projection mapping video creation tool is a tool for creating a projection mapping video in a three-dimensional space, a tool for creating a projection mapping video in a two-dimensional space, or the like. Since these tools are existing, detailed description thereof is omitted here.
The communication control unit 1 controls the operation of each unit of the projector and transmits / receives data and instruction signals (or control signals) to / from the information processing apparatus. For example, the communication control unit 1 controls the projector data generation unit 6 and the mapping data generation unit 7 in accordance with an instruction from the information processing apparatus, and generates data (three-dimensional object data) necessary to create a projection mapping video. 2D data or 3D data or projector data) is generated.
 パラメータ格納部3には、プロジェクションマッピング映像の作成に必要なマッピング用データ作成処理にて用いられるパラメータ(画角対称化選択パラメータ、セグメンテーションパラメータ、ポリゴンメッシュ化パラメータ、ファイルフォーマットパラメータ、マッピングモードパラメータなど)が格納されている。
 画角対称化選択パラメータは、画角対称化の有無を示す設定情報である。セグメンテーションパラメータは、立体物の表面の形状を3次元計測して得られた3次元位置データ(点群データ)に基づいて立体物の各面を抽出するための閾値である。ポリゴンメッシュ化パラメータは、抽出した立体物の各面をポリゴンメッシュ化するのに必要なパラメータ(ポリゴンの形状やメッシュの粗さなど)である。ファイルフォーマットパラメータは、2次元データファイル及び3次元データファイルそれぞれのフォーマットを指定するためのパラメータである。マッピングモードパラメータは、2次元空間用マッピング、3次元空間用マッピング、マッピングなしの3つのモードのいずれかを指定するためのパラメータである。
The parameter storage unit 3 includes parameters used in mapping data creation processing necessary to create a projection mapping video (view angle symmetrization selection parameter, segmentation parameter, polygon meshing parameter, file format parameter, mapping mode parameter, etc.) Is stored.
The field angle symmetrization selection parameter is setting information indicating whether or not there is field angle symmetrization. The segmentation parameter is a threshold value for extracting each surface of the three-dimensional object based on three-dimensional position data (point cloud data) obtained by three-dimensionally measuring the shape of the surface of the three-dimensional object. The polygon meshing parameter is a parameter (polygon shape, mesh roughness, etc.) necessary for polygonal meshing of each surface of the extracted three-dimensional object. The file format parameter is a parameter for designating the formats of the two-dimensional data file and the three-dimensional data file. The mapping mode parameter is a parameter for designating any one of three modes of mapping for two-dimensional space, mapping for three-dimensional space, and no mapping.
 映像信号が、情報処理装置から投射映像入力部4を介して投射部5に入力される。投射部5は、投射映像入力部4からの入力映像信号に基づく映像を投射するものであって、映像処理部12、歪み補正部13、投射レンズユニット14及び歪み補正係数計算部15を有する。
 映像処理部12は、入力映像信号の解像度を表示デバイスの解像度へ変換する処理や画質を調整する処理などを行う。歪み補正部13は、映像処理部12で処理された映像信号に対して、歪み補正係数に従って、プロジェクタと正対していない投射面に投射された画像の歪み(例えば、台形歪み)を補正する。歪み補正係数計算部15は、ユーザが設定した歪み補正のための情報、又は、画角対称化部10からの投射面の水平及び垂直の傾きの情報に基づいて、歪み補正部13での歪み補正に必要な歪み補正係数を計算する。ユーザは、不図示の操作部を用いて歪み補正のための情報を設定することができる。
A video signal is input from the information processing apparatus to the projection unit 5 via the projection video input unit 4. The projection unit 5 projects a video based on the input video signal from the projection video input unit 4, and includes a video processing unit 12, a distortion correction unit 13, a projection lens unit 14, and a distortion correction coefficient calculation unit 15.
The video processing unit 12 performs processing for converting the resolution of the input video signal to the resolution of the display device, processing for adjusting image quality, and the like. The distortion correction unit 13 corrects distortion (for example, trapezoidal distortion) of the image projected on the projection surface not facing the projector, according to the distortion correction coefficient, with respect to the video signal processed by the video processing unit 12. The distortion correction coefficient calculation unit 15 calculates the distortion in the distortion correction unit 13 based on the information for distortion correction set by the user or the horizontal and vertical inclination information of the projection surface from the view angle symmetrization unit 10. A distortion correction coefficient necessary for correction is calculated. The user can set information for distortion correction using an operation unit (not shown).
 投射レンズユニット14は、歪み補正部13からの映像信号に基づく画像を形成する表示デバイスと、表示デバイスで形成された画像を投射する投射レンズとを含む。投射レンズは、光軸の方向に移動可能なレンズを備え、該レンズの光軸上の位置に対応するズーム位置に応じて画角が変化するズーム機構と、当該投射レンズ全体を光軸と直交する方向にシフトさせるレンズシフト機構とを備える。投射レンズユニット14は、投射レンズのズーム位置及びレンズシフト位置を示すズーム・シフト位置情報を画角対称化部10に供給する。ここで、表示デバイスは、複数の画素からなる画像形成面を備える表示素子と呼ぶことができ、歪み補正部13からの映像信号に基づく画像が画像形成面に形成される。表示素子として、液晶表示素子やDMD(デジタルマイクロミラーデバイス)などを用いることができる。
 プロジェクタ投射設計データ格納部9は、プロジェクタの投射に関わる設計データを格納する。設計データは、投射レンズのズーム位置とレンズシフト位置とから画角と投射中心点を求めるための光学に関するデータ、すなわち、ズーム位置とレンズシフト位置とにより画角と投射中心点がどのように変化するかを計算するためのデータである。このデータは、光学設計で決定したデータに基づくものである。
The projection lens unit 14 includes a display device that forms an image based on the video signal from the distortion correction unit 13 and a projection lens that projects an image formed by the display device. The projection lens includes a lens that can move in the direction of the optical axis, a zoom mechanism that changes the angle of view according to a zoom position corresponding to the position on the optical axis of the lens, and the projection lens as a whole orthogonal to the optical axis. And a lens shift mechanism that shifts in the direction of movement. The projection lens unit 14 supplies zoom / shift position information indicating the zoom position and lens shift position of the projection lens to the view angle symmetrizing unit 10. Here, the display device can be referred to as a display element having an image forming surface including a plurality of pixels, and an image based on the video signal from the distortion correction unit 13 is formed on the image forming surface. As the display element, a liquid crystal display element, DMD (digital micromirror device), or the like can be used.
The projector projection design data storage unit 9 stores design data related to projector projection. The design data is data related to optics for obtaining the field angle and projection center point from the zoom position and lens shift position of the projection lens, that is, how the field angle and projection center point change depending on the zoom position and lens shift position. It is data for calculating what to do. This data is based on data determined by optical design.
 姿勢センサ部11は、前後360°、左右360°の回転角を検出できる姿勢センサ、例えば、3軸の加速度センサを備え、プロジェクタの水平面に対する傾きを検出する。姿勢センサ部11の出力は、プロジェクタデータ生成部6及びマッピング用データ生成部7に供給されている。
 画角対称化部10は、投射レンズユニット14からズーム・シフト位置情報を取得し、プロジェクタ投射設計データ格納部9から設計データを取得し、パラメータ格納部3から画角対称化選択パラメータを取得し、パラメータ格納部3からマッピングモードパラメータを取得する。画角対称化部10は、マッピングモードパラメータ及び画角対称化選択パラメータに応じて、ズーム・シフト位置情報と設計データとに基づいて、プロジェクタの投射中心軸の方向を示す投射方向ベクトルと画角、及び、プロジェクタと正対する投射面に対する投射面の水平及び垂直の傾き角を算出する。
 ここで、画角は、投射レンズからの投射光の範囲(画像の投射範囲)を角度で表したもので、水平方向の投射光の範囲を角度で表したものを水平方向の画角と呼び、垂直方向の投射光の範囲を角度で表したものを垂直方向の画角と呼ぶ。投射中心軸は、投射レンズからの投射光の中心軸(中心光線に対応する)であって、画角の基準として用いることができる。画角、投射中心軸及び投射中心点は、ズーム位置やレンズシフト位置に応じて変化する。
The attitude sensor unit 11 includes an attitude sensor that can detect 360 ° rotation angles and 360 ° rotation angles, for example, a triaxial acceleration sensor, and detects the inclination of the projector with respect to the horizontal plane. The output of the attitude sensor unit 11 is supplied to the projector data generation unit 6 and the mapping data generation unit 7.
The view angle symmetrization unit 10 acquires zoom / shift position information from the projection lens unit 14, acquires design data from the projector projection design data storage unit 9, and acquires view angle symmetrization selection parameters from the parameter storage unit 3. The mapping mode parameter is acquired from the parameter storage unit 3. The angle-of-view symmetrization unit 10 generates a projection direction vector and an angle of view indicating the direction of the projection center axis of the projector based on zoom / shift position information and design data in accordance with the mapping mode parameter and the angle-of-view symmetrization selection parameter. And the horizontal and vertical inclination angles of the projection surface with respect to the projection surface facing the projector are calculated.
Here, the angle of view represents the range of projection light from the projection lens (image projection range) as an angle, and the range of horizontal projection light as an angle is called the horizontal angle of view. A range of vertical projection light expressed as an angle is called a vertical angle of view. The projection central axis is the central axis (corresponding to the central ray) of the projection light from the projection lens, and can be used as a reference for the angle of view. The angle of view, the projection center axis, and the projection center point change according to the zoom position and the lens shift position.
 図5Aに打ち上げ有りの場合の画角、投射光軸及び投射中心点の関係を模式的に示す。図5Bに打ち上げ無しの場合の画角、投射光軸及び投射中心点の関係を模式的に示す。図5Cに画角を拡大した場合の画角、投射光軸及び投射中心点の関係を模式的に示す。図5Dに上方へレンズシフトを行った場合の画角、投射中心軸及び投射中心点の関係を模式的に示す。ここで、投射光軸は、画像形成面の中心を通り、かつ、画像形成面に垂直に交わる軸である。ここで、図5B~図5Dは、垂直方向の断面に相当する。
 通常、プロジェクタをテーブル上に置いたときに、テーブルの高さより上に映像が投射されるように、映像を投射光軸よりも上に向けて投射する、打ち上げと呼ばれる投射形態が用いられる。また、映像を投射光軸に対して上下左右の位置に移動して投射する、レンズシフトと呼ばれる投射形態もあるが、打ち上げはそのレンズシフトの一形態である。
 例えば、図5Aに示すように、投射中心点102は、投射エリア103の四隅の各点と表示デバイス100の画像形成領域の四隅の各点とを、それぞれ対応する点同士で直線的に結んだ線が交わる点である。ここで、投射エリア103は、表示デバイス100の画像形成領域の画像を上下左右で反転させたものである。なお、実際は、レンズでの屈折を伴うため、投射エリア103の四隅の各点と表示デバイス100の画像形成領域の四隅の各点とを結ぶ線は直線とはならない。投射中心点102は、投射レンズのレンズ構成を考慮して決定する必要がある。
FIG. 5A schematically shows the relationship between the angle of view, the projection optical axis, and the projection center point when launching is performed. FIG. 5B schematically shows the relationship between the angle of view, the projection optical axis, and the projection center point when there is no launch. FIG. 5C schematically shows the relationship between the angle of view, the projection optical axis, and the projection center point when the angle of view is enlarged. FIG. 5D schematically shows the relationship between the angle of view, the projection center axis, and the projection center point when the lens shift is performed upward. Here, the projection optical axis is an axis passing through the center of the image forming surface and perpendicular to the image forming surface. Here, FIGS. 5B to 5D correspond to cross sections in the vertical direction.
Usually, when the projector is placed on a table, a projection form called launching is used in which the image is projected above the projection optical axis so that the image is projected above the height of the table. In addition, there is a projection form called lens shift in which an image is projected by moving it to the vertical and horizontal positions with respect to the projection optical axis, and launch is one form of the lens shift.
For example, as shown in FIG. 5A, the projection center point 102 linearly connects each point of the four corners of the projection area 103 and each point of the four corners of the image forming area of the display device 100 with corresponding points. This is where the lines meet. Here, the projection area 103 is obtained by inverting the image of the image forming area of the display device 100 vertically and horizontally. In practice, since refraction is caused by the lens, lines connecting the four corners of the projection area 103 and the four corners of the image forming area of the display device 100 are not straight lines. The projection center point 102 needs to be determined in consideration of the lens configuration of the projection lens.
 例えば、表示デバイス100の画像形成領域の四隅の点をそれぞれA点、B点、C点、D点とし、投射エリア103の四隅の点をそれぞれa点、b点、c点、d点とする。a点、b点、c点、d点はそれぞれA点、B点、C点、D点に対応し、a点、b点、c点、d点の配置はA点、B点、C点、D点の配置に対して上下左右が反転した位置関係になる。この場合、投射中心点102は、A点から射出してレンズを介してa点に到達する主光線と、B点から射出してレンズを介してb点に到達する主光線と、C点から射出してレンズを介してc点に到達する主光線と、D点から射出してレンズを介してd点に到達する主光線とが互いに交わる点を示す。このような主光線の交点は、例えば、投射レンズの開口絞りの中心で規定することができ、レンズ設計データに基づいて計算することが可能である。
 図5Aに示した打ち上げ有りの例では、投射光軸109は投射エリア103の下端の中心部を通り、投射中心点102は、投射光軸109よりも上側に位置している。この場合、投射中心軸は、投射光軸109と一致しない。
For example, the four corner points of the image forming area of the display device 100 are A point, B point, C point, and D point, respectively, and the four corner points of the projection area 103 are respectively a point, b point, c point, and d point. . Points a, b, c, and d correspond to points A, B, C, and D, respectively, and the arrangement of points a, b, c, and d is points A, B, and C. , The positional relationship is reversed up, down, left and right with respect to the arrangement of point D. In this case, the projection center point 102 is emitted from the point A and reaches the point a through the lens, the principal ray that exits from the point B and reaches the point b through the lens, and the point C. The principal ray that exits and reaches the point c via the lens and the principal ray that exits from the point D and reaches the point d via the lens are shown as intersecting points. Such an intersection of principal rays can be defined, for example, at the center of the aperture stop of the projection lens, and can be calculated based on lens design data.
In the example with launch shown in FIG. 5A, the projection optical axis 109 passes through the center of the lower end of the projection area 103, and the projection center point 102 is located above the projection optical axis 109. In this case, the projection center axis does not coincide with the projection optical axis 109.
 図5Bに示した打ち上げ無しの例では、投射光軸109は投射エリア103の中心部を通り、投射中心点102は、投射光軸109上に位置している。この場合、投射中心軸は、投射光軸109と一致する。
 図5Cは、図5Bの例と比較して、画角を拡大した例である。図5Bと同様、投射光軸109は投射エリア103の中心部を通り、投射中心点102は投射光軸109上に位置しているが、図5Bの例よりも、投射中心点102は表示デバイス100側に配置されている。この場合も、投射中心軸は、投射光軸109と一致する。
In the example without launch shown in FIG. 5B, the projection optical axis 109 passes through the center of the projection area 103, and the projection center point 102 is located on the projection optical axis 109. In this case, the projection center axis coincides with the projection optical axis 109.
FIG. 5C is an example in which the angle of view is enlarged as compared with the example of FIG. 5B. 5B, the projection optical axis 109 passes through the center of the projection area 103, and the projection center point 102 is located on the projection optical axis 109. However, the projection center point 102 is a display device than the example of FIG. It is arranged on the 100 side. Also in this case, the projection center axis coincides with the projection optical axis 109.
 図5Dは、図5Bの例と比較して、投射エリア103が上方へシフトするようにレンズシフトを行った例である。図5Aの例と同様、投射光軸109は投射エリア103の下端の中心部を通り、投射中心点102は、投射光軸109よりも上側に位置している。この場合、投射中心軸は、投射光軸109と一致しない。
 図5A~図5Dの例から分かるように、画角、投射中心軸及び投射中心点は、ズーム位置やレンズシフト位置に応じて変化する。換言すると、画角、投射中心軸及び投射中心点は、ズーム位置やレンズシフト位置に応じて決定する必要がある。
FIG. 5D is an example in which the lens shift is performed so that the projection area 103 is shifted upward as compared with the example of FIG. 5B. As in the example of FIG. 5A, the projection optical axis 109 passes through the center of the lower end of the projection area 103, and the projection center point 102 is located above the projection optical axis 109. In this case, the projection center axis does not coincide with the projection optical axis 109.
As can be seen from the examples of FIGS. 5A to 5D, the angle of view, the projection center axis, and the projection center point change according to the zoom position and the lens shift position. In other words, the angle of view, the projection center axis, and the projection center point need to be determined according to the zoom position and the lens shift position.
 再び、図4を参照する。画角対称化を行う場合と画角対称化を行わない場合とで、投射方向ベクトル、画角、及び、傾き角を算出する処理が異なる。
 マッピングモードパラメータに2次元空間用マッピングモードが設定され、または、マッピングモードパラメータに3次元空間用マッピングモードが設定され、かつ、画角対称化選択パラメータに画角対称化無しが設定されている場合、画角対称化部10は、以下の処理(A1)~(A3)を実行する。
 (A1)ズーム・シフト位置情報(ズーム位置及びレンズシフト位置)と設計データとに基づいて、水平方向の左右の画角と垂直方向の上下の画角を求める。
 (A2)投射光軸を投射中心軸と見做して、水平・垂直方向成分がそれぞれ0とされたベクトルを投射方向ベクトルに設定する。
 (A3)水平傾き及び垂直傾きをともに0に設定する。
Reference is again made to FIG. The process of calculating the projection direction vector, the view angle, and the tilt angle differs depending on whether the view angle symmetrization is performed or not.
When the mapping mode parameter is set to the 2D space mapping mode, or the mapping mode parameter is set to the 3D space mapping mode, and the view angle symmetrization selection parameter is set to no view angle symmetrization. The angle-of-view symmetrization unit 10 executes the following processes (A1) to (A3).
(A1) Based on the zoom / shift position information (zoom position and lens shift position) and design data, the left and right field angles in the horizontal direction and the vertical field angles in the vertical direction are obtained.
(A2) Considering the projection optical axis as the projection center axis, a vector in which the horizontal and vertical direction components are set to 0 is set as the projection direction vector.
(A3) Both horizontal and vertical inclinations are set to zero.
 一方、マッピングモードパラメータに3次元空間用マッピングモードが設定され、かつ、画角対称化選択パラメータに画角対称化有りが設定されている場合には、画角対称化部10は、以下の処理(B1)~(B3)を実行する。
 (B1)ズーム・シフト位置情報(ズーム位置及びレンズシフト位置)と設計データとに基づいて、水平方向の左右の画角と垂直方向の上下の画角を求める。
 (B2)投射中心軸に垂直な投射面であり、その投射面に投射される映像がスクウェアになるように歪み補正を行ったときのその投射中心軸に対する水平方向の左右の画角が等しくなり、かつ、垂直方向の上下の画角が等しくなるように投射中心軸と投射面を定め、投射方向ベクトルをその投射中心軸の方向に設定する。
 (B3)(B2)で定めた投射中心軸に垂直な投射面の投射光軸に垂直な面に対する水平方向及び垂直方向の傾きをそれぞれ求める。
On the other hand, if the mapping mode parameter is set to the 3D space mapping mode and the view angle symmetrization selection parameter is set to view angle symmetrize, the view angle symmetrization unit 10 performs the following processing. (B1) to (B3) are executed.
(B1) Based on the zoom / shift position information (zoom position and lens shift position) and design data, the left and right field angles in the horizontal direction and the vertical field angles in the vertical direction are obtained.
(B2) The projection plane is perpendicular to the projection center axis, and the horizontal angle of view in the horizontal direction with respect to the projection center axis when the distortion correction is performed so that the image projected on the projection plane is square becomes equal. In addition, the projection center axis and the projection plane are determined so that the vertical angle of view in the vertical direction is equal, and the projection direction vector is set in the direction of the projection center axis.
(B3) The horizontal and vertical inclinations of the projection plane perpendicular to the projection center axis determined in (B2) with respect to the plane perpendicular to the projection optical axis are obtained.
 画角対称化部10は、投射方向ベクトル及び画角の演算結果をプロジェクタデータ生成部6に供給し、傾き角の算出結果を歪み補正係数計算部15に供給する。なお、マッピングモードパラメータがマッピングなしのモードである場合は、画角対称化部10は、投射方向ベクトル、画角及び傾きの演算処理を実行しない。
 プロジェクタデータ生成部6は、仮想の3次元空間におけるレンダリング用カメラやプロジェクタの設定を行うためのプロジェクタデータファイルを生成する。このプロジェクタデータファイルは、情報処理装置側の3次元空間でのプロジェクションマッピング映像を作成するツールにて用いられる。
The angle-of-view symmetrization unit 10 supplies the projection direction vector and the calculation result of the angle of view to the projector data generation unit 6 and supplies the calculation result of the tilt angle to the distortion correction coefficient calculation unit 15. Note that when the mapping mode parameter is a mode without mapping, the view angle symmetrization unit 10 does not perform the calculation processing of the projection direction vector, the view angle, and the tilt.
The projector data generation unit 6 generates a projector data file for setting a rendering camera and a projector in a virtual three-dimensional space. This projector data file is used in a tool for creating a projection mapping video in a three-dimensional space on the information processing apparatus side.
 プロジェクタデータ生成部6は、初期プロジェクタデータ生成部16、プロジェクタデータワールド座標変換部17、プロジェクタデータ垂直オフセット部18及びプロジェクタデータファイル生成部19を有する。
 初期プロジェクタデータ生成部16は、プロジェクタの投射中心点(図5A~図5Dに示したような投射中心点102)であるプロジェクタ座標系の原点、すなわち、座標(0,0,0)をプロジェクタ位置座標に設定する。初期プロジェクタデータ生成部16は、プロジェクタ位置座標と画角対称化部10からの投射方向ベクトル及び画角(上下左右の画角)とを含むプロジェクタデータを生成する。
The projector data generation unit 6 includes an initial projector data generation unit 16, a projector data world coordinate conversion unit 17, a projector data vertical offset unit 18, and a projector data file generation unit 19.
The initial projector data generation unit 16 uses the origin of the projector coordinate system that is the projection center point of the projector (projection center point 102 as shown in FIGS. 5A to 5D), that is, the coordinates (0, 0, 0) as the projector position. Set to coordinates. The initial projector data generation unit 16 generates projector data including the projector position coordinates, the projection direction vector from the angle-of-view symmetrization unit 10, and the angle of view (up / down / left / right angle of view).
 プロジェクタデータワールド座標変換部17は、姿勢センサ部11で検出されたプロジェクタの水平面に対する傾きに基づいて、初期プロジェクタデータ生成部16で生成されたプロジェクタデータに含まれている投射ベクトルの方向を、水平面を基準としたワールド座標系に変換する。
 プロジェクタデータ垂直オフセット部18は、マッピング用データ生成部7で行われる3次元位置データの垂直方向の座標の調整に合わせて、プロジェクタデータワールド座標変換部17からのプロジェクタデータに含まれているプロジェクタ位置座標の垂直方向の座標を変更する。この座標変更を行うために、垂直オフセット量がマッピング用データ生成部7からプロジェクタデータ垂直オフセット部18に供給される。
The projector data world coordinate conversion unit 17 converts the direction of the projection vector included in the projector data generated by the initial projector data generation unit 16 based on the inclination of the projector with respect to the horizontal plane detected by the attitude sensor unit 11 into the horizontal plane. Convert to world coordinate system based on.
The projector data vertical offset unit 18 adjusts the vertical coordinate of the three-dimensional position data performed by the mapping data generation unit 7 and adjusts the projector position included in the projector data from the projector data world coordinate conversion unit 17. Change the vertical coordinate of the coordinate. In order to perform this coordinate change, the vertical offset amount is supplied from the mapping data generation unit 7 to the projector data vertical offset unit 18.
 プロジェクタデータファイル生成部19は、プロジェクタデータ垂直オフセット部18からのプロジェクタデータに基づくプロジェクタデータファイル31を生成する。プロジェクタデータファイル31は、ファイル格納部8に格納される。
 マッピング用データ生成部7は、立体物の各面の外形位置を示す2次元データファイル32、又は、立体物の3次元データファイル33を生成する。2次元データファイル32は、2次元空間でのプロジェクションマッピング映像を作成するツールにて用いられる。3次元データファイル33は、3次元空間でのプロジェクションマッピング映像を作成するツールにて使用される。2次元データファイル32及び立体物の3次元データファイル33はファイル格納部8に格納される。
The projector data file generation unit 19 generates a projector data file 31 based on the projector data from the projector data vertical offset unit 18. The projector data file 31 is stored in the file storage unit 8.
The mapping data generation unit 7 generates a two-dimensional data file 32 indicating the outer position of each surface of the three-dimensional object or a three-dimensional data file 33 of the three-dimensional object. The two-dimensional data file 32 is used by a tool for creating a projection mapping video in a two-dimensional space. The three-dimensional data file 33 is used by a tool that creates a projection mapping video in a three-dimensional space. The two-dimensional data file 32 and the three-dimensional three-dimensional data file 33 are stored in the file storage unit 8.
 マッピング用データ生成部7は、3次元センサ部20、キャリブレーションデータ格納部21、3次元位置データプロジェクタ座標変換部22、3次元位置データワールド座標変換部23、垂直オフセット計算部24、3次元位置データ垂直オフセット部25、3次元位置データセグメンテーション部26、透視投影部27、3次元位置データポリゴンメッシュ化部29、2次元データファイル生成部28及び3次元データファイル生成部30を有する。
 3次元センサ部20は、投射レンズの光軸方向に向けて配置された、投射対象である立体物の各面を3次元計測する3次元センサを有する。3次元センサの検出範囲は、投射レンズの画像を投射することができる投射可能エリア全体を含む。
The mapping data generation unit 7 includes a three-dimensional sensor unit 20, a calibration data storage unit 21, a three-dimensional position data projector coordinate conversion unit 22, a three-dimensional position data world coordinate conversion unit 23, a vertical offset calculation unit 24, and a three-dimensional position. A data vertical offset unit 25, a three-dimensional position data segmentation unit 26, a perspective projection unit 27, a three-dimensional position data polygon meshing unit 29, a two-dimensional data file generation unit 28, and a three-dimensional data file generation unit 30 are provided.
The three-dimensional sensor unit 20 includes a three-dimensional sensor that is arranged toward the optical axis direction of the projection lens and three-dimensionally measures each surface of a three-dimensional object that is a projection target. The detection range of the three-dimensional sensor includes the entire projectable area where the image of the projection lens can be projected.
 図6は、3次元センサの検出範囲と投射可能エリアとの相対的な位置関係を説明するための模式図である。図6に示すように、3次元センサ108は、投射レンズ101aの光軸方向に向けて配置されている。ズーム機能を用いて投射エリアを拡大又は縮小することができ、レンズシフト機能を用いて投射エリアを上下左右に移動することができる。
 投射エリア113は、右上方向にレンズシフトしてズームを最小にした場合の投射エリアである。投射エリア114は、右上方向にレンズシフトしてズームを最大にした場合の投射エリアである。投射エリア115は、左上方向にレンズシフトしてズームを最小にした場合の投射エリアである。投射エリア116は、左上方向にレンズシフトしてズームを最大にした場合の投射エリアである。
FIG. 6 is a schematic diagram for explaining the relative positional relationship between the detection range of the three-dimensional sensor and the projectable area. As shown in FIG. 6, the three-dimensional sensor 108 is arranged toward the optical axis direction of the projection lens 101a. The projection area can be enlarged or reduced using the zoom function, and the projection area can be moved up, down, left and right using the lens shift function.
The projection area 113 is a projection area when the lens is shifted in the upper right direction to minimize the zoom. The projection area 114 is a projection area when the lens is shifted in the upper right direction to maximize the zoom. The projection area 115 is a projection area when the lens is shifted in the upper left direction to minimize the zoom. The projection area 116 is a projection area when the lens is shifted in the upper left direction to maximize the zoom.
 投射エリア117は、右下方向にレンズシフトしてズームを最小にした場合の投射エリアである。投射エリア118は、右下方向にレンズシフトしてズームを最大にした場合の投射エリアである。投射エリア119は、左下方向にレンズシフトしてズームを最小にした場合の投射エリアである。投射エリア120は、左下方向にレンズシフトしてズームを最大にした場合の投射エリアである。
 3次元センサ108の検出範囲は、投射レンズ101aからの画像を投射することができる投射可能エリア全体、すなわち、投射エリア113~120全体を含む。よって、3次元センサ108は、投射可能エリア内に配置された立体物の3次元位置を計測できる。3次元センサ部20は、3次元センサ108の出力である3次元位置データを3次元位置データプロジェクタ座標変換部22に供給する。
The projection area 117 is a projection area when the lens is shifted in the lower right direction to minimize the zoom. The projection area 118 is a projection area when the zoom is maximized by shifting the lens in the lower right direction. The projection area 119 is a projection area when the lens is shifted in the lower left direction to minimize the zoom. The projection area 120 is a projection area when the lens is shifted in the lower left direction to maximize the zoom.
The detection range of the three-dimensional sensor 108 includes the entire projectable area where the image from the projection lens 101a can be projected, that is, the entire projection areas 113 to 120. Therefore, the three-dimensional sensor 108 can measure the three-dimensional position of the three-dimensional object arranged in the projectable area. The three-dimensional sensor unit 20 supplies the three-dimensional position data output from the three-dimensional sensor 108 to the three-dimensional position data projector coordinate conversion unit 22.
 3次元センサ108として、例えば、TOF(Time of Flight)方式や三角測量方式の3次元センサを用いることができるが、これら方式のものに限定されない。TOF方式は、光を対象物に向けて投射し、その投射光が対象物で反射されて戻ってくるまでの時間を計測することで、3次元計測を行う方式である。三角測量方式には、例えば、パッシブ三角測量方式やアクティブ三角測量方式などがある。パッシブ三角測量方式は、左右に並べて配置された2台のカメラで同時に対象物を撮影し、各カメラで得られる対象物の撮像画像上での位置の違いから三角測量の原理を用いて3次元計測を行う方式であり、ステレオカメラ方式とも呼ばれている。アクティブ三角測量方式は、対象物に光を照射し、対象物からの反射光の情報に基づいて、三角測量の原理を用いて3次元計測を行う方式である。 As the three-dimensional sensor 108, for example, a TOF (Time-of-Flight) method or a triangulation method three-dimensional sensor can be used, but it is not limited to these methods. The TOF method is a method of performing three-dimensional measurement by projecting light toward an object and measuring the time until the projected light is reflected by the object and returned. Examples of the triangulation method include a passive triangulation method and an active triangulation method. In the passive triangulation method, an object is photographed at the same time with two cameras arranged side by side on the left and right, and the principle of triangulation is used based on the difference in position on the captured image of the object obtained by each camera. This is a method for measuring and is also called a stereo camera method. The active triangulation method is a method of irradiating light on an object and performing three-dimensional measurement using the principle of triangulation based on information on reflected light from the object.
 再び図4を参照する。キャリブレーションデータ格納部21には、キャリブレーションデータが格納されている。キャリブレーションデータは、3次元センサ108の座標系をプロジェクタの座標系に変換するためのパラメータ(回転量及び並進量)と、基準ズーム位置及び基準レンズシフト位置とを含む。回転量及び並進量は、3次元センサの座標系とプロジェクタの座標系の位置関係を計測するキャリブレーションを行うことで得られる。基準ズーム位置及び基準レンズシフト位置は、キャリブレーションを行った時のズーム位置及びレンズシフト位置である。
 3次元位置データプロジェクタ座標変換部22は、キャリブレーションデータ格納部21からキャリブレーションデータ(回転量、並進量、基準ズーム位置及び基準レンズシフト位置)を取得し、プロジェクタ投射設計データ格納部9から設計データを取得し、投射レンズユニット14からズーム・シフト位置情報を取得する。3次元位置データプロジェクタ座標変換部22は、キャリブレーションデータと設計データとに基づいて、3次元センサからの立体物の3次元位置データを、投射中心点を原点としたプロジェクタの座標系の3次元位置データに変換する。キャリブレーションデータ及び設計データは、3次元センサの座標系を投射中心点が原点とされたプロジェクタ座標系に変換するための座標変換データと呼ぶことができる。
Refer to FIG. 4 again. The calibration data storage unit 21 stores calibration data. The calibration data includes parameters (rotation amount and translation amount) for converting the coordinate system of the three-dimensional sensor 108 to the coordinate system of the projector, a reference zoom position, and a reference lens shift position. The amount of rotation and the amount of translation can be obtained by performing calibration that measures the positional relationship between the coordinate system of the three-dimensional sensor and the coordinate system of the projector. The reference zoom position and the reference lens shift position are the zoom position and the lens shift position when calibration is performed.
The three-dimensional position data projector coordinate conversion unit 22 acquires calibration data (rotation amount, translation amount, reference zoom position, and reference lens shift position) from the calibration data storage unit 21 and designs from the projector projection design data storage unit 9. Data is acquired, and zoom / shift position information is acquired from the projection lens unit 14. The three-dimensional position data projector coordinate conversion unit 22 uses the three-dimensional position data of the three-dimensional object from the three-dimensional sensor based on the calibration data and the design data, and the three-dimensional position of the projector coordinate system with the projection center point as the origin. Convert to position data. The calibration data and design data can be referred to as coordinate conversion data for converting the coordinate system of the three-dimensional sensor into a projector coordinate system in which the projection center point is the origin.
 3次元位置データワールド座標変換部23は、プロジェクタの水平面に対する傾きを姿勢センサ部11から取得し、プロジェクタ座標系に座標変換された3次元位置データを3次元位置データプロジェクタ座標変換部22から取得する。3次元位置データワールド座標変換部23は、プロジェクタの水平面に対する傾きに基づいて、プロジェクタ座標系に座標変換された3次元位置データを、水平面を基準としたワールド座標系に変換する。このワールド座標系に座標変換された3次元位置データは、垂直オフセット計算部24及び3次元位置データ垂直オフセット部25に供給される。
 垂直オフセット計算部24は、3次元位置データワールド座標変換部23からのワールド座標系に座標変換された3次元位置データから垂直座標の最小値を求める。垂直座標の最小値が負の数であった場合は、垂直オフセット計算部24は、垂直座標の最小値の絶対値を示す垂直オフセット量を出力する。垂直座標の最小値が正の数であった場合は、垂直オフセット計算部24は、0を示す垂直オフセット量を出力する。垂直オフセット計算部24の出力は、プロジェクタデータ垂直オフセット部18及び3次元位置データ垂直オフセット部25に供給される。
The three-dimensional position data world coordinate conversion unit 23 acquires the inclination of the projector with respect to the horizontal plane from the attitude sensor unit 11, and acquires the three-dimensional position data coordinate-converted into the projector coordinate system from the three-dimensional position data projector coordinate conversion unit 22. . The three-dimensional position data world coordinate conversion unit 23 converts the three-dimensional position data coordinate-converted into the projector coordinate system into a world coordinate system based on the horizontal plane based on the tilt of the projector with respect to the horizontal plane. The three-dimensional position data coordinate-converted into the world coordinate system is supplied to the vertical offset calculation unit 24 and the three-dimensional position data vertical offset unit 25.
The vertical offset calculator 24 obtains the minimum value of the vertical coordinates from the three-dimensional position data coordinate-converted to the world coordinate system from the three-dimensional position data world coordinate converter 23. When the minimum value of the vertical coordinate is a negative number, the vertical offset calculation unit 24 outputs a vertical offset amount indicating the absolute value of the minimum value of the vertical coordinate. When the minimum value of the vertical coordinate is a positive number, the vertical offset calculation unit 24 outputs a vertical offset amount indicating 0. The output of the vertical offset calculation unit 24 is supplied to the projector data vertical offset unit 18 and the three-dimensional position data vertical offset unit 25.
 3次元位置データ垂直オフセット部25は、垂直オフセット計算部24で計算した垂直オフセット量に基づいて、3次元位置データワールド座標変換部23からのワールド座標系に座標変換された3次元位置データを垂直方向にオフセットする。具体的には、3次元位置データの垂直方向の座標に垂直オフセット量を足すことで垂直方向へのオフセットを行う。この垂直方向へのオフセットがなされた3次元位置データは、3次元位置データセグメンテーション部26、3次元位置データポリゴンメッシュ化部29及び3次元位置データファイル作成部30に供給される。
 3次元位置データセグメンテーション部26は、パラメータ格納部3からセグメンテーションパラメータを取得する。3次元位置データセグメンテーション部26は、セグメンテーションパラメータに基づいて、3次元位置データ垂直オフセット部25からの3次元位置データから立体物の面とその形状を示す稜線を検出する。面と稜線の検出結果は、透視投影部27及び3次元位置データポリゴンメッシュ化部29に供給される。
The three-dimensional position data vertical offset unit 25 vertically converts the three-dimensional position data coordinate-converted into the world coordinate system from the three-dimensional position data world coordinate conversion unit 23 based on the vertical offset amount calculated by the vertical offset calculation unit 24. Offset in direction. Specifically, the vertical offset is performed by adding the vertical offset amount to the vertical coordinate of the three-dimensional position data. The three-dimensional position data that has been offset in the vertical direction is supplied to the three-dimensional position data segmentation unit 26, the three-dimensional position data polygon meshing unit 29, and the three-dimensional position data file creation unit 30.
The three-dimensional position data segmentation unit 26 acquires a segmentation parameter from the parameter storage unit 3. The three-dimensional position data segmentation unit 26 detects a surface of a three-dimensional object and a ridge line indicating the shape from the three-dimensional position data from the three-dimensional position data vertical offset unit 25 based on the segmentation parameter. The detection results of the surface and the ridge line are supplied to the perspective projection unit 27 and the three-dimensional position data polygon meshing unit 29.
 透視投影部27は、プロジェクタデータ垂直オフセット部18からプロジェクタデータを取得する。透視投影部27は、プロジェクタデータを用いて、3次元位置データセグメンテーション部26で検出した稜線を投射面に透視投影することで、投射映像における立体物の各面の外形位置を示す2次元データを生成する。この2次元データは、2次元データファイル生成部28に供給される。
 2次元データファイル生成部28は、パラメータ格納部3からファイルフォーマットパラメータを取得する。2次元データファイル生成部28は、透視投影部27で作成した2次元データからファイルフォーマットパラメータが示すファイルフォーマットに基づく2次元データファイル32を生成する。この2次元データファイル32は、ファイル格納部8に格納される。
The perspective projection unit 27 acquires projector data from the projector data vertical offset unit 18. The perspective projection unit 27 uses the projector data to project the ridgeline detected by the three-dimensional position data segmentation unit 26 onto the projection surface, thereby obtaining two-dimensional data indicating the external position of each surface of the three-dimensional object in the projection image. Generate. The two-dimensional data is supplied to the two-dimensional data file generation unit 28.
The two-dimensional data file generation unit 28 acquires file format parameters from the parameter storage unit 3. The two-dimensional data file generation unit 28 generates a two-dimensional data file 32 based on the file format indicated by the file format parameter from the two-dimensional data created by the perspective projection unit 27. The two-dimensional data file 32 is stored in the file storage unit 8.
 3次元位置データポリゴンメッシュ化部29は、パラメータ格納部3からポリゴンメッシュ化パラメータ及びファイルフォーマットパラメータを取得する。3次元位置データポリゴンメッシュ化部29は、ポリゴンメッシュ化パラメータとファイルフォーマットパラメータが示すファイルフォーマットとに基づいて、3次元位置データセグメンテーション部26で検出した面をポリゴンメッシュ化する。ポリゴンメッシュ化された面のデータが3次元データファイル生成部30に供給される。
 3次元データファイル生成部30は、パラメータ格納部3からファイルフォーマットパラメータを取得する。3次元データファイル生成部30は、3次元位置データポリゴンメッシュ化部29でポリゴンメッシュ化された面のデータからファイルフォーマットパラメータが示すファイルフォーマットに基づく3次元データファイル33を生成する。この3次元データファイル33は、ファイル格納部8に格納される。
The three-dimensional position data polygon meshing unit 29 acquires a polygon meshing parameter and a file format parameter from the parameter storage unit 3. The 3D position data polygon meshing unit 29 converts the surface detected by the 3D position data segmentation unit 26 into a polygon mesh based on the polygon meshing parameter and the file format indicated by the file format parameter. Data of the polygon meshed surface is supplied to the three-dimensional data file generation unit 30.
The three-dimensional data file generation unit 30 acquires file format parameters from the parameter storage unit 3. The three-dimensional data file generation unit 30 generates a three-dimensional data file 33 based on the file format indicated by the file format parameter from the surface data polygon- meshed by the three-dimensional position data polygon meshing unit 29. The three-dimensional data file 33 is stored in the file storage unit 8.
 次に、本実施形態のプロジェクタを用いたプロジェクションマッピングシステムの構成を説明する。
 図7に、プロジェクションマッピングシステムの一例を示す。図7を参照すると、プロジェクションマッピングシステムは、プロジェクタ201と、パーソナルコンピュータ等の映像処理装置205とを有する。プロジェクタ201と映像処理装置205は通信手段207を介して相互に通信可能に接続されている。例えば、通信手段207は、通信ケーブルと映像信号ケーブルとから構成されてもよい。この場合、映像信号ケーブルは、後述の映像処理装置205が、3次元空間用プロジェクションマッピング映像をプロジェクタ201に供給するのに用いられる。通信ケーブルは、例えば、後述のプロジェクタ201が、プロジェクタデータファイル31、2次元データファイル32及び3次元データファイル33などのデータを映像処理装置205に供給するために用いられる。なお、通信ケーブルと映像信号ケーブルを1つのケーブルで構成することも可能である。また、通信手段207は、無線通信手段を含むものであっても良い。
 プロジェクタ201は、図4~図6を用いて説明した構成を有し、映像処理装置205からの映像信号に基づく映像を立体部104の各面に投射する。映像処理装置205は、2次元空間用プロジェクションマッピング映像作成及び3次元空間用プロジェクションマッピング映像作成ツールの少なくとも一方を搭載している。
Next, the configuration of a projection mapping system using the projector of this embodiment will be described.
FIG. 7 shows an example of the projection mapping system. Referring to FIG. 7, the projection mapping system includes a projector 201 and a video processing device 205 such as a personal computer. The projector 201 and the video processing device 205 are connected to each other via a communication unit 207 so that they can communicate with each other. For example, the communication unit 207 may be composed of a communication cable and a video signal cable. In this case, the video signal cable is used by a video processing device 205 described later to supply a projection mapping video for three-dimensional space to the projector 201. For example, the communication cable is used by the projector 201 described later to supply data such as the projector data file 31, the two-dimensional data file 32, and the three-dimensional data file 33 to the video processing device 205. It is also possible to configure the communication cable and the video signal cable with one cable. The communication unit 207 may include a wireless communication unit.
The projector 201 has the configuration described with reference to FIGS. 4 to 6 and projects video based on the video signal from the video processing device 205 onto each surface of the three-dimensional unit 104. The video processing device 205 includes at least one of a two-dimensional space projection mapping video creation and a three-dimensional space projection mapping video creation tool.
 図7に示したプロジェクションマッピングシステムでは、以下の手順でプロジェクションマッピングが行われる。
 まず、プロジェクタ201が、映像処理装置205からのデータ作成指示(データ作成要求)に応じて、マッピング用データ作成処理を実行する。このマッピング用データ作成処理では、2次元空間用プロジェクションマッピング映像作成ツール用のデータ、または、3次元空間用プロジェクションマッピング映像作成ツール用のデータを作成する。ここでは、便宜上、映像処理装置205は3次元空間用プロジェクションマッピング映像作成ツールを搭載しているものと仮定し、プロジェクタ201は3次元空間用プロジェクションマッピング映像作成ツール用のデータを作成する。このデータは、立体物の3次元データと、仮想の3次元空間におけるレンダリング用カメラやプロジェクタの設定を行うためのプロジェクタデータとを含む。
In the projection mapping system shown in FIG. 7, projection mapping is performed according to the following procedure.
First, the projector 201 executes mapping data creation processing in response to a data creation instruction (data creation request) from the video processing device 205. In this mapping data creation process, data for a two-dimensional space projection mapping video creation tool or data for a three-dimensional space projection mapping video creation tool is created. Here, for convenience, it is assumed that the video processing device 205 is equipped with a projection mapping video creation tool for 3D space, and the projector 201 creates data for the projection mapping video creation tool for 3D space. This data includes three-dimensional data of a three-dimensional object and projector data for setting a rendering camera and a projector in a virtual three-dimensional space.
 次に、映像処理装置205が、プロジェクタ201で生成した3次元空間用プロジェクションマッピング映像作成ツール用のデータ、すなわち、3次元データファイル33とプロジェクタデータファイル31を取得する。そして、3次元空間用プロジェクションマッピング映像作成ツールが、プロジェクタ201から取得した3次元データファイル33とプロジェクタデータファイル31を用いて3次元空間用プロジェクションマッピング映像を作成する。
 映像処理装置205が、3次元空間用プロジェクションマッピング映像をプロジェクタ201に供給する。プロジェクタ201が、3次元空間用プロジェクションマッピング映像に基づく映像を立体部104の各面に投射する。
Next, the video processing device 205 acquires the data for the 3D space projection mapping video creation tool generated by the projector 201, that is, the 3D data file 33 and the projector data file 31. Then, the 3D space projection mapping video creation tool creates a 3D space projection mapping video using the 3D data file 33 and the projector data file 31 acquired from the projector 201.
The video processing device 205 supplies a projection mapping video for 3D space to the projector 201. The projector 201 projects an image based on the three-dimensional space projection mapping image on each surface of the three-dimensional unit 104.
 なお、映像処理装置205が2次元空間用プロジェクションマッピング映像作成ツールを搭載している場合は、プロジェクタ201は、3次元データファイル33に代えて、2次元空間用プロジェクションマッピング映像作成ツール用のデータである2次元データファイル32を作成する。2次元データファイル32は、投射映像における立体物の各面の外形位置を示す2次元データであり、プロジェクタ201から映像処理装置205に供給される。映像処理装置205では、2次元空間用プロジェクションマッピング映像作成ツールが、2次元データファイル32を用いて2次元空間用プロジェクションマッピング映像を作成する。 When the video processing device 205 is equipped with a projection mapping video creation tool for 2D space, the projector 201 uses data for the projection mapping video creation tool for 2D space instead of the 3D data file 33. A two-dimensional data file 32 is created. The two-dimensional data file 32 is two-dimensional data indicating the external position of each surface of the three-dimensional object in the projection video, and is supplied from the projector 201 to the video processing device 205. In the video processing device 205, the two-dimensional space projection mapping video creation tool uses the two-dimensional data file 32 to create a two-dimensional space projection mapping video.
 次に、本実施形態のプロジェクタ201のマッピング用データ作成処理の動作を詳細に説明する。
 図8に、マッピング用データ作成処理の一手順を示す。
 ステップS10で、操作者が、映像処理装置205上で、プロジェクションマッピング映像の作成に必要なパラメータを設定するための入力操作(データ作成要求)を行う。この入力操作に応じて、パラメータ設定開始指示及びパラメータ設定画面情報が、映像処理装置205からプロジェクタ201に供給される。プロジェクタ201では、パラメータ設定画面情報が投射部5に供給され、通信制御部1が、パラメータ設定開始指示に従い、投射部5にてパラメータ設定画面を投射させる。
Next, the operation of the mapping data creation process of the projector 201 of this embodiment will be described in detail.
FIG. 8 shows a procedure of mapping data creation processing.
In step S <b> 10, the operator performs an input operation (data creation request) for setting parameters necessary for creating a projection mapping video on the video processing device 205. In response to this input operation, a parameter setting start instruction and parameter setting screen information are supplied from the video processing device 205 to the projector 201. In the projector 201, the parameter setting screen information is supplied to the projection unit 5, and the communication control unit 1 causes the projection unit 5 to project the parameter setting screen according to the parameter setting start instruction.
 操作者は、パラメータ設定画面を参照して必要なパラメータの情報を入力する。ここで、パラメータは、マッピング用データ作成処理に必要なパラメータ(画角対称化選択パラメータ、セグメンテーションパラメータ、ポリゴンメッシュ化パラメータ、ファイルフォーマットパラメータ、マッピングモードパラメータなど)を含む。パラメータの入力情報は、映像処理装置205からプロジェクタ201に供給される。プロジェクタ201では、通信制御部1が、操作者が入力したパラメータをパラメータ格納部3に格納する。
 次に、ステップS11で、画角対称化部10が、投射レンズユニット14からズーム・シフト位置情報を取得し、プロジェクタ投射設計データ格納部9から設計データを取得し、パラメータ格納部3から画角対称化選択パラメータを取得し、パラメータ格納部3からマッピングモードパラメータを取得する。画角対称化部10は、マッピングモードパラメータ及び画角対称化選択パラメータに応じて、ズーム・シフト位置情報と設計データとに基づいて、プロジェクタの投射中心軸の方向を示す投射方向ベクトルと画角、及び、プロジェクタと正対する投射面に対する投射面の水平及び垂直の傾き角を算出する。
The operator inputs necessary parameter information with reference to the parameter setting screen. Here, the parameters include parameters necessary for mapping data creation processing (view angle symmetrization selection parameter, segmentation parameter, polygon meshing parameter, file format parameter, mapping mode parameter, etc.). The parameter input information is supplied from the video processing device 205 to the projector 201. In the projector 201, the communication control unit 1 stores parameters input by the operator in the parameter storage unit 3.
Next, in step S <b> 11, the view angle symmetrization unit 10 acquires zoom / shift position information from the projection lens unit 14, acquires design data from the projector projection design data storage unit 9, and receives the view angle from the parameter storage unit 3. A symmetrization selection parameter is acquired, and a mapping mode parameter is acquired from the parameter storage unit 3. The angle-of-view symmetrization unit 10 generates a projection direction vector and an angle of view indicating the direction of the projection center axis of the projector based on zoom / shift position information and design data in accordance with the mapping mode parameter and the angle-of-view symmetrization selection parameter. And the horizontal and vertical inclination angles of the projection surface with respect to the projection surface facing the projector are calculated.
 画角対称化部10による画角、投射方向、傾きの算出処理は、以下の第1及び第2の処理を含む。
 (第1の処理)
 画角を上下対称、左右対称になるようにする画角対称化を行わないとき、すなわち、マッピングモードパラメータに2次元空間用マッピングが設定されるか、マッピングモードパラメータに3次元空間用マッピングが設定され、かつ、画角対称化選択パラメータに画角対称化なしが設定されているときに、画角対称化部10は、第1の処理を実行する。
 第1の処理において、画角対称化部10は、投射レンズユニット14のズーム位置、レンズシフト位置、プロジェクタ投射設計データ格納部9に格納されているプロジェクタの画角、ズーム特性、レンズシフト特性などの投射に関わる設計データから画角を求める。そして、画角対称化部10は、投射方向ベクトルを投射光軸の方向、すなわち、水平・垂直方向成分がそれぞれ0のベクトルに設定する。なお、投射面の傾きについては、画角対称化部10は、投射面を変えないので、水平傾き、垂直傾きともに0にする。
The calculation process of the angle of view, the projection direction, and the tilt by the angle of view symmetrization unit 10 includes the following first and second processes.
(First process)
When the angle of view is not symmetrized so that the angle of view is vertically and horizontally symmetrical, that is, the mapping mode parameter is set to 2D space mapping or the mapping mode parameter is set to 3D space mapping. When the field angle symmetrization selection parameter is set to “no field angle symmetrization”, the field angle symmetrization unit 10 executes the first process.
In the first processing, the angle-of-view symmetrization unit 10 includes the zoom position and lens shift position of the projection lens unit 14, the angle of view of the projector stored in the projector projection design data storage unit 9, zoom characteristics, lens shift characteristics, and the like. The angle of view is obtained from design data related to the projection. Then, the view angle symmetrizing unit 10 sets the projection direction vector to the direction of the projection optical axis, that is, a vector in which the horizontal / vertical direction components are 0 respectively. As for the inclination of the projection plane, the angle-of-view symmetrization unit 10 does not change the projection plane, and therefore sets both the horizontal inclination and the vertical inclination to zero.
 (第2の処理)
 画角を上下対称、左右対称になるようにする画角対称化を行うとき、すなわち、マッピングモードパラメータに3次元空間用マッピングが設定され、かつ、画角対称化選択パラメータに画角対称化ありが設定されているときに、画角対称化部10は、第2の処理を実行する。
 第2の処理において、画角対称化部10は、投射レンズユニット14のズーム位置、レンズシフト位置、プロジェクタ投射設計データ格納部9に格納されているプロジェクタの画角、ズーム特性、レンズシフト特性などの投射に関わる設計データから画角を求める。次に、画角対称化部10は、投射中心軸に垂直な投射面であり、投射面に投射される映像がスクウェアになるように歪み補正を行ったときの投射中心軸に対する水平方向の左右の画角と垂直方向の上下の画角が等しくなるような投射面を定め、投射方向ベクトルをその投射中心軸の方向に設定する。そして、画角対称化部10は、投射中心軸に垂直なその投射面の投射光軸に垂直な面に対する水平方向、垂直方向の傾きを求める。
(Second process)
When performing angle-of-view symmetrization to make the angle of view vertically and horizontally symmetric, that is, mapping for the three-dimensional space is set in the mapping mode parameter, and angle-of-view symmetry is set in the view angle symmetrization selection parameter Is set, the angle-of-view symmetrization unit 10 executes the second process.
In the second processing, the view angle symmetrizing unit 10 includes the zoom position and lens shift position of the projection lens unit 14, the angle of view of the projector stored in the projector projection design data storage unit 9, the zoom characteristic, the lens shift characteristic, and the like. The angle of view is obtained from design data related to the projection. Next, the angle-of-view symmetrizing unit 10 is a projection plane perpendicular to the projection center axis, and the horizontal direction with respect to the projection center axis when distortion correction is performed so that the image projected on the projection plane becomes square. A projection plane is defined such that the vertical and vertical vertical angles of view are equal, and the projection direction vector is set in the direction of the projection center axis. Then, the view angle symmetrization unit 10 obtains the horizontal and vertical inclinations of the projection plane perpendicular to the projection central axis with respect to the plane perpendicular to the projection optical axis.
 なお、投射部5では、投射面における投射映像がスクウェアになるように歪み補正が行われるが、この歪み補正後の投射映像エリアは、補正前に比較して小さくなることから、歪み補正は画角にも影響を与える。このため、歪み補正後の投射映像エリアの変化も考慮して、水平方向画角の左右の画角が等しくなり、かつ、垂直方向画角の上下の画角が等しくなる投射中心軸とプロジェクタ装置とスクリーンが正対したときの投射面に対する投射面の傾きを求めることが望ましい。 The projection unit 5 performs distortion correction so that the projected image on the projection surface becomes square. However, since the projected image area after this distortion correction is smaller than before correction, the distortion correction is not performed. It also affects the corners. Therefore, in consideration of the change in the projected video area after distortion correction, the projection center axis and the projector apparatus in which the horizontal field angle is equal to each other and the vertical field angle is equal to each other in the vertical direction It is desirable to obtain the inclination of the projection surface with respect to the projection surface when the screen and the screen face each other.
 ここで、画角対称化部10及び投射部5の動作をさらに詳細に説明する。
 投射部5では、通常は、歪み補正係数計算部15は、ユーザによる歪み補正設定に従って歪み補正係数を計算し、歪み補正部13でプロジェクタと正対していないスクリーンに投射したときに生じる歪みを補正する。しかしながら、マッピングモードパラメータに2次元空間用マッピング又は3次元空間用マッピングが設定されているときは、歪み補正係数計算部15は、ユーザによる歪み補正設定ではなく、画角対称化部10からの投射面の傾きに従って、歪み補正係数を計算し、歪み補正部13がその歪み補正係数に基づいて歪みを補正する。画角対称化10と投射部5における歪み補正の処理は、ユーザが投射レンズユニット14のズーム位置とレンズシフト位置を調整するたび実行される。これにより、画角が上下対称、左右対称にしか設定できない3次元空間でのプロジェクションマッピング映像を作成するツールを使用した場合の、投射映像と立体物との間にずれが生じるという問題を解消することが可能となる。
Here, the operations of the view angle symmetrization unit 10 and the projection unit 5 will be described in more detail.
In the projection unit 5, the distortion correction coefficient calculation unit 15 normally calculates a distortion correction coefficient according to the distortion correction setting by the user, and corrects the distortion that occurs when the distortion correction unit 13 projects on a screen that is not directly facing the projector. To do. However, when 2D space mapping or 3D space mapping is set in the mapping mode parameter, the distortion correction coefficient calculation unit 15 does not perform distortion correction setting by the user, but projects from the view angle symmetrization unit 10. A distortion correction coefficient is calculated according to the inclination of the surface, and the distortion correction unit 13 corrects the distortion based on the distortion correction coefficient. The view angle symmetrization 10 and the distortion correction processing in the projection unit 5 are executed every time the user adjusts the zoom position and the lens shift position of the projection lens unit 14. This eliminates the problem of deviation between the projected image and the three-dimensional object when using a tool for creating a projection mapping image in a three-dimensional space in which the angle of view can only be set to be vertically or horizontally symmetrical. It becomes possible.
 図9と図10を用いて上記の問題解消を説明する。
 図9Aは、打ち上げ投射で、投射光軸が投射エリアの下辺の中心部を通る投射状態を示し、図9Bは、歪み補正の前後の投射映像エリアを示す。
 図9の例では、投射中心軸110は投射光軸109と一致しない。投射光軸109を基準とする垂直方向の画角について、上角をθT、下角をθBとする。投射中心軸110を基準とする垂直方向の画角について、上角をθ’T、下角をθ’Bとする。下角θB=0であり、上角θTと下角θBは一致しないため、上下対称にならない。画角対称化部10は、投射面に投射される映像がスクウェアになるように歪み補正を行ったときに垂直方向の画角が上下対称(θ’T=θ’B)になるように、画投射中心軸110と、画投射中心軸110に垂直な投射面を求め、その投射面の、プロジェクタ201とスクリーンが正対したときの投射面に対する傾きを求める。そして、画角対称化部10は、投射面の傾きを歪み補正係数計算部15に供給し、歪み補正係数計算部15が投射面の傾きに基づいて歪み補正係数を計算し、歪み補正部13が歪み補正係数に基づいて、投射面上の投射映像がスクウェアになるように歪み補正を行う。この歪み補正によれば、図9Bに示すように、投射映像エリア122が投射映像エリア123に補正され、その結果、図9Aに示すように、投射エリア103は投射エリア121に補正される。
The solution to the above problem will be described with reference to FIGS.
FIG. 9A shows a projection state in which the projection optical axis passes through the center of the lower side of the projection area in launch projection, and FIG. 9B shows the projection video area before and after distortion correction.
In the example of FIG. 9, the projection center axis 110 does not coincide with the projection optical axis 109. With respect to the vertical field angle with respect to the projection optical axis 109, the upper angle is θ T and the lower angle is θ B. With respect to the angle of view in the vertical direction with respect to the projection center axis 110, the upper angle is θ ′ T and the lower angle is θ ′ B. Since the lower angle θ B = 0 and the upper angle θ T and the lower angle θ B do not coincide with each other, they are not vertically symmetric. The angle-of-view symmetrizing unit 10 is configured so that the vertical angle of view is vertically symmetrical (θ ′ T = θ ′ B ) when distortion correction is performed so that the image projected on the projection surface is square. The image projection central axis 110 and the projection plane perpendicular to the image projection central axis 110 are obtained, and the inclination of the projection plane with respect to the projection plane when the projector 201 and the screen face each other is obtained. Then, the view angle symmetrizing unit 10 supplies the inclination of the projection surface to the distortion correction coefficient calculation unit 15, and the distortion correction coefficient calculation unit 15 calculates a distortion correction coefficient based on the inclination of the projection surface, and the distortion correction unit 13. Based on the distortion correction coefficient, distortion correction is performed so that the projected image on the projection surface becomes square. According to this distortion correction, the projection video area 122 is corrected to the projection video area 123 as shown in FIG. 9B, and as a result, the projection area 103 is corrected to the projection area 121 as shown in FIG. 9A.
 図10Aは、左上方向にレンズシフトさせた場合の投射状態を示し、図10Bは、歪み補正の前後の投射映像エリアを示す。この例においても、投射中心軸110を基準とする垂直方向の画角について、上角をθ’T、下角をθ’Bとする。また、投射中心軸110を基準とする水平方向の画角について、右角をθ’R、左角をθ’Lとする。
 θ’T≠θ’Bであるため、垂直方向の画角は上下対称にならない。また、θ’R≠θ’Lであるため、水平方向の画角も左右対称にならない。この場合、画角対称化部10は、投射面に投射される映像がスクウェアになるように歪み補正を行ったときに垂直方向の画角が上下対称(θ’T=θ’B)に、かつ、水平方向の画角が左右対称(θ’R=θ’L)になるように、投射中心軸110と、投射中心軸110に垂直な投射面を求め、その投射面の、プロジェクタ201とスクリーンが正対したときの投射面に対する傾きを求める。そして、画角対称化部10は、投射面の傾きを歪み補正係数計算部15に供給し、歪み補正係数計算部15が投射面の傾きに基づいて歪み補正係数を計算し、歪み補正部13が歪み補正係数に基づいて、投射面上の投射映像がスクウェアになるように歪み補正を行う。この歪み補正によれば、図10Bに示すように、投射映像エリア122が投射映像エリア123に補正され、その結果、図10Aに示すように、投射エリア103は投射エリア121に補正される。
FIG. 10A shows a projection state when the lens is shifted in the upper left direction, and FIG. 10B shows a projected video area before and after distortion correction. Also in this example, regarding the angle of view in the vertical direction with respect to the projection center axis 110, the upper angle is θ ′ T and the lower angle is θ ′ B. In addition, regarding the horizontal field angle with reference to the projection center axis 110, the right angle is θ ′ R and the left angle is θ ′ L.
Because it is θ 'T ≠ θ' B, the vertical angle of view does not become vertically symmetrical. Also, since θ ′ R ≠ θ ′ L , the horizontal field angle is not symmetrical. In this case, the angle-of-view symmetrizing unit 10 makes the vertical angle of view vertically symmetric (θ ′ T = θ ′ B ) when correcting distortion so that the image projected on the projection surface becomes square. In addition, the projection central axis 110 and a projection plane perpendicular to the projection central axis 110 are obtained so that the horizontal angle of view is symmetrical (θ ′ R = θ ′ L ), and the projector 201 of the projection plane is The inclination with respect to the projection surface when the screen is directly facing is obtained. Then, the view angle symmetrizing unit 10 supplies the inclination of the projection surface to the distortion correction coefficient calculation unit 15, and the distortion correction coefficient calculation unit 15 calculates a distortion correction coefficient based on the inclination of the projection surface, and the distortion correction unit 13. Based on the distortion correction coefficient, distortion correction is performed so that the projected image on the projection surface becomes square. According to this distortion correction, the projection video area 122 is corrected to the projection video area 123 as shown in FIG. 10B, and as a result, the projection area 103 is corrected to the projection area 121 as shown in FIG. 10A.
 再び、図8を参照する。ステップS12で、プロジェクタ201の通信制御部1が、マッピングデータ生成開始指示を受信した否かを判定する。プロジェクタ201の位置、投射方向、ズーム位置、レンズシフト位置を、投射対象である立体物に投射映像が投射されるように調整した後、ユーザは、情報処理装置205上で、マッピングデータ生成開始を指示するための入力操作を行う。この入力操作に応じて、マッピングデータ生成開始指示が情報処理装置205からプロジェクタ201に供給される。
 マッピングデータ生成開始指示を受信すると、ステップS13、S14で、通信制御部1は、マッピングモードパラメータに、2次元空間用マッピングモード又は3次元空間用マッピングモードが設定されているか否かを判定する。
 マッピングモードパラメータに2次元空間用マッピングモードが設定されている場合は、ステップS15で、通信制御部1は、マッピング用データ生成部7にて、2次元データファイル32を生成させる。一方、マッピングモードパラメータに3次元空間用マッピングモードが設定されている場合は、ステップS16で、通信制御部1は、マッピング用データ生成部7にて、3次元データファイル33を生成させるとともに、プロジェクタデータ生成部6にてプロジェクタデータファイル31を生成させる。なお、マッピングモードパラメータにマッピングなしモードが設定されている場合は、データファイル生成を行うことなく、マッピング用データ作成処理を終了する。
Again referring to FIG. In step S12, the communication control unit 1 of the projector 201 determines whether or not a mapping data generation start instruction has been received. After adjusting the position, projection direction, zoom position, and lens shift position of the projector 201 so that the projected image is projected onto the three-dimensional object that is the projection target, the user starts mapping data generation on the information processing device 205. An input operation for instructing is performed. In response to this input operation, a mapping data generation start instruction is supplied from the information processing apparatus 205 to the projector 201.
When the mapping data generation start instruction is received, in steps S13 and S14, the communication control unit 1 determines whether the mapping mode parameter is set to the two-dimensional space mapping mode or the three-dimensional space mapping mode.
If the mapping mode parameter is set to the two-dimensional space mapping mode, the communication control unit 1 causes the mapping data generation unit 7 to generate the two-dimensional data file 32 in step S15. On the other hand, if the mapping mode parameter is set to the 3D space mapping mode, in step S16, the communication control unit 1 causes the mapping data generation unit 7 to generate the 3D data file 33 and the projector. The data generation unit 6 generates a projector data file 31. If the no mapping mode is set in the mapping mode parameter, the mapping data creation process is terminated without generating the data file.
 以下、2次元空間用マッピングモードと3次元空間用マッピングモードとに分けて動作を説明する。
 (3次元空間用マッピングモード)
 3次元空間用マッピングモードでは、通信制御部1からのマッピング用データ生成開始指示に従い、マッピング用データ生成部7が3次元データファイル33の生成処理を実行するとともに、プロジェクタデータ生成部6がプロジェクタデータファイル31の生成処理を実行する。
 図11に、3次元データファイル33の生成処理の一手順を示す。
 ステップS20で、3次元センサ部20が立体物を3次元計測し、3次元位置データを出力する。この3次元位置データは、3次元センサ108の座標系における3次元座標で示される点群データで表される。
Hereinafter, the operation will be described separately for the two-dimensional space mapping mode and the three-dimensional space mapping mode.
(3D space mapping mode)
In the three-dimensional space mapping mode, the mapping data generation unit 7 executes the generation process of the three-dimensional data file 33 in accordance with the mapping data generation start instruction from the communication control unit 1, and the projector data generation unit 6 The generation process of the file 31 is executed.
FIG. 11 shows a procedure for generating the three-dimensional data file 33.
In step S20, the three-dimensional sensor unit 20 measures a three-dimensional object three-dimensionally and outputs three-dimensional position data. This three-dimensional position data is represented by point group data indicated by three-dimensional coordinates in the coordinate system of the three-dimensional sensor 108.
 ステップS21で、3次元位置データプロジェクタ座標変換部22が、キャリブレーションデータ格納部21から回転量、並進量、基準ズーム位置及び基準レンズシフト位置を取得し、投射レンズユニット14から現在のズーム位置及びレンズシフト位置を取得し、プロジェクタ投射設計データ格納部9から設計データを取得する。そして、3次元位置データプロジェクタ座標変換部22が、回転量、並進量、基準ズーム位置、基準レンズシフト位置、現在のズーム位置及びレンズシフト位置、設計データに基づいて、3次元センサ20からの3次元位置データの座標系を、投射中心点を原点としたプロジェクタ201の座標系に変換する。 In step S 21, the three-dimensional position data projector coordinate conversion unit 22 acquires the rotation amount, the translation amount, the reference zoom position, and the reference lens shift position from the calibration data storage unit 21, and the current zoom position and the reference lens shift position from the projection lens unit 14. The lens shift position is acquired, and the design data is acquired from the projector projection design data storage unit 9. Then, the 3D position data projector coordinate conversion unit 22 receives the 3D from the 3D sensor 20 based on the rotation amount, translation amount, reference zoom position, reference lens shift position, current zoom position and lens shift position, and design data. The coordinate system of the dimension position data is converted into the coordinate system of the projector 201 with the projection center point as the origin.
 図12に、3次元センサ108の座標系とプロジェクタ201の座標系との位置関係を模式的に示す。
 図12に示すように、3次元センサ108の座標系の原点はプロジェクタ201の座標系の原点(投射中心点)と一致せず、また、3次元センサ108の検出方向もプロジェクタ201の投射方向と一致しない。このため、3次元センサ108で計測した3次元位置データの座標系は、プロジェクタ201の座標系と異なる。ステップS21の座標変換は、3つの座標軸(XYZ)に対する回転量と、3つの座標軸(XYZ)に対する移動を表す並進量で定義できる。この回転量と並進量を求めることをキャリブレーションと呼ぶ。
FIG. 12 schematically shows the positional relationship between the coordinate system of the three-dimensional sensor 108 and the coordinate system of the projector 201.
As shown in FIG. 12, the origin of the coordinate system of the three-dimensional sensor 108 does not coincide with the origin (projection center point) of the coordinate system of the projector 201, and the detection direction of the three-dimensional sensor 108 is also the projection direction of the projector 201. It does not match. For this reason, the coordinate system of the three-dimensional position data measured by the three-dimensional sensor 108 is different from the coordinate system of the projector 201. The coordinate transformation in step S21 can be defined by a rotation amount with respect to the three coordinate axes (XYZ) and a translation amount representing movement with respect to the three coordinate axes (XYZ). Obtaining the amount of rotation and the amount of translation is called calibration.
 プロジェクタ201では、プロジェクタ座標系の原点となる投射中心点がいつも同じ位置ではなく、ズームやレンズシフトの操作により移動するため、回転量及び並進量とともに、キャリブレーションを行った時のズーム位置とレンズシフト位置がそれぞれ基準ズーム位置、基準レンズシフト位置としてキャリブレーションデータ格納部21に格納されている。
 3次元位置データプロジェクタ座標変換部22は、まず、設計データ、基準ズーム位置、基準レンズシフト位置を用いて、キャリブレーションを行った時の基準投射中心点の座標を求める。次に、3次元位置データプロジェクタ座標変換部22は、現在のズーム位置及びレンズシフト位置に基づいて現在の投射中心点の座標を求め、基準投射中心点の座標から現在の投射中心点の座標への座標変換のための並進量を求める。
In the projector 201, the projection center point that is the origin of the projector coordinate system is not always the same position, but is moved by a zoom or lens shift operation. The shift positions are stored in the calibration data storage unit 21 as the reference zoom position and the reference lens shift position, respectively.
The three-dimensional position data projector coordinate conversion unit 22 first obtains the coordinates of the reference projection center point when calibration is performed using the design data, the reference zoom position, and the reference lens shift position. Next, the three-dimensional position data projector coordinate conversion unit 22 obtains the coordinates of the current projection center point based on the current zoom position and lens shift position, and changes the coordinates of the reference projection center point to the coordinates of the current projection center point. The translation amount for the coordinate transformation of is obtained.
 次に、3次元位置データプロジェクタ座標変換部22は、3次元センサ部20からの3次元位置データをキャリブレーションデータ格納部21に格納されている回転量及び並進量に基づいて座標変換する。次に、3次元位置データプロジェクタ座標変換部22は、現在のズーム位置及びレンズシフト位置に基づいて、基準投射中心点の座標から現在の投射中心点の座標への並進量分だけ座標を移動する。これにより、3次元センサ部20からの3次元位置データの座標系を現在の投射中心点を原点としたプロジェクタ座標系に変換する。
 ステップS22で、3次元位置データワールド座標変換部23が、姿勢センサ部11で検出されたプロジェクタの水平面に対する傾きに基づいて、プロジェクタ座標系に座標変換された3次元位置データを、水平面を基準としたワールド座標系に変換する。
 プロジェクタ201は、前後方向及び左右方向に傾いて設置されたり、上下逆さまに設置されたりする。例えば、プロジェクタ201の設置状態には、図13に示すように、水平状態124、上下逆さ状態125、上向き状態126、下向き状態127、右回転状態128、左回転状態129といった種々の状態がある。このため、3次元位置データワールド座標変換部23は、姿勢センサ部11で検出した水平方向の傾きに基づいて、図12に示したような水平面を基準としたワールド座標系への変換処理を行う。
Next, the three-dimensional position data projector coordinate conversion unit 22 converts the three-dimensional position data from the three-dimensional sensor unit 20 based on the rotation amount and the translation amount stored in the calibration data storage unit 21. Next, the three-dimensional position data projector coordinate conversion unit 22 moves the coordinates by the amount of translation from the coordinates of the reference projection center point to the coordinates of the current projection center point based on the current zoom position and lens shift position. . As a result, the coordinate system of the three-dimensional position data from the three-dimensional sensor unit 20 is converted into a projector coordinate system with the current projection center point as the origin.
In step S22, the three-dimensional position data world coordinate conversion unit 23 converts the three-dimensional position data coordinate-converted into the projector coordinate system based on the inclination of the projector with respect to the horizontal plane detected by the attitude sensor unit 11, and uses the horizontal plane as a reference. Convert to the world coordinate system.
The projector 201 is installed tilted in the front-rear direction and the left-right direction, or installed upside down. For example, the installation state of the projector 201 includes various states such as a horizontal state 124, an upside down state 125, an upward state 126, a downward state 127, a right rotation state 128, and a left rotation state 129, as shown in FIG. For this reason, the three-dimensional position data world coordinate conversion unit 23 performs a conversion process to the world coordinate system based on the horizontal plane as shown in FIG. 12 based on the horizontal inclination detected by the attitude sensor unit 11. .
 なお、3次元空間用のプロジェクションマッピング映像作成ツールにおいては、立体物の表面に投射する映像として、平面的な映像だけでなく、立体的な映像表現を行う3次元プロジェクションマッピングを作成でき、平面で立体的表現を行うために、視点の位置の設定が必要となる。つまり、視点を3次元位置で設定し、そこから見たときに立体に見えるように立体物表面の映像を作成する。このとき、視点は地面に立った人間の目の位置であり、視点設定のために地面に平行な水平面を基準としたワールド座標系に変換しておく必要がある。 In addition, the projection mapping video creation tool for 3D space can create not only a planar video but also a 3D projection mapping for representing a stereoscopic video as a video projected on the surface of a three-dimensional object. In order to perform three-dimensional expression, it is necessary to set the position of the viewpoint. That is, the viewpoint is set at a three-dimensional position, and an image of the surface of the three-dimensional object is created so that it can be seen as a three-dimensional object when viewed from there. At this time, the viewpoint is the position of the human eye standing on the ground, and it is necessary to convert the viewpoint into a world coordinate system based on a horizontal plane parallel to the ground in order to set the viewpoint.
 座標変換した3次元位置データの垂直方向の座標には負の数が含まれることがあるため、垂直方向の座標調整を行う必要がある。この垂直方向の座標調整においては、3次元位置データのすべての垂直方向の座標を0以上にするために、垂直オフセット量計算部24が垂直オフセット量を計算し、3次元位置データ垂直オフセット部25が、3次元位置データのすべての垂直方向の座標に垂直オフセット量を足す。具体的には、垂直オフセット量計算部24が、3次元位置データの垂直座標の最小値を求め、負の数であった場合には、垂直オフセット量としてその最小値の絶対値を出力し、正の数であった場合は垂直オフセット量として0を出力する。
 ステップS24で、3次元位置データセグメンテーション部26が、パラメータ格納部3に格納されているセグメンテーションパラメータに基づいて、3次元位置データの中から投射対象である立体物の面とその外形を示す稜線を検出する。
Since the coordinate in the vertical direction of the coordinate-converted three-dimensional position data may include a negative number, it is necessary to adjust the coordinate in the vertical direction. In this vertical coordinate adjustment, the vertical offset amount calculation unit 24 calculates the vertical offset amount so that all the vertical coordinates of the three-dimensional position data become 0 or more, and the three-dimensional position data vertical offset unit 25 Adds the vertical offset to all the vertical coordinates of the three-dimensional position data. Specifically, the vertical offset amount calculation unit 24 obtains the minimum value of the vertical coordinates of the three-dimensional position data, and if it is a negative number, outputs the absolute value of the minimum value as the vertical offset amount, If it is a positive number, 0 is output as the vertical offset amount.
In step S24, the three-dimensional position data segmentation unit 26 generates a ridge line indicating the surface of the three-dimensional object to be projected and its outer shape from the three-dimensional position data based on the segmentation parameters stored in the parameter storage unit 3. To detect.
 図14に、セグメンテーション処理の一例を示す。3次元位置データセグメンテーション部26は、まず、3次元位置データの点群データ130の各点について、法線ベクトル132を求める。ここで、法線ベクトル132は、注目点と周辺の点で構成される三角形の面の法線ベクトル131の合成ベクトルである。次に、3次元位置データセグメンテーション部26は、隣接する点の法線ベクトルを比較し、その差がセグメンテーションパラメータで設定されている閾値よりも小さければ、同一面とする。これにより、面133、134を抽出することができる。面の抽出が終わると、3次元位置データセグメンテーション部26は、隣接する面133、134の交線である稜線135を算出する。
 上記のセグメンテーション処理において、セグメンテーションパラメータで設定されている閾値に応じて、どの程度の曲面を1つの面として抽出するかが決まる。なお、この例では、セグメンテーションパラメータは、法線ベクトルの差に対する閾値であるが、セグメンテーション手法により、その値は異なる。
FIG. 14 shows an example of the segmentation process. The three-dimensional position data segmentation unit 26 first obtains a normal vector 132 for each point of the point cloud data 130 of the three-dimensional position data. Here, the normal vector 132 is a composite vector of the normal vector 131 of a triangular surface composed of a point of interest and peripheral points. Next, the three-dimensional position data segmentation unit 26 compares the normal vectors of adjacent points, and if the difference is smaller than the threshold value set by the segmentation parameter, the three-dimensional position data segmentation unit 26 sets the same plane. Thereby, the surfaces 133 and 134 can be extracted. When the surface extraction is completed, the three-dimensional position data segmentation unit 26 calculates an edge line 135 that is an intersection line between the adjacent surfaces 133 and 134.
In the segmentation process described above, how many curved surfaces are extracted as one surface is determined according to the threshold value set by the segmentation parameter. In this example, the segmentation parameter is a threshold for the difference between the normal vectors, but the value differs depending on the segmentation method.
 ステップS25で、3次元位置データポリゴンメッシュ化部29が、パラメータ格納部3に格納されているポリゴンメッシュ化パラメータとパラメータ格納部3に格納されているファイルフォーマットパラメータに基づき、3次元位置データセグメンテーション部26で検出した面をポリゴンメッシュ化する。
 生成する3次元データファイルのフォーマットに応じて、扱えるポリゴンが異なる。例えば、3角形のポリゴンしか扱えないフォーマット、3角形と4角形のポリゴンしか扱えないフォーマット、多角形のポリゴンが扱えるフォーマットなどがある。このため、3次元位置データポリゴンメッシュ化部29は、ポリゴンの形を指定するファイルフォーマットパラメータと、ポリゴンの粗さを指定するポリゴンメッシュ化パラメータとを用いて、ポリゴン化を行う。なお、ポリゴンを細かくすると、曲面をより滑らかに表現することができるが、プロジェクションマッピング映像を作成するツールにおける計算量が増える。
In step S25, the three-dimensional position data polygon meshing unit 29 performs the three-dimensional position data segmentation unit based on the polygon meshing parameters stored in the parameter storage unit 3 and the file format parameters stored in the parameter storage unit 3. The surface detected at 26 is converted to a polygon mesh.
The polygons that can be handled differ depending on the format of the three-dimensional data file to be generated. For example, there are a format that can handle only triangular polygons, a format that can handle only triangular and quadrangular polygons, and a format that can handle polygonal polygons. For this reason, the three-dimensional position data polygon meshing unit 29 performs polygonization using a file format parameter that specifies the shape of the polygon and a polygon meshing parameter that specifies the roughness of the polygon. If the polygon is made fine, the curved surface can be expressed more smoothly, but the amount of calculation in the tool for creating the projection mapping video increases.
 図15Aに、3角ポリゴンと4角ポリゴンの両方でポリゴンメッシュ化した例を示し、図15Bに、3角ポリゴンのみでポリゴンメッシュ化した例を示す。図15A及び図15Bから分かるように、3角ポリゴンと4角ポリゴンの両方でポリゴンメッシュ化した仮想の立体物300、301に比較して、3角ポリゴンのみでポリゴンメッシュ化した仮想の立体物302、303は、ポリゴンがより細かなものとなっている。
 ステップS26で、3次元データファイル生成部30が、ファイルフォーマットパラメータに基づいて、3次元位置データ垂直オフセット部25からの3次元位置データ、または、3次元位置データポリゴンメッシュ化部29からのポリゴンメッシュ化3次元位置データから、指定されたファイルフォーマットに合った3次元データファイル33を生成する。なお、ファイルフォーマットによっては、3次元データファイル生成部30は、ポリゴンの各頂点の法線ベクトルや、メッシュ面の法線ベクトルなどの情報を計算し、これら情報を含む3次元データファイル33を生成する。
 なお、3次元位置データ垂直オフセット部25からの3次元位置データを用いて3次元データファイル33を生成する場合は、3次元データファイル33は、ワールド座標系における計測した3次元位置の生のデータを含む。
FIG. 15A shows an example in which a polygon mesh is formed using both a triangular polygon and a quadrangular polygon, and FIG. 15B shows an example in which a polygon mesh is formed using only a triangular polygon. As can be seen from FIGS. 15A and 15B, compared to the virtual three- dimensional object 300, 301 in which the polygon mesh is formed by both the triangular polygon and the quadrilateral polygon, the virtual three-dimensional object 302 in which the polygon mesh is formed only by the triangle polygon. 303 have finer polygons.
In step S26, the 3D data file generation unit 30 determines the 3D position data from the 3D position data vertical offset unit 25 or the polygon mesh from the 3D position data polygon meshing unit 29 based on the file format parameter. A three-dimensional data file 33 suitable for the designated file format is generated from the converted three-dimensional position data. Depending on the file format, the three-dimensional data file generation unit 30 calculates information such as the normal vector of each vertex of the polygon and the normal vector of the mesh surface, and generates a three-dimensional data file 33 including these information. To do.
When the 3D data file 33 is generated using the 3D position data from the 3D position data vertical offset unit 25, the 3D data file 33 is the raw data of the measured 3D position in the world coordinate system. including.
 次に、プロジェクタデータファイル31の生成処理を説明する。図16に、プロジェクタデータファイル31の生成処理の一手順を示す。
 ステップS30で、初期プロジェクタデータ生成部16が、プロジェクタの投射中心であるプロジェクタ座標系の原点、すなわち、座標(0,0,0)をプロジェクタ位置座標に設定し、画角対称化部10からの投射方向ベクトル及び画角と合わせて、投射方向ベクトル、画角、プロジェクタ位置座標からなる初期プロジェクタデータを生成する。
 図17に、画角対称化選択パラメータに画角対称化なしが設定されている場合の投射方向ベクトル、上下画角、左右画角、投射中心座標の例を示す。この例では、投射光軸109は、投射エリア103の下端の中心部を通る。下角θB=0であり、上角θ’Tと下角θBは一致しないが、投射光軸109を投射中心軸110と見做す。プロジェクタ201の投射中心点102であるプロジェクタ座標系の原点、すなわち、座標(0,0,0)をプロジェクタ位置138の座標に設定する。投射方向ベクトル136は、投射光軸109の方向(水平・垂直方向成分がそれぞれ0)である。
Next, generation processing of the projector data file 31 will be described. FIG. 16 shows a procedure for generating the projector data file 31.
In step S30, the initial projector data generation unit 16 sets the origin of the projector coordinate system, that is, the projection center of the projector, that is, the coordinates (0, 0, 0) as the projector position coordinates. Together with the projection direction vector and the angle of view, initial projector data including the projection direction vector, the angle of view, and the projector position coordinates is generated.
FIG. 17 shows an example of a projection direction vector, a vertical field angle, a left and right field angle, and a projection center coordinate when no field angle symmetrization is set as the field angle symmetrization selection parameter. In this example, the projection optical axis 109 passes through the center of the lower end of the projection area 103. The lower angle θ B = 0, and the upper angle θ ′ T and the lower angle θ B do not match, but the projection optical axis 109 is regarded as the projection center axis 110. The origin of the projector coordinate system that is the projection center point 102 of the projector 201, that is, the coordinates (0, 0, 0) are set as the coordinates of the projector position 138. The projection direction vector 136 is the direction of the projection optical axis 109 (the horizontal and vertical direction components are 0 respectively).
 図18に、画角対称化選択パラメータに画角対称化ありが設定されている場合の投射方向ベクトル、上下画角、左右画角、投射中心座標の例を示す。この例では、下角θB=0であり、上角θLと下角θBは一致しないため、画角137のうちの垂直方向の画角は上下対称にならない。投射中心軸に垂直な投射面に投射される映像がスクウェアになるように歪み補正を行ったときに垂直方向の画角が上下対称(θ’T=θ’B)に、かつ、水平方向の画角が左右対称(θ’R=θ’L)になるように投射中心軸110を設定する。投射方向ベクトル136は、垂直方向の上下画角が対称、かつ、水平方向の左右画角が対称となる方向を示すベクトルとなる。 FIG. 18 shows an example of the projection direction vector, the vertical and horizontal field angles, the left and right field angles, and the projection center coordinates when the field angle symmetrization is set as the field angle symmetrization selection parameter. In this example, since the lower angle θ B = 0 and the upper angle θ L and the lower angle θ B do not match, the vertical angle of view among the angle of view 137 is not vertically symmetric. When distortion correction is performed so that the image projected on the projection plane perpendicular to the projection center axis is square, the vertical angle of view is vertically symmetrical (θ ' T = θ' B ), and the horizontal direction The projection center axis 110 is set so that the angle of view is symmetrical (θ ′ R = θ ′ L ). The projection direction vector 136 is a vector indicating a direction in which the vertical vertical field angle is symmetric and the horizontal horizontal field angle is symmetric.
 ステップS31で、プロジェクタデータワールド座標変換部17が、3次元位置データワールド座標変換部23での処理と同じように、姿勢センサ部11で検出されたプロジェクタの水平面に対する傾きに基づき、初期プロジェクタデータ生成部16で生成された投射方向ベクトルを、水平面を基準としたワールド座標系に変換する。
 ステップS32で、プロジェクタデータ垂直オフセット部18が、3次元位置データ垂直オフセット部25での処理と同じように、垂直オフセット量計算部24で計算した垂直オフセット量をプロジェクタ位置座標の垂直方向の座標に足して垂直方向の座標の調整を行う。
 ステップS33で、プロジェクタファイル生成部19が、プロジェクタデータファイル31を生成する。
In step S31, the projector data world coordinate conversion unit 17 generates initial projector data based on the inclination of the projector with respect to the horizontal plane detected by the attitude sensor unit 11 in the same manner as the processing in the three-dimensional position data world coordinate conversion unit 23. The projection direction vector generated by the unit 16 is converted into a world coordinate system based on the horizontal plane.
In step S 32, the projector data vertical offset unit 18 converts the vertical offset amount calculated by the vertical offset amount calculation unit 24 into the vertical coordinate of the projector position coordinate in the same manner as the processing in the three-dimensional position data vertical offset unit 25. Add and adjust vertical coordinates.
In step S33, the projector file generation unit 19 generates a projector data file 31.
 3次元データファイル33及びプロジェクタデータファイル31の生成後、情報処理装置が、3次元データファイル33とプロジェクタデータファイル31を取り込む。3次元空間プロジェクションマッピング映像作成ツールが、3次元データファイル33の内容に基づいて仮想の3次元空間上で立体物を構成し、立体物のそれぞれの面に映像を割り当てて再生し、プロジェクタデータファイル31の内容に従って、レンダリング用カメラを設定してレンダリングを行って投射映像を作成する。この投射映像をプロジェクタから投射すれば、立体物に外形がぴったり合った状態でプロジェクションマッピングを行うことができる。
 なお、立体的な映像表現を行う3次元プロジェクションマッピングにおいて、照明を設定して光のあたり具合などを再現するために視点位置にレンダリング用カメラを設定する場合には、プロジェクタデータファイル31の内容に従って、プロジェクタを設定し、視点位置にレンダリング用カメラを設定してレンダリングを行う。立体物との位置関係を用いてプロジェクタ投射映像に変換して投射映像を作成する。この投射映像をプロジェクタから投射すれば、立体物に外形がぴったり合った状態でプロジェクションマッピングを行うことができる。
After the three-dimensional data file 33 and the projector data file 31 are generated, the information processing apparatus takes in the three-dimensional data file 33 and the projector data file 31. A three-dimensional space projection mapping video creation tool constructs a three-dimensional object in a virtual three-dimensional space based on the contents of the three-dimensional data file 33, assigns an image to each surface of the three-dimensional object, and reproduces it. In accordance with the contents of 31, a rendering camera is set and rendering is performed to create a projected image. If this projected image is projected from the projector, projection mapping can be performed in a state where the outer shape of the three-dimensional object is exactly matched.
It should be noted that, in the 3D projection mapping that performs three-dimensional video expression, in the case where the rendering camera is set at the viewpoint position in order to set the illumination and reproduce the light hitting condition, etc., according to the contents of the projector data file 31 Then, the projector is set, the rendering camera is set at the viewpoint position, and rendering is performed. Using the positional relationship with the three-dimensional object, it is converted into a projector projected image to create a projected image. If this projected image is projected from the projector, projection mapping can be performed in a state where the outer shape of the three-dimensional object is exactly matched.
 (2次元空間用マッピングモード)
 2次元空間用マッピングモードでは、通信制御部1からのマッピング用データ生成開始指示に従い、マッピング用データ生成部7が2次元データファイル32の生成処理を実行するとともに、プロジェクタデータ生成部6がプロジェクタデータ垂直オフセット部18までの処理を実行する。
(Mapping mode for 2D space)
In the two-dimensional space mapping mode, the mapping data generation unit 7 executes the generation process of the two-dimensional data file 32 according to the mapping data generation start instruction from the communication control unit 1, and the projector data generation unit 6 Processing up to the vertical offset unit 18 is executed.
 図19に、2次元データファイル33の生成処理の一手順を示す。
 ステップS40で3次元位置データを取得し、ステップS41で3次元位置データの座標系をプロジェクタの座標系に変換し、ステップS42でワールド座標系に変換し、ステップS43で垂直オフセットを行い、ステップS44で立体物の面及び稜線の検出を行う。これらステップS40~S44の処理は、図11に示したステップS20~S24の処理と同様である。一方、プロジェクタデータ生成部6において、図16に示したステップS30からS33の処理が行われる。
 ステップS44の処理の後、ステップS45で、透視投影部27が、プロジェクタデータ垂直オフセット部18からのプロジェクタデータに基づいて、3次元位置データセグメンテーション部26で検出した3次元空間の稜線をプロジェクタの投射面に透視投影して2次元データを生成する。
 ステップS46で、2次元データファイル生成部28が、パラメータ格納部3に格納されているファイルフォーマットパラメータに基づいて、透視投影部27で生成した2次元データから指定されたファイルフォーマットに合った2次元データファイル32を生成する。
FIG. 19 shows a procedure for generating the two-dimensional data file 33.
In step S40, three-dimensional position data is acquired, in step S41, the coordinate system of the three-dimensional position data is converted into the projector coordinate system, in step S42, converted into the world coordinate system, in step S43, a vertical offset is performed, and in step S44. The surface and ridgeline of the three-dimensional object are detected with. The processes in steps S40 to S44 are the same as the processes in steps S20 to S24 shown in FIG. On the other hand, the projector data generation unit 6 performs the processing of steps S30 to S33 shown in FIG.
After the processing in step S44, in step S45, the perspective projection unit 27 projects the edge of the three-dimensional space detected by the three-dimensional position data segmentation unit 26 based on the projector data from the projector data vertical offset unit 18 by the projector. Two-dimensional data is generated by perspective projection on a surface.
In step S46, the two-dimensional data file generation unit 28 matches the file format specified from the two-dimensional data generated by the perspective projection unit 27 based on the file format parameters stored in the parameter storage unit 3. A data file 32 is generated.
 図20に示すように、透視投影では、3次元空間における稜線を構成する3次元位置データの点群データの各点とプロジェクタデータ中の投射中心点102を示すプロジェクタ位置座標とを結ぶ直線と、仮想的に設定したプロジェクタデータ中の投射方向ベクトル136に垂直な投影面139との交点を求める。その交点を投影点とし、投影面139と投影面139上の稜線の投影点を2次元データに変換し、その投影点を補完することによって、立体物の各面の外形位置を示す2次元データを生成する。
 2次元データファイル32の生成後、情報処理装置が、2次元データファイル32を取り込む。2次元空間プロジェクションマッピング映像作成ツールが、2次元データファイル32の内容に基づいて、2次元データ上の立体物の各面の外形位置を元に各面に映像を割り当てて投射映像を作成する。この投射映像をプロジェクタから投射すれば、立体物の外形に合った状態でプロジェクションマッピングを行うことができる。
As shown in FIG. 20, in perspective projection, a straight line connecting each point of the point group data of the three-dimensional position data constituting the ridge line in the three-dimensional space and the projector position coordinates indicating the projection center point 102 in the projector data; An intersection point with the projection plane 139 perpendicular to the projection direction vector 136 in the virtually set projector data is obtained. Two-dimensional data indicating the outer position of each surface of the three-dimensional object by converting the projection surface 139 and the projection point of the ridge line on the projection surface 139 into two-dimensional data using the intersection as the projection point and complementing the projection point Is generated.
After the two-dimensional data file 32 is generated, the information processing apparatus takes in the two-dimensional data file 32. Based on the contents of the two-dimensional data file 32, the two-dimensional spatial projection mapping video creation tool allocates video to each surface based on the external position of each surface of the three-dimensional object on the two-dimensional data and creates a projection video. If this projected image is projected from the projector, projection mapping can be performed in a state that matches the outer shape of the three-dimensional object.
 以上説明した本実施形態のプロジェクタによれば、3次元センサ108が立体物を3次元計測し、該計測結果である3次元位置データ(立体物上の投射面の位置及び稜線)をプロジェクタの座標系に座標変換する。これにより、プロジェクタの座標系での立体物の3次元位置データを得ることができる。
 また、上記の座標変換した3次元位置データを用いて、立体物上の各投射面の位置及び外形を示す2次元データを作成する。この2次元データに基づいて、立体物上の各投射面に映像を割り当てることで、2次元プロジェクションマッピング用の映像を作成することができる。この2次元プロジェクションマッピング用の映像は、プロジェクタの座標系での立体物の3次元位置データに基づいて作成されているので、その映像をプロジェクタから投射した場合、その投射映像の範囲は、立体物の投射面に一致する。
According to the projector of the present embodiment described above, the three-dimensional sensor 108 measures a three-dimensional object three-dimensionally, and the three-dimensional position data (position and ridgeline of the projection surface on the three-dimensional object) as the measurement result is the coordinates of the projector. Convert coordinates to system. Thereby, the three-dimensional position data of the three-dimensional object in the projector coordinate system can be obtained.
Further, two-dimensional data indicating the position and outer shape of each projection surface on the three-dimensional object is created using the coordinate-converted three-dimensional position data. A video for two-dimensional projection mapping can be created by assigning a video to each projection surface on the three-dimensional object based on the two-dimensional data. Since the image for two-dimensional projection mapping is created based on the three-dimensional position data of the three-dimensional object in the coordinate system of the projector, when the image is projected from the projector, the range of the projected image is the three-dimensional object. It corresponds to the projection surface of
 また、上記の座標変換した3次元位置データを用いて、立体物上の各投射面の位置及び外形を示す3次元データを生成し、さらに、投射方向ベクトル、画角、プロジェクタ位置座標を含むプロジェクタデータを生成する。3次元データに基づいて、仮想の3次元空間上で立体物を構成し、この仮想の立体物上の各面に映像を割り当てて再生するとともに、プロジェクタデータに基づいてレンダリング用カメラを設定し、レンダリングを行うことで、3次元プロジェクションマッピング用の映像を作成することができる。この3次元プロジェクションマッピング用の映像も、プロジェクタの座標系での立体物の3次元位置データに基づいて作成されているので、その映像をプロジェクタから投射した場合、その投射映像の範囲は、立体物の投射面に一致する。 Further, using the coordinate-converted three-dimensional position data, three-dimensional data indicating the position and outer shape of each projection surface on the three-dimensional object is generated, and a projector including a projection direction vector, an angle of view, and a projector position coordinate Generate data. Based on the three-dimensional data, configure a three-dimensional object in a virtual three-dimensional space, assign and play a video to each surface on the virtual three-dimensional object, set a rendering camera based on the projector data, By performing rendering, a video for three-dimensional projection mapping can be created. Since the image for 3D projection mapping is also created based on the 3D position data of the 3D object in the projector coordinate system, when the image is projected from the projector, the range of the projected image is 3D It corresponds to the projection surface of
 さらに、立体的な映像表現を行う3次元プロジェクションマッピングにおいて、照明を設定して光のあたり具合などを再現するために視点位置にレンダリング用カメラを設定する場合には、プロジェクタデータファイルの内容に従ってプロジェクタを設定し、視点位置にレンダリング用カメラを設定してレンダリングを行う。そして、立体物との位置関係に基づいて、プロジェクタ投射映像に変換して投射映像を作成する。この映像をプロジェクタから投射した場合、その投射映像の範囲は、立体物の投射面に一致する。 Further, in the three-dimensional projection mapping that expresses a three-dimensional image, when setting a rendering camera at the viewpoint position in order to set the illumination and reproduce the light hit condition, etc., the projector is set according to the contents of the projector data file. Set the rendering camera at the viewpoint position and perform rendering. And based on the positional relationship with a solid object, it converts into a projector projection image | video and produces a projection image | video. When this video is projected from the projector, the range of the projected video matches the projection surface of the three-dimensional object.
 また、プロジェクタの画角が上下対称、左右対称になるように歪み補正をかけ、その歪み補正に応じた投射方向ベクトル、画角、プロジェクタ位置座標を含むプロジェクタデータファイルを生成することできる。よって、レンダリング用のカメラ、あるいは、プロジェクタの設定として上下対称、左右対称の画角しか設定できない3次元空間のプロジェクションマッピング映像を作成するツールにおいても、立体物の投射面に一致した映像を作成することができる。
 さらに、地面に平行な水平面を基準としたワールド座標系のデータは、3次元空間のプロジェクションマッピング映像を作成するツールにおける立体的な映像表現を行う3次元プロジェクションマッピングの映像作成にも、そのまま使用することができる。
Further, it is possible to perform distortion correction so that the angle of view of the projector is vertically symmetric and symmetric, and to generate a projector data file including a projection direction vector, an angle of view, and projector position coordinates according to the distortion correction. Therefore, even in a tool for creating a projection mapping image in a three-dimensional space in which only a vertically symmetrical and left / right symmetric angle of view can be set as a rendering camera or projector setting, an image that matches the projection surface of a three-dimensional object is created. be able to.
Furthermore, the data in the world coordinate system based on a horizontal plane parallel to the ground is also used as it is for creating a 3D projection mapping video for a 3D video representation in a tool for creating a 3D space projection mapping video. be able to.
 以下、3次元プロジェクションマッピング用の映像作成における画角対称化に関する作用効果について具体的に説明する。
 はじめに、図21A~図21Fを参照して画角対称化に関する問題を説明する。
 プロジェクタの投射形態としては、図21Aに示すような打ち上げ投射が一般的である。この例では、投射光軸109が投射エリア103の下辺の中心部を通る。この場合、水平方向の画角おいて、左角θLと右角θRは等しいが、垂直方向の画角において、下角θBは0であり、上角θTとは一致しない。
 仮想の3次元空間におけるレンダリング用カメラにおいては、水平方向の画角を上下対称に設定し、垂直方向の画角を左右対称に設定するのが一般的である。このようなレンダリング用カメラを用いる場合、図21Bに示すように、垂直方向の画角が上下対称(θ'T=θ'B)となるように画角中心軸150を定め、投射光軸109が画角中心軸150と一致するように設定する必要がある。
Hereinafter, the operation and effect regarding the view angle symmetrization in the creation of the video for three-dimensional projection mapping will be described in detail.
First, a problem relating to field angle symmetrization will be described with reference to FIGS. 21A to 21F.
As a projection form of the projector, a launch projection as shown in FIG. 21A is common. In this example, the projection optical axis 109 passes through the center of the lower side of the projection area 103. In this case, the left angle θ L and the right angle θ R are equal in the horizontal angle of view, but the lower angle θ B is 0 in the vertical angle of view and does not match the upper angle θ T.
In a rendering camera in a virtual three-dimensional space, it is common to set the horizontal angle of view to be vertically symmetric and the vertical angle of view to be symmetric. When such a rendering camera is used, as shown in FIG. 21B, the field angle center axis 150 is determined so that the vertical field angle is vertically symmetrical (θ ′ T = θ ′ B ), and the projection optical axis 109 Needs to be set to coincide with the angle of view center axis 150.
 図21Cに示すように、立体物の1つの面を示す投射対象面153に投射できるように、プロジェクタ201の投射方向、画角(ズーム)、位置を決定する。ここでは、投射対象面153は、地面に対し垂直である。なお、この位置決め以降、プロジェクタ201の位置、画角(ズーム)、方向は変更しない。
 3次元データ(投射対象面153の3次元データ)と、位置、画角及び投射方向ベクトルからなるプロジェクタデータとを用いて、3次元プロジェクションマッピング用の映像を作成する。ここで、位置は投射中心点102の座標であり、画角はθLθRLR)、θ'Tθ'B(θ'T=θ'B)であり、投射方向ベクトルは、画角中心軸150の方向である。
As illustrated in FIG. 21C, the projection direction, the angle of view (zoom), and the position of the projector 201 are determined so that projection can be performed on the projection target surface 153 that represents one surface of the three-dimensional object. Here, the projection target surface 153 is perpendicular to the ground. Note that the position, angle of view (zoom), and direction of the projector 201 are not changed after this positioning.
An image for three-dimensional projection mapping is created using the three-dimensional data (three-dimensional data of the projection target surface 153) and projector data including a position, an angle of view, and a projection direction vector. Here, the position is the coordinate of the projection center point 102, the angle of view is θ L θ RL = θ R ), θ ′ T θ ′ B (θ ′ T = θ ′ B ), and the projection direction vector Is the direction of the view angle central axis 150.
 3次元プロジェクションマッピング用の映像の作成では、まず、プロジェクタデータをレンダリング用カメラ148のパラメータとして設定する。この設定により、例えば、図21Dに示すように、レンダリング用カメラ148の仮想撮像面151は、光学中心軸152に対して垂直とされる。
 次いで、3次元データに基づき、仮想の3次元空間に再現された仮想投射対象面154に投射する映像を割り付ける。そして、図21Eに示すように、仮想投射対象面154に割り付けられた映像をレンダリング用カメラ148で撮像する。撮像映像は、仮想撮像面151内の撮像された仮想投射対象面155のようになる。
In creating a video for 3D projection mapping, first, projector data is set as a parameter of the rendering camera 148. With this setting, for example, as illustrated in FIG. 21D, the virtual imaging surface 151 of the rendering camera 148 is perpendicular to the optical center axis 152.
Next, an image to be projected on the virtual projection target surface 154 reproduced in the virtual three-dimensional space is allocated based on the three-dimensional data. Then, as shown in FIG. 21E, the video assigned to the virtual projection target surface 154 is captured by the rendering camera 148. The captured image is like a captured virtual projection target surface 155 in the virtual imaging surface 151.
 上記のようにして作成した映像をプロジェクタ201から投射する。この場合、例えば、図21Fに示すように、投射対象面153に投射された投射映像156の中心を通る投射中心軸110は、画角中心軸150より上に位置する。このため、投射映像156は投射対象面153に対して上方向にずれる。
 また、レンダリング用カメラ148の仮想撮像面151が光学中心軸152に対して垂直であるのに対し、投射エリア103の投射面は画角中心軸150に対して傾いており、仮想撮像面151と投射面が平行でないために、投射映像156に歪みが生じる。例えば、図21Eに示すように、レンダリング用カメラ148を用いた撮影では、撮像された仮想投射対象面155と仮想投射対象面154との間で、互いの上辺の間の距離と互いの下辺の間の距離が同じである。これに対して、図21Fに示すように、プロジェクタ201の投射では、投射対象面への映像157と投射対象面153の間で、互いの上辺の間の距離よりも互いの下辺の間の距離が遠いため、投射映像156の幅が広がってしまう。このため、投射映像156に歪みが生じる。この歪みが生じる現象は、あおり投射や斜め投射を行ったときに歪みが生じる現象と同じである。
The image created as described above is projected from the projector 201. In this case, for example, as illustrated in FIG. 21F, the projection center axis 110 passing through the center of the projection image 156 projected onto the projection target surface 153 is located above the view angle center axis 150. For this reason, the projection image 156 is shifted upward with respect to the projection target surface 153.
Further, while the virtual imaging surface 151 of the rendering camera 148 is perpendicular to the optical center axis 152, the projection surface of the projection area 103 is inclined with respect to the view angle center axis 150, and the virtual imaging surface 151 and Since the projection surfaces are not parallel, the projected image 156 is distorted. For example, as shown in FIG. 21E, in shooting using the rendering camera 148, between the captured virtual projection target surface 155 and the virtual projection target surface 154, the distance between the upper sides of each other and the lower side of each other. The distance between them is the same. On the other hand, as shown in FIG. 21F, in the projection of the projector 201, the distance between the lower sides of the image 157 and the projection target surface 153 between the lower sides rather than the distance between the upper sides of the projection target surface 153. Is far away, the width of the projected image 156 increases. For this reason, the projected image 156 is distorted. The phenomenon in which this distortion occurs is the same as the phenomenon in which distortion occurs when tilt projection or oblique projection is performed.
 本実施形態のプロジェクタによれば、上記の投射映像の投射対象面に対するずれや歪みの発生を抑制することができる。以下、図21G~図21Kを用いてその原理を説明する。
 図21Gに示すように、投射中心軸に垂直な投射面に投射される映像がスクウェアになるように歪み補正を行ったときの映像の画角の上下角が等しくなる(θ"T=θ"B)ようにする。具体的には、投射中心点102と投射映像の中心とを結ぶ投射中心軸110と、投射中心軸110に垂直な投射面とを求めて、歪み補正係数を設定する。なお、歪み補正を施した映像は、元の映像よりも小さくなるため、θ"T及びθ"Bはそれぞれθ'T及びθ'Bより小さく、θ"L及びθ"RはそれぞれθL及びθRより小さくなる。このため、歪み補正をしていない場合と比較して、画角は狭くなる。
According to the projector of this embodiment, generation | occurrence | production of the shift | offset | difference and distortion with respect to the projection target surface of said projection image | video can be suppressed. Hereinafter, the principle will be described with reference to FIGS. 21G to 21K.
As shown in FIG. 21G, the vertical angle of the field angle of the image when the distortion correction is performed so that the image projected on the projection plane perpendicular to the projection center axis becomes square becomes equal (θ " T = θ" B ) Specifically, a projection center axis 110 connecting the projection center point 102 and the center of the projected image and a projection plane perpendicular to the projection center axis 110 are obtained, and a distortion correction coefficient is set. The video which has been subjected to distortion correction, to become smaller than the original image, theta "T and theta" B is less than the respective theta 'T and θ' B, θ "L and theta" R respectively theta L and smaller than θ R. For this reason, the angle of view becomes narrower than in the case where distortion correction is not performed.
 次に、図21Hに示すように、立体物の1つの面を示す投射対象面153に投射できるようにプロジェクタ201の投射方向及び位置を決定する。この例では、投射対象面153が地面に対し垂直とされている。なお、これ以降、プロジェクタ201の位置、方向、画角(ズーム)は変更しない。
 3次元データ(投射対象面153の3次元データ)と、位置、画角及び投射方向ベクトルからなるプロジェクタデータとを用いて、3次元プロジェクションマッピング用の映像を作成する。ここで、位置は投射中心点102の座標であり、画角はθLθRLR)、θ'Tθ'B(θ'T=θ'B)であり、投射方向ベクトルは投射中心軸110(=画角中心軸150)の方向である。
Next, as illustrated in FIG. 21H, the projection direction and position of the projector 201 are determined so that the projection can be performed on the projection target surface 153 that indicates one surface of the three-dimensional object. In this example, the projection target surface 153 is perpendicular to the ground. Thereafter, the position, direction, and angle of view (zoom) of the projector 201 are not changed.
An image for three-dimensional projection mapping is created using the three-dimensional data (three-dimensional data of the projection target surface 153) and projector data including a position, an angle of view, and a projection direction vector. Here, the position is the coordinate of the projection center point 102, the angle of view is θ L θ RL = θ R ), θ ′ T θ ′ B (θ ′ T = θ ′ B ), and the projection direction vector Is the direction of the projection center axis 110 (= view angle center axis 150).
 3次元プロジェクションマッピング用の映像の作成では、まず、プロジェクタデータをレンダリング用カメラ148のパラメータとして設定する。この設定により、例えば、図21Iに示すように、レンダリング用カメラ148の仮想撮像面151は、光学中心軸152に対して垂直とされる。
 次いで、3次元データに基づき、仮想の3次元空間に再現された仮想投射対象面154に投射する映像を割り付ける。そして、図21Jに示すように、仮想投射対象面154に割り付けられた映像をレンダリング用カメラ148で撮像する。撮像映像は、仮想撮像面151内の撮像された仮想投射対象面155のようになる。
In creating a video for 3D projection mapping, first, projector data is set as a parameter of the rendering camera 148. With this setting, for example, as illustrated in FIG. 21I, the virtual imaging surface 151 of the rendering camera 148 is perpendicular to the optical center axis 152.
Next, an image to be projected on the virtual projection target surface 154 reproduced in the virtual three-dimensional space is allocated based on the three-dimensional data. Then, as shown in FIG. 21J, an image assigned to the virtual projection target surface 154 is captured by the rendering camera 148. The captured image is like a captured virtual projection target surface 155 in the virtual imaging surface 151.
 上記のようにして作成した映像をプロジェクタ201から投射する。この場合、例えば、図21Kに示すように、投射対象面153に投射された投射映像156の中心を通る投射中心軸110は、画角中心軸150と一致する。また、レンダリング用カメラ148の仮想撮像面151が光学中心軸152に対し垂直であるのに対して、投射エリア103の投射面も画角中心軸150に対して垂直であるため、投射映像155は投射対象面153と一致し、歪みも生じない。
 なお、上記の説明では、図21Aで示した打ち上げ投射の例に作用を説明したが、図5B~図5Dで示したようなレンズシフトにより投射映像の位置を上下左右にずらす場合にも、投射映像のずれや歪みの問題が生じる。この場合は、垂直方向の歪み補正に加えて、水平方向の歪み補正も行う。すなわち、プロジェクタ201の画角が上下対称、左右対称になるように、垂直方向及び水平方向の歪み補正を行い、投射方向ベクトル、画角、プロジェクタ位置座標を含むプロジェクタデータを生成する。
The image created as described above is projected from the projector 201. In this case, for example, as shown in FIG. 21K, the projection center axis 110 passing through the center of the projection image 156 projected onto the projection target surface 153 coincides with the field angle center axis 150. In addition, while the virtual imaging surface 151 of the rendering camera 148 is perpendicular to the optical center axis 152, the projection surface of the projection area 103 is also perpendicular to the field angle central axis 150, so that the projected video 155 is It coincides with the projection target surface 153 and no distortion occurs.
In the above description, the operation has been described with respect to the example of the launch projection shown in FIG. 21A. However, even when the position of the projected image is shifted vertically and horizontally by the lens shift as shown in FIGS. Problems with image shift and distortion occur. In this case, in addition to vertical distortion correction, horizontal distortion correction is also performed. That is, distortion correction in the vertical direction and the horizontal direction is performed so that the angle of view of the projector 201 is vertically and horizontally symmetrical, and projector data including a projection direction vector, an angle of view, and projector position coordinates is generated.
 (第2の実施形態)
 図22は、本発明の第2の実施形態であるプロジェクタの構成を示すブロック図である。
 図22に示すプロジェクタは、通信制御部1及び通信入出力部2に代えて、ユーザインタフェース部34及び外部ストレージデバイス35を備えており、この点で、第1の実施形態のものと異なる。
 ユーザインタフェース部34は、ユーザによる入力操作を受け付け、通信制御部1と同様の制御を行う操作部であって、例えば、オンスクリーンディスプレイやキー入力部からなる。
(Second Embodiment)
FIG. 22 is a block diagram showing a configuration of a projector according to the second embodiment of the present invention.
The projector shown in FIG. 22 includes a user interface unit 34 and an external storage device 35 in place of the communication control unit 1 and the communication input / output unit 2, and is different from the first embodiment in this respect.
The user interface unit 34 is an operation unit that receives an input operation by the user and performs the same control as the communication control unit 1, and includes, for example, an on-screen display and a key input unit.
 外部ストレージデバイス35は、例えば、USB(Universal Serial Bus)メモリやSDカードなどの取り外し可能な記憶装置である。プロジェクタデータファイル31、2次元データファイル32及び3次元データファイル33は、ファイル格納部8から外部ストレージデバイス35に供給される。外部ストレージデバイス35を情報処理装置に接続することで、情報処理装置は、外部ストレージデバイス35からプロジェクタデータファイル31、2次元データファイル32及び3次元データファイル33を読み込むことができる。
 本実施形態のプロジェクタによれば、第1の実施形態で説明した作用効果に加えて、情報処理装置との通信ケーブルを介した相互通信が困難な場合や、情報処理装置との無線通信が困難である場合にも、外部ストレージデバイス35を用いることで、プロジェクタデータファイル31、2次元データファイル32及び3次元データファイル33を情報処理装置に提供することができる効果を奏する。ここで、通信ケーブルは、例えば、プロジェクタデータファイル31、2次元データファイル32及び3次元データファイル33を送るために用いられる。
The external storage device 35 is a removable storage device such as a USB (Universal Serial Bus) memory or an SD card. The projector data file 31, the two-dimensional data file 32, and the three-dimensional data file 33 are supplied from the file storage unit 8 to the external storage device 35. By connecting the external storage device 35 to the information processing apparatus, the information processing apparatus can read the projector data file 31, the two-dimensional data file 32, and the three-dimensional data file 33 from the external storage device 35.
According to the projector of the present embodiment, in addition to the effects described in the first embodiment, it is difficult to perform mutual communication with the information processing apparatus via the communication cable, or wireless communication with the information processing apparatus is difficult. In this case, by using the external storage device 35, the projector data file 31, the two-dimensional data file 32, and the three-dimensional data file 33 can be provided to the information processing apparatus. Here, the communication cable is used to send, for example, the projector data file 31, the two-dimensional data file 32, and the three-dimensional data file 33.
 上述した第1及び第2の実施形態において、プロジェクタの各部(プロジェクタデータ生成部6、マッピング用データ生成部7、画角対称化部10、歪み補正部13、歪補正計算部15など)に対応する処理をコンピュータに実行させるためのプログラムが提供されてもよい。この場合、コンピュータがそのプログラムを実行することで、プロジェクタデータ生成部6、マッピング用データ生成部7、画角対称化部10、歪み補正部13、歪補正計算部15などの各部に対応する機能を実現することができる。ここで、プログラムは、コンピュータ使用可能またはコンピュータ可読媒体で提供されてもよく、また、インターネット等のネットワークを介して提供されてもよい。ここで、コンピュータ使用可能またはコンピュータ可読媒体は、磁気、光、電子、電磁気、赤外線などを用いて情報の記録または読み取りが可能な媒体を含む。そのような媒体として、例えば、半導体メモリ、半導体または固体の記憶装置、磁気テープ、取外し可能なコンピュータディスケット、ランダムアクセスメモリ(RAM)、読出し専用メモリ(ROM)、磁気ディスク、光ディスク、光磁気ディスクなどがある。 In the first and second embodiments described above, each unit of the projector (projector data generation unit 6, mapping data generation unit 7, angle of view symmetrization unit 10, distortion correction unit 13, distortion correction calculation unit 15, etc.) is supported. A program for causing a computer to execute the processing to be performed may be provided. In this case, a function corresponding to each unit such as the projector data generation unit 6, the mapping data generation unit 7, the view angle symmetrization unit 10, the distortion correction unit 13, and the distortion correction calculation unit 15 by the computer executing the program. Can be realized. Here, the program may be provided in a computer usable or computer readable medium, or may be provided through a network such as the Internet. Here, the computer usable or computer readable medium includes a medium capable of recording or reading information using magnetism, light, electronic, electromagnetic, infrared, or the like. Such media include, for example, semiconductor memory, semiconductor or solid storage devices, magnetic tape, removable computer diskettes, random access memory (RAM), read only memory (ROM), magnetic disk, optical disk, magneto-optical disk, etc. There is.
 (第3の実施形態)
 図23は、本発明の第3の実施形態であるプロジェクタの構成を示すブロック図である。
 図23を参照すると、プロジェクタは、表示素子400、投射レンズ401、受付部402及びマッピング用データ生成部403を有する。
 表示素子400は、複数の画素からなる画像形成面を備える。投射レンズ401は、画像形成面に形成された画像を投射する。この投射レンズ401から立体物に向けて画像が投射される。
(Third embodiment)
FIG. 23 is a block diagram showing a configuration of a projector according to the third embodiment of the present invention.
Referring to FIG. 23, the projector includes a display element 400, a projection lens 401, a reception unit 402, and a mapping data generation unit 403.
The display element 400 includes an image forming surface including a plurality of pixels. The projection lens 401 projects an image formed on the image forming surface. An image is projected from the projection lens 401 toward a three-dimensional object.
 受付部402は、データ作成要求を受け付ける。マッピング用データ生成部403は、データ作成要求に応じて、プロジェクションマッピング映像を作成するためのマッピング用データを生成する。マッピング用データ生成部403は、3次元センサ部404、座標変換部406及びデータ生成部407を有する。
 3次元センサ部404は、例えば、画像が投射される方向に向けて配置された3次元センサ405を備える。3次元センサ部404は、3次元センサ405で立体物を3次元計測し、該立体物上の画像が投射される各面の位置および形状を3次元座標で表した3次元位置データを出力する。
The accepting unit 402 accepts a data creation request. The mapping data generation unit 403 generates mapping data for creating a projection mapping video in response to a data creation request. The mapping data generation unit 403 includes a three-dimensional sensor unit 404, a coordinate conversion unit 406, and a data generation unit 407.
The three-dimensional sensor unit 404 includes, for example, a three-dimensional sensor 405 that is arranged in a direction in which an image is projected. The three-dimensional sensor unit 404 three-dimensionally measures a three-dimensional object with the three-dimensional sensor 405, and outputs three-dimensional position data that represents the position and shape of each surface on which the image on the three-dimensional object is projected with three-dimensional coordinates. .
 座標変換部406は、3次元位置データの座標系を、画像形成面の対角に位置する画素からの主光線が交差する点を原点として画像の投射エリアを3次元座標で表したプロジェクタ座標系に変換する。
 データ生成部407は、座標変換部406にて座標変換された3次元位置データに基づいて、立体物の各面(画像が投射される面)それぞれについて、該面の位置と該面の外形を示す稜線とを取得し、該位置および稜線に基づいてマッピング用データ(立体物の2次元データ又は3次元データ)を生成する。
The coordinate conversion unit 406 is a projector coordinate system in which the coordinate system of the three-dimensional position data is represented by the three-dimensional coordinates of the image projection area with the point where the principal ray from the pixel located diagonally on the image forming surface intersects as the origin. Convert to
Based on the three-dimensional position data coordinate-converted by the coordinate conversion unit 406, the data generation unit 407 determines the position of the surface and the outer shape of the surface for each surface (surface on which an image is projected) of the three-dimensional object. The ridgeline shown is acquired, and mapping data (two-dimensional data or three-dimensional data of a three-dimensional object) is generated based on the position and the ridgeline.
 本実施形態のプロジェクタにおいて、投射レンズ401を構成する複数のレンズの少なくとも一部のレンズを投射レンズ401の光軸方向に移動可能なズーム機構と、光軸方向と直交する方向であるシフト方向に投射レンズ401をシフト可能なレンズシフト機構と、光軸方向における上記少なくとも一部のレンズの位置であるズーム位置と、シフト方向における投射レンズ401の位置であるレンズシフト位置とに応じた、主光線が交差する位置の変化を示す第1のルックアップテーブルと、をさらに有してもよい。この場合、座標変換部406は、第1のルックアップテーブルを参照して、ズーム機構の現在のズーム位置とレンズシフト機構の現在のレンズシフト位置とから上記主光線が交差する位置を決定してもよい。 In the projector of the present embodiment, a zoom mechanism that can move at least some of the plurality of lenses constituting the projection lens 401 in the optical axis direction of the projection lens 401 and a shift direction that is a direction orthogonal to the optical axis direction. The chief ray according to the lens shift mechanism that can shift the projection lens 401, the zoom position that is the position of the at least some of the lenses in the optical axis direction, and the lens shift position that is the position of the projection lens 401 in the shift direction. And a first look-up table showing a change in a position where the crossing points. In this case, the coordinate conversion unit 406 refers to the first look-up table and determines the position where the principal ray intersects from the current zoom position of the zoom mechanism and the current lens shift position of the lens shift mechanism. Also good.
 また、本実施形態のプロジェクタにおいて、座標変換部406は、3次元位置データの座標系をプロジェクタ座標系に変換した後、水平面を基準としたワールド座標系に変換してもよい。
 さらに、本実施形態のプロジェクタにおいて、ズーム位置とレンズシフト位置とに応じた、画像の投射範囲を角度で表した画角の変化を示す第2のルックアップテーブルと、第2のルックアップテーブルを参照して、ズーム機構の現在のズーム位置とレンズシフト機構の現在のレンズシフト位置とから現在の画角を取得し、該画角に基づいて、投射レンズ401からの投射光束の中心光線を表す投射中心軸を設定し、該投射中心軸の方向を示す投射方向ベクトルと現在の画角とを含む画角対称化情報を出力する画角対称化部と、プロジェクタ座標系の原点の座標を示すプロジェクタ位置座標と、画角対称化情報により示される、前記投射方向ベクトルおよび現在の画角と、を含む、プロジェクタデータを生成するプロジェクタデータ生成部と、をさらに有してもよい。この場合、プロジェクタデータ生成部は、投射ベクトルの方向を、水平面を基準としたワールド座標系に変換してもよい。さらに、データ生成部407は、プロジェクタデータに基づいて、稜線を投射面に透視投影することで、2次元データを生成してもよい。
In the projector according to the present embodiment, the coordinate conversion unit 406 may convert the coordinate system of the three-dimensional position data into the projector coordinate system, and then convert the coordinate system into the world coordinate system based on the horizontal plane.
Further, in the projector according to the present embodiment, a second lookup table indicating a change in the angle of view representing the projection range of the image as an angle according to the zoom position and the lens shift position, and a second lookup table are provided. With reference to the current zoom angle of the zoom mechanism and the current lens shift position of the lens shift mechanism, the current angle of view is acquired, and the central ray of the projection light beam from the projection lens 401 is represented based on the angle of view. An angle-of-view symmetrizing unit that sets a projection center axis and outputs angle-of-view symmetrization information including a projection direction vector indicating the direction of the projection center axis and the current angle of view, and the coordinates of the origin of the projector coordinate system A projector data generation unit for generating projector data, including the projector position coordinates, and the projection direction vector and the current angle of view indicated by the angle of view symmetrization information; It may further include a. In this case, the projector data generation unit may convert the direction of the projection vector into a world coordinate system based on the horizontal plane. Furthermore, the data generation unit 407 may generate two-dimensional data by perspectively projecting the ridge line on the projection surface based on the projector data.
 また、本実施形態のプロジェクタにおいて、画角対称化部は、投射レンズ401の光軸に垂直な面に対する、投射中心軸に垂直な面の傾きを示す投射面傾き情報を出力してもよい。この場合、プロジェクタは、投射面傾き情報に基づいて、投射面に投射される画像の歪みを補正するための歪み補正係数を計算する歪み補正係数計算部と、入力映像信号に対して歪み補正係数に基づく歪み補正を行う歪み補正部と、をさらに有し、歪み補正が行われた映像信号に基づく画像が表示素子400の画像形成面に形成されてもよい。
 さらに、本実施形態のプロジェクタにおいて、データ生成部407は、2次元データを所定のフォーマットのファイル形式で生成した2次元データファイルを作成してもよい。この場合、プロジェクタは、2次元データファイルを出力する出力手段を、さらに有してもよい。
In the projector according to the present embodiment, the field angle symmetrizing unit may output projection plane tilt information indicating the tilt of the plane perpendicular to the projection center axis with respect to the plane perpendicular to the optical axis of the projection lens 401. In this case, the projector includes a distortion correction coefficient calculation unit that calculates a distortion correction coefficient for correcting distortion of an image projected on the projection plane based on the projection plane tilt information, and a distortion correction coefficient for the input video signal. A distortion correction unit that performs distortion correction based on the image signal, and an image based on the video signal subjected to the distortion correction may be formed on the image forming surface of the display element 400.
Furthermore, in the projector according to the present embodiment, the data generation unit 407 may create a two-dimensional data file generated by generating two-dimensional data in a predetermined format file format. In this case, the projector may further include output means for outputting a two-dimensional data file.
 さらに、本実施形態のプロジェクタにおいて、データ生成部407は、3次元データを所定のフォーマットのファイル形式で生成した3次元データファイルを作成してもよい。さらに、データ生成部407は、立体物の各面をポリゴンメッシュ化することで3次元データを生成してもよい。この場合、プロジェクタは、3次元データファイルを出力する出力手段を、さらに有してもよい。
 また、本実施形態のプロジェクタにおいて、データ生成部403は、3次元データを所定のフォーマットのファイル形式で生成した3次元データファイルを作成し、プロジェクタデータ生成部は、プロジェクタデータを所定のフォーマットのファイル形式で生成したプロジェクタデータファイルを作成し、3次元データファイルおよびプロジェクタデータファイルを出力する出力手段を、さらに有してもよい。
 上記の出力手段は、外部情報処理装置との相互通信が可能な通信手段、または、取り外し可能な記憶手段であってもよい。
Furthermore, in the projector according to the present embodiment, the data generation unit 407 may create a three-dimensional data file generated by generating three-dimensional data in a predetermined format file format. Further, the data generation unit 407 may generate three-dimensional data by converting each surface of the three-dimensional object into a polygon mesh. In this case, the projector may further include output means for outputting a three-dimensional data file.
In the projector according to the present embodiment, the data generation unit 403 generates a three-dimensional data file in which three-dimensional data is generated in a predetermined format file format. The projector data generation unit generates the projector data in a predetermined format file. You may further have the output means which produces the projector data file produced | generated in the format, and outputs a three-dimensional data file and a projector data file.
The output means may be a communication means capable of mutual communication with an external information processing apparatus or a removable storage means.
 本実施形態のプロジェクタでは、マッピング用データ作成要求に応じて、マッピング用データを生成するマッピング用データ生成するマッピング用データ作成方法が行われる。このマッピング用データ作成方法は、3次元センサ部404が、画像が投射される方向に向けて配置された3次元センサ405を用いて立体物を3次元計測し、該立体物上の画像が投射される各面の位置および形状を3次元座標で表した3次元位置データを出力すること、座標変換部406が、3次元位置データの座標系を、画像形成面の対角に位置する画素からの主光線が交差する点を原点として画像の投射エリアを3次元座標で表したプロジェクタ座標系に変換すること、データ生成部407が、座標変換された3次元位置データに基づいて、立体物の各面それぞれについて、該面の位置と該面の外形を示す稜線とを取得し、該位置および稜線に基づいて、立体物の2次元データ又は3次元データを生成することとを含む。 In the projector according to the present embodiment, a mapping data creation method for generating mapping data for generating mapping data is performed in response to a mapping data creation request. In this mapping data creation method, the three-dimensional sensor unit 404 measures a three-dimensional object three-dimensionally using the three-dimensional sensor 405 arranged in the direction in which the image is projected, and the image on the three-dimensional object is projected. 3D position data representing the position and shape of each surface to be expressed in 3D coordinates is output, and the coordinate conversion unit 406 converts the coordinate system of the 3D position data from the pixels located diagonally to the image forming surface. The data generation unit 407 converts the projection area of the image into a projector coordinate system expressed in three-dimensional coordinates with the point where the principal rays intersect as the origin, based on the coordinate-converted three-dimensional position data. For each of the surfaces, obtaining the position of the surface and a ridge line indicating the outer shape of the surface, and generating two-dimensional data or three-dimensional data of the three-dimensional object based on the position and the ridge line.
 本実施形態のプロジェクタにおいて、コンピュータに、マッピング用データ作成要求に応じて、マッピング用データを生成するマッピング用データ生成処理を実行させるためのプログラムが用いられてもよい。この場合、マッピング用データ生成処理は、画像が投射される方向に向けて配置された3次元センサ405を用いて立体物を3次元計測し、該立体物上の前記画像が投射される各面の位置および形状を3次元座標で表した3次元位置データを出力する処理と、3次元位置データの座標系を、画像形成面の対角に位置する画素からの主光線が交差する点を原点として画像の投射エリアを3次元座標で表したプロジェクタ座標系に変換する処理と、座標変換された3次元位置データに基づいて、立体物の各面それぞれについて、該面の位置と該面の外形を示す稜線とを取得し、該位置および稜線に基づいて、立体物の2次元データ又は3次元データを生成する処理とを含んでもよい。 In the projector of the present embodiment, a program for causing a computer to execute mapping data generation processing for generating mapping data in response to a mapping data creation request may be used. In this case, in the mapping data generation processing, each surface on which the three-dimensional object is three-dimensionally measured using the three-dimensional sensor 405 arranged in the direction in which the image is projected, and the image on the three-dimensional object is projected. The process of outputting the 3D position data representing the position and shape of the image in 3D coordinates, and the coordinate system of the 3D position data at the point where the principal ray from the pixel located diagonally on the image forming surface intersects As a process for converting the projection area of the image into a projector coordinate system expressed in three-dimensional coordinates, and for each surface of the three-dimensional object based on the coordinate-converted three-dimensional position data, the position of the surface and the outer shape of the surface And a process of generating two-dimensional data or three-dimensional data of a three-dimensional object based on the position and the ridge line.
 上記プログラムは、コンピュータ使用可能またはコンピュータ可読媒体で提供されてもよく、また、インターネット等のネットワークを介して提供されてもよい。ここで、コンピュータ使用可能またはコンピュータ可読媒体は、磁気、光、電子、電磁気、赤外線などを用いて情報の記録または読み取りが可能な媒体を含む。そのような媒体として、例えば、半導体メモリ、半導体または固体の記憶装置、磁気テープ、取外し可能なコンピュータディスケット、ランダムアクセスメモリ(RAM)、読出し専用メモリ(ROM)、磁気ディスク、光ディスク、光磁気ディスクなどがある。 The program may be provided on a computer-usable or computer-readable medium, or may be provided via a network such as the Internet. Here, the computer usable or computer readable medium includes a medium capable of recording or reading information using magnetism, light, electronic, electromagnetic, infrared, or the like. Such media include, for example, semiconductor memory, semiconductor or solid storage devices, magnetic tape, removable computer diskettes, random access memory (RAM), read only memory (ROM), magnetic disk, optical disk, magneto-optical disk, etc. There is.
 400 表示素子
 401 投射レンズ
 402 受付部
 403 マッピング用データ生成部
 404 3次元センサ部
 406 座標変換部
 407 データ生成部
400 Display element 401 Projection lens 402 Reception unit 403 Mapping data generation unit 404 Three-dimensional sensor unit 406 Coordinate conversion unit 407 Data generation unit

Claims (17)

  1.  複数の画素からなる画像形成面を備えた表示素子と、前記画像形成面に形成された画像を投射する投射レンズと、を有し、前記投射レンズから立体物に向けて前記画像が投射されるプロジェクタであって、
     データ作成要求を受け付ける受付部と、
     前記データ作成要求に応じて、プロジェクションマッピング映像を作成するためのマッピング用データを生成するマッピング用データ生成部と、を有し、
     前記マッピング用データ生成部は、
     前記立体物を3次元計測し、該立体物上の前記画像が投射される各面の位置および形状を3次元座標で表した3次元位置データを出力する3次元センサ部と、
     前記3次元位置データの座標系を、前記画像形成面の対角に位置する画素からの主光線が交差する点を原点として前記画像の投射エリアを3次元座標で表したプロジェクタ座標系に変換する座標変換部と、
     前記座標変換部にて座標変換された3次元位置データに基づいて、前記立体物の前記各面それぞれについて、該面の位置と該面の外形を示す稜線とを取得し、該位置および稜線に基づいて前記マッピング用データを生成するデータ生成部と、を有する、プロジェクタ。
    A display element having an image forming surface composed of a plurality of pixels; and a projection lens that projects an image formed on the image forming surface, and the image is projected from the projection lens toward a three-dimensional object. A projector,
    A reception unit for receiving a data creation request;
    A mapping data generation unit for generating mapping data for creating a projection mapping video in response to the data creation request;
    The mapping data generation unit
    A three-dimensional sensor unit that three-dimensionally measures the three-dimensional object and outputs three-dimensional position data representing the position and shape of each surface on which the image on the three-dimensional object is projected in three-dimensional coordinates;
    The coordinate system of the three-dimensional position data is converted into a projector coordinate system in which the projection area of the image is expressed in three-dimensional coordinates with a point where a principal ray from a pixel located on the diagonal of the image forming surface intersects as an origin. A coordinate transformation unit;
    Based on the three-dimensional position data coordinate-converted by the coordinate conversion unit, for each of the surfaces of the three-dimensional object, a position of the surface and a ridge line indicating the outer shape of the surface are acquired, and the position and the ridge line are obtained. And a data generation unit that generates the mapping data based on the projector.
  2.  請求項1に記載のプロジェクタにおいて、
     前記投射レンズを構成する複数のレンズの少なくとも一部のレンズを前記投射レンズの光軸方向に移動可能なズーム機構と、
     前記光軸方向と直交する方向であるシフト方向に前記投射レンズをシフト可能なレンズシフト機構と、
     前記光軸方向における前記少なくとも一部のレンズの位置であるズーム位置と、前記シフト方向における前記投射レンズの位置であるレンズシフト位置とに応じた、前記主光線が交差する位置の変化を示す第1のルックアップテーブルと、をさらに有し、
     前記座標変換部は、前記第1のルックアップテーブルを参照して、前記ズーム機構の現在のズーム位置と前記レンズシフト機構の現在のレンズシフト位置とから前記主光線が交差する位置を決定する、プロジェクタ。
    The projector according to claim 1, wherein
    A zoom mechanism capable of moving at least some of the plurality of lenses constituting the projection lens in the optical axis direction of the projection lens;
    A lens shift mechanism capable of shifting the projection lens in a shift direction that is a direction orthogonal to the optical axis direction;
    A change in position at which the principal rays intersect according to a zoom position that is the position of at least a part of the lens in the optical axis direction and a lens shift position that is the position of the projection lens in the shift direction. A lookup table, and
    The coordinate conversion unit refers to the first look-up table and determines a position where the principal ray intersects from a current zoom position of the zoom mechanism and a current lens shift position of the lens shift mechanism. projector.
  3.  請求項2に記載のプロジェクタにおいて、
     前記座標変換部は、前記3次元位置データの座標系を前記プロジェクタ座標系に変換した後、水平面を基準としたワールド座標系に変換する、プロジェクタ。
    The projector according to claim 2,
    The coordinate conversion unit converts the coordinate system of the three-dimensional position data into the projector coordinate system, and then converts the coordinate system into a world coordinate system based on a horizontal plane.
  4.  請求項2または3に記載のプロジェクタにおいて、
     前記ズーム位置と前記レンズシフト位置とに応じた、前記画像の投射範囲を角度で表した画角の変化を示す第2のルックアップテーブルと、
     前記第2のルックアップテーブルを参照して、前記ズーム機構の現在のズーム位置と前記レンズシフト機構の現在のレンズシフト位置とから現在の画角を取得し、該画角に基づいて、前記投射レンズからの投射光束の中心光線を表す投射中心軸を設定し、該投射中心軸の方向を示す投射方向ベクトルと前記現在の画角とを含む画角対称化情報を出力する画角対称化部と、
     前記プロジェクタ座標系の原点の座標を示すプロジェクタ位置座標と、前記画角対称化情報により示される、前記投射方向ベクトルおよび現在の画角と、を含む、プロジェクタデータを生成するプロジェクタデータ生成部と、をさらに有する、プロジェクタ。
    The projector according to claim 2 or 3,
    A second look-up table showing a change in angle of view representing the projection range of the image in angle according to the zoom position and the lens shift position;
    With reference to the second look-up table, a current angle of view is obtained from a current zoom position of the zoom mechanism and a current lens shift position of the lens shift mechanism, and the projection is performed based on the angle of view. An angle-of-view symmetrizing unit that sets a projection center axis that represents the central ray of the projection light beam from the lens and outputs angle-of-view symmetrization information including a projection direction vector indicating the direction of the projection center axis and the current angle of view. When,
    A projector data generation unit for generating projector data, including projector position coordinates indicating coordinates of an origin of the projector coordinate system, and the projection direction vector and the current angle of view indicated by the angle-of-view symmetrization information; The projector further comprising:
  5.  請求項4に記載のプロジェクタにおいて、
     前記プロジェクタデータ生成部は、前記投射ベクトルの方向を、水平面を基準としたワールド座標系に変換する、プロジェクタ。
    The projector according to claim 4, wherein
    The projector data generation unit converts the direction of the projection vector into a world coordinate system with a horizontal plane as a reference.
  6.  請求項4または5に記載のプロジェクタにおいて、
     前記データ生成部は、前記プロジェクタデータに基づいて、前記稜線を投射面に透視投影することで、2次元データを生成する、プロジェクタ。
    The projector according to claim 4 or 5,
    The data generation unit generates two-dimensional data by perspectively projecting the ridge line on a projection surface based on the projector data.
  7.  請求項4から6のいずれか一項に記載のプロジェクタにおいて、
     前記画角対称化部は、前記投射レンズの光軸に垂直な面に対する、前記投射中心軸に垂直な面の傾きを示す投射面傾き情報を出力し、
     前記投射面傾き情報に基づいて、投射面に投射される画像の歪みを補正するための歪み補正係数を計算する歪み補正係数計算部と、
     入力映像信号に対して前記歪み補正係数に基づく歪み補正を行う歪み補正部と、をさらに有し、
     前記歪み補正が行われた映像信号に基づく画像が前記画像形成面に形成される、プロジェクタ。
    The projector according to any one of claims 4 to 6,
    The angle-of-view symmetrization unit outputs projection plane tilt information indicating a tilt of a plane perpendicular to the projection center axis with respect to a plane perpendicular to the optical axis of the projection lens;
    Based on the projection surface tilt information, a distortion correction coefficient calculation unit that calculates a distortion correction coefficient for correcting distortion of an image projected on the projection surface;
    A distortion correction unit that performs distortion correction based on the distortion correction coefficient for the input video signal,
    A projector in which an image based on the video signal subjected to the distortion correction is formed on the image forming surface.
  8.  請求項1から7のいずれか一項に記載のプロジェクタにおいて、
     前記データ生成部は、前記2次元データを所定のフォーマットのファイル形式で生成した2次元データファイルを作成する、プロジェクタ。
    The projector according to any one of claims 1 to 7,
    The data generation unit creates a two-dimensional data file generated by generating the two-dimensional data in a file format of a predetermined format.
  9.  請求項8に記載のプロジェクタにおいて、
     前記2次元データファイルを出力する出力手段を、さらに有する、プロジェクタ。
    The projector according to claim 8, wherein
    A projector further comprising output means for outputting the two-dimensional data file.
  10.  請求項1から7のいずれか一項に記載のプロジェクタにおいて、
     前記データ生成部は、前記立体物の前記各面をポリゴンメッシュ化することで3次元データを生成する、プロジェクタ。
    The projector according to any one of claims 1 to 7,
    The data generation unit is a projector that generates three-dimensional data by forming each surface of the three-dimensional object into a polygon mesh.
  11.  請求項1から7、10のいずれか一項に記載のプロジェクタにおいて、
     前記データ生成部は、前記3次元データを所定のフォーマットのファイル形式で生成した3次元データファイルを作成する、プロジェクタ。
    The projector according to any one of claims 1 to 7, wherein
    The data generation unit creates a three-dimensional data file in which the three-dimensional data is generated in a file format of a predetermined format.
  12.  請求項11に記載のプロジェクタにおいて、
     前記3次元データファイルを出力する出力手段を、さらに有する、プロジェクタ。
    The projector according to claim 11, wherein
    A projector further comprising output means for outputting the three-dimensional data file.
  13.  請求項4から7のいずれか一項に記載のプロジェクタにおいて、
     前記データ生成部は、前記3次元データを所定のフォーマットのファイル形式で生成した3次元データファイルを作成し、
     前記プロジェクタデータ生成部は、前記プロジェクタデータを所定のフォーマットのファイル形式で生成したプロジェクタデータファイルを作成し、
     前記3次元データファイルおよびプロジェクタデータファイルを出力する出力手段を、さらに有する、プロジェクタ。
    The projector according to any one of claims 4 to 7,
    The data generation unit creates a three-dimensional data file in which the three-dimensional data is generated in a file format of a predetermined format,
    The projector data generation unit creates a projector data file that generates the projector data in a predetermined format file format,
    A projector further comprising output means for outputting the three-dimensional data file and the projector data file.
  14.  請求項9、12、13のいずれか一項に記載のプロジェクタにおいて、
     前記出力手段は、外部情報処理装置との相互通信が可能な通信手段、または、取り外し可能な記憶手段である、プロジェクタ。
    The projector according to any one of claims 9, 12, and 13,
    The projector is a communication unit capable of mutual communication with an external information processing apparatus or a removable storage unit.
  15.  複数の画素からなる画像形成面を備えた表示素子と、前記画像形成面に形成された画像を投射する投射レンズと、を有し、前記投射レンズから立体物に向けて前記画像が投射されるプロジェクタにて行われるマッピング用データ作成方法であって、
     データ作成要求に応じて、プロジェクションマッピング映像を作成するためのマッピング用データを生成するマッピング用データ生成ステップを含み、
     前記マッピング用データ生成ステップは、
     前記立体物を3次元計測し、該立体物上の前記画像が投射される各面の位置および形状を3次元座標で表した3次元位置データを取得し、
     前記3次元位置データの座標系を、前記画像形成面の対角に位置する画素からの主光線が交差する点を原点として前記画像の投射エリアを3次元座標で表したプロジェクタ座標系に変換し、
     前記座標変換された3次元位置データに基づいて、前記立体物の前記各面それぞれについて、該面の位置と該面の外形を示す稜線とを取得し、該位置および稜線に基づいて前記マッピング用データを生成すること、を含む、マッピング用データ作成方法。
    A display element having an image forming surface composed of a plurality of pixels; and a projection lens that projects an image formed on the image forming surface, and the image is projected from the projection lens toward a three-dimensional object. A mapping data creation method performed by a projector,
    A mapping data generation step for generating mapping data for creating a projection mapping video in response to the data creation request;
    The mapping data generation step includes:
    Three-dimensional measurement of the three-dimensional object, to obtain three-dimensional position data representing the position and shape of each surface on which the image on the three-dimensional object is projected in three-dimensional coordinates,
    The coordinate system of the three-dimensional position data is converted into a projector coordinate system in which the projection area of the image is expressed in three-dimensional coordinates with a point where a principal ray from a pixel located diagonally on the image forming surface intersects as an origin. ,
    Based on the coordinate-converted three-dimensional position data, for each of the surfaces of the three-dimensional object, a position of the surface and a ridge line indicating the outer shape of the surface are acquired, and the mapping is performed based on the position and the ridge line. Generating data for mapping, including generating data.
  16.  複数の画素からなる画像形成面を備えた表示素子と、前記画像形成面に形成された画像を投射する投射レンズと、を有し、前記投射レンズから立体物に向けて前記画像が投射されるプロジェクタのコンピュータに、データ作成要求に応じて、プロジェクションマッピング映像を作成するためのマッピング用データを生成するマッピング用データ生成処理を実行させるためのプログラムであって、
     前記マッピング用データ生成処理は、
     前記立体物を3次元計測し、該立体物上の前記画像が投射される各面の位置および形状を3次元座標で表した3次元位置データを取得する処理と、
     前記3次元位置データの座標系を、前記画像形成面の対角に位置する画素からの主光線が交差する点を原点として前記画像の投射エリアを3次元座標で表したプロジェクタ座標系に変換する処理と、
     前記座標変換された3次元位置データに基づいて、前記立体物の前記各面それぞれについて、該面の位置と該面の外形を示す稜線とを取得し、該位置および稜線に基づいて前記マッピング用データを生成する処理と、を含む、プログラム。
    A display element having an image forming surface composed of a plurality of pixels; and a projection lens that projects an image formed on the image forming surface, and the image is projected from the projection lens toward a three-dimensional object. A program for causing a projector computer to execute mapping data generation processing for generating mapping data for generating a projection mapping video in response to a data generation request,
    The mapping data generation process includes:
    Processing to measure the three-dimensional object three-dimensionally and obtain three-dimensional position data representing the position and shape of each surface on which the image on the three-dimensional object is projected in three-dimensional coordinates;
    The coordinate system of the three-dimensional position data is converted into a projector coordinate system in which the projection area of the image is expressed in three-dimensional coordinates with a point where a principal ray from a pixel located on the diagonal of the image forming surface intersects as an origin. Processing,
    Based on the coordinate-converted three-dimensional position data, for each of the surfaces of the three-dimensional object, a position of the surface and a ridge line indicating the outer shape of the surface are acquired, and the mapping is performed based on the position and the ridge line. And a program for generating data.
  17.  請求項1乃至14のいずれか一項に記載のプロジェクタと、
     前記プロジェクタと相互に通信可能な映像処理装置と、を有し、
     前記映像処理装置は、前記プロジェクタで生成したプロジェクションマッピング映像作成ツール用のデータに基づいてプロジェクションマッピング映像を作成する、プロジェクションマッピングシステム。
    A projector according to any one of claims 1 to 14,
    An image processing device capable of mutual communication with the projector,
    The video processing apparatus is a projection mapping system that creates a projection mapping video based on data for a projection mapping video creation tool generated by the projector.
PCT/JP2017/010700 2017-03-16 2017-03-16 Projector, method of creating data for mapping, program, and projection mapping system WO2018167918A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2017/010700 WO2018167918A1 (en) 2017-03-16 2017-03-16 Projector, method of creating data for mapping, program, and projection mapping system
JP2019505625A JP6990694B2 (en) 2017-03-16 2017-03-16 Projector, data creation method for mapping, program and projection mapping system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/010700 WO2018167918A1 (en) 2017-03-16 2017-03-16 Projector, method of creating data for mapping, program, and projection mapping system

Publications (1)

Publication Number Publication Date
WO2018167918A1 true WO2018167918A1 (en) 2018-09-20

Family

ID=63523477

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/010700 WO2018167918A1 (en) 2017-03-16 2017-03-16 Projector, method of creating data for mapping, program, and projection mapping system

Country Status (2)

Country Link
JP (1) JP6990694B2 (en)
WO (1) WO2018167918A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021021810A (en) * 2019-07-26 2021-02-18 sPods株式会社 Projection system, display controller, and method for projection
CN113630588A (en) * 2020-05-08 2021-11-09 精工爱普生株式会社 Control method of image projection system and image projection system
CN114157700A (en) * 2022-02-09 2022-03-08 国科星图(深圳)数字技术产业研发中心有限公司 Dam reservoir safety monitoring system
JP7281576B1 (en) 2022-03-31 2023-05-25 Kddi株式会社 Video projection system and video projection method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001061121A (en) * 1999-08-23 2001-03-06 Nec Corp Projector
JP2001320652A (en) * 2000-05-11 2001-11-16 Nec Corp Projector
JP2002062842A (en) * 2000-08-11 2002-02-28 Nec Corp Projection video correction system and its method
JP2005291839A (en) * 2004-03-31 2005-10-20 Brother Ind Ltd Projecting device and three-dimensional shape detection device
JP2012078490A (en) * 2010-09-30 2012-04-19 Sanyo Electric Co Ltd Projection image display device, and image adjusting method
JP2013098712A (en) * 2011-10-31 2013-05-20 Sanyo Electric Co Ltd Projection type video display device and image adjustment method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001061121A (en) * 1999-08-23 2001-03-06 Nec Corp Projector
JP2001320652A (en) * 2000-05-11 2001-11-16 Nec Corp Projector
JP2002062842A (en) * 2000-08-11 2002-02-28 Nec Corp Projection video correction system and its method
JP2005291839A (en) * 2004-03-31 2005-10-20 Brother Ind Ltd Projecting device and three-dimensional shape detection device
JP2012078490A (en) * 2010-09-30 2012-04-19 Sanyo Electric Co Ltd Projection image display device, and image adjusting method
JP2013098712A (en) * 2011-10-31 2013-05-20 Sanyo Electric Co Ltd Projection type video display device and image adjustment method

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021021810A (en) * 2019-07-26 2021-02-18 sPods株式会社 Projection system, display controller, and method for projection
CN113630588A (en) * 2020-05-08 2021-11-09 精工爱普生株式会社 Control method of image projection system and image projection system
CN113630588B (en) * 2020-05-08 2023-04-21 精工爱普生株式会社 Control method of image projection system and image projection system
CN114157700A (en) * 2022-02-09 2022-03-08 国科星图(深圳)数字技术产业研发中心有限公司 Dam reservoir safety monitoring system
CN114157700B (en) * 2022-02-09 2022-04-15 国科星图(深圳)数字技术产业研发中心有限公司 Dam reservoir safety monitoring system
JP7281576B1 (en) 2022-03-31 2023-05-25 Kddi株式会社 Video projection system and video projection method
JP2023151126A (en) * 2022-03-31 2023-10-16 Kddi株式会社 Image projection system and image projection method

Also Published As

Publication number Publication date
JPWO2018167918A1 (en) 2020-05-14
JP6990694B2 (en) 2022-01-12

Similar Documents

Publication Publication Date Title
WO2021103347A1 (en) Projector keystone correction method, apparatus, and system, and readable storage medium
JP6764533B2 (en) Calibration device, chart for calibration, chart pattern generator, and calibration method
TWI253006B (en) Image processing system, projector, information storage medium, and image processing method
WO2018068719A1 (en) Image stitching method and apparatus
WO2018167918A1 (en) Projector, method of creating data for mapping, program, and projection mapping system
EP3547260B1 (en) System and method for automatic calibration of image devices
JP6494239B2 (en) Control device, control method, and program
CN105308503A (en) System and method for calibrating a display system using a short throw camera
JP2020187358A (en) Projection system, projection apparatus and calibrating method for displayed image thereof
JP2007036482A (en) Information projection display and program
JP2014178393A (en) Image projection system and image projection method
KR102222290B1 (en) Method for gaining 3D model video sequence
JP6804056B2 (en) Projection type display device, control method of projection type display device, and program
KR20200045176A (en) Robust display system and method using depth camera and flight vehicle
JP2016085380A (en) Controller, control method, and program
JP2004228824A (en) Stack projection device and its adjusting method
CN114286066A (en) Projection correction method, projection correction device, storage medium and projection equipment
JP2015139087A (en) Projection device
JP2014134611A (en) Geometric distortion correction device, projector, and geometric distortion correction method
US10089726B2 (en) Image processing apparatus, image processing method, and storage medium, relating to generating an image corresponding to a predetermined three-dimensional shape by transforming a captured image
JP2007033087A (en) Calibration device and method
TWM594322U (en) Camera configuration system with omnidirectional stereo vision
JP2010220177A (en) Stereo image processing apparatus, stereo image processing method, and stereo image processing program
JP2006338167A (en) Image data creation method
KR102535136B1 (en) Method And Apparatus for Image Registration

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17900889

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019505625

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17900889

Country of ref document: EP

Kind code of ref document: A1