WO2019163449A1 - Appareil de traitement d'image, procédé de traitement d'image et programme - Google Patents

Appareil de traitement d'image, procédé de traitement d'image et programme Download PDF

Info

Publication number
WO2019163449A1
WO2019163449A1 PCT/JP2019/003183 JP2019003183W WO2019163449A1 WO 2019163449 A1 WO2019163449 A1 WO 2019163449A1 JP 2019003183 W JP2019003183 W JP 2019003183W WO 2019163449 A1 WO2019163449 A1 WO 2019163449A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
projection plane
image
plane
image processing
Prior art date
Application number
PCT/JP2019/003183
Other languages
English (en)
Japanese (ja)
Inventor
正俊 石井
Original Assignee
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2018235364A external-priority patent/JP2019146155A/ja
Application filed by キヤノン株式会社 filed Critical キヤノン株式会社
Publication of WO2019163449A1 publication Critical patent/WO2019163449A1/fr
Priority to US16/942,803 priority Critical patent/US11962946B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor

Definitions

  • the present invention relates to a technique for generating an image for a display system that displays a wide-field image.
  • Patent Document 1 describes a method of displaying an image on a spherical wide viewing angle screen with a concave surface facing the viewer. According to the method described in Patent Document 1, an image to be displayed on a screen is generated by performing a mapping process in which a planar image is pasted into a spherical shape.
  • a wide-field image can also be displayed by arranging a plurality of flat display screens so as to cover the viewer's field of view instead of a spherical screen.
  • each display screen is arranged so that a normal line of each of the display screens has an intersection point.
  • mapping a flat image onto each display screen in such a display system results in an unnatural display image.
  • an object of the present invention is to generate a more natural display image for an image display system capable of displaying an image with a wide viewing angle.
  • the present invention is an image processing apparatus that generates a display image to be displayed on a display system having a display unit, and inputs one input image obtained by imaging by one imaging apparatus.
  • FIG. 1 shows the hardware constitutions of an image processing apparatus Block diagram showing functional configuration of image processing apparatus Block diagram showing functional configuration of image processing apparatus
  • FIG. 1 shows the positional relationship of a viewpoint position, a display part, a plane projection surface, and a cylindrical projection surface
  • FIG. 1 shows the correspondence of a plane projection surface and an input image
  • FIG. 1 shows the relationship between a display part, a plane projection surface, and a cylindrical projection surface Diagram showing the placement of virtual cameras
  • FIG. 1 shows the relationship between a display part, a plane projection surface, and a cylindrical projection surface Diagram showing an example of a curved display
  • FIG. 1 shows the relationship between a display part, a plane projection surface, and a cylindrical projection surface Diagram showing the placement of virtual cameras Diagram showing the placement of virtual cameras
  • FIG. 4 shows an example of a display system assumed in this embodiment.
  • a display unit that displays an image is configured by three displays, the center display 401, the left display 402, and the right display 403.
  • the center display 401, the left display 402, and the right display 403 are arranged so as to draw an isosceles trapezoid when viewed from above.
  • Each display uses a self-luminous device such as a liquid crystal display.
  • the center display 401 is disposed in front of the viewer.
  • the left display 402 is disposed so as to be in contact with the left end as viewed from the viewer of the center display 401 and an angle formed with the center display 401 has an angle ⁇ sc.
  • the right display 403 is arranged so as to be in contact with the right end as viewed from the viewer of the center display 401 and to form an angle ⁇ sc with the center display 401. Therefore, the three displays are arranged so that the normal from each display surface has an intersection. And the display image corresponding to each is displayed on the intersection side of each display surface. By arranging three planar display screens (displays) in this way, the display unit covers the viewer's field of view.
  • the angle at which the viewer looks at the image displayed on the display unit (three displays) in the display system is called a display angle.
  • the display images displayed on each display are all generated from a common input image.
  • three display images are generated based on an input image captured using one imaging device (for example, a digital camera).
  • FIG. 1 shows a hardware configuration of an image processing apparatus according to this embodiment.
  • the CPU 101 uses the RAM 102 as a work memory, executes programs stored in the ROM 103 and the hard disk drive (HDD) 105, and controls each component to be described later via the system bus 100. Thereby, various processes described later are executed.
  • the HDD I / F 104 is an interface such as serial ATA (SATA), for example, and connects a secondary storage device such as the HDD 105 or an optical disk drive.
  • SATA serial ATA
  • the CPU 101 can read data from the HDD 105 and write data to the HDD 105 via the HDD I / F 104.
  • the CPU 101 can expand the data stored in the HDD 105 in the RAM 102 and similarly store the data expanded in the RAM 102 in the HDD 105.
  • the CPU 101 can execute the data expanded in the RAM 102 as a program.
  • the input I / F 106 is a serial bus interface such as USB or IEEE1394, for example, and is connected to an input device 107 such as a keyboard and a mouse.
  • the CPU 101 can read data from the input device 107 via the input I / F 106.
  • the output I / F 108 is a video output interface such as DVI or HDMI (registered trademark), for example, and connects an output device 109 such as a liquid crystal display or a projector.
  • the CPU 101 can send data to the output device 109 via the output I / F 108 to execute display.
  • the output device 109 is a display system having the display unit shown in FIG.
  • FIG. 2 is a block diagram showing a functional configuration of the image processing apparatus according to the present embodiment.
  • the image processing apparatus includes a projection plane setting unit 201, an image acquisition unit 202, a display device information acquisition unit 203, a viewpoint position information acquisition unit 204, an imaging parameter acquisition unit 205, a display image generation unit 206, and an image output unit 207.
  • the projection plane setting unit 201 sets two projection planes for generating a display image to be displayed on each display in the display system from the input image.
  • a planar virtual projection surface hereinafter referred to as a planar projection surface
  • a cylindrical virtual projection surface hereinafter referred to as a cylindrical projection surface
  • the projection plane setting unit 201 sets a plane projection plane and a cylindrical projection plane.
  • the plane projection plane is set according to the aspect ratio of the input image and the imaging angle of view when the input image obtained by imaging is captured.
  • the cylindrical projection surface is a projection surface having a shape constituted by a free-form surface, and here is a shape obtained by cutting out a part of the side surface of the cylinder. It can be said that the cylindrical projection plane is a plane whose plane is curved in the horizontal direction.
  • the cylindrical projection surface is an arc drawn by a smooth line segment, unlike a cornered shape (part of an isosceles trapezoid) drawn by three displays in the display system when viewed from above.
  • the projection plane setting unit 201 sets a cylindrical projection plane according to the viewpoint position and the size and positional relationship of each display in the display system.
  • the image acquisition unit 202 acquires an image obtained by imaging and outputs the acquired image to the display image generation unit 206 as an input image.
  • the display system information acquisition unit 203 acquires information related to a display unit (here, a display) in the display system.
  • information indicating the number of displays, the shape and size of each display surface, and the positional relationship among a plurality of displays is acquired.
  • the viewpoint information acquisition unit 204 acquires viewpoint information indicating the viewer's viewpoint position.
  • the viewpoint information is information indicating the three-dimensional position of the viewer's viewpoint when viewing the image display unit in the display system.
  • a display image to be displayed on the display system is generated in advance before the viewer appreciates.
  • the display angle varies depending on how far the viewer views the display from a position away from the display. Therefore, in this embodiment, in order to generate a display image in advance, it is necessary to assume from which position the viewer will appreciate.
  • the viewpoint position is specified by acquiring, as viewpoint information, a viewpoint position desirable for the viewer to appreciate the display.
  • the imaging parameter acquisition unit 205 acquires the imaging parameters of the imaging apparatus that were set when the input image was acquired by imaging.
  • the imaging parameter acquisition unit 205 can acquire imaging parameters based on metadata attached to the input image. Or it is good also as a form which acquires an imaging parameter based on the information which the user input from the input device 107. FIG.
  • the display image generation unit 206 generates a display image to be displayed on each display from one input image based on the positional relationship between the viewpoint position and each display. Details of the display image generation unit 206 will be described later.
  • the image output unit 207 outputs the generated three display images to each display.
  • FIG. 3 is a flowchart showing the flow of image processing in this embodiment.
  • the CPU 101 reads a program for realizing the flowchart shown in FIG. 3 stored in the ROM 103 or the HDD 104, and executes the RAM 102 as a work area.
  • the CPU 101 plays a role as each functional configuration shown in FIG.
  • each step (step) is denoted as “S”.
  • step S ⁇ b> 301 the image acquisition unit 202 acquires captured image data representing a captured image stored in the HDD 105 as an input image, and stores the acquired image data in the RAM 102.
  • the imaging parameter acquisition unit 205 acquires imaging parameters from the metadata attached to the captured image data.
  • the imaging parameter acquisition unit 205 acquires information for specifying an imaging angle of view at the time of imaging and a lens projection method as imaging parameters.
  • the display system information acquisition unit 203 acquires display system information related to the image display unit in the display system.
  • the display system information acquisition unit 203 acquires information indicating the number of displays that display images, the shape and size of each display surface, and the arrangement of each display.
  • the number of displays is three.
  • the shape of each display is a plane and a rectangle with a width Wsc and a height Hsc.
  • the opening angle between the center display and the right display, and the opening angle between the center display and the left display are all the angle ⁇ sc.
  • the viewing angle (display angle) when viewing a display unit composed of three displays from the viewpoint position is 2 ⁇ .
  • the display system information is acquired from the input device 107 to the RAM 102 based on a user instruction.
  • these pieces of information may be stored in advance in the HDD 105 as a display system information package, and may be selected from the HDD 105 as necessary.
  • the viewpoint information acquisition unit 204 acquires viewpoint information from the input device 107 based on a user instruction.
  • the viewpoint information acquisition unit 204 acquires a distance Dview from the center position on the screen of the center display 401 as viewpoint information.
  • the projection plane setting unit 201 sets a plane projection plane and a cylindrical projection plane used when generating a display image. Details of the projection plane setting process will be described later.
  • the display image generation unit 206 generates display image data indicating a display image to be displayed on each display. Details of the display image generation processing will be described later.
  • the image output unit 207 outputs a display image corresponding to each display generated from the RAM 102 to the output device 109 via the output I / F 108.
  • the generated display image may be stored in the HDD 105.
  • the projection plane setting process two projection planes used in the display image generation process are set.
  • the first projection plane is a planar projection plane, and is a projection plane for arranging the input image in the virtual space.
  • the second projection plane is a cylindrical projection plane, and is a projection plane for projecting the input image onto the configuration of the display unit.
  • the second projection plane has a role of approximating the input image to the configuration (shape) of the display unit.
  • the display part in this embodiment is arrange
  • a planar projection plane (first projection plane) is projected onto a cylindrical projection plane (second projection plane), and a display image of each display is generated based on the image. That is, it is desirable that the second projection plane is a projection plane that is more similar to the shape drawn by the three displays than the plane and has no corners when viewed from above. The distance between the point on the cylindrical projection plane and the viewpoint position changes smoothly in the horizontal direction.
  • a plane projection plane When a plane projection plane is associated with such a second projection plane, a display image is displayed on the display unit so as to appreciate an image projected on the cylindrical projection plane from the viewpoint position. As a result, it is possible to suppress the phenomenon that the subject appears to be bent even in the vicinity of the boundary where the two displays are adjacent.
  • FIG. 5 is a flowchart showing details of the projection plane setting process executed by the projection plane setting unit 201. Hereinafter, each step of the flowchart will be described.
  • the projection plane setting unit 201 generates a plane projection plane as the first projection plane.
  • the plane projection plane is configured by a rectangular plane having the same aspect ratio as the captured image. Further, the projection plane setting unit 201 calculates the size and position of the plane projection plane so that the expected angle when viewing the plane projection plane from the viewpoint position coincides with the shooting angle of view, and arranges it in the virtual space. In this way, by matching the viewpoint position and the position of the imaging device at the time of shooting in the virtual space, a display image with little spatial distortion can be generated.
  • FIG. 7 is a diagram illustrating the relationship between the viewpoint position, the display unit, and the virtual projection plane.
  • the virtual space XYZ three-dimensional coordinates with the viewpoint position as the origin are defined.
  • the planar projection plane is parallel to the XY plane, and is arranged at a position such that the Z axis passes through the center of the planar projection plane.
  • each display of the display unit is arranged so that the distance between the origin (viewpoint position) and the center position of the center display becomes Dview. That is, the center coordinates of the center display are (0, 0, Dview).
  • the three displays are arranged so as to be symmetric about the Z axis.
  • the half field angle of the horizontal shooting field angle in the input image is ⁇
  • the prospective half field angle with respect to the plane projection plane from the viewpoint position is ⁇ as shown in FIG.
  • the projection plane setting unit 201 acquires a display angle that is an expected angle when the display unit is viewed from the viewpoint position.
  • the angle between the straight line that connects the viewpoint position and the upper and lower center point of the left end of the left display and the straight line that connects the viewpoint position and the upper and lower center point of the right end of the right display is The display angle of the direction.
  • the angle ⁇ is a half of the display angle.
  • the projection plane setting unit 201 sets the central angle of the arc of the cylindrical projection plane by the display angle 2 ⁇ acquired in S502.
  • the projection plane setting unit 201 generates a cylindrical projection plane in the virtual space.
  • the cylindrical projection surface has a shape obtained by cutting the side surface of the cylinder with the central angle set in S503.
  • the height of the cylindrical projection surface is set so that the ratio of the length and height of the arc matches the ratio of the width and height of the planar projection surface.
  • the projection plane setting unit 201 arranges the cylindrical projection plane set according to the display system information in the virtual space so that the center of the cylinder matches the viewpoint position. Accordingly, the cylindrical projection surface is arranged at a position where the Z axis in the virtual space passes through the center of the cylindrical projection surface.
  • the cross section of the cylinder is a ugly shape with the same central angle as the display angle, and the center of the circle coincides with the viewpoint position.
  • the display image generation unit 206 generates a display image corresponding to each display by rendering the cylindrical projection plane with a virtual camera arranged at the viewpoint position in the virtual space.
  • FIG. 2B is a block diagram illustrating a detailed configuration of the display image generation unit 206.
  • the first calculation unit 2061 calculates the correspondence between the input image and the planar projection plane.
  • the second calculation unit 2063 calculates a correspondence relationship between the planar projection surface and the cylindrical projection surface.
  • the virtual camera setting unit 2062 sets a virtual camera corresponding to each of the plurality of displays on the virtual space. In the display system assumed in this embodiment, since there are three displays, three virtual cameras are also set.
  • the rendering processing unit 2064 uses the correspondence between the input image and the plane projection plane, and the correspondence between the plane projection plane and the cylindrical projection plane. A display image is generated by calculating.
  • FIG. 6 is a flowchart showing details of the display image generation processing in the present embodiment.
  • the first calculation unit 2061 performs association between the three-dimensional coordinates of each vertex of the plane projection plane in the virtual space and the two-dimensional coordinates indicating the pixel position of the input image. This is the same processing as UV mapping in general CG rendering.
  • the coordinates of the four corners of the plane projection plane are associated with the coordinates indicating the pixel positions of the four corners of the input image.
  • the pixel position of each pixel is indicated by a UV coordinate system in which the upper left pixel is the origin (0, 0).
  • the first calculation unit 2061 acquires the UV coordinates of each vertex in the input image and associates it with the three-dimensional coordinates of each vertex of the planar projection plane.
  • the UV coordinates of each pixel other than the four corners are calculated by linear interpolation. If the lens projection method is equidistant projection or equisolid angle projection, such as a fisheye lens, the same processing may be performed after re-projecting with central projection.
  • the second calculation unit 2063 calculates a correspondence relationship between the cylindrical projection plane and the planar projection plane in the virtual space.
  • the angle ⁇ s can be calculated by subtracting the angle between the line segment connecting S L and the origin and the line segment connecting the X axis and the origin.
  • the angle ⁇ s can be calculated by subtracting the angle calculated by the expression (1) from the display angle 2 ⁇ .
  • the angle ⁇ s is half the display angle ⁇ .
  • the points on the plane projection plane are associated with the cylindrical projection plane.
  • the points on the plane projection plane are associated with each other so that the corresponding points on the cylindrical projection plane are equal.
  • the ratio of the width of the plane projection plane to the length of the points S L ′ and S ′ on the plane projection plane and the length of the arc from the point S to the point S L corresponding to the point S ′ on the cylindrical projection plane The ratio of the length of the arc of the cylindrical projection surface to the height is as shown in Equation (4).
  • Equation (5) the X coordinate x s ′ of the point S ′ is expressed by the angle ⁇ s as shown in Equation (5).
  • the second calculation unit 2063 substitutes the formula (1), the formula (2), and the formula (3) for the angle ⁇ s in the formula (5) according to the position of x s to thereby obtain the point x on the plane projection plane.
  • s ′ can be calculated from the point x s on the cylindrical projection plane. In this way, the second calculation unit 2063 calculates the correspondence that associates the X coordinates of the cylindrical projection plane and the planar projection plane.
  • the association of the Y coordinate that is the height direction will be described.
  • the height of the cylindrical projection plane be Hcurve .
  • the Y coordinate at the point S on the cylindrical projection plane is y s
  • the Y coordinate of the point S ′ on the plane projection plane is y s ′.
  • the ratio of the height of the planar projection surface to the height from the lower side to the point S ′ on the planar projection surface, and the height of the cylindrical projection surface to the height from the lower side to the point S on the cylindrical projection surface The ratios of the two coincide with each other as shown in Equation (6).
  • the second calculation unit 2063 calculates a correspondence relationship for associating the Y coordinates of the cylindrical projection plane and the planar projection plane with Expression (7).
  • the plane projection plane is a plane
  • the Z coordinate does not change at any point on the plane projection plane. Therefore, the Z coordinate of the cylindrical projection plane coincides with the Z coordinate of the planar projection plane set in the projection plane setting process.
  • step S603 the virtual camera setting unit 2062 sets the position and orientation of the virtual camera used for display image rendering processing.
  • FIG. 10 is a diagram for explaining the position and orientation of the virtual camera in the present embodiment.
  • the virtual camera setting unit 2062 prepares a total of three virtual cameras corresponding to each display. Specifically, the virtual camera A is set in the virtual space for the center display, the virtual camera B is set for the left display, and the virtual camera C is set for the right display.
  • the position of each virtual camera is the viewpoint position, that is, the center of the cylindrical projection plane.
  • the orientation of the virtual camera is set so that the optical axis of the virtual camera faces the center of the display corresponding to each virtual camera.
  • the virtual camera setting unit 2062 sets the angle of view of the virtual camera.
  • the virtual camera setting unit 2062 sets the expected angle when the display corresponding to each virtual camera is viewed from the viewpoint position as the angle of view of each virtual camera.
  • step S605 the rendering processing unit 2064 executes rendering processing for each of the three virtual cameras, and generates display image data representing the display image. Specifically, first, the three-dimensional coordinates on the cylindrical projection surface projected onto each pixel of the image obtained when the virtual camera images in the virtual space are calculated. Next, based on the correspondence between the plane projection plane and the cylinder projection plane calculated in S602, the three-dimensional coordinates on the cylinder projection plane are converted into the three-dimensional coordinates on the plane projection plane. Further, based on the correspondence relationship between the plane projection surface calculated in S601 and the input image, the three-dimensional coordinates on the plane projection are converted into positions on the input image. Thereby, the pixel of the image obtained by the virtual camera corresponds to the position on the input image.
  • the rendering processing unit 2064 calculates the pixel value of the pixel in the image obtained by the virtual camera by sampling based on the calculated position on the input image. Specifically, the rendering processing unit 2064 acquires pixel values of four pixels around the calculated position in the input image. The rendering processing unit 2064 determines the pixel value of the pixel in the image obtained by the virtual camera by performing an interpolation operation according to the calculated position on the pixel value of the four pixels acquired from the input image. By performing the process on all the pixels of each virtual camera, a display image to be displayed on each display is generated.
  • display images to be displayed on the three display screens are generated based on one input image.
  • a corresponding image is captured for each display screen.
  • three imaging devices are arranged so that the optical axis passes through the center of each display screen at the same viewpoint position in the scene, and the scene is photographed by each imaging device.
  • the display image corresponding to each display screen is produced
  • this method requires a plurality of imaging devices at the time of shooting.
  • each imaging device it is necessary to set the imaging parameters of each imaging device in the same manner so that the images are naturally connected between the display screens when displayed on the display unit.
  • appropriate imaging parameters differ depending on the brightness of the scene and moving objects, and the imaging parameters of other imaging devices must be adjusted each time the imaging parameters of any imaging device are changed. Therefore, in the present embodiment, by generating each display image based on one input image obtained by shooting by one imaging device, it is possible to reduce the shooting load for displaying on the display unit.
  • the images displayed on the respective display images are images captured under common imaging parameters, the brightness and distance of the same subject in each display image substantially match. Therefore, when each display image is generated between a plurality of display screens, it is possible to realize a display system in which each display screen is naturally connected and the viewer can feel as if in the scene.
  • the display angle is larger than the shooting angle of view as shown in FIG.
  • the plane projection plane and each pixel on the display unit are arranged so as to be within the range of a line segment connecting both ends of the plane projection plane and the viewpoint position. Suppose that they are associated. In this case, only a part of the image is displayed on the left and right displays in the display unit. That is, when the display angle is larger than the shooting angle of view, the horizontal width (horizontal length) connecting the display images in the display unit is longer than the horizontal width (horizontal length) of one input image. An area where no image is displayed on the display unit is generated.
  • the input image is expanded so that the imaging field angle of the input image satisfies the display angle based on the imaging field angle when the input image is captured and the display angle on the display unit.
  • the planar projection plane corresponding to the input image and the cylindrical projection plane corresponding to the display image are associated with each other so that the horizontal widths thereof coincide. Thereby, it is suppressed that the area
  • the plane projection plane is directly mapped to each display in the virtual space.
  • the center display is parallel to the plane projection plane, whereas the left and right displays are not parallel to the plane projection plane.
  • the position of the predetermined interval on the plane projection plane is different from the position of the corresponding position on the center display and the interval of the corresponding position on the left and right displays.
  • a display image to be displayed on each display is generated using a cylindrical projection plane different from the display unit in the display system.
  • the plane projection plane is associated with the cylindrical projection plane so that the positions of the predetermined intervals on the plane projection plane are evenly spaced on the cylindrical projection plane.
  • the input image is an image obtained by capturing the soccer field from the touch line side.
  • a subject that has knowledge that everyone is a straight line, such as a touch line on a soccer field generates a display image using an input image that is a straight line in the input image.
  • the curvature of the cylindrical projection surface is changed in order to reduce the uncomfortable feeling caused by perceiving linear distortion with respect to such a subject.
  • the display system in this embodiment is the same as that in the first embodiment.
  • the overall processing flow performed in the image processing apparatus is the same as that in the flowchart of FIG. 3 described in the first embodiment, and a description thereof will be omitted.
  • details of the projection plane setting process different from the first embodiment will be described.
  • symbol is attached
  • FIG. 11 is a flowchart showing details of the projection plane setting process executed by the projection plane setting unit 201 in the present embodiment.
  • the projection plane setting process two projection planes are set: a plane projection plane corresponding to the input image and a cylindrical projection plane having a shape obtained by cutting out a part of the side surface of the cylinder.
  • the projection plane setting unit 201 generates a plane projection plane as the first projection plane.
  • the details of the processing are the same as the processing described in S501 of the first embodiment, and a description thereof will be omitted.
  • the projection plane setting unit 201 sets the center angle of the eel shape drawn by the cylindrical projection plane.
  • the horizontal field of view when the input image is captured is set as the central angle of the cylindrical projection plane.
  • step S ⁇ b> 1103 the projection plane setting unit 201 sets the cylindrical projection plane in the virtual space so that the horizontal expected angle when the cylindrical projection plane is viewed from the viewpoint position matches the horizontal display angle of the display unit (display). Determine the installation position to be placed inside.
  • the projection plane setting unit 201 generates a cylindrical projection plane and arranges it in the virtual space.
  • the cylindrical projection plane has a shape cut out at the central angle set in S1102, and is arranged at the installation position in the virtual space set in S1103.
  • FIG. 12 is a diagram illustrating a positional relationship among the cylindrical projection plane, the plane projection plane, and the display unit (display) when the virtual space is viewed on the XZ plane.
  • the Z axis passes through the center of the cylindrical projection plane, and the projection plane is perpendicular to the XZ plane.
  • the center of the circle on the cylindrical projection surface does not necessarily match the viewpoint position.
  • the height of the cylindrical projection surface is set so that the ratio of the arc length to the height matches the ratio of the width and height of the planar projection surface.
  • the cylindrical projection surface set as described above has a smaller curvature than the cylindrical projection surface described in the first embodiment. Accordingly, when an input image in which a linear subject is reflected in the horizontal direction is rendered, the straight line distortion can be further reduced. As a result, when the display image is displayed on the display system, it is possible to reduce a sense of discomfort caused by the viewer perceiving horizontal straight line distortion in the display image.
  • the central angle of the circle on the cylindrical projection plane is matched with the imaging field angle when the input image is captured.
  • the central angle of the cylindrical projection plane may be set according to the type of subject and the degree of distortion. This results in a process equivalent to controlling the curvature of the cylindrical projection surface.
  • FIG. 13 shows an example of the display unit in the present embodiment.
  • the viewer's viewpoint is located at the center of the cylinder showing the curved display 1301, and the curved display 1301 is arranged so that the prospective angle of the curved display viewed from the viewer is 2 ⁇ .
  • the curved display 1301 is arranged so that the center on the curved display 1301 corresponds to a position away from the viewer by a distance D view .
  • the distance D view is assumed to coincide with the radius of the cylinder showing the curved display 1301.
  • the height of the curved display 1301 is assumed to be H sc .
  • a display image on a curved display is generated based on an image obtained by projecting a plane projection surface (first projection surface) onto a cylindrical projection surface (second projection surface). Is done.
  • first projection surface a plane projection surface
  • second projection surface a cylindrical projection surface
  • the cylindrical projection plane is set to match the shape of the curved display constituting the display unit.
  • FIG. 14 is a flowchart showing details of the projection plane setting process executed by the projection plane setting unit 201 in the present embodiment. Hereinafter, each step of the flowchart will be described.
  • the projection plane setting unit 201 generates a plane projection plane as the first projection plane.
  • the plane projection plane is configured by a rectangular plane having the same aspect ratio as the captured image. Further, the projection plane setting unit 201 calculates the size and position of the plane projection plane so that the expected angle when viewing the plane projection plane from the viewpoint position coincides with the shooting angle of view, and arranges it in the virtual space.
  • the projection plane setting unit 201 acquires a display angle that is an expected angle when the display unit is viewed from the viewpoint position. As shown in FIG. 13, in the present embodiment, the angle between the straight line connecting the viewpoint position and the upper and lower central points of the left and right ends of the curved display is set as the horizontal display angle. If the display angle is 2 ⁇ , the angle ⁇ is a half of the display angle.
  • the projection plane setting unit 201 generates a cylindrical projection plane in the virtual space.
  • the cylindrical projection plane is set so as to have the same shape as the curved display constituting the display unit. That is, the center angle of the cylinder is set to coincide with the display angle 2 ⁇ , and the height of the cylindrical projection plane is set to match the height H sc of the curved display.
  • the projection plane setting unit 201 arranges the cylindrical projection plane set according to the display system information in the virtual space so that the center of the cylinder matches the viewpoint position. Accordingly, the cylindrical projection surface is arranged at a position where the Z axis in the virtual space passes through the center of the cylindrical projection surface.
  • the cross section of the cylinder has a fan shape with the same central angle as the display angle, and the center of the cylinder coincides with the viewpoint position.
  • the cylindrical projection surface has the same shape as the curved display.
  • the flow of the display image data generation process is basically the same as the process described with reference to FIG. 6 of the first embodiment.
  • the virtual camera setting unit 2062 prepared a total of three virtual cameras corresponding to each display in S603.
  • the virtual camera is also the same.
  • One unit corresponding to is sufficient.
  • the position of the virtual camera is the viewpoint position, that is, the center of the cylindrical projection plane.
  • the orientation of the virtual camera is set so that the optical axis faces the Z axis.
  • the angle of view of the virtual camera is set to match the expected angle when the curved display is viewed from the viewpoint position.
  • FIG. 16 shows the relationship between the cylindrical projection plane and the virtual camera in this embodiment.
  • a display image may be generated by setting a plurality of virtual cameras, for example.
  • FIG. 17 shows an example in which a display image is generated using three virtual cameras.
  • Each virtual camera is set to satisfy the center angle of the cylindrical projection surface, that is, the display angle of the curved display when the angle of view of multiple virtual cameras is matched, and a display image is generated by combining the rendered images Is possible.
  • the rendering ranges of the virtual cameras can be set to overlap.
  • the present invention can also be applied to a case where the display unit is configured with a curved screen and a plurality of projectors project a partial image on the screen.
  • the rendering process is performed by setting the orientation and angle of view of the virtual camera so as to correspond to the projection range of each projector. Then, a corresponding display image is projected from each projector, and an overlapping portion of the video is subjected to alpha blending processing so that it can be shown as a connected video.
  • the second projection plane is set in accordance with the shape of the display unit and the process is performed.
  • a natural display image can be generated.
  • a display image is generated after the projection plane setting process.
  • Such a projection plane setting process is necessary when a display image to be displayed on the display system assumed in the above-described embodiment is generated for the first time.
  • the projection plane setting process is not necessarily required.
  • information indicating a plane projection surface or a cylindrical projection surface in the virtual space is set in advance based on the display system information and stored in the ROM 103 or the HDD 105.
  • the information indicating the correspondence between the input image calculated in S601 and the plane projection plane when generating the display image, and the information indicating the correspondence between the plane projection plane and the cylindrical projection plane calculated in S602 are similarly stored. Keep it. In this case, the process shown in FIG. 5 is not necessary, and the process may proceed to S306 after S304. For each of S601 and S602, instead of the calculation process, a process of reading information indicating the correspondence between the stored input image and the plane projection plane and information indicating the correspondence between the plane projection plane and the cylindrical projection plane is performed. Just do it. Naturally, the correspondence between the plane projection surface and the cylindrical projection surface is a correspondence relationship according to a desired display system, so that a display image with a little discomfort is generated.
  • a display image can be produced
  • the display system is changed, information indicating the projection plane in the stored virtual space may be read and adjusted.
  • a plane projection plane according to the imaging field angle and viewpoint information and a cylindrical projection plane according to the display system are arranged.
  • the projection plane is set and stored again by changing the shape of the cylindrical projection plane in accordance with the change of the display angle or by changing the position of the cylindrical projection plane in the virtual space in accordance with the change of the viewpoint information.
  • the display angle indicating the angle at which the viewer looks at the display unit from the viewpoint position in the present embodiment can also be regarded as a range used for display in the input image.
  • a projection plane called a cylindrical projection plane is set as the second projection plane.
  • the cylindrical projection surface is a developable surface whose plane is curved in the horizontal direction with respect to the viewpoint.
  • the second projection plane is desirably a plane having an intermediate shape between the input image that is a plane with respect to the viewpoint and the display unit.
  • the display unit may use a screen that projects an image using a projector.
  • a display system a plurality of projectors corresponding to each of the plurality of screens are installed so that a display image can be projected on each screen.
  • the above embodiment can also be applied to a display system in which a plurality of projectors project onto a white wall.
  • the wall on which the image is projected is viewed from above and has the same shape as the display shown in FIG. 4, the same effect can be obtained by generating a display image using the cylindrical projection surface.
  • the display system information is acquired by regarding the area where the image is projected on the white wall as the display unit.
  • the display unit using a large display as shown in FIG. 4 has been described as an example.
  • a similar display system can be configured even if the display unit is large enough to cover only the head using, for example, display screens on a plurality of planes.
  • the display screen constituting the display unit is a plane, and the image display side is arranged so that the normal of each display screen has an intersection on the image display side, thereby displaying a wide-field image with a sense of reality. be able to.
  • the display image of each display screen is generated from one common input image, as in the above-described embodiment.
  • an image with a wide field of view without a sense of incongruity can be generated.
  • the viewpoint information is specified by designating a desired viewer position.
  • the position of the viewer may be actually detected, and the viewpoint information may be set according to the viewer who is actually viewing the display system.
  • the plane projection plane is set as a rectangular plane having the same aspect ratio as the captured image.
  • the plane projection plane may be set as a rectangular plane having an aspect ratio of the partial area.
  • the first calculation unit 2061 associates the coordinates of the four corners of the planar projection plane with the coordinates indicating the pixel positions of the four corners of the partial area in the input image, thereby generating a display generated from the partial area of the input image. An image can be displayed on the display unit.
  • the present invention supplies a program that realizes one or more functions of the above-described embodiments to a system or apparatus via a network or a storage medium, and one or more processors in the computer of the system or apparatus read and execute the program This process can be realized. It can also be realized by a circuit (for example, ASIC) that realizes one or more functions.
  • ASIC application specific integrated circuit

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

L'invention concerne un appareil de traitement d'image qui génère une image d'affichage pour un affichage dans un système d'affichage ayant un dispositif d'affichage, caractérisé en ce que le dispositif d'affichage a positionné à l'intérieur de celui-ci une pluralité d'écrans plats qui peuvent afficher des images, et l'appareil de traitement d'image ayant un moyen de génération qui génère une pluralité d'images d'affichage pour chacune d'une pluralité d'images plates à partir d'une seule image d'entrée, sur la base de la relation entre la position de point de vue et chacune de la pluralité d'images plates.
PCT/JP2019/003183 2018-02-20 2019-01-30 Appareil de traitement d'image, procédé de traitement d'image et programme WO2019163449A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/942,803 US11962946B2 (en) 2018-02-20 2020-07-30 Image processing apparatus, display system, image processing method, and medium

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2018-028111 2018-02-20
JP2018028111 2018-02-20
JP2018-235364 2018-12-17
JP2018235364A JP2019146155A (ja) 2018-02-20 2018-12-17 画像処理装置、画像処理方法およびプログラム

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/942,803 Continuation US11962946B2 (en) 2018-02-20 2020-07-30 Image processing apparatus, display system, image processing method, and medium

Publications (1)

Publication Number Publication Date
WO2019163449A1 true WO2019163449A1 (fr) 2019-08-29

Family

ID=67688085

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/003183 WO2019163449A1 (fr) 2018-02-20 2019-01-30 Appareil de traitement d'image, procédé de traitement d'image et programme

Country Status (1)

Country Link
WO (1) WO2019163449A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11561511B2 (en) 2019-12-30 2023-01-24 Electronics And Telecommunications Research Institute Method and apparatus for generating hologram with wide viewing angle

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0463092A (ja) * 1990-06-29 1992-02-28 Sanyo Electric Co Ltd 3次元シーン表示システム
JP2002350999A (ja) * 2001-05-24 2002-12-04 Toppan Printing Co Ltd 映像表示システム及び映像プログラム
JP2005277825A (ja) * 2004-03-25 2005-10-06 Hitachi Ltd マルチカメラ映像表示装置
JP2005347813A (ja) * 2004-05-31 2005-12-15 Olympus Corp 画像変換方法および画像変換装置、並びにマルチプロジェクションシステム
JP2013211672A (ja) * 2012-03-30 2013-10-10 Namco Bandai Games Inc 曲面投影立体視装置
JP2017502583A (ja) * 2013-12-09 2017-01-19 シゼイ シジブイ カンパニー リミテッド 多面上映館の映像の生成方法とその記憶媒体およびこれを用いた映像管理装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0463092A (ja) * 1990-06-29 1992-02-28 Sanyo Electric Co Ltd 3次元シーン表示システム
JP2002350999A (ja) * 2001-05-24 2002-12-04 Toppan Printing Co Ltd 映像表示システム及び映像プログラム
JP2005277825A (ja) * 2004-03-25 2005-10-06 Hitachi Ltd マルチカメラ映像表示装置
JP2005347813A (ja) * 2004-05-31 2005-12-15 Olympus Corp 画像変換方法および画像変換装置、並びにマルチプロジェクションシステム
JP2013211672A (ja) * 2012-03-30 2013-10-10 Namco Bandai Games Inc 曲面投影立体視装置
JP2017502583A (ja) * 2013-12-09 2017-01-19 シゼイ シジブイ カンパニー リミテッド 多面上映館の映像の生成方法とその記憶媒体およびこれを用いた映像管理装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11561511B2 (en) 2019-12-30 2023-01-24 Electronics And Telecommunications Research Institute Method and apparatus for generating hologram with wide viewing angle

Similar Documents

Publication Publication Date Title
WO2018188499A1 (fr) Procédé et dispositif de traitement d'image, procédé et dispositif de traitement vidéo, dispositif de réalité virtuelle et support d'enregistrement
US8007110B2 (en) Projector system employing depth perception to detect speaker position and gestures
US10863154B2 (en) Image processing apparatus, image processing method, and storage medium
US11962946B2 (en) Image processing apparatus, display system, image processing method, and medium
KR102539427B1 (ko) 화상 처리장치, 화상 처리방법, 및 기억매체
KR102049456B1 (ko) 광 필드 영상을 생성하는 방법 및 장치
US20200329227A1 (en) Information processing apparatus, information processing method and storage medium
JP2020120236A (ja) 映像表示装置および方法
JP2008187729A (ja) 映像信号処理装置
US11244423B2 (en) Image processing apparatus, image processing method, and storage medium for generating a panoramic image
CN111694528A (zh) 显示墙的排版辨识方法以及使用此方法的电子装置
KR20200103115A (ko) 가상 물체 표시 제어 장치, 가상 물체 표시 시스템, 가상 물체 표시 제어 방법, 및 가상 물체 표시 제어 프로그램
US20160260244A1 (en) Three-dimensional image processing apparatus and three-dimensional image processing system
JP2018010473A (ja) 画像処理装置、画像処理方法、および画像処理プログラム
JP6719596B2 (ja) 画像生成装置、及び画像表示制御装置
US10935878B2 (en) Image processing apparatus, image processing method, and program
WO2019163449A1 (fr) Appareil de traitement d'image, procédé de traitement d'image et programme
JP5249733B2 (ja) 映像信号処理装置
WO2019163128A1 (fr) Dispositif de commande d'affichage d'objet virtuel, système d'affichage d'objet virtuel, procédé de commande d'affichage d'objet virtuel et programme de commande d'affichage d'objet virtuel
JP2019146010A (ja) 画像処理装置、画像処理方法およびプログラム
JP5645448B2 (ja) 画像処理装置、画像処理方法及びプログラム
CN110264406B (zh) 图像处理装置及图像处理的方法
JP5781017B2 (ja) 映像会話システム
JP2019146004A (ja) 画像処理装置、画像処理方法およびプログラム
JP2019146157A (ja) 画像処理装置、画像処理方法およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19758061

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19758061

Country of ref document: EP

Kind code of ref document: A1