WO2019207922A1 - Display imaging device - Google Patents

Display imaging device Download PDF

Info

Publication number
WO2019207922A1
WO2019207922A1 PCT/JP2019/006080 JP2019006080W WO2019207922A1 WO 2019207922 A1 WO2019207922 A1 WO 2019207922A1 JP 2019006080 W JP2019006080 W JP 2019006080W WO 2019207922 A1 WO2019207922 A1 WO 2019207922A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
light receiving
display
receiving position
positions
Prior art date
Application number
PCT/JP2019/006080
Other languages
French (fr)
Japanese (ja)
Inventor
真治 木村
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Priority to JP2020516058A priority Critical patent/JP7002644B2/en
Publication of WO2019207922A1 publication Critical patent/WO2019207922A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/18Signals indicating condition of a camera member or suitability of light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a display imaging device.
  • Japanese Patent Application Laid-Open No. 2004-133867 discloses a technique of capturing an image of a user in front of a display with a camera provided behind a transmissive display that displays a call partner. According to this method, since the front of the user can be imaged in a state where the user and the other party are facing each other, there is an increased possibility that a video call with a line of sight can be realized.
  • An object of one embodiment of the present invention is to provide a display and imaging device capable of realizing a state in which line-of-sight coincides and improving image quality.
  • a display and imaging apparatus includes a transmissive display including a periodic structure member, a camera that images the front of the display from behind the display, and a generation unit that generates a front image based on the captured image of the camera And the generation unit acquires a period based on a plurality of captured images that are obtained when the light receiving position of the camera is at each of the plurality of positions and the periodic structure members in the image are displayed at different positions.
  • a front image is generated so that the structural member is not displayed.
  • the front of the display can be imaged from the front from the front of the display with the camera.
  • the user in front of the display can be imaged from the front, so that a line-of-sight state can be realized.
  • a front image is generated so that the periodic structure member is not displayed. Since the periodic structure member is not displayed in the front image, the image quality can be improved.
  • a display and imaging device capable of realizing a state of line-of-sight matching and improving image quality.
  • FIG. 1 shows a schematic configuration of a display imaging device 7 according to the embodiment.
  • the display imaging device 7 includes a display 1, a camera 2, and a control device 6.
  • the display 1, the camera 2, and the control device 6 are supported at a desired position by being supported by a support member (not shown), for example.
  • FIG. 1 also shows a user U1 who is a user of the display imaging device 7.
  • an XYZ orthogonal coordinate system is shown.
  • the X axis indicates the horizontal direction of the display 1.
  • the Y axis indicates the vertical direction of the display 1.
  • the Z axis indicates the front-rear direction of the display 1.
  • the display 1 is a transmissive display, for example, a self-luminous display in which each display pixel includes an organic light emitting diode (OLED: Organic Light Emitting Diode).
  • a front surface 1 a of the display 1 is a display surface of the display 1.
  • the display 1 has a viewing angle forward, and light (video light or the like) from the display 1 travels forward from the front surface 1a within the viewing angle range.
  • the image displayed on the front surface 1a of the display 1 is brightest when the display 1 is viewed from the front.
  • the camera 2 is provided behind the display 1.
  • the camera 2 images the front of the display 1 through the display 1.
  • An example of the imaging target is the user U1.
  • the camera 2 repeats imaging at a predetermined imaging frame rate, a plurality of captured images that are continuous in time series are acquired.
  • the plurality of captured images acquired in this way can constitute a video in front of the display 1.
  • the light from the imaging object located within the range of the angle of view of the camera 2 enters the camera 2.
  • the spread of the angle of view of the camera 2 is conceptually shown using an angle ⁇ .
  • the camera 2 receives incident light at the light receiving position R.
  • the light receiving position R is the position of the imaging start point of the camera 2, and a scene viewed forward from the light receiving position R is a captured image of the camera 2.
  • the camera 2 includes a photoelectric conversion element, an image processing circuit, and the like.
  • the position of the photoelectric conversion element may be the light receiving position R of the camera 2.
  • the distance between the light receiving position R of the camera 2 and the back surface 1b of the display 1 is referred to as a distance A1 and is illustrated.
  • a distance between the light receiving position R and the user U1 is referred to as a distance A2 and illustrated.
  • the distance A1 is sufficiently smaller than the distance A2.
  • the distance A2 is usually about several tens of centimeters to several meters, although it depends on the usage mode of the display imaging device 7.
  • the distance A1 may be set to about several tenths to several hundredths of the distance A2, for example, about several mm to several cm.
  • the camera 2 may be provided behind the display 1 in a state where the end 2a of the camera 2 is close to or in contact with the back surface 1b of the display 1 so that the distance A1 is as small as possible.
  • the display 1 includes a plurality of display pixels 11 and a periodic structure member 15.
  • Each display pixel 11 may be a sub-pixel of the display 1, and in this case, each display pixel 11 includes a light emitting element corresponding to one of the three primary colors of R, G, and B.
  • the example of a light emitting element is the above-mentioned organic light emitting diode.
  • the periodic structure member 15 includes signal lines that exist in a mesh shape for driving the display 1.
  • the luminance value of the light emitting element of each display pixel 11 is adjusted in accordance with an electrical signal (control signal or the like) from the signal line.
  • the periodic structure member 15 may include members necessary for signal line wiring.
  • the periodic structure member 15 is also a lattice frame that partitions the display pixels 11 into a mesh shape. In the example shown in FIG. 2, one display pixel 11 is included in one mesh. However, a plurality of display pixels 11 may be included in one mesh. For example, when one signal line running in the horizontal direction drives the light emitting elements of the display pixels 11 located on both sides of the signal line, the two display pixels 11 may be included in one mesh in the vertical direction. .
  • two display pixels 11 may be included in one mesh in the horizontal direction. Since the number of display pixels 11 included in one mesh is flexible, the shape of the mesh is not limited to a square, and may be, for example, a rectangle.
  • the periodic structure member 15 includes a plurality of horizontal members 15x and a plurality of vertical members 15y. Each horizontal member 15 x extends in the horizontal direction of the display 1 and is arranged at equal intervals in the vertical direction of the display 1. Each vertical member 15 y extends in the vertical direction of the display 1 and is arranged at equal intervals in the horizontal direction of the display 1. Details of the dimensions of the periodic structure member 15 will be described with reference to FIG.
  • a camera 2 that captures an image of the front of the display 1 via the display 1 shows a scene as shown in FIG.
  • the periodic structure member 15 of the display 1 is reflected in the camera 2.
  • the periodic structure member 15 usually appears as a grid frame of black or a color close to black.
  • FIG. 2B the following symbols are attached regarding the dimensions of the periodic structure member 15.
  • Vertical member width Wx The length of the vertical member 15y in the horizontal direction and the member width of the periodic structure member 15 in the horizontal direction.
  • Vertical member interval Dx a horizontal distance between adjacent vertical members 15y. It is also the periodic interval (mesh size) of the periodic structural member 15 in the horizontal direction.
  • the vertical member interval Dx may be larger than the vertical member width Wx (for example, twice or more).
  • Horizontal member width Wy The length in the vertical direction of the horizontal member 15x, and the member width of the periodic structure member 15 in the vertical direction.
  • Horizontal member interval Dy a vertical distance between adjacent horizontal members 15x. It is also the periodic interval (mesh size) of the periodic structure member 15 in the vertical direction.
  • the horizontal member interval Dy may be larger than the horizontal member width Wy (for example, twice or more).
  • the size of the imaging pixel (not shown) of the camera 2 will be described.
  • the imaging pixels of the camera 2 may be sufficiently smaller than the mesh. If the number of horizontal imaging pixels in one mesh is Nx and the number of vertical imaging pixels is Ny, Nx and Ny may be several to several hundreds.
  • the vertical member interval Dx and the vertical member interval Dx can also be expressed in units of the number of imaging pixels of the camera 2. In that case, the vertical member interval Dx is Nx, and the horizontal member interval Dy is Ny.
  • the dimensions related to the periodic structure member 15 described above, that is, the vertical member interval Dx, the vertical member width Wx, the horizontal member interval Dy, and the horizontal member width Wy may be measured, for example, by performing calibration in advance.
  • the size of the imaging pixel of the camera 2 may be equal to or less than the vertical member width Wx in the horizontal direction and less than or equal to the horizontal member width Wy in the vertical direction. That is, the vertical member width Wx and the horizontal member width Wy can be equal to or larger than the size of one imaging pixel of the camera 2. This is because, as described above, the distance A1 between the light receiving position R of the camera 2 and the back surface 1b of the display 1 is small, and the camera 2 images the periodic structure member 15 from a close range. In this case, the horizontal member 15x and the vertical member 15y in the captured image are displayed in the captured image, and the possibility that the periodic structure member 15 is reflected in the captured image increases.
  • the horizontal member 15x may be reflected due to the difference in size between the vertical member width Wx and the horizontal member width Wy, or only the vertical member 15y may be reflected. Due to the reflection of the periodic structure member 15, the possibility that the quality (image quality) of the image captured by the camera 2 is lowered is also increased. This problem is improved by the control device 6 described below.
  • FIG. 3 shows a block diagram of the control device 6.
  • the control device 6 is configured to be communicable with the camera 2, for example, and can perform various controls relating to imaging by the camera 2 and acquire a captured image of the camera 2.
  • the control device 6 includes an acquisition unit 61 and a generation unit 62 as its functional blocks.
  • the acquisition unit 61 can include a physical element (for example, a holding unit 61a described later) that moves the camera 2.
  • the acquisition unit 61 is a part that acquires a captured image of the camera 2 when the light receiving position R of the camera 2 is at each of a plurality of positions.
  • the captured images acquired here are a plurality of captured images displayed at different positions of the periodic structure member 15.
  • the camera 2 is configured to be movable.
  • the light receiving position R of the camera 2 moves between a plurality of positions in accordance with the movement of the camera 2.
  • the acquisition unit 61 functions as a moving unit that moves the camera 2 itself by controlling the holding unit 61 a that holds the camera 2.
  • the camera 2 may be controlled to vibrate (micro).
  • maintenance part 61a is realizable in various aspects using a well-known method.
  • the holding unit 61a may be, for example, a component that moves microscopically using a vibrator. Instead of moving the holding unit 61a, the holding unit 61a may be attached to the main body of the camera 2, an image sensor (CMOS, CCD, etc.), a lens, or the like. It may be realized by providing a vibrator and giving a minute movement, or the camera 2 is slightly vibrated by controlling the light reflection direction with a MEMS (Micro Electro Mechanical Systems) mirror provided inside the camera 2. You may implement
  • CMOS Micro Electro Mechanical Systems
  • the light receiving positions R of the camera 2 exist simultaneously at a plurality of positions.
  • a camera for example, a light field camera, a 3D camera, or the like can be used.
  • FIG. 4B shows a schematic configuration of the optical system of the light field camera 21.
  • An imaging target of the light field camera 21 is conceptually illustrated as an imaging target 30.
  • a microlens array 23 is disposed in front of the image sensor 22 between the main lens 24 and an image sensor (sensor) 22 such as a CMOS / CCD. Since the principle of the light field camera is well known, detailed description is omitted.
  • the light field camera 21 a plurality of captured images (image groups) similar to those when the light receiving position R of the camera 2 is moved can be acquired without moving the camera itself. Moreover, such an image group can be acquired by one imaging (one shot). In this case, a value obtained by dividing the number of captured images constituting the image group by the time required for one shot can be regarded as the captured frame rate.
  • the movement speed of the light receiving position R of the camera 2 is substantially reduced since an operation system for causing movement like the holding unit 61a shown in FIG. In the case of vibration, the same effect as increasing the vibration frequency) is obtained. Thereby, the imaging frame rate of the camera 2 can be increased.
  • the light receiving position R of the camera 2 moves between a plurality of positions, and the captured image of the camera 2 at each position is mainly acquired.
  • the light receiving positions R of the camera 2 in the form in which the light receiving positions R of the camera 2 are simultaneously present at a plurality of positions, it corresponds to each position to which the light receiving position R of the camera 2 is moved.
  • a light field camera or the like adjusted to have a plurality of light receiving positions may be used. In that case, what is necessary is just to understand that the movement of the light reception position R of the camera 2 in the following description is unnecessary.
  • the acquisition unit 61 moves the light receiving position R of the camera 2 to a plurality of positions.
  • the acquisition unit 61 may move the light receiving position R of the camera 2 in a direction inclined with respect to the horizontal direction and the vertical direction (hereinafter referred to as “inclination direction”).
  • the plurality of positions may be defined by coordinates (XY plane coordinates) parallel to the back surface 1b of the display 1.
  • the plurality of positions may be the same position in the Z-axis direction. In this case, even if the light receiving position R of the camera 2 moves, the distance A1 (see FIG. 1) between the light receiving position R of the camera 2 and the back surface 1b does not change.
  • the plurality of positions are determined so that minute parallax (fine parallax) is generated between the captured images acquired at the respective light receiving positions.
  • the occurrence of fine parallax means that in the captured images acquired at the respective light receiving positions, the imaging target is displayed at approximately the same position (overlapping), while the periodic structure member 15 is displayed at a different position (not overlapping).
  • Such fine parallax occurs because the distance A1 between the light receiving position R of the camera 2 and the back surface 1b of the display 1 depends on the light receiving position R and the imaging target, as described above with reference to FIG. This is because it is smaller than the distance A2 between the user U1.
  • the plurality of positions are arranged side by side in a straight line in the movement direction of the plurality of positions (for example, the above-described tilt direction).
  • the plurality of positions may be three or more positions.
  • a distance between the light receiving positions R of the camera 2 for causing fine parallax (movement distance of the light receiving position R) will be described.
  • the movement of the light receiving position R of the camera 2 on the XY plane includes the periodic structure member 15 of the display 1. It can be regarded as movement on a plane. Therefore, the periodic structure member 15 in the captured image of the camera 2 moves as much as the moving distance of the light receiving position R of the camera 2.
  • the minimum unit of the moving distance of the light receiving position R of the camera 2 is defined as follows.
  • Movement distance ⁇ x Distance in the horizontal direction (separation distance) between adjacent positions in the movement direction (for example, the tilt direction). Depending on how the coordinates are set, ⁇ x may be a negative value.
  • Movement distance ⁇ y Distance in the vertical direction (separation distance) between adjacent positions in the movement direction. Depending on how the coordinates are set, ⁇ y may be a negative value. As described above, the moving distance ⁇ x and the moving distance ⁇ y can take negative values, but hereinafter, the magnitude of the moving distance ⁇ x and the magnitude of the moving distance ⁇ y are simply referred to as the moving distance ⁇ x and the moving distance ⁇ y.
  • the moving distance ⁇ x and the moving distance ⁇ y are set as follows using the dimensions of the periodic structure member 15 shown in FIG.
  • the movement distance ⁇ x is set to be not less than the vertical member width Wx and less than the vertical member interval Dx. That is, Wx ⁇
  • the movement distance ⁇ y is set to be not less than the horizontal member width Wy and less than the horizontal member interval Dy. That is, Wy ⁇
  • the movement distance ⁇ x is set to be less than half of the vertical member interval Dx, that is, Wx ⁇
  • the movement distance ⁇ y is set to be less than half of the horizontal member interval Dy, that is, Wy ⁇
  • the vertical member interval Dx and the vertical member interval Dx can be represented by the number of pixels Nx and Ny of the camera 2.
  • the movement distance ⁇ x is expressed as less than Nx or less than (Nx / 2).
  • the movement distance ⁇ y is expressed as less than Ny or less than (Ny / 2).
  • the movement distance ⁇ x is equal to or greater than the vertical member width Wx
  • the vertical member 15y does not overlap at the same position in the captured image acquired before and after the movement of the light receiving position R of the camera 2.
  • different vertical members 15y for example, adjacent vertical members 15y
  • the vertical member 15y does not overlap even at the same position in the three captured images acquired over the two movements of the light receiving position R.
  • the horizontal member 15x does not overlap at the same position in the captured image acquired before and after the movement. Further, since the movement distance ⁇ y is less than the horizontal member interval Dy, different horizontal members 15x (for example, adjacent horizontal members 15x) do not overlap at the same position in the images acquired before and after the movement. Further, when the movement distance ⁇ y is less than half of the horizontal member interval Dy, the horizontal member 15x does not overlap even at the same position in the three captured images acquired over the two movements of the light receiving position R.
  • the moving distance ⁇ x can be made sufficiently small. In that case, the movement distance ⁇ x may be zero.
  • the movement distance ⁇ y can also be made sufficiently small. In that case, the movement distance ⁇ y may be zero.
  • the tilt angle is less than 90 °. If the movement distance ⁇ x and the movement distance ⁇ y are the same, the angle in the tilt direction is set to 45 °.
  • the moving distance ⁇ x and the moving distance ⁇ y may vary depending on the vertical member width Wx, the vertical member interval Dx, the horizontal member width Wy, and the vertical member interval Dx, the inclination angle may be a value different from 45 °. Good.
  • the inclination angle can be a size within a range of 45 ° ⁇ 10 °, 45 ° ⁇ 20 °, or 45 ° ⁇ 30 °.
  • the inclination angle is 0 ° or 90 °.
  • the acquisition unit 61 moves the light receiving position R of the camera 2 substantially in the horizontal direction or the vertical direction.
  • the acquisition unit 61 moves the light receiving position R of the camera 2 in conjunction with the imaging frame rate of the camera 2. Specifically, when the captured image is acquired (at the time of imaging), the acquisition unit 61 moves the light receiving position R of the camera 2 so that the light receiving position R of the camera 2 is located at one of a plurality of positions. When the imaging frame rate of the camera 2 is 60 fps, the acquisition unit 61 moves the light receiving position R of the camera 2 60 times per second.
  • the acquisition unit 61 may reciprocate the light receiving position R of the camera 2.
  • the acquisition unit 61 may vibrate the light receiving position R of the camera 2 (at a high speed and minutely) as described above with reference to FIG.
  • the length of one of the reciprocating paths (that is, the width of vibration) may be set to be less than the vertical member interval Dx in the horizontal direction and less than the horizontal member interval Dy in the vertical direction.
  • the generation unit 62 is a part that generates an image in front of the display 1 (hereinafter also referred to as “front image”) based on the captured image acquired by the acquisition unit 61.
  • the generation unit 62 generates a front image so that the periodic structure member 15 is not displayed.
  • a front image is an image in which the reflection of the periodic structure member 15 is reduced as compared with a captured image of the camera 2 (for example, an image as shown in FIG. 2B).
  • the front image may be an image without the periodic structure member 15 (removed).
  • the generation unit 62 moves forward based on the captured images of the camera 2 that are sequentially acquired when the light receiving position R of the camera 2 is in each position of the reciprocating path.
  • An image may be generated.
  • a time series median filter may be used to generate the forward image by the generation unit 62. Since the median filter itself is a known technique, the description thereof is omitted here.
  • the function of the median filter may be provided in the generation unit 62 itself, or the function of the median filter provided in other elements (may be a server (not shown) outside the control device 6) is generated.
  • the unit 62 may use it.
  • the generation unit 62 applies the median filter to captured images acquired at least three light receiving positions arranged in the moving direction (for example, the tilt direction) of the light receiving position R among the plurality of positions.
  • the resulting image may be generated as a front image.
  • an image obtained by combining image pickup pixels whose luminance value is the median luminance value of each of the three image pickup pixels is generated as a front image.
  • the captured images acquired when the light receiving position R of the camera 2 is at each of the three positions are the three images in time series acquired when the light receiving position R of the camera 2 moves in conjunction with the imaging frame. It is an image.
  • a median filter is applied to a plurality of such time-series captured images (image group).
  • image group a case where there are three positions will be described.
  • FIG. 5A to 5C show captured images acquired when the light receiving position R of the camera 2 is at each of the three positions.
  • an arrow AR indicates the moving direction of the light receiving position R. Since the light receiving position R reciprocates as described above, the arrow AR indicates the direction of the forward or backward path of the reciprocating path.
  • the light receiving positions R of the camera 2 from which the captured images shown in FIGS. 5A to 5C are acquired are referred to as light receiving positions R0 to R2.
  • one coordinate of the lattice point of the periodic structure member 15 in the captured image is represented as P (x, y).
  • the lattice point is a point where the horizontal member 15 x and the vertical member 15 y intersect in the periodic structure member 15.
  • FIG. 5A shows a captured image acquired at the light receiving position R0.
  • the periodic structure member 15 is present in front of the user U1 who is the imaging target.
  • One coordinate of the lattice point of the periodic structure member 15 is P (x0, y0).
  • FIG. 5B shows a captured image acquired at the light receiving position R1.
  • the light receiving position R1 is a position moved from the light receiving position R0 by the moving distance ⁇ x and the moving distance ⁇ y.
  • the periodic structure member 15 is present in front of the user U1.
  • fine parallax occurs with respect to the captured image acquired at the light receiving position R. Therefore, the position of the user U1 is the same while the position of the periodic structure member 15 is moved. Yes.
  • FIG. 5 shows a captured image acquired at the light receiving position R2.
  • the light receiving position R2 is a position moved from the light receiving position R1 by the moving distance ⁇ x and the moving distance ⁇ y.
  • the periodic structure member 15 is present in front of the user U1.
  • the movement distance ⁇ x and the movement distance ⁇ y are set to Wx ⁇
  • the vertical member 15y does not overlap and the horizontal member 15x does not overlap at the same position in the three captured images acquired at the respective light receiving positions R0 to R2. Therefore, when imaging pixels at the same position in the three imaging images are compared, at most one imaging pixel displays the vertical member 15y among the three imaging pixels. At least two imaging pixels are imaging pixels that display an imaging target (for example, user U1) instead of the vertical member 15y. Similarly, among the three imaging pixels, at most one imaging pixel displays the horizontal member 15x.
  • At least two imaging pixels are imaging pixels that display an imaging target, not the horizontal member 15x. Therefore, if a median filter is applied to these three captured images, the horizontal member 15x and the vertical member 15y of the periodic structure member 15 as shown in FIG. 5D are not displayed (the reflection is reduced). I) One captured image is obtained, and this captured image can be generated as a forward image.
  • the acquisition unit 61 reciprocates the light receiving position R of the camera 2, after the light receiving position R moves from the light receiving position R0 to the light receiving position R2 along the direction of the arrow AR, this time, the arrow AR It moves from the light receiving position R2 to the light receiving position R0 along the opposite direction.
  • the arrow AR is the forward direction of the reciprocating path, the light receiving position R passes through the light receiving position R0, the light receiving position R1, and the light receiving position R2 in this order in the forward path, and the light receiving position R is the light receiving position R2, the light receiving position R1, and the light receiving position in the return path Go through R0 in this order.
  • the generation unit 62 generates a first front image based on the captured image acquired when the light receiving position R of the camera 2 is in each position of the forward path of the reciprocating path (light receiving position R0 to light receiving position R2).
  • a second front image may be generated based on a captured image acquired when the vehicle is at each position of the return path (light receiving position R2 to light receiving position R0) of the path.
  • These first front image and second front image may be repeatedly generated while the reciprocal movement continues.
  • the light receiving position R2 is a return position from the forward path to the return path
  • the light reception position R0 is a return position from the return path to the return path.
  • the captured image acquired when the light receiving position R of the camera 2 is at the turn-back position of these round-trip paths may be used in common for generating the first front image and the second front image.
  • the captured image acquired at the light receiving position R0 and the light receiving position R2 is one of the three captured images acquired on the forward path of the reciprocating path (the light receiving position R0 to the light receiving position R2). It is also one captured image of the three captured images acquired on the return path (light reception position R2 to light reception position R0) of the round-trip path. Therefore, the captured image acquired at the light receiving position R2 includes a median filter that is applied to the three captured images acquired in the forward path, and a median that is applied to the three captured images acquired in the subsequent return path.
  • the captured image acquired at the light receiving position R0 includes a median filter that is applied to the three captured images acquired in the return path and a median that is applied to the three captured images acquired in the subsequent outbound path. It may be used in common with the filter.
  • the generation timing of the front image by the generation unit 62 will be described.
  • ⁇ t is the reciprocal of the imaging frame rate of the camera 2.
  • the forward image corresponding to time t′0 is generated based on three captured images (time-series image group) acquired at times t0 to t2 (each light receiving position R).
  • the forward image corresponding to time t′1 is generated based on the three captured images acquired at times t2 to t4.
  • the forward image corresponding to time t′2 is generated based on the three captured images acquired at times t4 to t6.
  • the forward image corresponding to time t′3 is generated based on the three captured images acquired at times t6 to t8.
  • time t ′ thereafter.
  • the imaging frame rate of the camera 2 is 60 fps
  • ⁇ t is 1/60 second and ⁇ t ′ is 1/30 second. That is, the video display frame rate when the front image generated by the generation unit 62 is displayed is 30 fps, which is a half of the imaging frame rate 60 fps of the camera 2.
  • the time t ′ may be associated with any one of the three times t, and may be the same time as the last time t of the three times t, for example.
  • time t′0 is time t2.
  • Time t′1 is time t4.
  • Time t′2 is time t6.
  • Time t′3 is time t8. The same applies to time t ′ thereafter.
  • FIG. 6 is a flowchart showing an example of processing (image generation method) executed by the control device 6.
  • the acquisition unit 61 reciprocates the light receiving position R of the camera 2 as described above.
  • step S1 the generation unit 62 obtains the luminance value f (t) of each imaging pixel in the captured image acquired at the light receiving position R at time t.
  • the light receiving position R here is, for example, the light receiving position R0 in FIG. 5A in the case of the reciprocating movement, and the light receiving position R2 in FIG. 5C in the case of the return path in the reciprocating movement. is there.
  • the luminance value f (t) is represented by a numerical value, and the numerical value may be larger as the luminance is higher.
  • step S2 the acquisition unit 61 moves the light receiving position R of the camera 2 diagonally (in the tilt direction) by a minute amount. This is because the acquisition unit 61 reciprocates the light receiving position R of the camera 2.
  • the minute amounts are the movement distance ⁇ x and the movement distance ⁇ y. Since the time required for the movement is ⁇ t, the time when the movement is completed is t + ⁇ t. In the captured image acquired after the movement, fine parallax occurs with respect to the captured image acquired before the movement.
  • step S3 the generation unit 62 obtains the luminance value f (t + ⁇ t) of each imaging pixel in the captured image acquired at the light receiving position R at time t + ⁇ t.
  • the light receiving position R here is, for example, the light receiving position R1 of FIG.
  • step S4 the acquisition unit 61 moves the light receiving position R of the camera 2 diagonally by a minute amount. That is, the light receiving position R of the camera 2 further moves in the tilt direction by the movement distance ⁇ x and the movement distance ⁇ y.
  • the time at which this movement is completed is t + 2 ⁇ t.
  • fine parallax occurs with respect to the captured image acquired before the movement.
  • step S5 the generation unit 62 obtains the luminance value f (t + 2 ⁇ t) of each imaging pixel of the captured image acquired at the light receiving position at time t + 2 ⁇ t.
  • the light receiving position R here is, for example, the light receiving position R2 in FIG. 5C in the case of the forward path of the reciprocating path, and the light receiving position R0 in FIG. 5A in the case of the return path of the reciprocating path. is there.
  • step S6 the generation unit 62 applies a median filter with the luminance value f (t), the luminance value f (t + ⁇ t), and the luminance value f (t + 2 ⁇ t).
  • step S7 the generation unit 62 sets the luminance value after the median filter at time t ′ to f (t ′).
  • the luminance value f (t ′) may be, for example, the luminance value of the median value of the three luminance values of the luminance value f (t), the luminance value f (t + ⁇ t), and the luminance value f (t + 2 ⁇ t).
  • An image obtained by applying the median filter is generated as a forward image at time t ′.
  • the time t ′ may be the same time as the time t + 2 ⁇ t.
  • the forward image generated in this way is a forward image in which the periodic structure member 15 is not displayed as described above with reference to FIG.
  • step S8 the acquisition unit 61 switches the moving direction of the light receiving position R of the camera 2. That is, the moving direction of the light receiving position R is switched between the forward path and the return path of the reciprocating path. Thereafter, the process returns to step S2.
  • the reason why the process returns to step S2 instead of step S1 is that, as described above, the captured image acquired at the return position of the round-trip path is used in common on the forward path and the return path. Therefore, the generation unit 62 uses the luminance value f (t + 2 ⁇ t) obtained in the previous step S5 as the luminance value f (t) of each imaging pixel in the captured image at time t. That is, after the captured image information at time t + 2 ⁇ t used in the previous loop is substituted for the captured image information at time t used in the next loop, the processes in steps S2 to S8 are repeatedly executed.
  • a forward image is repeatedly generated so that the periodic structure member 15 as described above is not displayed.
  • the front image generated in step S7 is the first front image described above.
  • the front image generated in step S7 is the second front image described above.
  • step S2 and S4 the example in which the light receiving position R of the camera 2 is moved obliquely by a small amount has been described.
  • the light receiving position R is moved not horizontally but horizontally or vertically.
  • steps S2 and S4 are not necessary.
  • the process of “switching the moving direction of the light receiving position” in step S8 is also unnecessary.
  • the display imaging device 7 described above includes a display 1, a camera 2, and a control device 6.
  • the display 1 includes a periodic structure member 15.
  • the camera 2 images the front of the display 1 from behind the display 1.
  • the control device 6 includes a generation unit 62.
  • the generation unit 62 generates a front image of the display 1 based on the captured image of the camera 2. More specifically, the generation unit 62 acquires the plurality of captured images in which the periodic structure member 15 is displayed at different positions by being acquired when the light receiving position R of the camera 2 is at each of the plurality of positions. Based on this, a forward image is generated so that the periodic structure member 15 is not displayed.
  • the front of the display 1 can be imaged from the front by the camera 2 from behind the display 1.
  • the user U1 in front of the display 1 can be imaged from the front, so that a line-of-sight state can be realized.
  • a video call system 9 shown in FIG. 7 includes a control unit 5 and a communication device 8 in addition to the display imaging device 7.
  • the display imaging device 7 includes the camera 2 and the control device 6 (see FIG. 1).
  • the user U1 of the video call system 9 is making a video call with the user U2 who is a call partner.
  • the user U1 is viewing the display 1 from the front (Z-axis positive direction). Therefore, the camera 2 can image the user U1 from the front (Z-axis negative direction) via the display 1 (FIG. 2 and the like).
  • the communication device 8 transmits and receives video data, audio data, and the like to and from a video call system (not shown) used by the user U2.
  • the video data includes video data of user U1 and video data of user U2.
  • the audio data includes user U1 audio data and user U2 audio data.
  • the control unit 5 is a part that executes various controls necessary for operating the display imaging device 7 in the video call system 9.
  • the control unit 5 receives display data from the display 1 (for example, video data of the user U2) by the communication device 8, and transmits captured image data from the camera 2 (for example, video data of the user U1) by the communication device 8.
  • the transmission / reception process is executed.
  • the control unit 5 may be a constituent element of the display imaging device 7.
  • elements necessary for the video call system 9 such as a speaker and a microphone (not shown) may be included in the display imaging device 7.
  • the display imaging device 7 As described so far, a front image is generated such that the periodic structure member 15 is not displayed. Therefore, the image quality can be improved.
  • the video of the user U1 is presented to the user U2, but the video quality can be improved because the periodic structure member 15 is not reflected in the video of the user U1. .
  • the light receiving position R of the camera 2 may be moved between a plurality of positions by moving the camera 2 as described above with reference to FIG. Thereby, it is possible to acquire a captured image when the light receiving position R of the camera 2 is at each of a plurality of positions using a normal camera that is not a light field camera, a 3D camera, or the like.
  • a light field camera, a 3D camera, or the like in which the light receiving positions R exist simultaneously at a plurality of positions may be used.
  • an effect similar to that of increasing the moving speed of the light receiving position R of the camera 2 can be obtained because the operation system for moving the camera 2 becomes unnecessary. Thereby, the imaging frame rate of the camera 2 can be increased.
  • the generation unit 62 captures the images of the camera 2 that are sequentially acquired when the light reception position R of the camera 2 is at each position on a reciprocation path passing through a plurality of positions.
  • a forward image may be generated based on the image.
  • the plurality of positions may include a return position of the round-trip path.
  • the generation unit 62 generates a first captured image based on the captured images of the camera 2 acquired when the light receiving positions R of the camera 2 are at a plurality of positions on the forward path of the round-trip path, and generates a plurality of images on the return path of the round-trip path.
  • a second front image may be generated based on the captured image of the camera 2 acquired when the camera is at the position.
  • the generation unit 62 may generate the first front image and the second front image by using the captured image of the camera 2 acquired at the turn-back position of the round-trip path in common. For example, when the captured image of the camera 2 at the return position of the round-trip path is not used in common, one front image is generated every time three captured images are acquired, so the video display frame rate of the front image Decreases to a frame rate that is one third of the imaging frame rate of the camera 2.
  • the captured image of the camera 2 at the turn-back position of the round-trip path is used in common, one front image is generated every time two captured images are acquired, so the video display frame rate of the front image is set. , It can be increased to half of the imaging frame rate of the camera 2.
  • the periodic structure member 15 may include a horizontal member 15x and a vertical member 15y.
  • the horizontal members 15 x may extend in the horizontal direction of the display 1 and be arranged at equal intervals in the vertical direction of the display 1.
  • the vertical members 15 y may extend in the vertical direction of the display 1 and be arranged at equal intervals in the horizontal direction of the display 1.
  • the plurality of positions may be arranged in an inclined direction with respect to the horizontal direction and the vertical direction.
  • the position of the horizontal member in each captured image can be prevented from overlapping, but the position of the vertical member overlaps.
  • the position of the horizontal member and the position of the vertical member in each captured image can be prevented from overlapping. Therefore, the reflection of the periodic structure member 15 in the front image generated by the generation unit 62 can be further reduced, and the image quality can be further improved.
  • the distance between adjacent positions among the plurality of positions is equal to or greater than the member width (vertical member width Wx, horizontal member width Wy) of the periodic structure member 15 and the periodic interval (vertical member interval Dx, horizontal member interval Dy). It may be less than half.
  • the generation unit 62 displays an image obtained by applying a median filter to the captured image of the camera 2 acquired when the light receiving position R of the camera 2 is at each of three positions. It may be generated as an image.
  • imaging pixels that display the vertical member 15y among the three imaging pixels. Both are one. At least two imaging pixels are imaging pixels that display an imaging target (for example, user U1) instead of the vertical member 15y. Similarly, among the three imaging pixels, at most one imaging pixel displays the horizontal member 15x. At least two imaging pixels are imaging pixels that display an imaging target, not the horizontal member 15x. Therefore, if a median filter is applied to these three captured images, a captured image in which the periodic structure member 15 is not displayed (such that the periodic structure member 15 is removed) is obtained, and this captured image is displayed on the display 1. Can be generated as a forward image.
  • FIG. 8 shows a block diagram of a control device 6A included in the display imaging device 7A according to such a modification.
  • the control device 6A is different from the control device 6 (FIG. 3) in that it includes an acquisition unit 61A and a generation unit 62A instead of the acquisition unit 61 and the generation unit 62, and further includes a storage unit 63.
  • the acquisition unit 61A reciprocates the light receiving position R of the camera 2 in the same manner as the acquisition unit 61 (FIG. 3). In the case of a light field camera or the like, the movement is not necessary as described above.
  • the acquisition unit 61 reciprocates the light receiving position R of the camera 2 so as to pass through at least three positions
  • the acquisition unit 61A reciprocates the light reception position R of the camera 2 so as to pass through at least two positions. Move.
  • the two positions may be the return positions of the round trip path.
  • the acquisition unit 61A reciprocates the light receiving position R of the camera 2 so as to pass through two positions will be described.
  • the movement of the light receiving position R of the camera 2 by the acquisition unit 61A is also determined so that the fine parallax described so far occurs.
  • the minimum unit of the moving distance of the light receiving position R of the camera 2 by the acquisition unit 61A is defined as follows. Movement distance ⁇ xA: a horizontal distance (separation distance) between two positions. Depending on how the coordinates are set, ⁇ xA may be a negative value. Movement distance ⁇ yA: a vertical distance (separation distance) between two positions. Depending on how the coordinates are set, ⁇ yA may be a negative value.
  • the magnitude of the movement distance ⁇ xA is simply referred to as a movement distance ⁇ xA.
  • the magnitude of the movement distance ⁇ yA is simply referred to as the movement distance ⁇ yA.
  • the moving distance ⁇ xA is set to be not less than the vertical member width Wx and less than the vertical member interval Dx. That is, Wx ⁇
  • the movement distance ⁇ yA is set to be not less than the horizontal member width Wy and less than the horizontal member interval Dy. That is, Wy ⁇
  • the generating unit 62A is a part that generates a front image based on a captured image acquired when the light receiving position R of the camera 2 is at each of the two positions. Specifically, the generation unit 62A does not use the imaging pixel that displays the periodic structure member 15 among the imaging pixels at the same position in the two captured images so that the periodic structure member 15 is not displayed (or removed). A forward image is generated.
  • the storage unit 63 is a part that stores in advance information related to the imaging pixels that display the periodic structure member 15. As an example of such information, the storage unit 63 stores in advance the luminance value L of the imaging pixel when the imaging pixel of the captured image of the camera 2 displays the periodic structure member 15.
  • the luminance value L may be a value measured in advance by calibration. Even if the measurement is not performed, a sufficiently small value (for example, the minimum value) may be set as the luminance value L using the low luminance value of the imaging pixel at the position of the periodic structure member 15.
  • the generation unit 62A generates a front image using an imaging pixel having a luminance value that is distant from the luminance value L stored in the storage unit 63 among the imaging pixels of the two captured images. This is because an imaging pixel whose luminance value is close to the luminance value L is likely to be an imaging pixel that displays the periodic structure member 15, so that such an imaging pixel is not used. For example, when the luminance value of one imaging pixel is 10 and the luminance value of the other imaging pixel is 40 out of the imaging pixels at the same position in two captured images, it is separated from the luminance value L (in this case, less than 10). An imaging pixel having a luminance value 40 of the existing one is employed.
  • FIG. 9A shows a captured image acquired at the light receiving position R0A.
  • One coordinate of the lattice point of the periodic structure member 15 is PA (x0A, y0A).
  • FIG. 9B shows a captured image acquired at the light receiving position R1A.
  • the light receiving position R1A is a position moved from the light receiving position R0A by the moving distance ⁇ xA and the moving distance ⁇ yA.
  • the position of the periodic structure member 15 is moved while the position of the user U1 is the same. is doing.
  • the movement distance ⁇ xA and the movement distance ⁇ yA are set to Wx ⁇
  • the vertical member 15y does not overlap and the horizontal member 15x does not overlap at the same position in the two captured images at the light receiving position R0A and the light receiving position R1A.
  • the luminance value of the imaging pixel displaying the horizontal member 15x and / or the vertical member 15y out of the two imaging pixels is the horizontal member 15x and / or The brightness value is closer to the brightness value L stored in the storage unit 63 than the brightness value of the imaging pixels not displaying the vertical member 15y. Therefore, the periodic structure member 15 as shown in (c) of FIG. 9 is removed by combining the imaging pixels having the luminance values far from the luminance value L out of the imaging pixels of these two captured images. Only one captured image is obtained, and this captured image can be generated as a front image in which the periodic structure member 15 is not displayed.
  • the acquisition unit 61A moves the light receiving position R of the camera 2 back and forth. Therefore, the generation unit 62A generates a first front image based on the captured images acquired at the light receiving position R0A and the light receiving position R1 in the forward path of the round trip path, and at the light receiving position R1A and the light receiving position R0A in the return path of the round trip path.
  • a second front image may be generated based on the acquired captured image.
  • a captured image acquired at the return position of the round-trip path can be used in common.
  • the video frame rate when displaying the forward image generated by the generation unit 62A is the same as the imaging frame rate of the camera 2. For example, when the imaging frame rate of the camera 2 is 60 fps, the video frame rate of the front image is also 60 fps.
  • FIG. 10 is a flowchart showing an example of processing (image generation method) executed by the control device 6A.
  • step S11 the generation unit 62A obtains the luminance value f (t) of each imaging pixel in the captured image at the light receiving position R at time t.
  • the light receiving position R here is, for example, the light receiving position R0A in FIG. 9A in the case of the forward path of the round-trip path, and is the light receiving position R1A in FIG. 9B in the case of the return path in the round-trip path. is there.
  • step S12 the acquisition unit 61A moves the light receiving position R of the camera 2 diagonally by a minute amount.
  • the minute amount is the above-described movement distance ⁇ xA and movement distance ⁇ yA. Since the time required for the movement is ⁇ t, the time when the movement is completed is t + ⁇ t. In the captured image acquired after the movement, a fine parallax occurs with respect to the imaging pixel acquired before the movement.
  • step S13 the generation unit 62A obtains the luminance value f (t + ⁇ t) of each imaging pixel in the captured image acquired at the light receiving position R at time t + ⁇ t.
  • the light receiving position R here is, for example, the light receiving position R1A of FIG. 9B in the case of the forward path of the reciprocating path, and the light receiving position R0A of FIG. 9A in the case of the return path of the reciprocating path. is there.
  • step S14 the generation unit 62A uses the luminance value L stored in the storage unit 63, and the difference value (
  • step S15 the generation unit 62A sets the luminance value used for calculation of the difference value having the larger value among the two difference values obtained in the previous step S14 as f (t ′). That is, an image obtained by combining imaging pixels having a luminance value far from the luminance value L out of the two luminance values obtained in the previous step S11 and step S13 is a front corresponding to the time t ′. Generated as an image.
  • the time t ′ may be the same time as the time t + ⁇ t in the previous step S13.
  • step S16 the acquisition unit 61A switches the moving direction of the light receiving position R of the camera 2. That is, the moving direction of the light receiving position R is switched between the forward path and the return path of the reciprocating path. Thereafter, the process returns to step S12 again.
  • the reason why the process returns to step S12 instead of step S11 is that, as described above, the captured image acquired at the return position of the round-trip path is used in common on the forward path and the return path. Therefore, the generation unit 62A uses the luminance value f (t + ⁇ t) obtained in the previous step S13 as the luminance value f (t) of each imaging pixel in the captured image at time t. That is, the information of the captured image at time t + ⁇ t used in the previous loop is substituted into the information of the captured image at time t used in the next loop, and then the processing of steps S12 to S16 is repeatedly executed.
  • a forward image is repeatedly generated so that the periodic structure member 15 as described above is not displayed.
  • the front image generated in step S16 is the first front image described above.
  • the front image generated in step S16 is the second front image described above.
  • step S12 the example in which the light receiving position R of the camera 2 is moved by a minute amount is described. However, as described above, if the moving distance ⁇ x or the moving distance ⁇ y is zero, the light receiving position R is received. The position R is moved not horizontally but horizontally or vertically. Further, as described above with reference to FIG. 4B, in the form in which the light receiving positions R of the camera 2 are simultaneously present at a plurality of positions, the process of step S12 described above is not necessary. The process of “switching the moving direction of the light receiving position” in step S16 described above is also unnecessary.
  • the storage unit 63 stores in advance the luminance value of the imaging pixel when the imaging pixel of the captured image of the camera 2 displays the periodic structure member 15.
  • the distance between adjacent positions among the plurality of positions is not less than the member width (vertical member width Wx, horizontal member width Wy) of the periodic structure member 15 and less than the periodic interval (vertical member interval Dx, horizontal member interval Dy). It is.
  • the generation unit 62 is stored in the storage unit 63 out of each imaging pixel in the captured image of the camera 2 acquired when the light receiving position R of the camera 2 is at each of two positions among the plurality of positions. An image obtained by combining image pickup pixels having a luminance value away from the luminance value L is generated as a front image in which the periodic structure member 15 is not displayed (steps S14 and S15).
  • the luminance value of the imaging pixel where the horizontal member 15x and / or the vertical member 15y is located among the two imaging pixels is the horizontal member 15x and / or the vertical member. It is closer to the luminance value L stored in the storage unit 63 than the luminance value of the imaging pixel in which 15y is not located. Therefore, by combining the imaging pixels having the luminance values that are separated from the luminance value L among the imaging pixels of these two captured images, one captured image in which the periodic structure member 15 is not displayed is obtained.
  • the captured image can be generated as a front image of the display 1. In this case, since the front image can be generated using only two captured images, the video frame rate of the front image can be increased as compared with the case where the front image is generated using three captured images.
  • the display imaging device 7 is used in the video call system 9 .
  • the use of the display imaging device 7 is not limited to the video call system 9.
  • the display imaging device 7 may be used for the user U1 to display his / her own video.
  • the display 1 displays the video imaged by the camera 2. Accordingly, the user U1 can take a picture while confirming how the user U1 is appearing without having to worry about the position of the camera 2 simply by facing the display 1.
  • Such an application of the display imaging device 7 has an advantage that, in a photographer (certification photographer, sticker printer) or the like, the photographing preview screen can be confirmed with the eye line facing forward at the time of photographing. Even when augmented reality (AR) signage placed in a shopping center or the like is used to synthesize CG with a user's video, it can be enjoyed more naturally. In addition, applications that deliver live images while reflecting themselves will be able to deliver with a more natural look.
  • AR augmented reality
  • each functional block may be realized by one device physically and / or logically coupled, and two or more devices physically and / or logically separated may be directly and / or indirectly. (For example, wired and / or wireless) and may be realized by these plural devices.
  • control device 6 or the like may function as a computer that performs processing such as the acquisition unit 61 and the generation unit 62.
  • FIG. 11 is a diagram illustrating an example of a hardware configuration of the control device 6 and the like according to the present embodiment.
  • the control device 6 will be described as an example.
  • the control unit 5, the control device 6 ⁇ / b> A, and the communication device 8 can be described in the same manner as the control device 6.
  • the control device 6 may be physically configured as a computer device including a processor 1001, a memory 1002, a storage 1003, a communication device 1004, an input device 1005, an output device 1006, a bus 1007, and the like.
  • control device 6 can be read as a circuit, a device, a unit, or the like.
  • the hardware configuration of the control device 6 may be configured to include one or a plurality of the devices illustrated in the figure, or may be configured not to include some devices.
  • Each function in the control device 6 is read by a predetermined software (program) on hardware such as the processor 1001 and the memory 1002, so that the processor 1001 performs an operation and performs communication by the communication device 1004, in the memory 1002 and the storage 1003. This is realized by controlling reading and / or writing of data.
  • the processor 1001 controls the entire computer by operating an operating system, for example.
  • the processor 1001 may be configured by a central processing unit (CPU) including an interface with peripheral devices, a control device, an arithmetic device, a register, and the like.
  • CPU central processing unit
  • the acquisition units 61 and 61A and the generation units 62 and 62A described above may be realized by the processor 1001.
  • the processor 1001 reads a program (program code), a software module, and data from the storage 1003 and / or the communication device 1004 to the memory 1002, and executes various processes according to these.
  • a program program code
  • a software module software module
  • data data from the storage 1003 and / or the communication device 1004 to the memory 1002, and executes various processes according to these.
  • the program a program that causes a computer to execute at least a part of the operations described in the above embodiments is used.
  • the acquisition unit 61 and the generation unit 62 may be realized by a control program stored in the memory 1002 and operated by the processor 1001, and may be realized similarly for other functional blocks.
  • the above-described various processes have been described as being executed by one processor 1001, they may be executed simultaneously or sequentially by two or more processors 1001.
  • the processor 1001 may be implemented by one or more chips.
  • the program may be transmitted from a network via a telecommunication line.
  • the memory 1002 is a computer-readable recording medium, and includes, for example, at least one of ROM (Read Only Memory), EPROM (Erasable Programmable ROM), EEPROM (ElectricallyrErasable Programmable ROM), RAM (Random Access Memory), and the like. May be.
  • the memory 1002 may be referred to as a register, a cache, a main memory (main storage device), or the like.
  • the memory 1002 can store a program (program code), a software module, and the like that can be executed to implement the wireless communication method according to the embodiment of the present invention.
  • the storage 1003 is a computer-readable recording medium such as an optical disc such as a CD-ROM (Compact Disc ROM), a hard disc drive, a flexible disc, a magneto-optical disc (eg, a compact disc, a digital versatile disc, a Blu-ray). (Registered trademark) disk, smart card, flash memory (for example, card, stick, key drive), floppy (registered trademark) disk, magnetic strip, or the like.
  • the storage 1003 may be referred to as an auxiliary storage device.
  • the storage medium described above may be, for example, a database, server, or other suitable medium including the memory 1002 and / or the storage 1003.
  • the communication device 1004 is hardware (transmission / reception device) for performing communication between computers via a wired and / or wireless network, and is also called a network device, a network controller, a network card, a communication module, or the like.
  • the communication device 8 described above may be realized by the communication device 1004.
  • the input device 1005 is an input device (for example, a keyboard, a mouse, a microphone, a switch, a button, a sensor, etc.) that accepts an external input.
  • the output device 1006 is an output device (for example, a display, a speaker, an LED lamp, etc.) that performs output to the outside.
  • the input device 1005 and the output device 1006 may have an integrated configuration (for example, a touch panel).
  • the devices such as the processor 1001 and the memory 1002 are connected by a bus 1007 for communicating information.
  • the bus 1007 may be configured with a single bus or may be configured with different buses between apparatuses.
  • the control device 6 includes hardware such as a microprocessor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a programmable logic device (PLD), and a field programmable gate array (FPGA). A part or all of each functional block may be realized by the hardware.
  • the processor 1001 may be implemented by at least one of these hardware.
  • the phrase “based on” does not mean “based only on”, unless expressly specified otherwise. In other words, the phrase “based on” means both “based only on” and “based at least on.”
  • any reference to that element does not generally limit the quantity or order of those elements. These designations can be used herein as a convenient way to distinguish between two or more elements. Thus, a reference to the first and second elements does not mean that only two elements can be employed there, or that in some way the first element must precede the second element.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Indication In Cameras, And Counting Of Exposures (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

This display imaging device 7 is provided with: a transmission type display 1 which comprises a periodic structure member 15; a camera 2 which images the front side of the display 1 from the rear side of the display 1; and a generation unit 62 which generates a front image on the basis of a captured image obtained from the camera 2. The generation unit 62 generates a front image such that the periodic structure member 15 is not displayed on the basis of a plurality of captured images which are obtained when a light-receiving position R of the camera is at a plurality of locations and thus the periodic structure member 15 in the images is displayed at different locations.

Description

表示撮像装置Display imaging device
 本発明は、表示撮像装置に関する。 The present invention relates to a display imaging device.
 視線一致した状態でのビデオ通話を実現するための手法が提案されている。特許文献1は、通話相手を表示する透過型ディスプレイの背後に設けられたカメラでディスプレイ前方のユーザを撮像する手法を開示する。この手法によれば、ユーザと通話相手とが向かい合った状態でユーザの正面を撮像できるので、視線一致した状態でのビデオ通話が実現できる可能性が高まる。 A method for realizing a video call with a line-of-sight match has been proposed. Japanese Patent Application Laid-Open No. 2004-133867 discloses a technique of capturing an image of a user in front of a display with a camera provided behind a transmissive display that displays a call partner. According to this method, since the front of the user can be imaged in a state where the user and the other party are facing each other, there is an increased possibility that a video call with a line of sight can be realized.
特開2010-232828号公報JP 2010-232828 A
 透過型ディスプレイでは、ディスプレイ駆動のために存在する信号線等を含む部材が周期的に配置されている。このような部材(以下、「周期構造部材」等とも呼ぶ場合もある)がカメラに写り込むと撮像画像の品質(画質)が低下する。この問題について、特許文献1では全く検討されていない。本発明の一態様は、視線一致した状態を実現するとともに画質を向上させることが可能な表示撮像装置を提供することを目的とする。 In the transmissive display, members including signal lines and the like that exist for driving the display are periodically arranged. When such a member (hereinafter also referred to as “periodic structure member” or the like) is reflected in the camera, the quality (image quality) of the captured image is degraded. This problem is not studied at all in Patent Document 1. An object of one embodiment of the present invention is to provide a display and imaging device capable of realizing a state in which line-of-sight coincides and improving image quality.
 本発明の一態様に係る表示撮像装置は、周期構造部材を含む透過型のディスプレイと、ディスプレイの背後からディスプレイの前方を撮像するカメラと、カメラの撮像画像に基づいて前方画像を生成する生成部と、を備え、生成部は、カメラの受光位置が複数の位置の各々にあるときに取得されることで画像中の周期構造部材が異なる位置に表示された複数の撮像画像に基づいて、周期構造部材が表示されないように前方画像を生成する。 A display and imaging apparatus according to an aspect of the present invention includes a transmissive display including a periodic structure member, a camera that images the front of the display from behind the display, and a generation unit that generates a front image based on the captured image of the camera And the generation unit acquires a period based on a plurality of captured images that are obtained when the light receiving position of the camera is at each of the plurality of positions and the periodic structure members in the image are displayed at different positions. A front image is generated so that the structural member is not displayed.
 上記表示撮像装置によれば、ディスプレイの背後からディスプレイの前方をカメラで正面から撮像することができる。ビデオ通話等に用いる場合には、ディスプレイの前方にいるユーザを正面から撮像できるので、視線一致した状態を実現することができる。さらに、画像中の周期構造部材が異なる位置に表示された複数の撮像画像に基づいて、周期構造部材が表示されないように前方画像が生成される。前方画像に周期構造部材が表示されないようになる分、画質を向上させることもできる。 According to the above display and imaging device, the front of the display can be imaged from the front from the front of the display with the camera. When used for a video call or the like, the user in front of the display can be imaged from the front, so that a line-of-sight state can be realized. Further, based on a plurality of captured images displayed at different positions of the periodic structure member in the image, a front image is generated so that the periodic structure member is not displayed. Since the periodic structure member is not displayed in the front image, the image quality can be improved.
 本発明の一態様によれば、視線一致した状態を実現するとともに画質を向上させることが可能な表示撮像装置が提供される。 According to one aspect of the present invention, there is provided a display and imaging device capable of realizing a state of line-of-sight matching and improving image quality.
実施形態に係る表示撮像装置の概略構成図である。It is a schematic block diagram of the display imaging device which concerns on embodiment. ディスプレイの一部の正面拡大図、及び、カメラの光景を概念的に示す図である。It is a figure which shows notionally the front enlarged view of a part of display, and the sight of a camera. 制御装置のブロック図を示す図である。It is a figure which shows the block diagram of a control apparatus. 複数の受光位置での撮像画像を取得する手法の例を示す図である。It is a figure which shows the example of the method of acquiring the captured image in a some light reception position. 前方画像の生成の例を示す図である。It is a figure which shows the example of the production | generation of a front image. 制御装置によって実行される処理の例を示すフローチャートである。It is a flowchart which shows the example of the process performed by the control apparatus. ビデオ通話システムの概略構成図である。It is a schematic block diagram of a video call system. 制御装置のブロック図を示す図である。It is a figure which shows the block diagram of a control apparatus. 前方画像の生成の例を示す図である。It is a figure which shows the example of the production | generation of a front image. 制御装置によって実行される処理の例を示すフローチャートである。It is a flowchart which shows the example of the process performed by the control apparatus. 制御装置等のハードウェア構成の例を示す図である。It is a figure which shows the example of hardware constitutions, such as a control apparatus.
 以下、本発明の実施形態について、図面を参照しながら説明する。なお、図面中、同一要素には同一符号を付し、重複する説明は省略する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the drawings, the same elements are denoted by the same reference numerals, and redundant description is omitted.
 図1は、実施形態に係る表示撮像装置7の概略構成を示す。表示撮像装置7は、ディスプレイ1と、カメラ2と、制御装置6とを含む。ディスプレイ1、カメラ2及び制御装置6は、例えば図示しない支持部材等によって支持されて所望の位置に配置される。図1には、表示撮像装置7のユーザであるユーザU1も示される。なお、いくつかの図には、XYZ直交座標系が示される。X軸は、ディスプレイ1の水平方向を示す。Y軸は、ディスプレイ1の垂直方向を示す。Z軸は、ディスプレイ1の前後方向を示す。 FIG. 1 shows a schematic configuration of a display imaging device 7 according to the embodiment. The display imaging device 7 includes a display 1, a camera 2, and a control device 6. The display 1, the camera 2, and the control device 6 are supported at a desired position by being supported by a support member (not shown), for example. FIG. 1 also shows a user U1 who is a user of the display imaging device 7. In some figures, an XYZ orthogonal coordinate system is shown. The X axis indicates the horizontal direction of the display 1. The Y axis indicates the vertical direction of the display 1. The Z axis indicates the front-rear direction of the display 1.
 ディスプレイ1は、透過型のディスプレイであり、例えば各表示画素が有機発光ダイオード(OLED:Organic Light Emitting Diode)を含んで構成される自発光型のディスプレイである。ディスプレイ1の前面1aは、ディスプレイ1の表示面である。ディスプレイ1は、前方に視野角を有しており、ディスプレイ1からの光(映像光等)は、視野角の範囲内で前面1aから前方に進む。ディスプレイ1の前面1aに表示される映像は、ディスプレイ1を正面方向からみたときに最も明るくなる。 The display 1 is a transmissive display, for example, a self-luminous display in which each display pixel includes an organic light emitting diode (OLED: Organic Light Emitting Diode). A front surface 1 a of the display 1 is a display surface of the display 1. The display 1 has a viewing angle forward, and light (video light or the like) from the display 1 travels forward from the front surface 1a within the viewing angle range. The image displayed on the front surface 1a of the display 1 is brightest when the display 1 is viewed from the front.
 カメラ2は、ディスプレイ1の背後に設けられる。カメラ2は、ディスプレイ1を介してディスプレイ1の前方を撮像する。撮像対象の例は、ユーザU1である。カメラ2が所定の撮像フレームレートで撮像を繰り返すことにより、時系列で連続する複数の撮像画像が取得される。このようにして取得された複数の撮像画像は、ディスプレイ1の前方の映像を構成しうる。 The camera 2 is provided behind the display 1. The camera 2 images the front of the display 1 through the display 1. An example of the imaging target is the user U1. When the camera 2 repeats imaging at a predetermined imaging frame rate, a plurality of captured images that are continuous in time series are acquired. The plurality of captured images acquired in this way can constitute a video in front of the display 1.
 カメラ2の画角の範囲内に位置する撮像対象物からの光が、カメラ2に入射する。図1において、カメラ2の画角の広がりが、角度αを用いて概念的に示される。カメラ2は、受光位置Rで入射光を受ける。受光位置Rはカメラ2の撮像起点の位置であり、受光位置Rから前方を見た光景がカメラ2の撮像画像となる。カメラ2は、光電変換素子、画像処理回路等を含む。光電変換素子の位置がカメラ2の受光位置Rであってもよい。 The light from the imaging object located within the range of the angle of view of the camera 2 enters the camera 2. In FIG. 1, the spread of the angle of view of the camera 2 is conceptually shown using an angle α. The camera 2 receives incident light at the light receiving position R. The light receiving position R is the position of the imaging start point of the camera 2, and a scene viewed forward from the light receiving position R is a captured image of the camera 2. The camera 2 includes a photoelectric conversion element, an image processing circuit, and the like. The position of the photoelectric conversion element may be the light receiving position R of the camera 2.
 カメラ2の受光位置Rとディスプレイ1の背面1bとの間の距離を、距離A1と称し図示する。受光位置RとユーザU1との間の距離を、距離A2と称し図示する。距離A1は、距離A2よりも十分に小さい。距離A2は、表示撮像装置7の利用態様にもよるが、通常は数十センチ~数メートル程度である。距離A1は、距離A2の数十分の一から数百分の一程度、例えば数mm~数cm程度に設定されてよい。距離A1ができるだけ小さくなるように、カメラ2の端部2aがディスプレイ1の背面1bに近接あるいは接した状態で、カメラ2をディスプレイ1の背後に設けてもよい。 The distance between the light receiving position R of the camera 2 and the back surface 1b of the display 1 is referred to as a distance A1 and is illustrated. A distance between the light receiving position R and the user U1 is referred to as a distance A2 and illustrated. The distance A1 is sufficiently smaller than the distance A2. The distance A2 is usually about several tens of centimeters to several meters, although it depends on the usage mode of the display imaging device 7. The distance A1 may be set to about several tenths to several hundredths of the distance A2, for example, about several mm to several cm. The camera 2 may be provided behind the display 1 in a state where the end 2a of the camera 2 is close to or in contact with the back surface 1b of the display 1 so that the distance A1 is as small as possible.
 カメラ2によってディスプレイ1の前方を撮像する場合、次に図2を参照して説明する問題が生じうる。図2の(a)は、ディスプレイ1の一部の正面拡大図を概念的に示す。ディスプレイ1は、複数の表示画素11と、周期構造部材15とを含む。各表示画素11は、ディスプレイ1のサブピクセルであってよく、その場合、各表示画素11は、R、G、Bの三原色のいずれかの色に対応した発光素子を含む。発光素子の例は、上述の有機発光ダイオードである。周期構造部材15は、ディスプレイ1を駆動するための網目状に存在する信号線を含む。例えば信号線からの電気信号(制御信号等)に応じて、各表示画素11の発光素子の輝度値が調整される。この他に、信号線の配線に必要な部材等が、周期構造部材15に含まれてよい。周期構造部材15は、表示画素11を網目状に区画する格子枠でもある。図2に示される例では、一つの網目内に一つの表示画素11が含まれる。ただし、一つの網目内に複数の表示画素11が含まれてもよい。例えば水平方向に走る一本の信号線が当該信号線の両側に位置する表示画素11の発光素子を駆動する場合には、垂直方向において一つの網目内に2つの表示画素11が含まれてよい。垂直方向に走る一本の信号線が当該信号線の両側に位置する表示画素11の発光素子を駆動する場合には、水平方向において一つの網目内に2つの表示画素11が含まれてよい。一つの網目に含まれる表示画素11の数がフレキシブルであるので、網目の形状は正方形には限定されず、例えば長方形になりうる。 When the front side of the display 1 is imaged by the camera 2, the problem described with reference to FIG. FIG. 2A conceptually shows an enlarged front view of a part of the display 1. The display 1 includes a plurality of display pixels 11 and a periodic structure member 15. Each display pixel 11 may be a sub-pixel of the display 1, and in this case, each display pixel 11 includes a light emitting element corresponding to one of the three primary colors of R, G, and B. The example of a light emitting element is the above-mentioned organic light emitting diode. The periodic structure member 15 includes signal lines that exist in a mesh shape for driving the display 1. For example, the luminance value of the light emitting element of each display pixel 11 is adjusted in accordance with an electrical signal (control signal or the like) from the signal line. In addition, the periodic structure member 15 may include members necessary for signal line wiring. The periodic structure member 15 is also a lattice frame that partitions the display pixels 11 into a mesh shape. In the example shown in FIG. 2, one display pixel 11 is included in one mesh. However, a plurality of display pixels 11 may be included in one mesh. For example, when one signal line running in the horizontal direction drives the light emitting elements of the display pixels 11 located on both sides of the signal line, the two display pixels 11 may be included in one mesh in the vertical direction. . When one signal line running in the vertical direction drives the light emitting elements of the display pixels 11 located on both sides of the signal line, two display pixels 11 may be included in one mesh in the horizontal direction. Since the number of display pixels 11 included in one mesh is flexible, the shape of the mesh is not limited to a square, and may be, for example, a rectangle.
 周期構造部材15は、複数の水平部材15xと、複数の垂直部材15yとを含む。各水平部材15xは、ディスプレイ1の水平方向に延在し、ディスプレイ1の垂直方向において等間隔に配置される。各垂直部材15yは、ディスプレイ1の垂直方向に延在し、ディスプレイ1の水平方向において等間隔に配置される。周期構造部材15の寸法の詳細については、次の図2の(b)を参照して説明する。 The periodic structure member 15 includes a plurality of horizontal members 15x and a plurality of vertical members 15y. Each horizontal member 15 x extends in the horizontal direction of the display 1 and is arranged at equal intervals in the vertical direction of the display 1. Each vertical member 15 y extends in the vertical direction of the display 1 and is arranged at equal intervals in the horizontal direction of the display 1. Details of the dimensions of the periodic structure member 15 will be described with reference to FIG.
 ディスプレイ1を介してディスプレイ1の前方を撮像するカメラ2には、例えば図2の(b)に示されるような光景が写る。図示されるように、カメラ2には、ディスプレイ1の周期構造部材15が写り込む。周期構造部材15は、通常は、黒色あるいは黒色に近い色の格子枠として写り込む。図2の(b)には、周期構造部材15の寸法に関して、以下の符号が付されている。
  垂直部材幅Wx:垂直部材15yの水平方向における長さであり、水平方向における周期構造部材15の部材幅である。
  垂直部材間隔Dx:隣り合う垂直部材15y同士の間の水平方向の距離である。水平方向における周期構造部材15の周期間隔(網目の大きさ)でもある。垂直部材間隔Dxは、垂直部材幅Wxよりも大きくてよい(例えば2倍以上)。
  水平部材幅Wy:水平部材15xの垂直方向における長さであり、垂直方向における周期構造部材15の部材幅である。
  水平部材間隔Dy:隣り合う水平部材15x同士の間の垂直方向の距離である。垂直方向における周期構造部材15の周期間隔(網目の大きさ)でもある。水平部材間隔Dyは、水平部材幅Wyよりも大きくてよい(例えば2倍以上)。
A camera 2 that captures an image of the front of the display 1 via the display 1 shows a scene as shown in FIG. As illustrated, the periodic structure member 15 of the display 1 is reflected in the camera 2. The periodic structure member 15 usually appears as a grid frame of black or a color close to black. In FIG. 2B, the following symbols are attached regarding the dimensions of the periodic structure member 15.
Vertical member width Wx: The length of the vertical member 15y in the horizontal direction and the member width of the periodic structure member 15 in the horizontal direction.
Vertical member interval Dx: a horizontal distance between adjacent vertical members 15y. It is also the periodic interval (mesh size) of the periodic structural member 15 in the horizontal direction. The vertical member interval Dx may be larger than the vertical member width Wx (for example, twice or more).
Horizontal member width Wy: The length in the vertical direction of the horizontal member 15x, and the member width of the periodic structure member 15 in the vertical direction.
Horizontal member interval Dy: a vertical distance between adjacent horizontal members 15x. It is also the periodic interval (mesh size) of the periodic structure member 15 in the vertical direction. The horizontal member interval Dy may be larger than the horizontal member width Wy (for example, twice or more).
 カメラ2の撮像画素(図示しない)の大きさについて説明する。周期構造部材15の網目と比較した場合、カメラ2の撮像画素は、網目よりも十分に小さくてよい。一つの網目内の水平方向の撮像画素数をNxとし垂直方向の撮像画素数をNyとすると、Nx及びNyは数個~数百個程度であってよい。垂直部材間隔Dx及び垂直部材間隔Dxは、カメラ2の撮像画素数を単位として表すこともできる。その場合、垂直部材間隔Dxの大きさはNxであり、水平部材間隔Dyの大きさはNyである。 The size of the imaging pixel (not shown) of the camera 2 will be described. When compared with the mesh of the periodic structure member 15, the imaging pixels of the camera 2 may be sufficiently smaller than the mesh. If the number of horizontal imaging pixels in one mesh is Nx and the number of vertical imaging pixels is Ny, Nx and Ny may be several to several hundreds. The vertical member interval Dx and the vertical member interval Dx can also be expressed in units of the number of imaging pixels of the camera 2. In that case, the vertical member interval Dx is Nx, and the horizontal member interval Dy is Ny.
 上述の周期構造部材15に関する寸法、すなわち垂直部材間隔Dx、垂直部材幅Wx、水平部材間隔Dy、及び、水平部材幅Wyは、例えば、予めキャリブレーションを行うことによって計測されていてよい。 The dimensions related to the periodic structure member 15 described above, that is, the vertical member interval Dx, the vertical member width Wx, the horizontal member interval Dy, and the horizontal member width Wy may be measured, for example, by performing calibration in advance.
 周期構造部材15のライン幅と比較した場合、カメラ2の撮像画素の大きさは、水平方向において垂直部材幅Wx以下であり、垂直方向において水平部材幅Wy以下でありうる。つまり、垂直部材幅Wx及び水平部材幅Wyは、カメラ2の一つの撮像画素の大きさ以上になりうる。これは、先に説明したように、カメラ2の受光位置Rとディスプレイ1の背面1bとの間の距離A1が小さく、カメラ2が周期構造部材15を至近距離から撮像してしまうためである。この場合、撮像画像中の水平部材15x及び垂直部材15yが撮像画像に表示され、撮像画像に周期構造部材15が写り込む可能性が高まる。なお、垂直部材幅Wx及び水平部材幅Wyの大きさの違い等に起因して、水平部材15xのみが写り込む場合もあるし、垂直部材15yのみが写り込む場合もある。周期構造部材15の写り込みに起因して、カメラ2によって撮像される画像の品質(画質)が低下する可能性も高まる。この問題は、次に説明する制御装置6によって改善される。 When compared with the line width of the periodic structure member 15, the size of the imaging pixel of the camera 2 may be equal to or less than the vertical member width Wx in the horizontal direction and less than or equal to the horizontal member width Wy in the vertical direction. That is, the vertical member width Wx and the horizontal member width Wy can be equal to or larger than the size of one imaging pixel of the camera 2. This is because, as described above, the distance A1 between the light receiving position R of the camera 2 and the back surface 1b of the display 1 is small, and the camera 2 images the periodic structure member 15 from a close range. In this case, the horizontal member 15x and the vertical member 15y in the captured image are displayed in the captured image, and the possibility that the periodic structure member 15 is reflected in the captured image increases. Note that only the horizontal member 15x may be reflected due to the difference in size between the vertical member width Wx and the horizontal member width Wy, or only the vertical member 15y may be reflected. Due to the reflection of the periodic structure member 15, the possibility that the quality (image quality) of the image captured by the camera 2 is lowered is also increased. This problem is improved by the control device 6 described below.
 図3は、制御装置6のブロック図を示す。制御装置6は、例えばカメラ2と通信可能に構成されており、カメラ2による撮像に関するさまざまな制御を行ったり、カメラ2の撮像画像を取得したりすることができる。制御装置6は、とくにその機能ブロックとして、取得部61と、生成部62とを含む。なお、取得部61は、カメラ2を移動させるような物理的な要素(例えば後述の保持部61a)を含みうる。 FIG. 3 shows a block diagram of the control device 6. The control device 6 is configured to be communicable with the camera 2, for example, and can perform various controls relating to imaging by the camera 2 and acquire a captured image of the camera 2. The control device 6 includes an acquisition unit 61 and a generation unit 62 as its functional blocks. The acquisition unit 61 can include a physical element (for example, a holding unit 61a described later) that moves the camera 2.
 取得部61は、カメラ2の受光位置Rが複数の位置の各々にあるときのカメラ2の撮像画像を取得する部分である。ここで取得される撮像画像は、周期構造部材15が異なる位置に表示された複数の撮像画像である。一実施形態では、カメラ2は、移動可能に構成される。カメラ2の受光位置Rは、カメラ2の移動に応じて、複数の位置の間で移動する。例えば図4の(a)に例示される手法では、取得部61が、カメラ2を保持する保持部61aを制御することによって、カメラ2自体を移動させる移動部としても機能する。後述するように、カメラ2が(微小)振動するように制御されてもよい。保持部61aは、公知の手法を用いてさまざまな態様で実現できる。保持部61aは、例えば、振動子を用いて微小移動する部品であっても良いし、保持部61aを移動させるのではなく、カメラ2の本体、撮像素子(CMOS、CCD等)、レンズ等に振動子を設けて、微小な移動を与えて実現しても良いし、カメラ2の内部に設けたMEMS(Micro Electro Mechanical Systems)ミラーで光の反射方向を制御することでカメラ2を微小振動した状態と同じ映像を撮影するような制御によって実現しても良い。 The acquisition unit 61 is a part that acquires a captured image of the camera 2 when the light receiving position R of the camera 2 is at each of a plurality of positions. The captured images acquired here are a plurality of captured images displayed at different positions of the periodic structure member 15. In one embodiment, the camera 2 is configured to be movable. The light receiving position R of the camera 2 moves between a plurality of positions in accordance with the movement of the camera 2. For example, in the method illustrated in FIG. 4A, the acquisition unit 61 functions as a moving unit that moves the camera 2 itself by controlling the holding unit 61 a that holds the camera 2. As will be described later, the camera 2 may be controlled to vibrate (micro). The holding | maintenance part 61a is realizable in various aspects using a well-known method. The holding unit 61a may be, for example, a component that moves microscopically using a vibrator. Instead of moving the holding unit 61a, the holding unit 61a may be attached to the main body of the camera 2, an image sensor (CMOS, CCD, etc.), a lens, or the like. It may be realized by providing a vibrator and giving a minute movement, or the camera 2 is slightly vibrated by controlling the light reflection direction with a MEMS (Micro Electro Mechanical Systems) mirror provided inside the camera 2. You may implement | achieve by control which image | photographs the same image | video as a state.
 一実施形態では、カメラ2の受光位置Rは、複数の位置に同時に存在する。そのようなカメラとして、例えばライトフィールドカメラ、3Dカメラ等を用いることができる。図4の(b)には、ライトフィールドカメラ21の光学系の概略構成が示される。ライトフィールドカメラ21の撮像対象が、撮像対象30として概念的に図示される。ライトフィールドカメラ21では、主レンズ24と、CMOS/CCD等の撮像素子(センサ)22との間において、撮像素子22の前方にマイクロレンズアレイ23が配置されている。ライトフィールドカメラの原理は公知であるので、詳細な説明は省略する。例えばライトフィールドカメラ21を用いれば、カメラ自体を移動させなくとも、カメラ2の受光位置Rを移動させた場合と同様の複数の撮像画像(画像群)が取得できる。しかも、そのような画像群を、1回の撮像(1ショット)で取得できる。この場合、画像群を構成する撮像画像の数を1ショットに要する時間で除した値を、撮像フレームレートとみなすことができる。ライトフィールドカメラ21によれば、図4の(a)に示される保持部61aのような移動を生じさせるための動作系が不要となる分、実質的にカメラ2の受光位置Rの移動速度(振動の場合には振動周波数)を高めるのと同様の効果が奏される。これにより、カメラ2の撮像フレームレートを上げることも可能になる。 In one embodiment, the light receiving positions R of the camera 2 exist simultaneously at a plurality of positions. As such a camera, for example, a light field camera, a 3D camera, or the like can be used. FIG. 4B shows a schematic configuration of the optical system of the light field camera 21. An imaging target of the light field camera 21 is conceptually illustrated as an imaging target 30. In the light field camera 21, a microlens array 23 is disposed in front of the image sensor 22 between the main lens 24 and an image sensor (sensor) 22 such as a CMOS / CCD. Since the principle of the light field camera is well known, detailed description is omitted. For example, if the light field camera 21 is used, a plurality of captured images (image groups) similar to those when the light receiving position R of the camera 2 is moved can be acquired without moving the camera itself. Moreover, such an image group can be acquired by one imaging (one shot). In this case, a value obtained by dividing the number of captured images constituting the image group by the time required for one shot can be regarded as the captured frame rate. According to the light field camera 21, the movement speed of the light receiving position R of the camera 2 is substantially reduced since an operation system for causing movement like the holding unit 61a shown in FIG. In the case of vibration, the same effect as increasing the vibration frequency) is obtained. Thereby, the imaging frame rate of the camera 2 can be increased.
 以降は、図4の(a)を参照して説明したようにカメラ2の受光位置Rが複数の位置の間で移動し、各位置でのカメラ2の撮像画像が取得される形態について主に説明する。そうでなく図4の(b)を参照して説明したようにカメラ2の受光位置Rが複数の位置に同時に存在する形態では、カメラ2の受光位置Rの移動先となる各位置に対応する複数の受光位置が存在するように調整されたライトフィールドカメラ等を用いればよい。その場合は、以降の説明におけるカメラ2の受光位置Rの移動が不要であるものとして理解すればよい。 Thereafter, as described with reference to FIG. 4A, the light receiving position R of the camera 2 moves between a plurality of positions, and the captured image of the camera 2 at each position is mainly acquired. explain. On the other hand, as described with reference to FIG. 4B, in the form in which the light receiving positions R of the camera 2 are simultaneously present at a plurality of positions, it corresponds to each position to which the light receiving position R of the camera 2 is moved. A light field camera or the like adjusted to have a plurality of light receiving positions may be used. In that case, what is necessary is just to understand that the movement of the light reception position R of the camera 2 in the following description is unnecessary.
 図3に戻り、取得部61は、カメラ2の受光位置Rを複数の位置に移動させる。一実施形態において、取得部61は、水平方向及び垂直方向に対して傾斜する方向(以下、「傾斜方向」と呼ぶ)に、カメラ2の受光位置Rを移動させてよい。複数の位置は、ディスプレイ1の背面1bと平行な座標(XY平面座標)で規定されてよい。複数の位置は、Z軸方向において同じ位置であってよい。この場合、カメラ2の受光位置Rが移動しても、カメラ2の受光位置Rと背面1bとの間の距離A1(図1参照)は変化しない。 3, the acquisition unit 61 moves the light receiving position R of the camera 2 to a plurality of positions. In one embodiment, the acquisition unit 61 may move the light receiving position R of the camera 2 in a direction inclined with respect to the horizontal direction and the vertical direction (hereinafter referred to as “inclination direction”). The plurality of positions may be defined by coordinates (XY plane coordinates) parallel to the back surface 1b of the display 1. The plurality of positions may be the same position in the Z-axis direction. In this case, even if the light receiving position R of the camera 2 moves, the distance A1 (see FIG. 1) between the light receiving position R of the camera 2 and the back surface 1b does not change.
 複数の位置は、各受光位置で取得される撮像画像同士の間に微小な視差(微視差)が生じるように定められる。微視差が生じるとは、各受光位置で取得される撮像画像において、撮像対象がほぼ同じ位置に表示される(重複する)のに対し、周期構造部材15は異なる位置に表示される(重複しない)ように、撮像画像間で視差が生じることを意味する。このような微視差が生じるのは、先に図1を参照して説明したように、カメラ2の受光位置Rとディスプレイ1の背面1bとの間の距離A1が、受光位置Rと撮像対象であるユーザU1との間の距離A2よりも小さいからである。複数の位置は、複数の位置の移動方向(例えば上述の傾斜方向)に直線上に並んで配置される。複数の位置は、3以上の位置であってもよい。 The plurality of positions are determined so that minute parallax (fine parallax) is generated between the captured images acquired at the respective light receiving positions. The occurrence of fine parallax means that in the captured images acquired at the respective light receiving positions, the imaging target is displayed at approximately the same position (overlapping), while the periodic structure member 15 is displayed at a different position (not overlapping). ) Means that parallax occurs between captured images. Such fine parallax occurs because the distance A1 between the light receiving position R of the camera 2 and the back surface 1b of the display 1 depends on the light receiving position R and the imaging target, as described above with reference to FIG. This is because it is smaller than the distance A2 between the user U1. The plurality of positions are arranged side by side in a straight line in the movement direction of the plurality of positions (for example, the above-described tilt direction). The plurality of positions may be three or more positions.
 微視差を生じさせるためのカメラ2の受光位置R間の距離(受光位置Rの移動距離)について説明する。まず、カメラ2の受光位置Rとディスプレイ1の背面1bとの間の距離A1が十分に小さいので、カメラ2の受光位置RのXY平面上での移動は、ディスプレイ1の周期構造部材15を含む平面上の移動とみなすことができる。よって、カメラ2の受光位置Rの移動距離と同じだけ、カメラ2の撮像画像中の周期構造部材15も移動する。ここで、カメラ2の受光位置Rの移動距離の最小単位を、以下のように定義する。
  移動距離Δx:移動方向(例えば傾斜方向)において隣り合う位置同士の間の水平方向の距離(離間距離)である。座標の設定の仕方によっては、Δxは負の値でもよい。
  移動距離Δy:移動方向において隣り合う位置同士の間の垂直方向の距離(離間距離)である。座標の設定の仕方によっては、Δyは負の値でもよい。
 上記のように移動距離Δx及び移動距離Δyは負の値もとりうるが、以下では、移動距離Δxの大きさ及び移動距離Δyの大きさを、単に移動距離Δx及び移動距離Δyと呼ぶ。
A distance between the light receiving positions R of the camera 2 for causing fine parallax (movement distance of the light receiving position R) will be described. First, since the distance A1 between the light receiving position R of the camera 2 and the back surface 1b of the display 1 is sufficiently small, the movement of the light receiving position R of the camera 2 on the XY plane includes the periodic structure member 15 of the display 1. It can be regarded as movement on a plane. Therefore, the periodic structure member 15 in the captured image of the camera 2 moves as much as the moving distance of the light receiving position R of the camera 2. Here, the minimum unit of the moving distance of the light receiving position R of the camera 2 is defined as follows.
Movement distance Δx: Distance in the horizontal direction (separation distance) between adjacent positions in the movement direction (for example, the tilt direction). Depending on how the coordinates are set, Δx may be a negative value.
Movement distance Δy: Distance in the vertical direction (separation distance) between adjacent positions in the movement direction. Depending on how the coordinates are set, Δy may be a negative value.
As described above, the moving distance Δx and the moving distance Δy can take negative values, but hereinafter, the magnitude of the moving distance Δx and the magnitude of the moving distance Δy are simply referred to as the moving distance Δx and the moving distance Δy.
 移動距離Δx及び移動距離Δyは、図2の(b)に示される周期構造部材15の寸法を用いて、次のように設定される。移動距離Δxは、垂直部材幅Wx以上且つ垂直部材間隔Dx未満に設定される。つまり、Wx≦|Δx|<Dxである。移動距離Δyは、水平部材幅Wy以上且つ水平部材間隔Dy未満に設定される。つまり、Wy≦|Δy|<Dyである。この後で説明する取得部61の制御では、移動距離Δxは、垂直部材間隔Dxの2分の1未満、つまり、Wx≦|Δx|<(Dx)/2に設定される。移動距離Δyは、水平部材間隔Dyの2分の1未満、つまり、Wy≦|Δy|<(Dy)/2に設定される。 The moving distance Δx and the moving distance Δy are set as follows using the dimensions of the periodic structure member 15 shown in FIG. The movement distance Δx is set to be not less than the vertical member width Wx and less than the vertical member interval Dx. That is, Wx ≦ | Δx | <Dx. The movement distance Δy is set to be not less than the horizontal member width Wy and less than the horizontal member interval Dy. That is, Wy ≦ | Δy | <Dy. In the control of the acquisition unit 61 described later, the movement distance Δx is set to be less than half of the vertical member interval Dx, that is, Wx ≦ | Δx | <(Dx) / 2. The movement distance Δy is set to be less than half of the horizontal member interval Dy, that is, Wy ≦ | Δy | <(Dy) / 2.
 前述のように、垂直部材間隔Dx及び垂直部材間隔Dxの大きさはカメラ2の撮像画素数Nx及びNyで表すこともできる。その場合には、移動距離Δxは、Nx未満、あるいは(Nx/2)未満として表される。移動距離Δyは、Ny未満、あるいは(Ny/2)未満として表される。 As described above, the vertical member interval Dx and the vertical member interval Dx can be represented by the number of pixels Nx and Ny of the camera 2. In that case, the movement distance Δx is expressed as less than Nx or less than (Nx / 2). The movement distance Δy is expressed as less than Ny or less than (Ny / 2).
 移動距離Δx及び移動距離Δyの技術的意義を説明する。移動距離Δxが垂直部材幅Wx以上なので、カメラ2の受光位置Rの移動前後で取得される撮像画像中の同じ位置では垂直部材15yは重複しない。また、移動距離Δxが垂直部材間隔Dx未満であるので、カメラ2の受光位置Rの移動前後で取得される撮像画像中の同じ位置に異なる垂直部材15y(例えば隣り合う垂直部材15y)が重複することもない。さらに、移動距離Δxが垂直部材間隔Dxの2分の1未満の場合には、受光位置Rの2回の移動にわたって取得される3つの撮像画像中の同じ位置でも、垂直部材15yは重複しない。 The technical significance of the movement distance Δx and the movement distance Δy will be described. Since the movement distance Δx is equal to or greater than the vertical member width Wx, the vertical member 15y does not overlap at the same position in the captured image acquired before and after the movement of the light receiving position R of the camera 2. Further, since the movement distance Δx is less than the vertical member interval Dx, different vertical members 15y (for example, adjacent vertical members 15y) overlap at the same position in the captured image obtained before and after the movement of the light receiving position R of the camera 2. There is nothing. Furthermore, when the movement distance Δx is less than half of the vertical member interval Dx, the vertical member 15y does not overlap even at the same position in the three captured images acquired over the two movements of the light receiving position R.
 同様に、移動距離Δyが水平部材幅Wy以上なので、移動前後で取得される撮像画像中の同じ位置では、水平部材15xは重複しない。また、移動距離Δyが水平部材間隔Dy未満であるので、移動前後で取得される画像中の同じ位置に異なる水平部材15x(例えば隣り合う水平部材15x)が重複することもない。さらに、移動距離Δyが水平部材間隔Dyの2分の1未満の場合には、受光位置Rの2回の移動にわたって取得される3つの撮像画像中の同じ位置でも、水平部材15xは重複しない。 Similarly, since the movement distance Δy is equal to or greater than the horizontal member width Wy, the horizontal member 15x does not overlap at the same position in the captured image acquired before and after the movement. Further, since the movement distance Δy is less than the horizontal member interval Dy, different horizontal members 15x (for example, adjacent horizontal members 15x) do not overlap at the same position in the images acquired before and after the movement. Further, when the movement distance Δy is less than half of the horizontal member interval Dy, the horizontal member 15x does not overlap even at the same position in the three captured images acquired over the two movements of the light receiving position R.
 なお、カメラ2の撮像画像に水平部材15xが写り込まないほど水平部材幅Wyが小さい場合には移動距離Δxも十分に小さくできる。その場合、移動距離Δxはゼロであってもよい。カメラ2の撮像画像に垂直部材15yが写り込まないほど垂直部材幅Wxが小さい場合には移動距離Δyも十分に小さくできる。その場合、移動距離Δyはゼロであってもよい。 If the horizontal member width Wy is so small that the horizontal member 15x does not appear in the captured image of the camera 2, the moving distance Δx can be made sufficiently small. In that case, the movement distance Δx may be zero. When the vertical member width Wx is so small that the vertical member 15y is not reflected in the captured image of the camera 2, the movement distance Δy can also be made sufficiently small. In that case, the movement distance Δy may be zero.
 取得部61が傾斜方向にカメラ2の受光位置Rを移動させる場合、傾斜方向の水平方向に対する角度(傾斜角度)は、移動距離Δx及び移動距離Δyを用いて、傾斜角度=arctan(Δy/Δx)として表される。傾斜角度は90°未満である。移動距離Δx及び移動距離Δyが同じであれば、傾斜方向の角度は45°に設定される。移動距離Δx及び移動距離Δyが、垂直部材幅Wx、垂直部材間隔Dx、水平部材幅Wy及び垂直部材間隔Dxによって異なりうることを考慮すると、傾斜角度は、45°とは異なる値であってもよい。移動距離Δx及び移動距離Δyがある程度近い場合、例えば、傾斜角度は、45°±10°、45°±20°又は45°±30°の範囲内の大きさとなりうる。 When the acquisition unit 61 moves the light receiving position R of the camera 2 in the inclination direction, the angle of the inclination direction with respect to the horizontal direction (inclination angle) is calculated by using the movement distance Δx and the movement distance Δy, and the inclination angle = arctan (Δy / Δx ). The tilt angle is less than 90 °. If the movement distance Δx and the movement distance Δy are the same, the angle in the tilt direction is set to 45 °. Considering that the moving distance Δx and the moving distance Δy may vary depending on the vertical member width Wx, the vertical member interval Dx, the horizontal member width Wy, and the vertical member interval Dx, the inclination angle may be a value different from 45 °. Good. When the movement distance Δx and the movement distance Δy are close to some extent, for example, the inclination angle can be a size within a range of 45 ° ± 10 °, 45 ° ± 20 °, or 45 ° ± 30 °.
 なお、移動距離Δx及び移動距離Δyの一方の移動距離がゼロの場合には、傾斜角度は0°又は90°である。その場合、取得部61は、実質的に水平方向又は垂直方向にカメラ2の受光位置Rを移動させる。 When one of the movement distances Δx and Δy is zero, the inclination angle is 0 ° or 90 °. In that case, the acquisition unit 61 moves the light receiving position R of the camera 2 substantially in the horizontal direction or the vertical direction.
 図3に戻り、取得部61は、カメラ2の撮像フレームレートに連動して、カメラ2の受光位置Rを移動させる。具体的に、撮像画像が取得される時(撮像時)にカメラ2の受光位置Rが複数の位置のいずれかに位置するように、取得部61がカメラ2の受光位置Rを移動させる。カメラ2の撮像フレームレートが60fpsである場合、取得部61は、カメラ2の受光位置Rを1秒間に60回移動させる。 3, the acquisition unit 61 moves the light receiving position R of the camera 2 in conjunction with the imaging frame rate of the camera 2. Specifically, when the captured image is acquired (at the time of imaging), the acquisition unit 61 moves the light receiving position R of the camera 2 so that the light receiving position R of the camera 2 is located at one of a plurality of positions. When the imaging frame rate of the camera 2 is 60 fps, the acquisition unit 61 moves the light receiving position R of the camera 2 60 times per second.
 さらに、取得部61は、カメラ2の受光位置Rを往復移動させてよい。この場合、取得部61は、先に図4の(a)を参照して説明したように、カメラ2の受光位置Rを(高速且つ微小に)振動させてよい。往復経路の一方の経路の長さ(つまり振動の幅)は、水平方向において垂直部材間隔Dx未満となり、垂直方向において水平部材間隔Dy未満となるように設定されてよい。 Furthermore, the acquisition unit 61 may reciprocate the light receiving position R of the camera 2. In this case, the acquisition unit 61 may vibrate the light receiving position R of the camera 2 (at a high speed and minutely) as described above with reference to FIG. The length of one of the reciprocating paths (that is, the width of vibration) may be set to be less than the vertical member interval Dx in the horizontal direction and less than the horizontal member interval Dy in the vertical direction.
 図3に戻り、生成部62について説明する。生成部62は、取得部61によって取得された撮像画像に基づいて、ディスプレイ1の前方の画像(以下、「前方画像」とも呼ぶ)を生成する部分である。生成部62は、周期構造部材15が表示されないように、前方画像を生成する。そのような前方画像は、カメラ2の撮像画像(例えば図2の(b)に示されるような画像)と比較して周期構造部材15の写り込みが低減された画像である。前方画像は、周期構造部材15が無い(除去された)画像であってもよい。 Returning to FIG. 3, the generation unit 62 will be described. The generation unit 62 is a part that generates an image in front of the display 1 (hereinafter also referred to as “front image”) based on the captured image acquired by the acquisition unit 61. The generation unit 62 generates a front image so that the periodic structure member 15 is not displayed. Such a front image is an image in which the reflection of the periodic structure member 15 is reduced as compared with a captured image of the camera 2 (for example, an image as shown in FIG. 2B). The front image may be an image without the periodic structure member 15 (removed).
 取得部61がカメラ2の受光位置Rを往復移動させる場合、生成部62は、カメラ2の受光位置Rが往復経路の各位置にあるときに順に取得されたカメラ2の撮像画像に基づいて前方画像を生成してよい。このような生成部62による前方画像の生成には、時系列でのメディアンフィルタが用いられてよい。メディアンフィルタ自体は公知の手法であるので、ここでは説明を省略する。メディアンフィルタの機能は、生成部62自体に備えられていてもよいし、そうでなく、他の要素(制御装置6外部の図示しないサーバ等であってもよい)が備えるメディアンフィルタの機能を生成部62が利用してもよい。 When the acquisition unit 61 reciprocates the light receiving position R of the camera 2, the generation unit 62 moves forward based on the captured images of the camera 2 that are sequentially acquired when the light receiving position R of the camera 2 is in each position of the reciprocating path. An image may be generated. A time series median filter may be used to generate the forward image by the generation unit 62. Since the median filter itself is a known technique, the description thereof is omitted here. The function of the median filter may be provided in the generation unit 62 itself, or the function of the median filter provided in other elements (may be a server (not shown) outside the control device 6) is generated. The unit 62 may use it.
 メディアンフィルタを用いる場合、生成部62は、複数の位置のうちの受光位置Rの移動方向(例えば傾斜方向)に並ぶ少なくとも3つの受光位置で取得される撮像画像に対してメディアンフィルタをかけることによって得られる画像を、前方画像として生成してよい。例えば、3つの撮像画素の各撮像画素が有する輝度値の中央値を輝度値とした撮像画素を組み合わせて得られる画像が、前方画像として生成される。カメラ2の受光位置Rが3つの位置の各々にあるときに取得される撮像画像は、カメラ2の受光位置Rが撮像フレームに連動して移動した際に取得された時系列上の3つの撮像画像である。このような時系列上の複数の撮像画像(画像群)に対してメディアンフィルタが施されることになる。以下では、複数の位置が3つの場合について説明する。 When using the median filter, the generation unit 62 applies the median filter to captured images acquired at least three light receiving positions arranged in the moving direction (for example, the tilt direction) of the light receiving position R among the plurality of positions. The resulting image may be generated as a front image. For example, an image obtained by combining image pickup pixels whose luminance value is the median luminance value of each of the three image pickup pixels is generated as a front image. The captured images acquired when the light receiving position R of the camera 2 is at each of the three positions are the three images in time series acquired when the light receiving position R of the camera 2 moves in conjunction with the imaging frame. It is an image. A median filter is applied to a plurality of such time-series captured images (image group). Hereinafter, a case where there are three positions will be described.
 生成部62による前方画像の生成の例を、図5を参照して説明する。図5の(a)~(c)は、カメラ2の受光位置Rが3つの位置の各々にあるときに取得される撮像画像をそれぞれ示す。図中、矢印ARは受光位置Rの移動方向を示す。上述のように受光位置Rは往復移動するので、矢印ARは往復経路の往路又は復路の方向を示す。図5の(a)~(c)に示される撮像画像が取得されるカメラ2の受光位置Rを、受光位置R0~R2と称し図示する。ここで、撮像画像中の周期構造部材15の格子点の一つの座標を、P(x、y)として表す。格子点は、周期構造部材15において水平部材15x及び垂直部材15yが交わる点である。 An example of generation of a front image by the generation unit 62 will be described with reference to FIG. 5A to 5C show captured images acquired when the light receiving position R of the camera 2 is at each of the three positions. In the figure, an arrow AR indicates the moving direction of the light receiving position R. Since the light receiving position R reciprocates as described above, the arrow AR indicates the direction of the forward or backward path of the reciprocating path. The light receiving positions R of the camera 2 from which the captured images shown in FIGS. 5A to 5C are acquired are referred to as light receiving positions R0 to R2. Here, one coordinate of the lattice point of the periodic structure member 15 in the captured image is represented as P (x, y). The lattice point is a point where the horizontal member 15 x and the vertical member 15 y intersect in the periodic structure member 15.
 図5の(a)は、受光位置R0で取得される撮像画像を示す。撮像画像において、撮像対象であるユーザU1の手前に、周期構造部材15が存在している。周期構造部材15の格子点の一つの座標は、P(x0、y0)である。 FIG. 5A shows a captured image acquired at the light receiving position R0. In the captured image, the periodic structure member 15 is present in front of the user U1 who is the imaging target. One coordinate of the lattice point of the periodic structure member 15 is P (x0, y0).
 図5の(b)は、受光位置R1で取得される撮像画像を示す。受光位置R1は、受光位置R0から移動距離Δx及び移動距離Δyだけ移動した位置である。この撮像画像では、格子点の座標は、P(x1、y1)であり、x1=x0+Δx、y1=y0+Δyとなる。受光位置R1で取得される撮像画像においても、ユーザU1の手前には、周期構造部材15が存在している。受光位置R1で取得される撮像画像では、受光位置Rで取得される撮像画像に対して微視差が生じるため、ユーザU1の位置は同じであるのに対し周期構造部材15の位置は移動している。 FIG. 5B shows a captured image acquired at the light receiving position R1. The light receiving position R1 is a position moved from the light receiving position R0 by the moving distance Δx and the moving distance Δy. In this captured image, the coordinates of the lattice point are P (x1, y1), and x1 = x0 + Δx and y1 = y0 + Δy. Even in the captured image acquired at the light receiving position R1, the periodic structure member 15 is present in front of the user U1. In the captured image acquired at the light receiving position R1, fine parallax occurs with respect to the captured image acquired at the light receiving position R. Therefore, the position of the user U1 is the same while the position of the periodic structure member 15 is moved. Yes.
 図5の(c)は、受光位置R2で取得される撮像画像を示す。受光位置R2は、受光位置R1から移動距離Δx及び移動距離Δyだけ移動した位置である。この撮像画像では、格子点の座標は、P(x2、y2)であり、x2=x1+Δx、y1=y1+Δyとなる。受光位置R2で取得される撮像画像においても、ユーザU1の手前には、周期構造部材15が存在している。受光位置R2で取得される撮像画像では、受光位置R及び受光位置R1で取得される撮像画像に対して微視差が生じるため、ユーザU1の位置は同じであるのに対し周期構造部材15の位置は移動している。 (C) in FIG. 5 shows a captured image acquired at the light receiving position R2. The light receiving position R2 is a position moved from the light receiving position R1 by the moving distance Δx and the moving distance Δy. In this captured image, the coordinates of the lattice point are P (x2, y2), and x2 = x1 + Δx and y1 = y1 + Δy. Even in the captured image acquired at the light receiving position R2, the periodic structure member 15 is present in front of the user U1. In the captured image acquired at the light receiving position R2, fine parallax occurs with respect to the captured image acquired at the light receiving position R and the light receiving position R1, so that the position of the user U1 is the same while the position of the periodic structure member 15 is Is moving.
 ここで、先に説明したように、移動距離Δx及び移動距離Δyはそれぞれ、Wx≦|Δx|<(Dx)/2及びWy≦|Δy|<(Dy)/2に設定される。この場合、各受光位置R0~R2で取得される3つの撮像画像中の同じ位置には、垂直部材15yが重複していないし、水平部材15xも重複していない。そのため、3つの撮像画中の同じ位置の撮像画素を比較した場合、3つの撮像画素のうち垂直部材15yを表示する撮像画素は、多くとも1つである。少なくとも2つの撮像画素は、垂直部材15yではなく撮像対象(例えばユーザU1)を表示する撮像画素である。同様に、3つの撮像画素のうち、水平部材15xを表示する撮像画素は、多くとも1つである。少なくとも2つの撮像画素は、水平部材15xではなく撮像対象を表示する撮像画素である。したがって、これら3つの撮像画像に対してメディアンフィルタをかければ、図5の(d)に示されるような周期構造部材15の水平部材15x及び垂直部材15yが表示されないような(写り込みが低減された)1つの撮像画像が得られ、この撮像画像を、前方画像として生成することができる。 Here, as described above, the movement distance Δx and the movement distance Δy are set to Wx ≦ | Δx | <(Dx) / 2 and Wy ≦ | Δy | <(Dy) / 2, respectively. In this case, the vertical member 15y does not overlap and the horizontal member 15x does not overlap at the same position in the three captured images acquired at the respective light receiving positions R0 to R2. Therefore, when imaging pixels at the same position in the three imaging images are compared, at most one imaging pixel displays the vertical member 15y among the three imaging pixels. At least two imaging pixels are imaging pixels that display an imaging target (for example, user U1) instead of the vertical member 15y. Similarly, among the three imaging pixels, at most one imaging pixel displays the horizontal member 15x. At least two imaging pixels are imaging pixels that display an imaging target, not the horizontal member 15x. Therefore, if a median filter is applied to these three captured images, the horizontal member 15x and the vertical member 15y of the periodic structure member 15 as shown in FIG. 5D are not displayed (the reflection is reduced). I) One captured image is obtained, and this captured image can be generated as a forward image.
 ここで、取得部61がカメラ2の受光位置Rを往復移動させる場合には、受光位置Rが矢印ARの向きに沿って受光位置R0から受光位置R2まで移動した後、今度は、矢印ARと逆の向きに沿って受光位置R2から受光位置R0まで移動する。矢印ARが往復経路の往路方向である場合、往路では受光位置Rが受光位置R0、受光位置R1及び受光位置R2をこの順に通り、復路では受光位置Rが受光位置R2、受光位置R1及び受光位置R0をこの順に通る。生成部62は、カメラ2の受光位置Rが往復経路の往路(受光位置R0から受光位置R2)の各位置にあるときに取得された撮像画像に基づいて第1の前方画像を生成し、往復経路の復路(受光位置R2から受光位置R0)の各位置にあるときに取得された撮像画像に基づいて第2の前方画像を生成してよい。これら第1の前方画像及び第2の前方画像は、往復移動が継続する間、繰り返し生成されてよい。ここで、受光位置R2は往路から復路への折返し位置であり、受光位置R0は復路から往路への折返し位置である。カメラ2の受光位置Rがこれらの往復経路の折返し位置にあるときに取得された撮像画像が、第1の前方画像及び第2の前方画像の生成に共通に用いられてもよい。具体的に、受光位置R0及び受光位置R2で取得された撮像画像は、往復経路の往路(受光位置R0から受光位置R2)で取得された3つの撮像画像のうちの一つの撮像画像であるとともに、往復経路の復路(受光位置R2から受光位置R0)で取得された3つの撮像画像のうちの一つの撮像画像でもある。したがって、受光位置R2で取得された撮像画像は、往路で取得された3つの撮像画像に対して施されるメディアンフィルタと、その後の復路で取得された3つの撮像画像に対して施されるメディアンフィルタとで共通に用いられてよい。また、受光位置R0で取得された撮像画像は、復路で取得された3つの撮像画像に対して施されるメディアンフィルタと、その後の往路で取得された3つの撮像画像に対して施されるメディアンフィルタとで共通に用いられてよい。 Here, when the acquisition unit 61 reciprocates the light receiving position R of the camera 2, after the light receiving position R moves from the light receiving position R0 to the light receiving position R2 along the direction of the arrow AR, this time, the arrow AR It moves from the light receiving position R2 to the light receiving position R0 along the opposite direction. When the arrow AR is the forward direction of the reciprocating path, the light receiving position R passes through the light receiving position R0, the light receiving position R1, and the light receiving position R2 in this order in the forward path, and the light receiving position R is the light receiving position R2, the light receiving position R1, and the light receiving position in the return path Go through R0 in this order. The generation unit 62 generates a first front image based on the captured image acquired when the light receiving position R of the camera 2 is in each position of the forward path of the reciprocating path (light receiving position R0 to light receiving position R2). A second front image may be generated based on a captured image acquired when the vehicle is at each position of the return path (light receiving position R2 to light receiving position R0) of the path. These first front image and second front image may be repeatedly generated while the reciprocal movement continues. Here, the light receiving position R2 is a return position from the forward path to the return path, and the light reception position R0 is a return position from the return path to the return path. The captured image acquired when the light receiving position R of the camera 2 is at the turn-back position of these round-trip paths may be used in common for generating the first front image and the second front image. Specifically, the captured image acquired at the light receiving position R0 and the light receiving position R2 is one of the three captured images acquired on the forward path of the reciprocating path (the light receiving position R0 to the light receiving position R2). It is also one captured image of the three captured images acquired on the return path (light reception position R2 to light reception position R0) of the round-trip path. Therefore, the captured image acquired at the light receiving position R2 includes a median filter that is applied to the three captured images acquired in the forward path, and a median that is applied to the three captured images acquired in the subsequent return path. It may be used in common with the filter. The captured image acquired at the light receiving position R0 includes a median filter that is applied to the three captured images acquired in the return path and a median that is applied to the three captured images acquired in the subsequent outbound path. It may be used in common with the filter.
 生成部62による前方画像の生成のタイミングについて説明する。まず、カメラ2の撮像の時刻tを、t=t0、t1、t2、t3、t4、t5、t6、t7、t8、t9・・・として表す。時刻tの間隔をΔtとすると、Δtは、カメラ2の撮像フレームレートの逆数になる。次に、生成部62によって生成される前方画像が対応する時刻t´を、t´=t´0、t´1、t´2、t´3・・・として表す。この場合、時刻t´0に対応する前方画像は、時刻t0~t2(における各受光位置R)で取得された3つの撮像画像(時系列の画像群)に基づいて生成される。時刻t´1に対応する前方画像は、時刻t2~t4で取得された3つの撮像画像に基づいて生成される。時刻t´2に対応する前方画像は、時刻t4~t6で取得された3つの撮像画像に基づいて生成される。時刻t´3に対応する前方画像は、時刻t6~t8で取得された3つの撮像画像に基づいて生成される。それ以降の時刻t´についても同様である。この場合、時刻t=t0、t2、t4、t6、t8・・・という時刻tの2つ分の間隔が、時刻t´の1つ分の間隔に対応する。したがって、時刻t´の間隔をΔt´とすると、Δt´は、Δtの2倍になる。カメラ2の撮像フレームレートが60fpsの場合、Δtは60分の1秒であり、Δt´は30分の1秒になる。つまり、生成部62によって生成される前方画像を映像表示する場合の映像表示フレームレートは、カメラ2の撮像フレームレート60fpsの2分の1の30fpsとなる。 The generation timing of the front image by the generation unit 62 will be described. First, the imaging time t of the camera 2 is expressed as t = t0, t1, t2, t3, t4, t5, t6, t7, t8, t9. If the time t interval is Δt, Δt is the reciprocal of the imaging frame rate of the camera 2. Next, the time t ′ to which the forward image generated by the generation unit 62 corresponds is expressed as t ′ = t′0, t′1, t′2, t′3. In this case, the forward image corresponding to time t′0 is generated based on three captured images (time-series image group) acquired at times t0 to t2 (each light receiving position R). The forward image corresponding to time t′1 is generated based on the three captured images acquired at times t2 to t4. The forward image corresponding to time t′2 is generated based on the three captured images acquired at times t4 to t6. The forward image corresponding to time t′3 is generated based on the three captured images acquired at times t6 to t8. The same applies to time t ′ thereafter. In this case, two intervals of time t, which are times t = t0, t2, t4, t6, t8... Correspond to one interval of time t ′. Therefore, if the time t ′ interval is Δt ′, Δt ′ is twice Δt. When the imaging frame rate of the camera 2 is 60 fps, Δt is 1/60 second and Δt ′ is 1/30 second. That is, the video display frame rate when the front image generated by the generation unit 62 is displayed is 30 fps, which is a half of the imaging frame rate 60 fps of the camera 2.
 時刻t´は、3つの時刻tのいずれかの時刻に対応づけられてよく、例えば3つの時刻tのうちの最後の時刻tと同じ時刻であってよい。この場合、時刻t´0は、時刻t2となる。時刻t´1は、時刻t4となる。時刻t´2は、時刻t6となる。時刻t´3は、時刻t8となる。それ以降の時刻t´についても同様である。 The time t ′ may be associated with any one of the three times t, and may be the same time as the last time t of the three times t, for example. In this case, time t′0 is time t2. Time t′1 is time t4. Time t′2 is time t6. Time t′3 is time t8. The same applies to time t ′ thereafter.
 図6は、制御装置6によって実行される処理(画像生成方法)の例を示すフローチャートである。取得部61は上述のようにカメラ2の受光位置Rを往復移動させるものとする。 FIG. 6 is a flowchart showing an example of processing (image generation method) executed by the control device 6. The acquisition unit 61 reciprocates the light receiving position R of the camera 2 as described above.
 ステップS1において、生成部62が、時刻tの受光位置Rで取得された撮像画像中の各撮像画素の輝度値f(t)を求める。ここでの受光位置Rは、例えば、往復移動の往路の場合には図5の(a)の受光位置R0であり、往復移動の復路の場合には図5の(c)の受光位置R2である。輝度値f(t)は数値で表され、輝度が高いほどその数値は大きくてよい。 In step S1, the generation unit 62 obtains the luminance value f (t) of each imaging pixel in the captured image acquired at the light receiving position R at time t. The light receiving position R here is, for example, the light receiving position R0 in FIG. 5A in the case of the reciprocating movement, and the light receiving position R2 in FIG. 5C in the case of the return path in the reciprocating movement. is there. The luminance value f (t) is represented by a numerical value, and the numerical value may be larger as the luminance is higher.
 ステップS2において、取得部61が、カメラ2の受光位置Rを微小量斜めに(傾斜方向に)動かす。これは、取得部61がカメラ2の受光位置Rを往復移動させるからである。微小量は、移動距離Δx及び移動距離Δyである。移動に要する時間はΔtであるから、この移動が完了する時刻は、t+Δtである。移動後に取得される撮像画像では、移動前に取得される撮像画像に対して微視差が発生する。 In step S2, the acquisition unit 61 moves the light receiving position R of the camera 2 diagonally (in the tilt direction) by a minute amount. This is because the acquisition unit 61 reciprocates the light receiving position R of the camera 2. The minute amounts are the movement distance Δx and the movement distance Δy. Since the time required for the movement is Δt, the time when the movement is completed is t + Δt. In the captured image acquired after the movement, fine parallax occurs with respect to the captured image acquired before the movement.
 ステップS3において、生成部62が、時刻t+Δtの受光位置Rで取得された撮像画像中の各撮像画素の輝度値f(t+Δt)を求める。ここでの受光位置Rは、例えば先に説明した図5の(b)の受光位置R1である。 In step S3, the generation unit 62 obtains the luminance value f (t + Δt) of each imaging pixel in the captured image acquired at the light receiving position R at time t + Δt. The light receiving position R here is, for example, the light receiving position R1 of FIG.
 ステップS4において、取得部61が、カメラ2の受光位置Rを微小量斜めに動かす。すなわち、カメラ2の受光位置Rが、移動距離Δx及び移動距離Δyだけ傾斜方向にさらに移動する。この移動が完了する時刻は、t+2Δtである。移動後に取得される撮像画像では、移動前で取得される撮像画像に対して微視差が発生する。 In step S4, the acquisition unit 61 moves the light receiving position R of the camera 2 diagonally by a minute amount. That is, the light receiving position R of the camera 2 further moves in the tilt direction by the movement distance Δx and the movement distance Δy. The time at which this movement is completed is t + 2Δt. In the captured image acquired after the movement, fine parallax occurs with respect to the captured image acquired before the movement.
 ステップS5において、生成部62が、時刻t+2Δtの受光位置で取得された撮像画像の各撮像画素の輝度値f(t+2Δt)を求める。ここでの受光位置Rは、例えば、往復経路の往路の場合には図5の(c)の受光位置R2であり、往復経路の復路の場合には図5の(a)の受光位置R0である。 In step S5, the generation unit 62 obtains the luminance value f (t + 2Δt) of each imaging pixel of the captured image acquired at the light receiving position at time t + 2Δt. The light receiving position R here is, for example, the light receiving position R2 in FIG. 5C in the case of the forward path of the reciprocating path, and the light receiving position R0 in FIG. 5A in the case of the return path of the reciprocating path. is there.
 ステップS6において、生成部62は、輝度値f(t)、輝度値f(t+Δt)及び輝度値f(t+2Δt)でメディアンフィルタをかける。 In step S6, the generation unit 62 applies a median filter with the luminance value f (t), the luminance value f (t + Δt), and the luminance value f (t + 2Δt).
 ステップS7において、生成部62は、時刻t´におけるメディアンフィルタ後の輝度値をf(t´)とする。輝度値f(t´)は、例えば輝度値f(t)、輝度値f(t+Δt)及び輝度値f(t+2Δt)の3つの輝度値の中央値の輝度値であってよい。メディアンフィルタが施されることによって得られる画像が、時刻t´における前方画像として生成される。時刻t´は、時刻t+2Δtと同じ時刻であってよい。このようにして生成される前方画像は、先に図5の(d)を参照して説明したような、周期構造部材15が表示されないような前方画像である。 In step S7, the generation unit 62 sets the luminance value after the median filter at time t ′ to f (t ′). The luminance value f (t ′) may be, for example, the luminance value of the median value of the three luminance values of the luminance value f (t), the luminance value f (t + Δt), and the luminance value f (t + 2Δt). An image obtained by applying the median filter is generated as a forward image at time t ′. The time t ′ may be the same time as the time t + 2Δt. The forward image generated in this way is a forward image in which the periodic structure member 15 is not displayed as described above with reference to FIG.
 ステップS8において、取得部61が、カメラ2の受光位置Rの移動方向を切り替える。すなわち、受光位置Rの移動方向が、往復経路の往路と復路との間で切り替わる。その後、ステップS2に再び処理が戻される。ステップS1ではなくステップS2に処理が戻されるのは、先に説明したように、往復経路の折り返し位置で取得される撮像画像が往路と復路とで共通に用いられるためである。このため、生成部62は、時刻tにおける撮像画像中の各撮像画素の輝度値f(t)として、先のステップS5で求めた輝度値f(t+2Δt)を用いる。すなわち、次のループで用いる時刻tにおける撮像画像の情報に、前のループで用いた時刻t+2Δtにおける撮像画像の情報を代入したうえで、ステップS2~S8の処理が繰り返し実行される。 In step S8, the acquisition unit 61 switches the moving direction of the light receiving position R of the camera 2. That is, the moving direction of the light receiving position R is switched between the forward path and the return path of the reciprocating path. Thereafter, the process returns to step S2. The reason why the process returns to step S2 instead of step S1 is that, as described above, the captured image acquired at the return position of the round-trip path is used in common on the forward path and the return path. Therefore, the generation unit 62 uses the luminance value f (t + 2Δt) obtained in the previous step S5 as the luminance value f (t) of each imaging pixel in the captured image at time t. That is, after the captured image information at time t + 2Δt used in the previous loop is substituted for the captured image information at time t used in the next loop, the processes in steps S2 to S8 are repeatedly executed.
 以上のようにフローチャートの処理が繰り返し実行されることによって、先に説明したような周期構造部材15が表示されないような前方画像が繰り返し生成される。フローチャートのループが受光位置Rの往復経路の往路に対応する場合には、ステップS7において生成される前方画像は、先に説明した第1の前方画像となる。フローチャートのループが受光位置Rの往復経路の復路に対応する場合には、ステップS7において生成される前方画像は、先に説明した第2の前方画像となる。フローチャートの処理が繰り返し実行されることで、第1の前方画像及び第2の前方画像が繰り返し生成される。 By repeatedly executing the processing of the flowchart as described above, a forward image is repeatedly generated so that the periodic structure member 15 as described above is not displayed. When the loop of the flowchart corresponds to the forward path of the round trip path of the light receiving position R, the front image generated in step S7 is the first front image described above. When the loop of the flowchart corresponds to the return path of the round trip path of the light receiving position R, the front image generated in step S7 is the second front image described above. By repeatedly executing the processing of the flowchart, the first front image and the second front image are repeatedly generated.
 なお、上述のステップS2及びS4ではカメラ2の受光位置Rを微小量斜めに動かす例を説明したが、先に説明したように、移動距離Δxまたは移動距離Δyのいずれかゼロである場合には、受光位置Rは斜めではなく水平方向又は垂直方向に動かされる。また、先に図4の(b)を参照して説明したようにカメラ2の受光位置Rが複数の位置に同時に存在する形態では、上述のステップS2及びS4の処理は不要である。上述のステップS8における「受光位置の移動方向を切り替える」処理も不要である。 In the above-described steps S2 and S4, the example in which the light receiving position R of the camera 2 is moved obliquely by a small amount has been described. However, as described above, when the movement distance Δx or the movement distance Δy is either zero, The light receiving position R is moved not horizontally but horizontally or vertically. Further, as described above with reference to FIG. 4B, in the form in which the light receiving positions R of the camera 2 are simultaneously present at a plurality of positions, the above-described steps S2 and S4 are not necessary. The process of “switching the moving direction of the light receiving position” in step S8 is also unnecessary.
 以上説明した表示撮像装置7は、ディスプレイ1と、カメラ2と、制御装置6とを含む。ディスプレイ1は、周期構造部材15を含む。カメラ2は、ディスプレイ1の背後からディスプレイ1の前方を撮像する。制御装置6は、生成部62を含む。生成部62は、カメラ2の撮像画像に基づいて、ディスプレイ1の前方画像を生成する。より具体的に、生成部62は、カメラ2の受光位置Rが複数の位置の各々にあるときに取得されることで画像中の周期構造部材15が異なる位置に表示された複数の撮像画像に基づいて、周期構造部材15が表示されないように、前方画像を生成する。 The display imaging device 7 described above includes a display 1, a camera 2, and a control device 6. The display 1 includes a periodic structure member 15. The camera 2 images the front of the display 1 from behind the display 1. The control device 6 includes a generation unit 62. The generation unit 62 generates a front image of the display 1 based on the captured image of the camera 2. More specifically, the generation unit 62 acquires the plurality of captured images in which the periodic structure member 15 is displayed at different positions by being acquired when the light receiving position R of the camera 2 is at each of the plurality of positions. Based on this, a forward image is generated so that the periodic structure member 15 is not displayed.
 まず、表示撮像装置7によれば、ディスプレイ1の背後からディスプレイ1の前方をカメラ2で正面から撮像することができる。ビデオ通話等に用いる場合には、ディスプレイ1の前方にいるユーザU1を正面から撮像できるので、視線一致した状態を実現することができる。 First, according to the display imaging device 7, the front of the display 1 can be imaged from the front by the camera 2 from behind the display 1. When used for a video call or the like, the user U1 in front of the display 1 can be imaged from the front, so that a line-of-sight state can be realized.
 ビデオ通話の例について、図7を参照して説明する。図7に示されるビデオ通話システム9は、表示撮像装置7に加えて、制御部5及び通信装置8を含む。図7には図示されないが、先に説明したように表示撮像装置7はカメラ2及び制御装置6(図1参照)も含む。この例では、ビデオ通話システム9のユーザU1が、通話相手であるユーザU2とビデオ通話を行っている。ユーザU1は、ディスプレイ1を正面(Z軸正方向)から見ている。したがって、カメラ2は、ディスプレイ1(図2等)を介してユーザU1を正面(Z軸負方向)から撮像できる。通信装置8は、ユーザU2の利用する図示しないビデオ通話システムとの間で、映像データ、音声データ等の送受信を行う。映像データは、ユーザU1の映像データ及びユーザU2の映像データを含む。音声データは、ユーザU1の音声データ及びユーザU2の音声データを含む。 An example of a video call will be described with reference to FIG. A video call system 9 shown in FIG. 7 includes a control unit 5 and a communication device 8 in addition to the display imaging device 7. Although not shown in FIG. 7, as described above, the display imaging device 7 includes the camera 2 and the control device 6 (see FIG. 1). In this example, the user U1 of the video call system 9 is making a video call with the user U2 who is a call partner. The user U1 is viewing the display 1 from the front (Z-axis positive direction). Therefore, the camera 2 can image the user U1 from the front (Z-axis negative direction) via the display 1 (FIG. 2 and the like). The communication device 8 transmits and receives video data, audio data, and the like to and from a video call system (not shown) used by the user U2. The video data includes video data of user U1 and video data of user U2. The audio data includes user U1 audio data and user U2 audio data.
 制御部5は、表示撮像装置7をビデオ通話システム9で動作させるために必要な種々の制御を実行する部分である。制御部5は、ディスプレイ1による表示データ(例えばユーザU2の映像データ)を通信装置8で受信したり、カメラ2による撮像画像データ(例えばユーザU1の映像データ)を通信装置8で送信したりするための送受信処理を実行する。なお、制御部5は、表示撮像装置7の構成要素であってもよい。この他にも、図示しないスピーカ及びマイク等、ビデオ通話システム9に必要な要素が表示撮像装置7に含まれてよい。 The control unit 5 is a part that executes various controls necessary for operating the display imaging device 7 in the video call system 9. The control unit 5 receives display data from the display 1 (for example, video data of the user U2) by the communication device 8, and transmits captured image data from the camera 2 (for example, video data of the user U1) by the communication device 8. The transmission / reception process is executed. The control unit 5 may be a constituent element of the display imaging device 7. In addition, elements necessary for the video call system 9 such as a speaker and a microphone (not shown) may be included in the display imaging device 7.
 さらに、表示撮像装置7によれば、これまで説明したとおり、周期構造部材15が表示されないような前方画像が生成される。したがって、画質を向上させることもできる。ビデオ通話システム9に用いられる場合にはユーザU1の映像がユーザU2に提示されることになるが、ユーザU1の映像に周期構造部材15が写り込んでいない分、映像品質を向上させることができる。 Furthermore, according to the display imaging device 7, as described so far, a front image is generated such that the periodic structure member 15 is not displayed. Therefore, the image quality can be improved. When used in the video call system 9, the video of the user U1 is presented to the user U2, but the video quality can be improved because the periodic structure member 15 is not reflected in the video of the user U1. .
 先に図4の(a)を参照して説明したようにカメラ2を移動させることで、カメラ2の受光位置Rを複数の位置の間で移動させてよい。これにより、ライトフィールドカメラ、3Dカメラ等ではない通常のカメラを用いて、カメラ2の受光位置Rが複数の位置の各々にあるときの撮像画像を取得することができる。 The light receiving position R of the camera 2 may be moved between a plurality of positions by moving the camera 2 as described above with reference to FIG. Thereby, it is possible to acquire a captured image when the light receiving position R of the camera 2 is at each of a plurality of positions using a normal camera that is not a light field camera, a 3D camera, or the like.
 先に図4の(b)を参照して説明したように、受光位置Rが複数の位置に同時に存在するようなライトフィールドカメラ、3Dカメラ等を用いてもよい。この場合、例えば、カメラ2を移動させるための動作系が不要となる分、実質的にカメラ2の受光位置Rの移動速度を高めるのと同様の効果が奏される。これにより、カメラ2の撮像フレームレートを上げることも可能になる。 As described above with reference to FIG. 4B, a light field camera, a 3D camera, or the like in which the light receiving positions R exist simultaneously at a plurality of positions may be used. In this case, for example, an effect similar to that of increasing the moving speed of the light receiving position R of the camera 2 can be obtained because the operation system for moving the camera 2 becomes unnecessary. Thereby, the imaging frame rate of the camera 2 can be increased.
 取得部61がカメラ2の受光位置Rを往復移動させる場合、生成部62は、カメラ2の受光位置Rが複数の位置を通る往復経路の各位置にあるときに順に取得されたカメラ2の撮像画像に基づいて、前方画像を生成してよい。この場合、複数の位置は、往復経路の折返し位置を含んでよい。生成部62は、カメラ2の受光位置Rが往復経路の往路における複数の位置にあるときに取得されたカメラ2の撮像画像に基づいて第1の撮像画像を生成し、往復経路の復路における複数の位置にあるときに取得されたカメラ2の撮像画像に基づいて第2の前方画像を生成してよい。生成部62は、往復経路の折返し位置で取得されたカメラ2の撮像画像を共通に用いて、第1の前方画像及び第2の前方画像を生成してよい。例えば、往復経路の折返し位置でのカメラ2の撮像画像を共通に用いない場合には、3つの撮像画像が取得されるごとに1つの前方画像が生成されるので、前方画像の映像表示フレームレートは、カメラ2の撮像フレームレートの3分の1のフレームレートまで低下してしまう。これに対し、往復経路の折返し位置でのカメラ2の撮像画像を共通に用いれば、2つの撮像画像が取得されるごとに1つの前方画像が生成されるので、前方画像の映像表示フレームレートを、カメラ2の撮像フレームレートの2分の1まで上げることができる。 When the acquisition unit 61 reciprocates the light reception position R of the camera 2, the generation unit 62 captures the images of the camera 2 that are sequentially acquired when the light reception position R of the camera 2 is at each position on a reciprocation path passing through a plurality of positions. A forward image may be generated based on the image. In this case, the plurality of positions may include a return position of the round-trip path. The generation unit 62 generates a first captured image based on the captured images of the camera 2 acquired when the light receiving positions R of the camera 2 are at a plurality of positions on the forward path of the round-trip path, and generates a plurality of images on the return path of the round-trip path. A second front image may be generated based on the captured image of the camera 2 acquired when the camera is at the position. The generation unit 62 may generate the first front image and the second front image by using the captured image of the camera 2 acquired at the turn-back position of the round-trip path in common. For example, when the captured image of the camera 2 at the return position of the round-trip path is not used in common, one front image is generated every time three captured images are acquired, so the video display frame rate of the front image Decreases to a frame rate that is one third of the imaging frame rate of the camera 2. On the other hand, if the captured image of the camera 2 at the turn-back position of the round-trip path is used in common, one front image is generated every time two captured images are acquired, so the video display frame rate of the front image is set. , It can be increased to half of the imaging frame rate of the camera 2.
 周期構造部材15は、水平部材15xと、垂直部材15yとを含んでよい。水平部材15xは、ディスプレイ1の水平方向に延在し、ディスプレイ1の垂直方向において等間隔に配置されてよい。垂直部材15yは、ディスプレイ1の垂直方向に延在し、ディスプレイ1の水平方向において等間隔に配置されてよい。複数の位置は、水平方向及び垂直方向に対して傾斜方向に並んでよい。ディスプレイ1がこのような周期構造部材15を含む場合、カメラ2の受光位置Rの移動が水平方向のみだと、各撮像画像中の垂直部材の位置は重複しないようにできるが水平部材の位置が重複する。カメラ2の受光位置Rの移動が垂直方向のみだと、各撮像画像中の水平部材の位置は重複しないようにできるが垂直部材の位置が重複する。カメラ2の受光位置Rを傾斜方向に移動させことにより、各撮像画像中の水平部材の位置も垂直部材の位置も重複しないようにできる。したがって、生成部62が生成する前方画像への周期構造部材15の写り込みをさらに低減し、画質をさらに向上させることができる。 The periodic structure member 15 may include a horizontal member 15x and a vertical member 15y. The horizontal members 15 x may extend in the horizontal direction of the display 1 and be arranged at equal intervals in the vertical direction of the display 1. The vertical members 15 y may extend in the vertical direction of the display 1 and be arranged at equal intervals in the horizontal direction of the display 1. The plurality of positions may be arranged in an inclined direction with respect to the horizontal direction and the vertical direction. When the display 1 includes such a periodic structure member 15, if the movement of the light receiving position R of the camera 2 is only in the horizontal direction, the position of the vertical member in each captured image can be prevented from overlapping, but the position of the horizontal member is Duplicate. If the movement of the light receiving position R of the camera 2 is only in the vertical direction, the position of the horizontal member in each captured image can be prevented from overlapping, but the position of the vertical member overlaps. By moving the light receiving position R of the camera 2 in the tilt direction, the position of the horizontal member and the position of the vertical member in each captured image can be prevented from overlapping. Therefore, the reflection of the periodic structure member 15 in the front image generated by the generation unit 62 can be further reduced, and the image quality can be further improved.
 複数の位置のうちの隣り合う位置同士の間の距離は、周期構造部材15の部材幅(垂直部材幅Wx、水平部材幅Wy)以上且つ周期間隔(垂直部材間隔Dx、水平部材間隔Dy)の2分の1未満であってよい。生成部62は、カメラ2の受光位置Rが複数の位置のうちの3つの位置の各々にあるときに取得されたカメラ2の撮像画像に対してメディアンフィルタをかけることによって得られる画像を、前方画像として生成してよい。 The distance between adjacent positions among the plurality of positions is equal to or greater than the member width (vertical member width Wx, horizontal member width Wy) of the periodic structure member 15 and the periodic interval (vertical member interval Dx, horizontal member interval Dy). It may be less than half. The generation unit 62 displays an image obtained by applying a median filter to the captured image of the camera 2 acquired when the light receiving position R of the camera 2 is at each of three positions. It may be generated as an image.
 上述のように複数の位置同士の間の距離を設定すれば、3つの撮像画像中の同じ位置の撮像画素を比較した場合、3つの撮像画素のうち垂直部材15yを表示する撮像画素は、多くとも1つである。少なくとも2つの撮像画素は、垂直部材15yではなく撮像対象(例えばユーザU1)を表示する撮像画素である。同様に、3つの撮像画素のうち、水平部材15xを表示する撮像画素は、多くとも1つである。少なくとも2つの撮像画素は、水平部材15xではなく撮像対象を表示する撮像画素である。したがって、これら3つの撮像画像に対してメディアンフィルタをかければ、周期構造部材15が表示されないような(周期構造部材15が除去されたような)撮像画像が得られ、この撮像画像を、ディスプレイ1の前方画像として生成することができる。 If the distance between a plurality of positions is set as described above, when imaging pixels at the same position in the three captured images are compared, there are many imaging pixels that display the vertical member 15y among the three imaging pixels. Both are one. At least two imaging pixels are imaging pixels that display an imaging target (for example, user U1) instead of the vertical member 15y. Similarly, among the three imaging pixels, at most one imaging pixel displays the horizontal member 15x. At least two imaging pixels are imaging pixels that display an imaging target, not the horizontal member 15x. Therefore, if a median filter is applied to these three captured images, a captured image in which the periodic structure member 15 is not displayed (such that the periodic structure member 15 is removed) is obtained, and this captured image is displayed on the display 1. Can be generated as a forward image.
[変形例]
 上記実施形態では、少なくとも3つの受光位置で取得される撮像画像に基づいて周期構造部材15が表示されないような前方画像を生成する例を説明した。これに対し、次に説明する変形例によれば、少なくとも2つの受光位置で取得される撮像画像に基づいて周期構造部材15が表示されないような前方画像を生成することができる。
[Modification]
In the above-described embodiment, an example has been described in which a front image is generated such that the periodic structure member 15 is not displayed based on captured images acquired at at least three light receiving positions. On the other hand, according to the modification described below, it is possible to generate a front image in which the periodic structure member 15 is not displayed based on captured images acquired at at least two light receiving positions.
 図8は、そのような変形例に係る表示撮像装置7Aが備える制御装置6Aのブロック図を示す。制御装置6Aは、制御装置6(図3)と比較して、取得部61及び生成部62に代えて取得部61A及び生成部62Aを含み、さらに記憶部63を含む点において相違する。 FIG. 8 shows a block diagram of a control device 6A included in the display imaging device 7A according to such a modification. The control device 6A is different from the control device 6 (FIG. 3) in that it includes an acquisition unit 61A and a generation unit 62A instead of the acquisition unit 61 and the generation unit 62, and further includes a storage unit 63.
 取得部61Aは、取得部61(図3)と同様に、カメラ2の受光位置Rを往復移動させる。なお、ライトフィールドカメラ等の場合は移動が不要である点は先に説明したとおりである。ここで、取得部61が少なくとも3つの位置を通るようにカメラ2の受光位置Rを往復移動させるのに対し、取得部61Aは、少なくとも2つの位置を通るようにカメラ2の受光位置Rを往復移動させる。2つの位置は、往復経路の折返し位置であってよい。以下では、取得部61Aが、2つの位置を通るようにカメラ2の受光位置Rを往復移動させる場合について説明する。 The acquisition unit 61A reciprocates the light receiving position R of the camera 2 in the same manner as the acquisition unit 61 (FIG. 3). In the case of a light field camera or the like, the movement is not necessary as described above. Here, the acquisition unit 61 reciprocates the light receiving position R of the camera 2 so as to pass through at least three positions, whereas the acquisition unit 61A reciprocates the light reception position R of the camera 2 so as to pass through at least two positions. Move. The two positions may be the return positions of the round trip path. Hereinafter, a case where the acquisition unit 61A reciprocates the light receiving position R of the camera 2 so as to pass through two positions will be described.
 取得部61Aによるカメラ2の受光位置Rの移動も、これまで説明した微視差が発生するように定められる。取得部61Aによるカメラ2の受光位置Rの移動距離の最小単位は、以下のように定義する。
  移動距離ΔxA:2つの位置同士の間の水平方向の距離(離間距離)である。座標の設定の仕方によっては、ΔxAは、負の値であってもよい。
  移動距離ΔyA:2つの位置同士の間の垂直方向の距離(離間距離)である。座標の設定の仕方によっては、ΔyAは、負の値であってもよい。
 以下、移動距離ΔxAの大きさを、単に移動距離ΔxAと呼ぶ。移動距離ΔyAの大きさを、単に移動距離ΔyAと呼ぶ。
The movement of the light receiving position R of the camera 2 by the acquisition unit 61A is also determined so that the fine parallax described so far occurs. The minimum unit of the moving distance of the light receiving position R of the camera 2 by the acquisition unit 61A is defined as follows.
Movement distance ΔxA: a horizontal distance (separation distance) between two positions. Depending on how the coordinates are set, ΔxA may be a negative value.
Movement distance ΔyA: a vertical distance (separation distance) between two positions. Depending on how the coordinates are set, ΔyA may be a negative value.
Hereinafter, the magnitude of the movement distance ΔxA is simply referred to as a movement distance ΔxA. The magnitude of the movement distance ΔyA is simply referred to as the movement distance ΔyA.
 移動距離ΔxAは、垂直部材幅Wx以上且つ垂直部材間隔Dx未満に設定される。つまり、Wx≦|ΔxA|<Dxである。移動距離ΔyAは、水平部材幅Wy以上且つ水平部材間隔Dy未満に設定される。つまり、Wy≦|ΔyA|<Dyである。 The moving distance ΔxA is set to be not less than the vertical member width Wx and less than the vertical member interval Dx. That is, Wx ≦ | ΔxA | <Dx. The movement distance ΔyA is set to be not less than the horizontal member width Wy and less than the horizontal member interval Dy. That is, Wy ≦ | ΔyA | <Dy.
 生成部62Aは、カメラ2の受光位置Rが2つの位置の各々にあるときに取得される撮像画像に基づいて、前方画像を生成する部分である。具体的に、生成部62Aは、2つの撮像画像中の同じ位置の撮像画素のうち、周期構造部材15を表示する撮像画素を用いないことによって、周期構造部材15が表示されないように(あるいは除去されるように)、前方画像を生成する。 The generating unit 62A is a part that generates a front image based on a captured image acquired when the light receiving position R of the camera 2 is at each of the two positions. Specifically, the generation unit 62A does not use the imaging pixel that displays the periodic structure member 15 among the imaging pixels at the same position in the two captured images so that the periodic structure member 15 is not displayed (or removed). A forward image is generated.
 記憶部63は、周期構造部材15を表示する撮像画素に関する情報を予め記憶する部分である。そのような情報の例として、記憶部63は、カメラ2の撮像画像の撮像画素が周期構造部材15を表示する場合の、撮像画素の輝度値Lを予め記憶している。輝度値Lは予めキャリブレーションを行い計測された値であってよい。計測を行わなくとも、周期構造部材15の位置における撮像画素の輝度値が低いことを利用して、十分に小さい値(例えば最小値)を輝度値Lとしてもよい。 The storage unit 63 is a part that stores in advance information related to the imaging pixels that display the periodic structure member 15. As an example of such information, the storage unit 63 stores in advance the luminance value L of the imaging pixel when the imaging pixel of the captured image of the camera 2 displays the periodic structure member 15. The luminance value L may be a value measured in advance by calibration. Even if the measurement is not performed, a sufficiently small value (for example, the minimum value) may be set as the luminance value L using the low luminance value of the imaging pixel at the position of the periodic structure member 15.
 生成部62Aは、2つの撮像画像の撮像画素のうち、記憶部63に記憶されている輝度値Lから離れた輝度値の撮像画素を用いて、前方画像を生成する。撮像画素の輝度値が輝度値Lに近い撮像画素は、周期構造部材15を表示する撮像画素である可能性が高いので、そのような撮像画素を用いないようにするためである。例えば2つの撮像画像の同じ位置の撮像画素のうち、一方の撮像画素の輝度値が10で、他方の撮像画素の輝度値が40の場合、輝度値L(この場合は10未満)から離れている方の輝度値40を有する撮像画素が採用される。 The generation unit 62A generates a front image using an imaging pixel having a luminance value that is distant from the luminance value L stored in the storage unit 63 among the imaging pixels of the two captured images. This is because an imaging pixel whose luminance value is close to the luminance value L is likely to be an imaging pixel that displays the periodic structure member 15, so that such an imaging pixel is not used. For example, when the luminance value of one imaging pixel is 10 and the luminance value of the other imaging pixel is 40 out of the imaging pixels at the same position in two captured images, it is separated from the luminance value L (in this case, less than 10). An imaging pixel having a luminance value 40 of the existing one is employed.
 生成部62Aによる前方画像の生成の例を、図9を参照して説明する。図9の(a)は、受光位置R0Aで取得される撮像画像を示す。周期構造部材15の格子点の一つの座標は、PA(x0A、y0A)である。 An example of generation of a front image by the generation unit 62A will be described with reference to FIG. FIG. 9A shows a captured image acquired at the light receiving position R0A. One coordinate of the lattice point of the periodic structure member 15 is PA (x0A, y0A).
 図9の(b)は、受光位置R1Aで取得される撮像画像を示す。受光位置R1Aは、受光位置R0Aから移動距離ΔxA及び移動距離ΔyAだけ移動した位置である。この撮像画像では、格子点の座標は、PA(x1A、y1A)であり、x1A=x0A+ΔxA、y1A=y0A+ΔyAとなる。受光位置R1Aで取得される撮像画像では、受光位置R0Aで取得される撮像画像に対して微視差が生じているため、ユーザU1の位置が同じであるのに対し周期構造部材15の位置が移動している。 FIG. 9B shows a captured image acquired at the light receiving position R1A. The light receiving position R1A is a position moved from the light receiving position R0A by the moving distance ΔxA and the moving distance ΔyA. In this captured image, the coordinates of the lattice points are PA (x1A, y1A), and x1A = x0A + ΔxA and y1A = y0A + ΔyA. In the captured image acquired at the light receiving position R1A, since the fine parallax is generated with respect to the captured image acquired at the light receiving position R0A, the position of the periodic structure member 15 is moved while the position of the user U1 is the same. is doing.
 ここで、先に説明したように、移動距離ΔxA及び移動距離ΔyAはそれぞれ、Wx≦|ΔxA|<Dx、Wy≦|ΔyA|<Dyに設定されている。この場合、受光位置R0A及び受光位置R1Aでの2つの撮像画像中の同じ位置には、垂直部材15yが重複していないし、水平部材15xも重複していない。そのため、2つの撮像画像中の同じ位置の撮像画素を比較した場合、2つの撮像画素のうち、水平部材15x及び/又は垂直部材15yを表示する撮像画素の輝度値は、水平部材15x及び/又は垂直部材15yを表示していない撮像画素の輝度値よりも、上述の記憶部63に記憶されている輝度値Lに近くなる。したがって、これら2つの撮像画像の各撮像画素のうち、輝度値Lから離れている輝度値を有する撮像画素を組み合わせることで、図9の(c)に示されるような周期構造部材15が除去された1つの撮像画像が得られ、この撮像画像を周期構造部材15が表示されないような前方画像として生成することができる。 Here, as described above, the movement distance ΔxA and the movement distance ΔyA are set to Wx ≦ | ΔxA | <Dx and Wy ≦ | ΔyA | <Dy, respectively. In this case, the vertical member 15y does not overlap and the horizontal member 15x does not overlap at the same position in the two captured images at the light receiving position R0A and the light receiving position R1A. Therefore, when the imaging pixels at the same position in the two captured images are compared, the luminance value of the imaging pixel displaying the horizontal member 15x and / or the vertical member 15y out of the two imaging pixels is the horizontal member 15x and / or The brightness value is closer to the brightness value L stored in the storage unit 63 than the brightness value of the imaging pixels not displaying the vertical member 15y. Therefore, the periodic structure member 15 as shown in (c) of FIG. 9 is removed by combining the imaging pixels having the luminance values far from the luminance value L out of the imaging pixels of these two captured images. Only one captured image is obtained, and this captured image can be generated as a front image in which the periodic structure member 15 is not displayed.
 上述のように、取得部61Aは、カメラ2の受光位置Rを往復移動させる。したがって、生成部62Aは、往復経路の往路における受光位置R0A及び受光位置R1で取得される撮像画像に基づいて第1の前方画像を生成し、往復経路の復路における受光位置R1A及び受光位置R0Aで取得される撮像画像に基づいて第2の前方画像を生成してよい。第1の前方画像及び第2の前方画像の生成に、往復経路の折返し位置で取得される撮像画像が共通に用いることもできる。この場合、生成部62Aによって生成される前方画像を映像表示する場合の映像フレームレートは、カメラ2の撮像フレームレートと同じになる。例えばカメラ2の撮像フレームレートが60fpsの場合、前方画像の映像フレームレートも60fpsになる。 As described above, the acquisition unit 61A moves the light receiving position R of the camera 2 back and forth. Therefore, the generation unit 62A generates a first front image based on the captured images acquired at the light receiving position R0A and the light receiving position R1 in the forward path of the round trip path, and at the light receiving position R1A and the light receiving position R0A in the return path of the round trip path. A second front image may be generated based on the acquired captured image. For the generation of the first front image and the second front image, a captured image acquired at the return position of the round-trip path can be used in common. In this case, the video frame rate when displaying the forward image generated by the generation unit 62A is the same as the imaging frame rate of the camera 2. For example, when the imaging frame rate of the camera 2 is 60 fps, the video frame rate of the front image is also 60 fps.
 図10は、制御装置6Aによって実行される処理(画像生成方法)の例を示すフローチャートである。 FIG. 10 is a flowchart showing an example of processing (image generation method) executed by the control device 6A.
 ステップS11において、生成部62Aが、時刻tにおける受光位置Rで撮像画像中の各撮像画素の輝度値f(t)を求める。ここでの受光位置Rは、例えば、往復経路の往路の場合には図9の(a)の受光位置R0Aであり、往復経路の復路の場合には図9の(b)の受光位置R1Aである。 In step S11, the generation unit 62A obtains the luminance value f (t) of each imaging pixel in the captured image at the light receiving position R at time t. The light receiving position R here is, for example, the light receiving position R0A in FIG. 9A in the case of the forward path of the round-trip path, and is the light receiving position R1A in FIG. 9B in the case of the return path in the round-trip path. is there.
 ステップS12において、取得部61Aが、カメラ2の受光位置Rを微小量斜めに動かす。微小量は、上述の移動距離ΔxA及び移動距離ΔyAである。移動にかかる時間がΔtであるので、この移動が完了する時刻は、t+Δtである。移動後に取得される撮像画像では、移動前に取得される撮像画素に対して微視差が発生する。 In step S12, the acquisition unit 61A moves the light receiving position R of the camera 2 diagonally by a minute amount. The minute amount is the above-described movement distance ΔxA and movement distance ΔyA. Since the time required for the movement is Δt, the time when the movement is completed is t + Δt. In the captured image acquired after the movement, a fine parallax occurs with respect to the imaging pixel acquired before the movement.
 ステップS13において、生成部62Aが、時刻t+Δtにおける受光位置Rで取得される撮像画像中の各撮像画素の輝度値f(t+Δt)を求める。ここでの受光位置Rは、例えば、往復経路の往路の場合には図9の(b)の受光位置R1Aであり、往復経路の復路の場合には図9の(a)の受光位置R0Aである。 In step S13, the generation unit 62A obtains the luminance value f (t + Δt) of each imaging pixel in the captured image acquired at the light receiving position R at time t + Δt. The light receiving position R here is, for example, the light receiving position R1A of FIG. 9B in the case of the forward path of the reciprocating path, and the light receiving position R0A of FIG. 9A in the case of the return path of the reciprocating path. is there.
 ステップS14において、生成部62Aが、記憶部63に記憶されている輝度値Lを用いて、先のステップS11で求めた輝度値f(t)と輝度値Lとの差分値(|f(t)-L|)、及び、先のステップS13で求めた輝度値f(t+Δt)と輝度値Lとの差分値(|f(t+Δt)-L|)を求める。 In step S14, the generation unit 62A uses the luminance value L stored in the storage unit 63, and the difference value (| f (t) between the luminance value f (t) obtained in the previous step S11 and the luminance value L. ) −L |) and a difference value (| f (t + Δt) −L |) between the luminance value f (t + Δt) obtained in the previous step S13 and the luminance value L.
 ステップS15において、生成部62Aは、先のステップS14で求めた2つの差分値のうち、値が大きい方の差分値の計算に用いられた輝度値をf(t´)とする。すなわち、先のステップS11及びステップS13で求めた二つの輝度値のうち、輝度値Lから離れている方の輝度値を有する撮像画素を組み合わせることによって得られる画像が、時刻t´に対応する前方画像として生成される。時刻t´は、先のステップS13における時刻t+Δtと同じ時刻であってよい。 In step S15, the generation unit 62A sets the luminance value used for calculation of the difference value having the larger value among the two difference values obtained in the previous step S14 as f (t ′). That is, an image obtained by combining imaging pixels having a luminance value far from the luminance value L out of the two luminance values obtained in the previous step S11 and step S13 is a front corresponding to the time t ′. Generated as an image. The time t ′ may be the same time as the time t + Δt in the previous step S13.
 ステップS16において、取得部61Aが、カメラ2の受光位置Rの移動方向を切り替える。すなわち、受光位置Rの移動方向が、往復経路の往路と復路との間で切り替わる。その後、ステップS12に再び処理が戻される。ステップS11ではなくステップS12に処理が戻されるのは、先に説明したように、往復経路の折返し位置で取得される撮像画像が往路と復路とで共通に用いられるためである。このため、生成部62Aは、時刻tにおける撮像画像中の各撮像画素の輝度値f(t)として、先のステップS13で求めた輝度値f(t+Δt)を用いる。すなわち、次のループで用いる時刻tにおける撮像画像の情報に、前のループで用いた時刻t+Δtにおける撮像画像の情報を代入したうえで、ステップS12~S16の処理が繰り返し実行される。 In step S16, the acquisition unit 61A switches the moving direction of the light receiving position R of the camera 2. That is, the moving direction of the light receiving position R is switched between the forward path and the return path of the reciprocating path. Thereafter, the process returns to step S12 again. The reason why the process returns to step S12 instead of step S11 is that, as described above, the captured image acquired at the return position of the round-trip path is used in common on the forward path and the return path. Therefore, the generation unit 62A uses the luminance value f (t + Δt) obtained in the previous step S13 as the luminance value f (t) of each imaging pixel in the captured image at time t. That is, the information of the captured image at time t + Δt used in the previous loop is substituted into the information of the captured image at time t used in the next loop, and then the processing of steps S12 to S16 is repeatedly executed.
 以上のようにフローチャートの処理が繰り返し実行されることによって、先に説明したような周期構造部材15が表示されないような前方画像が繰り返し生成される。フローチャートのループが受光位置Rの往復経路の往路に対応する場合には、ステップS16において生成される前方画像は、先に説明した第1の前方画像となる。フローチャートのループが受光位置Rの往復経路の復路に対応する場合には、ステップS16において生成される前方画像は、先に説明した第2の前方画像となる。フローチャートの処理が繰り返し実行されることで、第1の前方画像及び第2の前方画像が繰り返し生成される。 By repeatedly executing the processing of the flowchart as described above, a forward image is repeatedly generated so that the periodic structure member 15 as described above is not displayed. When the loop in the flowchart corresponds to the forward path of the round trip path of the light receiving position R, the front image generated in step S16 is the first front image described above. When the loop of the flowchart corresponds to the return path of the round trip path of the light receiving position R, the front image generated in step S16 is the second front image described above. By repeatedly executing the processing of the flowchart, the first front image and the second front image are repeatedly generated.
 なお、上述のステップS12ではカメラ2の受光位置Rを微小量斜めに動かす例を説明したが、先に説明したように、移動距離Δxまたは移動距離Δyのいずれかゼロである場合には、受光位置Rは斜めではなく水平方向又は垂直方向に動かされる。また、先に図4の(b)を参照して説明したようにカメラ2の受光位置Rが複数の位置に同時に存在する形態では、上述のステップS12の処理は不要である。上述のステップS16における「受光位置の移動方向を切り替える」処理も不要である。 In the above-described step S12, the example in which the light receiving position R of the camera 2 is moved by a minute amount is described. However, as described above, if the moving distance Δx or the moving distance Δy is zero, the light receiving position R is received. The position R is moved not horizontally but horizontally or vertically. Further, as described above with reference to FIG. 4B, in the form in which the light receiving positions R of the camera 2 are simultaneously present at a plurality of positions, the process of step S12 described above is not necessary. The process of “switching the moving direction of the light receiving position” in step S16 described above is also unnecessary.
 以上説明した変形例に係る表示撮像装置7Aでは、記憶部63が、カメラ2の撮像画像の撮像画素が周期構造部材15を表示する場合の、撮像画素の輝度値を予め記憶している。複数の位置のうちの隣り合う位置同士の間の距離は、周期構造部材15の部材幅(垂直部材幅Wx、水平部材幅Wy)以上且つ周期間隔(垂直部材間隔Dx、水平部材間隔Dy)未満である。生成部62は、カメラ2の受光位置Rが複数の位置のうちの2つの位置の各々にあるときに取得されたカメラ2の撮像画像中の各々の撮像画素のうち、記憶部63に記憶されている輝度値Lから離れた輝度値を有する撮像画素を組み合わせて得られる画像を、周期構造部材15が表示されないような前方画像として生成する(ステップS14及びS15)。 In the display imaging device 7A according to the modified example described above, the storage unit 63 stores in advance the luminance value of the imaging pixel when the imaging pixel of the captured image of the camera 2 displays the periodic structure member 15. The distance between adjacent positions among the plurality of positions is not less than the member width (vertical member width Wx, horizontal member width Wy) of the periodic structure member 15 and less than the periodic interval (vertical member interval Dx, horizontal member interval Dy). It is. The generation unit 62 is stored in the storage unit 63 out of each imaging pixel in the captured image of the camera 2 acquired when the light receiving position R of the camera 2 is at each of two positions among the plurality of positions. An image obtained by combining image pickup pixels having a luminance value away from the luminance value L is generated as a front image in which the periodic structure member 15 is not displayed (steps S14 and S15).
 2つの撮像画像中の同じ位置の撮像画素を比較した場合、2つの撮像画素のうち、水平部材15x及び/又は垂直部材15yが位置する撮像画素の輝度値は、水平部材15x及び/又は垂直部材15yが位置していない撮像画素の輝度値よりも、上述の記憶部63に記憶されている輝度値Lに近くなる。したがって、これら2つの撮像画像の各撮像画素のうち、輝度値Lから離れている輝度値を有する撮像画素を組み合わせることで、周期構造部材15が表示されないような1つの撮像画像が得られ、この撮像画像をディスプレイ1の前方画像として生成することができる。この場合、2つの撮像画像だけで前方画像を生成できるので、3つの撮像画像を用いて前方画像を生成する場合よりも、前方画像の映像フレームレートを上げることができる。 When imaging pixels at the same position in two captured images are compared, the luminance value of the imaging pixel where the horizontal member 15x and / or the vertical member 15y is located among the two imaging pixels is the horizontal member 15x and / or the vertical member. It is closer to the luminance value L stored in the storage unit 63 than the luminance value of the imaging pixel in which 15y is not located. Therefore, by combining the imaging pixels having the luminance values that are separated from the luminance value L among the imaging pixels of these two captured images, one captured image in which the periodic structure member 15 is not displayed is obtained. The captured image can be generated as a front image of the display 1. In this case, since the front image can be generated using only two captured images, the video frame rate of the front image can be increased as compared with the case where the front image is generated using three captured images.
 以上、本発明の実施形態について説明したが、本発明は上記実施形態に限定されない。上記実施形態では、表示撮像装置7をビデオ通話システム9に用いる例を説明したが、表示撮像装置7の用途は、ビデオ通話システム9に限定されない。例えば、ユーザU1が自分自身の映像を表示させるために、表示撮像装置7が用いられてもよい。この場合、ディスプレイ1は、カメラ2によって撮像された映像を表示する。したがって、ユーザU1は、ディスプレイ1に向かい合うだけで、カメラ2の位置を気にせずとも、自分自身がどう映っているのかを確認しながら撮影を行うことができる。このような表示撮像装置7の用途は、写真機(証明写真機、シール印刷機)等においては、撮影時に目線を正面に向けたまま撮影プレビュー画面を確認できるといったメリットがある。ショッピングセンター等に置いてある拡張現実(AR:Augmented Reality)サイネージにおいてユーザの映像にCGを合成するような場合にも、より自然に楽しめるようになる。また、自分自身を映しながらライブ配信するようなアプリケーションでは、より自然な表情で配信できるようになる。 As mentioned above, although embodiment of this invention was described, this invention is not limited to the said embodiment. In the above embodiment, the example in which the display imaging device 7 is used in the video call system 9 has been described. However, the use of the display imaging device 7 is not limited to the video call system 9. For example, the display imaging device 7 may be used for the user U1 to display his / her own video. In this case, the display 1 displays the video imaged by the camera 2. Accordingly, the user U1 can take a picture while confirming how the user U1 is appearing without having to worry about the position of the camera 2 simply by facing the display 1. Such an application of the display imaging device 7 has an advantage that, in a photographer (certification photographer, sticker printer) or the like, the photographing preview screen can be confirmed with the eye line facing forward at the time of photographing. Even when augmented reality (AR) signage placed in a shopping center or the like is used to synthesize CG with a user's video, it can be enjoyed more naturally. In addition, applications that deliver live images while reflecting themselves will be able to deliver with a more natural look.
 なお、上記実施の形態の説明に用いたブロック図は、機能単位のブロックを示している。これらの機能ブロック(構成部)は、ハードウェア及び/又はソフトウェアの任意の組み合わせによって実現される。また、各機能ブロックの実現手段は特に限定されない。すなわち、各機能ブロックは、物理的及び/又は論理的に結合した1つの装置により実現されてもよいし、物理的及び/又は論理的に分離した2つ以上の装置を直接的及び/又は間接的に(例えば、有線及び/又は無線)で接続し、これら複数の装置により実現されてもよい。 Note that the block diagram used in the description of the above embodiment shows functional unit blocks. These functional blocks (components) are realized by any combination of hardware and / or software. Further, the means for realizing each functional block is not particularly limited. That is, each functional block may be realized by one device physically and / or logically coupled, and two or more devices physically and / or logically separated may be directly and / or indirectly. (For example, wired and / or wireless) and may be realized by these plural devices.
 例えば、本発明の一実施の形態における制御装置6等は、取得部61、生成部62等の処理を行うコンピュータとして機能してもよい。図11は、本実施形態に係る制御装置6等のハードウェア構成の一例を示す図である。以下では、制御装置6を例に挙げて説明する。制御部5、制御装置6A及び通信装置8についても、制御装置6と同様の説明が可能である。制御装置6は、物理的には、プロセッサ1001、メモリ1002、ストレージ1003、通信装置1004、入力装置1005、出力装置1006、バス1007等を含むコンピュータ装置として構成されてもよい。 For example, the control device 6 or the like according to an embodiment of the present invention may function as a computer that performs processing such as the acquisition unit 61 and the generation unit 62. FIG. 11 is a diagram illustrating an example of a hardware configuration of the control device 6 and the like according to the present embodiment. Hereinafter, the control device 6 will be described as an example. The control unit 5, the control device 6 </ b> A, and the communication device 8 can be described in the same manner as the control device 6. The control device 6 may be physically configured as a computer device including a processor 1001, a memory 1002, a storage 1003, a communication device 1004, an input device 1005, an output device 1006, a bus 1007, and the like.
 なお、以下の説明では、「装置」という文言は、回路、デバイス、ユニット等に読み替えることができる。制御装置6のハードウェア構成は、図に示した各装置を1つ又は複数含むように構成されてもよいし、一部の装置を含まずに構成されてもよい。 In the following description, the term “apparatus” can be read as a circuit, a device, a unit, or the like. The hardware configuration of the control device 6 may be configured to include one or a plurality of the devices illustrated in the figure, or may be configured not to include some devices.
 制御装置6における各機能は、プロセッサ1001、メモリ1002等のハードウェア上に所定のソフトウェア(プログラム)を読み込ませることで、プロセッサ1001が演算を行い、通信装置1004による通信、メモリ1002及びストレージ1003におけるデータの読み出し及び/又は書き込みを制御することで実現される。 Each function in the control device 6 is read by a predetermined software (program) on hardware such as the processor 1001 and the memory 1002, so that the processor 1001 performs an operation and performs communication by the communication device 1004, in the memory 1002 and the storage 1003. This is realized by controlling reading and / or writing of data.
 プロセッサ1001は、例えば、オペレーティングシステムを動作させてコンピュータ全体を制御する。プロセッサ1001は、周辺装置とのインターフェース、制御装置、演算装置、レジスタ等を含む中央処理装置(CPU:Central Processing Unit)で構成されてもよい。例えば上述の取得部61、61A及び生成部62、62A等は、プロセッサ1001で実現されてもよい。 The processor 1001 controls the entire computer by operating an operating system, for example. The processor 1001 may be configured by a central processing unit (CPU) including an interface with peripheral devices, a control device, an arithmetic device, a register, and the like. For example, the acquisition units 61 and 61A and the generation units 62 and 62A described above may be realized by the processor 1001.
 また、プロセッサ1001は、プログラム(プログラムコード)、ソフトウェアモジュール及びデータを、ストレージ1003及び/又は通信装置1004からメモリ1002に読み出し、これらに従って各種の処理を実行する。プログラムとしては、上述の実施の形態で説明した動作の少なくとも一部をコンピュータに実行させるプログラムが用いられる。例えば、取得部61及び生成部62は、メモリ1002に格納され、プロセッサ1001で動作する制御プログラムによって実現されてもよく、他の機能ブロックについても同様に実現されてもよい。上述の各種処理は、1つのプロセッサ1001で実行される旨を説明してきたが、2以上のプロセッサ1001により同時又は逐次に実行されてもよい。プロセッサ1001は、1以上のチップで実装されてもよい。なお、プログラムは、電気通信回線を介してネットワークから送信されても良い。 Further, the processor 1001 reads a program (program code), a software module, and data from the storage 1003 and / or the communication device 1004 to the memory 1002, and executes various processes according to these. As the program, a program that causes a computer to execute at least a part of the operations described in the above embodiments is used. For example, the acquisition unit 61 and the generation unit 62 may be realized by a control program stored in the memory 1002 and operated by the processor 1001, and may be realized similarly for other functional blocks. Although the above-described various processes have been described as being executed by one processor 1001, they may be executed simultaneously or sequentially by two or more processors 1001. The processor 1001 may be implemented by one or more chips. Note that the program may be transmitted from a network via a telecommunication line.
 メモリ1002は、コンピュータ読み取り可能な記録媒体であり、例えば、ROM(Read Only Memory)、EPROM(Erasable Programmable ROM)、EEPROM(Electrically Erasable Programmable ROM)、RAM(Random Access Memory)等の少なくとも1つで構成されてもよい。メモリ1002は、レジスタ、キャッシュ、メインメモリ(主記憶装置)等と呼ばれてもよい。メモリ1002は、本発明の一実施の形態に係る無線通信方法を実施するために実行可能なプログラム(プログラムコード)、ソフトウェアモジュール等を保存することができる。 The memory 1002 is a computer-readable recording medium, and includes, for example, at least one of ROM (Read Only Memory), EPROM (Erasable Programmable ROM), EEPROM (ElectricallyrErasable Programmable ROM), RAM (Random Access Memory), and the like. May be. The memory 1002 may be referred to as a register, a cache, a main memory (main storage device), or the like. The memory 1002 can store a program (program code), a software module, and the like that can be executed to implement the wireless communication method according to the embodiment of the present invention.
 ストレージ1003は、コンピュータ読み取り可能な記録媒体であり、例えば、CD-ROM(Compact Disc ROM)等の光ディスク、ハードディスクドライブ、フレキシブルディスク、光磁気ディスク(例えば、コンパクトディスク、デジタル多用途ディスク、Blu-ray(登録商標)ディスク)、スマートカード、フラッシュメモリ(例えば、カード、スティック、キードライブ)、フロッピー(登録商標)ディスク、磁気ストリップ等の少なくとも1つで構成されてもよい。ストレージ1003は、補助記憶装置と呼ばれてもよい。上述の記憶媒体は、例えば、メモリ1002及び/又はストレージ1003を含むデータベース、サーバその他の適切な媒体であってもよい。 The storage 1003 is a computer-readable recording medium such as an optical disc such as a CD-ROM (Compact Disc ROM), a hard disc drive, a flexible disc, a magneto-optical disc (eg, a compact disc, a digital versatile disc, a Blu-ray). (Registered trademark) disk, smart card, flash memory (for example, card, stick, key drive), floppy (registered trademark) disk, magnetic strip, or the like. The storage 1003 may be referred to as an auxiliary storage device. The storage medium described above may be, for example, a database, server, or other suitable medium including the memory 1002 and / or the storage 1003.
 通信装置1004は、有線及び/又は無線ネットワークを介してコンピュータ間の通信を行うためのハードウェア(送受信デバイス)であり、例えばネットワークデバイス、ネットワークコントローラ、ネットワークカード、通信モジュール等とも呼ぶ。例えば上述の通信装置8は、通信装置1004で実現されてもよい。 The communication device 1004 is hardware (transmission / reception device) for performing communication between computers via a wired and / or wireless network, and is also called a network device, a network controller, a network card, a communication module, or the like. For example, the communication device 8 described above may be realized by the communication device 1004.
 入力装置1005は、外部からの入力を受け付ける入力デバイス(例えば、キーボード、マウス、マイクロフォン、スイッチ、ボタン、センサ等)である。出力装置1006は、外部への出力を実施する出力デバイス(例えば、ディスプレイ、スピーカ、LEDランプ等)である。なお、入力装置1005及び出力装置1006は、一体となった構成(例えば、タッチパネル)であってもよい。 The input device 1005 is an input device (for example, a keyboard, a mouse, a microphone, a switch, a button, a sensor, etc.) that accepts an external input. The output device 1006 is an output device (for example, a display, a speaker, an LED lamp, etc.) that performs output to the outside. The input device 1005 and the output device 1006 may have an integrated configuration (for example, a touch panel).
 また、プロセッサ1001及びメモリ1002等の各装置は、情報を通信するためのバス1007で接続される。バス1007は、単一のバスで構成されてもよいし、装置間で異なるバスで構成されてもよい。 Also, the devices such as the processor 1001 and the memory 1002 are connected by a bus 1007 for communicating information. The bus 1007 may be configured with a single bus or may be configured with different buses between apparatuses.
 また、制御装置6は、マイクロプロセッサ、デジタル信号プロセッサ(DSP:Digital Signal Processor)、ASIC(Application Specific Integrated Circuit)、PLD(Programmable Logic Device)、FPGA(Field Programmable Gate Array)等のハードウェアを含んで構成されてもよく、当該ハードウェアにより、各機能ブロックの一部又は全てが実現されてもよい。例えば、プロセッサ1001は、これらのハードウェアの少なくとも1つで実装されてもよい。 The control device 6 includes hardware such as a microprocessor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a programmable logic device (PLD), and a field programmable gate array (FPGA). A part or all of each functional block may be realized by the hardware. For example, the processor 1001 may be implemented by at least one of these hardware.
 以上、本実施形態について詳細に説明したが、当業者にとっては、本実施形態が本明細書中に説明した実施形態に限定されるものではないということは明らかである。本実施形態は、特許請求の範囲の記載により定まる本発明の趣旨及び範囲を逸脱することなく修正及び変更態様として実施することができる。したがって、本明細書の記載は、例示説明を目的とするものであり、本実施形態に対して何ら制限的な意味を有するものではない。 As mentioned above, although this embodiment was described in detail, it is clear for those skilled in the art that this embodiment is not limited to embodiment described in this specification. The present embodiment can be implemented as a modification and change without departing from the spirit and scope of the present invention defined by the description of the scope of claims. Therefore, the description of the present specification is for illustrative purposes and does not have any limiting meaning to the present embodiment.
 本明細書で説明した各態様/実施形態の処理手順、シーケンス、フローチャート等は、矛盾の無い限り、順序を入れ替えてもよい。例えば、本明細書で説明した方法については、例示的な順序で様々なステップの要素を提示しており、提示した特定の順序に限定されない。 The processing procedures, sequences, flowcharts, and the like of each aspect / embodiment described in this specification may be switched in order as long as there is no contradiction. For example, the methods described herein present the elements of the various steps in an exemplary order and are not limited to the specific order presented.
 本明細書で使用する「に基づいて」という記載は、別段に明記されていない限り、「のみに基づいて」を意味しない。言い換えれば、「に基づいて」という記載は、「のみに基づいて」と「に少なくとも基づいて」の両方を意味する。 As used herein, the phrase “based on” does not mean “based only on”, unless expressly specified otherwise. In other words, the phrase “based on” means both “based only on” and “based at least on.”
 本明細書で「第1の」、「第2の」等の呼称を使用した場合においては、その要素へのいかなる参照も、それらの要素の量または順序を全般的に限定するものではない。これらの呼称は、2つ以上の要素間を区別する便利な方法として本明細書で使用され得る。したがって、第1および第2の要素への参照は、2つの要素のみがそこで採用され得ること、または何らかの形で第1の要素が第2の要素に先行しなければならないことを意味しない。 In the present specification, when a designation such as “first”, “second”, etc. is used, any reference to that element does not generally limit the quantity or order of those elements. These designations can be used herein as a convenient way to distinguish between two or more elements. Thus, a reference to the first and second elements does not mean that only two elements can be employed there, or that in some way the first element must precede the second element.
 「含む(include)」、「含んでいる(including)」、およびそれらの変形が、本明細書あるいは特許請求の範囲で使用されている限り、これら用語は、用語「備える(comprising)」と同様に、包括的であることが意図される。さらに、本明細書あるいは特許請求の範囲において使用されている用語「または(or)」は、排他的論理和ではないことが意図される。 These terms are similar to the term “comprising” as long as “include”, “including” and variations thereof are used herein or in the claims. It is intended to be comprehensive. Furthermore, the term “or” as used herein or in the claims is not intended to be an exclusive OR.
 本明細書において、文脈または技術的に明らかに1つのみしか存在しない装置である場合以外は、複数の装置をも含むものとする。 In this specification, unless there is only one device that is clearly present in context or technically, a plurality of devices are also included.
 本開示の全体において、文脈から明らかに単数を示したものではなければ、複数のものを含むものとする。 In the whole of the present disclosure, a plural is included unless it is clearly indicated by a context.
 1…ディスプレイ、2…カメラ、7、7A…表示撮像装置、15…周期構造部材、15x…水平部材、15y…垂直部材、61、61A…取得部、62、62A…生成部、Dx…垂直部材間隔、Dy…水平部材間隔、Wx…垂直部材幅、Wy…水平部材幅。 DESCRIPTION OF SYMBOLS 1 ... Display, 2 ... Camera, 7, 7A ... Display imaging device, 15 ... Periodic structure member, 15x ... Horizontal member, 15y ... Vertical member, 61, 61A ... Acquisition part, 62, 62A ... Generation part, Dx ... Vertical member Spacing, Dy ... horizontal member spacing, Wx ... vertical member width, Wy ... horizontal member width.

Claims (7)

  1.  周期構造部材を含む透過型のディスプレイと、
     前記ディスプレイの背後から前記ディスプレイの前方を撮像するカメラと、
     前記カメラの撮像画像に基づいて前方画像を生成する生成部と、
     を備え、
     前記生成部は、前記カメラの受光位置が複数の位置の各々にあるときに取得されることで画像中の前記周期構造部材が異なる位置に表示された複数の撮像画像に基づいて前記周期構造部材が表示されないように、前方画像を生成する、
     を備える、
     表示撮像装置。
    A transmissive display including a periodic structure member;
    A camera that images the front of the display from behind the display;
    A generating unit that generates a front image based on a captured image of the camera;
    With
    The periodic structure member is acquired based on a plurality of captured images in which the periodic structure member in the image is displayed at different positions by being acquired when the light receiving position of the camera is at each of a plurality of positions. Generate a forward image so that is not displayed,
    Comprising
    Display imaging device.
  2.  前記カメラの受光位置は、前記カメラの移動に応じて、前記複数の位置の間で移動する、
     請求項1に記載の表示撮像装置。
    The light receiving position of the camera moves between the plurality of positions according to the movement of the camera.
    The display imaging device according to claim 1.
  3.  前記カメラの受光位置は、前記複数の位置に同時に存在する、
     請求項1に記載の表示撮像装置。
    The light receiving position of the camera is simultaneously present at the plurality of positions.
    The display imaging device according to claim 1.
  4.  前記生成部は、前記カメラの受光位置が前記複数の位置を通る往復経路の各位置にあるときに順に取得された前記カメラの撮像画像に基づいて、前記前方画像を生成し、
     前記複数の位置は、前記往復経路の折返し位置を含み、
     前記生成部は、前記カメラの受光位置が前記往復経路の往路における複数の位置にあるときに取得された前記カメラの撮像画像に基づいて第1の前方画像を生成し、前記往復経路の復路における複数の位置にあるときに取得された前記カメラの撮像画像に基づいて第2の前方画像を生成し、
     前記生成部は、前記カメラの受光位置が前記往復経路の折り返し位置にあるときに取得された前記カメラの撮像画像を共通に用いて、前記第1の前方画像及び前記第2の前方画像を生成する、
     請求項1~3のいずれか1項に記載の表示撮像装置。
    The generation unit generates the front image based on captured images of the camera sequentially acquired when a light receiving position of the camera is at each position of a round-trip path passing through the plurality of positions.
    The plurality of positions includes a return position of the round-trip path,
    The generation unit generates a first front image based on captured images of the camera acquired when a light receiving position of the camera is at a plurality of positions in the forward path of the round-trip path, and in a return path of the round-trip path Generating a second front image based on the captured image of the camera acquired when the image is at a plurality of positions;
    The generation unit generates the first front image and the second front image by using in common the captured image of the camera acquired when the light receiving position of the camera is at the return position of the round-trip path. To
    The display imaging apparatus according to any one of claims 1 to 3.
  5.  前記周期構造部材は、
      前記ディスプレイの水平方向に延在し、前記ディスプレイの垂直方向において等間隔に配置された複数の水平部材と、
      前記垂直方向に延在し、前記水平方向において等間隔に配置された複数の垂直部材と、
     を含み、
     前記複数の位置は、前記水平方向及び前記垂直方向に対して傾斜方向に並んでいる、
     請求項1~4のいずれか1項に記載の表示撮像装置。
    The periodic structure member is:
    A plurality of horizontal members extending in the horizontal direction of the display and arranged at equal intervals in the vertical direction of the display;
    A plurality of vertical members extending in the vertical direction and arranged at equal intervals in the horizontal direction;
    Including
    The plurality of positions are arranged in an inclined direction with respect to the horizontal direction and the vertical direction.
    The display imaging device according to any one of claims 1 to 4.
  6.  前記複数の位置のうちの隣り合う位置同士の間の距離は、周期構造部材の部材幅以上且つ周期間隔の2分の1未満であり、
     前記生成部は、前記カメラの前記受光位置が前記複数の位置のうちの3つの位置の各々にあるときに取得された前記カメラの撮像画像に対してメディアンフィルタをかけることによって得られる画像を、前記前方画像として生成する、
     請求項1~5のいずれか1項に記載の表示撮像装置。
    The distance between adjacent positions among the plurality of positions is equal to or greater than the member width of the periodic structure member and less than half of the periodic interval.
    The generation unit obtains an image obtained by applying a median filter to a captured image of the camera acquired when the light receiving position of the camera is at each of three positions of the plurality of positions. Generating as the front image,
    The display imaging device according to any one of claims 1 to 5.
  7.  前記カメラの撮像画像の撮像画素が前記周期構造部材を表示する場合の、撮像画素の輝度値を予め記憶している記憶部をさらに備え、
     前記複数の位置のうちの隣り合う位置同士の間の距離は、周期構造部材の部材幅以上且つ周期間隔未満であり、
     前記生成部は、前記カメラの前記受光位置が前記複数の位置のうちの2つの位置の各々にあるときに取得された前記カメラの撮像画像中の各々の撮像画素のうち、前記記憶部に記憶されている輝度値から離れた輝度値を有する撮像画素を組み合わせて得られる画像を、前記前方画像として生成する、
     請求項1~5のいずれか1項に記載の表示撮像装置。
    When the imaging pixel of the captured image of the camera displays the periodic structure member, the storage unit further stores a luminance value of the imaging pixel in advance,
    The distance between adjacent positions among the plurality of positions is equal to or greater than the member width of the periodic structure member and less than the periodic interval.
    The generation unit stores, in the storage unit, among each imaging pixel in the captured image of the camera acquired when the light receiving position of the camera is at each of two positions of the plurality of positions. An image obtained by combining imaging pixels having a luminance value that is distant from the luminance value being generated is generated as the front image;
    The display imaging device according to any one of claims 1 to 5.
PCT/JP2019/006080 2018-04-26 2019-02-19 Display imaging device WO2019207922A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020516058A JP7002644B2 (en) 2018-04-26 2019-02-19 Display imager

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018085407 2018-04-26
JP2018-085407 2018-04-26

Publications (1)

Publication Number Publication Date
WO2019207922A1 true WO2019207922A1 (en) 2019-10-31

Family

ID=68294540

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/006080 WO2019207922A1 (en) 2018-04-26 2019-02-19 Display imaging device

Country Status (2)

Country Link
JP (1) JP7002644B2 (en)
WO (1) WO2019207922A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007305050A (en) * 2006-05-15 2007-11-22 Olympus Imaging Corp Camera, composite image photographing method, program and recording medium
US20080165267A1 (en) * 2007-01-09 2008-07-10 Cok Ronald S Image capture and integrated display apparatus
JP2015046019A (en) * 2013-08-28 2015-03-12 キヤノン株式会社 Image processing device, imaging device, imaging system, image processing method, program, and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007305050A (en) * 2006-05-15 2007-11-22 Olympus Imaging Corp Camera, composite image photographing method, program and recording medium
US20080165267A1 (en) * 2007-01-09 2008-07-10 Cok Ronald S Image capture and integrated display apparatus
JP2015046019A (en) * 2013-08-28 2015-03-12 キヤノン株式会社 Image processing device, imaging device, imaging system, image processing method, program, and storage medium

Also Published As

Publication number Publication date
JP7002644B2 (en) 2022-01-20
JPWO2019207922A1 (en) 2021-01-07

Similar Documents

Publication Publication Date Title
KR102142643B1 (en) Three-dimensional telepresence system
US11665331B2 (en) Dynamic vision sensor and projector for depth imaging
KR102534698B1 (en) Pass-through display of captured images
JP5456020B2 (en) Information processing apparatus and method
US10739111B2 (en) Cloaking systems and methods
US20170345398A1 (en) Minimal-latency tracking and display for matching real and virtual worlds in head-worn displays
KR100910175B1 (en) Image sensor for generating a three dimensional image
JP2011182317A (en) Imaging apparatus
JP2007116208A (en) Compound eye imaging apparatus
TWI556625B (en) Image projection apparatus
TW201837861A (en) Depth processing system
KR102598428B1 (en) Multistripe laser for laser-based projector displays
WO2019207922A1 (en) Display imaging device
JP2005149127A (en) Imaging display device and method, and image sending and receiving system
KR100928332B1 (en) Stereoscopic Imaging System Using Dynamic Pinhole Array and Its Image Display Method
JP2012163371A (en) Information processor, information processing method
US20240223743A1 (en) Display device and display system
US10778893B2 (en) Detection device, display device and detection method
JP2005107361A (en) Scanning-type image display device and image photographing apparatus having the device
KR100689040B1 (en) Image recording device and image processing device of displaying 3d image
KR101118745B1 (en) Method and system for imaging display
KR20150139031A (en) Autostereoscopic three dimensional proector and method for displaying thereof
TW202347258A (en) Depth processing system and operational method thereof
CN118264793A (en) Display device and display system
JP6192222B2 (en) Imaging device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19792693

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020516058

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19792693

Country of ref document: EP

Kind code of ref document: A1