WO2023276156A1 - Spatial image display device, spatial image display method, and program - Google Patents

Spatial image display device, spatial image display method, and program Download PDF

Info

Publication number
WO2023276156A1
WO2023276156A1 PCT/JP2021/025200 JP2021025200W WO2023276156A1 WO 2023276156 A1 WO2023276156 A1 WO 2023276156A1 JP 2021025200 W JP2021025200 W JP 2021025200W WO 2023276156 A1 WO2023276156 A1 WO 2023276156A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
visual effect
unit
display device
Prior art date
Application number
PCT/JP2021/025200
Other languages
French (fr)
Japanese (ja)
Inventor
健也 鈴木
大地 並河
馨亮 長谷川
誠 武藤
精一 紺谷
泰治 中村
信博 平地
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Priority to JP2023531331A priority Critical patent/JPWO2023276156A1/ja
Priority to PCT/JP2021/025200 priority patent/WO2023276156A1/en
Publication of WO2023276156A1 publication Critical patent/WO2023276156A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/346Image reproducers using prisms or semi-transparent mirrors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/388Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
    • H04N13/395Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume with depth sampling, i.e. the volume being constructed from a stack or sequence of 2D image planes

Definitions

  • the present disclosure relates to a spatial image display device, a spatial image display method, and a program.
  • Patent Document 1 It is also known to display a spatial image composed of a plurality of virtual images by displaying a plurality of virtual images at different distances in the same direction from the observer. This allows the observer to observe the spatial image with a higher stereoscopic effect. Also, by designing the distances between a plurality of virtual images to be short, the three-dimensional effect can be further enhanced.
  • a stereoscopic image can be provided by displaying a plurality of images with different luminances at different distances in the same direction from the observer (Patent Document 2).
  • Patent Document 2 a stereoscopic image can be provided by displaying a plurality of images with different luminances at different distances in the same direction from the observer.
  • the plurality of virtual images are virtual images formed on a two-dimensional virtual image plane. Therefore, in a configuration in which a plurality of virtual image planes are positioned at different distances in the same direction as viewed from the observer, if an object (for example, a ball used in a sports game) moves in a direction corresponding to the observation direction of the observer, , the virtual image of the object may be displayed on one virtual image plane at one time and displayed on another virtual image plane at another time. In such a case, the observer sees the virtual image of the object instantaneously moving from one virtual image plane to another, which seems unnatural compared to the movement of the object in real space. . Therefore, the observer may not be able to obtain a sufficient sense of realism.
  • Patent Document 2 when the technology described in Patent Document 2 is applied to an image showing a moving object to display a virtual image of the object on each of a plurality of virtual image planes, parallax between the plurality of virtual images may occur depending on the position of the observer. may occur and you may not be able to obtain a sufficient sense of presence.
  • An object of the present disclosure which has been made in view of such circumstances, is to provide a spatial image display device, a spatial image display method, and a program capable of giving the observer a sufficient sense of realism when displaying virtual images of objects on a plurality of virtual image planes. to provide.
  • a spatial image display device includes a plurality of display units that display images, and image light emitted from the images displayed on the plurality of display units, respectively. a plurality of optical elements arranged such that a virtual image of the image is displayed at different distances in the same direction from the observer by the reflected image light reaching the eyes of the observer; a determination unit that determines a visual effect to be added to an object image including the image of the object based on the position of an object that is a specific subject whose image is included in the object; and determining the image based on the object image and the visual effect.
  • a display controller for controlling the display to display.
  • a spatial image display method includes a plurality of display units that display images, and image light emitted from the images displayed on the plurality of display units. and a plurality of optical elements arranged such that the reflected image light reaches an observer's eye, thereby displaying a virtual image of the image at different distances in the same direction from the observer.
  • a spatial image display method for a spatial image display device comprising: determining a visual effect to be added to an object image including an image of the object based on the position of the object, which is a specific subject whose image is included in the image. and controlling the display to display the image based on the object image and the visual effect.
  • a program according to the present disclosure causes a computer to function as the spatial image display device described above.
  • the spatial image display device the spatial image display method, and the program according to the present disclosure, it is possible to give the observer a sufficient sense of realism when displaying virtual images of objects on a plurality of virtual image planes.
  • FIG. 1 is a schematic diagram of a spatial image display device according to a first embodiment
  • FIG. 2 is a schematic diagram of a virtual image display unit shown in FIG. 1
  • FIG. 3 is a diagram for explaining a virtual image displayed by the virtual image display unit shown in FIG. 2
  • FIG. It is a figure which shows the virtual image seen from the observer.
  • 2 is a diagram showing an example of visual effects stored in a visual effect storage section shown in FIG. 1
  • FIG. FIG. 4 is a diagram showing an example of the position of an object in real space
  • 6B is a diagram showing an example of a virtual image displayed based on the position of the object shown in FIG. 6A;
  • FIG. 10 is a diagram showing another example of the position of an object in real space
  • 7B is a diagram showing an example of a virtual image displayed based on the position of the object shown in FIG. 7A
  • FIG. FIG. 10 is a diagram showing still another example of the position of an object in real space
  • 8B is a diagram showing an example of a virtual image displayed based on the position of the object shown in FIG. 8A
  • FIG. 2 is a flow chart showing an example of the operation in the spatial image display device shown in FIG. 1
  • FIG. 10 is a diagram showing a first modification of visual effects
  • 10B is a diagram showing an example of a virtual image displayed based on the visual effect shown in FIG. 10A
  • FIG. 10B is a diagram showing another example of a virtual image displayed based on the visual effect shown in FIG. 10A;
  • FIG. 10 is a diagram showing a second modification of the visual effect;
  • 11B is a diagram showing an example of a virtual image displayed based on the visual effect shown in FIG. 11A;
  • FIG. 11B is a diagram showing another example of a virtual image displayed based on the visual effect shown in FIG. 11A.
  • FIG. FIG. 11 is a diagram showing a third modification of visual effects;
  • 12B is a diagram showing an example of a virtual image displayed based on the visual effect shown in FIG. 12A;
  • FIG. 12B is a diagram showing another example of a virtual image displayed based on the visual effect shown in FIG. 12A;
  • FIG. 14 is a sequence diagram showing an example of operations in the spatial image display device shown in FIG. 13;
  • FIG. 11 is a schematic diagram of a spatial image display device according to a third embodiment;
  • 16 is a sequence diagram showing an example of operations in the spatial image display device shown in FIG. 15;
  • FIG. 2 is a hardware block diagram of a spatial image display device;
  • FIG. 1 is a schematic diagram of a spatial image display device 1 according to the first embodiment.
  • the spatial image display device 1 includes a virtual image display section 2 and a visual effect determination section 3.
  • the virtual image display section 2 includes multiple display sections 21 and multiple optical elements 22 .
  • the display unit 21 is configured by an organic EL (Electro Luminescence), a liquid crystal panel, or the like.
  • the organic EL, liquid crystal panel, etc. may be attached to a device such as a tablet, VR goggles, or the like.
  • the optical element 22 is composed of, for example, a half mirror that transmits part of the incident light and reflects the remaining part of the light.
  • the virtual image display unit 2 includes two plurality of display units 211 and 212 and two optical elements 221 and 222, but is not limited to this, and three or more display units 21 and 3 More than one optical element 22 may be provided.
  • the display unit 21 displays images. Specifically, the display unit 21 displays an image based on the object image under the control of the visual effect determination unit 3 .
  • An object image is a part of an image generated by an imaging device such as a camera and includes an image of an object OB.
  • the object OB is any object, but can be, for example, an object contained in an image that can be noticed by the observer.
  • the objects OB are, for example, a first player PL1, a second player PL2 who is an opponent of the first player PL1, and the first player PL2.
  • the ball B is used by the second player PL1 and the second player PL2 during the game.
  • the imaging device is arranged so that the first player PL1 is positioned on the front side and the second player PL2 is positioned on the back side with respect to the imaging surface of the imaging device.
  • a video based on the object video includes a video in which the visual effect determined by the visual effect determination unit 3 is added to the object video.
  • the image based on the object image may include an object image to which the visual effect determined by the visual effect determining unit 3 has not been added, that is, the object image itself. That is, the display unit 21 displays an image obtained by adding the visual effect determined by the visual effect determination unit 3 to the object image. Further, the display unit 21 may display an image to which no visual effect is added to the object image.
  • the image obtained by adding a visual effect to the object image is an image obtained by processing the object image. The image is generated by processing the object image according to the type of visual effect, as will be described later in detail.
  • the display unit 212 displays an object image IM2 including an image of the second player PL2, which is one of the objects OB. Further, the display unit 212 may display an object image IM3 including an image of a ball B, which is one of the objects OB, under the control of the visual effect determination unit 3.
  • FIG. At this time, the display unit 211 does not have to display the object image IM3. Also, the display unit 211 may display an object image IM3. At this time, the display unit 212 does not have to display the object image IM3.
  • the optical elements 221 and 222 reflect image light emitted from the images displayed on the plurality of display units 211 and 212, respectively.
  • the optical elements 221 and 222 are arranged such that the reflected image light reaches the observer's eye, thereby displaying a virtual image VI of the image at different distances in the same direction from the observer.
  • the distance from the observer to the virtual image plane VF1, which is one virtual image plane VF, and the distance from the observer to the virtual image plane VF2, which is the other virtual image plane VF are different from each other.
  • the direction from the observer to the virtual image plane VF1 may be the same as the direction from the observer to the virtual image plane VF2.
  • the position of the virtual image plane VF1 can be calculated by a known method based on the positions of the display unit 211 and the optical element 221. is placed.
  • the virtual image plane VF1 is located on a plane symmetrical to the display surface of the display unit 211 with the optical element 221 interposed therebetween.
  • the optical element 221 may be arranged so that
  • the position of the virtual image plane VF2 can be calculated by a known method based on the positions of the display unit 212 and the optical element 222. and an optical element 222 are arranged.
  • the display surface of the display unit 212 is symmetrical with respect to the desired position where the virtual image plane VF2 is displayed.
  • the optical element 222 may be arranged so that
  • the optical elements 221 and 222 are arranged so that the virtual image planes VF1 and VF2 are positioned on the actual court.
  • the optical element 221 is arranged so that the virtual image plane VF1 is positioned closer to the observer than the actual net on the actual court.
  • the optical element 222 is arranged so that the virtual image plane VF2 is positioned farther from the observer than the actual net on the actual court.
  • the observer observes the virtual image VI1 displayed on the virtual image plane VF1 on the near side, and observes the virtual image VI2 displayed on the virtual image plane VF2 on the far side.
  • the virtual image VIb shown in FIG. 4 is displayed on either the virtual image plane VF1 or the virtual image plane VF2.
  • the virtual image plane VF on which the virtual image VIb is displayed will be described later in detail. Therefore, the observer observes a spatial image composed of the virtual image VI1, the virtual image VI2, and the virtual image VIb.
  • Such a virtual image is generally called a pepper's ghost or the like, and it looks to the observer as if the display portions 211 and 212 are on the virtual image planes VF1 and VF2, respectively.
  • the visual effect determination unit 3 includes a communication unit 31, a visual effect storage unit 32, a determination unit 33, and a display control unit .
  • the communication unit 31 is configured by a communication interface. Standards such as Ethernet (registered trademark), FDDI (Fiber Distributed Data Interface), and Wi-Fi (registered trademark) may be used for the communication interface.
  • the visual effect storage unit 32 is stored by memories such as HDD (Hard Disk Drive), SSD (Solid State Drive), EEPROM (Electrically Erasable Programmable Read-Only Memory), ROM (Read-Only Memory) and RAM (Random Access Memory). Configured.
  • the determination unit 33 and the display control unit 34 constitute a control unit (controller).
  • the control unit may be composed of dedicated hardware such as ASIC (Application Specific Integrated Circuit) or FPGA (Field-Programmable Gate Array), or may be composed of a processor, or may be composed of both. good too.
  • the communication unit 31 receives object video information indicating the object video and object position information indicating the position of the object OB from an external device.
  • An object image is an image including an image of an object OB, which is a specific subject.
  • the external device may be a video processing device that generates object video information and object position information, or may be any device that acquires object video information and object position information generated by a video processing device. .
  • the position of the object OB may be indicated by the distance in a predetermined direction from the reference plane, which is a plane in real space, corresponding to the virtual image plane VF on which the virtual image is displayed, to the object OB.
  • the predetermined direction is the normal direction of the imaging surface of the camera that images the object. Note that the positional relationship of the specific subject viewed from the camera in a predetermined direction is substantially the same as the positional relationship of the virtual image of the specific subject viewed from the observer.
  • the predetermined direction is a predetermined position (for example, the center position) of the range in which the first player PL1 located on the near side from the imaging device exists. This is the direction in which the player is heading, and is the direction from the imaging device toward a predetermined position (for example, the center position) of the range where the second player PL2 exists.
  • the visual effect storage unit 32 associates and stores the position of the object OB in the real space with the visual effect.
  • the visual effect is a visual effect given to the object image when the virtual image display unit 2 displays the virtual image VI of the object image indicated by the object image information.
  • the position is indicated by the distance from the reference plane to the object OB in real space, and the visual effect storage unit 32 stores the distance and the visual effect in association with each other.
  • the determination unit 33 determines a visual effect to be added to the object image including the image of the object OB, based on the position of the object OB, which is a specific subject whose image is included in the image.
  • the position of the object OB may be the position of the object OB in the real space or the position of the image of the object OB in the video space.
  • the determination unit 33 adds to the object image indicated by the image information. Determine visual effects.
  • the determination unit 33 determines the visual effect based on the distance from the reference plane to the object OB in real space. Specifically, the determination unit 33 extracts the visual effect stored in the visual effect storage unit 32 corresponding to the distance. Then, the determination unit 33 determines the visual effect to be added to the object video as the extracted visual effect.
  • the determination unit 33 causes the display unit 21 to display the object image from among the plurality of display units 21 based on the distance. may be determined.
  • the determination unit 33 causes the display unit 21 that displays the object OB to display the virtual image VI on the virtual image plane VF corresponding to the reference plane with the shortest distance to the object OB among the plurality of reference planes. and decide.
  • the display control unit 34 controls the plurality of display units 21 to display images based on the object images.
  • the virtual image display unit 2 controlled by the display control unit 34 may be variable, and may be, for example, a tablet, and may be replaced with VR goggles later. That is, the display control unit 34 may be compatible with multiple devices.
  • the display control unit 34 controls the display unit 21 to display an image obtained by adding a visual effect to the image based on the object image.
  • the display control unit 34 causes the display unit to display an image obtained by adding the visual effect determined by the determination unit 33 to the object image of the object OB indicated by the object image information received by the communication unit 31. 21.
  • the display control unit 34 determines whether the display unit 21 determined by the determining unit 33 is the object image. Control to display images with visual effects.
  • the display control unit 34 controls the display unit 21 determined by the determination unit 33 to display an image obtained by adding a visual effect to the object image IM3 including the image of the ball B that is the object OB. do.
  • the display control unit 34 may control the predetermined display unit 21 to display the object video of the predetermined object OB indicated by the object video information.
  • the display control unit 34 controls the display unit 211 to display the object image IM1 including the image of the first player PL1, which is the predetermined object OB.
  • the display control unit 34 also controls the display unit 212 to display the object image IM2 including the image of the second player PL2, which is the predetermined object OB.
  • the object video includes images of the first player PL1, the second player PL2, and the ball B, respectively. Also, the distance between the first reference plane S1 and the second reference plane S2 is 9 in the real space. In real space, a ball B, which is one of the objects, is positioned between the first reference plane S1 and the second reference plane S2, and the distance between the first reference plane S1 and the ball B is is 8.2.
  • the determination unit 33 determines the display unit 21 that displays the object OB as the display unit 21 that displays the virtual image VI on the virtual image plane VF corresponding to the reference plane with the shortest distance to the object OB.
  • the distance from the first reference plane S1 to the ball B is 8.2
  • the distance from the second reference plane S2 to the ball B is 0.8. Therefore, the determining unit 33 causes the display unit 21 that displays the virtual image VIb of the ball B to display the virtual image VI2 on the virtual image plane VF2 (see FIG. 3) corresponding to the second reference plane S2 with the shortest distance to the ball B.
  • the display unit 212 to be displayed is determined.
  • the determination unit 33 selects the visual effect “none” stored in the visual effect storage unit 32 as the object, corresponding to the distance 0.8 from the second reference plane S2 to the ball B, which is the object OB. Determine the visual effect to be added to OB.
  • the display control unit 34 controls the display unit 212 to display the object image IM3 including the image of the ball B, which is the object OB, without visual effects. Furthermore, the display control unit 34 controls the display unit 211 to display the object image IM1 including the image of the first player PL1, which is the predetermined object OB. The display control unit 34 also controls the display unit 212 to display the object image IM2 including the image of the second player PL2, which is the predetermined object OB. As a result, as shown in FIG. 6B, the observer observes the virtual image VI1 of the first player PL1 on the near side, and observes the virtual image VI2 of the second player PL2 and the virtual image VIb of the ball B on the far side.
  • the observer observes the virtual image VI1 of the first player PL1, the virtual image VI2 of the second player PL2, and the virtual image VIb of the ball B in a positional relationship similar to that of the respective objects OB in real space. be able to.
  • the object video includes images of the first player PL1, the second player PL2, and the ball B, similar to the example shown in FIG. 6A. Also, the distance between the first reference plane S1 and the second reference plane S2 is 9 in the real space. Further, in the real space, the ball B, which is one of the objects, is positioned between the first reference plane S1 and the second reference plane S2, and unlike the example shown in FIG. and ball B is 7.
  • the determination unit 33 causes the display unit 212 that displays the virtual image VI on the virtual image plane VF2 corresponding to the second reference plane S2, which is the shortest distance to the ball B, to display the object image including the image of the ball B.
  • the display unit 21 is determined.
  • the determining unit 33 sets the visual effect “enlargement/reduction” stored in the visual effect storage unit 32 to the object OB, corresponding to the distance 2 from the second reference plane S2 to the ball B, which is the object OB.
  • the determination unit 33 reduces the object image including the image of the ball B as the distance between the ball B and the second reference plane S2 becomes shorter, and reduces the object image including the image of the ball B as the distance increases. Decide to add visual effects such as magnifying.
  • the determination unit 33 determines the display unit 211 that displays the virtual image on the first reference plane S1 as the display unit 21 that displays the object image including the image of the ball B, and determines the display unit 21 that displays the object image including the image of the ball B. is within a predetermined range, the object image is enlarged as the distance between the ball B and the first reference surface S1 becomes shorter, and the object image is reduced as the distance becomes longer. may decide to do so.
  • the display control unit 34 controls the display unit 212 to display an image to which a visual effect of enlarging or reducing the image of the ball B is added.
  • the visual effect of enlarging or reducing is, for example, such that the image of ball B is enlarged as the ball B approaches the first player PL1, and the image of the ball B is reduced as the ball B moves away from the first player PL1. It can be a visual effect.
  • the display control unit 34 controls the display unit 211 to display the object image IM1 including the image of the first player PL1 that is the predetermined object OB, and the image of the second player PL2 that is the predetermined object OB.
  • the display unit 212 is controlled to display the object image IM2 including As a result, the observer observes the virtual image VI1 of the first player PL1 on the near side, and the virtual image VI2 of the second player PL2 and the virtual image VIb of the ball B on the far side, as shown in FIG. 7B. Therefore, the observer can observe the virtual images of the first player PL1, the second player PL2, and the ball B in a positional relationship similar to their respective positional relationships in real space. Furthermore, since the virtual image display unit 2 reduces or enlarges the image of the ball B according to the change in distance, the observer can perceive the movement of the ball in the real space with a more realistic feeling.
  • the object video includes images of the first player PL1, the second player PL2, and the ball B, similar to the examples shown in FIGS. 6A and 7A.
  • the distance between the first reference plane S1 and the second reference plane S2 is 9 in the real space.
  • the ball B which is one of the objects, is positioned between the first reference plane S1 and the second reference plane S2, and unlike the examples shown in FIG. 6A and FIG.
  • the distance between the reference plane S1 and the ball B is 5.
  • the determining unit 33 causes the display unit 21 that displays the object image including the image of the ball B to be displayed on the virtual image plane VF2 corresponding to the second reference plane S2 having the shortest distance to the ball B, which is the object OB. is determined to be the display unit 212 that displays .
  • the determination unit 33 adds the visual effect “flash light” stored in the visual effect storage unit 32 to the object OB corresponding to the distance 2 from the second reference plane S2 to the ball B. Decide on the effect.
  • the display control unit 34 displays the object image including the image of the ball B by adding a visual effect (shaded portion in FIG. 8B) that makes the image of the ball B look as bright as flash light. It controls the display unit 212 . Further, the display control unit 34 controls the display unit 211 to display the object image IM1 including the image of the first player PL1 that is the predetermined object OB, and the image of the second player PL2 that is the predetermined object OB.
  • the display unit 212 is controlled to display the object image IM2 including As a result, the observer observes the virtual image VI1 of the first player PL1 on the near side, and the virtual image VI2 of the second player PL2 and the virtual image VIb of the ball B on the far side, as shown in FIG. 8B. Therefore, the observer can observe the virtual images of the first player PL1, the second player PL2, and the ball B in a positional relationship similar to their respective positional relationships in real space. Furthermore, the observer can more strongly recognize that the ball B is approaching by observing the image of the ball B to which the visual effect of the flash light is added.
  • FIG. 9 is a flow chart showing an example of the operation of the spatial image display device 1 according to the first embodiment.
  • the operation of the spatial image display device 1 described with reference to FIG. 9 corresponds to an example of the spatial image display method of the spatial image display device 1 according to the first embodiment.
  • step S11 the communication unit 31 receives object video information and object position information from an external device via the communication network.
  • step S12 the determining unit 33 determines a visual effect to be added to the object image, which is the image of the object included in the object image information, based on the position of the object corresponding to the object, which is the specific subject whose image is included in the image. decide.
  • step S13 the display control unit 34 determines, from among the plurality of display units 21, the display unit 21 on which to display the object image including the image of the object OB.
  • step S14 the display control unit 34 controls the display unit 21 to display the image based on the object image and the visual effect. Specifically, the display control unit 34 causes the display unit determined in step S13 to display an image obtained by adding the visual effect determined by the determination unit 33 to the object image of the object OB indicated by the object image information. 21. Furthermore, the display control unit 34 may control the predetermined display unit 21 to display the object image of the predetermined object OB indicated by the object image information.
  • step S15 the display unit 21 displays the object image under the control of the display control unit 34. Specifically, the display unit 21 displays an image obtained by adding the visual effect determined by the determination unit 33 to the object image of the object OB indicated by the object image information. At this time, the display unit 21 may display an object image of a predetermined object.
  • the spatial image display apparatus 1 adds an image of an object OB to an object image including an image of the object OB based on the position of the object, which is a specific subject whose image is included in the image. A visual effect is determined, and the display unit 21 is controlled to display an image based on the object image and the visual effect. Therefore, the spatial image display device 1 can display virtual images of objects on a plurality of virtual image planes VF. Therefore, the spatial image display device 1 can provide a sufficient sense of realism to an observer observing a spatial image formed by virtual images on a plurality of virtual image planes VF.
  • the spatial image display device 1 receives object video information indicating an object video and object position information indicating the position of the object OB, and based on the position indicated by the object position information, A visual effect to be added to the object video indicated by the object video information is determined. Therefore, the spatial image display device 1 can display a virtual image obtained by adding a visual effect to an object image included in the image, at a remote location where the object OB is imaged.
  • the spatial image display device 1 determines a visual effect such as changing the brightness of each object image based on the distance. Therefore, the spatial image display device 1 can display the virtual image VI of the moving object OB on a plurality of virtual image planes VF so that the observer can intuitively recognize the position of the object OB. It is possible to give a more realistic feeling to the observer who observes the stereoscopic image formed by.
  • the spatial image display device 1 determines, from among the plurality of display units 21, the display unit 21 on which the image based on the object image is to be displayed, based on the distance.
  • the display unit 21 determined by is controlled to display an image obtained by adding a visual effect to the object image. Therefore, the spatial image display device 1 can display the virtual image VI of the moving object OB on a plurality of virtual image planes VF so that the observer can intuitively recognize the position of the object OB. It is possible to give a more realistic feeling to the observer who observes the stereoscopic image formed by.
  • the spatial image display device 1 includes the virtual image display unit 2 in the first embodiment described above, this is not the only option.
  • the spatial image display device 1 may not include the virtual image display unit 2 and may control an external virtual image display device to display the object image. In such a configuration, the spatial image display device 1 does not execute step S15 of the flowchart described above.
  • the position of the object OB is indicated by the distance from the reference plane to the object OB in real space, but the present invention is not limited to this.
  • the position of object OB may be indicated by the height from a predetermined horizontal plane (for example, ground plane) to object OB in real space.
  • the visual effect storage unit 32 stores heights and visual effects in association with each other, as shown in FIG. 10A.
  • a determination unit 33 determines a visual effect based on the height.
  • the display control unit 34 controls the display unit 21 to display an image obtained by adding the visual effect determined by the determination unit 33 to the object image.
  • the determining unit 33 determines that the visual effect is "none" (see FIG. 10A). Then, the display control unit 34 controls the display unit 21 to display the object image without visual effects. This allows the observer to observe a spatial image formed by virtual images VI1, VI2, and VIb as shown in FIG. 10B.
  • the determination unit 33 determines the visual effect “50% transparency” (see FIG. 10A), and the determination unit 33 changes the transparency of the object image to 50%.
  • the display unit 21 is controlled to display as As a result, the observer can observe a spatial image composed of the virtual images VI1 and VI2 as shown in FIG. 10C and the virtual image VIb of the object image with a visual effect of 50% transparency.
  • the determination unit 33 determines the visual effect as “80% transparency” (see FIG. 10A), and changes the transparency of the object image to 80%.
  • the display unit 21 is controlled to display
  • the determining unit 33 determines the visual effect based on the position of the object OB.
  • a visual effect may be determined.
  • the change rate of the position can be, for example, the change in the position of the image of the object OB in a few unit frames forming the video.
  • the visual effect storage unit 32 stores the rate of change in position and the visual effect in association with each other, as shown in FIG. 11A.
  • the display control unit 34 also controls the display unit 21 to display an image obtained by adding the visual effect determined by the determination unit 33 to the object image.
  • the determination unit 33 determines the visual effect to be "none" (see FIG. 11A). Then, the display control unit 34 controls the display unit 21 to display the object image without visual effects. This allows the observer to observe a spatial image formed by virtual images VI1, VI2, and VIb as shown in FIG. 11B.
  • the determining unit 33 determines the visual effect to be "overlapping multiple frames" (see FIG. 11A). Then, the display control unit 34 controls the display unit 21 to display an image with a visual effect of superimposing and displaying a plurality of frames of object images. As a result, the observer can observe a spatial image composed of the virtual images VI1 and VI2 as shown in FIG. 11C and the virtual image VIb of the image displayed by overlapping the object images of a plurality of frames.
  • the determination unit 33 determines the visual effect as "50% transparency”. Then, the display control unit 34 controls the display unit 21 to change the transparency of the object image to 50% and display it.
  • the determination unit 33 may determine the visual effect to be added to the object image based on whether the position of the object OB is within a predetermined range.
  • the visual effect storage unit 32 stores visual effects in association with whether the position of the object OB is within a predetermined range.
  • a predetermined range In the example shown in FIG. 12A, inside and on the line of a court used in a tennis match are within a predetermined range (IN), and outside the line is outside a predetermined range (OUT).
  • the display control unit 34 also controls the display unit 21 to display an image obtained by adding the visual effect determined by the determination unit 33 to the object image.
  • the determining unit 33 determines "no" visual effect. Then, the display control unit 34 controls the virtual image display unit 2 to display the object image without visual effects. This allows the observer to observe a spatial image formed by virtual images VI1, VI2, and VIb as shown in FIG. 12B.
  • the determination unit 33 determines the visual effect "flash light". Then, the display control unit 34 controls the virtual image display unit 2 to display the object OB by adding a visual effect such that the image of the object OB looks bright like flash light. As a result, the observer can observe a spatial image composed of the virtual images VI1 and VI2 as shown in FIG. 12C and the virtual image VIb of the image added with the visual effect of making it look bright like flash light. .
  • the determination unit 33 determines the visual effect as "50% transparency". Then, the display control unit 34 controls the virtual image display unit 2 so that the transparency of the object image is changed to 50% and displayed.
  • the determination unit 33 determines the visual effect based on whether the position of the object OB in the real space is within a predetermined range.
  • a visual effect may be determined based on whether the image of is within a predetermined range.
  • the determination unit 33 determines the display unit 21 to display the object image based on the position of the object OB, but the present invention is not limited to this.
  • the determination unit 33 may determine to display the object image on each of the multiple display units 21 .
  • the position is indicated by the distance in a predetermined direction from the reference plane, which is a plane in real space, to the object, corresponding to the virtual image plane on which the image is displayed.
  • the determination unit 33 selects the object image to be displayed on each display unit 21 according to the position of the object OB so that the observer can observe the spatial image with a higher stereoscopic effect. The luminance is changed for each display section 21 .
  • the determination unit 33 may further determine irradiation light to be applied to a member that defines the space in which the virtual image is displayed.
  • Members that define the space in which the virtual image is displayed can be, for example, floors, walls, and pillars.
  • the irradiation light is visible light.
  • the determination unit 33 may determine the color, intensity, etc. of the visible light, or may determine the projection image formed by the visible light according to the irradiation position. .
  • the display control unit 34 controls the irradiation device to emit the irradiation light determined by the determination unit 33 .
  • the illumination device may be a lighting device, a projection device, or the like.
  • the determination unit 33 determines whether the ball B is positioned outside the line on the court. It is decided to irradiate the real coat with white light having a higher intensity than before. Then, the display control unit 34 controls the irradiation device to irradiate the coat with white light.
  • the observer can determine whether the ball B is positioned within or on the line on the court, or whether the ball B is on the line on the court, for example, due to the fast movement speed of the actual ball B. Even when it is not possible to clearly perceive whether the ball B is positioned outside, it is possible to instantly recognize the range on the court where the ball B is positioned by illuminating it with white light. Therefore, the observer can grasp the contents of the game without missing the timing.
  • FIG. 13 is a schematic diagram of a spatial image display device 1-1 according to the second embodiment.
  • functional units that are the same as those in the first embodiment are denoted by the same reference numerals, and descriptions thereof are omitted.
  • the spatial image display device 1-1 includes a virtual image display section 2, a visual effect determination section 3-1, and a video processing section 4-1.
  • the video processing unit 4-1 includes an input unit 41 and an object extraction unit .
  • the input unit 41 is configured by an input interface that receives input of information.
  • the object extraction unit 42 constitutes a control unit.
  • the input unit 41 accepts input of image information indicating an image generated by the imaging device.
  • the object extraction unit 42 extracts the image of the object OB from the image information whose input is accepted by the input unit 41 . Any method may be used for the object extraction unit 42 to extract the object video.
  • the visual effect determination unit 3-1 includes a communication unit 31-1, a visual effect storage unit 32, a determination unit 33-1, and a display control unit .
  • the communication unit 31-1 receives object position information. Specifically, it receives object position information from an external device via a communication network.
  • the determination unit 33-1 determines the visual effect to be added to the object video including the image of the object OB based on the position of the object OB. Specifically, the determination unit 33-1 determines the visual effect to be added to the object video extracted by the object extraction unit 42 based on the position of the object OB indicated by the object position information received by the communication unit 31-1. decide.
  • a specific method for determining the visual effect by the determining unit 33-1 is the same as the specific method for determining the visual effect by the determining unit 33 in the first embodiment described above.
  • FIG. 14 is a flow chart showing an example of the operation of the spatial image display device 1-1 according to the second embodiment.
  • the operation of the spatial image display device 1-1 described with reference to FIG. 14 corresponds to an example of the spatial image display method of the spatial image display device 1-1 according to the second embodiment.
  • step S21 the input unit 41 accepts input of video information indicating the video generated by the imaging device.
  • step S22 the object extraction unit 42 extracts object video information indicating an object video including the image of the object OB from the video information.
  • step S23 the communication unit 31-1 receives object position information from an external device via the communication network.
  • the spatial image display device 1-1 executes the processing from step S24 to step S27.
  • the processing from step S25 to step S28 is the same as the processing from step S12 to step S15 in the first embodiment.
  • the visual effect determination unit 3-1 and the video processing unit 4-1 may be configured separately.
  • the video processing unit 4-1 has a communication unit configured by a communication interface, and the communication unit determines the visual effect of the object video information indicating the object video extracted by the object extraction unit 42. It is transmitted to the communication section 31-1 of the section 3-1.
  • FIG. 15 is a schematic diagram of a spatial image display device 1-2 according to the third embodiment.
  • the same reference numerals are given to the same functional units as in the second embodiment, and the description thereof is omitted.
  • the spatial image display device 1-2 includes a virtual image display section 2, a visual effect determination section 3-2, and an image processing section 4-2.
  • the video processing unit 4-2 includes an input unit 41, an object extraction unit 42, and an object position estimation unit 43.
  • the object position estimation unit 43 estimates the position of the object OB. Any method may be used by the object position estimation unit 43 to estimate the position of the object OB. For example, the object position estimation unit 43 estimates the distance from the reference plane in the real space to the object OB using deep learning, which indicates the position of the object OB based on the video information input by the input unit 41. good too.
  • the visual effect determination unit 3-2 includes a visual effect storage unit 32, a determination unit 33-2, and a display control unit .
  • the determination unit 33-2 determines a visual effect to be added to the object image including the image of the object OB, based on the position of the object OB. Specifically, the determination unit 33 - 2 determines the visual effect to be added to the object image extracted by the object extraction unit 42 based on the position estimated by the object position estimation unit 43 .
  • a specific method for determining the visual effect by the determining unit 33-2 is the same as the specific method for determining the visual effect by the determining unit 33 in the first embodiment described above.
  • FIG. 16 is a sequence diagram showing an example of the operation of the spatial image display device 1-2 according to the third embodiment.
  • the operation of the spatial image display device 1-2 described with reference to FIG. 16 corresponds to an example of the spatial image display method of the spatial image display device 1-2 according to the third embodiment.
  • step S31 the input unit 41 accepts input of image information indicating an image generated by an imaging device such as a camera.
  • step S32 the object extraction unit 42 extracts object video information indicating the video of the object OB from the video information.
  • step S33 the object position estimation unit 43 estimates the object position indicating the position of the object OB.
  • step S34 to step S37 executes the processing from step S34 to step S37.
  • the processing from step S34 to step S37 is the same as the processing from step S12 to step S15 in the first embodiment.
  • the object position estimation unit 43 estimates the height from a predetermined horizontal plane to the object OB in real space, which indicates the position of the object OB.
  • the object position estimation unit 43 estimates the change rate of the position of the image of the object OB in the video.
  • the object estimation unit 44 estimates whether or not the position of the object OB in real space is within a predetermined range.
  • FIG. 17 is a block diagram showing a schematic configuration of the computer 100 functioning as the determination units 33, 33-1, 33-2, and the display control unit 34, respectively.
  • the computer 100 may be a general-purpose computer, a dedicated computer, a workstation, a PC (Personal Computer), an electronic notepad, or the like.
  • Program instructions may be program code, code segments, etc. for performing the required tasks.
  • the computer 100 includes a processor 110, a ROM (Read Only Memory) 120, a RAM (Random Access Memory) 130, a storage 140, an input unit 150, a display unit 160, and a communication interface ( I/F) 170.
  • the processor 110 is specifically a CPU (Central Processing Unit), MPU (Micro Processing Unit), GPU (Graphics Processing Unit), DSP (Digital Signal Processor), SoC (System on a Chip), etc. may be configured by a plurality of processors of
  • the processor 110 controls each configuration and executes various arithmetic processing. That is, processor 110 reads a program from ROM 120 or storage 140 and executes the program using RAM 130 as a work area. The processor 110 performs control of each configuration and various arithmetic processing according to programs stored in the ROM 120 or the storage 140 .
  • the ROM 120 or storage 140 stores the program according to the present disclosure.
  • the program may be stored in a storage medium readable by the computer 100.
  • a program can be installed in the computer 100 by using such a storage medium.
  • the storage medium storing the program may be a non-transitory storage medium.
  • the non-temporary storage medium is not particularly limited, but may be, for example, a CD-ROM, a DVD-ROM, a USB (Universal Serial Bus) memory, or the like.
  • this program may be downloaded from an external device via a network.
  • the ROM 120 stores various programs and various data.
  • RAM 130 temporarily stores programs or data as a work area.
  • the storage 140 is configured by a HDD (Hard Disk Drive) or SSD (Solid State Drive) and stores various programs including an operating system and various data.
  • the input unit 150 includes one or more input interfaces that receive user's input operations and acquire information based on the user's operations.
  • the input unit 150 is a pointing device, keyboard, mouse, etc., but is not limited to these.
  • the display unit 160 includes one or more output interfaces that output information.
  • the display unit 160 is a display that outputs information as video or a speaker that outputs information as audio, but is not limited to these.
  • the display unit 160 also functions as the input unit 150 when it is a touch panel type display.
  • a communication interface (I/F) 170 is an interface for communicating with an external device.
  • (Appendix 1) a plurality of display units for displaying images; The image light emitted from the images displayed on the plurality of display units is reflected, and the reflected image light reaches the observer's eye, whereby the image is displayed at different distances in the same direction from the observer.
  • a plurality of optical elements arranged to display a virtual image of an image; a control unit; The control unit determining a visual effect to be added to an object image including an image of the object based on the position of an object that is a specific subject whose image is included in the image;
  • a spatial image display device for controlling the display unit to display the image based on the object image and the visual effect.
  • (Appendix 2) a communication unit that receives object video information indicating the object video and object position information indicating the position of the object; 2.
  • the spatial image display device according to claim 1, wherein the control unit determines the visual effect to be added to the object image indicated by the object image information based on the position indicated by the object position information.
  • (Appendix 3) The control unit estimating the position of the object; 3.
  • the aerial image display device according to claim 1 or 2, wherein the visual effect is determined based on the position.
  • the position is indicated by the distance from a reference plane, which is a plane in real space, to the object, corresponding to the virtual image plane on which the virtual image is displayed, The control unit determining, from among the plurality of display units, a display unit on which to display the object image based on the distance; 4.
  • the spatial image display device according to any one of additional items 1 to 3, wherein the display unit determined by the determination unit controls to display the image added with the visual effect to the object image.
  • the position is indicated by a height from a predetermined horizontal plane to the object in real space, 5.
  • the spatial image display device according to any one of additional items 1 to 4, wherein the control unit determines the visual effect based on the height. (Appendix 6) 6.
  • the aerial image display device according to any one of additional items 1 to 5, wherein the control unit determines the visual effect based on the change rate of the position.
  • (Appendix 7) The aerial image display device according to any one of additional items 1 to 6, wherein the control unit determines the visual effect based on whether the position is within a predetermined range.
  • (Appendix 8) The position is indicated by a distance in a predetermined direction from a reference plane, which is a plane in real space, to the object, corresponding to the virtual image plane on which the virtual image is displayed; 4.
  • the spatial image display device according to claim 3, wherein the control unit determines the visual effect of changing the brightness of each of the object images displayed on the plurality of display units, based on the distance.
  • the control unit Determining irradiation light irradiated to a member defining a space in which the virtual image is displayed; 9.
  • (Appendix 10) 10.
  • Appendix 11 a plurality of display units that display images; and image light emitted from the images displayed on the plurality of display units are reflected, respectively, and the reflected image light reaches an observer's eye.
  • a spatial image display method for a spatial image display device comprising a plurality of optical elements arranged such that virtual images of the video are displayed at different distances in the same direction from an observer, determining a visual effect to be added to an object image including an image of the object based on the position of the object, which is a specific subject whose image is included in the image; controlling the plurality of displays to display the image based on the object image and the visual effect;
  • a spatial image display method comprising: (Appendix 12) A non-temporary storage medium storing a program executable by a computer, the non-temporary storage medium storing the program causing the computer to function as the determining unit and the display control unit according to any one of appendices 1 to 10. storage medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A spatial image display device (1) of the present disclosure comprises: a plurality of display units (21) which display an image; a plurality of optical elements (22) which respectively reflect image light projected from the image displayed on the plurality of display units (21), and which are arranged so that the reflected image light reaches an observer's eye, thereby displaying a virtual image (VI) of the image, in the same direction from the observer and at a different distance; a determination unit (22) which determines a visual effect to be added to an object image containing an image of an object (OB), based on the position of the object (OB), which is a specific subject an image of which is included in the image; and a display control unit (34) which controls the display units (21) so that the image is displayed on the basis of the object image and the visual effect.

Description

空間像表示装置、空間像表示方法、及びプログラムSPATIAL IMAGE DISPLAY DEVICE, SPATIAL IMAGE DISPLAY METHOD, AND PROGRAM
 本開示は、空間像表示装置、空間像表示方法、及びプログラムに関する。 The present disclosure relates to a spatial image display device, a spatial image display method, and a program.
 表示装置に表示された映像から出射された光である映像光を反射させ、反射した映像光を観察者の眼に到達させることによって、空間中に虚像を表示する技術が知られている。 A technique is known for displaying a virtual image in space by reflecting image light, which is light emitted from an image displayed on a display device, and allowing the reflected image light to reach the eyes of an observer.
 また、観察者から同じ方向の異なる距離に複数の虚像を表示することによって、該複数の虚像により構成される空間像を表示することが知られている(特許文献1)。これにより、観察者はより高い立体感をもって空間像を観察することができる。また、複数の虚像の間の距離を短く設計することによって、さらに立体感を高くすることもできる。 It is also known to display a spatial image composed of a plurality of virtual images by displaying a plurality of virtual images at different distances in the same direction from the observer (Patent Document 1). This allows the observer to observe the spatial image with a higher stereoscopic effect. Also, by designing the distances between a plurality of virtual images to be short, the three-dimensional effect can be further enhanced.
 また、観察者から同じ方向の異なる距離に、輝度の異なる複数の映像を表示することによって立体像を提供することができる(特許文献2)。これにより、観察者には、映像が存在しない位置においても、立体像が構成されているかのように見え、この結果、高い立体感をもって映像を観察することができる。 Also, a stereoscopic image can be provided by displaying a plurality of images with different luminances at different distances in the same direction from the observer (Patent Document 2). As a result, the observer can see a stereoscopic image even at a position where no image exists, and as a result, the observer can observe the image with a high stereoscopic effect.
国際公開第2017/038091号WO2017/038091 特開2009-237310号公報JP 2009-237310 A
 しかしながら、特許文献1に記載の技術において、複数の虚像は、それぞれ2次元の虚像面上に形成された虚像である。そのため、複数の虚像面が観察者から見て同じ方向の異なる距離に位置する構成において、観察者の観察方向に対応する方向にオブジェクト(例えば、スポーツの試合で用いられているボール)が移動すると、該オブジェクトの虚像は、一の時点で一の虚像面に表示され、他の時点で他の虚像面に表示されることがある。このような場合、観察者には、オブジェクトの虚像が一の虚像面から他の虚像面に瞬間的に移動したように観察され、実空間でのオブジェクトの移動と比べて不自然な移動に見える。そのため、観察者は、十分な臨場感を得ることができないことがある。 However, in the technique described in Patent Document 1, the plurality of virtual images are virtual images formed on a two-dimensional virtual image plane. Therefore, in a configuration in which a plurality of virtual image planes are positioned at different distances in the same direction as viewed from the observer, if an object (for example, a ball used in a sports game) moves in a direction corresponding to the observation direction of the observer, , the virtual image of the object may be displayed on one virtual image plane at one time and displayed on another virtual image plane at another time. In such a case, the observer sees the virtual image of the object instantaneously moving from one virtual image plane to another, which seems unnatural compared to the movement of the object in real space. . Therefore, the observer may not be able to obtain a sufficient sense of realism.
 また、移動するオブジェクトを示す映像について、特許文献2に記載の技術を適用して複数の虚像面それぞれに該オブジェクトの虚像を表示させると、観察者の位置によっては、複数の虚像の間に視差が発生して、十分な臨場感を得ることができないことがある。 In addition, when the technology described in Patent Document 2 is applied to an image showing a moving object to display a virtual image of the object on each of a plurality of virtual image planes, parallax between the plurality of virtual images may occur depending on the position of the observer. may occur and you may not be able to obtain a sufficient sense of presence.
 かかる事情に鑑みてなされた本開示の目的は、複数の虚像面にオブジェクトの虚像を表示させるにあたって観察者に十分な臨場感を与えることができる空間像表示装置、空間像表示方法、及びプログラムを提供することにある。 An object of the present disclosure, which has been made in view of such circumstances, is to provide a spatial image display device, a spatial image display method, and a program capable of giving the observer a sufficient sense of realism when displaying virtual images of objects on a plurality of virtual image planes. to provide.
 上記課題を解決するため、本開示に係る空間像表示装置は、映像を表示する、複数の表示部と、前記複数の表示部に表示された前記映像から出射された映像光をそれぞれ反射させ、反射した前記映像光が観察者の眼に到達することによって、前記観察者から同じ方向の異なる距離に、前記映像の虚像が表示されるように配置されている、複数の光学素子と、前記映像に像が含まれる特定の被写体であるオブジェクトの位置に基づいて、前記オブジェクトの像を含むオブジェクト映像に付加する視覚効果を決定する決定部と、前記オブジェクト映像及び前記視覚効果に基づいて前記映像を表示するよう前記表示部を制御する表示制御部と、を備える。 In order to solve the above problems, a spatial image display device according to the present disclosure includes a plurality of display units that display images, and image light emitted from the images displayed on the plurality of display units, respectively. a plurality of optical elements arranged such that a virtual image of the image is displayed at different distances in the same direction from the observer by the reflected image light reaching the eyes of the observer; a determination unit that determines a visual effect to be added to an object image including the image of the object based on the position of an object that is a specific subject whose image is included in the object; and determining the image based on the object image and the visual effect. a display controller for controlling the display to display.
 また、上記課題を解決するため、本開示に係る空間像表示方法は、映像を表示する、複数の表示部と、前記複数の表示部に表示された前記映像から出射された映像光をそれぞれ反射させ、反射した前記映像光が観察者の眼に到達することによって、前記観察者から同じ方向の異なる距離に、前記映像の虚像が表示されるように配置されている、複数の光学素子とを備える空間像表示装置の空間像表示方法であって、前記映像に像が含まれる特定の被写体であるオブジェクトの位置に基づいて、前記オブジェクトの像を含むオブジェクト映像に付加する視覚効果を決定するステップと、前記オブジェクト映像及び前記視覚効果に基づいて前記映像を表示するよう前記表示部を制御するステップと、を含む。 Further, in order to solve the above problems, a spatial image display method according to the present disclosure includes a plurality of display units that display images, and image light emitted from the images displayed on the plurality of display units. and a plurality of optical elements arranged such that the reflected image light reaches an observer's eye, thereby displaying a virtual image of the image at different distances in the same direction from the observer. A spatial image display method for a spatial image display device comprising: determining a visual effect to be added to an object image including an image of the object based on the position of the object, which is a specific subject whose image is included in the image. and controlling the display to display the image based on the object image and the visual effect.
 また、上記課題を解決するため、本開示に係るプログラムは、コンピュータを上述した空間像表示装置として機能させる。 Also, in order to solve the above problems, a program according to the present disclosure causes a computer to function as the spatial image display device described above.
 本開示に係る空間像表示装置、空間像表示方法、及びプログラムによれば、複数の虚像面にオブジェクトの虚像を表示させるにあたって観察者に十分な臨場感を与えることができる。 According to the spatial image display device, the spatial image display method, and the program according to the present disclosure, it is possible to give the observer a sufficient sense of realism when displaying virtual images of objects on a plurality of virtual image planes.
第1の実施形態に係る空間像表示装置の概略図である。1 is a schematic diagram of a spatial image display device according to a first embodiment; FIG. 図1に示す虚像表示部の概略図である。2 is a schematic diagram of a virtual image display unit shown in FIG. 1; FIG. 図2に示す虚像表示部によって表示される虚像を説明するための図である。3 is a diagram for explaining a virtual image displayed by the virtual image display unit shown in FIG. 2; FIG. 観察者から見た虚像を示す図である。It is a figure which shows the virtual image seen from the observer. 図1に示す視覚効果記憶部に記憶されている視覚効果の一例を示す図である。2 is a diagram showing an example of visual effects stored in a visual effect storage section shown in FIG. 1; FIG. 実空間におけるオブジェクトの位置の一例を示す図である。FIG. 4 is a diagram showing an example of the position of an object in real space; 図6Aに示すオブジェクトの位置に基づいて表示される虚像の一例を示す図である。6B is a diagram showing an example of a virtual image displayed based on the position of the object shown in FIG. 6A; FIG. 実空間におけるオブジェクトの位置の他の例を示す図である。FIG. 10 is a diagram showing another example of the position of an object in real space; 図7Aに示すオブジェクトの位置に基づいて表示される虚像の一例を示す図である。7B is a diagram showing an example of a virtual image displayed based on the position of the object shown in FIG. 7A; FIG. 実空間におけるオブジェクトの位置のさらなる他の例を示す図である。FIG. 10 is a diagram showing still another example of the position of an object in real space; 図8Aに示すオブジェクトの位置に基づいて表示される虚像の一例を示す図である。8B is a diagram showing an example of a virtual image displayed based on the position of the object shown in FIG. 8A; FIG. 図1に示す空間像表示装置における動作の一例を示すフローチャートである。2 is a flow chart showing an example of the operation in the spatial image display device shown in FIG. 1; 視覚効果の第1の変形例を示す図である。FIG. 10 is a diagram showing a first modification of visual effects; 図10Aに示す視覚効果に基づいて表示される虚像の一例を示す図である。10B is a diagram showing an example of a virtual image displayed based on the visual effect shown in FIG. 10A; FIG. 図10Aに示す視覚効果に基づいて表示される虚像の他の例を示す図である。10B is a diagram showing another example of a virtual image displayed based on the visual effect shown in FIG. 10A; FIG. 視覚効果の第2の変形例を示す図である。FIG. 10 is a diagram showing a second modification of the visual effect; 図11Aに示す視覚効果に基づいて表示される虚像の一例を示す図である。11B is a diagram showing an example of a virtual image displayed based on the visual effect shown in FIG. 11A; FIG. 図11Aに示す視覚効果に基づいて表示される虚像の他の例を示す図である。11B is a diagram showing another example of a virtual image displayed based on the visual effect shown in FIG. 11A. FIG. 視覚効果の第3の変形例を示す図である。FIG. 11 is a diagram showing a third modification of visual effects; 図12Aに示す視覚効果に基づいて表示される虚像の一例を示す図である。12B is a diagram showing an example of a virtual image displayed based on the visual effect shown in FIG. 12A; FIG. 図12Aに示す視覚効果に基づいて表示される虚像の他の例を示す図である。12B is a diagram showing another example of a virtual image displayed based on the visual effect shown in FIG. 12A; FIG. 第2の実施形態に係る空間像表示装置の概略図である。It is a schematic diagram of a spatial image display device according to a second embodiment. 図13に示す空間像表示装置における動作の一例を示すシーケンス図である。14 is a sequence diagram showing an example of operations in the spatial image display device shown in FIG. 13; FIG. 第3の実施形態に係る空間像表示装置の概略図である。FIG. 11 is a schematic diagram of a spatial image display device according to a third embodiment; 図15に示す空間像表示装置における動作の一例を示すシーケンス図である。16 is a sequence diagram showing an example of operations in the spatial image display device shown in FIG. 15; FIG. 空間像表示装置のハードウェアブロック図である。2 is a hardware block diagram of a spatial image display device; FIG.
<<第1の実施形態>>
 図1を参照して第1の実施形態の全体構成について説明する。図1は、第1の実施形態に係る空間像表示装置1の概略図である。
<<First Embodiment>>
The overall configuration of the first embodiment will be described with reference to FIG. FIG. 1 is a schematic diagram of a spatial image display device 1 according to the first embodiment.
 図1に示すように、第1の実施形態に係る空間像表示装置1は、虚像表示部2と、視覚効果決定部3とを備える。 As shown in FIG. 1, the spatial image display device 1 according to the first embodiment includes a virtual image display section 2 and a visual effect determination section 3.
 <虚像表示部の構成>
 図2に示すように、虚像表示部2は、複数の表示部21と、複数の光学素子22とを備える。表示部21は、有機EL(Electro Luminescence)、液晶パネル等によって構成される。有機EL、液晶パネル等は、タブレット、VRゴーグル等のデバイスに取り付けられたものであってよい。光学素子22は、入射する光の一部を透過させ、該光の残りの一部を反射させる、例えば、ハーフミラーによって構成される。図2に示す例では、虚像表示部2は、2つの複数の表示部211及び212と、2つの光学素子221及び222とを備えるが、これに限られず、3つ以上の表示部21及び3つ以上の光学素子22を備えてもよい。
<Configuration of virtual image display unit>
As shown in FIG. 2 , the virtual image display section 2 includes multiple display sections 21 and multiple optical elements 22 . The display unit 21 is configured by an organic EL (Electro Luminescence), a liquid crystal panel, or the like. The organic EL, liquid crystal panel, etc. may be attached to a device such as a tablet, VR goggles, or the like. The optical element 22 is composed of, for example, a half mirror that transmits part of the incident light and reflects the remaining part of the light. In the example shown in FIG. 2, the virtual image display unit 2 includes two plurality of display units 211 and 212 and two optical elements 221 and 222, but is not limited to this, and three or more display units 21 and 3 More than one optical element 22 may be provided.
 表示部21は、映像を表示する。具体的には、表示部21は、視覚効果決定部3の制御に基づいて、オブジェクト映像に基づく映像を表示する。オブジェクト映像は、カメラ等の撮像装置によって生成された映像の一部であって、オブジェクトOBの像を含む映像である。オブジェクトOBは、任意の被写体であるが、例えば、映像に含まれる、観察者によって注目され得る被写体とすることができる。撮像装置がテニスの試合を撮像することによって生成された映像においては、オブジェクトOBは、例えば、第1の選手PL1、該第1の選手PL1の対戦相手である第2の選手PL2、並びに第1の選手PL1及び第2の選手PL2が試合中に用いているボールBである。また、本例において、撮像装置は、該撮像装置の撮像面に対して、第1の選手PL1が手前側に、第2の選手PL2が奥側に位置するように配置されている。 The display unit 21 displays images. Specifically, the display unit 21 displays an image based on the object image under the control of the visual effect determination unit 3 . An object image is a part of an image generated by an imaging device such as a camera and includes an image of an object OB. The object OB is any object, but can be, for example, an object contained in an image that can be noticed by the observer. In an image generated by an imaging device capturing an image of a tennis match, the objects OB are, for example, a first player PL1, a second player PL2 who is an opponent of the first player PL1, and the first player PL2. The ball B is used by the second player PL1 and the second player PL2 during the game. Also, in this example, the imaging device is arranged so that the first player PL1 is positioned on the front side and the second player PL2 is positioned on the back side with respect to the imaging surface of the imaging device.
 オブジェクト映像に基づく映像は、オブジェクト映像に、視覚効果決定部3によって決定された視覚効果が付加された映像を含む。オブジェクト映像に基づく映像は、オブジェクト映像に、視覚効果決定部3によって決定された視覚効果が付加されていない映像、すなわちオブジェクト映像そのものを含んでもよい。すなわち、表示部21は、オブジェクト映像に、視覚効果決定部3によって決定された視覚効果が付加された映像を表示する。また、表示部21は、オブジェクト映像に、視覚効果が付加されていない映像を表示してもよい。なお、オブジェクト映像に視覚効果が付加された映像とは、オブジェクト映像に加工が施された映像である。該映像は、追って詳細に説明するように、視覚効果の種類に応じてオブジェクト映像に加工が施されることによって生成される。 A video based on the object video includes a video in which the visual effect determined by the visual effect determination unit 3 is added to the object video. The image based on the object image may include an object image to which the visual effect determined by the visual effect determining unit 3 has not been added, that is, the object image itself. That is, the display unit 21 displays an image obtained by adding the visual effect determined by the visual effect determination unit 3 to the object image. Further, the display unit 21 may display an image to which no visual effect is added to the object image. The image obtained by adding a visual effect to the object image is an image obtained by processing the object image. The image is generated by processing the object image according to the type of visual effect, as will be described later in detail.
 上述したような、撮像装置がテニスの試合を撮像した映像を生成する例では、図3に示すように、表示部211は、オブジェクトOBの1つである第1の選手PL1の像を含むオブジェクト映像IM1を表示する。表示部212は、オブジェクトOBの1つである第2の選手PL2の像を含むオブジェクト映像IM2を表示する。さらに、表示部212は、視覚効果決定部3の制御に基づいて、オブジェクトOBの1つであるボールBの映像含むオブジェクト映像IM3を表示することがある。このとき表示部211は、オブジェクト映像IM3を表示しなくてもよい。また、表示部211は、オブジェクト映像IM3を表示することがある。このとき表示部212は、オブジェクト映像IM3を表示しなくてもよい。 In the above-described example in which the imaging device generates a video image of a tennis match, as shown in FIG. Display the image IM1. The display unit 212 displays an object image IM2 including an image of the second player PL2, which is one of the objects OB. Further, the display unit 212 may display an object image IM3 including an image of a ball B, which is one of the objects OB, under the control of the visual effect determination unit 3. FIG. At this time, the display unit 211 does not have to display the object image IM3. Also, the display unit 211 may display an object image IM3. At this time, the display unit 212 does not have to display the object image IM3.
 光学素子221及び222は、複数の表示部211及び212に表示された映像から出射された映像光をそれぞれ反射させる。光学素子221及び222は、反射した映像光が観察者の眼に到達することによって、観察者から同じ方向の異なる距離に、映像の虚像VIが表示されるように配置されている。図2及び図3に示す例では、観察者から、一の虚像面VFである虚像面VF1までの距離と、観察者から、他の虚像面VFである虚像面VF2までの距離とは互いに異なる。さらに、観察者から虚像面VF1への方向と、観察者から虚像面VF2への方向とは同じであってもよく、このような構成では、観察者にとって、虚像面VF1は虚像面VF2より手前に位置する。虚像面VF1の位置は、表示部211及び光学素子221の位置に基づいて、公知の手法により算出することができ、虚像面VF1が所望の位置に配置されるように表示部211及び光学素子221が配置される。例えば、虚像面VF1は、光学素子221を挟んで表示部211の表示面と対称となる平面に位置するため、虚像面VF1を表示させる所望の位置に対して表示部211の表示面が対称となるよう光学素子221を配置してもよい。同様にして、虚像面VF2の位置は、表示部212及び光学素子222の位置に基づいて、公知の手法により算出することができ、虚像面VF2が所望の位置に配置されるように表示部212及び光学素子222が配置される。例えば、虚像面VF2は、光学素子222を挟んで表示部212の表示面と対称となる平面に位置するため、虚像面VF2を表示させる所望の位置に対して表示部212の表示面が対称となるよう光学素子222を配置してもよい。 The optical elements 221 and 222 reflect image light emitted from the images displayed on the plurality of display units 211 and 212, respectively. The optical elements 221 and 222 are arranged such that the reflected image light reaches the observer's eye, thereby displaying a virtual image VI of the image at different distances in the same direction from the observer. In the examples shown in FIGS. 2 and 3, the distance from the observer to the virtual image plane VF1, which is one virtual image plane VF, and the distance from the observer to the virtual image plane VF2, which is the other virtual image plane VF, are different from each other. . Furthermore, the direction from the observer to the virtual image plane VF1 may be the same as the direction from the observer to the virtual image plane VF2. Located in The position of the virtual image plane VF1 can be calculated by a known method based on the positions of the display unit 211 and the optical element 221. is placed. For example, the virtual image plane VF1 is located on a plane symmetrical to the display surface of the display unit 211 with the optical element 221 interposed therebetween. The optical element 221 may be arranged so that Similarly, the position of the virtual image plane VF2 can be calculated by a known method based on the positions of the display unit 212 and the optical element 222. and an optical element 222 are arranged. For example, since the virtual image plane VF2 is positioned on a plane symmetrical to the display surface of the display unit 212 with the optical element 222 interposed therebetween, the display surface of the display unit 212 is symmetrical with respect to the desired position where the virtual image plane VF2 is displayed. The optical element 222 may be arranged so that
 上述した、撮像装置がテニスの試合を撮像する例においては、光学素子221及び222は、虚像面VF1及び虚像面VF2が実物のコート上に位置するように配置される。具体的には、光学素子221は、虚像面VF1が実物のコート上における実物のネットより観察者に近い側に位置するように配置される。また、光学素子222は、虚像面VF2が実物のコート上における実物のネットより観察者から遠い側に位置するように配置される。このように、複数の表示部211及び表示部212、並びに光学素子221及び光学素子222を組み合わせることで、実物のコート、ネット等が配置された空間内に虚像を投影することができ、観察者が有する立体感を高めることができる。 In the above-described example in which the imaging device images a tennis match, the optical elements 221 and 222 are arranged so that the virtual image planes VF1 and VF2 are positioned on the actual court. Specifically, the optical element 221 is arranged so that the virtual image plane VF1 is positioned closer to the observer than the actual net on the actual court. Also, the optical element 222 is arranged so that the virtual image plane VF2 is positioned farther from the observer than the actual net on the actual court. By combining the plurality of display units 211 and 212 and the optical elements 221 and 222 in this manner, a virtual image can be projected in a space in which the actual court, net, or the like is arranged. It is possible to enhance the three-dimensional effect possessed by.
 これによって、図4に示すように、観察者は、自身にとって手前側に虚像面VF1に表示された虚像VI1を観察し、奥側に虚像面VF2に表示された虚像VI2を観察する。なお、図4に示されている、虚像VIbは、虚像面VF1及び虚像面VF2のいずれかに表示される。虚像VIbが表示される虚像面VFについては追って詳細に説明する。したがって、観察者は、虚像VI1、虚像VI2、及び虚像VIbによって構成される空間像を観察する。このような虚像は、一般にペッパーズゴースト等と呼ばれ、観察者からはあたかも虚像面VF1及び虚像面VF2にそれぞれ表示部211及び表示部212があるように見える。 As a result, as shown in FIG. 4, the observer observes the virtual image VI1 displayed on the virtual image plane VF1 on the near side, and observes the virtual image VI2 displayed on the virtual image plane VF2 on the far side. The virtual image VIb shown in FIG. 4 is displayed on either the virtual image plane VF1 or the virtual image plane VF2. The virtual image plane VF on which the virtual image VIb is displayed will be described later in detail. Therefore, the observer observes a spatial image composed of the virtual image VI1, the virtual image VI2, and the virtual image VIb. Such a virtual image is generally called a pepper's ghost or the like, and it looks to the observer as if the display portions 211 and 212 are on the virtual image planes VF1 and VF2, respectively.
 <視覚効果決定部の構成>
 図1に示すように、視覚効果決定部3は、通信部31と、視覚効果記憶部32と、決定部33と、表示制御部34とを備える。
<Configuration of visual effect determination unit>
As shown in FIG. 1, the visual effect determination unit 3 includes a communication unit 31, a visual effect storage unit 32, a determination unit 33, and a display control unit .
 通信部31は、通信インターフェースによって構成される。通信インターフェースには、例えば、イーサネット(登録商標)、FDDI(Fiber Distributed Data Interface)、Wi-Fi(登録商標)等の規格が用いられてもよい。視覚効果記憶部32は、HDD(Hard Disk Drive)、SSD(Solid State Drive)、EEPROM(Electrically Erasable Programmable Read-Only Memory)、ROM(Read-Only Memory)およびRAM(Random Access Memory)等のメモリによって構成される。決定部33及び表示制御部34は、制御部(コントローラ)を構成する。制御部は、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)等の専用のハードウェアによって構成されてもよいし、プロセッサによって構成されてもよいし、双方を含んで構成されてもよい。 The communication unit 31 is configured by a communication interface. Standards such as Ethernet (registered trademark), FDDI (Fiber Distributed Data Interface), and Wi-Fi (registered trademark) may be used for the communication interface. The visual effect storage unit 32 is stored by memories such as HDD (Hard Disk Drive), SSD (Solid State Drive), EEPROM (Electrically Erasable Programmable Read-Only Memory), ROM (Read-Only Memory) and RAM (Random Access Memory). Configured. The determination unit 33 and the display control unit 34 constitute a control unit (controller). The control unit may be composed of dedicated hardware such as ASIC (Application Specific Integrated Circuit) or FPGA (Field-Programmable Gate Array), or may be composed of a processor, or may be composed of both. good too.
 通信部31は、オブジェクト映像を示すオブジェクト映像情報、及びオブジェクトOBの位置を示すオブジェクト位置情報を外部の装置から受信する。オブジェクト映像は、特定の被写体であるオブジェクトOBの像を含む映像である。外部の装置は、オブジェクト映像情報及びオブジェクト位置情報を生成する映像処理装置であってもよいし、映像処理装置によって生成されたオブジェクト映像情報及びオブジェクト位置情報を取得した任意の装置であってもよい。 The communication unit 31 receives object video information indicating the object video and object position information indicating the position of the object OB from an external device. An object image is an image including an image of an object OB, which is a specific subject. The external device may be a video processing device that generates object video information and object position information, or may be any device that acquires object video information and object position information generated by a video processing device. .
 一例として、オブジェクトOBの位置は、虚像が表示される虚像面VFに対応する、実空間上の面である基準面からオブジェクトOBまでの所定方向の距離によって示されてもよい。所定方向は、オブジェクトを撮像するカメラの撮像面の法線方向である。なお、カメラから所定方向に見た特定の被写体の位置関係と、観察者から見た特定の被写体の虚像の位置関係とは略同じである。また、上述した、撮像装置がテニスの試合を撮像する例においては、所定方向は、撮像装置から、手前側に位置する第1の選手PL1が存在する範囲の所定位置(例えば、中心位置)に向かう方向であって、撮像装置から、第2の選手PL2が存在する範囲の所定位置(例えば、中心位置)に向かう方向である。 As an example, the position of the object OB may be indicated by the distance in a predetermined direction from the reference plane, which is a plane in real space, corresponding to the virtual image plane VF on which the virtual image is displayed, to the object OB. The predetermined direction is the normal direction of the imaging surface of the camera that images the object. Note that the positional relationship of the specific subject viewed from the camera in a predetermined direction is substantially the same as the positional relationship of the virtual image of the specific subject viewed from the observer. Further, in the above-described example in which the imaging device captures an image of a tennis match, the predetermined direction is a predetermined position (for example, the center position) of the range in which the first player PL1 located on the near side from the imaging device exists. This is the direction in which the player is heading, and is the direction from the imaging device toward a predetermined position (for example, the center position) of the range where the second player PL2 exists.
 視覚効果記憶部32は、実空間における、オブジェクトOBの位置と、視覚効果とを対応付けて記憶している。視覚効果は、虚像表示部2がオブジェクト映像情報によって示されるオブジェクト映像の虚像VIを表示するにあたって、該オブジェクト映像に付される視覚的な効果である。図5に示す例では、位置は、実空間における、基準面からオブジェクトOBまでの距離で示され、視覚効果記憶部32は、該距離と、視覚効果とを対応付けて記憶している。 The visual effect storage unit 32 associates and stores the position of the object OB in the real space with the visual effect. The visual effect is a visual effect given to the object image when the virtual image display unit 2 displays the virtual image VI of the object image indicated by the object image information. In the example shown in FIG. 5, the position is indicated by the distance from the reference plane to the object OB in real space, and the visual effect storage unit 32 stores the distance and the visual effect in association with each other.
 決定部33は、映像に像が含まれる特定の被写体であるオブジェクトOBの位置に基づいて、オブジェクトOBの像を含むオブジェクト映像に付加する視覚効果を決定する。オブジェクトOBの位置は、実空間におけるオブジェクトOBの位置であってもよいし、映像空間におけるオブジェクトOBの像の位置であってもよい。第1の実施形態では、決定部33は、通信部31によって受信されたオブジェクト映像情報に対応する、通信部31によって受信されたオブジェクト位置情報に基づいて、該映像情報が示すオブジェクト映像に付加する視覚効果を決定する。 The determination unit 33 determines a visual effect to be added to the object image including the image of the object OB, based on the position of the object OB, which is a specific subject whose image is included in the image. The position of the object OB may be the position of the object OB in the real space or the position of the image of the object OB in the video space. In the first embodiment, based on the object position information received by the communication unit 31 corresponding to the object image information received by the communication unit 31, the determination unit 33 adds to the object image indicated by the image information. Determine visual effects.
 図5に示す例において、決定部33は、実空間における、基準面からオブジェクトOBまでの距離に基づいて視覚効果を決定する。具体的には、決定部33は、該距離に対応して、視覚効果記憶部32に記憶されている視覚効果を抽出する。そして、決定部33は、オブジェクト映像に付加する視覚効果を、抽出した視覚効果と決定する。 In the example shown in FIG. 5, the determination unit 33 determines the visual effect based on the distance from the reference plane to the object OB in real space. Specifically, the determination unit 33 extracts the visual effect stored in the visual effect storage unit 32 corresponding to the distance. Then, the determination unit 33 determines the visual effect to be added to the object video as the extracted visual effect.
 また、オブジェクトOBの位置が、基準面からオブジェクトOBまでの距離で示される構成において、決定部33は、該距離に基づいて、複数の表示部21の中から、オブジェクト映像を表示させる表示部21を決定してもよい。一例として、決定部33は、オブジェクトOBを表示させる表示部21を、複数の基準面のうち、オブジェクトOBまでの距離が最も短い基準面に対応する虚像面VFに虚像VIを表示させる表示部21と決定する。 In addition, in a configuration in which the position of the object OB is indicated by the distance from the reference plane to the object OB, the determination unit 33 causes the display unit 21 to display the object image from among the plurality of display units 21 based on the distance. may be determined. As an example, the determination unit 33 causes the display unit 21 that displays the object OB to display the virtual image VI on the virtual image plane VF corresponding to the reference plane with the shortest distance to the object OB among the plurality of reference planes. and decide.
 表示制御部34は、オブジェクト映像に基づく映像を表示するよう複数の表示部21を制御する。表示制御部34が制御する虚像表示部2は可変であってよく、例えば、タブレットであってもよいし、その後、VRゴーグルに取り換えられてもよい。すなわち、表示制御部34は、マルチデバイス対応であってもよい。 The display control unit 34 controls the plurality of display units 21 to display images based on the object images. The virtual image display unit 2 controlled by the display control unit 34 may be variable, and may be, for example, a tablet, and may be replaced with VR goggles later. That is, the display control unit 34 may be compatible with multiple devices.
 具体的には、表示制御部34は、オブジェクト映像に基づく映像に視覚効果を付加した映像を表示するよう表示部21を制御する。本実施形態では、表示制御部34は、通信部31によって受信されたオブジェクト映像情報が示す、オブジェクトOBのオブジェクト映像に、決定部33によって決定された視覚効果を付加した映像を表示するよう表示部21を制御する。このとき、決定部33が、距離に基づいて、複数の表示部21の中から、オブジェクト映像を表示させる構成において、表示制御部34は、決定部33によって決定された表示部21がオブジェクト映像に視覚効果を付した映像を表示するよう制御する。図3に示す例では、表示制御部34は、オブジェクトOBであるボールBの像を含むオブジェクト映像IM3に視覚効果を付加した映像を表示するよう、決定部33によって決定された表示部21を制御する。 Specifically, the display control unit 34 controls the display unit 21 to display an image obtained by adding a visual effect to the image based on the object image. In this embodiment, the display control unit 34 causes the display unit to display an image obtained by adding the visual effect determined by the determination unit 33 to the object image of the object OB indicated by the object image information received by the communication unit 31. 21. At this time, in the configuration in which the determining unit 33 displays the object image from among the plurality of display units 21 based on the distance, the display control unit 34 determines whether the display unit 21 determined by the determining unit 33 is the object image. Control to display images with visual effects. In the example shown in FIG. 3, the display control unit 34 controls the display unit 21 determined by the determination unit 33 to display an image obtained by adding a visual effect to the object image IM3 including the image of the ball B that is the object OB. do.
 また、表示制御部34は、オブジェクト映像情報が示す、所定のオブジェクトOBのオブジェクト映像を表示するよう所定の表示部21を制御してもよい。図3に示す例では、表示制御部34は、所定のオブジェクトOBである第1の選手PL1の像を含むオブジェクト映像IM1を表示するよう表示部211を制御する。また、表示制御部34は、所定のオブジェクトOBである第2の選手PL2の像を含むオブジェクト映像IM2を表示するよう表示部212を制御する。 Also, the display control unit 34 may control the predetermined display unit 21 to display the object video of the predetermined object OB indicated by the object video information. In the example shown in FIG. 3, the display control unit 34 controls the display unit 211 to display the object image IM1 including the image of the first player PL1, which is the predetermined object OB. The display control unit 34 also controls the display unit 212 to display the object image IM2 including the image of the second player PL2, which is the predetermined object OB.
 ここで、図5に示す例における、決定部33及び表示制御部34の処理の一例について詳細に説明する。 Here, an example of the processing of the determination unit 33 and the display control unit 34 in the example shown in FIG. 5 will be described in detail.
 図6Aに示す例では、オブジェクト映像は、第1の選手PL1、第2の選手PL2、及びボールBそれぞれの像を含んでいる。また、実空間において、第1の基準面S1と、第2の基準面S2との間の距離は9である。また、実空間において、オブジェクトの1つであるボールBは、第1の基準面S1と第2の基準面S2との間に位置し、第1の基準面S1とボールBとの間の距離が8.2である。 In the example shown in FIG. 6A, the object video includes images of the first player PL1, the second player PL2, and the ball B, respectively. Also, the distance between the first reference plane S1 and the second reference plane S2 is 9 in the real space. In real space, a ball B, which is one of the objects, is positioned between the first reference plane S1 and the second reference plane S2, and the distance between the first reference plane S1 and the ball B is is 8.2.
 上述したように、決定部33は、オブジェクトOBを表示させる表示部21を、オブジェクトOBまでの距離が最も短い基準面に対応する虚像面VFに虚像VIを表示させる表示部21と決定する。本例では、第1の基準面S1からボールBまでの距離は8.2であり、第2の基準面S2からボールBまでの距離は0.8である。このため、決定部33は、ボールBの虚像VIbを表示させる表示部21を、ボールBまでの距離が最も短い第2の基準面S2に対応する虚像面VF2(図3参照)に虚像VI2を表示する表示部212と決定する。 As described above, the determination unit 33 determines the display unit 21 that displays the object OB as the display unit 21 that displays the virtual image VI on the virtual image plane VF corresponding to the reference plane with the shortest distance to the object OB. In this example, the distance from the first reference plane S1 to the ball B is 8.2, and the distance from the second reference plane S2 to the ball B is 0.8. Therefore, the determining unit 33 causes the display unit 21 that displays the virtual image VIb of the ball B to display the virtual image VI2 on the virtual image plane VF2 (see FIG. 3) corresponding to the second reference plane S2 with the shortest distance to the ball B. The display unit 212 to be displayed is determined.
 そして、決定部33は、第2の基準面S2からオブジェクトOBであるボールBまでの距離0.8に対応して、視覚効果記憶部32に記憶されている視覚効果である「なし」をオブジェクトOBに付加する視覚効果と決定する。 Then, the determination unit 33 selects the visual effect “none” stored in the visual effect storage unit 32 as the object, corresponding to the distance 0.8 from the second reference plane S2 to the ball B, which is the object OB. Determine the visual effect to be added to OB.
 表示制御部34は、視覚効果を付さずにオブジェクトOBであるボールBの像を含むオブジェクト映像IM3を表示するよう表示部212を制御する。さらに、表示制御部34は、所定のオブジェクトOBである第1の選手PL1の像を含むオブジェクト映像IM1を表示するよう表示部211を制御する。また、表示制御部34は、所定のオブジェクトOBである第2の選手PL2の像を含むオブジェクト映像IM2を表示するよう表示部212を制御する。これにより、観察者は、図6Bに示すように、手前側に第1の選手PL1の虚像VI1を観察し、奥側に第2の選手PL2の虚像VI2とボールBの虚像VIbを観察する。したがって、観察者は、第1の選手PL1の虚像VI1、第2の選手PL2の虚像VI2、及びボールBの虚像VIbを、実空間におけるそれぞれのオブジェクトOBの位置関係に類似した位置関係で観察することができる。 The display control unit 34 controls the display unit 212 to display the object image IM3 including the image of the ball B, which is the object OB, without visual effects. Furthermore, the display control unit 34 controls the display unit 211 to display the object image IM1 including the image of the first player PL1, which is the predetermined object OB. The display control unit 34 also controls the display unit 212 to display the object image IM2 including the image of the second player PL2, which is the predetermined object OB. As a result, as shown in FIG. 6B, the observer observes the virtual image VI1 of the first player PL1 on the near side, and observes the virtual image VI2 of the second player PL2 and the virtual image VIb of the ball B on the far side. Therefore, the observer observes the virtual image VI1 of the first player PL1, the virtual image VI2 of the second player PL2, and the virtual image VIb of the ball B in a positional relationship similar to that of the respective objects OB in real space. be able to.
 図7Aに示す例では、図6Aに示す例と同様に、オブジェクト映像は、第1の選手PL1、第2の選手PL2、及びボールBそれぞれの像を含んでいる。また、実空間において、第1の基準面S1と、第2の基準面S2との間の距離は9である。また、実空間において、オブジェクトの1つであるボールBは、第1の基準面S1と第2の基準面S2との間に位置し、図6Aに示す例と異なり、第1の基準面S1とボールBとの間の距離が7である。 In the example shown in FIG. 7A, the object video includes images of the first player PL1, the second player PL2, and the ball B, similar to the example shown in FIG. 6A. Also, the distance between the first reference plane S1 and the second reference plane S2 is 9 in the real space. Further, in the real space, the ball B, which is one of the objects, is positioned between the first reference plane S1 and the second reference plane S2, and unlike the example shown in FIG. and ball B is 7.
 本例では、第1の基準面S1からボールBまでの距離は7であり、第2の基準面S2からボールBまでの距離は2である。このため、決定部33は、ボールBまでの距離が最も短い第2の基準面S2に対応する虚像面VF2に虚像VIを表示する表示部212を、ボールBの像を含むオブジェクト映像を表示させる表示部21と決定する。 In this example, the distance from the first reference plane S1 to the ball B is 7, and the distance from the second reference plane S2 to the ball B is 2. Therefore, the determination unit 33 causes the display unit 212 that displays the virtual image VI on the virtual image plane VF2 corresponding to the second reference plane S2, which is the shortest distance to the ball B, to display the object image including the image of the ball B. The display unit 21 is determined.
 そして、決定部33は、第2の基準面S2からオブジェクトOBであるボールBまでの距離2に対応して、視覚効果記憶部32に記憶されている視覚効果である「拡大/縮小」をオブジェクトOBに付加する視覚効果と決定する。一例として、決定部33は、ボールBと第2の基準面S2との距離が短くなるほど、ボールBの像を含むオブジェクト映像を縮小し、距離が長くなるほど、ボールBの像を含むオブジェクト映像を拡大するような視覚効果を付加するよう決定する。なお、決定部33が、第1の基準面S1に虚像を表示する表示部211を、ボールBの像を含むオブジェクト映像を表示させる表示部21と決定し、ボールBと第1の基準面S1との距離が所定範囲内であった場合、ボールBと第1の基準面S1との距離が短くなるほど、オブジェクト映像を拡大し、距離が長くなるほど、オブジェクト映像を縮小するような視覚効果を付加するよう決定してもよい。 Then, the determining unit 33 sets the visual effect “enlargement/reduction” stored in the visual effect storage unit 32 to the object OB, corresponding to the distance 2 from the second reference plane S2 to the ball B, which is the object OB. Determine the visual effect to be added to OB. As an example, the determination unit 33 reduces the object image including the image of the ball B as the distance between the ball B and the second reference plane S2 becomes shorter, and reduces the object image including the image of the ball B as the distance increases. Decide to add visual effects such as magnifying. Note that the determination unit 33 determines the display unit 211 that displays the virtual image on the first reference plane S1 as the display unit 21 that displays the object image including the image of the ball B, and determines the display unit 21 that displays the object image including the image of the ball B. is within a predetermined range, the object image is enlarged as the distance between the ball B and the first reference surface S1 becomes shorter, and the object image is reduced as the distance becomes longer. may decide to do so.
 これに伴い、表示制御部34は、ボールBの映像を拡大又は縮小させる視覚効果を付加した映像を表示するよう表示部212を制御する。拡大又は縮小させる視覚効果とは、例えば、ボールBが第1の選手PL1に近づくほどボールBの映像を拡大させ、ボールBが第1の選手PL1から遠ざかるほどボールBの映像を縮小させるような視覚効果とすることができる。さらに、表示制御部34は、所定のオブジェクトOBである第1の選手PL1の像を含むオブジェクト映像IM1を表示するよう表示部211を制御し、所定のオブジェクトOBである第2の選手PL2の像を含むオブジェクト映像IM2を表示するよう表示部212を制御する。これにより、観察者は、図7Bに示すように、手前側に第1の選手PL1の虚像VI1を観察し、奥側に第2の選手PL2の虚像VI2とボールBの虚像VIbを観察する。したがって、観察者は、第1の選手PL1、第2の選手PL2、及びボールBの虚像を、実空間におけるそれぞれの位置関係に類似した位置関係で観察することができる。さらに、虚像表示部2は、距離の変化に応じてボールBの像を小さくしたり大きくしたりするため、観察者は、実空間におけるボールの移動をより臨場感をもって認識することができる。 Along with this, the display control unit 34 controls the display unit 212 to display an image to which a visual effect of enlarging or reducing the image of the ball B is added. The visual effect of enlarging or reducing is, for example, such that the image of ball B is enlarged as the ball B approaches the first player PL1, and the image of the ball B is reduced as the ball B moves away from the first player PL1. It can be a visual effect. Further, the display control unit 34 controls the display unit 211 to display the object image IM1 including the image of the first player PL1 that is the predetermined object OB, and the image of the second player PL2 that is the predetermined object OB. The display unit 212 is controlled to display the object image IM2 including As a result, the observer observes the virtual image VI1 of the first player PL1 on the near side, and the virtual image VI2 of the second player PL2 and the virtual image VIb of the ball B on the far side, as shown in FIG. 7B. Therefore, the observer can observe the virtual images of the first player PL1, the second player PL2, and the ball B in a positional relationship similar to their respective positional relationships in real space. Furthermore, since the virtual image display unit 2 reduces or enlarges the image of the ball B according to the change in distance, the observer can perceive the movement of the ball in the real space with a more realistic feeling.
 図8Aに示す例では、図6A及び図7Aに示す例と同様に、オブジェクト映像は、第1の選手PL1、第2の選手PL2、及びボールBそれぞれの像を含んでいる。また、実空間において、第1の基準面S1と、第2の基準面S2との間の距離は9である。また、実空間において、オブジェクトの1つであるボールBは、第1の基準面S1と第2の基準面S2との間に位置し、図6A及び図7Aに示す例と異なり、第1の基準面S1とボールBとの間の距離が5である。 In the example shown in FIG. 8A, the object video includes images of the first player PL1, the second player PL2, and the ball B, similar to the examples shown in FIGS. 6A and 7A. Also, the distance between the first reference plane S1 and the second reference plane S2 is 9 in the real space. Further, in the real space, the ball B, which is one of the objects, is positioned between the first reference plane S1 and the second reference plane S2, and unlike the examples shown in FIG. 6A and FIG. The distance between the reference plane S1 and the ball B is 5.
 本例では、第1の基準面S1からボールBまでの距離は5であり、第2の基準面S2からボールBまでの距離は4である。このため、決定部33は、ボールBの像を含むオブジェクト映像を表示させる表示部21を、オブジェクトOBであるボールBまでの距離が最も短い第2の基準面S2に対応する虚像面VF2に虚像を表示する表示部212と決定する。 In this example, the distance from the first reference plane S1 to the ball B is 5, and the distance from the second reference plane S2 to the ball B is 4. For this reason, the determining unit 33 causes the display unit 21 that displays the object image including the image of the ball B to be displayed on the virtual image plane VF2 corresponding to the second reference plane S2 having the shortest distance to the ball B, which is the object OB. is determined to be the display unit 212 that displays .
 そして、決定部33は、第2の基準面S2からボールBまでの距離2に対応して、視覚効果記憶部32に記憶されている視覚効果である「フラッシュ光」をオブジェクトOBに付加する視覚効果と決定する。 Then, the determination unit 33 adds the visual effect “flash light” stored in the visual effect storage unit 32 to the object OB corresponding to the distance 2 from the second reference plane S2 to the ball B. Decide on the effect.
 これに伴い、表示制御部34は、ボールBの映像がフラッシュ光のように明るく見えるような視覚効果(図8Bの網掛け部分)を付加してボールBの像を含むオブジェクト映像を表示するよう表示部212を制御する。さらに、表示制御部34は、所定のオブジェクトOBである第1の選手PL1の像を含むオブジェクト映像IM1を表示するよう表示部211を制御し、所定のオブジェクトOBである第2の選手PL2の像を含むオブジェクト映像IM2を表示するよう表示部212を制御する。これにより、観察者は、図8Bに示すように、手前側に第1の選手PL1の虚像VI1を観察し、奥側に第2の選手PL2の虚像VI2とボールBの虚像VIbを観察する。したがって、観察者は、第1の選手PL1、第2の選手PL2、及びボールBの虚像を、実空間におけるそれぞれの位置関係に類似した位置関係で観察することができる。さらに、観察者は、フラッシュ光による視覚効果が付加されたボールBの像を観察することにより、ボールBが近付いていることをより強く認識することができる。 Along with this, the display control unit 34 displays the object image including the image of the ball B by adding a visual effect (shaded portion in FIG. 8B) that makes the image of the ball B look as bright as flash light. It controls the display unit 212 . Further, the display control unit 34 controls the display unit 211 to display the object image IM1 including the image of the first player PL1 that is the predetermined object OB, and the image of the second player PL2 that is the predetermined object OB. The display unit 212 is controlled to display the object image IM2 including As a result, the observer observes the virtual image VI1 of the first player PL1 on the near side, and the virtual image VI2 of the second player PL2 and the virtual image VIb of the ball B on the far side, as shown in FIG. 8B. Therefore, the observer can observe the virtual images of the first player PL1, the second player PL2, and the ball B in a positional relationship similar to their respective positional relationships in real space. Furthermore, the observer can more strongly recognize that the ball B is approaching by observing the image of the ball B to which the visual effect of the flash light is added.
 <空間像表示装置の動作>
 ここで、第1の実施形態に係る空間像表示装置1の動作について、図9を参照して説明する。図9は、第1の実施形態に係る空間像表示装置1の動作の一例を示すフローチャートである。図9を参照して説明する空間像表示装置1における動作は第1の実施形態に係る空間像表示装置1の空間像表示方法の一例に相当する。
<Operation of Spatial Image Display>
Here, the operation of the spatial image display device 1 according to the first embodiment will be described with reference to FIG. FIG. 9 is a flow chart showing an example of the operation of the spatial image display device 1 according to the first embodiment. The operation of the spatial image display device 1 described with reference to FIG. 9 corresponds to an example of the spatial image display method of the spatial image display device 1 according to the first embodiment.
 ステップS11において、通信部31が、通信ネットワークを介して、外部の装置からオブジェクト映像情報及びオブジェクト位置情報を受信する。 In step S11, the communication unit 31 receives object video information and object position information from an external device via the communication network.
 ステップS12において、決定部33が、映像に像が含まれる特定の被写体であるオブジェクトに対応するオブジェクトの位置に基づいて、オブジェクト映像情報に含まれるオブジェクトの像であるオブジェクト映像に付加する視覚効果を決定する。 In step S12, the determining unit 33 determines a visual effect to be added to the object image, which is the image of the object included in the object image information, based on the position of the object corresponding to the object, which is the specific subject whose image is included in the image. decide.
 ステップS13において、表示制御部34は、複数の表示部21の中から、オブジェクトOBの像を含むオブジェクト映像を表示させる表示部21を決定する。 In step S13, the display control unit 34 determines, from among the plurality of display units 21, the display unit 21 on which to display the object image including the image of the object OB.
 ステップS14において、表示制御部34は、オブジェクト映像及び前記視覚効果に基づいて前記映像を表示するよう表示部21を制御する。具体的には、表示制御部34は、オブジェクト映像情報が示す、オブジェクトOBのオブジェクト映像に、決定部33によって決定された視覚効果を付加した映像を表示するよう、ステップS13で決定された表示部21を制御する。さらに、表示制御部34は、オブジェクト映像情報が示す、所定のオブジェクトOBのオブジェクト映像を表示するよう所定の表示部21を制御してもよい。 In step S14, the display control unit 34 controls the display unit 21 to display the image based on the object image and the visual effect. Specifically, the display control unit 34 causes the display unit determined in step S13 to display an image obtained by adding the visual effect determined by the determination unit 33 to the object image of the object OB indicated by the object image information. 21. Furthermore, the display control unit 34 may control the predetermined display unit 21 to display the object image of the predetermined object OB indicated by the object image information.
 ステップS15において、表示部21は、表示制御部34の制御に基づいて、オブジェクト映像を表示する。具体的には、表示部21は、オブジェクト映像情報が示す、オブジェクトOBのオブジェクト映像に、決定部33によって決定された視覚効果を付加した映像を表示する。なお、このとき、表示部21は、所定のオブジェクトのオブジェクト映像を表示してもよい。 In step S15, the display unit 21 displays the object image under the control of the display control unit 34. Specifically, the display unit 21 displays an image obtained by adding the visual effect determined by the determination unit 33 to the object image of the object OB indicated by the object image information. At this time, the display unit 21 may display an object image of a predetermined object.
 上述したように、第1の実施形態によれば、空間像表示装置1は、映像に像が含まれる特定の被写体であるオブジェクトの位置に基づいて、オブジェクトOBの像を含むオブジェクト映像に付加する視覚効果を決定し、オブジェクト映像及び前記視覚効果に基づいて映像を表示するよう表示部21を制御する。このため、空間像表示装置1は、複数の虚像面VFにオブジェクトの虚像を表示することができる。このため、空間像表示装置1は、複数の虚像面VFにおける虚像よって構成される空間像を観察する観察者に十分な臨場感を与えることができる。 As described above, according to the first embodiment, the spatial image display apparatus 1 adds an image of an object OB to an object image including an image of the object OB based on the position of the object, which is a specific subject whose image is included in the image. A visual effect is determined, and the display unit 21 is controlled to display an image based on the object image and the visual effect. Therefore, the spatial image display device 1 can display virtual images of objects on a plurality of virtual image planes VF. Therefore, the spatial image display device 1 can provide a sufficient sense of realism to an observer observing a spatial image formed by virtual images on a plurality of virtual image planes VF.
 また、第1の実施形態によれば、空間像表示装置1は、オブジェクト映像を示すオブジェクト映像情報、及びオブジェクトOBの位置を示すオブジェクト位置情報を受信し、オブジェクト位置情報が示す位置に基づいて、オブジェクト映像情報が示すオブジェクト映像に付加する視覚効果を決定する。このため、空間像表示装置1は、映像に含まれるオブジェクト映像に視覚効果を付した虚像を、オブジェクトOBが撮像される場所の遠隔にて表示することができる。 Further, according to the first embodiment, the spatial image display device 1 receives object video information indicating an object video and object position information indicating the position of the object OB, and based on the position indicated by the object position information, A visual effect to be added to the object video indicated by the object video information is determined. Therefore, the spatial image display device 1 can display a virtual image obtained by adding a visual effect to an object image included in the image, at a remote location where the object OB is imaged.
 また、第1の実施形態によれば、空間像表示装置1は、該距離に基づいて、オブジェクト映像の輝度をそれぞれ変更するような視覚効果を決定する。このため、空間像表示装置1は、オブジェクトOBの位置を観察者に直感的に認識させるように、移動するオブジェクトOBの虚像VIを複数の虚像面VFに表示させることができ、これにより、虚像によって構成される立体像を観察する観察者により臨場感を与えることができる。 Also, according to the first embodiment, the spatial image display device 1 determines a visual effect such as changing the brightness of each object image based on the distance. Therefore, the spatial image display device 1 can display the virtual image VI of the moving object OB on a plurality of virtual image planes VF so that the observer can intuitively recognize the position of the object OB. It is possible to give a more realistic feeling to the observer who observes the stereoscopic image formed by.
 また、第1の実施形態によれば、空間像表示装置1は、距離に基づいて、複数の表示部21の中から、オブジェクト映像に基づく映像を表示させる表示部21を決定し、決定部33によって決定された表示部21がオブジェクト映像に視覚効果を付した映像を表示するよう制御する。このため、空間像表示装置1は、オブジェクトOBの位置を観察者に直感的に認識させるように、移動するオブジェクトOBの虚像VIを複数の虚像面VFに表示させることができ、これにより、虚像によって構成される立体像を観察する観察者により臨場感を与えることができる。 Further, according to the first embodiment, the spatial image display device 1 determines, from among the plurality of display units 21, the display unit 21 on which the image based on the object image is to be displayed, based on the distance. The display unit 21 determined by is controlled to display an image obtained by adding a visual effect to the object image. Therefore, the spatial image display device 1 can display the virtual image VI of the moving object OB on a plurality of virtual image planes VF so that the observer can intuitively recognize the position of the object OB. It is possible to give a more realistic feeling to the observer who observes the stereoscopic image formed by.
 なお、上述した第1の実施形態において、空間像表示装置1は、虚像表示部2を備えるが、この限りではない。例えば、空間像表示装置1は、虚像表示部2を備えず、外部の虚像表示装置がオブジェクト映像を表示するよう制御してもよい。このような構成において、空間像表示装置1は、上述したフローチャートのステップS15を実行しない。 Although the spatial image display device 1 includes the virtual image display unit 2 in the first embodiment described above, this is not the only option. For example, the spatial image display device 1 may not include the virtual image display unit 2 and may control an external virtual image display device to display the object image. In such a configuration, the spatial image display device 1 does not execute step S15 of the flowchart described above.
 (第1の変形例)
 また、上述の実施形態において、オブジェクトOBの位置は、実空間における、基準面からオブジェクトOBまでの距離によって示されたが、これに限られない。例えば、オブジェクトOBの位置は、実空間における、所定の水平面(例えば、地平面)からオブジェクトOBまでの高さで示されてもよい。
(First modification)
Also, in the above-described embodiment, the position of the object OB is indicated by the distance from the reference plane to the object OB in real space, but the present invention is not limited to this. For example, the position of object OB may be indicated by the height from a predetermined horizontal plane (for example, ground plane) to object OB in real space.
 このような構成において、視覚効果記憶部32は、図10Aに示すように、高さと、視覚効果とを対応付けて記憶している。決定部33は、高さに基づいて、視覚効果を決定する。さらに、表示制御部34は、オブジェクト映像に、決定部33によって決定された視覚効果を付加した映像を表示するよう表示部21を制御する。 In such a configuration, the visual effect storage unit 32 stores heights and visual effects in association with each other, as shown in FIG. 10A. A determination unit 33 determines a visual effect based on the height. Furthermore, the display control unit 34 controls the display unit 21 to display an image obtained by adding the visual effect determined by the determination unit 33 to the object image.
 本例では、高さが1.5である場合、決定部33は、視覚効果「なし」(図10A参照)と決定する。そして、表示制御部34は、オブジェクト映像に視覚効果を付さずに表示するよう表示部21を制御する。これにより、観察者は、図10Bに示すような虚像VI1、VI2、及びVIbによって構成される空間像を観察することができる。 In this example, when the height is 1.5, the determining unit 33 determines that the visual effect is "none" (see FIG. 10A). Then, the display control unit 34 controls the display unit 21 to display the object image without visual effects. This allows the observer to observe a spatial image formed by virtual images VI1, VI2, and VIb as shown in FIG. 10B.
 また、高さが2.5である場合、決定部33は、視覚効果「50%の透過度」(図10A参照)と決定し、決定部33は、オブジェクト映像の透過度を50%に変更して表示するよう表示部21を制御する。これにより、観察者は、図10Cに示すような虚像VI1及びVI2、並びに50%の透過度とする視覚効果が付されたオブジェクト映像の虚像VIbによって構成される空間像を観察することができる。 Further, when the height is 2.5, the determination unit 33 determines the visual effect “50% transparency” (see FIG. 10A), and the determination unit 33 changes the transparency of the object image to 50%. The display unit 21 is controlled to display as As a result, the observer can observe a spatial image composed of the virtual images VI1 and VI2 as shown in FIG. 10C and the virtual image VIb of the object image with a visual effect of 50% transparency.
 また、高さが3以上である場合、決定部33は、視覚効果「80%の透過度」(図10A参照)と決定し、決定部33は、オブジェクト映像の透過度を80%に変更して表示するよう表示部21を制御する。 Also, when the height is 3 or more, the determination unit 33 determines the visual effect as “80% transparency” (see FIG. 10A), and changes the transparency of the object image to 80%. The display unit 21 is controlled to display
 (第2の変形例)
 上述した実施形態において、決定部33は、オブジェクトOBの位置に基づいて、視覚効果を決定したが、このとき、決定部33は、映像における、オブジェクトOBの像の位置の変化率に基づいて、視覚効果を決定してもよい。位置の変化率は、例えば、映像を構成する単位数フレームにおけるオブジェクトOBの像の位置の変化とすることができる。
(Second modification)
In the above-described embodiment, the determining unit 33 determines the visual effect based on the position of the object OB. A visual effect may be determined. The change rate of the position can be, for example, the change in the position of the image of the object OB in a few unit frames forming the video.
 このような構成において、視覚効果記憶部32は、図11Aに示すように、位置の変化率と、視覚効果とを対応付けて記憶している。また、表示制御部34は、オブジェクト映像に、決定部33によって決定された視覚効果を付加した映像を表示するよう表示部21を制御する。 In such a configuration, the visual effect storage unit 32 stores the rate of change in position and the visual effect in association with each other, as shown in FIG. 11A. The display control unit 34 also controls the display unit 21 to display an image obtained by adding the visual effect determined by the determination unit 33 to the object image.
 本例では、位置の変化率が閾値未満である場合、決定部33は、視覚効果を「なし」(図11A参照)と決定する。そして、表示制御部34は、オブジェクト映像に視覚効果を付さずに表示するよう表示部21を制御する。これにより、観察者は、図11Bに示すような虚像VI1、VI2、及びVIbによって構成される空間像を観察することができる。 In this example, when the position change rate is less than the threshold, the determination unit 33 determines the visual effect to be "none" (see FIG. 11A). Then, the display control unit 34 controls the display unit 21 to display the object image without visual effects. This allows the observer to observe a spatial image formed by virtual images VI1, VI2, and VIb as shown in FIG. 11B.
 また、位置の変化率が閾値以上である場合、決定部33は、視覚効果を「複数フレームを重ねる」(図11A参照)と決定する。そして、表示制御部34は、複数フレームのオブジェクト映像を重ねて表示する視覚効果を付した映像を表示するよう表示部21を制御する。これにより、観察者は、図11Cに示すような虚像VI1及びVI2、並びに複数フレームのオブジェクト映像を重ねて表示された映像の虚像VIbによって構成される空間像を観察することができる。 Also, when the rate of change in position is equal to or greater than the threshold, the determining unit 33 determines the visual effect to be "overlapping multiple frames" (see FIG. 11A). Then, the display control unit 34 controls the display unit 21 to display an image with a visual effect of superimposing and displaying a plurality of frames of object images. As a result, the observer can observe a spatial image composed of the virtual images VI1 and VI2 as shown in FIG. 11C and the virtual image VIb of the image displayed by overlapping the object images of a plurality of frames.
 また、位置の変化率が判定不能である場合、決定部33は、視覚効果「50%の透過度」と決定する。そして、表示制御部34は、オブジェクト映像の透過度を50%に変更して表示するよう表示部21を制御する。 Also, when the change rate of the position cannot be determined, the determination unit 33 determines the visual effect as "50% transparency". Then, the display control unit 34 controls the display unit 21 to change the transparency of the object image to 50% and display it.
 (第3の変形例)
 また、決定部33は、オブジェクトOBの位置が所定範囲内であるか否かに基づいて、オブジェクト映像に付加する視覚効果を決定してもよい。
(Third modification)
Further, the determination unit 33 may determine the visual effect to be added to the object image based on whether the position of the object OB is within a predetermined range.
 例えば、視覚効果記憶部32は、図12Aに示すように、オブジェクトOBの位置が所定範囲内であるか否かと、視覚効果とを対応付けて記憶している。図12Aに示す例では、テニスの試合で用いられるコートのライン内及びライン上が所定範囲内(IN)であり、ライン外が所定範囲外(OUT)である。また、表示制御部34は、オブジェクト映像に、決定部33によって決定された視覚効果を付加した映像を表示するよう表示部21を制御する。 For example, as shown in FIG. 12A, the visual effect storage unit 32 stores visual effects in association with whether the position of the object OB is within a predetermined range. In the example shown in FIG. 12A, inside and on the line of a court used in a tennis match are within a predetermined range (IN), and outside the line is outside a predetermined range (OUT). The display control unit 34 also controls the display unit 21 to display an image obtained by adding the visual effect determined by the determination unit 33 to the object image.
 本例では、実空間におけるオブジェクトOBの位置が所定範囲内である場合、決定部33は、視覚効果「なし」と決定する。そして、表示制御部34は、オブジェクト映像に視覚効果を付さずに表示するよう虚像表示部2を制御する。これにより、観察者は、図12Bに示すような虚像VI1、VI2、及びVIbによって構成される空間像を観察することができる。 In this example, when the position of the object OB in the real space is within the predetermined range, the determining unit 33 determines "no" visual effect. Then, the display control unit 34 controls the virtual image display unit 2 to display the object image without visual effects. This allows the observer to observe a spatial image formed by virtual images VI1, VI2, and VIb as shown in FIG. 12B.
 また、実空間におけるオブジェクトOBの位置が所定範囲外である場合、決定部33は、視覚効果「フラッシュ光」と決定する。そして、表示制御部34は、オブジェクトOBの像がフラッシュ光のように明るく見えるような視覚効果を付加してオブジェクトOBを表示するよう虚像表示部2を制御する。これにより、観察者は、図12Cに示すような虚像VI1及びVI2、並びにフラッシュ光のように明るく見えるような視覚効果が付加された映像の虚像VIbによって構成される空間像を観察することができる。 Further, when the position of the object OB in the real space is outside the predetermined range, the determination unit 33 determines the visual effect "flash light". Then, the display control unit 34 controls the virtual image display unit 2 to display the object OB by adding a visual effect such that the image of the object OB looks bright like flash light. As a result, the observer can observe a spatial image composed of the virtual images VI1 and VI2 as shown in FIG. 12C and the virtual image VIb of the image added with the visual effect of making it look bright like flash light. .
 また、実空間におけるオブジェクトOBの位置が所定範囲内であるか否かが不明である場合、決定部33は、視覚効果「50%の透過度」と決定する。そして、表示制御部34は、オブジェクト映像の透過度を50%に変更して表示するよう虚像表示部2を制御する。 Further, when it is unclear whether the position of the object OB in the real space is within the predetermined range, the determination unit 33 determines the visual effect as "50% transparency". Then, the display control unit 34 controls the virtual image display unit 2 so that the transparency of the object image is changed to 50% and displayed.
 なお、第3の変形例において、決定部33は、実空間におけるオブジェクトOBの位置が所定範囲内であるか否かに基づいて視覚効果を決定したが、決定部33は、映像空間におけるオブジェクトOBの像が所定範囲内であるか否かに基づいて視覚効果を決定してもよい。 In the third modification, the determination unit 33 determines the visual effect based on whether the position of the object OB in the real space is within a predetermined range. A visual effect may be determined based on whether the image of is within a predetermined range.
 (第4の変形例)
 また、上述の実施形態において、決定部33は、オブジェクトOBの位置に基づいて、オブジェクト映像を表示させる表示部21を決定したが、これに限られない。例えば、決定部33は、複数の表示部21それぞれにオブジェクト映像を表示させると決定してもよい。このような構成において、位置は、像が表示される虚像面に対応する、実空間上の面である基準面からオブジェクトまでの所定方向の距離で示され、決定部33は、距離に基づいて、複数の表示部21に表示させるオブジェクト映像の輝度をそれぞれ変更するような視覚効果を決定する。具体的には、決定部33は、観察者がより高い立体感を有しながら空間像を観察することができるように、オブジェクトOBの位置に応じて、各表示部21に表示させるオブジェクト映像の輝度を表示部21ごとに変更する。
(Fourth modification)
Further, in the above-described embodiment, the determination unit 33 determines the display unit 21 to display the object image based on the position of the object OB, but the present invention is not limited to this. For example, the determination unit 33 may determine to display the object image on each of the multiple display units 21 . In such a configuration, the position is indicated by the distance in a predetermined direction from the reference plane, which is a plane in real space, to the object, corresponding to the virtual image plane on which the image is displayed. , determine a visual effect that changes the brightness of each object image displayed on the plurality of display units 21 . Specifically, the determination unit 33 selects the object image to be displayed on each display unit 21 according to the position of the object OB so that the observer can observe the spatial image with a higher stereoscopic effect. The luminance is changed for each display section 21 .
 (第5の変形例)
 また、上述の実施形態において、決定部33は、虚像が表示される空間を画定する部材に照射される照射光をさらに決定してもよい。虚像が表示される空間を画定する部材は、例えば、床、壁、柱とすることができる。照射光は、可視光であり、例えば、決定部33は、可視光の色、強度等を決定してもよいし、照射位置に応じて可視光によって形成されるプロジェクション映像を決定してもよい。このような構成において、表示制御部34は、決定部33によって決定された照射光を照射するよう照射装置を制御する。照射装置は、照明装置、プロジェクション投影装置等とすることができる。
(Fifth Modification)
Moreover, in the above-described embodiment, the determination unit 33 may further determine irradiation light to be applied to a member that defines the space in which the virtual image is displayed. Members that define the space in which the virtual image is displayed can be, for example, floors, walls, and pillars. The irradiation light is visible light. For example, the determination unit 33 may determine the color, intensity, etc. of the visible light, or may determine the projection image formed by the visible light according to the irradiation position. . In such a configuration, the display control unit 34 controls the irradiation device to emit the irradiation light determined by the determination unit 33 . The illumination device may be a lighting device, a projection device, or the like.
 例えば、上述したような、テニスの試合が撮像された映像の例において、決定部33は、ボールBがコート上のライン外に位置した場合に、ボールBがコート上のライン外に位置する前までに比べて、実物のコートに対して強度の高い白色光を照射すると決定する。そして、表示制御部34は、コート上に白色光を照射するよう照射装置を制御する。このような例において、観察者は、例えば、実体のボールBの移動速度が速いことに起因して、ボールBがコート上のライン内又はライン上に位置したか、ボールBがコート上のライン外に位置したかを明確に知覚することができない場合も、白色光の照射によって、瞬時にボールBが位置したコート上の範囲を認識することができる。したがって、観察者は、タイミングを逃さずに試合の内容を把握することができる。 For example, in the example of the video of the tennis match as described above, when the ball B is positioned outside the line on the court, the determination unit 33 determines whether the ball B is positioned outside the line on the court. It is decided to irradiate the real coat with white light having a higher intensity than before. Then, the display control unit 34 controls the irradiation device to irradiate the coat with white light. In such an example, the observer can determine whether the ball B is positioned within or on the line on the court, or whether the ball B is on the line on the court, for example, due to the fast movement speed of the actual ball B. Even when it is not possible to clearly perceive whether the ball B is positioned outside, it is possible to instantly recognize the range on the court where the ball B is positioned by illuminating it with white light. Therefore, the observer can grasp the contents of the game without missing the timing.
 <<第2の実施形態>>
 図13を参照して第2の実施形態の全体構成について説明する。図13は、第2の実施形態に係る空間像表示装置1-1の概略図である。第2の実施形態において、第1の実施形態と同一の機能部については同じ符号を付加し、説明を省略する。
<<Second Embodiment>>
The overall configuration of the second embodiment will be described with reference to FIG. FIG. 13 is a schematic diagram of a spatial image display device 1-1 according to the second embodiment. In the second embodiment, functional units that are the same as those in the first embodiment are denoted by the same reference numerals, and descriptions thereof are omitted.
 図13に示すように、第2の実施形態に係る空間像表示装置1-1は、虚像表示部2と、視覚効果決定部3-1と、映像処理部4-1とを備える。 As shown in FIG. 13, the spatial image display device 1-1 according to the second embodiment includes a virtual image display section 2, a visual effect determination section 3-1, and a video processing section 4-1.
 <映像処理部の構成>
 映像処理部4-1は、入力部41と、オブジェクト抽出部42とを備える。入力部41は、情報の入力を受け付ける入力インターフェースによって構成される。オブジェクト抽出部42は、制御部を構成する。
<Configuration of video processing unit>
The video processing unit 4-1 includes an input unit 41 and an object extraction unit . The input unit 41 is configured by an input interface that receives input of information. The object extraction unit 42 constitutes a control unit.
 入力部41は、撮像装置によって生成された映像を示す映像情報の入力を受け付ける。 The input unit 41 accepts input of image information indicating an image generated by the imaging device.
 オブジェクト抽出部42は、入力部41によって入力が受け付けられた映像情報から、オブジェクトOBの映像を抽出する。オブジェクト抽出部42がオブジェクト映像を抽出する手法は、任意の手法であってよい。 The object extraction unit 42 extracts the image of the object OB from the image information whose input is accepted by the input unit 41 . Any method may be used for the object extraction unit 42 to extract the object video.
 <視覚効果決定部の構成>
 図1に示すように、視覚効果決定部3-1は、通信部31-1と、視覚効果記憶部32と、決定部33-1と、表示制御部34とを備える。
<Configuration of visual effect determination unit>
As shown in FIG. 1, the visual effect determination unit 3-1 includes a communication unit 31-1, a visual effect storage unit 32, a determination unit 33-1, and a display control unit .
 通信部31-1は、オブジェクト位置情報を受信する。具体的には、通信ネットワークを介して、外部の装置から、オブジェクト位置情報を受信する。 The communication unit 31-1 receives object position information. Specifically, it receives object position information from an external device via a communication network.
 決定部33-1は、オブジェクトOBの位置に基づいて、オブジェクトOBの像を含むオブジェクト映像に付加する視覚効果を決定する。具体的には、決定部33-1は、通信部31-1によって受信されたオブジェクト位置情報が示すオブジェクトOBの位置に基づいて、オブジェクト抽出部42によって抽出されたオブジェクト映像に付加する視覚効果を決定する。決定部33-1が視覚効果を決定する具体的な方法は、上述した、第1の実施形態における決定部33が視覚効果を決定する具体的な方法と同様である。 The determination unit 33-1 determines the visual effect to be added to the object video including the image of the object OB based on the position of the object OB. Specifically, the determination unit 33-1 determines the visual effect to be added to the object video extracted by the object extraction unit 42 based on the position of the object OB indicated by the object position information received by the communication unit 31-1. decide. A specific method for determining the visual effect by the determining unit 33-1 is the same as the specific method for determining the visual effect by the determining unit 33 in the first embodiment described above.
 <空間像表示装置の動作>
 ここで、第2の実施形態に係る空間像表示装置1-1の動作について、図14を参照して説明する。図14は、第2の実施形態に係る空間像表示装置1-1の動作の一例を示すフローチャートである。図14を参照して説明する空間像表示装置1-1における動作は第2の実施形態に係る空間像表示装置1-1の空間像表示方法の一例に相当する。
<Operation of Spatial Image Display>
Here, the operation of the spatial image display device 1-1 according to the second embodiment will be described with reference to FIG. FIG. 14 is a flow chart showing an example of the operation of the spatial image display device 1-1 according to the second embodiment. The operation of the spatial image display device 1-1 described with reference to FIG. 14 corresponds to an example of the spatial image display method of the spatial image display device 1-1 according to the second embodiment.
 ステップS21において、入力部41が、撮像装置によって生成された映像を示す映像情報の入力を受け付ける。 In step S21, the input unit 41 accepts input of video information indicating the video generated by the imaging device.
 ステップS22において、オブジェクト抽出部42が、映像情報から、オブジェクトOBの像を含むオブジェクト映像を示すオブジェクト映像情報を抽出する。 In step S22, the object extraction unit 42 extracts object video information indicating an object video including the image of the object OB from the video information.
 ステップS23において、通信部31-1が、通信ネットワークを介して、外部の装置からオブジェクト位置情報を受信する。 In step S23, the communication unit 31-1 receives object position information from an external device via the communication network.
 続いて、空間像表示装置1-1は、ステップS24からステップS27までの処理を実行する。ステップS25からステップS28までの処理は、第1の実施形態におけるステップS12からステップS15までの処理と同じである。 Subsequently, the spatial image display device 1-1 executes the processing from step S24 to step S27. The processing from step S25 to step S28 is the same as the processing from step S12 to step S15 in the first embodiment.
 なお、第2の実施形態において、視覚効果決定部3-1と映像処理部4-1とは別体として構成されてもよい。このような構成において、映像処理部4-1は、通信インターフェースによって構成される通信部を有し、該通信部は、オブジェクト抽出部42によって抽出されたオブジェクト映像を示すオブジェクト映像情報を視覚効果決定部3-1の通信部31-1に送信する。 Note that in the second embodiment, the visual effect determination unit 3-1 and the video processing unit 4-1 may be configured separately. In such a configuration, the video processing unit 4-1 has a communication unit configured by a communication interface, and the communication unit determines the visual effect of the object video information indicating the object video extracted by the object extraction unit 42. It is transmitted to the communication section 31-1 of the section 3-1.
 また、上述した第1の実施形態の第1から第5の変形例を第2の実施形態に適用してもよい。 Also, the first to fifth modifications of the first embodiment described above may be applied to the second embodiment.
 <<第3の実施形態>>
 図15を参照して第3の実施形態の全体構成について説明する。図15は、第3の実施形態に係る空間像表示装置1-2の概略図である。第2の実施形態と同一の機能部については同じ符号を付加し、説明を省略する。
<<Third Embodiment>>
The overall configuration of the third embodiment will be described with reference to FIG. FIG. 15 is a schematic diagram of a spatial image display device 1-2 according to the third embodiment. The same reference numerals are given to the same functional units as in the second embodiment, and the description thereof is omitted.
 図15に示すように、第2の実施形態に係る空間像表示装置1-2は、虚像表示部2と、視覚効果決定部3-2と、映像処理部4-2とを備える。 As shown in FIG. 15, the spatial image display device 1-2 according to the second embodiment includes a virtual image display section 2, a visual effect determination section 3-2, and an image processing section 4-2.
 <映像処理部の構成>
 映像処理部4-2は、入力部41と、オブジェクト抽出部42と、オブジェクト位置推定部43とを備える。
<Configuration of video processing unit>
The video processing unit 4-2 includes an input unit 41, an object extraction unit 42, and an object position estimation unit 43.
 オブジェクト位置推定部43は、オブジェクトOBの位置を推定する。オブジェクト位置推定部43がオブジェクトOBの位置を推定する手法は、任意の手法であってよい。例えば、オブジェクト位置推定部43は、入力部41によって入力された映像情報に基づいて、オブジェクトOBの位置を示す、ディープラーニングを用いて、実空間における基準面からオブジェクトOBまでの距離を推定してもよい。 The object position estimation unit 43 estimates the position of the object OB. Any method may be used by the object position estimation unit 43 to estimate the position of the object OB. For example, the object position estimation unit 43 estimates the distance from the reference plane in the real space to the object OB using deep learning, which indicates the position of the object OB based on the video information input by the input unit 41. good too.
 <視覚効果決定部の構成>
 図15に示すように、視覚効果決定部3-2は、視覚効果記憶部32と、決定部33-2と、表示制御部34とを備える。
<Configuration of visual effect determination unit>
As shown in FIG. 15, the visual effect determination unit 3-2 includes a visual effect storage unit 32, a determination unit 33-2, and a display control unit .
 決定部33-2は、オブジェクトOBの位置に基づいて、オブジェクトOBの像を含むオブジェクト映像に付加する視覚効果を決定する。具体的には、決定部33-2は、オブジェクト位置推定部43によって推定された位置に基づいて、オブジェクト抽出部42によって抽出されたオブジェクト映像に付加する視覚効果を決定する。決定部33-2が視覚効果を決定する具体的な方法は、上述した、第1の実施形態における決定部33が視覚効果を決定する具体的な方法と同様である。 The determination unit 33-2 determines a visual effect to be added to the object image including the image of the object OB, based on the position of the object OB. Specifically, the determination unit 33 - 2 determines the visual effect to be added to the object image extracted by the object extraction unit 42 based on the position estimated by the object position estimation unit 43 . A specific method for determining the visual effect by the determining unit 33-2 is the same as the specific method for determining the visual effect by the determining unit 33 in the first embodiment described above.
 <空間像表示装置の動作>
 ここで、第3の実施形態に係る空間像表示装置1-2の動作について、図16を参照して説明する。図16は、第3の実施形態に係る空間像表示装置1-2の動作の一例を示すシーケンス図である。図16を参照して説明する空間像表示装置1-2における動作は、第3の実施形態に係る空間像表示装置1-2の空間像表示方法の一例に相当する。
<Operation of Spatial Image Display>
Here, the operation of the spatial image display device 1-2 according to the third embodiment will be described with reference to FIG. FIG. 16 is a sequence diagram showing an example of the operation of the spatial image display device 1-2 according to the third embodiment. The operation of the spatial image display device 1-2 described with reference to FIG. 16 corresponds to an example of the spatial image display method of the spatial image display device 1-2 according to the third embodiment.
 ステップS31において、入力部41が、カメラ等の撮像装置によって生成された映像を示す映像情報の入力を受け付ける。 In step S31, the input unit 41 accepts input of image information indicating an image generated by an imaging device such as a camera.
 ステップS32において、オブジェクト抽出部42が、映像情報から、オブジェクトOBの映像を示すオブジェクト映像情報を抽出する。 In step S32, the object extraction unit 42 extracts object video information indicating the video of the object OB from the video information.
 ステップS33において、オブジェクト位置推定部43が、オブジェクトOBの位置を示すオブジェクト位置を推定する。 In step S33, the object position estimation unit 43 estimates the object position indicating the position of the object OB.
 続いて、空間像表示装置1-2は、ステップS34からステップS37までの処理を実行する。ステップS34からステップS37までの処理は、第1の実施形態におけるステップS12からステップS15までの処理と同じである。 Subsequently, the spatial image display device 1-2 executes the processing from step S34 to step S37. The processing from step S34 to step S37 is the same as the processing from step S12 to step S15 in the first embodiment.
 なお、上述した第1の実施形態の第1から第5の変形例を第3の実施形態に適用してもよい。第1の変形例を適用する構成において、オブジェクト位置推定部43は、オブジェクトOBの位置を示す、実空間における、所定の水平面からオブジェクトOBまでの高さを推定する。また、第2の変形例を適用する構成において、オブジェクト位置推定部43は、映像における、オブジェクトOBの像の位置の変化率を推定する。第3の変形例を適用する構成において、オブジェクト推定部44は、実空間における、オジェクトOBの位置が所定範囲内であるか否かを推定する。 Note that the first to fifth modifications of the first embodiment described above may be applied to the third embodiment. In the configuration to which the first modification is applied, the object position estimation unit 43 estimates the height from a predetermined horizontal plane to the object OB in real space, which indicates the position of the object OB. In addition, in the configuration to which the second modification is applied, the object position estimation unit 43 estimates the change rate of the position of the image of the object OB in the video. In the configuration to which the third modification is applied, the object estimation unit 44 estimates whether or not the position of the object OB in real space is within a predetermined range.
 <プログラム>
 上述した決定部33、33-1、33-2、及び表示制御部34は、コンピュータ100によって実現することができる。また、決定部33、33-1、33-2、及び表示制御部34として機能させるためのプログラムが提供されてもよい。また、該プログラムは、格納媒体に格納されてもよいし、ネットワークを通して提供されてもよい。図17は、決定部33、33-1、33-2、及び表示制御部34としてそれぞれ機能するコンピュータ100の概略構成を示すブロック図である。ここで、コンピュータ100は、汎用コンピュータ、専用コンピュータ、ワークステーション、PC(Personal Computer)、電子ノートパッドなどであってもよい。プログラム命令は、必要なタスクを実行するためのプログラムコード、コードセグメントなどであってもよい。
<Program>
The determination units 33 , 33 - 1 , 33 - 2 and the display control unit 34 described above can be realized by the computer 100 . Further, a program for functioning as the determination units 33, 33-1, 33-2 and the display control unit 34 may be provided. Also, the program may be stored in a storage medium or provided through a network. FIG. 17 is a block diagram showing a schematic configuration of the computer 100 functioning as the determination units 33, 33-1, 33-2, and the display control unit 34, respectively. Here, the computer 100 may be a general-purpose computer, a dedicated computer, a workstation, a PC (Personal Computer), an electronic notepad, or the like. Program instructions may be program code, code segments, etc. for performing the required tasks.
 図17に示すように、コンピュータ100は、プロセッサ110と、ROM(Read Only Memory)120と、RAM(Random Access Memory)130と、ストレージ140と、入力部150と、表示部160と、通信インターフェース(I/F)170とを備える。各構成は、バス180を介して相互に通信可能に接続されている。プロセッサ110は、具体的にはCPU(Central Processing Unit)、MPU(Micro Processing Unit)、GPU(Graphics Processing Unit)、DSP(Digital Signal Processor)、SoC(System on a Chip)などであり、同種又は異種の複数のプロセッサにより構成されてもよい。 As shown in FIG. 17, the computer 100 includes a processor 110, a ROM (Read Only Memory) 120, a RAM (Random Access Memory) 130, a storage 140, an input unit 150, a display unit 160, and a communication interface ( I/F) 170. Each component is communicatively connected to each other via a bus 180 . The processor 110 is specifically a CPU (Central Processing Unit), MPU (Micro Processing Unit), GPU (Graphics Processing Unit), DSP (Digital Signal Processor), SoC (System on a Chip), etc. may be configured by a plurality of processors of
 プロセッサ110は、各構成の制御、及び各種の演算処理を実行する。すなわち、プロセッサ110は、ROM120又はストレージ140からプログラムを読み出し、RAM130を作業領域としてプログラムを実行する。プロセッサ110は、ROM120又はストレージ140に格納されているプログラムに従って、上記各構成の制御及び各種の演算処理を行う。上述した実施形態では、ROM120又はストレージ140に、本開示に係るプログラムが格納されている。 The processor 110 controls each configuration and executes various arithmetic processing. That is, processor 110 reads a program from ROM 120 or storage 140 and executes the program using RAM 130 as a work area. The processor 110 performs control of each configuration and various arithmetic processing according to programs stored in the ROM 120 or the storage 140 . In the above-described embodiment, the ROM 120 or storage 140 stores the program according to the present disclosure.
 プログラムは、コンピュータ100が読み取り可能な格納媒体に格納されていてもよい。このような格納媒体を用いれば、プログラムをコンピュータ100にインストールすることが可能である。ここで、プログラムが格納された格納媒体は、非一時的(non-transitory)格納媒体であってもよい。非一時的格納媒体は、特に限定されるものではないが、例えば、CD-ROM、DVD-ROM、USB(Universal Serial Bus)メモリなどであってもよい。また、このプログラムは、ネットワークを介して外部装置からダウンロードされる形態としてもよい。 The program may be stored in a storage medium readable by the computer 100. A program can be installed in the computer 100 by using such a storage medium. Here, the storage medium storing the program may be a non-transitory storage medium. The non-temporary storage medium is not particularly limited, but may be, for example, a CD-ROM, a DVD-ROM, a USB (Universal Serial Bus) memory, or the like. Also, this program may be downloaded from an external device via a network.
 ROM120は、各種プログラム及び各種データを格納する。RAM130は、作業領域として一時的にプログラム又はデータを格納する。ストレージ140は、HDD(Hard Disk Drive)又はSSD(Solid State Drive)により構成され、オペレーティングシステムを含む各種プログラム及び各種データを格納する。 The ROM 120 stores various programs and various data. RAM 130 temporarily stores programs or data as a work area. The storage 140 is configured by a HDD (Hard Disk Drive) or SSD (Solid State Drive) and stores various programs including an operating system and various data.
 入力部150は、ユーザの入力操作を受け付けて、ユーザの操作に基づく情報を取得する1つ以上の入力インターフェースを含む。例えば、入力部150は、ポインティングデバイス、キーボード、マウスなどであるが、これらに限定されない。 The input unit 150 includes one or more input interfaces that receive user's input operations and acquire information based on the user's operations. For example, the input unit 150 is a pointing device, keyboard, mouse, etc., but is not limited to these.
 表示部160は、情報を出力する1つ以上の出力インターフェースを含む。例えば、表示部160は、情報を映像で出力するディスプレイ、又は情報を音声で出力するスピーカであるが、これらに限定されない。なお、表示部160は、タッチパネル方式のディスプレイである場合には、入力部150としても機能する。 The display unit 160 includes one or more output interfaces that output information. For example, the display unit 160 is a display that outputs information as video or a speaker that outputs information as audio, but is not limited to these. Note that the display unit 160 also functions as the input unit 150 when it is a touch panel type display.
 通信インターフェース(I/F)170は、外部の装置と通信するためのインターフェースである。 A communication interface (I/F) 170 is an interface for communicating with an external device.
 以上の実施形態に関し、更に以下の付記を開示する。 Regarding the above embodiments, the following additional remarks are disclosed.
 (付記項1)
 映像を表示する、複数の表示部と、
 前記複数の表示部に表示された前記映像から出射された映像光をそれぞれ反射させ、反射した前記映像光が観察者の眼に到達することによって、前記観察者から同じ方向の異なる距離に、前記映像の虚像が表示されるように配置されている、複数の光学素子と、
 制御部と、を備え、
 前記制御部は、
  前記映像に像が含まれる特定の被写体であるオブジェクトの位置に基づいて、前記オブジェクトの像を含むオブジェクト映像に付加する視覚効果を決定し、
  前記オブジェクト映像及び前記視覚効果に基づいて前記映像を表示するよう前記表示部を制御する、空間像表示装置。
 (付記項2)
 前記オブジェクト映像を示すオブジェクト映像情報、及び前記オブジェクトの位置を示すオブジェクト位置情報を受信する通信部をさらに備え、
 前記制御部は、前記オブジェクト位置情報が示す前記位置に基づいて、前記オブジェクト映像情報が示す前記オブジェクト映像に付加する前記視覚効果を決定する、付記項1に記載の空間像表示装置。
 (付記項3)
 前記制御部は、
  前記オブジェクトの位置を推定し、
  該位置に基づいて、前記視覚効果を決定する、付記項1又は2に記載の空間像表示装置。
 (付記項4)
 前記位置は、前記虚像が表示される虚像面に対応する、実空間上の面である基準面から前記オブジェクトまでの距離で示され、
 前記制御部は、
  前記距離に基づいて、前記複数の表示部の中から、前記オブジェクト映像を表示させる表示部を決定し、
  前記決定部によって決定された前記表示部が前記オブジェクト映像に前記視覚効果を付した映像を表示するよう制御する、付記項1から3のいずれか一項に記載の空間像表示装置。
 (付記項5)
 前記位置は、実空間における、所定の水平面から前記オブジェクトまでの高さで示され、
 前記制御部は、前記高さに基づいて、前記視覚効果を決定する、付記項1から4のいずれか一項に記載の空間像表示装置。
 (付記項6)
 前記制御部は、前記位置の変化率に基づいて、前記視覚効果を決定する、付記項1から5のいずれか一項に記載の空間像表示装置。
 (付記項7)
 前記制御部は、前記位置が所定範囲内であるかに基づいて、前記視覚効果を決定する、付記項1から6のいずれか一項に記載の空間像表示装置。
 (付記項8)
 前記位置は、前記虚像が表示される虚像面に対応する、実空間上の面である基準面から前記オブジェクトまでの所定方向の距離で示され、
 前記制御部は、前記距離に基づいて、前記複数の表示部に表示させる前記オブジェクト映像の輝度をそれぞれ変更するような前記視覚効果を決定する、付記項3に記載の空間像表示装置。
 (付記項9)
 前記制御部は、
  前記虚像が表示される空間を画定する部材に照射される照射光を決定し、
  前記決定された前記照射光を照射するよう照射装置を制御する、付記項1から8のいずれか一項に記載の空間像表示装置。
 (付記項10)
 前記制御部は、マルチデバイス対応である、付記項1から9のいずれか一項に記載の空間像表示装置。
 (付記項11)
 映像を表示する、複数の表示部と、前記複数の表示部に表示された前記映像から出射された映像光をそれぞれ反射させ、反射した前記映像光が観察者の眼に到達することによって、前記観察者から同じ方向の異なる距離に、前記映像の虚像が表示されるように配置されている、複数の光学素子とを備える空間像表示装置の空間像表示方法であって、
 前記映像に像が含まれる特定の被写体であるオブジェクトの位置に基づいて、前記オブジェクトの像を含むオブジェクト映像に付加する視覚効果を決定するステップと、
 前記オブジェクト映像及び前記視覚効果に基づいて前記映像を表示するよう前記複数の表示部を制御するステップと、
を含む空間像表示方法。
 (付記項12)
 コンピュータによって実行可能なプログラムを格納した非一時的格納媒体であって、前記コンピュータを付記項1から10のいずれか一項に記載の決定部及び表示制御部として機能させる、プログラムを格納した非一時的格納媒体。
(Appendix 1)
a plurality of display units for displaying images;
The image light emitted from the images displayed on the plurality of display units is reflected, and the reflected image light reaches the observer's eye, whereby the image is displayed at different distances in the same direction from the observer. a plurality of optical elements arranged to display a virtual image of an image;
a control unit;
The control unit
determining a visual effect to be added to an object image including an image of the object based on the position of an object that is a specific subject whose image is included in the image;
A spatial image display device for controlling the display unit to display the image based on the object image and the visual effect.
(Appendix 2)
a communication unit that receives object video information indicating the object video and object position information indicating the position of the object;
2. The spatial image display device according to claim 1, wherein the control unit determines the visual effect to be added to the object image indicated by the object image information based on the position indicated by the object position information.
(Appendix 3)
The control unit
estimating the position of the object;
3. The aerial image display device according to claim 1 or 2, wherein the visual effect is determined based on the position.
(Appendix 4)
The position is indicated by the distance from a reference plane, which is a plane in real space, to the object, corresponding to the virtual image plane on which the virtual image is displayed,
The control unit
determining, from among the plurality of display units, a display unit on which to display the object image based on the distance;
4. The spatial image display device according to any one of additional items 1 to 3, wherein the display unit determined by the determination unit controls to display the image added with the visual effect to the object image.
(Appendix 5)
The position is indicated by a height from a predetermined horizontal plane to the object in real space,
5. The spatial image display device according to any one of additional items 1 to 4, wherein the control unit determines the visual effect based on the height.
(Appendix 6)
6. The aerial image display device according to any one of additional items 1 to 5, wherein the control unit determines the visual effect based on the change rate of the position.
(Appendix 7)
7. The aerial image display device according to any one of additional items 1 to 6, wherein the control unit determines the visual effect based on whether the position is within a predetermined range.
(Appendix 8)
The position is indicated by a distance in a predetermined direction from a reference plane, which is a plane in real space, to the object, corresponding to the virtual image plane on which the virtual image is displayed;
4. The spatial image display device according to claim 3, wherein the control unit determines the visual effect of changing the brightness of each of the object images displayed on the plurality of display units, based on the distance.
(Appendix 9)
The control unit
Determining irradiation light irradiated to a member defining a space in which the virtual image is displayed;
9. The spatial image display device according to any one of additional items 1 to 8, wherein an irradiation device is controlled to irradiate the determined irradiation light.
(Appendix 10)
10. The spatial image display device according to any one of additional items 1 to 9, wherein the control unit is compatible with multiple devices.
(Appendix 11)
a plurality of display units that display images; and image light emitted from the images displayed on the plurality of display units are reflected, respectively, and the reflected image light reaches an observer's eye. A spatial image display method for a spatial image display device comprising a plurality of optical elements arranged such that virtual images of the video are displayed at different distances in the same direction from an observer,
determining a visual effect to be added to an object image including an image of the object based on the position of the object, which is a specific subject whose image is included in the image;
controlling the plurality of displays to display the image based on the object image and the visual effect;
a spatial image display method comprising:
(Appendix 12)
A non-temporary storage medium storing a program executable by a computer, the non-temporary storage medium storing the program causing the computer to function as the determining unit and the display control unit according to any one of appendices 1 to 10. storage medium.
 本明細書に記載された全ての文献、特許出願および技術規格は、個々の文献、特許出願、および技術規格が参照により取り込まれることが具体的かつ個々に記載された場合と同程度に、本明細書中に参照により取り込まれる。 All publications, patent applications and technical standards mentioned herein are expressly incorporated herein by reference to the same extent as if each individual publication, patent application and technical standard were specifically and individually indicated to be incorporated by reference. incorporated herein by reference.
 上述の実施形態は代表的な例として説明したが、本開示の趣旨及び範囲内で、多くの変更及び置換ができることは当業者に明らかである。したがって、本発明は、上述の実施形態によって制限するものと解するべきではなく、請求の範囲から逸脱することなく、種々の変形又は変更が可能である。例えば、実施形態の構成図に記載の複数の構成ブロックを1つに組み合わせたり、あるいは1つの構成ブロックを分割したりすることが可能である。 Although the above-described embodiments have been described as representative examples, it will be apparent to those skilled in the art that many modifications and substitutions can be made within the spirit and scope of the present disclosure. Therefore, the present invention should not be construed as limited by the above-described embodiments, and various modifications and changes are possible without departing from the scope of the claims. For example, it is possible to combine a plurality of configuration blocks described in the configuration diagrams of the embodiments into one, or divide one configuration block.
1、1-1、1-2    空間像表示装置
2            虚像表示部
3、3-1、3-2    視覚効果決定部
4-1、4-2      映像処理部
21、211、212   表示部
22、221、222   光学素子
31、31-1      通信部
32           視覚効果記憶部
33、33-1、33-2 決定部
34           表示制御部
41           入力部
42           オブジェクト抽出部
43           オブジェクト位置推定部
100          コンピュータ
110          プロセッサ
120          ROM
130          RAM
140          ストレージ
150          入力部
160          出力部
170          通信インターフェース
180          バス
1, 1-1, 1-2 Spatial image display device 2 Virtual image display unit 3, 3-1, 3-2 Visual effect determination unit 4-1, 4-2 Video processing unit 21, 211, 212 Display unit 22, 221 , 222 optical elements 31, 31-1 communication unit 32 visual effect storage units 33, 33-1, 33-2 determination unit 34 display control unit 41 input unit 42 object extraction unit 43 object position estimation unit 100 computer 110 processor 120 ROM
130 RAM
140 storage 150 input unit 160 output unit 170 communication interface 180 bus

Claims (12)

  1.  映像を表示する、複数の表示部と、
     前記複数の表示部に表示された前記映像から出射された映像光をそれぞれ反射させ、反射した前記映像光が観察者の眼に到達することによって、前記観察者から同じ方向の異なる距離に、前記映像の虚像が表示されるように配置されている、複数の光学素子と、
     前記映像に像が含まれる特定の被写体であるオブジェクトの位置に基づいて、前記オブジェクトの像を含むオブジェクト映像に付加する視覚効果を決定する決定部と、
     前記オブジェクト映像及び前記視覚効果に基づいて前記映像を表示するよう前記表示部を制御する表示制御部と、
    を備える空間像表示装置。
    a plurality of display units for displaying images;
    The image light emitted from the images displayed on the plurality of display units is reflected, and the reflected image light reaches the observer's eye, whereby the image is displayed at different distances in the same direction from the observer. a plurality of optical elements arranged to display a virtual image of an image;
    a determination unit that determines a visual effect to be added to an object image including the image of the object based on the position of the object, which is a specific subject whose image is included in the image;
    a display control unit that controls the display unit to display the image based on the object image and the visual effect;
    A spatial image display device comprising:
  2.  前記オブジェクト映像を示すオブジェクト映像情報、及び前記オブジェクトの位置を示すオブジェクト位置情報を受信する通信部をさらに備え、
     前記決定部は、前記オブジェクト位置情報が示す前記位置に基づいて、前記オブジェクト映像情報が示す前記オブジェクト映像に付加する前記視覚効果を決定する、請求項1に記載の空間像表示装置。
    a communication unit that receives object video information indicating the object video and object position information indicating the position of the object;
    2. The spatial image display device according to claim 1, wherein said determining unit determines said visual effect to be added to said object image indicated by said object image information based on said position indicated by said object position information.
  3.  前記オブジェクトの位置を推定するオブジェクト位置推定部をさらに備え、
     前記決定部は、前記オブジェクト位置推定部によって推定された前記位置に基づいて、前記視覚効果を決定する、請求項1又は2に記載の空間像表示装置。
    Further comprising an object position estimator that estimates the position of the object,
    3. The aerial image display device according to claim 1, wherein said determining section determines said visual effect based on said position estimated by said object position estimating section.
  4.  前記位置は、前記虚像が表示される虚像面に対応する、実空間上の面である基準面から前記オブジェクトまでの距離によって示され、
     前記決定部は、前記距離に基づいて、前記複数の表示部の中から、前記オブジェクト映像に基づく映像を表示させる表示部を決定し、
     前記表示制御部は、前記決定部によって決定された前記表示部が前記オブジェクト映像に前記視覚効果を付した映像を表示するよう制御する、請求項1から3のいずれか一項に記載の空間像表示装置。
    The position is indicated by the distance from a reference plane, which is a plane in real space corresponding to the virtual image plane on which the virtual image is displayed, to the object,
    The determining unit determines, from among the plurality of display units, a display unit on which to display an image based on the object image based on the distance,
    4. The aerial image according to any one of claims 1 to 3, wherein said display control unit controls said display unit determined by said determining unit to display an image obtained by adding said visual effect to said object image. display device.
  5.  前記位置は、実空間における、所定の水平面から前記オブジェクトまでの高さで示され、
     前記決定部は、前記高さに基づいて、前記視覚効果を決定する、請求項1から4のいずれか一項に記載の空間像表示装置。
    The position is indicated by a height from a predetermined horizontal plane to the object in real space,
    The aerial image display device according to any one of claims 1 to 4, wherein the determination unit determines the visual effect based on the height.
  6.  前記決定部は、前記位置の変化率に基づいて、前記視覚効果を決定する、請求項1から5のいずれか一項に記載の空間像表示装置。 The aerial image display device according to any one of claims 1 to 5, wherein the determination unit determines the visual effect based on the change rate of the position.
  7.  前記決定部は、前記位置が所定範囲内であるかに基づいて、前記視覚効果を決定する、請求項1から6のいずれか一項に記載の空間像表示装置。 The spatial image display device according to any one of claims 1 to 6, wherein the determination unit determines the visual effect based on whether the position is within a predetermined range.
  8.  前記位置は、前記虚像が表示される虚像面に対応する実空間上の面である基準面から、前記オブジェクトまでの所定方向の距離によって示され、
     前記決定部は、前記距離に基づいて、前記複数の表示部に表示させる前記オブジェクト映像の輝度をそれぞれ変更するような前記視覚効果を決定する、請求項3に記載の空間像表示装置。
    The position is indicated by a distance in a predetermined direction from a reference plane, which is a plane in real space corresponding to the virtual image plane on which the virtual image is displayed, to the object,
    4. The spatial image display device according to claim 3, wherein said determining unit determines said visual effects such as changing luminance of said object images displayed on said plurality of display units based on said distance.
  9.  前記決定部は、前記虚像が表示される空間を画定する部材に照射される照射光を決定し、
     前記表示制御部は、前記決定部によって決定された前記照射光を照射するよう照射装置を制御する、請求項1から8のいずれか一項に記載の空間像表示装置。
    The determining unit determines irradiation light to be applied to a member that defines a space in which the virtual image is displayed,
    The spatial image display device according to any one of claims 1 to 8, wherein the display control section controls an irradiation device to emit the irradiation light determined by the determination section.
  10.  前記表示制御部は、マルチデバイス対応である、請求項1から9のいずれか一項に記載の空間像表示装置。 The spatial image display device according to any one of claims 1 to 9, wherein the display control unit is compatible with multiple devices.
  11.  映像を表示する、複数の表示部と、前記複数の表示部に表示された前記映像から出射された映像光をそれぞれ反射させ、反射した前記映像光が観察者の眼に到達することによって、前記観察者から同じ方向の異なる距離に、前記映像の虚像が表示されるように配置されている、複数の光学素子とを備える空間像表示装置の空間像表示方法であって、
     前記映像に像が含まれる特定の被写体であるオブジェクトの位置に基づいて、前記オブジェクトの像を含むオブジェクト映像に付加する視覚効果を決定するステップと、
     前記オブジェクト映像及び前記視覚効果に基づいて前記映像を表示するよう前記表示部を制御するステップと、
    を含む空間像表示方法。
    a plurality of display units that display images; and image light emitted from the images displayed on the plurality of display units are reflected, respectively, and the reflected image light reaches an observer's eye. A spatial image display method for a spatial image display device comprising a plurality of optical elements arranged such that virtual images of the video are displayed at different distances in the same direction from an observer,
    determining a visual effect to be added to an object image including an image of the object based on the position of the object, which is a specific subject whose image is included in the image;
    controlling the display to display the image based on the object image and the visual effect;
    a spatial image display method comprising:
  12.  コンピュータを、請求項1から10のいずれか一項に記載の決定部及び表示制御部として機能させるためのプログラム。 A program for causing a computer to function as the determination unit and the display control unit according to any one of claims 1 to 10.
PCT/JP2021/025200 2021-07-02 2021-07-02 Spatial image display device, spatial image display method, and program WO2023276156A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2023531331A JPWO2023276156A1 (en) 2021-07-02 2021-07-02
PCT/JP2021/025200 WO2023276156A1 (en) 2021-07-02 2021-07-02 Spatial image display device, spatial image display method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/025200 WO2023276156A1 (en) 2021-07-02 2021-07-02 Spatial image display device, spatial image display method, and program

Publications (1)

Publication Number Publication Date
WO2023276156A1 true WO2023276156A1 (en) 2023-01-05

Family

ID=84691058

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/025200 WO2023276156A1 (en) 2021-07-02 2021-07-02 Spatial image display device, spatial image display method, and program

Country Status (2)

Country Link
JP (1) JPWO2023276156A1 (en)
WO (1) WO2023276156A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000261832A (en) * 1999-03-08 2000-09-22 Nippon Telegr & Teleph Corp <Ntt> Three-dimensional image display method and head mount display device
JP2000333211A (en) * 1999-05-18 2000-11-30 Nippon Telegr & Teleph Corp <Ntt> Three-dimensional display method and head mount display device
JP2003058912A (en) * 2001-05-18 2003-02-28 Sony Computer Entertainment Inc Display device and image processing method
JP2004163644A (en) * 2002-11-13 2004-06-10 Nippon Telegr & Teleph Corp <Ntt> Three-dimensional display method
JP2004240090A (en) * 2003-02-05 2004-08-26 Pioneer Electronic Corp Display device and its method
JP2009267557A (en) * 2008-04-23 2009-11-12 Seiko Epson Corp Image display apparatus and image display method
JP2016018560A (en) * 2014-07-08 2016-02-01 三星電子株式会社Samsung Electronics Co.,Ltd. Device and method to display object with visual effect
JP2018073172A (en) * 2016-10-31 2018-05-10 株式会社ソニー・インタラクティブエンタテインメント Information processing device and image generation method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000261832A (en) * 1999-03-08 2000-09-22 Nippon Telegr & Teleph Corp <Ntt> Three-dimensional image display method and head mount display device
JP2000333211A (en) * 1999-05-18 2000-11-30 Nippon Telegr & Teleph Corp <Ntt> Three-dimensional display method and head mount display device
JP2003058912A (en) * 2001-05-18 2003-02-28 Sony Computer Entertainment Inc Display device and image processing method
JP2004163644A (en) * 2002-11-13 2004-06-10 Nippon Telegr & Teleph Corp <Ntt> Three-dimensional display method
JP2004240090A (en) * 2003-02-05 2004-08-26 Pioneer Electronic Corp Display device and its method
JP2009267557A (en) * 2008-04-23 2009-11-12 Seiko Epson Corp Image display apparatus and image display method
JP2016018560A (en) * 2014-07-08 2016-02-01 三星電子株式会社Samsung Electronics Co.,Ltd. Device and method to display object with visual effect
JP2018073172A (en) * 2016-10-31 2018-05-10 株式会社ソニー・インタラクティブエンタテインメント Information processing device and image generation method

Also Published As

Publication number Publication date
JPWO2023276156A1 (en) 2023-01-05

Similar Documents

Publication Publication Date Title
JP7304934B2 (en) Mixed reality system with virtual content warping and method of using it to generate virtual content
CN110431599B (en) Mixed reality system with virtual content warping and method for generating virtual content using the same
CN110402425B (en) Mixed reality system with color virtual content distortion and method for generating virtual content using the same
CN108780578B (en) Augmented reality system and method of operating an augmented reality system
JP6747504B2 (en) Information processing apparatus, information processing method, and program
US10962780B2 (en) Remote rendering for virtual images
JP7095189B2 (en) Mixed reality system with multi-source virtual content synthesis and how to use it to generate virtual content
US20160238852A1 (en) Head mounted display performing post render processing
US10715791B2 (en) Virtual eyeglass set for viewing actual scene that corrects for different location of lenses than eyes
JP7201869B1 (en) Generate new frames with rendered and unrendered content from the previous eye
JP2012253690A (en) Program, information storage medium, and image generation system
JP2023520765A (en) Systems and methods for virtual and augmented reality
US11709370B2 (en) Presentation of an enriched view of a physical setting
WO2023276156A1 (en) Spatial image display device, spatial image display method, and program
EP3185103A1 (en) A gazed virtual object identification determination module, a system for implementing gaze translucency, and a related method
WO2023181634A1 (en) Information processing device, information processing method, and recording medium
US20240046584A1 (en) Information processing apparatus
CN118710796A (en) Method, apparatus, device and medium for displaying bullet screen

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21948460

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023531331

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21948460

Country of ref document: EP

Kind code of ref document: A1