WO2005124450A1 - Exhibition system - Google Patents

Exhibition system Download PDF

Info

Publication number
WO2005124450A1
WO2005124450A1 PCT/JP2005/010916 JP2005010916W WO2005124450A1 WO 2005124450 A1 WO2005124450 A1 WO 2005124450A1 JP 2005010916 W JP2005010916 W JP 2005010916W WO 2005124450 A1 WO2005124450 A1 WO 2005124450A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
screen
viewer
real
display device
Prior art date
Application number
PCT/JP2005/010916
Other languages
French (fr)
Japanese (ja)
Inventor
Ikuro Choh
Original Assignee
Cad Center Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cad Center Corporation filed Critical Cad Center Corporation
Publication of WO2005124450A1 publication Critical patent/WO2005124450A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F19/00Advertising or display means not otherwise provided for
    • G09F19/12Advertising or display means not otherwise provided for using special optical effects
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47FSPECIAL FURNITURE, FITTINGS, OR ACCESSORIES FOR SHOPS, STOREHOUSES, BARS, RESTAURANTS OR THE LIKE; PAYING COUNTERS
    • A47F10/00Furniture or installations specially adapted to particular types of service systems, not otherwise provided for
    • A47F10/06Furniture or installations specially adapted to particular types of service systems, not otherwise provided for for restaurant service systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F19/00Advertising or display means not otherwise provided for
    • G09F19/12Advertising or display means not otherwise provided for using special optical effects
    • G09F19/18Advertising or display means not otherwise provided for using special optical effects involving the use of optical projection means, e.g. projection of images on clouds
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor

Definitions

  • the present invention relates to a display device.
  • Patent Document 1 discloses a display device. This exhibition device performs an exhibition that combines the projected image and the exhibits. Patent Literature 1 discloses an exhibition example in which an image of a dinosaur is combined with a forest miniature.
  • Patent Document 2 discloses a virtual operation system. This virtual operation system can provide a virtual experience as if the power of a personal computer at home, for example, is touching the real thing or moving the real thing through the Internet. Patent Document 2 discloses a specific example of remotely controlling a toy robot or the like via the Internet.
  • Patent Document 3 discloses a pointing device.
  • the pointing device includes an input plane plate arranged in front of the display device, an optical sensor unit for projecting and receiving scanning light within a plane having a predetermined distance from the input plane plate, and a light projection unit of the optical sensor unit.
  • a retroreflective member that retroreflects light rays
  • imaging means that uses the retroreflected light received by the optical sensor unit to image a pointer on the input flat plate, and converts the pointer into an electric signal
  • Image processing means for analyzing the obtained imaging signal to calculate the coordinate position of the pointer.
  • Patent Document 4 discloses a virtual space movement control device that moves a virtual space image displayed on a display screen of a display unit.
  • the virtual space movement control device includes a detection unit that detects a touch operation or a drag operation of a plurality of pointing parts, which are performed on a display screen, and a viewpoint of a virtual space image based on a detection result of the detection unit.
  • the apparatus includes viewpoint position information generating means for generating position information, and stereoscopic image generating means for generating virtual space image data viewed from the viewpoint indicated by the viewpoint position information and outputting the data to the display means.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 5-35192 (Abstract, etc.)
  • Patent Document 2 Japanese Patent Application Laid-Open No. 2002-170055 (Abstract, etc.)
  • Patent Document 3 Japanese Patent Application Laid-Open No. 2004-5271 (Claims, etc.)
  • Patent Document 4 Japanese Patent Application Laid-Open No. 2004-5272 (Claims, etc.)
  • the display device is provided between the real object to be displayed and the viewing position, and the viewer's power at the viewing position is large enough to see the real object and is a transparent screen. And, from the viewer's point of view, the real posture that is almost the same as the real posture when viewed through the screen And a projector for projecting an image on a screen so that the image overlaps with the real object.
  • This display device allows the viewer to perceive that the image of the real object formed on the screen corresponds one-to-one with the real object, so that the real image formed on the screen is the real object itself. Illusion can be given.
  • This display device can give a real image to a real image that actually exists.
  • the exhibition apparatus further includes a viewing position detecting unit that detects a viewing position or a viewpoint position of a viewer, and a position detected by the viewing position detecting unit.
  • Image change means for changing the image of the real object formed on the screen so that the image of the actual object overlaps the actual object in view of the positional force.
  • the real image appears to overlap with the real object, taking into account the viewer's position. Further, it is not necessary to specify a position where the real image is seen overlapping with the real image.
  • the display device in addition to the configuration of each of the above-described inventions, includes a viewing position detecting unit 1S, a screen force, a distance sensor for detecting a distance to a viewer, and a predetermined range on the viewer side of the screen.
  • the display device may be configured such that, when a plurality of viewers are included in a captured image, the viewer viewpoint specifying means is located at the center of the image. This is to identify the viewpoint position of the viewer who is closest to the center of the viewer or the screen.
  • the exhibition device further includes a viewing position detecting unit, a contact position detecting unit that detects a position touching the power screen, and a contact position detected by the contact position detecting unit. Based on the screen assuming that the viewer reached out and touched And a viewer viewpoint specifying means for specifying the position of the viewer based on the positional relationship with the viewer.
  • the position where the viewer is present can be specified by simple processing.
  • the screen is a transmissive screen having a transmissivity of 50% or more and 80% or less, and is arranged on the real side of the screen.
  • the first control to reduce the amount of light output by the light projecting member and the actual light image formed on the screen when the real image is formed on the screen and the light projecting member that illuminates the real object
  • a light amount control means for executing at least one of the second controls for increasing the light amount output from the light projecting member.
  • the display device in addition to the configuration of each of the above-described inventions, further includes an operation detecting means for detecting an operation of the viewer on the screen, an operation detected by the operation detecting means, and an image of the image at that time.
  • Image control means for rotating or enlarging or reducing the real image formed on the screen based on the display state.
  • the viewer can operate the screen on which the real image is formed by overlapping the real object, thereby rotating the real image formed on the screen. And can be enlarged or reduced.
  • the display device further includes an operation detecting means for detecting an operation of the viewer on the screen, an operation detected by the operation detecting means, Image forming means for forming an image, video, or character information related to the real thing on the screen based on the display state of the image at the time.
  • FIG. 1 is a view showing a display device according to Embodiment 1 of the present invention.
  • FIG. 2 is a cross-sectional view of the display device of FIG. 1.
  • FIG. 3 is a cross-sectional view of the display device of FIG. 1.
  • FIG. 4 is an explanatory view showing a layout and a structure of a transmission screen, a retroreflective tape, and two infrared light emitting and receiving units of the exhibition apparatus of FIG. 1.
  • FIG. 5 is a block diagram showing a hardware configuration of a control system of the exhibition device of FIG. 1.
  • FIG. 6 is a block diagram showing a control system of the exhibition device of FIG. 1.
  • FIG. 7 is a diagram showing data stored in the hard disk device of FIG. 5.
  • FIG. 8 is a diagram showing an example of an image based on the three-dimensional CG data of FIG. 7.
  • FIG. 9 is a diagram showing an example of another image based on the three-dimensional CG data of FIG. 7.
  • FIG. 10 is a diagram showing still another example of an image based on the three-dimensional CG data of FIG. 7.
  • FIG. 11 is a diagram showing a projection state of the initial image of FIG. 8 onto a transmission screen.
  • FIG. 12 is a view showing a display device according to Embodiment 2 of the present invention.
  • FIG. 13 is a block diagram showing a hardware configuration of a control system of the exhibition device in FIG. 12.
  • FIG. 14 is a block diagram showing a control system of the display device in FIG. 12.
  • FIG. 15 is a diagram showing data stored in the hard disk device of FIG.
  • FIG. 16 is a diagram showing a geometrical positional relationship between a real object, a viewer, and a transmission screen.
  • FIG. 17 is a diagram showing an example of an image based on three-dimensional CG data.
  • FIG. 18 is a diagram showing a state of projection of the image of FIG. 17 onto a transmission screen.
  • FIG. 19 is a diagram showing a situation where three viewers are standing in front of a transmissive screen.
  • FIG. 20 is a diagram showing a state in which a viewer is touching a transmission screen.
  • FIG. 21 is a block diagram showing a hardware configuration of a control system of the display device according to Embodiment 3 of the present invention.
  • FIG. 22 is a block diagram showing a control system of the display device in FIG. 21.
  • FIG. 23 is a diagram showing data stored in the hard disk device of FIG. 21.
  • FIG. 24 is a cross-sectional view showing a modification of the display device having a spot lamp that outputs incandescent light and a spot lamp that outputs monochromatic light.
  • FIG. 25 is a cross-sectional view of a display device having a plurality of spot lamps and a real object.
  • the display device according to the first embodiment of the present invention is a display device that can be suitably used when exhibiting arts and crafts that cannot be directly touched.
  • the display device according to the first embodiment of the present invention provides an image based on the combi- ter graphics as well as real objects such as arts and crafts in a state where the viewer can freely operate it.
  • FIG. 1 is a diagram showing a display device according to Embodiment 1 of the present invention.
  • Fig. 1 (A) is a side view of the exhibition device.
  • FIG. 1B is a front view of the display device.
  • 2 and 3 are cross-sectional views of the display device of FIG.
  • Fig. 2 (A) is a cross-sectional view of the display device of Fig. 1 (A) taken along line A-A. .
  • FIG. 2 (B) is a BB cross-sectional view of the display device of FIG. 1 (A).
  • FIG. 3 is a CC cross-sectional view of the display device shown in FIGS. 1 (A) and 1 (B).
  • the exhibition apparatus includes an access restriction frame 1, a mounting table 2, a cover member 3, and a spot lamp 4 as a light emitting means. Have.
  • the entry restriction frame 1 has four column members 5.
  • the four long pillar members 5 are arranged so as to be erected at the four corners of the square.
  • the four column members 5 are connected to each other by the beam members 6, and are assembled into a cubic frame structure.
  • Restricted access frame 1 is installed on floor 7.
  • the four column members 5 of the entry restriction frame 1 may be fixed to the floor 7. Restricted access frame 1 is about 2.4m tall, higher than an adult.
  • the entry restriction frame 1 has four horizontal bar members 8 connecting the four column members 5 to be erected.
  • the four horizontal bar members 8 are provided so as to bridge between the adjacent column members 5 at the height of the adult's knee or at a height above it.
  • a person who wants to enter the restricted access frame 1 needs to cross over the horizontal bar member 8.
  • a person who attempts to enter Restricted Access Box 1 must raise his or her feet significantly when attempting to enter Restricted Access Box 1. As a result, a viewer or a thief cannot easily enter the restricted access frame 1.
  • the restricted access frame 1 is reinforced by the four horizontal bar members 8.
  • a transparent reinforcing glass may be provided between the four pillar members 5 that are erected.
  • the mounting table 2 is arranged at the center inside the entrance restriction frame 1.
  • the mounting table 2 has the height of the waist of an adult or more.
  • the mounting table 2 has a substantially quadrangular prism shape.
  • the cover member 3 covers the real object 9 mounted on the upper surface of the mounting table 2.
  • the cover member 3 is formed of a transparent material, and covers the real object 9 mounted on the upper surface of the mounting table 2. Hippopotamus Examples of the material of the one member 3 include glass and a transparent acrylic plate.
  • the spot lamp 4 is disposed on the ceiling of the entry restriction frame 1 so that the spot light illuminates the real object 9 mounted on the mounting table 2.
  • the spot lamp 4 is a kind of lighting equipment and outputs a spot light having a brightness according to the supplied electric power.
  • the spot light of the spot lamp 4 is incandescent. The greater the power supplied, the brighter the spotlight. The smaller the power supplied, the darker the spotlight. When no power is supplied, the spot lamp 4 is turned off. In the first embodiment, the spot lamp 4 is always supplied with power.
  • the display device includes a translucent screen 11 as a screen, an opaque plate 12, and a projector 13.
  • the transmission screen 11 has a rectangular shape.
  • the transmissive screen 11 is a set of entrance restriction frames 1 so that the lengthwise direction of the translucent screen 11 is parallel to the horizontal direction and the height extends from the height of the adult's waist to the height of the head. It is disposed between the pillar members 5 of the first and second members.
  • the transmission screen 11 has a transmittance of about 65%.
  • the opaque plate 12 has a rectangular shape.
  • the opaque plate 12 is disposed between the pair of column members 5 of the entrance restriction frame 1 above the transmissive screen 11.
  • the opaque plate 12 has a height up to the ceiling of the restricted access frame 1.
  • the opaque plate 12 is disposed so that its surface is substantially parallel to the surface of the transmission screen 11.
  • the opaque plate material 12 is disposed so as to be shifted outside the entrance restriction frame 1 from the transmission screen 11. As shown in the interval B in FIG. 3, between the surface of the transmissive screen 11 (the surface outside the restricted access frame 1) and the back surface of the opaque plate 12 (the surface inside the restricted access frame 1). Will form a gap of several centimeters.
  • the projector 13 When the video signal or the still image data is input, the projector 13 outputs an image based on the video signal or the still image data from the output unit.
  • the projector 13 is disposed on the ceiling of the entrance restriction frame 1 in such a manner that the center of the output image is the center of the transmissive screen 11.
  • the image output from the projector 13 is projected on the transmission screen 11. From the front side of the transmissive screen 11 (outside the entrance restriction frame 1), the projector 13 can see the image projected on the transmissive screen 11.
  • the projector 13 is disposed so as to be positioned obliquely upward at 45 degrees from the center of the transmissive screen 11.
  • viewer a viewer standing in front of the transmissive screen 11
  • viewpoint of an output unit of the projector 13.
  • the opaque plate 12 will be located. From the viewer 14 standing in front of the translucent screen 11, the output of the projector 13 is not visible. The viewer 14 is not dazzled by the direct light from the output unit of the projector 13.
  • the output of the projector 13 is corrected by the trapezoidal distortion correction function so that the image is projected on the translucent screen 11 with a rectangular outline.
  • the display device includes a retroreflective tape 21 and two infrared light emitting and receiving units 22 and 23.
  • the retroreflective tape 21 and the two infrared light emitting and receiving units 22, 23 are operation detecting means and contact position detecting means.
  • FIG. 4 is an explanatory diagram showing the layout and structure of the transmissive screen 11, the retroreflective tape 21, and the two infrared light emitting and receiving units 22, 23 of the display device of FIG.
  • the retroreflective tape 21 is a tape that reflects irradiated light in the irradiation direction.
  • the retroreflective tape 21 is, for example, a plurality of beads 24 each having a reflective layer adhered to half of the surface of a transparent sphere such as glass, which is also strong, arranged in such a manner that the reflective layer is on the side where the tape adheres. Some have a structure.
  • the left and right sides and the lower side of the transmissive screen 11 are arranged so that the surface where the tape is attached is outside the transmissive screen 11.
  • the light irradiated on the retroreflective tape 21 enters the bead body 24 from the front side of the tape, is reflected by the reflective layer, and is reflected from the front side of the tape in the irradiation direction.
  • the plurality of bead bodies 24 may be encapsulated in a tape or encapsulated in the tape.
  • One of the two infrared light emitting and receiving units 22 and 23 includes an infrared LED (Light Emitting Diode) 25, a polygon mirror 26, and an infrared CCD (Charge-Coupled Device) 27.
  • the infrared LED 25 outputs infrared light. Infrared rays are a type of light.
  • the polygon mirror 26 is, for example, a polygon mirror such as a hexagon mirror.
  • the infrared CCD 27 has a plurality of infrared light receiving elements.
  • the infrared light receiving element outputs a light receiving level signal corresponding to the amount of received infrared light.
  • the infrared CCD 27 outputs an infrared image based on the light receiving level signals of the plurality of infrared light receiving elements.
  • One infrared projection / reception unit 22 is provided near one end of the translucent screen 11 in the longitudinal direction.
  • One infrared light emitting / receiving unit 22 is provided in a gap between the transmission screen 11 and the opaque plate 12.
  • the plurality of infrared light receiving elements are arranged along the surface of the transmission screen 11.
  • one infrared light emitting / receiving unit 22 emits infrared light, and the ED 25 outputs infrared light.
  • the polygon mirror 26 reflects the infrared light output from the infrared LED 25 on one mirror surface. reflect.
  • the infrared light reflected by the polygon mirror 26 travels along the surface of the transmission screen 11 and enters the retroreflective tape 21 provided along the outer periphery of the transmission screen 11.
  • the retroreflective tape 21 reflects the incident infrared light in the incident direction.
  • the infrared light reflected by the retroreflective tape 21 returns to one of the infrared light emitting / receiving units 22 along substantially the same path as the case of incidence.
  • the infrared light returned to one infrared light emitting / receiving unit 22 is received by a certain infrared light receiving element among the plurality of infrared light receiving elements of the infrared CCD 27.
  • the polygon mirror 26 rotates.
  • the direction of infrared light output from one infrared light emitting and receiving unit 22 changes.
  • the infrared rays output in the different directions proceed along the surface of the transmissive screen 11 and are incident on the retroreflective tape 21 at a site different from the one before.
  • the retro-reflective tape 21 reflects the irradiated infrared rays in the direction of incidence.
  • the infrared light reflected by the retroreflective tape 21 returns to one of the infrared light emitting and receiving units 22 along substantially the same path as the case of incidence.
  • the infrared light that has returned to the one infrared light emitting / receiving unit 22 is received by a certain infrared light receiving element different from the previous one among the plurality of infrared light receiving elements of the infrared CCD 27.
  • the path of the infrared light output from one infrared light emitting / receiving unit 22 gradually moves along the surface of the transmission screen 11 as the polygon mirror 26 rotates.
  • the infrared light receiving element that receives infrared light at the infrared CCD 27 changes.
  • the plurality of infrared light receiving elements of the infrared CCD 27 receive infrared light once.
  • the plurality of infrared light receiving elements of the infrared CCD 27 receive infrared light again.
  • one infrared projection / reception unit 22 moves the polygon mirror 26 by a predetermined range (see FIG. 4) of the surface of the transmission screen 11 every time the polygon mirror 26 rotates by an angle obtained by dividing 360 degrees by the number of surfaces. Scan the area C with infrared light. Further, one infrared light emitting and receiving unit 22 repeats the scanning. If there is no obstacle in a predetermined range on the surface of the transmission screen 11 during a period required for one scan, the plurality of infrared light receiving elements receive infrared light once.
  • the infrared CCD 27 of the infrared emitting / receiving unit 22 outputs an infrared image every time the polygon mirror 26 rotates by an angle obtained by dividing 360 degrees by the number of surfaces.
  • the infrared ray that receives the infrared ray passing through the position where the obstacle is located The light receiving element stops receiving infrared light.
  • the infrared CCD 27 outputs an infrared image in which a shadow is formed in a small portion that does not receive the infrared rays.
  • the other infrared projection / reception unit 23 is provided near the other end in the longitudinal direction of the transmission screen 11.
  • the other infrared light emitting / receiving unit 23 is disposed in a gap between the transmission screen 11 and the opaque plate 12.
  • the configuration and operation of the other infrared projection / reception unit 23 are the same as the configuration and operation of the infrared projection / reception unit 22, and are denoted by the same reference numerals and description thereof will be omitted.
  • the infrared CCD 27 outputs an infrared image every time the polygon mirror 26 rotates by an angle obtained by dividing 360 degrees by the number of surfaces.
  • one infrared light emitting / receiving unit 22 and the other infrared light emitting / receiving unit 23 are disposed at both ends on the upper side of the transmission screen 11. So the other red A predetermined range of the surface of the transmission screen 11 scanned by the external light emitting / receiving unit 23 (a range C indicated by a dashed line in FIG. 4) and a predetermined range of the surface of the transmission screen 11 scanned by one infrared light emitting / receiving unit 22 (Range D shown by a dashed line in FIG. 4) does not coincide with each other, although there are overlapping portions, because the two infrared light emitting / receiving units 22, 23 are disposed at different positions.
  • a point on the transmissive screen 11 (for example, point E in FIG. 4) is in a direction different from the direction viewed from one infrared light emitting / receiving unit 22 and the direction viewed from the other infrared light emitting / receiving unit 23. .
  • the display device includes a control device 31. Although not shown in FIGS. 1 to 3, the control device 31 is arranged at a position where the viewer's power is not visible.
  • FIG. 5 is a block diagram showing a hardware configuration of a control system of the exhibition apparatus of FIG.
  • FIG. 6 is a block diagram showing a control system of the exhibition device of FIG.
  • the control device 31 can be realized by a personal computer or the like.
  • the control device 31 has an input / output port 32, a CPU (Central Processing Unit: central processing unit) 33, a memory 34, a hard disk device 35, and a system node 36 connecting these.
  • a CPU Central Processing Unit: central processing unit
  • the input / output port 32 is connected to the projector 13, one infrared light emitting / receiving unit 22, and the other infrared light emitting / receiving unit 23.
  • the one infrared light emitting and receiving unit 22 and the other infrared light emitting and receiving unit 23 output an infrared image of the infrared CCD 27 to the input / output port 32 every time scanning is performed.
  • FIG. 7 is a diagram showing data stored in the hard disk device 35 of FIG.
  • the hard disk device 35 stores, for example, programs such as a projection image generating program 41, a moving image reproducing program 42, and an operation detecting program 43, and display data such as three-dimensional computer graphics data 44 and moving image data 45.
  • programs such as a projection image generating program 41, a moving image reproducing program 42, and an operation detecting program 43, and display data such as three-dimensional computer graphics data 44 and moving image data 45.
  • the three-dimensional computer graphics data 46 is data for generating still image data of an image including an image when the real object 9 is viewed in any direction.
  • the three-dimensional computer graphics data 46 includes, for example, modeling data of the real object 9 and data of an image attached to the surface of the modeling data.
  • the data of the image pasted on the surface of this modeling data includes, for example, image power It is possible to use the data of the image obtained.
  • FIG. 8 is a diagram showing one image 51 based on still image data generated based on the three-dimensional computer graphics data 44 of FIG.
  • the same real image as when the real 9 (silver cup) is viewed from the front side of the transmission screen 11 is formed.
  • FIG. 9 is a diagram showing another image 52 based on still image data generated based on the three-dimensional computer graphics data 44 of FIG.
  • a real image larger than the real image in the image 51 in FIG. 8 is formed.
  • FIG. 10 is a diagram showing still another image 53 based on still image data generated based on the three-dimensional computer graphics data 44 of FIG.
  • the same real image as when the real 9 (silver cup) is viewed upward is formed.
  • the moving image data 45 is data for generating a video signal.
  • the moving image data 45 is, for example, moving image data 45 that introduces the manufacturing process of the real 9.
  • the projection image generation program 41 is read into the memory 34 and executed by the CPU 33.
  • the projection image generation unit 61 shown in FIG. 6 as the image changing unit and the image control unit is generated.
  • the projection image generation unit 61 generates still image data from the three-dimensional computer graphics data 44, and outputs the generated still image data to the projector 13.
  • the moving image reproducing program 42 is read into the memory 34 and executed by the CPU 33. As a result, the moving image reproducing unit 62 shown in FIG. 6 as an image forming unit is generated. The moving image reproducing unit 62 generates a video signal based on the moving image data 45, and outputs the generated video signal to the projector 13.
  • the operation detection program 43 is read into the memory 34 and executed by the CPU 33.
  • the operation detecting unit 63 shown in FIG. 6 is generated as the operation detecting unit and the viewer viewpoint specifying unit.
  • the operation detection unit 63 determines an operation on the transmissive screen 11 based on the infrared images input from the two infrared light emitting and receiving units 22 and 23, and according to the determined operation, the projection image generation unit 61 and the moving image reproduction unit. Output instructions to 62.
  • the real object 9 (here, a silver cup) is placed on the mounting table 2,
  • the spot lamp 4 illuminates the light.
  • a viewer or the like can check the appearance and color of the real object 9 through the translucent screen 11.
  • the projection image generation unit 61 reads the three-dimensional computer graphics data 44, and, based on the read three-dimensional computer graphics data 44, two-dimensional still image data obtained by projecting an image of the real object 9 on a predetermined plane. Generate Here, the projection image generation unit 61 generates still image data of the image shown in FIG.
  • the projection image generator 61 outputs the generated still image data to the projector 13 via the input / output port 32.
  • the projector 13 outputs an image 51 shown in FIG.
  • An image projected by the projector 13 is formed on the transmission screen 11.
  • FIG. 11 is a diagram showing a state where the initial image 51 shown in FIG.
  • the real image is projected substantially in the center of the transmissive screen 11 in comparison to the image projected on the transmissive screen 11.
  • the real image formed on the transmissive screen 11 has a viewing height having a standard height (for example, 175 cm) at the position of point A (the position viewed by the viewer 14) shown in FIG.
  • a standard height for example, 175 cm
  • the real image and the real object 9 seen through the transmissive screen 11 are positioned and overlapped.
  • the real 9 and the real image slightly deviate, but the real 9 and the real image rarely appear to be separated.
  • the viewer 14 can see both the real object 9 and the real image. As a result, the viewer 14 can be given an illusion as if the real thing 9 itself is at hand.
  • the two infrared light emitting / receiving units 22 , 23 When the viewer 14 or the like stands at the position of the point A shown in FIG. 2 and reaches out to the transmissive screen 11, the two infrared light emitting / receiving units 22 , 23 output two infrared images in which the shadow of the finger of the hand is formed. This set of infrared images is input to the operation detection unit 63 via the input / output port 32. Further, the two infrared light emitting / receiving units 22, 23 output one set of infrared images for each scan. Hereinafter, two infrared images output from the two infrared light emitting and receiving units 22, 23 for each scan are referred to as a set of infrared images.
  • the operation detection unit 63 determines an operation on the transmission screen 11 based on each set of infrared images, and outputs an instruction to the projection image generation unit 61 or the moving image reproduction unit 62 according to the determined operation.
  • the operation detecting unit 63 first determines the position and the number of fingers based on each set of infrared images. As described above, when infrared light is blocked by a finger, a shadow is formed in the infrared image. The operation detection unit 63 determines the position and number of shadows in each infrared image.
  • the operation detection unit 63 determines that the shadow in the two infrared images is a shadow by one finger. Further, the operation detecting unit 63 specifies the direction of the finger from each infrared light emitting / receiving unit (if the shadow has a width, the direction of the center of the shadow) based on the position of the shadow in each infrared image. . The operation detection unit 63 uses the specified direction of the finger from each infrared light emitting / receiving unit and the distance between one infrared light emitting / receiving unit 22 and the other infrared light emitting / receiving unit 23 based on the principle of the triangulation measurement method. Then, the position of the one finger on the transmissive screen 11 is specified. Accordingly, when a finger is present at point E in FIG. 4, for example, operation detection unit 63 can determine that there is one finger at that position.
  • the operation detection unit 63 determines that the shadow in the two infrared images is a shadow by two fingers. In addition, the operation detecting unit 63 determines that the right shadows in the two infrared images are shadows of the first finger, and the left shadows in the two infrared images are shadows of the remaining first finger. Assuming that there is, the direction of each finger from each infrared light emitting / receiving unit is specified.
  • the operation detection unit 63 uses the specified direction of each finger from each infrared light emitting and receiving unit and the distance between one infrared light emitting and receiving unit 22 and the other infrared light emitting and receiving unit 23 to determine the principle of triangulation.
  • the positions of the two fingers on the transmissive screen 11 are specified based on
  • the operation detecting unit 63 determines the number and position of the fingers based on the specified number and positions of the fingers. The contents of the operation instruction for the transmission screen 11 are determined. Alternatively, the operation detection unit 63 Based on the change of the finger position with respect to the finger position in the previous set of infrared images, the content of the operation instruction on the translucent screen 11 is determined. After determining the content of the operation instruction, the operation detection unit 63 outputs an instruction to the projection image generation unit 61 or the moving image reproduction unit 62 according to the determined operation instruction.
  • the positions of two fingers are specified in a certain set of infrared images, and the positions of the two fingers are specified in a certain set of previous infrared images.
  • the operation detection unit 63 performs an operation to instruct an enlargement of the image projected on the transmissive screen 11. Judge. If it is determined that the operation is an operation for instructing enlargement of the image, the operation detection unit 63 outputs an instruction for enlarging the image to the projection image generation unit 61.
  • the positions of two fingers are specified in a certain set of infrared images, and the positions of two fingers are specified in a certain set of infrared images before that.
  • the operation detection unit 63 determines that an operation for instructing reduction of the image projected on the transmissive screen 11 has been performed. If it is determined that the operation is an operation to instruct to reduce the image, the operation detecting unit 63 outputs an instruction to reduce the image to the projection image generating unit 61.
  • the positions of two fingers are specified in a certain set of infrared images, and the positions of two fingers are specified in a certain set of infrared images before that.
  • the operation detecting unit 63 projects the image on the transmissive screen 11 and instructs to rotate the image. to decide. If it is determined that the operation is to instruct the rotation of the image, the operation detection unit 63 outputs an instruction to rotate the image to the projection image generation unit 61.
  • the operation detection unit 63 determines that the operation force for instructing the reproduction of the moving image has been input. to decide. If it is determined that the operation is an operation for instructing reproduction of a moving image, the operation detection unit 63 outputs an instruction to reproduce the moving image to the moving image reproduction unit 62.
  • the projection image generation unit 61 reads the three-dimensional computer graphics data 44. From the three-dimensional computer graphics data 44, two-dimensional still image data in which the image of the real object 9 is projected onto a predetermined plane is generated.
  • Projection image generation section 61 generates still image data for projecting a projection image obtained by enlarging, reducing, or rotating the image based on the image projected by projector 13 on transmission screen 11. .
  • the image 51 output to the projector 13 by the initial screen display unit 61 is used as a reference.
  • the projection image generation unit 61 outputs the generated still image data to the projector 13.
  • Projector 13 outputs a screen based on the newly input still image data. On the transmissive screen 11, an image newly projected by the projector 13 is formed.
  • the transmission screen 11 includes the image shown in FIG. An image 53 having an image of the real object 9 shown as viewed from above is projected.
  • the image of the real object which was initially formed on the transmissive screen 11 so as to be superimposed on the real object 9 is enlarged, reduced or rotated by operating the transmissive screen 11. Or you can. As a result, it is possible to give an illusion to the viewer 14 who is operating as if he / she is holding the real thing 9 on display.
  • the operation detection unit 63 instructs the projection image generation unit 61 to display the initial image 51 shown in FIG. As a result, an image 51 overlapping the real object 9 shown in FIG. As a result, it is possible to give the operating viewer 14 an illusion that the real object 9 held and watched by the user is placed on the mounting table 2.
  • the projection image generation unit 61 continuously enlarges the real image until the image power of the real object formed on the transmissive screen 11 matches the real image in the image based on the initial screen data shown in Fig. 8.
  • the reduced and rotated still image data may be generated, and then the still image data of the image 51 overlapping the real object 9 shown in FIG. 8 may be generated.
  • the image of the real object formed on the screen 11 becomes an image that overlaps with the real object 9 after its size and orientation smoothly change. In this way, by smoothly changing the image of the real object formed on the transmissive screen 11 so as to overlap with the real object 9, the sense of unity between the real image formed on the transmissive screen 11 and the real object 9 is further enhanced. You can have.
  • the moving image reproducing unit 62 generates a video signal from the moving image data 45 when a moving image reproducing instruction is input.
  • the moving image reproducing unit 62 outputs the generated video signal to the projector 13.
  • the projector 13 outputs a moving image based on the video signal.
  • the moving image projected from the projector 13 is displayed on the transmission screen 11.
  • the video playback unit 62 instructs the projection image generation unit 61 to display the initial image 51 shown in FIG. As a result, an image 51 overlapping the real object 9 shown in FIG. 8 is displayed on the transmission screen 11.
  • the display device is configured such that the image of the real object having the same posture as the real object 9 when viewed through the transparent screen 11 is placed on the transmissive screen 11. It can be overlaid with the real thing 9 above. This allows the viewer 14 to recognize that the real image formed on the transmissive screen 11 has a one-to-one correspondence with the real 9, and the real image formed on the transmissive screen 11 is the real 9 itself. It can give the illusion as if.
  • the display device of the first embodiment allows the viewer 14 to view the real object 9 and the virtual real image from the same viewpoint, and has the same feeling between them. I can do it. For example, it is possible to give the size and texture of the real object 9 to the virtual real image, which is difficult to grasp only with the virtual real image. As a result, the display device of the first embodiment can give the real image a sense of reality.
  • the viewer 14 operates the transmission screen 11 to enlarge, reduce, or rotate the real image formed on the transmission screen 11. can do. Therefore, the viewer 14 can freely view the inaccessible real object 9 in the image of the real object. As a result, a high learning effect for art objects can be expected.
  • the operation on the transmission screen 11 is detected by the retroreflective tape 21 and the two infrared light emitting and receiving units 22 and 23.
  • Embodiment 1 In this configuration, by employing this configuration, it is possible to detect an operation on the transmissive screen 11 without disposing a member for detecting an operation on the transmissive screen 11 so as to overlap the transmissive screen 11. Since there are no members arranged on the transmissive screen 11, there is no loss of the real object 9 seen through the transmissive screen 11 or the sharpness of the real image formed on the transmissive screen 11.
  • the two infrared light projecting and receiving units 22, 23 are arranged above the translucent screen 11 that is erected. Therefore, compared to the case where the two infrared emitting and receiving units 22 and 23 are arranged below the transmission screen 11, for example, infrared rays other than the infrared rays from the infrared LED 25 enter the infrared CCD 26. . Also, since infrared light is used as light used for detecting operations, the influence of visible light is reduced. When the visible light is used as the light for detecting the operation, the light becomes susceptible to, for example, the light of the spot lamp 4 illuminating the real object 9.
  • the operation detecting section 63 can easily specify the position of the hand to be operated based on the infrared image in which the shadow of the hand is clearly seen.
  • the initial image 51 shown in FIG. 8 is projected on the transmission screen 11 at a predetermined position.
  • the initial image 51 shown in FIG. 8 projected on a predetermined position of the transmissive screen 11 is moved right and left and up and down on the transmissive screen 11 in accordance with the operation of the viewer 14, and is enlarged or reduced. Or may be able to do so.
  • the viewer 14 can move the real image so as to overlap the real image 9 with his / her eyes.
  • an image to be subsequently projected on the transmissive screen 11 may be projected based on the moved position.
  • the real image formed based on the three-dimensional computer graphics data 44 is the real object 9 mounted on the mounting table 2 itself.
  • the real image formed based on the three-dimensional computer graphics data 44 adds the discolored color to the real object 9.
  • Image may be used.
  • moving image data 45 is only continuously projected on transmissive screen 11.
  • a video signal based on the moving image data 45 may be generated stepwise according to an operation of the viewer 14 on the translucent screen 11. As a result, for example, the viewer 14 can relive the coloring process of the real object 9 and the like.
  • moving image data 45 is stored in hard disk device 35.
  • various data such as slide data and audio data may be reproduced on an exhibition device to form an image, a video, or character information related to a real object on the transparent screen 11.
  • the operation on the transmission screen 11 is detected by the retroreflective tape 21 and the two infrared light emitting and receiving units 22 and 23.
  • the operation on the transmissive screen 11 may be detected by a touch panel or the like disposed so as to overlap the transmissive screen 11.
  • the viewer feels that the operation of the viewer 14 is not as much as operating the real image formed on the transmissive screen 11 but as being operating on the touch panel. turn into.
  • the operation on the transmissive screen 11 is detected by the retroreflective tape 21 and the two infrared light emitting and receiving units 22 and 23, because such a sense of unity can be maintained.
  • a screen having a transmittance of 65% is used as the transmission screen 11.
  • the transmission screen 11 may have a transmittance of 10% or more and 90% or less.
  • a polarizing screen may be used in place of the transmission screen 11.
  • the polarizing screen is not suitable for a large-scale display device in which a plurality of viewers 14 can view at the same time, because the color of the image changes only by slightly changing the viewing angle.
  • the projector 13 is disposed on the actual 9 side of the transmissive screen 11, the occupied area of the display device is larger than when the projector 13 is disposed on the viewer 14 side of the transmissive screen 11, for example. Become smaller.
  • the display device according to the second embodiment is configured such that the viewer 14 stands in front of the transmission screen 11.
  • the point that the position and the size of the real image formed on the transmissive screen 11 are controlled in accordance with the height, the position of the viewpoint (eye), the standing position, and the like is different from the display device according to the first embodiment. You. In the following description, this difference will be mainly described.
  • FIG. 12 is a diagram showing a display device according to Embodiment 2 of the present invention.
  • FIG. 12A is a side view of the display device.
  • FIG. 12B is a front view of the display device.
  • FIG. 13 is a block diagram showing a hardware configuration of a control system of the exhibition device in FIG.
  • FIG. 14 is a block diagram showing a control system of the display device of FIG.
  • FIG. 15 is a diagram showing data stored in the hard disk device 35 of FIG.
  • the exhibition device includes a distance sensor 71 as a viewing position detecting unit and an imaging device 72 as a viewing position detecting unit.
  • the distance sensor 71 is attached to the center of the lower end of the transmission screen 11 toward the front of the transmission screen 11.
  • the distance sensor 71 outputs infrared light or other light, and receives reflected light of the light.
  • the distance sensor 71 calculates the distance to the object reflecting the light based on the time from when the light is output to when the force receives the reflected light.
  • the distance sensor 71 outputs the calculated distance to the input / output port 32, as shown in FIG.
  • the imaging device 72 has a plurality of light receiving elements (not shown).
  • the light receiving element outputs a light receiving level signal corresponding to the amount of received light.
  • the plurality of light receiving elements are arranged vertically and horizontally in the same plane.
  • the surface on which the plurality of light receiving elements are arranged is called a light receiving surface.
  • the imaging device 72 is attached to the center of the upper end of the opaque plate 12 in such a manner that the light receiving surface thereof is directed downward by a predetermined angle in front of the transmission screen 11.
  • the imaging device 72 outputs a captured image based on the light receiving level signals of the plurality of light receiving elements to the input / output port 32, as shown in FIG.
  • the hard disk device 35 stores a projection image generation program 81 and a human detection program 82, as shown in FIG.
  • the projection image generation program 81 is read into the memory 34 and executed by the CPU 33.
  • the projection image generation unit 91 shown in FIG. 14 as the image changing unit and the image control unit is generated.
  • the projection image generation unit 91 generates still image data based on the three-dimensional computer graphics data 44, and sends the generated still image data to the projector 13. Output.
  • the human detection program 82 is read into the memory 34 and executed by the CPU 33.
  • a person detecting unit 92 shown in FIG. 14 is generated as a viewer viewpoint specifying unit and a viewing position detecting unit.
  • the captured image captured by the imaging device 72 and the distance to the object reflecting the light calculated by the distance sensor 71 are input.
  • the components other than the components described above are the same as the components of the display device according to the first embodiment, and will be described using the same reference numerals and names. Is omitted.
  • the distance sensor 71 In an initial state where there is no person around the display device, the distance sensor 71 outputs light in the direction in front of the transmission screen 11.
  • the imaging device 72 images a predetermined range on the near side of the transmission screen 11. Spot lamp 4 is lit.
  • the projector 13 is off.
  • the projector 13 can be turned off by setting its luminance to 0. When the projector 13 cannot be turned off, a black image may be output.
  • the distance sensor 71 When the viewer 14 stands in front of the translucent screen 11 to view the real object 9 (silver cup) illuminated by the light of the spot lamp 4, the distance sensor 71 The emitted reflected light is received. The distance sensor 71 calculates the distance to the viewer 14 and outputs the distance to the input / output port 32. The imaging device 72 outputs a captured image including the image of the viewer 14 to the input / output port 32.
  • Information on the distance to the viewer 14 detected by the distance sensor 71 and the captured image are input to the human detection unit 92.
  • the human detection unit 92 detects the head of the viewer 14 standing in front of the translucent screen 11 based on the information on the distance and the captured image at that time. Identify the location. Instead of the head, a position such as the eye or the space between the eyebrows may be specified as the viewpoint position.
  • the viewpoint position (L, H) of the installation height of the imaging device 72 is specified based on the shooting angle ⁇ ⁇ ⁇ ⁇ of each pixel of the captured image and the distance L to the viewer 14.
  • the human detection unit 92 specifies the position of the head of the viewer 14 (that is, the viewpoint position) in the captured image by image processing.
  • the human detection unit 92 assumes that the distance to the viewer 14 detected by the distance sensor 71 is the distance from the translucent screen 11 to the viewer 14 and virtually captures the captured image from the translucent screen 11 at that distance.
  • the relative position of the identified head with respect to the transmissive screen 11 and the real object 9 is identified based on the virtual arrangement.
  • the human detection unit outputs information on the specified relative position of the head to the projection image generation unit 91.
  • the projection image generation unit 91 Based on the information of the relative position of the head (that is, the viewpoint position) and the three-dimensional computer graphics data 44, the projection image generation unit 91 generates a real image power to be formed on the translucent screen 11. When viewed from the position of the head, still image data that matches the real object 9 is generated.
  • the projection image generation unit 91 firstly determines the position of the head (that is, the viewpoint position) specified by the human detection unit 92 and the center of the real object 9. The position 102 of the transmissive screen 11 that intersects the connecting center line is specified. This position 102 is the center of the real image formed on the transmissive screen 11. In addition, the projection image generation unit 91 specifies the size 101 of the real image formed on the transmissive screen 11, as shown in FIG.
  • the projection image generation unit 91 After specifying the center 102 and the size 101 of the real image formed on the transmissive screen 11, the projection image generation unit 91 generates still image data in which the real image is formed on the specified part.
  • the projection image generation unit 91 outputs the generated still image data to the projector 13.
  • the projector 13 starts outputting and outputs an image based on the still image data.
  • An image projected by the projector 13 is formed on the transmissive screen 11.
  • FIG. 17 is a diagram showing an image 111 based on still image data generated based on the three-dimensional computer graphics data 44.
  • the image force of the real object 9 here, the silver cup
  • the front side of the translucent screen 11 is seen from the front side of the translucent screen 11, and its center is formed shifted to the lower left in FIG. .
  • FIG. 18 is a diagram showing a state where image 111 shown in FIG. 17 is projected on transmissive screen 11.
  • the real image formed on the transmission screen 11 is formed at a position where the central force of the transmission screen 11 is shifted to the lower left.
  • the viewer 14 standing to the left in front of the transmissive screen 11 sees the real image formed on the transmissive screen 11 and the real object 9 seen through the transmissive screen 11 completely overlap.
  • projection image generating section 91 Based on detection of viewer 14 by distance sensor 71, projection image generating section 91 generates still image data, and projector 13 starts outputting an image based on the still image data. .
  • projection image generating section 91 generates still image data
  • projector 13 starts outputting an image based on the still image data.
  • the operation of the display device after the display of the initial screen is the same as the operation of the display device according to the first embodiment, and a description thereof will be omitted.
  • the display device has the position and size of the real image formed on the transmission screen 11 according to the height and position of the viewer 14 standing in front of the transmission screen 11. Control.
  • the display device according to the second embodiment adjusts the position of the real image on the transmission screen 11. This allows the viewer 14 to see the real image formed on the transmissive screen 11 irrespective of the height and the standing position, superimposed on the real object 9. Further, unlike Embodiment 1, the standing position of the viewer 14 does not have to be specified. In the case of the display device according to the second embodiment, the position where the viewer 14 views may be on the front side of the transmission screen 11.
  • FIG. 19 is a diagram showing a situation where three viewers 14 are standing in front of the transmissive screen 11.
  • the human detection unit 92 outputs the viewer closest to the center of the transmissive screen 11 (in FIG. 19, three out of three viewers).
  • the middle viewer 14) is identified as a representative viewer 14 operating on the transmission screen 11, and the image of the real object formed on the transmission screen 11 is displayed to the representative viewer 14 in real time. You can make it overlap with nine.
  • the other viewers 14 who do not operate the translucent screen 11 merely observe the operations of the representative viewers 14, so that a sufficient viewing effect can be obtained even with such control. You. It should be noted that the same effect can be expected even if the viewer located in the center of the captured image is specified instead of the viewer located closest to the center of the transmissive screen 11.
  • the human detection unit 92 accurately specifies the position of the viewer 14 based on the detection information of the distance sensor 71 and the imaging device 72.
  • the human detection unit 92 has the viewer 14 touch the translucent screen 11, and based on the touched position, and when the viewer 14 extends his arm, The approximate position of the head of the viewer 14 may be specified based on the positional relationship between the translucent screen 11 and the viewer 14 when it is assumed that the touch has been made.
  • FIG. 20 is a diagram showing a state in which the viewer 14 is touching the transmission screen 11.
  • the human detection unit 92 may display a message such as “Touch here” on the transmission screen 11 so that the viewer 14 touches the transmission screen 11.
  • the operation detection unit 63 It is a viewer viewpoint specifying means for specifying the position of the viewer 14.
  • the human detection unit 92 when displaying the initial screen, the human detection unit 92 The position is specified, and the formation position of the real image is adjusted.
  • a human detection unit In addition to this, for example, a human detection unit
  • the 92 may specify the position of the viewer 14 and adjust the formation position of the real image each time the projection image generation unit 91 outputs the still image data.
  • the real image can be formed at a position that always overlaps with the real object 9 by shifting the real image according to the eyes of the viewer 14.
  • a control device 31 capable of high-speed image processing is required, and the display device becomes expensive.
  • Embodiment 2 when there is no viewer 14 around the display device, projector 13 is turned off. In addition, for example, when there is no viewer 14 around the display device, the projector 13 may output an image other than the real image. This makes it easier for viewers 14 who are far away from the display equipment to see.
  • the display apparatus according to the third embodiment is different from the display apparatus according to the second embodiment in that the spot lamp 4 is dimmed in accordance with the projection of an image onto the transmission screen 11.
  • the following description focuses mainly on this difference.
  • FIG. 21 is a block diagram showing a hardware configuration of a control system of the display device according to Embodiment 3 of the present invention.
  • FIG. 22 is a block diagram showing a control system of the display device of FIG.
  • FIG. 23 is a diagram showing data stored in the hard disk device 35 of FIG.
  • the spot lamp 4 is connected to the input / output port 32 as shown in FIG.
  • the light amount of the spot lamp 4 is controlled by the control system of the exhibition device.
  • the hard disk device 35 according to the third embodiment stores a projection image generation program 121 and a lamp control program 122, as shown in FIG.
  • the projection image generation program 121 is read into the memory 34 and executed by the CPU 33.
  • the projection image generation unit 131 shown in FIG. 22 is generated as the image changing unit, the light amount control unit, and the image control unit.
  • the projection image generation unit 131 generates still image data based on the three-dimensional computer graphics data 44, and outputs the generated still image data to the projector 13.
  • the lamp control program 122 is read into the memory 34 and executed by the CPU 33. This As a result, a lamp control unit 132 shown in FIG. 22 is generated as light amount control means. The lamp control unit 132 controls the light amount of the spot lamp 4 to be 0 to 100%.
  • the components other than the components described above are the same as the components of the display device according to the second embodiment, and will be described using the same reference numerals and names. Is omitted.
  • the distance sensor 71 In an initial state in which no person is around the display device, the distance sensor 71 outputs light in the forward direction of the transmission screen 11.
  • the imaging device 72 images a predetermined range on the near side of the transmission screen 11.
  • the lamp control unit 132 lights the spot lamp 4 at 100% output.
  • the projector 13 is off.
  • the distance sensor 71 When the viewer 14 stands in front of the transmissive screen 11 to view the real object 9 (here, the silver cup) illuminated by the light of the spot lamp 4, the distance sensor 71 The distance to is output.
  • the human detection unit 92 specifies the position of the head of the viewer 14 and outputs information on the relative position of the specified head to the translucent screen 11 and the real object 9 to the projection image generation unit 131.
  • the projection image generation unit 131 Based on the information on the relative position of the head and the three-dimensional computer graphics data 44, the projection image generation unit 131 generates a real image formed on the transmissive screen 11 when viewed from the position of the head. Then, still image data that matches the real object 9 is generated and output to the projector 13. The projector 13 starts outputting and outputs an image based on the still image data. An image projected by the projector 13 is formed on the translucent screen 11.
  • projection image generation section 131 instructs lamp control section 132 to turn off the light.
  • the lamp control unit 132 turns off the spot lamp 4.
  • the transmittance of the transmissive screen 11 is lower than 50%, the real object 9 disappears as soon as the spot lamp 4 is turned off, and the force of the real object 9 placed on the mounting table 2 also approaches the near side. It is difficult to give an impression as if they have moved to the city. If the transmittance of the transmissive screen 11 is higher than 80%, even if the spot lamp 4 is turned off, the actual object 9 may be seen indefinitely, and the actual object 9 placed on the mounting table 2 may be invisible. It is difficult to give the impression that your power has moved closer to you.
  • the operation detection unit 63 instructs the projection image generation unit 131 to display the previously displayed initial image.
  • the projection image generation unit 131 After displaying the initial image 51 previously displayed on the transmissive screen 11 after a predetermined time has elapsed, the projection image generation unit 131 outputs an instruction to turn off the light to the projector 13 and turns on the lamp control unit 132. Output instructions.
  • the lamp control unit 132 turns on the spot lamp 4.
  • the projector 13 is turned off.
  • the operation of the display device from the initial display of the initial screen to the end of the display of the initial screen is the same as that of the display device according to the second embodiment. It is the same, and the description is omitted.
  • the spot lamp 4 when displaying the real image on the transmissive screen 11, the spot lamp 4 is turned off. Thereby, the viewer 14 can be given an illusion as if the real object 9 placed on the mounting table 2 is approaching. The attention of the viewer 14 will come and go between the real 9 and the real image. In addition, spot The same effect can be expected by simply reducing the light amount of the spot lamp 4 instead of turning off the lamp 4.
  • spot lamp 4 when the display of the real image on transmission screen 11 is finished, spot lamp 4 is turned on. Thereby, the viewer 14 can be given an illusion of returning to the real image 9 on the other side of the transmissive screen 11 formed on the transmissive screen 11. The point of interest of the viewer 14 will be toggling between the real 9 and the real statue. The same effect can be expected by merely increasing the light amount of the spot lamp 4 instead of turning on the spot lamp 4.
  • Embodiment 3 only spot lamp 4 is turned on when there is no person around the display device, and only projector 13 is turned on when a viewer comes.
  • the lighting of the spot lamp 4 and the lighting of the projector 13 may be controlled stepwise or may be controlled so as to change slowly.
  • the spot lamp 4 and the projector 13 are turned off, and when the viewer comes, only the spot lamp 4 is turned on. You can turn it on.
  • the display device has one spot lamp 4 that outputs incandescent light.
  • the display device may have a spot lamp 4 that outputs incandescent light and a spot lamp 141 that outputs monochromatic light.
  • the display device may be controlled such that the plurality of spot lamps 4 and 141 are switched on or off, or may be turned on simultaneously. When a plurality of spot lamps 4 and 141 are turned on at the same time, the color of the real object 9 can be changed by combining light.
  • FIG. 24 is a cross-sectional view showing a modification of the display device having the spot lamp 4 that outputs incandescent light and the spot lamp 141 that outputs monochromatic light.
  • the display device has one spot lamp 4 and one real object 9.
  • the display device may have a plurality of sets of spot lamps 4, 151 and real objects 9, 152.
  • the display device may switch the plurality of spot lamps 4 and 151 to be lit, thereby switching the point of interest of the viewer 14 between the plurality of real objects 9 and 152.
  • FIG. 25 is a cross-sectional view showing a modification of the display device having a plurality of sets of spot lamps 4 and 151 and real objects 9 and 152.
  • the entrance restriction frame 1 of the display device is assembled in a cubic frame structure.
  • the entry restriction frame 1 may be formed in a hexagonal column shape or another column shape.
  • the display device may be installed in a pillar of a building or the like.
  • the transmissive screen 11 is provided only between the pair of column members 5 of the entry restriction frame 1.
  • a plurality of transmissive screens may be provided between the column members 5 of all the sets of the entrance restriction frame 1. This allows a plurality of viewers 14 to operate the respective translucent screens at the same time, and to watch at the respective paces.
  • the display device according to the present invention can be used to display arts, crafts, commodities, and the like that actually exist.

Landscapes

  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Projection Apparatus (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

An exhibition system in which an appreciator can appreciate with a realistic sensation when an actually existing real object is exhibited. The screen (11) of the exhibition system (1) disposed between an exhibition object, i.e. a real object (9), and an appreciation position A is translucent and has such a size that the real object (9) can be viewed by the appreciator (14) at the appreciation position A. A projector (13) projects the image of a real object taking the substantially same attitude as that of the real object (9) when viewed through the screen (11) at the view point of the appreciator (14) such that it is viewed superposed on the real object (9).

Description

明 細 書  Specification
展示装置  Exhibition equipment
技術分野  Technical field
[0001] 本発明は、展示装置に関する。  The present invention relates to a display device.
背景技術  Background art
[0002] 特許文献 1は、展示装置を開示する。この展示装置は、投射映像と展示物を合成し た展示を行なう。特許文献 1は、恐竜の映像と森のミニチュアとを合成する展示例を 開示する。  [0002] Patent Document 1 discloses a display device. This exhibition device performs an exhibition that combines the projected image and the exhibits. Patent Literature 1 discloses an exhibition example in which an image of a dinosaur is combined with a forest miniature.
[0003] 特許文献 2は、仮想操作システムを開示する。この仮想操作システムは、インターネ ットを介して自宅のパソコン等力 あた力も実物に触れたり、実物を動かしたりしてい るかのようなバーチャル体験ができる。特許文献 2は、玩具のロボットなどをインター ネットを介して遠隔操作する具体例を開示する。  [0003] Patent Document 2 discloses a virtual operation system. This virtual operation system can provide a virtual experience as if the power of a personal computer at home, for example, is touching the real thing or moving the real thing through the Internet. Patent Document 2 discloses a specific example of remotely controlling a toy robot or the like via the Internet.
[0004] 特許文献 3は、ポインティングデバイスを開示する。このポインティングデバイスは、 表示装置の前方に配置される入力平面板と、入力平面板に対して所定の距離を有 する平面内に走査光を投受光する光センサユニットと、光センサユニットの投光する 光線を再帰反射する再帰反射部材と、光センサユニットの受光した再帰反射光を利 用して、入力平面板上の指示体を撮像し電気信号に変換する撮像手段と、撮像手 段により得られた撮像信号を解析して指示体の座標位置を算出する画像処理手段と を有する。  [0004] Patent Document 3 discloses a pointing device. The pointing device includes an input plane plate arranged in front of the display device, an optical sensor unit for projecting and receiving scanning light within a plane having a predetermined distance from the input plane plate, and a light projection unit of the optical sensor unit. A retroreflective member that retroreflects light rays, imaging means that uses the retroreflected light received by the optical sensor unit to image a pointer on the input flat plate, and converts the pointer into an electric signal, and an imaging means. Image processing means for analyzing the obtained imaging signal to calculate the coordinate position of the pointer.
[0005] 特許文献 4は、表示手段の表示画面に表示される仮想空間画像を移動させる仮想 空間移動制御装置を開示する。この仮想空間移動制御装置は、表示画面において 行われる、複数の指示部位力 なる指示体のタツチ操作またはドラック操作を検出す る検出手段と、検出手段の検出結果に基づいて、仮想空間画像の視点位置情報を 生成する視点位置情報生成手段と、視点位置情報が指示する視点から見た仮想空 間画像データを生成し、表示手段へ出力する立体画像生成手段とを有する。  [0005] Patent Document 4 discloses a virtual space movement control device that moves a virtual space image displayed on a display screen of a display unit. The virtual space movement control device includes a detection unit that detects a touch operation or a drag operation of a plurality of pointing parts, which are performed on a display screen, and a viewpoint of a virtual space image based on a detection result of the detection unit. The apparatus includes viewpoint position information generating means for generating position information, and stereoscopic image generating means for generating virtual space image data viewed from the viewpoint indicated by the viewpoint position information and outputting the data to the display means.
[0006] 特許文献 1 :特開平 5— 35192号公報 (要約書など)  Patent Document 1: Japanese Patent Application Laid-Open No. 5-35192 (Abstract, etc.)
特許文献 2:特開 2002— 170055号公報(要約書など) 特許文献 3:特開 2004— 5271号公報 (特許請求の範囲など) Patent Document 2: Japanese Patent Application Laid-Open No. 2002-170055 (Abstract, etc.) Patent Document 3: Japanese Patent Application Laid-Open No. 2004-5271 (Claims, etc.)
特許文献 4:特開 2004 - 5272号公報 (特許請求の範囲など)  Patent Document 4: Japanese Patent Application Laid-Open No. 2004-5272 (Claims, etc.)
発明の開示  Disclosure of the invention
発明が解決しょうとする課題  Problems to be solved by the invention
[0007] 従来、美術品、工芸品、商品などの実物を展示する場合、その実物は、展示ケース に収容されて展示される。し力しながら、このように展示ケースに実物を収容して展示 した場合、その実物を自由に手にとって鑑賞したり、実物の裏側を眺めたりすることは できな 、と!/、う問題がある。  [0007] Conventionally, when real objects such as arts, crafts, and commodities are displayed, the real objects are displayed in a display case. However, if you put the real thing in the display case and display it in this way, you will not be able to freely watch the real thing and look at the back of the real thing! is there.
[0008] そこで、展示する実物を予め撮像し、その撮像した写真や静止画などを提供するこ とが考えられる。これにより、展示ケースに収容されている実物の裏側を見せたりする ことができる。また、この静止画などを提供する際に、上述する各特許文献の装置な どを利用することが考えられる。  [0008] Therefore, it is conceivable to take an image of a real object to be exhibited in advance and provide the photographed still image or the like. This makes it possible to show the back side of the real thing stored in the display case. In providing such a still image or the like, it is conceivable to use the devices described in the above-mentioned patent documents.
[0009] し力しながら、このように展示ケースに収容されている実物とは別に、たとえば展示 ケースの隣において実物を撮像した写真や静止画などを提供するとしても、鑑賞者 は、たえず視点を動力して、その写真や静止画などに映し出されている実物の像と、 展示ケースに収容されている実物とを、自分の頭の中で対応付けなければならない  [0009] In spite of this, while providing a photograph or a still image of the real object next to the real case, for example, in addition to the real object housed in the display case, the viewer always gets the viewpoint. Must be able to associate the real image shown in the photograph or still image with the real object contained in the display case in your own head.
[0010] たとえば写真において工芸品の一部を拡大した像を提供する場合、その拡大され た部分を展示ケースに収容されている実物に嵌め込むように想像力を働力せなけれ ば、鑑賞者は、その拡大された部分の細工の細力さやすばらしさなどを、実感をもつ て理解することは困難である。このように、写真などに、実物の大きさや質感などを持 たせることは、難しい。 [0010] For example, when providing a magnified image of a part of a handicraft in a photograph, the viewer must use his or her imagination to fit the magnified part into the real thing contained in the display case. It is difficult to understand with real feeling the fineness and fineness of the work of the enlarged part. In this way, it is difficult to give photographs and the like the real size and texture.
[0011] 本発明は、現実に存在する実物を展示する場合に鑑賞者が臨場感をもって鑑賞す ることができる展示装置を得ることを目的とする。  [0011] It is an object of the present invention to provide a display device that allows a viewer to view with realism when exhibiting a real object that actually exists.
課題を解決するための手段  Means for solving the problem
[0012] 本発明に係る展示装置は、展示対象である実物と鑑賞位置との間に設けられ、そ の鑑賞位置にいる鑑賞者力もその実物が見えるサイズでありかつ透過性のあるスクリ ーンと、鑑賞者の視点で、スクリーン越しに見たときの実物と略同じ姿勢である実物の 像を、実物と重なって見えるようにスクリーンに投影するプロジェクタとを有するもので ある。 [0012] The display device according to the present invention is provided between the real object to be displayed and the viewing position, and the viewer's power at the viewing position is large enough to see the real object and is a transparent screen. And, from the viewer's point of view, the real posture that is almost the same as the real posture when viewed through the screen And a projector for projecting an image on a screen so that the image overlaps with the real object.
[0013] この展示装置は、鑑賞者に、スクリーンに形成される実物の像が実物と一対一に対 応するものであると認知させ、スクリーンに形成される実物の像が実物そのものである ような錯覚を与えることができる。この展示装置は、現実に存在する実物の像に、現 実感を持たせることができる。  [0013] This display device allows the viewer to perceive that the image of the real object formed on the screen corresponds one-to-one with the real object, so that the real image formed on the screen is the real object itself. Illusion can be given. This display device can give a real image to a real image that actually exists.
[0014] 本発明に係る展示装置は、上述した各発明の構成に加えて、鑑賞者の鑑賞位置ま たは視点の位置を検出する鑑賞位置検出手段と、鑑賞位置検出手段が検出した位 置に応じて、その位置力 見て実物の像が実物と重なるように、スクリーンに形成され る実物の像を変える像変更手段とを有するものである。  [0014] In addition to the configuration of each of the above-described inventions, the exhibition apparatus according to the present invention further includes a viewing position detecting unit that detects a viewing position or a viewpoint position of a viewer, and a position detected by the viewing position detecting unit. Image change means for changing the image of the real object formed on the screen so that the image of the actual object overlaps the actual object in view of the positional force.
[0015] この構成を採用すれば、鑑賞者の 、る位置力も見て、実物の像は実物と重なって 見える。また、実物の像が実物と重なって見える位置を、指定する必要がなくなる。  [0015] If this configuration is adopted, the real image appears to overlap with the real object, taking into account the viewer's position. Further, it is not necessary to specify a position where the real image is seen overlapping with the real image.
[0016] 本発明に係る展示装置は、上述した各発明の構成に加えて、鑑賞位置検出手段 1S スクリーン力 鑑賞者までの距離を検出する距離センサと、スクリーンの鑑賞者側 の所定の範囲を撮像する撮像装置と、距離センサが検出した距離および撮像装置 が撮像した画像に基づ!/、て、鑑賞者の視点の位置を特定する鑑賞者視点特定手段 とを有するちのである。  [0016] The display device according to the present invention, in addition to the configuration of each of the above-described inventions, includes a viewing position detecting unit 1S, a screen force, a distance sensor for detecting a distance to a viewer, and a predetermined range on the viewer side of the screen. An imaging device for imaging, and a viewer viewpoint specifying means for specifying the position of the viewpoint of the viewer based on the distance detected by the distance sensor and the image captured by the imaging device.
[0017] この構成を採用すれば、鑑賞者の視点位置を正確に特定することかできる。  With this configuration, it is possible to accurately specify the viewpoint position of the viewer.
[0018] 本発明に係る展示装置は、上述した各発明の構成に加えて、鑑賞者視点特定手 段が、撮像された画像に複数の鑑賞者が写っている場合には、その画像の真中にい る鑑賞者あるいはスクリーンの真中に最も近 、ところに 、る鑑賞者の視点位置を特定 するものである。 [0018] In addition to the configuration of each of the above-described inventions, the display device according to the present invention may be configured such that, when a plurality of viewers are included in a captured image, the viewer viewpoint specifying means is located at the center of the image. This is to identify the viewpoint position of the viewer who is closest to the center of the viewer or the screen.
[0019] この構成を採用すれば、スクリーンの前に複数の鑑賞者がいたとしても、撮像画像 の真中に 、る鑑賞者あるいはスクリーンの真中に最も近 、ところに 、る鑑賞者からは 、実物の像が実物と重なって見える。  [0019] By employing this configuration, even if there are a plurality of viewers in front of the screen, the viewer is in the middle of the captured image or closest to the center of the screen. Image appears to overlap with the real one.
[0020] 本発明に係る展示装置は、上述した各発明の構成に加えて、鑑賞位置検出手段 力 スクリーンに触れた位置を検出する接触位置検出手段と、接触位置検出手段が 検出した接触位置に基づ!/、て、鑑賞者が腕を伸ばして触れたと仮定したスクリーンと 鑑賞者との位置関係に基づいて、鑑賞者の位置を特定する鑑賞者視点特定手段と を有するものである。 [0020] In addition to the configuration of each of the above-described inventions, the exhibition device according to the present invention further includes a viewing position detecting unit, a contact position detecting unit that detects a position touching the power screen, and a contact position detected by the contact position detecting unit. Based on the screen assuming that the viewer reached out and touched And a viewer viewpoint specifying means for specifying the position of the viewer based on the positional relationship with the viewer.
[0021] この構成を採用すれば、鑑賞者のいる位置を簡単な処理にて特定することかできる  By employing this configuration, the position where the viewer is present can be specified by simple processing.
[0022] 本発明に係る展示装置は、上述した各発明の構成に加えて、スクリーンが、透過率 力 50%以上でかつ 80%以下である透過スクリーンであり、且つ、スクリーンより実物 側に配設され、実物を照らす投光部材と、スクリーンに実物の像を形成する場合には 、投光部材が出力する光量を下げる第一の制御およびスクリーンに形成していた実 物の像を消す場合には、投光部材が出力する光量を上げる第二の制御の中の少な くとも一方の制御を実行する光量制御手段とを有するものである。 [0022] In the display device according to the present invention, in addition to the configuration of each of the above-described inventions, the screen is a transmissive screen having a transmissivity of 50% or more and 80% or less, and is arranged on the real side of the screen. The first control to reduce the amount of light output by the light projecting member and the actual light image formed on the screen when the real image is formed on the screen and the light projecting member that illuminates the real object And a light amount control means for executing at least one of the second controls for increasing the light amount output from the light projecting member.
[0023] この構成を採用すれば、スクリーンに実物の像を形成する場合には、第一の制御に よって、投光部材が出力する光量が下がるので、鑑賞者に、スクリーンに形成される 実物の像を、スクリーンの向こう側にある実物がスクリーンに移動してきたものである かのように錯覚させることができる。また、この構成を採用すれば、スクリーンに形成し ていた実物の像を消す場合には、第二の制御によって、投光部材が出力する光量が 上がるので、鑑賞者に、スクリーンに形成される実物の像力 スクリーンの向こう側に ある実物のところへ移動して 、つたかのような錯覚を与えることができる。これら 2つの 中のいずれか一方の制御を実行することで、スクリーンに形成される実物の像と実物 との一致感を強めることができる。  With this configuration, when a real image is formed on the screen, the amount of light output from the light projecting member is reduced by the first control. Can be illusioned as if the real object on the other side of the screen had moved to the screen. In addition, if this configuration is adopted, when erasing the real image formed on the screen, the second control increases the amount of light output from the light emitting member, so that the light is output to the screen by the viewer. Real Image Power You can move to the real object on the other side of the screen to give the illusion of a warmth. Executing one of these two controls can enhance the sense of match between the real image formed on the screen and the real object.
[0024] 本発明に係る展示装置は、上述した各発明の構成に加えて、スクリーンに対する鑑 賞者の操作を検出する操作検出手段と、操作検出手段が検出した操作およびその 時点での像の表示状態に基づいて、スクリーンに形成する実物の像を回転あるいは 拡縮させる像制御手段とを有するものである。  The display device according to the present invention, in addition to the configuration of each of the above-described inventions, further includes an operation detecting means for detecting an operation of the viewer on the screen, an operation detected by the operation detecting means, and an image of the image at that time. Image control means for rotating or enlarging or reducing the real image formed on the screen based on the display state.
[0025] この構成を採用すれば、鑑賞者に、実物と重なって 、る実物の像が形成されるスク リーンに対して操作をさせることで、このスクリーンに形成される実物の像を回転したり 、拡大したり、縮小したりすることができる。  [0025] With this configuration, the viewer can operate the screen on which the real image is formed by overlapping the real object, thereby rotating the real image formed on the screen. And can be enlarged or reduced.
[0026] 本発明に係る展示装置は、上述した各発明の構成に加えて、スクリーンに対する鑑 賞者の操作を検出する操作検出手段と、操作検出手段が検出した操作およびその 時点での像の表示状態に基づいて、実物に関連する画像、映像あるいは文字情報 をスクリーンに形成する画像形成手段とを有するものである。 [0026] In addition to the configuration of each of the above-described inventions, the display device according to the present invention further includes an operation detecting means for detecting an operation of the viewer on the screen, an operation detected by the operation detecting means, Image forming means for forming an image, video, or character information related to the real thing on the screen based on the display state of the image at the time.
[0027] この構成を採用すれば、鑑賞者に、スクリーンに対して操作をさせることで、スクリー ン越しに見える実物に関連する画像、映像あるいは文字情報を鑑賞させることができ る。  [0027] With this configuration, it is possible to allow a viewer to view an image, a video, or character information related to a real thing seen through a screen by operating the screen.
発明の効果  The invention's effect
[0028] 本発明では、現実に存在する実物を展示する場合に鑑賞者が臨場感をもって鑑賞 することができる。  [0028] According to the present invention, when exhibiting a real thing that is actually present, the viewer can appreciate the presence with a sense of realism.
図面の簡単な説明  Brief Description of Drawings
[0029] [図 1]本発明の実施の形態 1に係る展示装置を示す図である。 FIG. 1 is a view showing a display device according to Embodiment 1 of the present invention.
[図 2]図 1の展示装置の断面図である。  FIG. 2 is a cross-sectional view of the display device of FIG. 1.
[図 3]図 1の展示装置の断面図である。  FIG. 3 is a cross-sectional view of the display device of FIG. 1.
[図 4]図 1の展示装置の透過スクリーン、再帰反射テープおよび 2つの赤外線投受光 ユニットのレイアウトおよび構造を示す説明図である。  FIG. 4 is an explanatory view showing a layout and a structure of a transmission screen, a retroreflective tape, and two infrared light emitting and receiving units of the exhibition apparatus of FIG. 1.
[図 5]図 1の展示装置の制御系のハードウェア構成を示すブロック図である。  FIG. 5 is a block diagram showing a hardware configuration of a control system of the exhibition device of FIG. 1.
[図 6]図 1の展示装置の制御系を示すブロック図である。  FIG. 6 is a block diagram showing a control system of the exhibition device of FIG. 1.
[図 7]図 5のハードディスク装置に記憶されるデータを示す図である。  FIG. 7 is a diagram showing data stored in the hard disk device of FIG. 5.
[図 8]図 7の三次元 CGデータに基づく画像の一例を示す図である。  FIG. 8 is a diagram showing an example of an image based on the three-dimensional CG data of FIG. 7.
[図 9]図 7の三次元 CGデータに基づく他の画像の例を示す図である。  FIG. 9 is a diagram showing an example of another image based on the three-dimensional CG data of FIG. 7.
[図 10]図 7の三次元 CGデータに基づくさらに他の画像の例を示す図である。  FIG. 10 is a diagram showing still another example of an image based on the three-dimensional CG data of FIG. 7.
[図 11]透過スクリーンへの、図 8の初期画像の投影状態を示す図である。  FIG. 11 is a diagram showing a projection state of the initial image of FIG. 8 onto a transmission screen.
[図 12]本発明の実施の形態 2に係る展示装置を示す図である。  FIG. 12 is a view showing a display device according to Embodiment 2 of the present invention.
[図 13]図 12の展示装置の制御系のハードウェア構成を示すブロック図である。  FIG. 13 is a block diagram showing a hardware configuration of a control system of the exhibition device in FIG. 12.
[図 14]図 12の展示装置の制御系を示すブロック図である。  FIG. 14 is a block diagram showing a control system of the display device in FIG. 12.
[図 15]図 13のハードディスク装置に記憶されるデータを示す図である。  FIG. 15 is a diagram showing data stored in the hard disk device of FIG.
[図 16]実物、鑑賞者および透過スクリーンの幾何学的位置関係を示す図である。  FIG. 16 is a diagram showing a geometrical positional relationship between a real object, a viewer, and a transmission screen.
[図 17]三次元 CGデータに基づく画像の一例を示す図である。  FIG. 17 is a diagram showing an example of an image based on three-dimensional CG data.
[図 18]透過スクリーンへの、図 17の画像の投影状態を示す図である。 [図 19]透過スクリーンの前に、三人の鑑賞者が立っている状況を示す図である。 FIG. 18 is a diagram showing a state of projection of the image of FIG. 17 onto a transmission screen. FIG. 19 is a diagram showing a situation where three viewers are standing in front of a transmissive screen.
[図 20]鑑賞者が透過スクリーンに触れている状態を示す図である。  FIG. 20 is a diagram showing a state in which a viewer is touching a transmission screen.
[図 21]本発明の実施の形態 3に係る展示装置の制御系のハードウェア構成を示すブ ロック図である。  FIG. 21 is a block diagram showing a hardware configuration of a control system of the display device according to Embodiment 3 of the present invention.
[図 22]図 21の展示装置の制御系を示すブロック図である。  FIG. 22 is a block diagram showing a control system of the display device in FIG. 21.
[図 23]図 21のハードディスク装置に記憶されるデータを示す図である。  FIG. 23 is a diagram showing data stored in the hard disk device of FIG. 21.
[図 24]白熱色の光を出力するスポットランプと、単色光を出力するスポットランプとを 有する展示装置の変形例を示す断面図である。  FIG. 24 is a cross-sectional view showing a modification of the display device having a spot lamp that outputs incandescent light and a spot lamp that outputs monochromatic light.
[図 25]複数のスポットランプおよび実物を有する展示装置の断面図である。  FIG. 25 is a cross-sectional view of a display device having a plurality of spot lamps and a real object.
符号の説明  Explanation of symbols
[0030] 4 スポットランプ(投光部材)、 9 実物、 11 透過スクリーン (スクリーン)、 13 プロジ ェクタ、 14 鑑賞者、 21 再帰反射テープ (操作検出手段、接触位置検出手段)、 22 , 23 赤外線投受光ユニット (操作検出手段、接触位置検出手段)、 61 投影画像 生成部 (像変更手段、像制御手段)、 63 操作検出部 (操作検出手段、鑑賞者視点 特定手段)、 71 距離センサ (鑑賞位置検出手段)、 72 撮像装置 (鑑賞位置検出手 段)、 91 投影画像生成部 (像変更手段、像制御手段)、 92 人検出部 (鑑賞者視点 特定手段、鑑賞位置検出手段)、 131 投影画像生成部 (像変更手段、光量制御手 段、像制御手段)、 132 ランプ制御部 (光量制御手段)  [0030] 4 spot lamps (light emitting members), 9 real objects, 11 transmissive screens (screens), 13 projectors, 14 viewers, 21 retroreflective tapes (operation detecting means, contact position detecting means), 22, 23 infrared Light receiving unit (operation detection means, contact position detection means), 61 projected image generation unit (image change means, image control means), 63 operation detection unit (operation detection means, viewer viewpoint identification means), 71 distance sensor (viewing position Detection means), 72 imaging device (appreciation position detection means), 91 projected image generation unit (image changing means, image control means), 92 person detection unit (viewer viewpoint identification means, appreciation position detection means), 131 projected image Generator (image changing means, light quantity control means, image control means), 132 Lamp control section (light quantity control means)
発明を実施するための最良の形態  BEST MODE FOR CARRYING OUT THE INVENTION
[0031] 以下、本発明の実施の形態に係る展示装置を、図面に基づいて説明する。  Hereinafter, a display device according to an embodiment of the present invention will be described with reference to the drawings.
[0032] 実施の形態 1.  Embodiment 1.
本発明の実施の形態 1に係る展示装置は、直接触れることができない美術品ゃェ 芸品などを展示する際に好適に利用することができる展示装置である。本発明の実 施の形態 1に係る展示装置は、美術品や工芸品などの実物とともに、そのコンビユー タグラフィックス〖こよる像を、鑑賞者が自由に操作できる状態で提供する。  The display device according to the first embodiment of the present invention is a display device that can be suitably used when exhibiting arts and crafts that cannot be directly touched. The display device according to the first embodiment of the present invention provides an image based on the combi- ter graphics as well as real objects such as arts and crafts in a state where the viewer can freely operate it.
[0033] 図 1は、本発明の実施の形態 1に係る展示装置を示す図である。図 1 (A)は、展示 装置の側面図である。図 1 (B)は、展示装置の正面図である。図 2および図 3は、図 1 の展示装置の断面図である。図 2 (A)は、図 1 (B)の展示装置の A— A断面図である 。図 2 (B)は、図 1 (A)の展示装置の B— B断面図である。図 3は、図 1 (A)および (B) の展示装置の C C断面図である。 FIG. 1 is a diagram showing a display device according to Embodiment 1 of the present invention. Fig. 1 (A) is a side view of the exhibition device. FIG. 1B is a front view of the display device. 2 and 3 are cross-sectional views of the display device of FIG. Fig. 2 (A) is a cross-sectional view of the display device of Fig. 1 (A) taken along line A-A. . FIG. 2 (B) is a BB cross-sectional view of the display device of FIG. 1 (A). FIG. 3 is a CC cross-sectional view of the display device shown in FIGS. 1 (A) and 1 (B).
[0034] 図 1から図 3に示すように、実施の形態 1に係る展示装置は、立入制限枠 1と、載置 台 2と、カバー部材 3と、投光手段としてのスポットランプ 4とを有する。  As shown in FIGS. 1 to 3, the exhibition apparatus according to the first embodiment includes an access restriction frame 1, a mounting table 2, a cover member 3, and a spot lamp 4 as a light emitting means. Have.
[0035] 立入制限枠 1は、 4本の柱部材 5を有する。 4本の長尺な柱部材 5は、四角形の四 隅に立設されるように配置される。 4本の柱部材 5は、梁部材 6により互いに連結され 、立方体形状の骨組み構造に組み立てられる。立入制限枠 1は、床 7の上に設置さ れる。立入制限枠 1の 4本の柱部材 5は、床 7に固定されていてもよい。立入制限枠 1 は、大人の背より高い、約 2. 4mの高さを有する。  The entry restriction frame 1 has four column members 5. The four long pillar members 5 are arranged so as to be erected at the four corners of the square. The four column members 5 are connected to each other by the beam members 6, and are assembled into a cubic frame structure. Restricted access frame 1 is installed on floor 7. The four column members 5 of the entry restriction frame 1 may be fixed to the floor 7. Restricted access frame 1 is about 2.4m tall, higher than an adult.
[0036] 立入制限枠 1は、立設される 4本の柱部材 5を接続する 4本の横棒部材 8を有する。  The entry restriction frame 1 has four horizontal bar members 8 connecting the four column members 5 to be erected.
4本の横棒部材 8は、大人の膝の高さあるいはそれより上側となる高さ位置において 、隣り合う柱部材 5の間を橋渡すように設けられる。立入制限枠 1内に入ろうとする人 は、この横棒部材 8を跨ぎ越す必要がある。立入制限枠 1内に入ろうとする人は、立 入制限枠 1内に立ち入ろうとする場合には、足を大きく上げなければならない。その 結果、鑑賞者や泥棒などは、安易に立入制限枠 1内に立ち入ることができなくなる。 また、立入制限枠 1は、この 4本の横棒部材 8により補強される。  The four horizontal bar members 8 are provided so as to bridge between the adjacent column members 5 at the height of the adult's knee or at a height above it. A person who wants to enter the restricted access frame 1 needs to cross over the horizontal bar member 8. A person who attempts to enter Restricted Access Box 1 must raise his or her feet significantly when attempting to enter Restricted Access Box 1. As a result, a viewer or a thief cannot easily enter the restricted access frame 1. The restricted access frame 1 is reinforced by the four horizontal bar members 8.
[0037] なお、立入制限枠 1において、立設される 4本の柱部材 5の間には、透明な強化ガ ラスを配設するようにしてもよい。これにより、鑑賞者などの立入制限枠 1内への立ち 入りを、より効果的に制限し、後述する実物 9の盗難などをより効果的に防止すること ができる。  [0037] In the entrance restriction frame 1, a transparent reinforcing glass may be provided between the four pillar members 5 that are erected. As a result, it is possible to more effectively restrict the entrance of the viewer into the restricted access frame 1 and more effectively prevent theft of the real object 9 described later.
[0038] 載置台 2は、立入制限枠 1の内部の中央に配設される。載置台 2は、大人の腰の高 さあるいはそれ以上の高さを有する。載置台 2は、略四角柱形状を有する。  The mounting table 2 is arranged at the center inside the entrance restriction frame 1. The mounting table 2 has the height of the waist of an adult or more. The mounting table 2 has a substantially quadrangular prism shape.
[0039] 載置台 2の上面の上には、展示対象である美術品、工芸品、商品、その他の実物 9 が載置される。図 1では、載置台 2の上面の上に、銀色のカップが載置されている。 実物 9を立入制限枠 1の内部の中央に配設することで、鑑賞者などは、鑑賞のために 、実物 9に直接触れることができない。  [0039] On the upper surface of the mounting table 2, arts, crafts, commodities, and other real objects 9 to be displayed are mounted. In FIG. 1, a silver cup is mounted on the upper surface of the mounting table 2. By arranging the real thing 9 in the center of the inside of the restricted access frame 1, the viewer cannot directly touch the real thing 9 for viewing.
[0040] カバー部材 3は、載置台 2の上面に載置される実物 9を被覆する。カバー部材 3は、 透明な材料で形成され、載置台 2の上面の上に載置される実物 9を被覆する。カバ 一部材 3の材料としては、たとえばガラス、透明アクリル板などがある。透明なカバー 部材 3で実物 9を被覆することにより、実物 9の鑑賞を妨げることなぐ実物 9に塵や埃 が付着したりしてしまうことを防止することができる。 [0040] The cover member 3 covers the real object 9 mounted on the upper surface of the mounting table 2. The cover member 3 is formed of a transparent material, and covers the real object 9 mounted on the upper surface of the mounting table 2. Hippopotamus Examples of the material of the one member 3 include glass and a transparent acrylic plate. By covering the actual object 9 with the transparent cover member 3, it is possible to prevent dust or dust from adhering to the actual object 9 without hindering the appreciation of the actual object 9.
[0041] スポットランプ 4は、そのスポット光が載置台 2に載置される実物 9を照らすように、立 入制限枠 1の天井部分に配設される。スポットランプ 4は、照明器具の一種であり、供 給される電力に応じた明るさのスポット光を出力する。このスポットランプ 4のスポット 光は、白熱色である。供給される電力が大きくなれば、スポット光は、明るくなる。供給 される電力が小さくなれば、スポット光は、暗くなる。電力が供給されない状態では、 スポットランプ 4は、消灯する。この実施の形態 1では、スポットランプ 4には、常に電 力が供給されている。 The spot lamp 4 is disposed on the ceiling of the entry restriction frame 1 so that the spot light illuminates the real object 9 mounted on the mounting table 2. The spot lamp 4 is a kind of lighting equipment and outputs a spot light having a brightness according to the supplied electric power. The spot light of the spot lamp 4 is incandescent. The greater the power supplied, the brighter the spotlight. The smaller the power supplied, the darker the spotlight. When no power is supplied, the spot lamp 4 is turned off. In the first embodiment, the spot lamp 4 is always supplied with power.
[0042] また、実施の形態 1に係る展示装置は、スクリーンとしての透過スクリーン 11と、不 透明の板材 12と、プロジェクタ 13とを有する。  The display device according to the first embodiment includes a translucent screen 11 as a screen, an opaque plate 12, and a projector 13.
[0043] 透過スクリーン 11は、長方形形状を有する。透過スクリーン 11は、その長尺方向が 水平方向に沿った方向となるように、且つ、大人の腰の高さから頭の高さまでの範囲 にわたる高さとなるように、立入制限枠 1の 1組の柱部材 5と柱部材 5との間に配設さ れる。透過スクリーン 11は、約 65%の透過率を有する。  [0043] The transmission screen 11 has a rectangular shape. The transmissive screen 11 is a set of entrance restriction frames 1 so that the lengthwise direction of the translucent screen 11 is parallel to the horizontal direction and the height extends from the height of the adult's waist to the height of the head. It is disposed between the pillar members 5 of the first and second members. The transmission screen 11 has a transmittance of about 65%.
[0044] 不透明の板材 12は、長方形形状を有する。不透明の板材 12は、透過スクリーン 11 の上方において、立入制限枠 1の 1組の柱部材 5と柱部材 5との間に配設される。不 透明の板材 12は、立入制限枠 1の天井までにいたる高さを有する。  The opaque plate 12 has a rectangular shape. The opaque plate 12 is disposed between the pair of column members 5 of the entrance restriction frame 1 above the transmissive screen 11. The opaque plate 12 has a height up to the ceiling of the restricted access frame 1.
[0045] 不透明の板材 12は、その面が透過スクリーン 11の面と略平行となるように配設され る。不透明の板材 12は、透過スクリーン 11より立入制限枠 1の外側にずれて配設さ れる。図 3の間隔 Bに示すように、透過スクリーン 11の表面(立入制限枠 1の外側とな る面)と、不透明の板材 12の裏面(立入制限枠 1の内側となる面)との間には、数セン チメートルの間隔が形成される。  The opaque plate 12 is disposed so that its surface is substantially parallel to the surface of the transmission screen 11. The opaque plate material 12 is disposed so as to be shifted outside the entrance restriction frame 1 from the transmission screen 11. As shown in the interval B in FIG. 3, between the surface of the transmissive screen 11 (the surface outside the restricted access frame 1) and the back surface of the opaque plate 12 (the surface inside the restricted access frame 1). Will form a gap of several centimeters.
[0046] プロジェクタ 13は、映像信号あるいは静止画データが入力されると、その映像信号 あるいは静止画データに基づく画像を出力部から出力する。プロジェクタ 13は、出力 した画像の中心が透過スクリーン 11の中心となる姿勢で、立入制限枠 1の天井部分 に配設される。プロジェクタ 13が出力する画像は、透過スクリーン 11に投影される。 透過スクリーン 11の前側(立入制限枠 1の外側)からは、プロジェクタ 13が透過スクリ ーン 11に投影された画像を見ることができる。 When the video signal or the still image data is input, the projector 13 outputs an image based on the video signal or the still image data from the output unit. The projector 13 is disposed on the ceiling of the entrance restriction frame 1 in such a manner that the center of the output image is the center of the transmissive screen 11. The image output from the projector 13 is projected on the transmission screen 11. From the front side of the transmissive screen 11 (outside the entrance restriction frame 1), the projector 13 can see the image projected on the transmissive screen 11.
[0047] プロジェクタ 13は、その透過スクリーン 11の中心から上方へ斜め 45度方向に位置 するように配設される。透過スクリーン 11を対してプロジェクタ 13をこのような角度方 向に配設することで、透過スクリーン 11の前側に立つ人(以下、鑑賞者という) 14の 視点とプロジェクタ 13の出力部との間に、不透明の板材 12が位置することになる。透 過スクリーン 11の前側に立つ鑑賞者 14からは、プロジェクタ 13の出力部は見えない 。鑑賞者 14は、プロジェクタ 13の出力部からの直接光によって幻惑されてしまったり することがない。 [0047] The projector 13 is disposed so as to be positioned obliquely upward at 45 degrees from the center of the transmissive screen 11. By arranging the projector 13 in such an angle direction with respect to the transmissive screen 11, a viewer standing in front of the transmissive screen 11 (hereinafter referred to as “viewer”) 14 and a viewpoint of an output unit of the projector 13. The opaque plate 12 will be located. From the viewer 14 standing in front of the translucent screen 11, the output of the projector 13 is not visible. The viewer 14 is not dazzled by the direct light from the output unit of the projector 13.
[0048] なお、このように透過スクリーン 11の垂直方向に対して 45度の角度方向力 画像を 投影する場合、透過スクリーン 11の垂直方向から画像を投影する場合とは異なり、透 過スクリーン 11に投影される画像は、台形の輪郭になる。そのため、プロジェクタ 13 は、透過スクリーン 11に画像が四角形の輪郭にて投影されるように、台形歪み補正 機能によりその出力を補正するとよい。  When projecting an image in the direction of an angle of 45 ° with respect to the vertical direction of the transmissive screen 11 as described above, unlike when projecting an image from the vertical direction of the transmissive screen 11, The projected image has a trapezoidal outline. Therefore, it is preferable that the output of the projector 13 is corrected by the trapezoidal distortion correction function so that the image is projected on the translucent screen 11 with a rectangular outline.
[0049] また、実施の形態 1に係る展示装置は、再帰反射テープ 21と、 2つの赤外線投受 光ユニット 22, 23とを有する。再帰反射テープ 21および 2つの赤外線投受光ユニット 22, 23は、操作検出手段および接触位置検出手段である。図 4は、図 1の展示装置 の透過スクリーン 11、再帰反射テープ 21および 2つの赤外線投受光ユニット 22, 23 のレイアウトおよび構造を示す説明図である。  The display device according to the first embodiment includes a retroreflective tape 21 and two infrared light emitting and receiving units 22 and 23. The retroreflective tape 21 and the two infrared light emitting and receiving units 22, 23 are operation detecting means and contact position detecting means. FIG. 4 is an explanatory diagram showing the layout and structure of the transmissive screen 11, the retroreflective tape 21, and the two infrared light emitting and receiving units 22, 23 of the display device of FIG.
[0050] 再帰反射テープ 21は、照射された光を、その照射方向へ反射するテープである。  [0050] The retroreflective tape 21 is a tape that reflects irradiated light in the irradiation direction.
再帰反射テープ 21は、たとえば、ガラスなど力もなる透明な球体の表面の半分に反 射層を付着させた複数のビーズ体 24を、その反射層がテープの付着面側となる姿 勢で並べた構造を有するものがある。この再帰反射テープ 21では、テープの付着面 が透過スクリーン 11の外側となるように、透過スクリーン 11の左右両辺と下辺にそつ て配設する。再帰反射テープ 21に照射される光は、テープの表側からビーズ体 24 内へ入射した後、反射層で反射され、テープの表側からその照射方向へ反射される 。なお、複数のビーズ体 24は、テープに封入されていても、テープ内においてカプセ ル化されていてもよい。 [0051] 2つの赤外線投受光ユニット 22, 23の中の一方の赤外線投受光ユニット 22は、赤 外線 LED (Light Emitting Diode) 25と、ポリゴンミラー 26と、赤外線 CCD (Cha rge -Coupled Device) 27とを有する。赤外線 LED25は、赤外線を出力する。赤 外線は、光の一種である。ポリゴンミラー 26は、たとえば六面鏡などの多面鏡である。 赤外線 CCD27は、複数の赤外線受光素子を有する。赤外線受光素子は、赤外線 の受光光量に応じた受光レベル信号を出力する。赤外線 CCD27は、複数の赤外線 受光素子の受光レベル信号に基づく赤外線画像を出力する。 The retroreflective tape 21 is, for example, a plurality of beads 24 each having a reflective layer adhered to half of the surface of a transparent sphere such as glass, which is also strong, arranged in such a manner that the reflective layer is on the side where the tape adheres. Some have a structure. In the retroreflective tape 21, the left and right sides and the lower side of the transmissive screen 11 are arranged so that the surface where the tape is attached is outside the transmissive screen 11. The light irradiated on the retroreflective tape 21 enters the bead body 24 from the front side of the tape, is reflected by the reflective layer, and is reflected from the front side of the tape in the irradiation direction. Note that the plurality of bead bodies 24 may be encapsulated in a tape or encapsulated in the tape. [0051] One of the two infrared light emitting and receiving units 22 and 23 includes an infrared LED (Light Emitting Diode) 25, a polygon mirror 26, and an infrared CCD (Charge-Coupled Device) 27. And The infrared LED 25 outputs infrared light. Infrared rays are a type of light. The polygon mirror 26 is, for example, a polygon mirror such as a hexagon mirror. The infrared CCD 27 has a plurality of infrared light receiving elements. The infrared light receiving element outputs a light receiving level signal corresponding to the amount of received infrared light. The infrared CCD 27 outputs an infrared image based on the light receiving level signals of the plurality of infrared light receiving elements.
[0052] 一方の赤外線投受光ユニット 22は、透過スクリーン 11の長尺方向の一端寄りに配 設される。一方の赤外線投受光ユニット 22は、透過スクリーン 11と不透明の板材 12 との間の隙間に配設される。複数の赤外線受光素子は、透過スクリーン 11の面に沿 つて配列される。  One infrared projection / reception unit 22 is provided near one end of the translucent screen 11 in the longitudinal direction. One infrared light emitting / receiving unit 22 is provided in a gap between the transmission screen 11 and the opaque plate 12. The plurality of infrared light receiving elements are arranged along the surface of the transmission screen 11.
[0053] 図 4にお 、て、点線で示すように、一方の赤外線投受光ユニット 22では、赤外線し ED25が赤外線を出力する。ポリゴンミラー 26は、赤外線 LED25から出力された赤 外線を 1つの鏡面で反射する。反射する。ポリゴンミラー 26で反射された赤外線は、 透過スクリーン 11の面にそって進み、透過スクリーン 11の外周に沿って設けられた 再帰反射テープ 21に入射する。再帰反射テープ 21は、入射した赤外線をその入射 した方向へ反射する。再帰反射テープ 21で反射された赤外線は、入射の場合と略 同じ経路で一方の赤外線投受光ユニット 22へ戻る。一方の赤外線投受光ユニット 22 に戻った赤外線は、赤外線 CCD27の複数の赤外線受光素子の中の、ある赤外線 受光素子に受光される。  In FIG. 4, as indicated by a dotted line, one infrared light emitting / receiving unit 22 emits infrared light, and the ED 25 outputs infrared light. The polygon mirror 26 reflects the infrared light output from the infrared LED 25 on one mirror surface. reflect. The infrared light reflected by the polygon mirror 26 travels along the surface of the transmission screen 11 and enters the retroreflective tape 21 provided along the outer periphery of the transmission screen 11. The retroreflective tape 21 reflects the incident infrared light in the incident direction. The infrared light reflected by the retroreflective tape 21 returns to one of the infrared light emitting / receiving units 22 along substantially the same path as the case of incidence. The infrared light returned to one infrared light emitting / receiving unit 22 is received by a certain infrared light receiving element among the plurality of infrared light receiving elements of the infrared CCD 27.
[0054] ポリゴンミラー 26は回転する。ポリゴンミラー 26が回転し、赤外線を反射している鏡 面の向きが変化すると、一方の赤外線投受光ユニット 22から出力される赤外線の方 向は、変わる。この異なる方向へ出力される赤外線は、透過スクリーン 11の面にそつ て進み、再帰反射テープ 21の先ほどとは異なる部位に入射する。再帰反射テープ 2 1は、照射された赤外線をその入射した方向へ反射する。再帰反射テープ 21で反射 された赤外線は、入射の場合と略同じ経路で一方の赤外線投受光ユニット 22へ戻る 。一方の赤外線投受光ユニット 22に戻った赤外線は、赤外線 CCD27の複数の赤外 線受光素子の中の、先ほどとは異なるある赤外線受光素子により受光される。 [0055] 一方の赤外線投受光ユニット 22から出力される赤外線の経路は、ポリゴンミラー 26 の回転にしたがって、透過スクリーン 11の面にそって少しずつ移動する。透過スクリ ーン 11の面の上での経路の位置が変わると、赤外線 CCD27にお 、て赤外線を受 光する赤外線受光素子は、変わる。ポリゴンミラー 26が 360度をその面数で割った角 度を回転する期間に、赤外線 CCD27の複数の赤外線受光素子は、赤外線を 1回ず っ受光する。ポリゴンミラー 26がその期間を超えてさらに回転すると、赤外線 CCD2 7の複数の赤外線受光素子は、赤外線をもう一度ずっ受光する。 [0054] The polygon mirror 26 rotates. When the polygon mirror 26 rotates and the direction of the mirror surface that reflects infrared light changes, the direction of infrared light output from one infrared light emitting and receiving unit 22 changes. The infrared rays output in the different directions proceed along the surface of the transmissive screen 11 and are incident on the retroreflective tape 21 at a site different from the one before. The retro-reflective tape 21 reflects the irradiated infrared rays in the direction of incidence. The infrared light reflected by the retroreflective tape 21 returns to one of the infrared light emitting and receiving units 22 along substantially the same path as the case of incidence. The infrared light that has returned to the one infrared light emitting / receiving unit 22 is received by a certain infrared light receiving element different from the previous one among the plurality of infrared light receiving elements of the infrared CCD 27. The path of the infrared light output from one infrared light emitting / receiving unit 22 gradually moves along the surface of the transmission screen 11 as the polygon mirror 26 rotates. When the position of the path on the surface of the transmission screen 11 changes, the infrared light receiving element that receives infrared light at the infrared CCD 27 changes. During a period in which the polygon mirror 26 rotates by an angle obtained by dividing 360 degrees by the number of surfaces, the plurality of infrared light receiving elements of the infrared CCD 27 receive infrared light once. When the polygon mirror 26 further rotates beyond that period, the plurality of infrared light receiving elements of the infrared CCD 27 receive infrared light again.
[0056] つまり、一方の赤外線投受光ユニット 22は、ポリゴンミラー 26が 360度をその面数 で割った角度を回転する期間毎に、透過スクリーン 11の面の所定の範囲(図 4にお いて一点鎖線で示す範囲) Cを赤外線で走査する。また、一方の赤外線投受光ュ- ット 22は、その走査を繰り返す。 1回の走査に要する期間において、その透過スクリ ーン 11の面の所定の範囲に障害物がない場合、複数の赤外線受光素子は、赤外線 を 1回ずっ受光する。一方の赤外線投受光ユニット 22の赤外線 CCD27は、ポリゴン ミラー 26が 360度をその面数で割った角度を回転する期間毎に、赤外線画像を出 力する。  In other words, one infrared projection / reception unit 22 moves the polygon mirror 26 by a predetermined range (see FIG. 4) of the surface of the transmission screen 11 every time the polygon mirror 26 rotates by an angle obtained by dividing 360 degrees by the number of surfaces. Scan the area C with infrared light. Further, one infrared light emitting and receiving unit 22 repeats the scanning. If there is no obstacle in a predetermined range on the surface of the transmission screen 11 during a period required for one scan, the plurality of infrared light receiving elements receive infrared light once. The infrared CCD 27 of the infrared emitting / receiving unit 22 outputs an infrared image every time the polygon mirror 26 rotates by an angle obtained by dividing 360 degrees by the number of surfaces.
[0057] そして、その透過スクリーン 11の面上方の所定の範囲の中に障害物(たとえば指な どの操作時に使用するもの)がある場合、障害物がある位置を通過する赤外線を受 光する赤外線受光素子は、赤外線を受光しなくなる。赤外線 CCD27は、その赤外 線を受光しなカゝつた部分に影が形成されている赤外線画像を出力する。  If there is an obstacle (for example, a finger or the like used during operation) in a predetermined range above the surface of the transmissive screen 11, the infrared ray that receives the infrared ray passing through the position where the obstacle is located The light receiving element stops receiving infrared light. The infrared CCD 27 outputs an infrared image in which a shadow is formed in a small portion that does not receive the infrared rays.
[0058] 他方の赤外線投受光ユニット 23は、透過スクリーン 11の長尺方向の他端寄りに配 設される。他方の赤外線投受光ユニット 23は、透過スクリーン 11と不透明の板材 12 との間の隙間に配設される。他方の赤外線投受光ユニット 23の構成および動作は、 一方の赤外線投受光ユニット 22の構成および動作と同様であり、同一の符号を付し てその説明を省略する。他方の赤外線投受光ユニット 23においても、その赤外線 C CD27は、そのポリゴンミラー 26が 360度をその面数で割った角度を回転する期間 毎に、赤外線画像を出力する。  [0058] The other infrared projection / reception unit 23 is provided near the other end in the longitudinal direction of the transmission screen 11. The other infrared light emitting / receiving unit 23 is disposed in a gap between the transmission screen 11 and the opaque plate 12. The configuration and operation of the other infrared projection / reception unit 23 are the same as the configuration and operation of the infrared projection / reception unit 22, and are denoted by the same reference numerals and description thereof will be omitted. Also in the other infrared projection / reception unit 23, the infrared CCD 27 outputs an infrared image every time the polygon mirror 26 rotates by an angle obtained by dividing 360 degrees by the number of surfaces.
[0059] 図 4に示すように、一方の赤外線投受光ユニット 22と他方の赤外線投受光ユニット 23とは、透過スクリーン 11の上側の両端部に配設されている。そのため、他方の赤 外線投受光ユニット 23が走査する透過スクリーン 11の面の所定の範囲(図 4におい て一点鎖線で示す範囲 C)と、一方の赤外線投受光ユニット 22が走査する透過スクリ ーン 11の面の所定の範囲(図 4において一点鎖線で示す範囲 D)とは、互いに重な る部分があるものの、 2つの赤外線投受光ユニット 22, 23の配設位置が異なるため 一致しない。透過スクリーン 11上のある点(たとえば図 4の点 E)は、一方の赤外線投 受光ユ ット 22から見た方向と、他方の赤外線投受光ユニット 23から見た方向とは、 異なる方向となる。 As shown in FIG. 4, one infrared light emitting / receiving unit 22 and the other infrared light emitting / receiving unit 23 are disposed at both ends on the upper side of the transmission screen 11. So the other red A predetermined range of the surface of the transmission screen 11 scanned by the external light emitting / receiving unit 23 (a range C indicated by a dashed line in FIG. 4) and a predetermined range of the surface of the transmission screen 11 scanned by one infrared light emitting / receiving unit 22 (Range D shown by a dashed line in FIG. 4) does not coincide with each other, although there are overlapping portions, because the two infrared light emitting / receiving units 22, 23 are disposed at different positions. A point on the transmissive screen 11 (for example, point E in FIG. 4) is in a direction different from the direction viewed from one infrared light emitting / receiving unit 22 and the direction viewed from the other infrared light emitting / receiving unit 23. .
[0060] また、実施の形態 1に係る展示装置は、制御装置 31を有する。なお、制御装置 31 は、図 1から図 3には図示していないが、鑑賞者力も見えない位置に配置されている 。図 5は、図 1の展示装置の制御系のハードウェア構成を示すブロック図である。図 6 は、図 1の展示装置の制御系を示すブロック図である。  The display device according to the first embodiment includes a control device 31. Although not shown in FIGS. 1 to 3, the control device 31 is arranged at a position where the viewer's power is not visible. FIG. 5 is a block diagram showing a hardware configuration of a control system of the exhibition apparatus of FIG. FIG. 6 is a block diagram showing a control system of the exhibition device of FIG.
[0061] 制御装置 31は、パーソナルコンピュータなどで実現することができる。制御装置 31 は、入出力ポート 32と、 CPU (Central Processing Unit:中央処理装置) 33と、 メモリ 34と、ハードディスク装置 35と、これらを接続するシステムノ ス 36とを有する。  [0061] The control device 31 can be realized by a personal computer or the like. The control device 31 has an input / output port 32, a CPU (Central Processing Unit: central processing unit) 33, a memory 34, a hard disk device 35, and a system node 36 connecting these.
[0062] 入出力ポート 32には、プロジェクタ 13と、一方の赤外線投受光ユニット 22と、他方 の赤外線投受光ユニット 23と、が接続される。一方の赤外線投受光ユニット 22およ び他方の赤外線投受光ユニット 23は、走査の度に、入出力ポート 32へ赤外線 CCD 27の赤外線画像を出力する。  The input / output port 32 is connected to the projector 13, one infrared light emitting / receiving unit 22, and the other infrared light emitting / receiving unit 23. The one infrared light emitting and receiving unit 22 and the other infrared light emitting and receiving unit 23 output an infrared image of the infrared CCD 27 to the input / output port 32 every time scanning is performed.
[0063] ハードディスク装置 35は、データを記憶する。図 7は、図 5のハードディスク装置 35 に記憶されるデータを示す図である。  [0063] The hard disk device 35 stores data. FIG. 7 is a diagram showing data stored in the hard disk device 35 of FIG.
[0064] ハードディスク装置 35は、たとえば、投影画像生成プログラム 41、動画再生プログ ラム 42、操作検出プログラム 43などのプログラムや、三次元コンピュータグラフィック スデータ 44、動画データ 45などの表示データを記憶する。  The hard disk device 35 stores, for example, programs such as a projection image generating program 41, a moving image reproducing program 42, and an operation detecting program 43, and display data such as three-dimensional computer graphics data 44 and moving image data 45.
[0065] 三次元コンピュータグラフィックスデータ 46は、実物 9を任意の方向力 見たときの 像を含む画像の静止画データを生成するためのデータである。三次元コンピュータ グラフィックスデータ 46は、たとえば、実物 9のモデリングデータと、そのモデリングデ ータの表面に貼り付けられる画像のデータとで構成される。このモデリングデータの 表面に貼り付けられる画像のデータには、たとえば実物 9を撮像した画像力 切り出 した画像のデータなどを利用することができる。 The three-dimensional computer graphics data 46 is data for generating still image data of an image including an image when the real object 9 is viewed in any direction. The three-dimensional computer graphics data 46 includes, for example, modeling data of the real object 9 and data of an image attached to the surface of the modeling data. The data of the image pasted on the surface of this modeling data includes, for example, image power It is possible to use the data of the image obtained.
[0066] 図 8は、図 7の三次元コンピュータグラフィックスデータ 44に基づいて生成される静 止画データによる一の画像 51を示す図である。図 8の画像 51には、透過スクリーン 1 1の前側から実物 9 (銀色のカップ)を見たときと同じ実物の像が形成される。  FIG. 8 is a diagram showing one image 51 based on still image data generated based on the three-dimensional computer graphics data 44 of FIG. In the image 51 of FIG. 8, the same real image as when the real 9 (silver cup) is viewed from the front side of the transmission screen 11 is formed.
[0067] 図 9は、図 7の三次元コンピュータグラフィックスデータ 44に基づいて生成される静 止画データによる他の画像 52を示す図である。図 9の画像 52には、図 8の画像 51に おける実物の像より大きな実物の像が形成される。  FIG. 9 is a diagram showing another image 52 based on still image data generated based on the three-dimensional computer graphics data 44 of FIG. In the image 52 in FIG. 9, a real image larger than the real image in the image 51 in FIG. 8 is formed.
[0068] 図 10は、図 7の三次元コンピュータグラフィックスデータ 44に基づいて生成される 静止画データによるさらに他の画像 53を示す図である。図 10の画像 53には、実物 9 (銀色のカップ)を上側力 見たときと同じ実物の像が形成される。  FIG. 10 is a diagram showing still another image 53 based on still image data generated based on the three-dimensional computer graphics data 44 of FIG. In the image 53 in FIG. 10, the same real image as when the real 9 (silver cup) is viewed upward is formed.
[0069] 動画データ 45は、映像信号を生成するためのデータである。動画データ 45は、た とえば実物 9の製造工程を紹介する動画データ 45である。  [0069] The moving image data 45 is data for generating a video signal. The moving image data 45 is, for example, moving image data 45 that introduces the manufacturing process of the real 9.
[0070] 投影画像生成プログラム 41は、メモリ 34に読み込まれ、 CPU33により実行される。  [0070] The projection image generation program 41 is read into the memory 34 and executed by the CPU 33.
これにより、像変更手段および像制御手段としての、図 6に示す投影画像生成部 61 が生成される。投影画像生成部 61は、三次元コンピュータグラフィックスデータ 44か ら静止画データを生成し、その生成した静止画データを、プロジェクタ 13へ出力する  As a result, the projection image generation unit 61 shown in FIG. 6 as the image changing unit and the image control unit is generated. The projection image generation unit 61 generates still image data from the three-dimensional computer graphics data 44, and outputs the generated still image data to the projector 13.
[0071] 動画再生プログラム 42は、メモリ 34に読み込まれ、 CPU33により実行される。これ により、画像形成手段としての、図 6に示す動画再生部 62が生成される。動画再生 部 62は、動画データ 45に基づいて映像信号を生成し、その生成した映像信号を、 プロジェクタ 13へ出力する。 The moving image reproducing program 42 is read into the memory 34 and executed by the CPU 33. As a result, the moving image reproducing unit 62 shown in FIG. 6 as an image forming unit is generated. The moving image reproducing unit 62 generates a video signal based on the moving image data 45, and outputs the generated video signal to the projector 13.
[0072] 操作検出プログラム 43は、メモリ 34に読み込まれ、 CPU33により実行される。これ により、操作検出手段および鑑賞者視点特定手段としての、図 6に示す操作検出部 63が生成される。操作検出部 63は、 2つの赤外線投受光ユニット 22, 23から入力さ れる赤外線画像に基づいて、透過スクリーン 11に対する操作を判断し、判断した操 作に応じて投影画像生成部 61および動画再生部 62へ指示を出力する。  The operation detection program 43 is read into the memory 34 and executed by the CPU 33. As a result, the operation detecting unit 63 shown in FIG. 6 is generated as the operation detecting unit and the viewer viewpoint specifying unit. The operation detection unit 63 determines an operation on the transmissive screen 11 based on the infrared images input from the two infrared light emitting and receiving units 22 and 23, and according to the determined operation, the projection image generation unit 61 and the moving image reproduction unit. Output instructions to 62.
[0073] 次に、本発明の実施の形態 1に係る展示装置の動作について説明する。  Next, the operation of the display device according to Embodiment 1 of the present invention will be described.
[0074] 展示装置において、実物 9 (ここでは銀色のカップ)は、載置台 2の上に載置され、 スポットランプ 4の光に照らされる。これにより、鑑賞者などは、透過スクリーン 11越し に実物 9外観や色合 、を確認することができる。 [0074] In the display device, the real object 9 (here, a silver cup) is placed on the mounting table 2, The spot lamp 4 illuminates the light. Thereby, a viewer or the like can check the appearance and color of the real object 9 through the translucent screen 11.
[0075] 投影画像生成部 61は、三次元コンピュータグラフィックスデータ 44を読み込み、そ の読み込んだ三次元コンピュータグラフィックスデータ 44から、実物 9の画像を所定 の平面に投影した二次元の静止画データを生成する。ここでは、投影画像生成部 61 は、図 8に示す画像の静止画データを生成する。  [0075] The projection image generation unit 61 reads the three-dimensional computer graphics data 44, and, based on the read three-dimensional computer graphics data 44, two-dimensional still image data obtained by projecting an image of the real object 9 on a predetermined plane. Generate Here, the projection image generation unit 61 generates still image data of the image shown in FIG.
[0076] 投影画像生成部 61は、生成した静止画データを、入出力ポート 32を介して、プロ ジェクタ 13へ出力する。プロジェクタ 13は、図 8に示す画像 51を出力する。透過スク リーン 11には、プロジェクタ 13により投影された画像が形成される。図 11は、透過ス クリーン 11に、図 8に示す初期画像 51が投影されている状態を示す図である。  The projection image generator 61 outputs the generated still image data to the projector 13 via the input / output port 32. The projector 13 outputs an image 51 shown in FIG. An image projected by the projector 13 is formed on the transmission screen 11. FIG. 11 is a diagram showing a state where the initial image 51 shown in FIG.
[0077] この図 8に示す画像 51に基づ 、て透過スクリーン 11に投影される画像にぉ 、て、 実物の像は、透過スクリーン 11の略真中に投影される。この初期画面データ 45に基 づ 、て透過スクリーン 11に形成する実物の像は、図 2に示す点 Aの位置 (鑑賞者 14 が鑑賞する位置)に標準的な身長 (たとえば 175cm)を有する鑑賞者 14が立って透 過スクリーン 11の方向を見た場合に、その実物の像と、透過スクリーン 11越しに見え る実物 9とが重なる位置およびサイズになっている。なお、標準的な身長ではない鑑 賞者 14の場合には、実物 9と実物の像とが多少ずれるが、実物 9と実物の像とが分 離して見えることはほとんどな 、。  Based on the image 51 shown in FIG. 8, the real image is projected substantially in the center of the transmissive screen 11 in comparison to the image projected on the transmissive screen 11. Based on the initial screen data 45, the real image formed on the transmissive screen 11 has a viewing height having a standard height (for example, 175 cm) at the position of point A (the position viewed by the viewer 14) shown in FIG. When the person 14 stands and looks in the direction of the translucent screen 11, the real image and the real object 9 seen through the transmissive screen 11 are positioned and overlapped. In the case of the viewer 14 who is not the standard height, the real 9 and the real image slightly deviate, but the real 9 and the real image rarely appear to be separated.
[0078] このように、透過スクリーン 11に投影する実物の像を、実物 9に重ねて形成すること で、鑑賞者 14からは、実物 9と実物の像との両方が見える。これにより、展示されてい る実物 9そのものが手元にあるかのような錯覚を、鑑賞者 14に与えることができる。  As described above, by forming the real image projected on the transmissive screen 11 on the real object 9, the viewer 14 can see both the real object 9 and the real image. As a result, the viewer 14 can be given an illusion as if the real thing 9 itself is at hand.
[0079] 図 2に示す点 Aの位置に立って 、る鑑賞者 14などが透過スクリーン 11に対して手 を伸ばすと、透過スクリーン 11の左右上方に配設された 2つの赤外線投受光ユニット 22, 23は、その手の指による影が形成された 2つの赤外線画像を出力する。この 1組 の赤外線画像は、入出力ポート 32を介して、操作検出部 63に入力される。また、 2つ の赤外線投受光ユニット 22, 23は、走査毎に、 1組の赤外線画像を出力する。以下 、 2つの赤外線投受光ユニット 22, 23から 1走査毎に出力される 2つの赤外線画像を 1組の赤外線画像という。 [0080] 操作検出部 63は、各組の赤外線画像に基づいて透過スクリーン 11に対する操作 を判断し、判断した操作に応じて投影画像生成部 61ある 、は動画再生部 62へ指示 を出力する。 When the viewer 14 or the like stands at the position of the point A shown in FIG. 2 and reaches out to the transmissive screen 11, the two infrared light emitting / receiving units 22 , 23 output two infrared images in which the shadow of the finger of the hand is formed. This set of infrared images is input to the operation detection unit 63 via the input / output port 32. Further, the two infrared light emitting / receiving units 22, 23 output one set of infrared images for each scan. Hereinafter, two infrared images output from the two infrared light emitting and receiving units 22, 23 for each scan are referred to as a set of infrared images. The operation detection unit 63 determines an operation on the transmission screen 11 based on each set of infrared images, and outputs an instruction to the projection image generation unit 61 or the moving image reproduction unit 62 according to the determined operation.
[0081] 具体的にはたとえば、操作検出部 63は、まず、各組の赤外線画像に基づいて、指 の位置および数を判断する。上述したように、指によって赤外線が遮断されると、赤 外線画像には影が形成される。操作検出部 63は、各赤外線画像における影の位置 や数を判断する。  [0081] Specifically, for example, the operation detecting unit 63 first determines the position and the number of fingers based on each set of infrared images. As described above, when infrared light is blocked by a finger, a shadow is formed in the infrared image. The operation detection unit 63 determines the position and number of shadows in each infrared image.
[0082] たとえば 1組の赤外線画像のそれぞれに影が 1つずつ形成されている場合、操作 検出部 63は、その 2つの赤外線画像における影が 1本の指による影であると判断す る。また、操作検出部 63は、各赤外線画像における影の位置に基づいて、各赤外線 投受光ユニットからのその指の方向(影に幅がある場合には、その影の中央の方向) を特定する。操作検出部 63は、特定した各赤外線投受光ユニットからの指の方向と 、一方の赤外線投受光ユニット 22と他方の赤外線投受光ユニット 23との距離とを用 いて、三角測定法の原理に基づいて、透過スクリーン 11上でのその 1本の指の位置 を特定する。これにより、操作検出部 63は、たとえば図 4の点 Eに指がある場合、その 位置に 1本の指があると判断することができる。  [0082] For example, when one shadow is formed in each of a set of infrared images, the operation detection unit 63 determines that the shadow in the two infrared images is a shadow by one finger. Further, the operation detecting unit 63 specifies the direction of the finger from each infrared light emitting / receiving unit (if the shadow has a width, the direction of the center of the shadow) based on the position of the shadow in each infrared image. . The operation detection unit 63 uses the specified direction of the finger from each infrared light emitting / receiving unit and the distance between one infrared light emitting / receiving unit 22 and the other infrared light emitting / receiving unit 23 based on the principle of the triangulation measurement method. Then, the position of the one finger on the transmissive screen 11 is specified. Accordingly, when a finger is present at point E in FIG. 4, for example, operation detection unit 63 can determine that there is one finger at that position.
[0083] 他にもたとえば、各赤外線画像に影が 2つずつ形成されている場合、操作検出部 6 3は、その 2つの赤外線画像における影が 2本の指による影であると判断する。また、 操作検出部 63は、 2つの赤外線画像において右側となる影同士が 1本目の指の影 であり、且つ、 2つの赤外線画像において左側となる影同士が残りの 1本目の指の影 であると仮定して、各赤外線投受光ユニットからのそれぞれの指の方向を特定する。 操作検出部 63は、特定した各赤外線投受光ユニットからのそれぞれの指の方向と、 一方の赤外線投受光ユニット 22と他方の赤外線投受光ユニット 23との距離とを用い て、三角測定法の原理に基づいて、透過スクリーン 11上でのその 2本の指の位置を 特定する。  [0083] In addition, for example, when two shadows are formed in each infrared image, the operation detection unit 63 determines that the shadow in the two infrared images is a shadow by two fingers. In addition, the operation detecting unit 63 determines that the right shadows in the two infrared images are shadows of the first finger, and the left shadows in the two infrared images are shadows of the remaining first finger. Assuming that there is, the direction of each finger from each infrared light emitting / receiving unit is specified. The operation detection unit 63 uses the specified direction of each finger from each infrared light emitting and receiving unit and the distance between one infrared light emitting and receiving unit 22 and the other infrared light emitting and receiving unit 23 to determine the principle of triangulation. The positions of the two fingers on the transmissive screen 11 are specified based on
[0084] 各組の赤外線画像に基づ!/、て、透過スクリーン 11上での指の数および位置を特定 した後、操作検出部 63は、その特定した指の数および位置に基づいて、透過スクリ ーン 11に対する操作の指示内容を判断する。あるいは、操作検出部 63は、特定した 指の位置の、それ以前の組の赤外線画像における指の位置に対する変化に基づ 、 て、透過スクリーン 11に対する操作の指示内容を判断する。操作の指示内容を判断 した後、操作検出部 63は、その判断した操作指示に応じて投影画像生成部 61ある いは動画再生部 62へ指示を出力する。 After specifying the number and position of the fingers on the transmissive screen 11 based on the infrared images of each set, the operation detecting unit 63 determines the number and position of the fingers based on the specified number and positions of the fingers. The contents of the operation instruction for the transmission screen 11 are determined. Alternatively, the operation detection unit 63 Based on the change of the finger position with respect to the finger position in the previous set of infrared images, the content of the operation instruction on the translucent screen 11 is determined. After determining the content of the operation instruction, the operation detection unit 63 outputs an instruction to the projection image generation unit 61 or the moving image reproduction unit 62 according to the determined operation instruction.
[0085] 具体的にはたとえば、ある組の赤外線画像において 2本の指の位置が特定され、 且つ、それ以前のある組の赤外線画像にぉ 、て 2本の指の位置が特定されて 、る場 合であって、その特定した 2本の指の間隔が以前より広がっているときには、操作検 出部 63は、透過スクリーン 11に投影して ヽる画像の拡大を指示する操作がなされた と判断する。画像の拡大を指示する操作であると判断した場合、操作検出部 63は、 画像を拡大する指示を、投影画像生成部 61へ出力する。  [0085] Specifically, for example, the positions of two fingers are specified in a certain set of infrared images, and the positions of the two fingers are specified in a certain set of previous infrared images. When the specified distance between the two fingers is wider than before, the operation detection unit 63 performs an operation to instruct an enlargement of the image projected on the transmissive screen 11. Judge. If it is determined that the operation is an operation for instructing enlargement of the image, the operation detection unit 63 outputs an instruction for enlarging the image to the projection image generation unit 61.
[0086] 他にもたとえば、ある組の赤外線画像において 2本の指の位置が特定され、且つ、 それ以前のある組の赤外線画像にぉ 、て 2本の指の位置が特定されて 、る場合であ つて、その特定した 2本の指の間隔が以前より狭まっているときには、操作検出部 63 は、透過スクリーン 11に投影して ヽる画像の縮小を指示する操作がなされたと判断 する。画像の縮小を指示する操作であると判断した場合、操作検出部 63は、画像を 縮小する指示を、投影画像生成部 61へ出力する。  [0086] In addition, for example, the positions of two fingers are specified in a certain set of infrared images, and the positions of two fingers are specified in a certain set of infrared images before that. In this case, when the specified distance between the two fingers is smaller than before, the operation detection unit 63 determines that an operation for instructing reduction of the image projected on the transmissive screen 11 has been performed. If it is determined that the operation is an operation to instruct to reduce the image, the operation detecting unit 63 outputs an instruction to reduce the image to the projection image generating unit 61.
[0087] 他にもたとえば、ある組の赤外線画像において 2本の指の位置が特定され、且つ、 それ以前のある組の赤外線画像にぉ 、て 2本の指の位置が特定されて 、る場合であ つて、その特定した 2本の指指が略同じ方向に移動しているときには、操作検出部 6 3は、透過スクリーン 11に投影して 、る画像の回転を指示する操作がなされたと判断 する。画像の回転を指示する操作であると判断した場合、操作検出部 63は、画像を 回転する指示を、投影画像生成部 61へ出力する。  [0087] For example, the positions of two fingers are specified in a certain set of infrared images, and the positions of two fingers are specified in a certain set of infrared images before that. In such a case, when the two specified fingers are moving in substantially the same direction, the operation detecting unit 63 projects the image on the transmissive screen 11 and instructs to rotate the image. to decide. If it is determined that the operation is to instruct the rotation of the image, the operation detection unit 63 outputs an instruction to rotate the image to the projection image generation unit 61.
[0088] 他にもたとえば、ある組の赤外線画像において 1本の指の位置が所定の位置範囲 において特定されている場合には、操作検出部 63は、動画の再生を指示する操作 力 されたと判断する。動画の再生を指示する操作であると判断した場合、操作検出 部 63は、動画を再生する指示を、動画再生部 62へ出力する。  [0088] In addition, for example, when the position of one finger is specified within a predetermined position range in a certain set of infrared images, the operation detection unit 63 determines that the operation force for instructing the reproduction of the moving image has been input. to decide. If it is determined that the operation is an operation for instructing reproduction of a moving image, the operation detection unit 63 outputs an instruction to reproduce the moving image to the moving image reproduction unit 62.
[0089] 画像を拡大する指示、画像を縮小する指示あるいは画像を回転する指示が入力さ れると、投影画像生成部 61は、三次元コンピュータグラフィックスデータ 44を読み込 み、その三次元コンピュータグラフィックスデータ 44から、実物 9の画像を所定の平面 に投影した二次元の静止画データを生成する。 When an instruction to enlarge the image, an instruction to reduce the image, or an instruction to rotate the image is input, the projection image generation unit 61 reads the three-dimensional computer graphics data 44. From the three-dimensional computer graphics data 44, two-dimensional still image data in which the image of the real object 9 is projected onto a predetermined plane is generated.
[0090] 投影画像生成部 61は、プロジェクタ 13が透過スクリーン 11へ投影している画像を 基準として、その画像を拡大、縮小あるいは回転させた投影画像を投影するための、 静止画データを生成する。最初の操作のときには、初期画面表示部 61がプロジェク タ 13へ出力した画像 51が基準となる。投影画像生成部 61は、生成した静止画デー タをプロジェクタ 13へ出力する。プロジェクタ 13は、新たに入力された静止画データ に基づく画面を出力する。透過スクリーン 11には、プロジェクタ 13が新たに投影した 画像が形成される。 [0090] Projection image generation section 61 generates still image data for projecting a projection image obtained by enlarging, reducing, or rotating the image based on the image projected by projector 13 on transmission screen 11. . At the time of the first operation, the image 51 output to the projector 13 by the initial screen display unit 61 is used as a reference. The projection image generation unit 61 outputs the generated still image data to the projector 13. Projector 13 outputs a screen based on the newly input still image data. On the transmissive screen 11, an image newly projected by the projector 13 is formed.
[0091] これにより、たとえば、透過スクリーン 11に図 8に示す初期画像 51が投影されてい るときに、画像を拡大する指示がなされたときには、透過スクリーン 11には、図 9に示 す実物 9を拡大した像を有する画像 52が投影される。他にもたとえば、透過スクリー ン 11に図 9に示す実物 9を拡大した像を有する画像 52が投影されるときに、画像を 回転する指示がなされたときには、透過スクリーン 11には、図 10に示す実物 9を上側 カゝら見た像を有する画像 53が投影される。  Thus, for example, when an instruction to enlarge the image is given while the initial image 51 shown in FIG. 8 is being projected on the transmissive screen 11, An image 52 having an enlarged image of is projected. In addition, for example, when an image 52 having an enlarged image of the real object 9 shown in FIG. 9 is projected on the transmission screen 11 and an instruction to rotate the image is given, the transmission screen 11 includes the image shown in FIG. An image 53 having an image of the real object 9 shown as viewed from above is projected.
[0092] このように、当初は実物 9に重ねて透過スクリーン 11に形成されていた実物の像を 、その透過スクリーン 11に対して操作することで像を拡大したり、縮小したり、回転し たりすることができる。その結果、操作する鑑賞者 14に対して、展示されている実物 9 そのものを手に持って鑑賞しているかのような錯覚を与えることができる。  [0092] As described above, the image of the real object which was initially formed on the transmissive screen 11 so as to be superimposed on the real object 9 is enlarged, reduced or rotated by operating the transmissive screen 11. Or you can. As a result, it is possible to give an illusion to the viewer 14 who is operating as if he / she is holding the real thing 9 on display.
[0093] 予め定められた所定の期間にわたって次の操作が検出されない場合、操作検出部 63は、投影画像生成部 61へ図 8に示す初期画像 51の表示を指示する。これにより 、透過スクリーン 11には、図 8に示す実物 9と重なる画像 51が表示される。その結果 、操作する鑑賞者 14に対して、あた力も、手に持って鑑賞していた実物 9を載置台 2 に置いたかのような錯覚を、与えることができる。  If the next operation is not detected for a predetermined period, the operation detection unit 63 instructs the projection image generation unit 61 to display the initial image 51 shown in FIG. As a result, an image 51 overlapping the real object 9 shown in FIG. As a result, it is possible to give the operating viewer 14 an illusion that the real object 9 held and watched by the user is placed on the mounting table 2.
[0094] なお、投影画像生成部 61は、透過スクリーン 11に形成する実物の像力 図 8に示 す初期画面データによる画像における実物の像と一致するまで、実物の像を連続的 に拡大、縮小および回転した静止画データを生成し、その後に、図 8に示す実物 9と 重なる画像 51の静止画データを生成するようにしてもよい。これにより、透過スクリー ン 11に形成する実物の像は、その大きさおよび向きがなめらかに変化した後に、実 物 9と重なる像となる。このように透過スクリーン 11に形成する実物の像をなめらかに 変化させて実物 9と重なるようにすることで、透過スクリーン 11に形成する実物の像と 、実物 9との一体感を、より一層強く持たせることができる。 [0094] The projection image generation unit 61 continuously enlarges the real image until the image power of the real object formed on the transmissive screen 11 matches the real image in the image based on the initial screen data shown in Fig. 8. The reduced and rotated still image data may be generated, and then the still image data of the image 51 overlapping the real object 9 shown in FIG. 8 may be generated. This allows the transmission screen to The image of the real object formed on the screen 11 becomes an image that overlaps with the real object 9 after its size and orientation smoothly change. In this way, by smoothly changing the image of the real object formed on the transmissive screen 11 so as to overlap with the real object 9, the sense of unity between the real image formed on the transmissive screen 11 and the real object 9 is further enhanced. You can have.
[0095] 動画再生部 62は、動画の再生指示が入力されると、動画データ 45から映像信号を 生成する。動画再生部 62は、生成した映像信号を、プロジェクタ 13へ出力する。プロ ジェクタ 13は、この映像信号による動画を出力する。透過スクリーン 11には、プロジェ クタ 13から投影される動画が映し出される。  [0095] The moving image reproducing unit 62 generates a video signal from the moving image data 45 when a moving image reproducing instruction is input. The moving image reproducing unit 62 outputs the generated video signal to the projector 13. The projector 13 outputs a moving image based on the video signal. The moving image projected from the projector 13 is displayed on the transmission screen 11.
[0096] 動画データ 45に基づく映像信号の生成が終了すると、動画再生部 62は、投影画 像生成部 61へ図 8に示す初期画像 51の表示を指示する。これにより、透過スクリー ン 11には、図 8に示す実物 9と重なる画像 51が表示される。  When the generation of the video signal based on the video data 45 ends, the video playback unit 62 instructs the projection image generation unit 61 to display the initial image 51 shown in FIG. As a result, an image 51 overlapping the real object 9 shown in FIG. 8 is displayed on the transmission screen 11.
[0097] 以上のように、この実施の形態 1の展示装置は、透過スクリーン 11に、その透過スク リーン 11越しに見たときの実物 9と同じ姿勢である実物の像を、載置台 2の上の実物 9と重ねて見せることができる。これにより、鑑賞者 14に、透過スクリーン 11に形成さ れる実物の像が実物 9と一対一に対応するものであると認知させ、透過スクリーン 11 に形成される実物の像が実物 9そのものであるかのような錯覚を与えることができる。  [0097] As described above, the display device according to the first embodiment is configured such that the image of the real object having the same posture as the real object 9 when viewed through the transparent screen 11 is placed on the transmissive screen 11. It can be overlaid with the real thing 9 above. This allows the viewer 14 to recognize that the real image formed on the transmissive screen 11 has a one-to-one correspondence with the real 9, and the real image formed on the transmissive screen 11 is the real 9 itself. It can give the illusion as if.
[0098] そのため、この実施の形態 1の展示装置は、鑑賞者 14に、実物 9とバーチャルな実 物の像とを同じ視点で鑑賞させることができ、これらの間に同一感を持たせることがで きる。たとえばバーチャルな実物の像だけでは把握させ難い、実物 9の大きさや質感 などを、そのバーチャルな実物の像に与えることができる。その結果、この実施の形 態 1の展示装置は、実物の像に現実感を持たせることができる。  [0098] For this reason, the display device of the first embodiment allows the viewer 14 to view the real object 9 and the virtual real image from the same viewpoint, and has the same feeling between them. I can do it. For example, it is possible to give the size and texture of the real object 9 to the virtual real image, which is difficult to grasp only with the virtual real image. As a result, the display device of the first embodiment can give the real image a sense of reality.
[0099] し力も、鑑賞者 14は、透過スクリーン 11に対して操作をすることで、その透過スクリ ーン 11に形成されている実物の像を、拡大したり、縮小したり、回転したりすることが できる。したがって、鑑賞者 14は、触ることができない実物 9を、実物の像において自 由に鑑賞することができる。その結果、美術品などの高い学習効果を期待することが できる。  [0099] The viewer 14 operates the transmission screen 11 to enlarge, reduce, or rotate the real image formed on the transmission screen 11. can do. Therefore, the viewer 14 can freely view the inaccessible real object 9 in the image of the real object. As a result, a high learning effect for art objects can be expected.
[0100] また、この実施の形態 1では、透過スクリーン 11に対する操作を、再帰反射テープ 2 1および 2つの赤外線投受光ユニット 22, 23により検出している。この実施の形態 1 の構成では、この構成を採用すれば、透過スクリーン 11に対する操作を検出するた めの部材を、透過スクリーン 11に重ねて配設することなぐ透過スクリーン 11に対す る操作を検出することができる。透過スクリーン 11に重ねて配設される部材がな 、の で、透過スクリーン 11越しに見える実物 9や、透過スクリーン 11に形成される実物の 像の鮮明さを損なってしまうことはない。 In the first embodiment, the operation on the transmission screen 11 is detected by the retroreflective tape 21 and the two infrared light emitting and receiving units 22 and 23. Embodiment 1 In this configuration, by employing this configuration, it is possible to detect an operation on the transmissive screen 11 without disposing a member for detecting an operation on the transmissive screen 11 so as to overlap the transmissive screen 11. Since there are no members arranged on the transmissive screen 11, there is no loss of the real object 9 seen through the transmissive screen 11 or the sharpness of the real image formed on the transmissive screen 11.
[0101] また、この実施の形態 1では、 2つの赤外線投受光ユニット 22, 23は、立設される透 過スクリーン 11の上側に配設される。したがって、たとえば 2つの赤外線投受光ュ- ット 22, 23を透過スクリーン 11の下側に配設する場合などに比べて、赤外線 CCD2 6に、赤外線 LED25からの赤外線以外の赤外線が入りに《なる。しカゝも、操作を検 出するために使用する光として赤外線を使用しているので、可視光の影響も少なくな る。操作を検出するための光として可視光を使用した場合には、たとえば実物 9を照 らすスポットランプ 4の光の影響などを受けやすくなる。その結果、この実施の形態 1 での赤外線画像には、透過スクリーン 11に対して操作をして!/、る手の影が鮮明に形 成される。操作検出部 63は、この手の影が鮮明に写っている赤外線画像に基づいて 、操作する手の位置を容易に特定することができる。  [0101] In the first embodiment, the two infrared light projecting and receiving units 22, 23 are arranged above the translucent screen 11 that is erected. Therefore, compared to the case where the two infrared emitting and receiving units 22 and 23 are arranged below the transmission screen 11, for example, infrared rays other than the infrared rays from the infrared LED 25 enter the infrared CCD 26. . Also, since infrared light is used as light used for detecting operations, the influence of visible light is reduced. When the visible light is used as the light for detecting the operation, the light becomes susceptible to, for example, the light of the spot lamp 4 illuminating the real object 9. As a result, for the infrared image in the first embodiment, operate the transmission screen 11! / The shadow of the hand is clearly formed. The operation detecting section 63 can easily specify the position of the hand to be operated based on the infrared image in which the shadow of the hand is clearly seen.
[0102] なお、この実施の形態 1では、図 8に示す初期画像 51は、透過スクリーン 11の予め 定められた位置に投影される。この他にもたとえば、透過スクリーン 11の予め定めら れた位置に投影した図 8に示す初期画像 51を、鑑賞者 14の操作に応じて、透過スク リーン 11において左右上下に移動したり、拡縮したりできるようにしてもよい。これに より、鑑賞者 14は、実物の像を、自分の目線において実物 9と重なるように移動する ことができる。この場合、その後に透過スクリーン 11に投影する画像は、その移動さ せた位置を基準として投影するようにするとよい。  In Embodiment 1, the initial image 51 shown in FIG. 8 is projected on the transmission screen 11 at a predetermined position. In addition to this, for example, the initial image 51 shown in FIG. 8 projected on a predetermined position of the transmissive screen 11 is moved right and left and up and down on the transmissive screen 11 in accordance with the operation of the viewer 14, and is enlarged or reduced. Or may be able to do so. Thus, the viewer 14 can move the real image so as to overlap the real image 9 with his / her eyes. In this case, an image to be subsequently projected on the transmissive screen 11 may be projected based on the moved position.
[0103] この実施の形態 1では、三次元コンピュータグラフィックスデータ 44に基づいて形成 される実物の像は、載置台 2の上に載置されている実物 9そのものである。この他にも たとえば、実物 9が色落ちした古い美術品などである場合において、三次元コンビュ 一タグラフィックスデータ 44に基づいて形成される実物の像は、実物 9にその落ちた 色を付加した像であってもよい。これにより、実物 9のみでは把握しにくい、美術品な どの本来の姿で鑑賞することができる。 [0104] この実施の形態 1では、動画データ 45は、透過スクリーン 11に連続的に投影される だけである。この他にもたとえば、透過スクリーン 11に対する鑑賞者 14の操作に応じ て、動画データ 45に基づく映像信号を段階的に生成するようにしてもよい。これによ り、たとえば、実物 9への色つけ工程などを、鑑賞者 14に追体験させることができる。 In the first embodiment, the real image formed based on the three-dimensional computer graphics data 44 is the real object 9 mounted on the mounting table 2 itself. In addition to this, for example, in the case where the real object 9 is an old artwork with discoloration, the real image formed based on the three-dimensional computer graphics data 44 adds the discolored color to the real object 9. Image may be used. As a result, it is difficult to grasp only with the real thing 9, and it is possible to appreciate it in its original form, such as artwork. In the first embodiment, moving image data 45 is only continuously projected on transmissive screen 11. In addition to this, for example, a video signal based on the moving image data 45 may be generated stepwise according to an operation of the viewer 14 on the translucent screen 11. As a result, for example, the viewer 14 can relive the coloring process of the real object 9 and the like.
[0105] この実施の形態 1では、ハードディスク装置 35には、動画データ 45が記憶されてい る。この他にもたとえば、スライドデータ、音声データなどの各種のデータを展示装置 において再生して、実物に関連する画像、映像あるいは文字情報を透過スクリーン 1 1に形成するようにしてもょ 、。  In the first embodiment, moving image data 45 is stored in hard disk device 35. In addition to this, for example, various data such as slide data and audio data may be reproduced on an exhibition device to form an image, a video, or character information related to a real object on the transparent screen 11.
[0106] この実施の形態 1では、透過スクリーン 11に対する操作を、再帰反射テープ 21およ び 2つの赤外線投受光ユニット 22, 23により検出している。この他にもたとえば、透過 スクリーン 11に対する操作を、透過スクリーン 11と重ねて配設されるタツチパネルな どで検出するようにしてもよい。ただし、タツチパネルなどで手の動きを検出する場合 、鑑賞者 14の操作感としては、透過スクリーン 11に形成する実物の像を操作してい る感覚ではなぐタツチパネルに対して操作をしている感覚となってしまう。その結果、 鑑賞者 14に、透過スクリーン 11に形成する実物の像と、実物 9との一体感を与え難く なる。透過スクリーン 11に対する操作を、再帰反射テープ 21および 2つの赤外線投 受光ユニット 22, 23によって検出する場合には、そのような一体感を維持することが できるので、好適である。  In the first embodiment, the operation on the transmission screen 11 is detected by the retroreflective tape 21 and the two infrared light emitting and receiving units 22 and 23. In addition to this, for example, the operation on the transmissive screen 11 may be detected by a touch panel or the like disposed so as to overlap the transmissive screen 11. However, when hand movements are detected with a touch panel or the like, the viewer feels that the operation of the viewer 14 is not as much as operating the real image formed on the transmissive screen 11 but as being operating on the touch panel. turn into. As a result, it is difficult to give the viewer 14 a sense of unity between the real image formed on the transmissive screen 11 and the real object 9. It is preferable that the operation on the transmissive screen 11 is detected by the retroreflective tape 21 and the two infrared light emitting and receiving units 22 and 23, because such a sense of unity can be maintained.
[0107] この実施の形態 1では、透過スクリーン 11として、透過率 65%のものを使用している 。この他にもたとえば、透過スクリーン 11は、透過率が 10%以上、 90%以下のものを使 用するようにしてもよい。また、透過スクリーン 11に替えて、偏光スクリーンを使用して もよい。ただし、偏光スクリーンは、それを見る角度が少し変化するだけで像の色が変 化してしまうので、複数の鑑賞者 14が同時に鑑賞するような大型の展示装置には不 向きである。また、プロジェクタ 13が透過スクリーン 11の実物 9側に配設されるので、 たとえばプロジェクタ 13が透過スクリーン 11の鑑賞者 14側に配設される場合などにく らベて、展示装置の占有面積が小さくなる。  In the first embodiment, a screen having a transmittance of 65% is used as the transmission screen 11. Alternatively, for example, the transmission screen 11 may have a transmittance of 10% or more and 90% or less. Further, a polarizing screen may be used in place of the transmission screen 11. However, the polarizing screen is not suitable for a large-scale display device in which a plurality of viewers 14 can view at the same time, because the color of the image changes only by slightly changing the viewing angle. In addition, since the projector 13 is disposed on the actual 9 side of the transmissive screen 11, the occupied area of the display device is larger than when the projector 13 is disposed on the viewer 14 side of the transmissive screen 11, for example. Become smaller.
[0108] 実施の形態 2.  [0108] Embodiment 2.
実施の形態 2に係る展示装置は、透過スクリーン 11の手前に立つ鑑賞者 14の背 丈、視点(眼)の位置、立ち位置などに応じて、透過スクリーン 11に形成する実物の 像の位置および大きさを制御する点にお!ヽて、実施の形態 1に係る展示装置と異な る。以下の説明では、主にこの相違点を中心に説明する。 The display device according to the second embodiment is configured such that the viewer 14 stands in front of the transmission screen 11. The point that the position and the size of the real image formed on the transmissive screen 11 are controlled in accordance with the height, the position of the viewpoint (eye), the standing position, and the like is different from the display device according to the first embodiment. You. In the following description, this difference will be mainly described.
[0109] 図 12は、本発明の実施の形態 2に係る展示装置を示す図である。図 12 (A)は、展 示装置の側面図である。図 12 (B)は、展示装置の正面図である。図 13は、図 12の 展示装置の制御系のハードウェア構成を示すブロック図である。図 14は、図 12の展 示装置の制御系を示すブロック図である。図 15は、図 13のハードディスク装置 35に 記憶されるデータを示す図である。  FIG. 12 is a diagram showing a display device according to Embodiment 2 of the present invention. FIG. 12A is a side view of the display device. FIG. 12B is a front view of the display device. FIG. 13 is a block diagram showing a hardware configuration of a control system of the exhibition device in FIG. FIG. 14 is a block diagram showing a control system of the display device of FIG. FIG. 15 is a diagram showing data stored in the hard disk device 35 of FIG.
[0110] 実施の形態 2に係る展示装置は、図 12に示すように、鑑賞位置検出手段としての 距離センサ 71と、鑑賞位置検出手段としての撮像装置 72とを有する。  As shown in FIG. 12, the exhibition device according to the second embodiment includes a distance sensor 71 as a viewing position detecting unit and an imaging device 72 as a viewing position detecting unit.
[0111] 距離センサ 71は、透過スクリーン 11の手前方向に向けて、透過スクリーン 11の下 端中央部分に取り付けられる。距離センサ 71は、赤外線あるいはその他の光を出力 し、その光の反射光を受光する。距離センサ 71は、光を出力して力も反射光を受光 するまでの時間に基づいて、光を反射した物体までの距離を計算する。距離センサ 7 1は、図 13に示すように、計算した距離を、入出力ポート 32へ出力する。  The distance sensor 71 is attached to the center of the lower end of the transmission screen 11 toward the front of the transmission screen 11. The distance sensor 71 outputs infrared light or other light, and receives reflected light of the light. The distance sensor 71 calculates the distance to the object reflecting the light based on the time from when the light is output to when the force receives the reflected light. The distance sensor 71 outputs the calculated distance to the input / output port 32, as shown in FIG.
[0112] 撮像装置 72は、図示外の複数の受光素子を有する。受光素子は、受光した光量 に応じた受光レベル信号を出力する。複数の受光素子は、同一面内において縦横 に並べて配列される。この複数の受光素子が配列される面は、受光面と呼ばれてい る。撮像装置 72は、その受光面が透過スクリーン 11の手前方向所定の角度だけ下 向きとなる姿勢にて、不透明の板材 12の上端中央部に取り付けられる。撮像装置 72 は、図 13に示すように、複数の受光素子の受光レベル信号に基づく撮像画像を、入 出力ポート 32へ出力する。  [0112] The imaging device 72 has a plurality of light receiving elements (not shown). The light receiving element outputs a light receiving level signal corresponding to the amount of received light. The plurality of light receiving elements are arranged vertically and horizontally in the same plane. The surface on which the plurality of light receiving elements are arranged is called a light receiving surface. The imaging device 72 is attached to the center of the upper end of the opaque plate 12 in such a manner that the light receiving surface thereof is directed downward by a predetermined angle in front of the transmission screen 11. The imaging device 72 outputs a captured image based on the light receiving level signals of the plurality of light receiving elements to the input / output port 32, as shown in FIG.
[0113] また、実施の形態 2に係るハードディスク装置 35には、図 15に示すように、投影画 像生成プログラム 81と、人検出プログラム 82とを、記憶する。  The hard disk device 35 according to the second embodiment stores a projection image generation program 81 and a human detection program 82, as shown in FIG.
[0114] 投影画像生成プログラム 81は、メモリ 34に読み込まれ、 CPU33により実行される。  [0114] The projection image generation program 81 is read into the memory 34 and executed by the CPU 33.
これにより、像変更手段および像制御手段としての、図 14に示す投影画像生成部 9 1が生成される。投影画像生成部 91は、三次元コンピュータグラフィックスデータ 44 に基づいて静止画データを生成し、この生成した静止画データをプロジェクタ 13へ 出力する。 As a result, the projection image generation unit 91 shown in FIG. 14 as the image changing unit and the image control unit is generated. The projection image generation unit 91 generates still image data based on the three-dimensional computer graphics data 44, and sends the generated still image data to the projector 13. Output.
[0115] 人検出プログラム 82は、メモリ 34に読み込まれ、 CPU33により実行される。これに より、鑑賞者視点特定手段および鑑賞位置検出手段としての、図 14に示す人検出 部 92が生成される。人検出部 92には、撮像装置 72が撮像した撮像画像と、距離セ ンサ 71が計算した、光を反射した物体までの距離と、が入力される。  The human detection program 82 is read into the memory 34 and executed by the CPU 33. As a result, a person detecting unit 92 shown in FIG. 14 is generated as a viewer viewpoint specifying unit and a viewing position detecting unit. To the human detection unit 92, the captured image captured by the imaging device 72 and the distance to the object reflecting the light calculated by the distance sensor 71 are input.
[0116] 実施の形態 2に係る展示装置において、上述した構成要素以外の構成要素は、実 施の形態 1に係る展示装置の構成要素と同じであり、同一の符号および名称を使用 して説明を省略する。  [0116] In the display device according to the second embodiment, the components other than the components described above are the same as the components of the display device according to the first embodiment, and will be described using the same reference numerals and names. Is omitted.
[0117] 次に、本発明の実施の形態 2に係る展示装置の動作について説明する。  Next, the operation of the display device according to Embodiment 2 of the present invention will be described.
[0118] 展示装置の周囲に人がいない初期状態においては、距離センサ 71は、透過スクリ ーン 11の手前方向へ光を出力する。撮像装置 72は、透過スクリーン 11の手前側の 所定の範囲を撮像する。スポットランプ 4は、点灯している。プロジェクタ 13は、消灯し ている。  [0118] In an initial state where there is no person around the display device, the distance sensor 71 outputs light in the direction in front of the transmission screen 11. The imaging device 72 images a predetermined range on the near side of the transmission screen 11. Spot lamp 4 is lit. The projector 13 is off.
[0119] なお、プロジェクタ 13は、その輝度を 0に設定することで消灯することができる。また 、消灯することができないプロジェクタ 13である場合、真っ黒の画像を出力させるよう にしてもよい。  [0119] The projector 13 can be turned off by setting its luminance to 0. When the projector 13 cannot be turned off, a black image may be output.
[0120] スポットランプ 4の光に照らされている実物 9 (銀色のカップ)を鑑賞するために、透 過スクリーン 11の手前に鑑賞者 14が立つと、距離センサ 71は、鑑賞者 14により反 射された反射光を受光する。距離センサ 71は、鑑賞者 14までの距離を計算して、入 出力ポート 32へ出力する。撮像装置 72は、鑑賞者 14の像を含む撮像画像を、入出 力ポート 32へ出力する。  [0120] When the viewer 14 stands in front of the translucent screen 11 to view the real object 9 (silver cup) illuminated by the light of the spot lamp 4, the distance sensor 71 The emitted reflected light is received. The distance sensor 71 calculates the distance to the viewer 14 and outputs the distance to the input / output port 32. The imaging device 72 outputs a captured image including the image of the viewer 14 to the input / output port 32.
[0121] 距離センサ 71が検出した鑑賞者 14までの距離の情報と、撮像画像とは、人検出部 92に入力される。人検出部 92は、距離センサ 71からの距離の情報が入力されると、 その距離の情報とそのときの撮像画像とに基づいて、透過スクリーン 11の手前に立 つている鑑賞者 14の頭の位置を特定する。なお、頭に替えて、眼、眉間などの位置 を視点位置として特定するようにしてもよい。その際、たとえば撮像装置 72の設置高 さは、撮像画像の各画素についての撮影角度 Θおよび鑑賞者 14までの距離 Lに基 づき、視点位置 (L, H)が特定される。 [0122] 図 16は、実物 9と、鑑賞者 14と、透過スクリーン 11との幾何学的な位置関係を示す 図である。人検出部 92は、画像処理によって、撮像画像内での、鑑賞者 14の頭の 位置 (すなわち視点位置)を特定する。人検出部 92は、距離センサ 71が検出した鑑 賞者 14までの距離が、透過スクリーン 11から鑑賞者 14までの距離であるとして、透 過スクリーン 11からその距離の位置に撮像画像を仮想的に配置し、その仮想的な配 置に基づいて、特定した頭の、透過スクリーン 11および実物 9に対する相対位置を 特定する。人検出部は、その特定した頭の相対位置に関する情報を、投影画像生成 部 91へ出力する。 [0121] Information on the distance to the viewer 14 detected by the distance sensor 71 and the captured image are input to the human detection unit 92. When the information on the distance from the distance sensor 71 is input, the human detection unit 92 detects the head of the viewer 14 standing in front of the translucent screen 11 based on the information on the distance and the captured image at that time. Identify the location. Instead of the head, a position such as the eye or the space between the eyebrows may be specified as the viewpoint position. At this time, for example, the viewpoint position (L, H) of the installation height of the imaging device 72 is specified based on the shooting angle に つ い て of each pixel of the captured image and the distance L to the viewer 14. FIG. 16 is a diagram showing a geometric positional relationship between the real object 9, the viewer 14, and the transmission screen 11. The human detection unit 92 specifies the position of the head of the viewer 14 (that is, the viewpoint position) in the captured image by image processing. The human detection unit 92 assumes that the distance to the viewer 14 detected by the distance sensor 71 is the distance from the translucent screen 11 to the viewer 14 and virtually captures the captured image from the translucent screen 11 at that distance. And the relative position of the identified head with respect to the transmissive screen 11 and the real object 9 is identified based on the virtual arrangement. The human detection unit outputs information on the specified relative position of the head to the projection image generation unit 91.
[0123] 投影画像生成部 91は、この頭の相対位置(つまり、視点位置)の情報と、三次元コ ンピュータグラフィックスデータ 44とに基づ 、て、透過スクリーン 11に形成する実物の 像力 その頭の位置から見たときに、実物 9と一致する静止画データを生成する。  [0123] Based on the information of the relative position of the head (that is, the viewpoint position) and the three-dimensional computer graphics data 44, the projection image generation unit 91 generates a real image power to be formed on the translucent screen 11. When viewed from the position of the head, still image data that matches the real object 9 is generated.
[0124] 具体的にはたとえば、投影画像生成部 91は、図 16に示すように、まず、人検出部 9 2が特定した頭の位置 (つまり、視点位置)と、実物 9の中心とを結ぶ中心線と交わる 透過スクリーン 11の位置 102を特定する。この位置 102は、透過スクリーン 11に形成 する実物の像の中心となる。また、投影画像生成部 91は、図 16に示すように、透過 スクリーン 11に形成する実物の像の大きさ 101を特定する。  [0124] Specifically, for example, as shown in FIG. 16, the projection image generation unit 91 firstly determines the position of the head (that is, the viewpoint position) specified by the human detection unit 92 and the center of the real object 9. The position 102 of the transmissive screen 11 that intersects the connecting center line is specified. This position 102 is the center of the real image formed on the transmissive screen 11. In addition, the projection image generation unit 91 specifies the size 101 of the real image formed on the transmissive screen 11, as shown in FIG.
[0125] 透過スクリーン 11に形成する実物の像の中心 102およびサイズ 101を特定した後、 投影画像生成部 91は、その特定した部位に実物の像が形成される静止画データを 生成する。投影画像生成部 91は、生成した静止画データをプロジェクタ 13へ出力す る。プロジェクタ 13は、出力を開始し、静止画データに基づく画像を出力する。透過 スクリーン 11には、プロジェクタ 13により投影された画像が形成される。  [0125] After specifying the center 102 and the size 101 of the real image formed on the transmissive screen 11, the projection image generation unit 91 generates still image data in which the real image is formed on the specified part. The projection image generation unit 91 outputs the generated still image data to the projector 13. The projector 13 starts outputting and outputs an image based on the still image data. An image projected by the projector 13 is formed on the transmissive screen 11.
[0126] 図 17は、三次元コンピュータグラフィックスデータ 44に基づいて生成される、静止 画データによる画像 111を示す図である。図 17の画像 111には、透過スクリーン 11 の前側から実物 9 (ここでは銀色のカップ)を見たときと同じ実物の像力 その中心が 図 17にお 、て左下方向にずれて形成される。  FIG. 17 is a diagram showing an image 111 based on still image data generated based on the three-dimensional computer graphics data 44. In the image 111 in FIG. 17, the image force of the real object 9 (here, the silver cup) is seen from the front side of the translucent screen 11, and its center is formed shifted to the lower left in FIG. .
[0127] 図 18は、透過スクリーン 11に、図 17に示す画像 111が投影されている状態を示す 図である。このように、透過スクリーン 11の手前に立つ鑑賞者 14力 図 18において 左寄りの位置に立ち、且つ、その鑑賞者 14の身長が標準的な身長より低いときには 、透過スクリーン 11に形成する実物の像は、透過スクリーン 11の中心力も左下にず れた位置に形成される。その結果、この透過スクリーン 11の手前において左寄りに立 つ鑑賞者 14からは、透過スクリーン 11に形成する実物の像と、透過スクリーン 11越 しに見える実物 9とが完全に重なって見える。このように、透過スクリーン 11に形成す る実物の像を、その大きさおよび位置を制御して実物 9と重ねることで、鑑賞者 14の 身長や立ち位置にかかわらずに、その鑑賞者 14に対して、展示されている実物 9そ のものが手元に現れたかのような錯覚を与えることができる。 FIG. 18 is a diagram showing a state where image 111 shown in FIG. 17 is projected on transmissive screen 11. As described above, when the viewer 14 stands in front of the transmissive screen 11 and stands at the left side in FIG. 18 and the height of the viewer 14 is lower than the standard height. The real image formed on the transmission screen 11 is formed at a position where the central force of the transmission screen 11 is shifted to the lower left. As a result, the viewer 14 standing to the left in front of the transmissive screen 11 sees the real image formed on the transmissive screen 11 and the real object 9 seen through the transmissive screen 11 completely overlap. In this manner, by superimposing the real image formed on the transmissive screen 11 on the real object 9 by controlling its size and position, it can be given to the viewer 14 regardless of the height or standing position of the viewer 14. On the other hand, you can give the illusion that the 9 real objects on display are at hand.
[0128] また、鑑賞者 14が透過スクリーン 11に近づいた場合には、同様の処理により、像が 大きく投影される。その結果、この透過スクリーン 11に近づいた鑑賞者 14からは、透 過スクリーン 11に形成する実物の像と、透過スクリーン 11越しに見える実物 9とが完 全に重なって見える。 [0128] When the viewer 14 approaches the translucent screen 11, a large image is projected by the same processing. As a result, the viewer 14 who approaches the transmissive screen 11 sees the real image formed on the transmissive screen 11 and the real object 9 seen through the transmissive screen 11 completely overlap.
[0129] また、距離センサ 71による鑑賞者 14の検出に基づいて、投影画像生成部 91は、 静止画データを生成し、プロジェクタ 13は、この静止画データに基づいて画像の出 力を開始する。これにより、透過スクリーン 11に近づいた鑑賞者 14に対して、載置台 2に載置されている実物 9が自分の方へ移動してきたかのような錯覚を与えることがで きる。  Further, based on detection of viewer 14 by distance sensor 71, projection image generating section 91 generates still image data, and projector 13 starts outputting an image based on the still image data. . Thus, it is possible to give an illusion to the viewer 14 approaching the translucent screen 11 as if the real object 9 mounted on the mounting table 2 has moved toward the user.
[0130] 実施の形態 2に係る展示装置において、この初期画面の表示の後の展示装置の動 作は、実施の形態 1に係る展示装置の動作と同一であり、説明を省略する。  [0130] In the display device according to the second embodiment, the operation of the display device after the display of the initial screen is the same as the operation of the display device according to the first embodiment, and a description thereof will be omitted.
[0131] 以上のように、この実施の形態 2の展示装置は、透過スクリーン 11の手前に立つ鑑 賞者 14の背丈および位置に応じて、透過スクリーン 11に形成する実物の像の位置 および大きさを制御する。この実施の形態 2の展示装置は、実物の像の透過スクリー ン 11における位置を調整する。これにより、鑑賞者 14に対して、その背丈および立ち 位置に関係なぐ透過スクリーン 11に形成する実物の像を、実物 9と重ねて見せるこ とができる。また、実施の形態 1のように、鑑賞者 14の立ち位置を指定しなくてよい。 なお、この実施の形態 2の展示装置の場合、鑑賞者 14が鑑賞する位置は、透過スク リーン 11の前側であればよ 、。  [0131] As described above, the display device according to the second embodiment has the position and size of the real image formed on the transmission screen 11 according to the height and position of the viewer 14 standing in front of the transmission screen 11. Control. The display device according to the second embodiment adjusts the position of the real image on the transmission screen 11. This allows the viewer 14 to see the real image formed on the transmissive screen 11 irrespective of the height and the standing position, superimposed on the real object 9. Further, unlike Embodiment 1, the standing position of the viewer 14 does not have to be specified. In the case of the display device according to the second embodiment, the position where the viewer 14 views may be on the front side of the transmission screen 11.
[0132] なお、この実施の形態 2では、一人の鑑賞者 14が透過スクリーン 11の前に立った 場合について説明している力 透過スクリーン 11の前には、一度に複数の鑑賞者が 立つ場合もある。図 19は、透過スクリーン 11の前に、三人の鑑賞者 14が立っている 状況を示す図である。このように透過スクリーン 11の前に、複数の鑑賞者が立った場 合には、人検出部 92は、透過スクリーン 11の真中に最も近いところにいる鑑賞者(図 19では 3人の中の真中の鑑賞者 14)を、透過スクリーン 11に対して操作を行う代表 的な鑑賞者 14として特定し、その代表的な鑑賞者 14に対して、透過スクリーン 11に 形成する実物の像を、実物 9と重ねて見せるようにすればよい。透過スクリーン 11に 対して操作を行わない、それ以外の鑑賞者 14にあっては、その代表的な鑑賞者 14 による操作を傍観するだけなので、そのような制御としても十分な鑑賞効果が得られ る。なお、透過スクリーン 11の真中に最も近いところにいる鑑賞者を特定する替わり に、撮像画像の真中にいる鑑賞者を特定するようにしても、同様の効果を期待するこ とがでさる。 [0132] In the second embodiment, a description is given of a case where one viewer 14 stands in front of the transmissive screen 11. In front of the translucent screen 11, a plurality of viewers can be seen at once. May stand. FIG. 19 is a diagram showing a situation where three viewers 14 are standing in front of the transmissive screen 11. When a plurality of viewers stand in front of the transmissive screen 11 in this way, the human detection unit 92 outputs the viewer closest to the center of the transmissive screen 11 (in FIG. 19, three out of three viewers). The middle viewer 14) is identified as a representative viewer 14 operating on the transmission screen 11, and the image of the real object formed on the transmission screen 11 is displayed to the representative viewer 14 in real time. You can make it overlap with nine. The other viewers 14 who do not operate the translucent screen 11 merely observe the operations of the representative viewers 14, so that a sufficient viewing effect can be obtained even with such control. You. It should be noted that the same effect can be expected even if the viewer located in the center of the captured image is specified instead of the viewer located closest to the center of the transmissive screen 11.
[0133] この実施の形態 2では、人検出部 92は、距離センサ 71および撮像装置 72の検出 情報に基づいて、鑑賞者 14の位置を正確に特定している。この他にもたとえば、人 検出部 92は、図 20に示すように、鑑賞者 14に透過スクリーン 11に触れてもらい、そ の触れた位置に基づいて、および、鑑賞者 14が腕を伸ばして触れたと仮定したとき の透過スクリーン 11と鑑賞者 14との位置関係に基づいて、鑑賞者 14の頭のおおよ その位置を特定するようにしてもよい。図 20は、鑑賞者 14が透過スクリーン 11に触 れている状態を示す図である。これにより、実施の形態 1の展示装置と同様のハード ウェア構成において鑑賞者 14の位置を簡単な処理で特定し、実施の形態 2の展示 装置と同様の効果を期待することができる。また、鑑賞者 14に透過スクリーン 11に触 れてもらうために、人検出部 92は、透過スクリーン 11に「Touch here」などのメッセ ージを表示するようにしてもょ 、。  [0133] In the second embodiment, the human detection unit 92 accurately specifies the position of the viewer 14 based on the detection information of the distance sensor 71 and the imaging device 72. In addition to this, for example, as shown in FIG. 20, the human detection unit 92 has the viewer 14 touch the translucent screen 11, and based on the touched position, and when the viewer 14 extends his arm, The approximate position of the head of the viewer 14 may be specified based on the positional relationship between the translucent screen 11 and the viewer 14 when it is assumed that the touch has been made. FIG. 20 is a diagram showing a state in which the viewer 14 is touching the transmission screen 11. This makes it possible to specify the position of the viewer 14 with simple processing in the same hardware configuration as the display device of the first embodiment, and expect the same effect as the display device of the second embodiment. In addition, the human detection unit 92 may display a message such as “Touch here” on the transmission screen 11 so that the viewer 14 touches the transmission screen 11.
[0134] なお、図 20に示す方式で鑑賞者 14の位置を検出する場合、再帰反射テープ 21お よび 2つの赤外線投受光ユニット 22, 23力 透過スクリーン 11に触れた位置を検出 する接触位置検出手段となり、操作検出部 63が、検出した接触位置に基づいて、鑑 賞者 14が腕を伸ばして触れたと仮定した透過スクリーン 11と鑑賞者 14との位置関係 に基づ!/、て、鑑賞者 14の位置を特定する鑑賞者視点特定手段となる。  When the position of the viewer 14 is detected by the method shown in FIG. 20, the position of the retroreflective tape 21 and the two infrared light emitting and receiving units 22, 23 is detected. Based on the positional relationship between the transmissive screen 11 and the viewer 14 assuming that the viewer 14 has extended his arm and touched it, based on the detected contact position, the operation detection unit 63! It is a viewer viewpoint specifying means for specifying the position of the viewer 14.
[0135] この実施の形態 2では、人検出部 92は、初期画面を表示するときに、鑑賞者 14の 位置を特定し、実物の像の形成位置を調整している。この他にもたとえば、人検出部In the second embodiment, when displaying the initial screen, the human detection unit 92 The position is specified, and the formation position of the real image is adjusted. In addition to this, for example, a human detection unit
92は、投影画像生成部 91が静止画データを出力する度に、鑑賞者 14の位置を特 定し、実物の像の形成位置を調整するようにしてもよい。これにより、鑑賞者 14が移 動したとしても、その鑑賞者 14の目線に合わせて実物の像をずらして、常に実物 9と 重なる位置に実物の像を形成することができる。ただし、このように常に実物の像が 実物とが重なる位置となるように制御する場合、制御装置 31として高速に画像処理 が可能となるものが必要となり、展示装置として高価なものになる。 The 92 may specify the position of the viewer 14 and adjust the formation position of the real image each time the projection image generation unit 91 outputs the still image data. As a result, even if the viewer 14 moves, the real image can be formed at a position that always overlaps with the real object 9 by shifting the real image according to the eyes of the viewer 14. However, in such a case where control is performed so that the real image always overlaps with the real object, a control device 31 capable of high-speed image processing is required, and the display device becomes expensive.
[0136] この実施の形態 2では、展示装置の周囲に鑑賞者 14がいないときには、プロジェク タ 13は消灯している。この他にもたとえば、展示装置の周囲に鑑賞者 14がいないと きには、プロジェクタ 13は、実物の像以外の画像を出力していてもよい。これにより、 展示装置力も離れている鑑賞者 14の目に止まり易くなる。  In Embodiment 2, when there is no viewer 14 around the display device, projector 13 is turned off. In addition, for example, when there is no viewer 14 around the display device, the projector 13 may output an image other than the real image. This makes it easier for viewers 14 who are far away from the display equipment to see.
[0137] 実施の形態 3.  [0137] Embodiment 3.
実施の形態 3に係る展示装置は、透過スクリーン 11への画像の投影に応じて、スポ ットランプ 4を調光する点において、実施の形態 2に係る展示装置と異なる。以下の説 明では、主にこの相違点を中心に説明する。  The display apparatus according to the third embodiment is different from the display apparatus according to the second embodiment in that the spot lamp 4 is dimmed in accordance with the projection of an image onto the transmission screen 11. The following description focuses mainly on this difference.
[0138] 図 21は、本発明の実施の形態 3に係る展示装置の制御系のハードウェア構成を示 すブロック図である。図 22は、図 21の展示装置の制御系を示すブロック図である。図 23は、図 21のハードディスク装置 35に記憶されるデータを示す図である。  FIG. 21 is a block diagram showing a hardware configuration of a control system of the display device according to Embodiment 3 of the present invention. FIG. 22 is a block diagram showing a control system of the display device of FIG. FIG. 23 is a diagram showing data stored in the hard disk device 35 of FIG.
[0139] 実施の形態 3に係る展示装置では、図 21に示すように、スポットランプ 4は、入出力 ポート 32に接続される。スポットランプ 4の光量は、展示装置の制御系により制御され る。また、実施の形態 3に係るハードディスク装置 35には、図 23に示すように、投影 画像生成プログラム 121と、ランプ制御プログラム 122とを、記憶する。  In the display device according to Embodiment 3, the spot lamp 4 is connected to the input / output port 32 as shown in FIG. The light amount of the spot lamp 4 is controlled by the control system of the exhibition device. The hard disk device 35 according to the third embodiment stores a projection image generation program 121 and a lamp control program 122, as shown in FIG.
[0140] 投影画像生成プログラム 121は、メモリ 34に読み込まれ、 CPU33により実行される 。これにより、像変更手段、光量制御手段および像制御手段としての、図 22に示す 投影画像生成部 131が生成される。投影画像生成部 131は、三次元コンピュータグ ラフィックスデータ 44に基づ 、て静止画データを生成し、この生成した静止画データ をプロジェクタ 13へ出力する。  [0140] The projection image generation program 121 is read into the memory 34 and executed by the CPU 33. As a result, the projection image generation unit 131 shown in FIG. 22 is generated as the image changing unit, the light amount control unit, and the image control unit. The projection image generation unit 131 generates still image data based on the three-dimensional computer graphics data 44, and outputs the generated still image data to the projector 13.
[0141] ランプ制御プログラム 122は、メモリ 34に読み込まれ、 CPU33により実行される。こ れにより、光量制御手段としての、図 22に示すランプ制御部 132が生成される。ラン プ制御部 132は、スポットランプ 4の光量を 0〜 100%に制御する。 [0141] The lamp control program 122 is read into the memory 34 and executed by the CPU 33. This As a result, a lamp control unit 132 shown in FIG. 22 is generated as light amount control means. The lamp control unit 132 controls the light amount of the spot lamp 4 to be 0 to 100%.
[0142] 実施の形態 3に係る展示装置において、上述した構成要素以外の構成要素は、実 施の形態 2に係る展示装置の構成要素と同じであり、同一の符号および名称を使用 して説明を省略する。  [0142] In the display device according to the third embodiment, the components other than the components described above are the same as the components of the display device according to the second embodiment, and will be described using the same reference numerals and names. Is omitted.
[0143] 次に、本発明の実施の形態 3に係る展示装置の動作について説明する。  Next, the operation of the display device according to Embodiment 3 of the present invention will be described.
[0144] 展示装置の周囲に人がいない初期状態においては、距離センサ 71は、透過スクリ ーン 11の手前方向へ光を出力する。撮像装置 72は、透過スクリーン 11の手前側の 所定の範囲を撮像する。ランプ制御部 132は、スポットランプ 4を 100%の出力で点 灯する。プロジェクタ 13は、消灯している。  In an initial state in which no person is around the display device, the distance sensor 71 outputs light in the forward direction of the transmission screen 11. The imaging device 72 images a predetermined range on the near side of the transmission screen 11. The lamp control unit 132 lights the spot lamp 4 at 100% output. The projector 13 is off.
[0145] スポットランプ 4の光に照らされている実物 9 (ここでは銀色のカップ)を鑑賞するた めに、透過スクリーン 11の手前に鑑賞者 14が立つと、距離センサ 71は、鑑賞者 14 までの距離を出力する。人検出部 92は、その鑑賞者 14の頭の位置を特定し、その 特定した頭の透過スクリーン 11および実物 9に対する相対位置に関する情報を、投 影画像生成部 131へ出力する。  [0145] When the viewer 14 stands in front of the transmissive screen 11 to view the real object 9 (here, the silver cup) illuminated by the light of the spot lamp 4, the distance sensor 71 The distance to is output. The human detection unit 92 specifies the position of the head of the viewer 14 and outputs information on the relative position of the specified head to the translucent screen 11 and the real object 9 to the projection image generation unit 131.
[0146] 投影画像生成部 131は、この頭の相対位置の情報と、三次元コンピュータグラフィ ックスデータ 44とに基づいて、透過スクリーン 11に形成する実物の像力 その頭の位 置から見たときに、実物 9と一致する静止画データを生成し、プロジェクタ 13へ出力 する。プロジェクタ 13は、出力を開始し、静止画データに基づく画像を出力する。透 過スクリーン 11には、プロジェクタ 13により投影された画像が形成される。  [0146] Based on the information on the relative position of the head and the three-dimensional computer graphics data 44, the projection image generation unit 131 generates a real image formed on the transmissive screen 11 when viewed from the position of the head. Then, still image data that matches the real object 9 is generated and output to the projector 13. The projector 13 starts outputting and outputs an image based on the still image data. An image projected by the projector 13 is formed on the translucent screen 11.
[0147] また、投影画像生成部 131は、プロジェクタ 13へ静止画データを出力するのと前後 して、ランプ制御部 132へ消灯を指示する。ランプ制御部 132は、スポットランプ 4を 消灯する。  Further, before and after outputting the still image data to projector 13, projection image generation section 131 instructs lamp control section 132 to turn off the light. The lamp control unit 132 turns off the spot lamp 4.
[0148] このように、透過スクリーン 11への画像の投影を開始する際にスポットランプ 4を消 灯すると、実物 9と鑑賞者 14との間には透過率が 65%の透過スクリーン 11があるた め、透過スクリーン 11越しに実物 9が見えなくなる。その結果、鑑賞者 14に対して、 載置台 2に載置されて 、る実物 9があた力も手前に近づくように移動してきたかのよう な錯覚を与えることができる。 [0149] なお、このような載置台 2に載置されている実物 9があた力も手前に近づくように移 動してきたかのような錯覚を与える効果を期待するとき、透過スクリーン 11の透過率 は、 50%以上でかつ 80%以下のものを使用するとよい。透過スクリーン 11の透過率 が 50%より低い場合、スポットランプ 4を消灯するとすぐに実物 9が見えなくなってしま うので、載置台 2に載置されている実物 9があた力も手前に近づくように移動してきた かのような印象を与えにくい。また、透過スクリーン 11の透過率が 80%より高い場合 、スポットランプ 4を消灯したとしても、いつまでも実物 9がうつすら見えてしまうことがあ り、載置台 2に載置されている実物 9があた力も手前に近づくように移動してきたかの ような印象を与えにくい。 [0148] As described above, when the spot lamp 4 is turned off when the projection of the image on the transmissive screen 11 is started, there is the transmissive screen 11 having a transmittance of 65% between the real object 9 and the viewer 14. Therefore, the real object 9 cannot be seen through the transmission screen 11. As a result, it is possible to give the viewer 14 an illusion that the real object 9 placed on the mounting table 2 has moved such that the force approaches the front. [0149] It should be noted that when expecting the effect of giving the illusion that the real object 9 placed on the mounting table 2 also moves so that the force approaches the near side, the transmittance of the transmission screen 11 becomes It is better to use the one with 50% or more and 80% or less. If the transmittance of the transmissive screen 11 is lower than 50%, the real object 9 disappears as soon as the spot lamp 4 is turned off, and the force of the real object 9 placed on the mounting table 2 also approaches the near side. It is difficult to give an impression as if they have moved to the city. If the transmittance of the transmissive screen 11 is higher than 80%, even if the spot lamp 4 is turned off, the actual object 9 may be seen indefinitely, and the actual object 9 placed on the mounting table 2 may be invisible. It is difficult to give the impression that your power has moved closer to you.
[0150] また、一定期間操作がない場合、操作検出部 63は、投影画像生成部 131へ先に 表示した初期画像の表示を指示する。投影画像生成部 131は、先に表示した初期 画像 51を透過スクリーン 11に表示した後、所定の時間が経過すると、プロジェクタ 13 へ消灯の指示を出力するとともに、ランプ制御部 132に対して点灯の指示を出力す る。ランプ制御部 132は、スポットランプ 4を点灯する。プロジェクタ 13は、消灯する。  If no operation has been performed for a certain period of time, the operation detection unit 63 instructs the projection image generation unit 131 to display the previously displayed initial image. After displaying the initial image 51 previously displayed on the transmissive screen 11 after a predetermined time has elapsed, the projection image generation unit 131 outputs an instruction to turn off the light to the projector 13 and turns on the lamp control unit 132. Output instructions. The lamp control unit 132 turns on the spot lamp 4. The projector 13 is turned off.
[0151] このように、透過スクリーン 11への画像の投影を終了するとともにその終了の際にス ポットランプ 4を点灯すると、直前まであた力も手に持って 、たかのように錯覚して!/ヽ た実物 9が載置台 2の上へ戻っていったかのような錯覚を、鑑賞者 14に対して与える ことができる。なお、透過スクリーン 11への画像の投影を開始するときにスポットラン プ 4を消灯する場合と同様の理由によって、このような錯覚を与える効果を期待すると きには、透過スクリーン 11の透過率は、 50%以上でかつ 80%以下のものを使用する とよい。  [0151] As described above, when the projection of the image on the transmissive screen 11 is completed and the spot lamp 4 is turned on at the end of the projection, the illusion of holding the hand up to immediately before is as if! /ヽ It is possible to give an illusion to the viewer 14 as if the object 9 had returned to the mounting table 2. When the effect of giving such an illusion is expected for the same reason as when the spot lamp 4 is turned off when the projection of the image onto the transmissive screen 11 is started, the transmittance of the transmissive screen 11 becomes It is better to use the one with 50% or more and 80% or less.
[0152] 実施の形態 3に係る展示装置において、初期画面を最初に表示してから、最後に 初期画面を消すまでの期間の展示装置の動作は、実施の形態 2に係る展示装置の 動作と同一であり、説明を省略する。  [0152] In the display device according to the third embodiment, the operation of the display device from the initial display of the initial screen to the end of the display of the initial screen is the same as that of the display device according to the second embodiment. It is the same, and the description is omitted.
[0153] 以上のように、この実施の形態 3の展示装置では、透過スクリーン 11に実物の像を 表示するときには、スポットランプ 4を消灯する。これにより、鑑賞者 14に対して、載置 台 2に載置されている実物 9が近づいてきたかのような錯覚を与えることができる。鑑 賞者 14の注目点は、実物 9と実物の像との間で行き来することになる。なお、スポット ランプ 4を消灯するのではなぐスポットランプ 4の光量を下げるだけでも、同様の効果 を期待することができる。 As described above, in the display device of the third embodiment, when displaying the real image on the transmissive screen 11, the spot lamp 4 is turned off. Thereby, the viewer 14 can be given an illusion as if the real object 9 placed on the mounting table 2 is approaching. The attention of the viewer 14 will come and go between the real 9 and the real image. In addition, spot The same effect can be expected by simply reducing the light amount of the spot lamp 4 instead of turning off the lamp 4.
[0154] また、この実施の形態 3の展示装置では、透過スクリーン 11への実物の像の表示を 終了するときには、スポットランプ 4を点灯する。これにより、鑑賞者 14に対して、透過 スクリーン 11に形成されていた実物の像力 透過スクリーン 11の向こう側にある実物 9のところへ戻っていったかのような錯覚を与えることができる。鑑賞者 14の注目点は 、実物 9と実物の像との間で行き来することになる。なお、スポットランプ 4を点灯する のではなぐスポットランプ 4の光量を上げるだけでも、同様の効果を期待することが できる。 In the display device according to the third embodiment, when the display of the real image on transmission screen 11 is finished, spot lamp 4 is turned on. Thereby, the viewer 14 can be given an illusion of returning to the real image 9 on the other side of the transmissive screen 11 formed on the transmissive screen 11. The point of interest of the viewer 14 will be toggling between the real 9 and the real statue. The same effect can be expected by merely increasing the light amount of the spot lamp 4 instead of turning on the spot lamp 4.
[0155] なお、この実施の形態 3では、展示装置の周囲に人がいないときには、スポットラン プ 4のみを点灯し、鑑賞者がきたときには、プロジェクタ 13のみを点灯させている。こ の他にもたとえば、スポットランプ 4の点灯およびプロジェクタ 13の点灯を、段階的に 制御したり、ゆっくりと変化するように制御したりするようにしてもよい。  [0155] In Embodiment 3, only spot lamp 4 is turned on when there is no person around the display device, and only projector 13 is turned on when a viewer comes. In addition to this, for example, the lighting of the spot lamp 4 and the lighting of the projector 13 may be controlled stepwise or may be controlled so as to change slowly.
[0156] また、スポットランプ 4の点灯のみを、人の有無に応じて制御するようにしてもよい。  [0156] Only the lighting of the spot lamp 4 may be controlled according to the presence or absence of a person.
この変形例の場合であれば、プロジェクタ 13として消灯機能を具備しな 、ものを使用 することができ、展示装置のトータルコストを抑えつつ、実施の形態 3の展示装置と同 様の効果を期待することができる。  In the case of this modification, it is possible to use a projector 13 that does not have an extinguishing function, and to expect the same effect as the display device of the third embodiment while suppressing the total cost of the display device. can do.
[0157] また、展示装置の周囲に人がいないときには、スポットランプ 4およびプロジェクタ 1 3を消灯し、鑑賞者がきたらスポットランプ 4のみを点灯し、鑑賞者 14が操作をした後 プロジェクタ 13のみを点灯するようにしてもょ 、。  [0157] Further, when there is no person around the display device, the spot lamp 4 and the projector 13 are turned off, and when the viewer comes, only the spot lamp 4 is turned on. You can turn it on.
[0158] この実施の形態 3では、展示装置は、白熱色の光を出力するスポットランプ 4を 1つ 有する。この他にもたとえば、展示装置は、図 24に示すように、白熱色の光を出力す るスポットランプ 4と、単色光を出力するスポットランプ 141とを有するものであってもよ い。また、展示装置は、この複数のスポットランプ 4, 141を切り替えて点灯するように 制御したり、同時に点灯させたりしてもよい。複数のスポットランプ 4, 141を同時に点 灯させた場合、光の合成によって実物 9の色を変化させることができる。図 24は、白 熱色の光を出力するスポットランプ 4と、単色光を出力するスポットランプ 141とを有す る展示装置の変形例を示す断面図である。 [0159] この実施の形態 3では、展示装置は、スポットランプ 4および実物 9は、 1つずつ有 する。この他にもたとえば、展示装置は、図 25に示すように、スポットランプ 4, 151お よび実物 9, 152を複数組有するものであってもよい。また、展示装置は、この複数の スポットランプ 4, 151を切り替えて点灯することで、鑑賞者 14の注目点を複数の実物 9, 152の間で切り替えるようにしてもよい。図 25は、スポットランプ 4, 151および実 物 9, 152を複数組有する展示装置の変形例を示す断面図である。 In the third embodiment, the display device has one spot lamp 4 that outputs incandescent light. In addition to this, for example, as shown in FIG. 24, the display device may have a spot lamp 4 that outputs incandescent light and a spot lamp 141 that outputs monochromatic light. Further, the display device may be controlled such that the plurality of spot lamps 4 and 141 are switched on or off, or may be turned on simultaneously. When a plurality of spot lamps 4 and 141 are turned on at the same time, the color of the real object 9 can be changed by combining light. FIG. 24 is a cross-sectional view showing a modification of the display device having the spot lamp 4 that outputs incandescent light and the spot lamp 141 that outputs monochromatic light. In the third embodiment, the display device has one spot lamp 4 and one real object 9. In addition to this, for example, as shown in FIG. 25, the display device may have a plurality of sets of spot lamps 4, 151 and real objects 9, 152. In addition, the display device may switch the plurality of spot lamps 4 and 151 to be lit, thereby switching the point of interest of the viewer 14 between the plurality of real objects 9 and 152. FIG. 25 is a cross-sectional view showing a modification of the display device having a plurality of sets of spot lamps 4 and 151 and real objects 9 and 152.
[0160] 以上の各実施の形態は、本発明の好適な実施の形態の例であるが、本発明はこれ に限定されるものではなぐ種々の変形、変更が可能である。  [0160] Each of the above embodiments is an example of a preferred embodiment of the present invention, but the present invention is not limited to this, and various modifications and changes can be made.
[0161] たとえば、上記各実施の形態では、展示装置の立入制限枠 1は、立方体形状の骨 組み構造に組み立てられている。この他にもたとえば、立入制限枠 1は、六角柱形状 、その他の柱状形状に形成されていてもよい。また、展示装置は、建物の柱の中など に設置されてもよい。  [0161] For example, in each of the above embodiments, the entrance restriction frame 1 of the display device is assembled in a cubic frame structure. In addition to this, for example, the entry restriction frame 1 may be formed in a hexagonal column shape or another column shape. The display device may be installed in a pillar of a building or the like.
[0162] 上記各実施の形態では、立入制限枠 1の 1組の柱部材 5と柱部材 5との間にのみ、 透過スクリーン 11が配設されている。この他にもたとえば、立入制限枠 1のすベての 組の柱部材 5と柱部材 5との間に、複数の透過スクリーンを配設するようにしてもよい 。これにより、同時に複数の鑑賞者 14がそれぞれの透過スクリーンに対して操作を行 い、それぞれのペースで鑑賞することができる。  In each of the above embodiments, the transmissive screen 11 is provided only between the pair of column members 5 of the entry restriction frame 1. In addition to this, for example, a plurality of transmissive screens may be provided between the column members 5 of all the sets of the entrance restriction frame 1. This allows a plurality of viewers 14 to operate the respective translucent screens at the same time, and to watch at the respective paces.
産業上の利用可能性  Industrial applicability
[0163] 本発明に係る展示装置は、現実に存在する美術品、工芸品、商品などを展示する 場合に利用することができる。 [0163] The display device according to the present invention can be used to display arts, crafts, commodities, and the like that actually exist.

Claims

請求の範囲 The scope of the claims
[1] 展示対象である実物と鑑賞位置との間に設けられ、その鑑賞位置にいる鑑賞者か らその実物が見えるサイズでありかつ透過性のあるスクリーンと、  [1] A screen, which is provided between the actual object to be displayed and the viewing position, and is sized and transparent for the viewer to see at the viewing position,
上記鑑賞者の視点で、上記スクリーン越しに見たときの上記実物と略同じ姿勢であ る実物の像を、上記実物と重なって見えるように上記スクリーンに投影するプロジェク タと、  A projector that projects an image of a real object having substantially the same posture as the real object when viewed through the screen from the viewpoint of the viewer on the screen so as to overlap with the real object,
を有することを特徴とする展示装置。  A display device comprising:
[2] 前記鑑賞者の鑑賞位置または視点の位置を検出する鑑賞位置検出手段と、  [2] viewing position detecting means for detecting a viewing position or a viewpoint position of the viewer,
上記鑑賞位置検出手段が検出した位置に応じて、その位置力 見て前記実物の 像が前記実物と重なるように、前記スクリーンに形成される前記実物の像を変える像 変更手段と、  Image changing means for changing the image of the real object formed on the screen in accordance with the position detected by the viewing position detecting means so that the image of the real object overlaps the real object when viewed from the position force;
を有することを特徴とする請求項 1記載の展示装置。  The display device according to claim 1, comprising:
[3] 前記鑑賞位置検出手段は、 [3] The viewing position detecting means,
前記スクリーンカゝら前記鑑賞者までの距離を検出する距離センサと、  A distance sensor for detecting a distance to the viewer, the screen capa;
前記スクリーンの前記鑑賞者側の所定の範囲を撮像する撮像装置と、  An imaging device for imaging a predetermined range on the viewer side of the screen;
上記距離センサが検出した距離および上記撮像装置が撮像した画像に基づいて、 前記鑑賞者の視点の位置を特定する鑑賞者視点特定手段と、  Based on the distance detected by the distance sensor and the image captured by the imaging device, a viewer viewpoint specifying unit that specifies the position of the viewer's viewpoint,
を有することを特徴とする請求項 2記載の展示装置。  3. The display device according to claim 2, comprising:
[4] 前記鑑賞者視点特定手段は、撮像された前記画像に複数の鑑賞者が写って!/ヽる 場合には、その画像の真中に 、る鑑賞者あるいは前記スクリーンの真中に最も近 ヽ ところにいる鑑賞者の視点位置を特定することを特徴とする請求項 3記載の展示装 置。 [4] In the case where a plurality of viewers are shown in the captured image, the viewer viewpoint identification means may be located at the center of the image or the viewer closest to the center of the screen. 4. The display device according to claim 3, wherein a viewpoint position of a viewer present is specified.
[5] 前記鑑賞位置検出手段は、  [5] The viewing position detecting means,
前記スクリーンに触れた位置を検出する接触位置検出手段と、  Contact position detecting means for detecting a position touching the screen,
上記接触位置検出手段が検出した接触位置に基づ!/、て、前記鑑賞者が腕を伸ば して触れたと仮定した前記スクリーンと前記鑑賞者との位置関係に基づいて、前記鑑 賞者の位置を特定する鑑賞者視点特定手段と、  Based on the contact position detected by the contact position detecting means, based on the positional relationship between the screen and the viewer, assuming that the viewer extended his arm and touched the viewer, Viewer viewpoint specifying means for specifying the position of the
を有することを特徴とする請求項 2記載の展示装置。 3. The display device according to claim 2, comprising:
[6] 前記スクリーンは、透過率が 50%以上でかつ 80%以下である透過スクリーンであり 前記スクリーンより前記実物側に配設され、前記実物を照らす投光部材と、 前記スクリーンに前記実物の像を形成する場合には上記投光部材が出力する光量 を下げる第一の制御および前記スクリーンに形成していた前記実物の像を消す場合 には上記投光部材が出力する光量を上げる第二の制御の中の少なくとも一方の制 御を実行する光量制御手段と、 [6] The screen is a transmissive screen having a transmittance of 50% or more and 80% or less, and is disposed on the real side of the screen and illuminates the real. The first control is to lower the amount of light output by the light emitting member when forming an image, and the second control is to increase the amount of light output by the light emitting member to erase the real image formed on the screen. Light amount control means for executing at least one of the controls,
を有することを特徴とする請求項 1記載の展示装置。  The display device according to claim 1, comprising:
[7] 前記スクリーンに対する鑑賞者の操作を検出する操作検出手段と、 [7] operation detecting means for detecting a viewer's operation on the screen,
上記操作検出手段が検出した操作およびその時点での像の表示状態に基づいて 、前記スクリーンに形成する実物の像を回転あるいは拡縮させる像制御手段と、 を有することを特徴とする請求項 1記載の展示装置。  2. An image control means for rotating or enlarging or reducing a real image formed on the screen based on an operation detected by the operation detection means and a display state of the image at that time. Exhibition equipment.
[8] 前記スクリーンに対する鑑賞者の操作を検出する操作検出手段と、 [8] operation detecting means for detecting a viewer's operation on the screen,
上記操作検出手段が検出した操作およびその時点での像の表示状態に基づいて 、前記実物に関連する画像、映像あるいは文字情報を前記スクリーンに形成する画 像形成手段と、  Image forming means for forming, on the screen, an image, video or character information relating to the real object based on the operation detected by the operation detecting means and the display state of the image at that time;
を有することを特徴とする請求項 1記載の展示装置。  The display device according to claim 1, comprising:
PCT/JP2005/010916 2004-06-15 2005-06-15 Exhibition system WO2005124450A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2004-177086 2004-06-15
JP2004177086A JP2006003414A (en) 2004-06-15 2004-06-15 Exhibiting device

Publications (1)

Publication Number Publication Date
WO2005124450A1 true WO2005124450A1 (en) 2005-12-29

Family

ID=35509856

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2005/010916 WO2005124450A1 (en) 2004-06-15 2005-06-15 Exhibition system

Country Status (2)

Country Link
JP (1) JP2006003414A (en)
WO (1) WO2005124450A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009540349A (en) * 2006-06-07 2009-11-19 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Optical feedback on the selection of physical objects
JP2014042655A (en) * 2012-08-27 2014-03-13 Adc Technology Inc Display apparatus
JP2017080516A (en) * 2017-01-17 2017-05-18 エイディシーテクノロジー株式会社 Display device

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011248041A (en) 2010-05-26 2011-12-08 Seiko Epson Corp Mounting device and projection type display apparatus
JP2015232634A (en) * 2014-06-10 2015-12-24 セイコーエプソン株式会社 Display device
JP2015232633A (en) * 2014-06-10 2015-12-24 セイコーエプソン株式会社 Display device
JP2016001211A (en) * 2014-06-11 2016-01-07 セイコーエプソン株式会社 Display device
JP6449120B2 (en) * 2015-08-31 2019-01-09 日本電信電話株式会社 Aerial image display device and aerial image display method
JP6726889B2 (en) * 2016-06-20 2020-07-22 パナソニックIpマネジメント株式会社 Video display system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03221008A (en) * 1990-01-25 1991-09-30 Fujitsu Ltd Display device for display object
JPH0535192A (en) * 1991-07-25 1993-02-12 Sony Corp Display device
JPH0933856A (en) * 1995-07-24 1997-02-07 Denso Corp Display device
JPH11164291A (en) * 1997-09-26 1999-06-18 Denso Corp Video information display system
JP2003058092A (en) * 2001-08-10 2003-02-28 Masaaki Matsumura Window advertisement electronic display and method of using the same
JP2003173237A (en) * 2001-09-28 2003-06-20 Ricoh Co Ltd Information input-output system, program and storage medium
JP2004054065A (en) * 2002-07-23 2004-02-19 Saeilo Japan Inc Show window interactive display device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03221008A (en) * 1990-01-25 1991-09-30 Fujitsu Ltd Display device for display object
JPH0535192A (en) * 1991-07-25 1993-02-12 Sony Corp Display device
JPH0933856A (en) * 1995-07-24 1997-02-07 Denso Corp Display device
JPH11164291A (en) * 1997-09-26 1999-06-18 Denso Corp Video information display system
JP2003058092A (en) * 2001-08-10 2003-02-28 Masaaki Matsumura Window advertisement electronic display and method of using the same
JP2003173237A (en) * 2001-09-28 2003-06-20 Ricoh Co Ltd Information input-output system, program and storage medium
JP2004054065A (en) * 2002-07-23 2004-02-19 Saeilo Japan Inc Show window interactive display device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009540349A (en) * 2006-06-07 2009-11-19 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Optical feedback on the selection of physical objects
US9336700B2 (en) 2006-06-07 2016-05-10 Koninklijke Philips N.V. Light feedback on physical object selection
JP2014042655A (en) * 2012-08-27 2014-03-13 Adc Technology Inc Display apparatus
JP2017080516A (en) * 2017-01-17 2017-05-18 エイディシーテクノロジー株式会社 Display device

Also Published As

Publication number Publication date
JP2006003414A (en) 2006-01-05

Similar Documents

Publication Publication Date Title
ES2354985T3 (en) SYSTEM AND METHOD FOR OPERATION IN THE 3D VIRTUAL SPACE.
WO2005124450A1 (en) Exhibition system
KR101795644B1 (en) Projection capture system, programming and method
JP6059223B2 (en) Portable projection capture device
JP3968477B2 (en) Information input device and information input method
JP3092162B2 (en) Multiple image synthesis device
EP1530119A2 (en) Stereoscopic two-dimensional image display device and method
JP2000010194A (en) Picture display method and device
JP4843901B2 (en) Display device
JP2007318754A (en) Virtual environment experience display device
JP2004227332A (en) Information display method
KR101756948B1 (en) Apparatus of electronic bulletin board
Fisher et al. Augmenting reality with projected interactive displays
JP2000090285A (en) Video display device
JP2007200353A (en) Information processor and information processing method
JP2010033604A (en) Information input device and information input method
WO2018207490A1 (en) Contactless three-dimensional touch panel, contactless three-dimensional touch panel system, contactless three-dimensional touch panel control method, program and recording medium
Lee Projector-based location discovery and tracking
WO2023162690A1 (en) Floating video display device
JP2000039949A (en) Video display device
WO2017054114A1 (en) Display system and display method therefor
WO2004114108A1 (en) 3-dimensional image display method and device
JP2004258287A (en) Video display system
JP2002108561A (en) Interactive projection display picture system
Spassova Interactive ubiquitous displays based on steerable projection

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

122 Ep: pct application non-entry in european phase