WO2004056153A1 - 音声情報支援システム - Google Patents
音声情報支援システム Download PDFInfo
- Publication number
- WO2004056153A1 WO2004056153A1 PCT/JP2003/016110 JP0316110W WO2004056153A1 WO 2004056153 A1 WO2004056153 A1 WO 2004056153A1 JP 0316110 W JP0316110 W JP 0316110W WO 2004056153 A1 WO2004056153 A1 WO 2004056153A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- support system
- information support
- audio
- electromagnetic wave
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/20—Arrangements for obtaining desired frequency or directional characteristics
- H04R1/32—Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only
- H04R1/40—Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only by combining a number of identical transducers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R3/00—Circuits for transducers, loudspeakers or microphones
- H04R3/12—Circuits for transducers, loudspeakers or microphones for distributing signals to two or more loudspeakers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/20—Arrangements for obtaining desired frequency or directional characteristics
- H04R1/32—Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only
- H04R1/40—Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only by combining a number of identical transducers
- H04R1/403—Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only by combining a number of identical transducers loud-speakers
Definitions
- the invention of this application relates to a voice information support system. More specifically, the invention of this application relates to a new voice information support system that can realize voice information support individually corresponding to each user who is viewing an image displayed on a screen or the like.
- Patent Document 1 Conventionally, there has been known an information system in which a screen is mounted on a table top and a video is projected by a projector from a lower rear portion thereof (see Patent Document 1). According to this information system, many users will be able to watch images simultaneously around the table.
- Patent Document 1 Japanese Patent Application Laid-Open No. 2000-1 75773
- each user can only watch the video projected on the table, and each user can control the sound (voice, music, signal sound, etc .; the same applies hereinafter) related to the video. You can't listen to your favorite audio information when you want.
- a mode in which audio is output together with video is also proposed in Patent Document 1, but audio information is sent from speakers to all users surrounding the table, and all users always hear the same audio. It just becomes.
- an image display device (1) for displaying an image an image display device (1) as shown in FIG.
- An audio output device (2) that outputs an electromagnetic wave modulated based on audio information toward one or more positions in the image displayed by the device, and receives the electromagnetic wave at a position in the image.
- a sound reproducing terminal (3) having a converting means (31) for converting the electric signal into an electric signal; and a sound reproducing means (32) for sound reproducing the electric signal obtained by the converting means (31).
- a voice information support system is provided.
- an image display device (1) that displays an image, and one or more positions in the image displayed by the image display device (1), as illustrated in the functional block diagram of FIG.
- An audio output device (2) that outputs an electromagnetic wave modulated based on audio information; a conversion unit (31) that receives the electromagnetic wave at a position in the image and converts the electromagnetic wave into an electric signal; 31)
- An audio reproduction terminal (3) having an audio reproduction means (32) for reproducing the electric signal obtained by the above-mentioned method and an ID transmission means (33) for transmitting an ID, and an audio reproduction terminal (3).
- a voice information support system comprising an ID detection device (4) for detecting an ID transmitted by ID transmission means (33).
- an image display device (1) that displays an image, and one or more positions in the image displayed by the image display device (1), as illustrated in the functional block diagram of FIG.
- An audio output device (2) that outputs an electromagnetic wave modulated based on audio information; a conversion unit (31) that receives the electromagnetic wave at a position in the image and converts the electromagnetic wave into an electric signal; 31) a sound reproducing terminal (3) having a sound reproducing means (32) for reproducing the electric signal obtained by (31), and a position detecting device (5) for detecting the position of the sound reproducing terminal (3);
- a voice information support system that features provide.
- an image display device (1) that displays an image, and one or more positions in the image displayed by the image display device (1), as illustrated in the functional block diagram of FIG.
- An audio output device that outputs an electromagnetic wave modulated based on audio information (2); a conversion unit (31) that receives the electromagnetic wave at a position in the image and converts the electromagnetic wave into an electric signal; 31.
- An audio reproduction terminal (3) having an audio reproduction means (32) for reproducing the electric signal obtained by the above-mentioned method and an ID transmission means (33) for transmitting an ID, and an ID transmission of the audio reproduction terminal (3).
- the voice characterized by comprising an ID detection device (4) for detecting the ID transmitted by the means (33), and a position detection device (5) for detecting the position of the audio reproduction terminal (3).
- the image display device (1) includes a screen means (11) for displaying an image, and an image projected on the screen means.
- a voice information support system comprising: an image projection means; (12) a sixth aspect of the present invention, wherein the screen means (11) has a flat, curved, or uneven surface having an image display surface.
- the screen means (11) is a voice information support system characterized in that its image display surface is translucent.
- the image projection means (12) projects an image from the image display surface side to the screen means (11).
- the image projection means (12) Audio information support system and the image display surface with respect to emissions means (11), characterized in that projects an image from the opposite facing side.
- a tenth aspect is the audio information support system, wherein the image display device (1) is a cathode ray tube display.
- the flat panel display is a liquid crystal display, a plasma display, an electroluminescence display, a light emitting diode display, a fluorescent display tube.
- a voice information support system is provided, which is one of a display and a field emission display.
- the audio output device (2) as exemplified in a functional block diagram in FIG.
- the electromagnetic wave irradiating means comprises: (22) A sound information reporting system characterized by having an electromagnetic wave source that outputs an electromagnetic wave.
- the electromagnetic wave source is provided corresponding to each of a plurality of positions in the image.
- the audio information support system is characterized in that the electromagnetic wave source is capable of changing an irradiation direction toward a plurality of positions in the image.
- Speech information support system specializing in multiple A seventeenth aspect, wherein the electromagnetic wave source is a light source that outputs a light as an electromagnetic wave, and a eighteenth aspect, wherein the light source is a light emitting diode or a laser.
- a nineteenth aspect of the present invention provides a voice information support system characterized by irradiating light from the light source to a position in the image through an optical cable.
- the conversion means (31) of the audio reproduction terminal (3) is a photoelectric conversion means for receiving light from the light source of the electromagnetic wave irradiation means and performing photoelectric conversion.
- An audio information support system 21.
- the audio information support system, wherein the photoelectric conversion means is a solar cell.
- the conversion means (31) of the audio reproduction terminal (3). Is a voice information support system characterized in that it can be worn on a part of the body of the terminal user. 23. The part of the body is a hand or a foot.
- a twenty-fourth aspect of the present invention is characterized in that the conversion means (31) of the audio reproduction terminal (3) can be mounted on or incorporated in a pointing rod held by a terminal user.
- the audio information support system The audio reproducing means (32) of the terminal (3) is an audio information support system characterized by being an earphone, a headphone or a speaker. 26th, the audio reproducing terminal (3) is a separate Provided is a voice information support system characterized by being a powerless terminal that does not require a drive power supply. In addition, 27th, the audio reproduction as shown in FIG.
- the ID transmitting means (33) of the raw terminal (3) is an RFID tag (34), and the ID detection device (4) performs ID authentication communication with the RF ID tag (34).
- a voice information support system characterized by being a reader / writer (41).
- the ID transmitting means (33) of the voice reproduction terminal (3) as exemplified in a functional block diagram in FIG.
- An optical ID tag (35) and the ID detection device (4) is an infrared sensor (42) that receives the infrared light for ID emitted by the optical ID tag (35) and outputs the ID data.
- the ninth ninth feature is that the optical ID tag (35) is an infrared light source (35a) for ID infrared light as exemplified in a functional block diagram in FIG. ), An ID storage means (35b) for storing ID data, and a modulating means (35c) for modulating infrared light for ID in accordance with the ID data.
- the line sensor (42) provides an audio information support system characterized in that the line sensor (42) receives ID infrared light modulated and transmitted by the optical ID tag (35) and outputs ID data. .
- the position detecting device (5) includes an infrared light source (51) for position infrared light, and the sound reproducing terminal (3).
- a position detecting means (53) for detecting a position, wherein the sound reproducing terminal. (3) is a reflecting means (5) for reflecting the position infrared light transmitted by the position detecting device (5).
- the audio information support system comprising: a touch panel provided on an image display surface of the image display device (1);
- the position of the audio playback terminal (3) is determined based on the position of the evening panel that the terminal user has been eroded. That it has a position detecting means for exiting providing audio information support system to Toku ⁇ a.
- FIG. 1 is a functional block diagram for explaining the invention of this application.
- FIG. 2 is a functional block diagram for explaining the invention of this application.
- FIG. 3 is a functional block diagram for explaining the invention of this application.
- FIG. 4 is a functional block diagram for explaining the invention of this application.
- FIG. 5 is a functional block diagram for explaining the invention of this application.
- FIG. 6 is a functional block diagram for explaining the invention of this application.
- FIG. 7 is a functional block diagram for explaining the invention of this application.
- FIG. 8 is a functional block diagram for explaining the invention of this application.
- FIG. 9 is a functional block diagram for explaining the invention of this application.
- FIG. 10 is a functional block diagram for explaining the invention of this application.
- FIG. 11 is a schematic diagram showing an embodiment of the invention of this application.
- FIG. 12 is a schematic diagram showing another embodiment of the invention of this application.
- FIG. 13 is a schematic diagram showing an embodiment of the invention of the present application when performing ID authentication.
- FIG. 14 is a schematic diagram showing another embodiment of the invention of this application when performing ID authentication.
- FIG. 15 is a schematic diagram showing an embodiment of the invention of this application when performing location authentication.
- FIG. 16 is a schematic diagram showing an embodiment of the invention of this application when performing ID authentication and location authentication.
- FIG. 17 is a diagram for explaining the embodiment in FIG.
- FIG. 18 is a schematic diagram showing another embodiment of the invention of this application when performing ID authentication and location authentication.
- FIG. 19 is a schematic diagram showing still another embodiment of the invention of this application that performs location authentication.
- FIG. 20 is a schematic diagram showing an example of a case where a spherical screen body is used.
- FIG. 21 is a schematic diagram showing an example when a CRT display is used.
- FIG. 22 is a schematic diagram showing an example when a liquid crystal display is used.
- FIG. 23 is a schematic diagram showing an example of a case where a plasma display is used.
- FIG. 24 is a schematic diagram showing an example of a case where an electroluminescent display is used.
- FIGS. 25 (a) and 25 (b) are schematic diagrams each showing an example in the case of using a flat body and a faceted light emitting diode display.
- FIG. 26 is a schematic diagram showing an example of a case where an infrared light source is arranged above a top screen unit.
- FIG. 27 is a schematic diagram showing another example in which the infrared light source is disposed above the top screen unit.
- FIG. 28 is a schematic diagram illustrating an example of the pointing rod.
- FIG. 29 is a schematic diagram illustrating an example of the image display surface.
- FIG. 30 is a schematic diagram showing another embodiment of the location authentication.
- FIG. 31 is a schematic diagram showing an example of a three-dimensional screen body. BEST MODE FOR CARRYING OUT THE INVENTION
- FIG. 11 and FIG. 12 show an embodiment of the invention of this application.
- the image display device (1) corresponds to a table-like body (100) corresponding to the screen means (11) and the image projection means (12).
- a projector device (102) is provided.
- the table-like body (100) has a flat and translucent top screen part (101), and the projector device (102) has an image of the top screen part (101). It is installed below the top screen (101) so that images are projected from the back side opposite to the display surface.
- one projector device (102) is used to project an image on the top screen (101)-surface.
- two projector devices (102) are arranged.
- the top screen section (101) is divided into two screens and images are projected on each screen.
- the projector device (102) in FIG. 3) and an optical system (104) such as a mirror that refracts the projected image upward and guides it to the back of the top screen unit (101).
- a plurality of infrared light sources (201) corresponding to the electromagnetic wave irradiating means (22) and an infrared light source array (200) composed of a plurality of infrared light sources (201) are arranged below the top screen unit (101).
- a voice control unit (202) having a function corresponding to the modulation means (1).
- each infrared light source (201) of the infrared light source array (200) is arranged in an array corresponding to each of a plurality of positions in the image displayed on the top screen (101).
- the infrared light modulated based on the audio information by the audio control unit (202) is emitted from the back of the top screen unit (101) to each position in the image.
- a 6 x 16 spot position is set for the two images divided and displayed on the top screen (101), and a 6 x 16 infrared light source (201) corresponding to each of these spot positions is set. And arrange them.
- the projection area is kept free so as not to hinder the projector device (102) (projector body (103) and optical system (104)) from projecting an image on the top screen (101).
- the infrared light emitted from the infrared light source (201) is modulated based on audio information to be output by the audio control unit (202). More specifically, for example, the intensity of the emitted infrared light may be modulated and controlled by changing the driving voltage of the infrared light source (20.1) according to the voltage level of the audio signal.
- the sound modulated infrared light of the infrared light source array (200) is radiated to a plurality of spot positions in the image.
- the conversion means (31) and the sound reproduction means (32) see FIG. 1) of the sound reproduction terminal (3), the sound modulation infrared light was received and photoelectrically converted to obtain Electric signals can be reproduced as audio.
- a solar cell (302) corresponding to the conversion means (31) is provided.
- the finger-mounted terminal unit (301) is attached to the finger of the terminal user (600), and connected to the solar battery (302) via a cable (not shown) or the like. (300) is attached to the terminal user's ear.
- the solar cell (600) 302) receives the above-mentioned voice-modulated infrared light applied to the spot position and performs photoelectric conversion.
- the electric signal obtained by photoelectric conversion is sent to the earphone (300) and output as it is, and the terminal user (600) hears the sound.
- This sound is, of course, the same as the original sound information.
- a plurality of terminal users (600) can not only see images projected on the top screen (101) of the table-like body (100), but also Terminal users (600) —Each person can listen to favorite audio information related to the image at any time. In other words, voice information support has been realized for each terminal user (600).
- the system in order to further promote individualization of voice information support for each user (which can also be called personalization), the system also provides voice information support based on the ID and location of the terminal user or terminal itself. Configuration.
- FIG. 13 shows an embodiment of voice information support based on ID.
- (301) see the enlarged conceptual diagram in the figure
- the table-like body (100) has the image display on the top plate screen (101) on the top plate.
- Reader / writers (401) are arranged at the four corners that do not disturb, and the output of each reader / writer (401) is connected to the ID authentication unit (400).
- the reader / writer (401) and the ID authentication unit (400) correspond to the ID detection device (4) (see FIGS. 2 and 7).
- the solar cell (302) converts the voice-modulated infrared light emitted from the back of the top screen section (101) by the infrared light source array (200) into the top screen section (10).
- the light receiving surface is oriented downward, and the RF ID tag (303) can communicate with the reader / writer (401) by ID authentication communication
- the terminal user (600) may turn the finger-mounted terminal (301) upside down on the reader / writer (401).
- the terminal user (600) approaches the RF ID tag (303) to the reader / writer (401) with the finger-mounted terminal section (301) attached to the finger, the RFID tag (303) and the reader / writer Necessary data communication is performed with (401), and the ID data read by the reader / writer (401) is sent to the ID authentication unit (400). Then, based on the ID data authenticated by the ID authentication unit (400), it is possible to automatically determine what terminal user (600) is currently viewing the image, etc. Voice information suitable for the terminal user (600) can be transmitted via the control unit (202).
- a voice information database storing voice information corresponding to predetermined ID data, and voice information searching and selecting voice information corresponding to the ID data authenticated by the ID authentication unit (400) from the voice information database. If the selection means is constructed, individual support using voice information can be realized only for terminal users (600) who have the ID data.
- FIG. 14 shows another embodiment of voice information support based on ID.
- the finger-mounted terminal section (301) is a part of the RF ID tag (303).
- the optical ID tag (304) is integrated with the solar cell (302) (see the enlarged conceptual diagram in the figure).
- the optical ID tag (304) includes an infrared light source (35a) such as an infrared LED beacon that emits infrared light for ID as shown in FIG. 9 and ID storage means such as an ID memory for storing ID data. (35b) and modulation means (35c) such as a modulation circuit for modulating the infrared light for ID according to the ID data.
- a plurality of infrared sensors (402) that receive infrared light for ID from the optical ID tag (304) and output ID data are arranged at appropriate positions.
- the output of each infrared sensor (402) is connected to an ID authentication unit (400).
- the optical ID tag (304) modulates the ID according to its own ID data stored in the optical ID tag (304).
- the infrared sensor (42) receives it, demodulates it, and extracts the ID bit string.
- the ID authentication unit (400) receiving the ID bit string authenticates the ID data, and voice information suitable only for the terminal user (600) having the ID data is provided. Originated via voice control (202).
- FIG. 15 shows an embodiment of voice information support based on position.
- the finger-mounted terminal section (301) is configured such that a retroreflective sheet (305) corresponding to the reflecting means (36) (see FIG. 10) is disposed integrally with the solar cell (302). (See the enlarged conceptual diagram in the figure).
- a device integrally including an infrared LED (502) and an infrared camera (501) is disposed at an appropriate position.
- the infrared LED (502) corresponds to the infrared light source (51) (see FIG. 10) that emits infrared light for position
- the infrared camera (501) is a reflection sheet (305) of the finger-mounted terminal (301). This corresponds to the infrared imaging means (52) (see FIG. 10) for imaging the position infrared light reflected and returned by.
- the output of the infrared camera (501) is output to the position detection means (53) (see FIG. 10) for detecting the position of the audio reproduction terminal (3) based on the position infrared light image picked up by the infrared camera (501).
- the position authentication unit (500) may be incorporated in an integrated housing of the infrared camera (501) and the infrared LED (502), or may be provided as an external unit.
- the infrared LED (502) constantly emits infrared light for position to almost the entire surface of the top screen (101), and the finger-mounted terminal (301) enters this emission area.
- the retroreflective sheet (305) is used for positioning. It receives infrared light and reflects it in almost the same direction as the incident direction. This reflected infrared light returns to the infrared camera (501).
- the infrared camera (501) captures the position infrared light as a bright point through the visible light cut filter, and the position authentication unit (500) that receives this image data detects the camera relative position of the bright point from the frame coordinates by image processing. I do.
- This detected position is authenticated as the position of the finger-mounted terminal (301) on the top screen (101), and it is automatically determined at which position in the current image the finger-mounted terminal (301) is located. be able to.
- the coordinates of the position on the top screen (101) where the infrared light source array (200) is irradiating the voice-modulated infrared light are stored in advance in the voice control unit (202).
- the terminal user (600) By selecting an irradiation position that matches the position coordinates of the finger-mounted terminal section (301) and irradiating the selected irradiation position with the voice-modulated infrared light, the terminal user (600) ) Can transmit voice information to the finger-mounted terminal (301), that is, to the position in the image pointed to by the user's finger.
- the retroreflective sheet (305) can be composed of, for example, a plurality of recursive corner cubes arranged in a sheet, as described above.
- the light source array (200) is arranged on the opposite side to the solar cell (302) which emits the voice-modulated infrared light from below, that is, so that the light receiving / reflecting surface faces upward.
- this voice information support system even more excellent individual voice information support can be realized by combining the voice information support based on the ID and the voice information support based on the position.
- FIG. 16 shows an embodiment in which the voice information support based on both the ID and the position is performed by combining the embodiments of FIG. 13 and FIG.
- FIG. 17 is a conceptual diagram for further describing the present embodiment, and will be appropriately referred to in the following description.
- the infrared light for position is constantly emitted from the infrared LED (502), and the infrared light for position is emitted by the reflection sheet (305) of the finger-mounted terminal section (301) on the top screen section (101).
- the image is retroreflected to the infrared camera (501) and imaged.
- the position authentication unit (500) performs position authentication based on the imaged data. In other words, as long as the retroreflected position infrared light is imaged, the finger-mounted terminal (301) can be tracked, that is, tracked (see Fig. 17).
- ID authentication is performed as described above.
- the position coordinates of the finger-mounted terminal unit (301) detected and authenticated by the infrared camera (501) and the position authentication unit (500) match the position coordinates of the reader / writer (401). If the location coordinates of (401) are stored in advance, it is possible to automatically determine which reader / writer (401) has performed ID authentication, and to track the finger-mounted terminal (301) and the ID data. (See Fig. 17).
- the terminal user who has the ID data (600) can transmit voice information suitable only for the terminal user (600) to the position in the image where the user is interested, and the individual voice information support is transmitted to the finger-mounted terminal unit.
- voice information suitable only for the terminal user (600) to the position in the image where the user is interested, and the individual voice information support is transmitted to the finger-mounted terminal unit.
- FIG. 18 shows an embodiment in which the voice information support based on both the ID and the position is performed by combining the embodiments of FIGS. 14 and 15 described above.
- the finger-mounted terminal section (301) intermittently emits infrared light for ID from the optical ID tag (304) at a preset time interval.
- This ID infrared light reaches two different sensors via separate paths: the infrared sensor (402) and the infrared camera (501).
- the infrared sensor (402) demodulates the infrared light for ID to extract an ID bit string, and the ID authentication unit (400) performs ID authentication.
- the infrared camera (501) captures the ID infrared light as a bright spot through a visible light cut filter, and the position authentication unit (500) performs position authentication.
- the output timing can be known from the change in the luminescent spot. That is, it is possible to synchronize the output of the infrared sensor (402) with the camera image of the infrared camera (501).
- the infrared LED (502) constantly emits infrared light for position, and this is reflected on the reflection sheet (305) of the finger-mounted terminal section (301) on the top screen (101). ) Is retro-reflected to the infrared camera (501), and its imaging data
- the location authentication is performed by the location authentication unit (500) based on.
- the bright spot of the position infrared light, that is, the reflection sheet (305) is unlikely to be lost, unlike the infrared light for ID, so the position of the reflection sheet (305), that is, the position of the finger mounting terminal section (301) is reduced. The position can always be tracked.
- the ID authenticated by the infrared sensor (402) and the ID authentication unit (400) is converted to the infrared ray.
- the position can be associated with the position authenticated by the camera (501) and the position authentication unit (500).
- audio information corresponding to the authenticated ID data is searched and selected, and the infrared light modulated based on the information is moved to a position in the image that matches the authenticated position coordinates of the finger-mounted terminal unit (301).
- the terminal user (600) having the ID data can transmit sound information suitable only to the terminal user (600) to the position in the image where the user is interested, and This individual voice information support can always be accurately performed during tracking of the finger-mounted terminal section (301).
- FIG. 19 shows an embodiment in which voice information support is performed based on location authentication using the touch panel.
- the evening touch panel (503) is provided on the top screen portion (101) of the table-like body (100).
- the touch panel (503) is transparent so that the image displayed on the top screen (101) can be seen by the terminal user (600) through the evening touch panel (503). In this case, when the terminal user (600) touches the touch panel (503), the position is detected as the position of the finger-mounted terminal section (301).
- a table-like body (101) having a planar top plate screen portion (101) (although 100) is used, a curved screen portion can be used.
- a form in which an image is projected from a back surface onto a screen body such as a spherical surface or a hemispherical surface can be considered.
- FIG. 20 shows an example of a translucent spherical screen body (700) and a projector device (701).
- a plurality of projection devices (701) are arranged inside a 360-degree spherical screen body (700), and each projector device (701) projects an image on a spherical area assigned to each. I have.
- an appropriate number of the infrared light sources (201) may be arranged at appropriate positions so that the ⁇ spot position in the image can be irradiated with the voice-modulated infrared light.
- FIG. 21 shows an example of a CRT display (801) and an infrared light source (800).
- the infrared light source (800) passes through the phosphor screen (802) of the cathode ray tube display (801) and moves to an appropriate spot position in the image.
- An appropriate number may be arranged at a position 3 ⁇ 4 ⁇ outside the cathode ray tube (803) so that the voice modulated infrared light can be irradiated.
- the cathode ray tube (803) is usually provided with an outer coating and an inner coating, for example, the coating on a portion through which infrared light is to be transmitted is cut off, or a material that transmits infrared light is used. However, it is necessary to ensure that infrared light can sufficiently pass through both coatings.
- the present invention is not limited to the example shown in FIG.
- FIG. 22 shows an example of a transmissive liquid crystal display (805) and an infrared light source (804).
- (806) is a backlight light source
- (807) is a polarizing filter
- (808) is a glass substrate
- (809) is a transparent electrode
- (810) is a liquid crystal layer
- (811) is a color filter
- (812) is a glass substrate
- (813) is a polarizing filter
- (814) is a display surface.
- an appropriate number of infrared light sources (804) may be arranged at appropriate positions so as to transmit sound modulation infrared light to appropriate spot positions on the display surface (814) through each layer. Just fine.
- a material that transmits infrared light may be used at an appropriate portion of each layer.
- the present invention is not limited to the example shown in FIG.
- FIG. 23 shows an example of a plasma display (816) and an infrared light source (815).
- (817) is a glass substrate on the back
- (818) is an address electrode (also referred to as a data electrode)
- (819) is a phosphor
- (820) is a partition
- (822) is a display electrode (scanning electrode).
- (822) a dielectric layer
- (823) a front glass substrate.
- an appropriate number of infrared light sources (815) are arranged at appropriate positions so that sound modulation infrared light can be transmitted through each layer and applied to appropriate spot positions on the glass substrate (823) on which images are displayed. You should let it.
- a material that transmits infrared light may be used for the Ml portion of each layer.
- the present invention is not limited to the example shown in FIG.
- FIG. 24 shows an example of a transparent electroluminescent display (825) and an infrared light source (824).
- (826) is a glass substrate
- (827) is a transparent electrode
- (828) is a fiber layer
- (829) is a light emitting layer
- (830) is a fiber layer
- (831) is a glass substrate.
- the infrared light source (824) An appropriate number may be disposed at an appropriate position so that sound-modulated infrared light can be applied to an appropriate spot position on the glass substrate (831) through the layer.
- a material that transmits infrared light may be used at an appropriate position in each layer.
- the present invention is not limited to the example shown in FIG.
- FIG. 25 (a) and (b) show a flat and spherical light-emitting diode display (833) with multiple light-emitting diode panels (834) and an infrared light source (832), respectively.
- FIG. The light emitting diode panel (834) is a panel body having a plurality of small LEDs (835) in an array, and a plurality of these are connected in series to form a flat or spherical light emitting diode display (834). 3) is configured.
- the infrared light source (832) may be installed in each light emitting diode panel (834) itself together with other small LEDs, or may be installed in the gap between the light emitting diode panels (8324). It is sufficient if the position can be irradiated with voice-modulated infrared light.
- the present invention is not limited to the example shown in FIG.
- infrared light sources may be provided at appropriate positions so that sound-modulated infrared light can be applied to appropriate spot positions on the image display surface.
- a plurality of infrared light sources (800), (804), (815) and (824) are separately arranged.
- a surface-emitting light source typified by may be provided at the position 3 ⁇ 4g.
- each infrared light source is arranged in the same number as each position in the image to be irradiated, but the irradiation direction of each infrared light should be directed to each position in the image. If change control is possible, the number of irradiation positions in the image and the number of light sources may not be the same, and the number of light sources may be smaller than the number of positions or may be only one light source.
- the sound modulation infrared light is emitted from the back side of the image display surface, that is, the side opposite to the image display surface.
- the sound modulation infrared light may be emitted from the image display surface side.
- FIG. 26 and FIG. 27 each show an example of this case, and an infrared light source (201) is arranged above the top plate screen (101).
- Fig. 26 shows the same number of light sources as the number of irradiation positions.
- Fig. 27 shows that each infrared light source (201) can control the irradiation direction by pan-tilt drive as described above. The number of light sources is smaller than the number of positions.
- each infrared light source (201) itself can be a collection of multiple infrared LEDs .
- the electromagnetic wave irradiation means (22) of the audio output device (2) in addition to the infrared light source (201), various light sources that output other light such as visible light may be used.
- LEDs not only LEDs but also lasers can be used as the light source, and further, a mode in which light from these is irradiated to each spot position in the image through an optical cable such as an optical fiber can be considered.
- the finger-mounted terminal section (301) including the sun (302), the RF ID tag (303), the optical ID tag (304), and the reflection sheet (305) ) Can be worn on the finger of the terminal user (600), but may be a body-worn terminal that can be worn on a part of the body, such as a hand or foot, instead of the finger. Anything that can touch or approach the spot position in the displayed image can be used. Therefore, not only the object that can be attached to a part of the body as described above, but also an object that can be attached to the pointing rod (900) of the terminal user (600) as shown in FIG. Can be considered.
- a rod-mounted terminal (901) is attached to the tip of an extendable pointing rod (900) so that it can be attached and detached, and this rod-mounted terminal (901) is attached to the pointing rod (9).
- an audio playback terminal (903) such as an earphone, headphone, or speaker via a conducting wire (902) (which may pass outside) through the inside, and from a solar cell (not shown). Is reproduced as audio.
- earphones or headphones as the sound reproduction terminal unit (903) can be worn on the ears of the terminal user (600), and speakers can be carried by the terminal user (600).
- This voice information support system can also be applied as a voice guide system for the visually impaired.
- a small hole (1000) should be provided at each irradiation position on the image display surface, as shown in Fig. 29, for example. It may be. According to this, a visually impaired person can know the irradiation position by touching the hole (1000), and can receive voice information support appropriately.
- a bit string indicating that the person is visually impaired is embedded in the ID data, it can be determined at the time of ID authentication, and voice information suitable for the visually impaired can be provided according to the determination.
- voice information suitable for the visually impaired can be provided according to the determination.
- the direction of the finger or the pointing stick from the irradiation position in the image or the hole (1000) above to determine the next irradiation position or hole (1000) You can also navigate around. For example, you will be able to give voice instructions such as "The next hole is slightly to the right of it.” "There is a hole a little ahead of it. Of course, this navigation can be applied to healthy people.
- the infrared camera (501) and the infrared LED (502) for performing position authentication are disposed above the top screen unit (101).
- the reflection sheet (305) is provided on the same side as the solar cell (301) so as to receive and retroreflect the position infrared light from below (see the enlarged view in the figure).
- the image display device (1) What is equivalent to the clean means (11) (table (100) or spherical screen (700) having the top plate screen (101)) is a clean flat or curved image without any irregularities. Although it has a display surface, it is also possible to use a screen means (11) having an uneven image display surface formed by combining flat surfaces and curved surfaces. FIG. 31 shows an example of this case. A three-dimensional map is displayed by projecting an image from a projector device (1101) onto a three-dimensional screen body (1100) having an uneven image display surface and displaying the three-dimensional map. A sound-modulated infrared light is emitted from an infrared light source (1102) to an appropriate position inside.
- the image display device (1) is not limited to the one including the various screen means (11) and the image projection means (12) as in the above embodiment, but displays an image. Any device can be used. For example, a device in which an image is applied or printed on the screen itself and which can transmit the electromagnetic wave represented by the infrared light can be used as the audio information support system of the invention of this application. Can be applied to Industrial applicability
- the user can point to or touch the position by simply pointing to or touching the position.
- a completely new voice information support system that can easily hear voice information and realize voice information support individually corresponding to each user watching an image is provided.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Otolaryngology (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Signal Processing (AREA)
- General Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Optical Communication System (AREA)
- User Interface Of Digital Computer (AREA)
- Circuit For Audible Band Transducer (AREA)
- Headphones And Earphones (AREA)
- Controls And Circuits For Display Device (AREA)
- Devices For Indicating Variable Information By Combining Individual Elements (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP03780790A EP1583392A4 (en) | 2002-12-16 | 2003-12-16 | AUDIO INFORMATION SUPPORT SYSTEM |
AU2003289362A AU2003289362A1 (en) | 2002-12-16 | 2003-12-16 | Audio information support system |
US10/539,350 US8073155B2 (en) | 2002-12-16 | 2003-12-16 | System for reproducing audio information corresponding to a position within a displayed image |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2002-364471 | 2002-12-16 | ||
JP2002364471A JP4228069B2 (ja) | 2002-12-16 | 2002-12-16 | 音声情報支援システム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004056153A1 true WO2004056153A1 (ja) | 2004-07-01 |
Family
ID=32588239
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2003/016110 WO2004056153A1 (ja) | 2002-12-16 | 2003-12-16 | 音声情報支援システム |
Country Status (7)
Country | Link |
---|---|
US (1) | US8073155B2 (ja) |
EP (1) | EP1583392A4 (ja) |
JP (1) | JP4228069B2 (ja) |
KR (1) | KR20060028757A (ja) |
CN (1) | CN1754404A (ja) |
AU (1) | AU2003289362A1 (ja) |
WO (1) | WO2004056153A1 (ja) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007005927A (ja) * | 2005-06-21 | 2007-01-11 | National Institute Of Information & Communication Technology | 音声情報提供システムおよび音声情報提供方法 |
JP2007005926A (ja) * | 2005-06-21 | 2007-01-11 | National Institute Of Information & Communication Technology | 音声情報提供システムおよび音声情報提供方法 |
US9524047B2 (en) * | 2009-01-27 | 2016-12-20 | Disney Enterprises, Inc. | Multi-touch detection system using a touch pane and light receiver |
JP5448611B2 (ja) | 2009-07-02 | 2014-03-19 | キヤノン株式会社 | 表示制御装置及び制御方法 |
KR20110080894A (ko) * | 2010-01-07 | 2011-07-13 | 삼성전자주식회사 | 멀티 터치 입력 처리 방법 및 장치 |
KR102052372B1 (ko) * | 2013-01-11 | 2019-12-05 | 엘지전자 주식회사 | 이어폰을 이용한 데이터 송수신 디바이스 및 그 제어 방법 |
MX2015010932A (es) * | 2013-02-22 | 2016-06-10 | Ashley Diana Black Internat Holdings Llc | Dispositivo para la buena condicion del tejido de fascia. |
EP3179744B1 (en) * | 2015-12-08 | 2018-01-31 | Axis AB | Method, device and system for controlling a sound image in an audio zone |
KR102448771B1 (ko) | 2021-11-29 | 2022-09-29 | (주) 케이앤지앰테크 | 조형용 풍력발전시스템 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06202648A (ja) * | 1992-12-25 | 1994-07-22 | Casio Comput Co Ltd | 入力装置及びこの入力装置を備えた電子楽器 |
JPH08263006A (ja) * | 1995-03-23 | 1996-10-11 | Unitec Kk | 音声ガイド装置 |
JPH10222108A (ja) * | 1997-02-10 | 1998-08-21 | Syst Sekkei:Kk | 電波感知警告装置 |
JP2002116858A (ja) * | 2000-10-06 | 2002-04-19 | Sony Corp | 電子pop広告装置 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3245157A (en) * | 1963-10-04 | 1966-04-12 | Westinghouse Electric Corp | Audio visual teaching system |
US3273260A (en) * | 1964-10-06 | 1966-09-20 | Tutortape Lab Inc | Audio-visual communication systems and methods |
US4336018A (en) * | 1979-12-19 | 1982-06-22 | The United States Of America As Represented By The Secretary Of The Navy | Electro-optic infantry weapons trainer |
IL81146A (en) * | 1986-01-26 | 1990-04-29 | Avish Jacob Weiner | Sound-producing amusement or educational devices |
JPH09214430A (ja) * | 1996-01-31 | 1997-08-15 | Sony Corp | 赤外線式伝送装置 |
US6445369B1 (en) * | 1998-02-20 | 2002-09-03 | The University Of Hong Kong | Light emitting diode dot matrix display system with audio output |
GB2361575B (en) * | 1999-03-16 | 2002-01-30 | Innomind Internat Ltd | Display, and device having a display |
EP1906294B1 (en) * | 1999-05-03 | 2018-07-18 | Symbol Technologies, LLC | Wearable communication system |
US7099568B2 (en) | 2000-03-21 | 2006-08-29 | Sony Corporation | Information playback apparatus and electronic pop advertising apparatus |
-
2002
- 2002-12-16 JP JP2002364471A patent/JP4228069B2/ja not_active Expired - Lifetime
-
2003
- 2003-12-16 KR KR1020057011156A patent/KR20060028757A/ko not_active Application Discontinuation
- 2003-12-16 AU AU2003289362A patent/AU2003289362A1/en not_active Abandoned
- 2003-12-16 WO PCT/JP2003/016110 patent/WO2004056153A1/ja active Application Filing
- 2003-12-16 EP EP03780790A patent/EP1583392A4/en not_active Withdrawn
- 2003-12-16 US US10/539,350 patent/US8073155B2/en not_active Expired - Fee Related
- 2003-12-16 CN CNA200380106385XA patent/CN1754404A/zh active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06202648A (ja) * | 1992-12-25 | 1994-07-22 | Casio Comput Co Ltd | 入力装置及びこの入力装置を備えた電子楽器 |
JPH08263006A (ja) * | 1995-03-23 | 1996-10-11 | Unitec Kk | 音声ガイド装置 |
JPH10222108A (ja) * | 1997-02-10 | 1998-08-21 | Syst Sekkei:Kk | 電波感知警告装置 |
JP2002116858A (ja) * | 2000-10-06 | 2002-04-19 | Sony Corp | 電子pop広告装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1583392A4 * |
Also Published As
Publication number | Publication date |
---|---|
EP1583392A1 (en) | 2005-10-05 |
KR20060028757A (ko) | 2006-04-03 |
US8073155B2 (en) | 2011-12-06 |
JP2004200815A (ja) | 2004-07-15 |
AU2003289362A1 (en) | 2004-07-09 |
JP4228069B2 (ja) | 2009-02-25 |
EP1583392A4 (en) | 2008-12-03 |
CN1754404A (zh) | 2006-03-29 |
US20080025548A1 (en) | 2008-01-31 |
AU2003289362A8 (en) | 2004-07-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10342131B1 (en) | PCB laminated structure and mobile terminal having the same | |
KR20170005649A (ko) | 3차원 깊이 카메라 모듈 및 이를 구비하는 이동 단말기 | |
KR20170076435A (ko) | 휴대 장치의 액세서리 장치 | |
WO2004056153A1 (ja) | 音声情報支援システム | |
KR20190008067A (ko) | 이동 단말기 | |
JP3843321B2 (ja) | 情報支援システム | |
CN209707863U (zh) | 电子设备及其背光单元和感测模组 | |
US11630485B2 (en) | Housing structures and input-output devices for electronic devices | |
CN209044429U (zh) | 一种设备 | |
KR20160038409A (ko) | 이동 단말기 및 그 제어 방법 | |
KR20160115603A (ko) | 전자 디바이스 및 그 제어방법 | |
JP7471432B2 (ja) | 液晶モジュール、電子デバイス、および画面インタラクションシステム | |
CN109407758A (zh) | 一种设备 | |
CN209167942U (zh) | 设备 | |
CN209330167U (zh) | 一种设备 | |
KR20180006580A (ko) | 주변환경에 따라 패턴 및 색상을 표현하는 스마트 의복 및 그 제어방법 | |
CN109416741B (zh) | 电子设备 | |
KR20210046323A (ko) | 이동 단말기 및 이에 결합하는 보조 장치 | |
US12130666B2 (en) | Housing structures and input-output devices for electronic devices | |
CN213990948U (zh) | 用于电子设备的发光麦克风组件 | |
CN209167943U (zh) | 一种设备 | |
KR102637419B1 (ko) | 이동 단말기 및 그의 3d 영상 변환 방법 | |
CN209167946U (zh) | 一种设备 | |
CN209167940U (zh) | 一种设备 | |
KR101635033B1 (ko) | 이동 단말기 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 1020057011156 Country of ref document: KR Ref document number: 20038A6385X Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2003780790 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2003780790 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020057011156 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10539350 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 10539350 Country of ref document: US |