WO2001041102A1 - Dispositif et procede de simulation d'experience - Google Patents

Dispositif et procede de simulation d'experience Download PDF

Info

Publication number
WO2001041102A1
WO2001041102A1 PCT/JP2000/008381 JP0008381W WO0141102A1 WO 2001041102 A1 WO2001041102 A1 WO 2001041102A1 JP 0008381 W JP0008381 W JP 0008381W WO 0141102 A1 WO0141102 A1 WO 0141102A1
Authority
WO
WIPO (PCT)
Prior art keywords
motion
image
movement
display device
video
Prior art date
Application number
PCT/JP2000/008381
Other languages
English (en)
Japanese (ja)
Inventor
Toshiya Iinuma
Kenji Oyamada
Masahiro Seto
Original Assignee
Sanyo Electric Co., Ltd.
Sanyo Electric Software Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP34131299A external-priority patent/JP2001154570A/ja
Priority claimed from JP37113599A external-priority patent/JP2001183968A/ja
Application filed by Sanyo Electric Co., Ltd., Sanyo Electric Software Co., Ltd. filed Critical Sanyo Electric Co., Ltd.
Publication of WO2001041102A1 publication Critical patent/WO2001041102A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63GMERRY-GO-ROUNDS; SWINGS; ROCKING-HORSES; CHUTES; SWITCHBACKS; SIMILAR DEVICES FOR PUBLIC AMUSEMENT
    • A63G31/00Amusement arrangements
    • A63G31/16Amusement arrangements creating illusions of travel

Definitions

  • the present invention relates to an experiment apparatus and a method for simulating a simulated experience by giving a human a shaking according to an image.
  • Virtual reality is a technology that gives the illusion of being at a remote location with video and audio, etc., and is being used in various fields such as games, music, and movies.
  • This simulated experience device using virtual reality technology can create a simulated experience for humans by playing software created to resemble movement, vibration, video, and sound that you would actually experience. It is provided safely many times.
  • the software used in such a simulated experience device appeals to human sensibilities and requires specialized know-how to create it, making it difficult to create.
  • the simulated experience device disclosed in Japanese Patent Application Laid-Open No. 11-153,499 corresponds to a motion video detecting means for detecting a motion video which is a motion of a video from a video signal, and a motion video from an audio signal.
  • Voice detection means for detecting voice motion detection means for detecting the motion of the user
  • motion generation means for generating respective motion signals from the motion video, voice and the motion of the user, and experience based on the motion signal On which the person is on board
  • a reproduction effect control means for controlling a reproduction effect of a video signal and an audio signal from the movement of the user.
  • motion drive control is performed based on actual measurement values such as motion video, audio, and the movement of the user instead of software data dedicated to motion drive.
  • Versatility can be improved because it does not rely on software.
  • the present invention is intended to solve such a problem and provide a more realistic experience.
  • a first simulated experience device is a display device for displaying a video, a video signal supply means for supplying a video signal to the display device, and detecting a motion of a video based on the video signal supplied to the display device.
  • a motion detecting means and a pseudo effect means for stimulating the user are provided.
  • the pseudo effect means transfers the motion of the image detected by the motion detecting means to the center of the image display area of the display device and the periphery thereof. , Or the subject and background It is characterized by stimulating the user according to the movement of the surrounding or background video.
  • the simulated effect means is, for example, a ride means arranged in front of the display device and on which the user is placed, a rocking means for rocking the ride means to stimulate the user, and an image detected by the motion detecting means. Movement is classified into the center of the image display area of the display device and its surrounding area, or between the subject and the background, and the swing means for driving and controlling the swing means in accordance with the movement of the surrounding or background image. Motion control means.
  • the swing control means includes, for example, means for driving a swing means in accordance with the movement of the image of the central part or the subject when the magnitude of the movement of the peripheral part or the background image is smaller than a first predetermined value. ing.
  • the swing control means drives the swing means in response to the movement of the image at the bottom of the screen or by movement unrelated to the image. It has means to make it.
  • the first pseudo experience method includes a first step of supplying a video signal to a display device, a second step of detecting a motion of a video based on the video signal supplied to the display device, and a user.
  • the method includes a third step of applying a stimulus, and the third step classifies the motion of the image detected in the second step into a central portion and a peripheral portion of the image display area of the display device, or into a subject and a background.
  • the stimulus is given to the user in response to the movement of the surrounding or background video.
  • the swinging hand is moved according to the movement of the center part or the image of the subject. Driving the step.
  • the swinging means is driven in accordance with the movement of the image at the bottom of the screen or in a motion irrelevant to the image. It has steps.
  • a second pseudo experience device is a display device for displaying a video, a video signal supply means for supplying a video signal to the display device, and a motion for detecting a motion of the video from the video signal supplied to the display device.
  • the pseudo effect means comprises: a means for giving the user a stimulus of a strength corresponding to the motion of the video detected by the motion detection means; When the state where the strength is a predetermined maximum continues for a predetermined time or more, means for gradually reducing the strength of the stimulus is provided.
  • the simulated effect means is, for example, a ride means arranged in front of the display device and on which the user is placed, a rocking means for rocking the ride means to stimulate the user, and a drive control of the rocking means.
  • Swing control means for performing the swing control means.
  • the swing control means includes means for driving the swing means in accordance with the motion of the image detected by the motion detection means, and a state in which the swing amount of the swing means is previously maximum.
  • a mechanism is provided to gradually return the rocking means to the center when the movement has continued for a predetermined time or more.
  • the swing control means reverses the direction of the movement of the image before swinging the riding means in a direction corresponding to the movement of the image.
  • Means are provided for slightly swinging the riding means in the direction.
  • a second pseudo experience method includes a first step of supplying a video signal to a display device, a second step of detecting video motion from a video signal supplied to the display device, and providing a stimulus to the user
  • the method includes a third step, in which the third step is to give a stimulus to the user with a strength corresponding to the motion of the image detected in the second step, and the strength of the stimulus is a predetermined maximum.
  • a step of gradually reducing the intensity of the stimulus is provided.
  • a ride is provided in front of the display device for placing the experience There is provided a rocking means for oscillating the rocking means to stimulate the user. And a step of gradually returning the rocking means to the center when the state in which the rocking amount of the rocking means is the maximum beforehand continues for a predetermined time or more.
  • the third step for example, when the movement of the image detected in the second step suddenly increases, before the rider swings in a direction corresponding to the movement of the image, the direction of the movement of the image is reversed.
  • a step of slightly rocking the riding means in the direction.
  • FIG. 1 is a block diagram showing the configuration of the virtual experience device.
  • FIG. 2 is a block diagram showing the configuration of the swing control means in FIG.
  • FIG. 3 is a flowchart showing the operation of the virtual experience device.
  • FIG. 6 is a block diagram showing still another example of the simulated experience device.
  • Figure 1 shows the configuration of the virtual experience device.
  • 1 is a video source serving as a video signal supply means for a VTR, a CD-ROM, a TV broadcast, a TV game, etc.
  • 2 is a video signal from the video source 1 that captures video in at least one frame unit.
  • the image capturing means 3 is a display means comprising a CRT, a liquid crystal display or the like
  • 4 is a display control means for displaying the video signal captured by the image capturing means 2 on the display means 3.
  • reference numeral 61 denotes the motion vector detected by the motion vector detection means 5 for each small area on the display means 3.
  • Classification unit to be classified 62: Average peripheral motion vector calculation unit that calculates the average of the motion vector of the peripheral region classified by the classification unit 61, and 63: Medium classification by the classification unit 61
  • the central average motion vector calculator that calculates the average of the central motion vector
  • 64 is the lower average motion vector that calculates the average of the lower motion vectors classified by the classification unit 61 This is a file calculation unit.
  • Reference numeral 69 denotes a motion signal based on the average value calculated by each of the average motion vector calculation units 62, 63, and 64 or the motion independent of the video motion generated by the unrelated motion generation means 68. This is an operation signal generation unit for generating.
  • the classification unit 61 classifies the motion vector for each small area detected by the motion vector detection unit 5 into a central part of one screen, a peripheral part of the central part, and a lower part of the screen.
  • the peripheral average motion vector calculator 62 calculates the average value of the motion vector of the central peripheral portion.
  • the center average motion vector calculator 63 calculates the center motion vector. Calculate the average value.
  • the lower average motion vector calculator 64 calculates the average value of the lower motion vector.
  • the peripheral motion determining section 65 determines whether or not the average peripheral motion vector calculated by the peripheral average motion vector calculating section 62 is smaller than a first predetermined value. If the average peripheral motion vector calculated by the peripheral average motion vector calculating unit 62 is smaller than the first predetermined value, the peripheral motion determining unit 65 instructs the central motion determining unit 66 to determine. Put out. If the average motion vector of the surrounding area calculated by the average motion vector of the surrounding area 62 is equal to or larger than the first predetermined value, the surrounding motion determination section 65 determines the average motion vector of the surrounding area. The average motion vector of the surroundings calculated by the vector calculation unit 62 is sent to the operation signal generation unit 69. The motion signal generation unit 69 generates a motion signal based on the average motion vector of the peripheral part calculated by the peripheral average motion vector calculation unit 62.
  • the central motion determining unit 66 When receiving the determination instruction from the peripheral motion determining unit 65, the central motion determining unit 66 calculates the central average motion vector calculated by the central average motion vector calculating unit 63 as the second It is determined whether it is smaller than a predetermined value. If the average motion vector at the center calculated by the average motion vector calculator at the center 63 is smaller than the second predetermined value, the center motion determiner 66 determines the lower motion determiner 67. Give instructions.
  • the center motion determination unit 66 determines the average motion vector at the center.
  • the average motion vector at the center calculated by the calculator 63 is sent to the motion signal generator 69.
  • the motion signal generator 69 generates a motion signal based on the average motion vector of the center calculated by the average motion vector calculator 63 of the center.
  • the average motion vector at the bottom of the screen calculated by the lower average motion vector calculator 6 4 is If it is equal to or more than the third predetermined value, the lower motion determining unit 67 sends the average motion vector at the lower part of the screen calculated by the lower average motion vector calculating unit 64 to the motion signal generating unit 69 .
  • the motion signal generator 69 generates a motion signal based on the average motion vector at the lower part of the screen calculated by the lower average motion vector calculator 64.
  • the irrelevant motion generating unit 68 When receiving an operation instruction from the lower motion determining unit 67, the irrelevant motion generating unit 68 generates a motion vector of a motion irrelevant to the video and sends it to the motion signal generating unit 69.
  • the motion signal generator 69 generates a motion signal based on the motion vector of the motion irrelevant to the video calculated by the irrelevant motion generator 68. Generate motion signals.
  • 1 1 is an effect control means for controlling the drive of the effect device in accordance with the motion signal from the swing control means 6, and 1 2 is for outputting the sound accompanying the video from the video supply source 1 to the speech force 13
  • the sound output means 14 sends air to the rider according to the control signal from the effect control means 11, that is, the movement of the image.
  • Blower means for controlling, 15 is a sound image control means for controlling the sound image of the sound outputted from the speaker 13 according to the control signal from the effect control means 11, and 16 is a control signal from the effect control means 11 This is a lighting method that irradiates the user with light and enhances the sense of reality.
  • the swing control means 6 provides the rider 9 with the rider 7, the speaker 13, the blower 14 and the light 16 through the rider 9. It gives a sense of realism and gives a stimulus that enhances it, so rocking control means 6, ride means 7, rocking means 8, effect control means 11, speaker 13 and blowing means 14 and sound image control means 15 and the writing means 16 correspond to the pseudo effect means of the present invention.
  • Figure 3 shows the operation of the virtual experience device.
  • the image capturing means 2 captures an image from the image supply source 1 (step S 1).
  • the motion vector detecting means 5 detects a motion vector for each small area from the video captured in step S1 (step S2).
  • Oscillation control means 6 detects in step S2
  • the motion vector for each of the small areas is fetched, and the classification unit 61 classifies the motion vector for each of the small areas into the center of the image, the periphery of the center, and the bottom of the screen (step S3).
  • the average motion vector of the surrounding area is calculated by the average motion vector calculating section 62 of the surrounding area (step S4).
  • the central portion average motion vector is calculated by the central portion average motion vector calculation section 63 (step S5).
  • the average motion vector at the bottom of the screen is calculated by the lower average motion vector calculation unit 64 (step S6).
  • the peripheral motion determining section 65 determines whether the average motion vector of the peripheral portion is smaller than a first predetermined value (step S7). If the average movement vector of the surrounding area is equal to or larger than the first predetermined value, the process proceeds to step S8. In step S8, the motion signal generation unit 69 generates a motion signal based on the average motion vector of the surrounding area, and the swing unit 8 controls the ride unit 7 based on the motion signal. Rock it. If it is determined in step S7 that the average motion vector of the peripheral portion is smaller than the first predetermined value, it is determined whether the average motion vector of the central portion is smaller than the second predetermined value. The determination unit 66 determines (step S9).
  • step S9 If it is determined in step S9 that the average motion vector at the center is smaller than the second predetermined value, it is determined whether the motion vector at the bottom of the screen is smaller than the third predetermined value. Is determined (step S11).
  • step S12 the motion signal generator 69 generates a motion signal based on the average motion vector at the bottom of the screen, and the rocking means 8 turns the rocking means 8 based on the motion signal. Swing the ride means 7.
  • the motion signal generated by the operation signal generator 69 is also sent to the effect controller 11.
  • the effect control means 11 controls the blowing means 14, the sound image control means 15, and the lie and means 16 in accordance with the motion signal in order to enhance the sense of reality given to the rider on the ride means 7.
  • FIG. 4 shows a processing procedure of the operation signal generation unit 69 of the swing control means 6 in the above step S8, step S10, step S12, and step S13.
  • the operation signal generation unit 69 basically includes a swing unit that swings the ride unit 7 in accordance with the direction and size of the average movement vector input to the operation signal generation unit 69. 8 is operated (step S27). For example, if the image tilts to the right, the ride means 7 is tilted to the right.
  • step S20 the motion signal generation unit 69 determines whether or not the motion of the video has rapidly increased based on the average motion vector input to the motion signal generation unit 69. If it is determined that the motion of the video has rapidly increased, the process proceeds to step S21. If it is determined that the motion of the video has not rapidly increased, the process proceeds to step S23.
  • step S22 the magnitude of the slight motion of step S21 in the direction of the average motion vector input to the motion signal generation unit 69 and the magnitude of the average motion vector is determined.
  • the rocking means 8 is operated to rock the riding means 7 according to the magnitude of the added movement.
  • step S23 it is determined whether or not the riding means 7 is tilted from the center position to any of the front, rear, left and right positions. If it is determined that it is tilted, the process proceeds to step S24. If it is determined that it is not tilted, the process proceeds to step S27.
  • step S24 it is determined whether or not the inclination of the riding means 7 is kept the same for a predetermined time. If it is determined that the predetermined time is maintained, the process proceeds to step S26. If it is determined that the predetermined time is not maintained, the process proceeds to step S25. In step S26, if the ride means 7 maintains the same inclined state, the user 9 is often accustomed to the experience of that state, so the ride means 7 is gradually moved to the center position. The swinging means 8 is controlled to return.
  • step S25 it is determined whether the inclination of the ride means 7 in the front-rear, left-right direction is the limit position (maximum inclination position). If the inclination of the ride means 7 in the front-rear and left-right directions is at the limit position (maximum inclination position), the process proceeds to step S26. If not, the process proceeds to step S27.
  • step S26 if the ride means 7 is at the limit position, it cannot be moved even if there is further movement of the image, so the swing means 8 is controlled so that the ride means 7 is gradually returned to the center position.
  • step S27 the swing means 8 is operated so as to swing the ride means 7 in accordance with the direction and the magnitude of the average motion vector input to the operation signal generation section 69.
  • the classification unit 61 includes the motion vector extracted by the motion vector detection means 5 on the display unit 3 in the center, the periphery of the center, and the bottom of the screen.
  • the present invention is not limited to this, and the subject and the background in the video may be classified. If the classifying unit 61 classifies the subject and the background of the video, the same configuration as described above can be realized by replacing the center in the above-described embodiment with the subject and the surrounding part as the background. .
  • the configuration for displaying a two-dimensional image on the display means 3 has been described.
  • the present invention is not limited to this, and a video signal obtained by converting a two-dimensional image into a three-dimensional image is displayed on the three-dimensional display means 18. You may apply to the structure which displays.
  • This embodiment is shown in FIG. Note that the configurations denoted by the same reference numerals as those in FIG. 1 have the same functions, and thus description thereof will be omitted.
  • the video signal obtained by converting the 2D video into the 3D video is displayed on the display means 3, but also the 3D video from the video source 1, that is, the left-eye video L
  • the right-eye image R may be output and displayed on the stereoscopic display means 18 based on this signal.
  • the image supply source 1 supplies a three-dimensional image, specifically, supplies a left-eye image L and a right-eye image R, and the display control means 4 performs a three-dimensional display operation based on the image signal.
  • the video capturing means 2 captures the left-eye video L and the right-eye video R, and based on the video captured by the video capturing means 2, the motion vector detecting means 5 The motion vector is detected in consideration of the specific result of the means 19, and then sent to the swing control means 6 to drive the ride means 7.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un dispositif de simulation d'expérience comprenant une unité d'affichage conçue pour afficher des images, un moyen d'alimentation en signaux d'image conçu pour fournir des signaux d'image à l'unité d'affichage, un moyen de détection de mouvements conçu pour détecter les mouvements des images en fonction des signaux d'image fournis à l'unité d'affichage, et un moyen de simulation d'effets conçu pour donner un stimulus à une personne faisant l'expérience. Ledit moyen de simulation d'effets se caractérise en ce qu'il classifie les mouvements des images détectées par le moyen de détection de mouvements, en mouvement au niveau du centre et mouvements à la périphérie d'une zone d'affichage d'images ou en mouvements de l'objet et mouvements dans le fond, et donne un stimulus à la personne faisant l'expérience, en fonction des mouvements des images dans la partie périphérique ou dans le fond.
PCT/JP2000/008381 1999-11-30 2000-11-28 Dispositif et procede de simulation d'experience WO2001041102A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP11/341312 1999-11-30
JP34131299A JP2001154570A (ja) 1999-11-30 1999-11-30 疑似体験装置及び疑似体験方法
JP11/371135 1999-12-27
JP37113599A JP2001183968A (ja) 1999-12-27 1999-12-27 疑似体験装置及び疑似体験方法

Publications (1)

Publication Number Publication Date
WO2001041102A1 true WO2001041102A1 (fr) 2001-06-07

Family

ID=26576943

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2000/008381 WO2001041102A1 (fr) 1999-11-30 2000-11-28 Dispositif et procede de simulation d'experience

Country Status (1)

Country Link
WO (1) WO2001041102A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0854820A (ja) * 1994-08-12 1996-02-27 Sega Enterp Ltd ドライブゲーム装置及びその背景画表示方法
JPH10277261A (ja) * 1998-04-13 1998-10-20 Namco Ltd 画像合成装置及びこれを用いた仮想体験装置
JPH11153949A (ja) * 1997-11-20 1999-06-08 Sony Corp 体感モーション装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0854820A (ja) * 1994-08-12 1996-02-27 Sega Enterp Ltd ドライブゲーム装置及びその背景画表示方法
JPH11153949A (ja) * 1997-11-20 1999-06-08 Sony Corp 体感モーション装置
JPH10277261A (ja) * 1998-04-13 1998-10-20 Namco Ltd 画像合成装置及びこれを用いた仮想体験装置

Similar Documents

Publication Publication Date Title
US10181212B2 (en) Method and system for reducing motion sickness in virtual reality ride systems
US11790616B2 (en) Immersive virtual display
US10062247B2 (en) Vibration generation system, storage medium having stored therein vibration generation program, and vibration generation method
JP6504157B2 (ja) 体感導入装置、体感導入システム、及び体感導入方法
JPH0819662A (ja) 画像表示装置を用いた遊戯装置
US10758821B2 (en) Operation input system, operation input device, and game system for adjusting force feedback control
JPWO2020090477A1 (ja) Vr酔い低減システム、ヘッドマウントディスプレイ、vr酔い低減方法及びプログラム
JP2009061161A (ja) プログラム、情報記憶媒体、及び、ゲームシステム
McMenemy et al. A hitchhiker's guide to virtual reality
JPH11146978A (ja) 3次元ゲーム装置及び情報記憶媒体
KR20160099075A (ko) 영상 시뮬레이팅 시스템, 플랫폼 제어 장치 및 플랫폼 제어 방법
JPH11153949A (ja) 体感モーション装置
JPH08131659A (ja) 疑似現実感発生装置
WO2001041102A1 (fr) Dispositif et procede de simulation d'experience
JP2001183968A (ja) 疑似体験装置及び疑似体験方法
JP3838173B2 (ja) 情報処理装置および方法、記録媒体、並びにプログラム
JP2009061159A (ja) プログラム、情報記憶媒体、及び、ゲームシステム
KR20180039415A (ko) 서보모터 기반 미세진동형 모션플랫폼 시스템
JP4212015B2 (ja) 画像生成装置及び情報記憶媒体
JP2000331184A (ja) 画像生成装置及び情報記憶媒体
JP2001154570A (ja) 疑似体験装置及び疑似体験方法
CN110947175A (zh) 一种高仿真亲临式三屏体感赛车
KR20160095663A (ko) 영상 시뮬레이팅 시스템, 플랫폼 제어 장치 및 플랫폼 제어 방법
JP3631890B2 (ja) 電子遊戯装置
JP2001079264A (ja) ゲーム装置

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
122 Ep: pct application non-entry in european phase