CN207591284U - Virtual reality experience device - Google Patents
Virtual reality experience device Download PDFInfo
- Publication number
- CN207591284U CN207591284U CN201721233518.5U CN201721233518U CN207591284U CN 207591284 U CN207591284 U CN 207591284U CN 201721233518 U CN201721233518 U CN 201721233518U CN 207591284 U CN207591284 U CN 207591284U
- Authority
- CN
- China
- Prior art keywords
- image
- experiencer
- experience
- virtual reality
- steps
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Abstract
The utility model is related to virtual reality experience device, including providing the device for image of image to experiencer and the device of taking physically acted being provided to experiencer, it is described take that device includes providing the space that can take to experiencer take portion;And make the gyro mechanism of at least one of the portion's of taking generation pitching, rolling, rolling and the reciprocating motion, therefore experiencer can be consistent with each other by the stimulation and the stimulation by physically acting perception of visual perception.It prevents experiencer from feeling abnormal sense as a result, improves degree of immersing, presence can be improved as a result.
Description
Technical field
The utility model is related to virtual reality experience devices, more specifically, are related to one kind and are capable of providing image and physics
The virtual reality experience device of formula action.
Background technology
In general, so-called virtual reality (Virtual Reality, VR), it is meant that certain is special using computer manufacture
Fixed environment or scene, and be fabricated to using its people can be compared to carried out with practical periphery scene or environment interaction people with
Interface between computer.
This virtual reality is also known as artificial reality, different space (cyberspace), virtual world, virtual environment, conjunction
Into environment, artificial environment etc..
The purpose of use of virtual reality is, by the daily environment for being difficult to experience of people, to make not directly experience
It is on the spot in person as shown and operated, just applied recently in fields such as education, advanced programming, remote operations.
Previous virtual reality experience device is disclosed in Korean Registered Utility Model bulletin 0342223.
But for this previous virtual reality experience device, there are presence it is low the problem of.It is more specific and
Speech, previous virtual reality experience device only only provides image to experiencer, thus there are the thorns that experiencer passes through visual perception
Swash and by physically acting the problem of stimulation of perception is inconsistent each other.On the other hand, it is intended to although having to experiencer together
Image and the trial physically acted are provided, but there are the action shown in image and the action actually provided are inconsistent each other
Problem.In addition, there are the problem of the practical visual field of experiencer and the inconsistent visual field of image for previous virtual reality experience device.
Therefore, experiencer feels abnormal sense, and degree of immersing is low, as a result there are presence it is low the problem of.
Utility model content
Technical problem
Therefore, the purpose of this utility model is to provide a kind of virtual reality experience devices that can improve presence.
Technical solution
The utility model provides a kind of virtual reality experience device to reach purpose as described above, including:Image fills
It puts, image is provided to experiencer;And device is taken, to experiencer's offer action;The device of taking includes:It takes
Portion provides the space that can be taken to experiencer;And gyro mechanism, make the portion of taking that pitching, rolling, rolling occur
At least one of and move back and forth.
The gyro mechanism can include:First mechanism makes the portion of taking that rolling occur;Second mechanism makes institute
It states the portion of taking and pitching occurs;And third mechanism, roll the portion of taking.
First mechanism can be rotated on the basis of supporting the works of the gyro mechanism and back and forth be transported
Dynamic mode is formed, and second mechanism supports are in first mechanism, and with perpendicular to the rotary shaft of first mechanism
Axis on the basis of formed in a manner of it can be rotated, the third mechanism supports are in second mechanism, and with vertical
It is formed in a manner of it can be rotated on the basis of the axis of the rotary shaft of second mechanism, the portion of taking can be with fixed knot
Together in the third mechanism.
Between the works and first mechanism, the rotary motion institute that first mechanism occurs could be formed with
First actuator of the driving force needed between first mechanism and second mechanism, could be formed with and occur described the
The third actuator of driving force needed for the rotary motion of two mechanisms, can between second mechanism and the third mechanism
To be formed with the 4th actuator of the driving force needed for the rotary motion that the third mechanism occurs.
First mechanism can be formed in a manner of so that the portion of taking moves back and forth.
First mechanism can be to be formed on the basis of the works in a manner of it can move back and forth.
Between the works and first mechanism, the reciprocating motion institute that first mechanism occurs could be formed with
Second actuator of the driving force needed.
First mechanism can be to support the position of second mechanism along the side separate and close from the works
It is formed to the mode that can be moved back and forth.
In first mechanism, second needed for the reciprocating motion at the position for supporting second mechanism could be formed with
Actuator.
Utility model effect
The virtual reality experience device of the utility model includes providing the device for image of image and to experiencer to experiencer
There is provided physically act take device, it is described take that device includes providing the space that can take to experiencer take portion;
And make the gyro mechanism of at least one of the portion's of taking generation pitching, rolling, rolling and the reciprocating motion;Therefore, experiencer is led to
The stimulation and the stimulation by physically acting perception for crossing visual perception can be consistent with each other.Prevent experiencer from feeling different as a result,
Sample sense improves degree of immersing, as a result can improve presence.
Description of the drawings
Fig. 1 is the stereogram for the virtual reality experience device for illustrating one embodiment according to the present utility model,
Fig. 2 is the image and the stereogram of action that the virtual reality experience device of pictorial image 1 provides,
Fig. 3 is the system diagram of the inscape of the virtual reality experience device of pictorial image 1,
Fig. 4 is the precedence diagram of a part for the first control method to virtual reality experience device of pictorial image 1,
Fig. 5 is the precedence diagram of another part of the first control method of pictorial image 4,
Fig. 6 is the chart illustrated in order to illustrate the concept to the visual field operation of Fig. 5,
Fig. 7 to 10 is the precedence diagram of the another part of the first control method of pictorial image 4 respectively,
Figure 11 is the precedence diagram for the part for illustrating the second control method to the virtual reality experience device of Fig. 1,
Figure 12 is the precedence diagram for the part for illustrating the third control method to the virtual reality experience device of Fig. 1,
Figure 13 to Figure 19 is the precedence diagram of another part of the third control method of pictorial image 12 respectively,
Figure 20 is that the inscape of the virtual reality experience device for illustrating another embodiment according to the present utility model is
System figure,
Figure 21 is the precedence diagram for the part for illustrating the control method to the virtual reality experience device of Figure 20,
Figure 22 to Figure 24 is the precedence diagram of another part of the control method of pictorial image 21 respectively,
Figure 25 is the stereogram for the virtual reality experience device for illustrating another embodiment according to the present utility model,
Figure 26 to Figure 29 is the plan view of action that the virtual reality experience device of pictorial image 25 respectively provides,
Figure 30 is the stereogram for the virtual reality experience device for illustrating another embodiment according to the present utility model,
Figure 31 to Figure 34 is the stereogram of action that the virtual reality experience device of pictorial image 30 provides.
Specific embodiment
In the following, with reference to attached drawing, the virtual reality experience device of the utility model is described in detail.
Fig. 1 is the stereogram for the virtual reality experience device for illustrating one embodiment according to the present utility model, and Fig. 2 is figure
The image and the stereogram of action that the virtual reality experience device of diagram 1 provides, Fig. 3 are the virtual reality experience dresses of pictorial image 1
The system diagram for the inscape put.Moreover, Fig. 4 is illustrate the first control method to the virtual reality experience device of Fig. 1 one
Partial precedence diagram, Fig. 5 are the precedence diagrams of another part of the first control method of pictorial image 4, and Fig. 6 is used as in order to illustrate to figure
The concept of 5 visual field operation and the chart illustrated, be experiencer by sight downward in the state of be moved upward when,
Display according to correction the visual field whether caused discrepancy chart, Fig. 7 to 10 be respectively pictorial image 4 the first control method again
The precedence diagram of a part.On the other hand, Figure 11 is illustrate the second control method to the virtual reality experience device of Fig. 1 one
The precedence diagram divided.In addition, Figure 12 to Figure 19 is one of the third control method of the virtual reality experience device of pictorial image 1 respectively
The precedence diagram divided.
If with reference to attached drawing Fig. 1 to Fig. 3, the virtual reality experience device of one embodiment according to the present utility model can be with
Including:Device for image 100 provides virtual reality imagery to experiencer;Device 200 is taken, is provided physically to experiencer
Action;And control device (not shown), it controls the device for image 100 and described takes device 200.It below will be to body
The virtual reality imagery that the person of testing provides is known as experiencing image, will be known as experience action to the physically action that experiencer provides.
The device for image 100 can include for alloing experiencer's visually experiencing virtual reality described in display
Experience the image expressed portion 110 of image and the image control part 120 of the control image expressed portion 110.Wherein, the image control
Portion 120 processed is contained in the device for image 100 in the present embodiment, but can also be contained in the control device and (not show in figure
Go out).
On the other hand, the device for image 100 is as shown in Fig. 2, can be with the shadow in the experience image around experiencer
As in (hereinafter referred to as all-around video) (FPk) by image corresponding with the visual field of experiencer (hereinafter referred to as the visual field corresponding image)
(VPk) mode (hereinafter referred to as the visual field corresponds to image presentation mode) to provide is formed, so as to experiencer it can be seen that like
Into the image of actual environment.That is, the experience image is formed, and can be in the full side with the all-around video (FPk)
The corresponding image in the visual field (VPk) corresponding with privileged site (position of the sight direction of experiencer) exists in position image (FPk)
The mode that the image expressed portion 110 is shown is formed.
Specifically, the device for image 100 is for example to be worn on the head-mounted display apparatus (HMD on the head of experiencer:
Head Mount Display) it is formed, and can also include detecting the first test section 130 of the action of the device for image 100.
Wherein, first test section 130 as gyro sensor, acceleration transducer such as can form.
Moreover, the device for image 100 can be formed by the following manner, that is, be stored in the image control part 120
The all-around video (FPk) passes through the measured value (image detected by the first test section 130 of first test section 130
The action of device 100) image control part 120 is periodically transmitted to, the image control part 120 is based on the described first inspection
The measured value in survey portion 130, the visual field of operation experiencer, the image control part 120 will be right in the all-around video (FPk)
The image expressed portion 110 should be sent in the image in the visual field of the experiencer of institute's operation, the image expressed portion 110 play from
The image that the image control part 120 receives.
On the other hand, the action of the device for image 100 detected by first test section 130, can not only be experienced
The influence of person's sight change, when action (the experience action) that can also take device 200 because described, are affected.For example, described
Take device 200 upward move but experiencer by sight be maintained at front in the case of, first test section 130 can also be examined
It surveys and is moved upward for the device for image 100.As a result, experiencer it is described take device 200 it is motionless in the state of change and regard
During line, the action of device for image 100 detected by first test section 130 can be with the shadow caused by the change of experiencer's sight
As keeping strokes for device 100, visual field meeting and real experiences from the experiencer of the measured value operation of first test section 130
The visual field of person is consistent.But it is described take device 200 it is dynamic in the case of, pass through the image that first test section 130 detects
The action of device 100 can be inconsistent with the action of the device for image 100 caused by the change of experiencer's sight, thus from described first
It the visual field of the experiencer of the measured value operation of test section 130 can be inconsistent with the visual field of real experiences person.
In view of this point, for the present embodiment, the device for image 100 can be formed as follows, that is, operation institute
When stating the experiencer visual field, exclude to take moving for the device for image 100 caused by the action (experience action) of device 200 because described in
The mode (hereinafter referred to as visual field correction mode) of work.That is, the device 200 of taking can take device 200 including detecting this
Act second test section 240 of (experience action), the image control part 120 of the device for image 100 from described first to detect
The measured value in portion 130 subtracts the measured value of second test section 240 (according to the dynamic of the image part for taking the action of device 200
Make), and regarded from value (θ 1- θ 2) (action of image part caused by the visual field change of experiencer) the operation experiencer after subtracting
Wild mode is formed.
It if specifically, will be from base vector (for example, in experience sart point in time, towards the vector in front of experiencer)
The angle of the vector (β) for the direction of visual lines that (α) is observed to experiencer is known as first angle (θ 1), will be from the base vector (α)
To at the time point to be detected towards vector (γ) (aftermentioned 212 backrest of chair that is included in portion 210 taken in front of experiencer
Normal vector) angle of (γ) is known as second angle (θ 2), then first test section 130 detects the first angle (θ
1) image control part 120, is sent to, second test section 240 detects the second angle (θ 2), is sent to the shadow
As control unit 120, the image control part 120 can subtract the second angle (θ 2) from the first angle (θ 1), from this
The visual field of value (θ 1- θ 2) operation experiencer after subtracting.As a result, as shown in fig. 6, the visual field pair with real experiences person can be provided
The image answered.
Wherein, second test section 240 both can be by being installed on the aftermentioned gyro sensor for taking portion 210, adding
The formation such as velocity sensor can also this takes portion can perceive each joint action union of aftermentioned mechanical arm 221
RSI (Robot Sensor Interface) mode of 210 action is formed.
It is described to take device 200 for alloing experiencer that include by physically acting come experiencing virtual reality:
Portion 210 is taken, the space that can be taken is provided to experiencer;Driving portion 220 makes the portion 210 of taking carry out straight line fortune
Dynamic or rotary motion, so as to provide the experience action;And drive control part 230, control the driving portion 220.Wherein, institute
Drive control part 230 is stated, in the present embodiment, is contained in and described takes device 200, but the control device can also be contained in
(not shown).
The portion 210 of taking can include:Chair 212, takes one's seat for experiencer;Safety belt 214, prevents experiencer
It is detached from from the chair 212;And handle 216, it is held for experiencer, so that experiencer is made psychologically to stablize.
Moreover, the portion 210 of taking can further include:Supporter (not shown), can for the device for image 100
Releasably place;Anti-detachment means (not shown) prevents the device for image 100 from (not showing in figure from the supporter
Go out) far from be more than it is pre-determined separated by a distance;Cable (not shown) etc. is used to (in figure not show from the supporter
Go out) supply power supply to 100 side of device for image.
The driving portion 220 can be in the case of relatively small by space constraints, can to provide to experiencer
The mode physically acted that practical set-up is taken like experiencer is formed.That is, the action shown in the experience image is not
It is to be provided by practical set-up, but can be by the narrower defined confined space compared with the space that the practical set-up is run
The driving portion 220 of interior operation provides.
This driving portion 220 can be can make described to take portion 210 movable various composition is come in three dimensions
It is formed, it, can be by including multiple arms (Arm) and joint and with multiple degrees of freedom (for example, six degree of freedom) for the present embodiment
And the mechanical arm (Robot Arm) 221 that can move is formed.At this point, the portion 210 of taking can be detachably combined in institute
State the free end of mechanical arm 221.
Wherein it is possible to take the number in portion 210 and the number of the driving portion 220 described in suitably adjusting.A that is, drive
Dynamic portion 220 can be combined there are one portion 210 is taken, once to provide virtual reality experience to an experiencer.Alternatively,
One driving portion 220 can also be combined with it is multiple take portion 210, so as to once can to several experiencers provide virtual reality body
It tests and improves turnover rate.Alternatively, can have multiple driving portions 220 and each driving portion 220 can combine it is at least one
Portion 210 is taken, so as to further improve turnover rate.I.e., it is possible to have multiple virtual reality experience devices.It is at this point, multiple
Virtual reality experience device can separately provide experience image and experience action, so as to provide multiple types simultaneously
Virtual reality.
The control device (not shown) is by being electrically connected in the device for image 100 and described taking device 200
Server or computer formed, at least part (this reality of aftermentioned editorial office 310 and aftermentioned control unit (C) can be included
The situation for applying example is uniformly controlled portion 320 to be aftermentioned).
It on the other hand, can also be also according to the virtual reality experience device of the present embodiment in order to publicize and attract experiencer
Public screen 400 including providing from virtual reality imagery to non-experiencer.At this point, the image control part 120 can be with control
The image expressed portion 110 and the mode of the public screen 400 are formed.
It can be with can be by Fig. 4 to the first control method shown in Fig. 10 according to the virtual reality experience device of this composition
The mode of operation is formed.
That is, the virtual reality experience device is as previously described, not only so that the experience image is corresponded to by the visual field
Image presentation mode and the visual field mode that provides of correction mode are formed, and can be to be previously provided with from the experience time started
Point to the experience end time point experience image that each moment is provided in the time is experienced and experience acts, and successively provide this
Pre-set experience image is acted with experience, and experience image is made to act mode synchronized with each other with experience and is formed.Wherein, it is so-called
It is synchronous, it is meant that shown image-type action (vision sexual act) acts (physics with the experience in the experience image
Formula acts) it is consistent.
Specifically, when the experience image acts step different from each other with the experience, for example, when the device for image
100 provide decline image, and it is described take device 200 provide rise action when, experiencer feels abnormal sense, degree of immersing
Lowly, the problem of presence is low can occur as a result.
In view of this point, can be formed as follows according to the virtual reality experience device of the present embodiment, that is, pass through shape
Into the editorial office 310 of (regulation) experience image and the experience action, experience image described in experience preamble with it is described
Experience acts, and the experience image of (regulation) and experience action is formed based on the editorial office 310, by controlling the device for image
100 (more accurately, being image expressed portion 110) and the control unit for taking device 200 (more accurately, being driving portion 220)
(C), it in experience, synchronizes the experience image and is acted with the experience.
More specifically, the software that the editorial office 310 has as described in control device (not shown),
Timestamp code (TC) can be formed, the timestamp code (TC) is will be from experience sart point in time to experience end time point
The experience time in multiple time points for including be defined as the first to the n-th timestamp (Time Stamp) (T1 to Tn), can be with shape
Into first database (DB1), first database (DB1) regulation is used as will be respectively in the first to the n-th timestamp (T1
To Tn) the first to the n-th image (FP1 to FPn) of image is experienced in broadcasting, and can form the second database (DB2), and it is described
The second database (DB2) regulation acted as the experience that will be fulfiled respectively in the first to the n-th timestamp (T1 to Tn)
The first to the n-th action (M1 to Mn).
Wherein, the timestamp code (TC), which can be stored in, aftermentioned is uniformly controlled portion 320, the first database
(DB1) image control part 120 can be stored in, second database (DB2) can be stored in the drive control part
230。
The control unit (C) can be described by the image control part 120, the drive control part 230 and for controlling
The portion 320 that is uniformly controlled of image control part 120 and the drive control part 230 is formed.Wherein, it is described in the control device
Can have the portion of being uniformly controlled 320 in (not shown).
The portion 320 that is uniformly controlled can be to be based on the timestamp code (TC), after experience is started, by being determined in advance
Time (for example, 10ms) interval, the first to the n-th timestamp (T1 to Tn) is sent in sequence to the image control part
120 and the mode of the drive control part 230 formed.
Wherein, it is described to be uniformly controlled portion 320, it, can be with by the first to the n-th timestamp for the synchronization in experiencing
Arbitrary timestamp (Tk) while the side of the image control part 120 and the drive control part 230 is sent in (T1 to Tn)
Formula is formed.
The image control part 120 can be will be uniformly controlled from described described in the timestamp (Tk) that portion 320 receives substitutes into
First database (DB1) selects image corresponding with the timestamp (Tk) of reception in the first to the n-th image (FP1 to FPn)
(FPk), and by the visual field in the image of selection (FPk) side that image (VPk) is sent to the image expressed portion 110 is corresponded to
Formula is formed.It wherein, can also be the image control part 120 be sent to the image expressed portion 110 for the present embodiment
Image (VPk) be sent to the mode of the public screen 400 and formed.
Moreover, the image control part 120, it, can be with by the frequency (example being determined in advance for synchronization in operation
Such as, 60Hz) interval compares target image and practical image and is allowed to consistent mode and formed.
Specifically, the image control part 120 can be using will be as the image with being sent to the image expressed portion 110
Corresponding timestamp real time stamp (Tk'), with as from it is described be uniformly controlled portion 320 reception timestamp (Tk) target
The mode that timestamp (Tk) is compared is formed.Wherein, do not compare image data directly, but compare timestamp, so as to
Reduce the burden for being applied to the image control part 120, improve the processing speed of the image control part 120.
Moreover, when real time stamp (Tk') is the time point earlier than object time stamp (Tk), the image
Control unit 120 can assign instruction so that the image expressed portion 110 is with the broadcasting speed faster than the broadcasting speed being determined in advance
Play the shadow between image corresponding with real time stamp (Tk') and image corresponding with object time stamp (Tk)
Picture.
Moreover, when real time stamp (Tk') is to be later than the time point of the object time stamp (Tk), the image
Control unit 120 can assign instruction so that the image expressed portion 110 is with the broadcasting speed slower than the broadcasting speed being determined in advance
Play the image after image corresponding with real time stamp (Tk').
Alternatively, when real time stamp (Tk') is to be later than the time point of the object time stamp (Tk), the image
Control unit 120 can assign instruction so that the image expressed portion 110 plays corresponding with real time stamp (Tk') repeatedly
Image.
The drive control part 230 can be will be uniformly controlled from described described in the timestamp (Tk) that portion 320 receives substitutes into
Second database (DB2) selects action corresponding with the timestamp (Tk) of reception in the first to the n-th action (M1 to Mn)
(Mk), the mode that the action of selection (Mk) is sent to the driving portion 220 is formed.
Moreover, the drive control part 230, for the synchronization in experiencing, can with by be determined in advance time (for example,
12ms) interval compares target action and actual act and is allowed to consistent mode and formed.
Specifically, the drive control part 230 can be will move as the reality with being carried out in the driving portion 220
Make corresponding timestamp real time stamp (Tk "), with as from it is described be uniformly controlled portion 320 reception timestamp (Tk) mesh
The mode that mark timestamp (Tk) is compared is formed.Wherein, not direct comparison data, but compare timestamp, so as to
It is enough to reduce the burden for being applied to the drive control part 230, improve the processing speed of the drive control part 230.
Moreover, when real time stamp (Tk ") is the time point earlier than object time stamp (Tk), the driving
Control unit 230 can assign instruction so that the driving portion 220 with the speed faster than the actuating speed being determined in advance, fulfil with
Action between the corresponding action of real time stamp (Tk ") and action corresponding with object time stamp (Tk).
Moreover, when real time stamp (Tk ") is to be later than the time point of the object time stamp (Tk), the driving
Control unit 230 can assign instruction so that the driving portion 220 is carried out with the actuating speed slower than the actuating speed being determined in advance
Action after row action corresponding with real time stamp (Tk ").
Wherein, the drive control part 230 can calculate the real time stamp with utilization second database (DB2)
The mode of (Tk ") is formed.Specifically, second test section detected by the time interval (for example, 12ms) being determined in advance
240 measured value is also transmitted to the drive control part 230, and the drive control part 230 can be with by second test section
240 measured value substitute into second database (DB2) and by timestamp corresponding with the measured value of second test section 240
It calculates and is formed for the mode of real time stamp (Tk ").Although at this point, be applied to the drive control part 230 burden or
It is more or reduce plus, but do not need to be additional for calculating the other device of the real time stamp (Tk "), it is thus possible to save expense
With.
Alternatively, the drive control part 230 can include counting the timer (figure for the time that the driving portion 220 operates
In be not shown), will by the timer (not shown) that the time interval (for example, 12ms) that is determined in advance is extracted when
It carves to calculate and be formed for the mode of real time stamp (Tk ").At this point, it is stabbed due to additional for calculating the real time
The other device (timer (not shown)) of (Tk "), thus expense can more or less increase, but can reduce application
In the burden of the drive control part 230.
First control method is illustrated below.
That is, first control method can be included in experience image described in experience pre-editing and the volume of experience action
It collects step and carries out the steps performed of experience.
In the edit step, the timestamp code (TC), first number can be formed in the editorial office 310
According to library (DB1) and second database (DB2), the timestamp code (TC) be stored in it is described be uniformly controlled portion 320, it is described
First database (DB1) is stored in the image control part 120, and second database (DB2) is stored in the drive control
Portion 230.
In the steps performed, experiencer takes takes device 200 in described, and the device for image 100 is worn on this
Behind the head of experiencer, it can start to experience.
Start to experience, then in the 1st step (S1), the portion 320 that is uniformly controlled can will be as the institute of initial timestamp
It states stamp (T1) at the first time and is stored as object time stamp (Tk).
Then, in second step (S2), this can be uniformly controlled the mesh stored in portion 320 by the portion 320 that is uniformly controlled
It marks timestamp (Tk) while is sent to the image control part 120 and the drive control part 230.
Then, in 3-1-1 steps (S311), the image control part 120 can will pass through the second step (S2)
The object time stamp (Tk) of reception substitutes into the first database (DB1), selects the first to the n-th image (all-around video)
Image (all-around video) (FPk) corresponding with object time stamp (Tk) in (FP1 to FPn).
Then, in 3-1-2 steps (S312), first test section 130 can be by the survey of first test section 130
Magnitude is sent to the image control part 120, and second test section 240 can send out the measured value of second test section 240
It is sent to the image control part 120.
Then, in 3-1-3 steps (S313), the image control part 120 can be with first test section 130
Based on the measured value of measured value and second test section 240, the visual field of operation experiencer.
Then, in 3-1-4 steps (S314), the image control part 120 can be from the 3-1-1 steps
(S311) in the image (all-around video) (FPk) selected in, the experience of selection and 3-1-3 steps (S313) operation
The corresponding image in the visual field (visual field corresponds to image) (VPk) of person, and it is sent to the image expressed portion 110 and the public screen
400。
Then, in 3-1-5 steps (S315), the image expressed portion 110 and the public screen 400 can be distinguished
Play the image (VPk) received by the 3-1-4 steps (S314).
Wherein, the public screen 400 is to simplify logic, the image that will be shown with the image expressed portion 110
(VPk) identical image is supplied to the mode of non-experiencer to be formed, but is not limited to this, when experiencer is several, in order to
Who eliminate for the image of experiencer will be supplied to play in the problems in described public screen 400, third as be described hereinafter
Shown in control method, non-body can also be supplied to will be independent of the image for the image (VPk) that the image expressed portion 110 is shown
The mode for the person of testing is formed, and in order to save time and the expense runed needed for the public screen 400, can also be omitted described public
Screen 400 itself.
On the other hand, in 3-2-1 steps (S321), the drive control part 230 can will pass through the second step
(S2) the object time stamp (Tk) received substitutes into second database (DB2), and (M1 to Mn) is acted described the first to the n-th
Middle selection is corresponding with object time stamp (Tk) to act (Mk), and the action (Mk) of the selection is sent to the driving portion
220。
Then, in 3-2-2 steps (S322), the driving portion 220 can be fulfiled through the 3-2-1 steps
(S321) action received.
On the other hand, at least one end in the 3-1-4 steps (S314) and the 3-2-2 steps (S322),
Then in the 4th step (S4), the portion 320 that is uniformly controlled may determine that whether experience terminates.That is, described be uniformly controlled portion 320
May determine that this be uniformly controlled stored in portion 320 object time stamp (Tk) whether with the n-th timestamp as final timestamp
(Tn) it is consistent.
Moreover, in the 4th step (S4), when at the end of being judged as experience (when the object time stabs (Tk) with n-th
Between stamp (Tn) it is consistent when), experience terminates, and (object time stabs (Tk) and the n-th timestamp (Tn) no when in being judged as that experience carries out
When consistent), aftermentioned 5th step (S5) can be entered.
In the 5th step (S5), the portion 320 that is uniformly controlled sends object time stamp in the second step (S2)
(Tk) after, it can be determined that whether have passed through the time (interval between timestamp) being determined in advance.
Moreover, in the 5th step (S5), when being judged as have passed through the time being determined in advance, with aftermentioned 6th step
Suddenly the mode of (S6) carries out, can be with aftermentioned 7-1-1 steps (S711) when being judged as without the time being determined in advance
And the mode of aftermentioned 7-2-1 steps (S721) is carried out at the same time.
In the 6th step (S6), it is described be uniformly controlled portion 320 can will so far as the object time stamp (Tk) storage when
Between next timestamp for stabbing, be stored as new object time stamp (Tk).For example, it ought be deposited so far as object time stamp (Tk)
When the timestamp of storage stabs (T1) for the first time, second timestamp (T2) can stab (Tk) as the new object time
To store.
Moreover, after the 6th step (S6) is terminated, the second step (S2) can be revert to.
In 7-1-1 steps (S711), the image control part 120 can calculate the real time stamp (Tk').
Then, in 7-1-2 steps (S712), the image control part 120 may determine that from the 7-1-1 steps
(S711) calculate real time stamp (Tk') whether with is received by the second step (S2) object time stamp (Tk) one
It causes.
Moreover, in the 7-1-2 steps (S712), when being judged as that object time stamp (Tk) stabs with the real time
(Tk') when consistent, the 3-1-2 steps (S312) are revert to, when being judged as that stamp (Tk) and real time object time stab
(Tk') when inconsistent, aftermentioned 7-1-3 steps (S713) can be entered.
In 7-1-3 steps (S713), the image control part 120 may determine that the real time stamp (Tk') whether be
Earlier than the time point of object time stamp (Tk).
It is earlier than the object time when being judged as that the real time stabs (Tk') moreover, in the 7-1-3 steps (S713)
Stab (Tk) time point when, carried out in a manner of aftermentioned 7-1-4 steps (S714), when be judged as the real time stab (Tk')
It is that when being later than the time point of object time stamp (Tk), can be carried out in a manner of aftermentioned 7-1-5 steps (S715).
In 7-1-4 steps (S714), the image control part 120 can assign finger to the image expressed portion 110
Show, to play image corresponding with real time stamp (Tk') with the broadcasting speed faster than the broadcasting speed being determined in advance
Image between image corresponding with object time stamp (Tk).
The 7-1-4 steps (S714) are terminated, then can revert to the 3-1-2 steps (S312).
In 7-1-5 steps (S715), the image control part 120 can assign finger to the image expressed portion 110
Show, to play image corresponding with real time stamp (Tk') with the broadcasting speed slower than the broadcasting speed being determined in advance
Image later.In addition, the image control part 120 can assign instruction to the image expressed portion 110, to play repeatedly
Image corresponding with real time stamp (Tk').
The 7-1-5 steps (S715) are terminated, then can revert to the 3-1-2 steps (S312).
Wherein, the 7-1-1 to 7-1-5 steps (S711 to S715) can be determined in advance frequency (for example,
60Hz) interval is carried out.
Moreover, in the 7-1-2 steps (S712), when being judged as that object time stamp (Tk) stabs with the real time
(Tk') it when consistent or at the end of the 7-1-4 steps (S714) or at the end of the 7-1-5 steps (S715), returns
To the 3-1-2 steps (S312), this is changed for the visual field of experiencer during reflecting that.
On the other hand, in 7-2-1 steps (S721), second test section 240 can be by second test section 240
Measured value (actual act of driving portion 220) be sent to the drive control part 230.
Then, in 7-2-2 steps (S722), the drive control part 230 can be based on walking by the 7-2-1
Suddenly the measured value for the second test section 240 that (S721) is received calculates real time stamp (Tk "), and judges the real time calculated
Whether stamp (Tk ") stabs (Tk) with the object time received by the second step (S2) unanimously.
Moreover, in the 7-2-2 steps (S722), when being judged as that object time stamp (Tk) stabs with the real time
When (Tk ") is consistent, the 5th step (S5) is revert to, when being judged as object time stamp (Tk) and real time stamp (Tk ") no
When consistent, aftermentioned 7-2-3 steps (S723) can be entered.
In 7-2-3 steps (S723), the drive control part 230 may determine that whether real time stamp (Tk ") is early
In the time point of object time stamp (Tk).
It is earlier than the object time when being judged as that the real time stabs (Tk ") moreover, in the 7-2-3 steps (S723)
Stab (Tk) time point when, carried out in a manner of aftermentioned 7-2-4 steps (S724), when be judged as the real time stab (Tk ")
It is that when being later than the time point of object time stamp (Tk), can be carried out in a manner of aftermentioned 7-2-5 steps (S725).
In 7-2-4 steps (S724), the drive control part 230 can assign instruction to the driving portion 220, with
Just with the actuating speed faster than the actuating speed being determined in advance, fulfil action corresponding with real time stamp (Tk ") and with
The object time stabs the action between (Tk) corresponding action.
The 7-2-4 steps (S724) are terminated, then can revert to the 5th step (S5).
In 7-2-5 steps (S725), the drive control part 230 can assign instruction to the driving portion 220, with
Just with the actuating speed slower than the actuating speed being determined in advance, after fulfiling action corresponding with real time stamp (Tk ")
Action.
The 7-2-5 steps (S725) are terminated, then can revert to the 5th step (S5).
Wherein, the 7-2-1 to 7-2-5 steps (S721 to S725) can by be determined in advance time interval (for example,
12ms) carry out.
This steps performed can carry out the primary first step (S1), and straight in experience sart point in time
To providing corresponding with final timestamp image and action is repeatedly carried out the described 2nd and is tied afterwards to 7-2-5 steps (S2 to S725)
Beam.
Wherein, according to the virtual reality experience device of the present embodiment with including the device for image 100 and described taking
Device 200, for experiencer, experiencer can phase with the stimulation by physically acting perception by the stimulation of visual perception
It is mutually consistent.Prevent experiencer from feeling abnormal sense as a result, degree of immersing improves, and can improve presence as a result.
Moreover, as the experience image is synchronous with experience action, experiencer is by the stimulation of visual perception with leading to
Cross physically act the stimulation of perception can be further consistent.
Moreover, (before experience, the time started is experienced by the stage with synchronous between the experience image and experience action
In point, experience) carry out, be periodically carried out in experience, experiencer by the stimulation of visual perception with by physically acting
The stimulation of perception can be effectively further consistent.
Moreover, as the device for image 100 to provide the visual field in the all-around video (FPk) corresponds to image
(VPk) mode is formed, and can further improve presence.
Moreover, with the device for image 100 to take the dynamic of device 200 described in being excluded in the visual field operation of experiencer
The mode of work is formed, and can prevent the action for taking device 200 because described in from the practical visual field of experiencer being caused to be regarded with image in advance
Wild inconsistent situation.
Moreover, as the driving portion 220 for taking device 200 is formed by mechanical arm 221, it is thus possible to opposite
In the case of ground is less subject to space constraints, the action for caing be compared to the experiencer and taking practical set-up is provided to experiencer.
On the other hand, the virtual reality experience device can also with can be as shown in Figure 11 the second control method run
Mode formed.
That is, the virtual reality experience device is big compared with being formed in a manner of it can be run by first control method
Mode with small difference is formed, and the image control part 120 compares target image and practical image and be allowed to consistent not perform
The mode of effect is formed, and the drive control part 230 compares target action and actual act and be allowed to consistent work not perform
Mode is formed, so as to only to be formed before experience and in a manner of experience sart point in time carries out synchronization.
Second control method is illustrated below.
Compared with first control method, second control method can be used shown in Figure 11 shown in step alternate figures 8
The step of, Fig. 9 and step shown in Fig. 10 are deleted.That is, second control method can include Fig. 4, Fig. 5, Fig. 7, Figure 11
Shown step.Therefore, in the 5th step (S5), when being judged as have passed through the time being determined in advance, with the described 6th
The mode of step (S6) carries out, can be with the 3-1-2 steps when being judged as without the time being determined in advance
(S312) and the mode of the 5th step (S5) is carried out at the same time.
It is moved specifically, second control method can be included in experience image described in experience pre-editing with the experience
Dynamic edit step and the steps performed of implementation experience.
In the edit step, the editorial office 310 could be formed with the timestamp code (TC), first number
According to library (DB1) and second database (DB2), the timestamp code (TC) be stored in it is described be uniformly controlled portion 320, it is described
First database (DB1) is stored in the image control part 120, and second database (DB2) is stored in the drive control
Portion 230.
In the steps performed, experiencer takes takes device 200 in described, and the device for image 100 is worn on this
Behind the head of experiencer, it can start to experience.
Start to experience, then in the 1st step (S1), the portion 320 that is uniformly controlled can will be as the institute of initial timestamp
It states stamp (T1) at the first time and is stored as object time stamp (Tk).
Then, in second step (S2), this can be uniformly controlled the mesh stored in portion 320 by the portion 320 that is uniformly controlled
It marks timestamp (Tk) while is sent to the image control part 120 and the drive control part 230.
Then, in 3-1-1 steps (S311), the image control part 120 can will pass through the second step (S2)
The object time stamp (Tk) of reception substitutes into the first database (DB1), selects the first to the n-th image (all-around video)
Image (all-around video) (FPk) corresponding with object time stamp (Tk) in (FP1 to FPn).
Then, in 3-1-2 steps (S312), first test section 130 can be by the survey of first test section 130
Magnitude is sent to the image control part 120, second test section 240 can the measured value of second test section 240 send
To the image control part 120.
Then, in 3-1-3 steps (S313), the image control part 120 can be with first test section 130
Based on the measured value of measured value and second test section 240, the visual field of operation experiencer.
Then, in 3-1-4 steps (S314), the image control part 120 is in the 3-1-1 steps (S311)
In the image (all-around video) (FPk) of middle selection, selection and the visual field of the experiencer of 3-1-3 steps (S313) operation
Corresponding image (visual field corresponds to image) (VPk), and it is sent to the image expressed portion 110 and the public screen 400.
Then, in 3-1-5 steps (S315), the image expressed portion 110 and the public screen 400 can be distinguished
Play the image (VPk) received by the 3-1-4 steps (S314).
Wherein, the public screen 400 is to simplify logic, the image that will be shown with the image expressed portion 110
(VPk) identical image is supplied to the mode of non-experiencer to be formed, but is not limited to this, when experiencer is several, in order to
Who eliminate for the image of experiencer will be supplied to play in the problems in described public screen 400, third as be described hereinafter
Shown in control method, non-body can also be supplied to will be independent of the image for the image (VPk) that the image expressed portion 110 is shown
The mode for the person of testing is formed, and in order to save time and the expense runed needed for the public screen 400, can also be omitted described public
Screen 400 itself.
On the other hand, in 3-2-1 steps (S321), the drive control part 230 can will pass through the second step
(S2) the object time stamp (Tk) received substitutes into second database (DB2), and (M1~Mn) is moved described the first to the n-th
Middle selection is corresponding with object time stamp (Tk) to act (Mk), and the action (Mk) of the selection is sent to the driving portion
220。
Then, in 3-2-2 steps (S322), the driving portion 220 can be fulfiled through the 3-2-1 steps
(S321) action received.
On the other hand, at least one end in the 3-1-4 steps (S314) and the 3-2-2 steps (S322),
Then in the 4th step (S4), the portion 320 that is uniformly controlled may determine that whether experience terminates.That is, described be uniformly controlled portion 320
May determine that this be uniformly controlled stored in portion 320 object time stamp (Tk) whether with the n-th timestamp as final timestamp
(Tn) it is consistent.
Moreover, in the 4th step (S4), when at the end of being judged as experience (when the object time stabs (Tk) with n-th
Between stamp (Tn) it is consistent when), experience terminates, and (object time stabs (Tk) and the n-th timestamp (Tn) no when in being judged as that experience carries out
When consistent), aftermentioned 5th step (S5) can be entered.
In the 5th step (S5), the portion 320 that is uniformly controlled sends object time stamp in the second step (S2)
(Tk) after, it can be determined that whether have passed through the time (interval between timestamp) being determined in advance.
Moreover, in the 5th step (S5), when being judged as have passed through the time being determined in advance, with aftermentioned 6th step
Suddenly the mode of (S6) carries out, can be with the 3-1-2 steps (S312) when being judged as without the time being determined in advance
And the mode of the 5th step (S5) is carried out at the same time.
In the 6th step (S6), it is described be uniformly controlled portion 320 can will so far as the object time stamp (Tk) storage when
Between next timestamp for stabbing, be stored as new object time stamp (Tk).For example, it ought be deposited so far as object time stamp (Tk)
When the timestamp of storage stabs (T1) for the first time, the 2nd timestamp (T2) can stab (Tk) as the new object time
To store.
Moreover, after the 6th step (S6) is terminated, the second step (S2) can be revert to.
For the virtual reality experience device of this composition, it is applied to the image control part 120 and the driving control
The burden in portion 230 processed mitigates, and the processing speed of the image control part 120 and the processing speed of the drive control part 230 carry
Height can reduce error generation rate.Therefore, experience image can be formed with higher-definition, experience can be more closely formed and move
It is dynamic.But, it is unfavorable in synchronization aspects at this time, but synchronous considerable part is reached in the editorial office 310, because without becoming
Big problem.
On the other hand, the virtual reality experience device can also be with can be as the third controlling party shown in Figure 12 to Figure 19
Method is run.
That is, the virtual reality experience device with by can by first control method run in a manner of formed compared with
Similar mode is formed, and the public screen 400 can be using will be as independently of the experience image for being supplied to experiencer
The all-around video (FPk) in by be determined in advance the visual field observation image (SPk) be supplied to non-experiencer in a manner of shape
Into.
Specifically, the editorial office 310 can be formed to will be in the public screen 400, described the first to the n-th
Third database (DB3) as defined in public image (SP1 to the SPn) progress that timestamp (T1 to Tn) plays respectively.
Moreover, the third database (DB3) can be stored in the image control part 120.
Moreover, the image control part 120 can be will be uniformly controlled the timestamp (Tk) that portion 320 receives and substitute into from described
The third database (DB3), and select corresponding with the timestamp (Tk) of reception in the public image (SP1 to SPn)
Public image (SPk), and the mode that the public image (SPk) of selection is sent to the public screen 400 is formed.Wherein, it sends out
The image (SPk) for being sent to the public screen 400 can be different from being sent to the image (VPk) of the image expressed portion 110
The visual field observation image.
Moreover, the image control part 120 can with the identical principle of the experience image by the public image of target with
Practical public image is compared by frequency (for example, 60Hz) interval being determined in advance and is allowed to consistent mode and formed.In order to
It avoids repeating, omits the detailed description to this.
The third control method is illustrated below.
Compared with first control method, the third control method can be substituted with the step shown in Figure 12 to Figure 19
Fig. 3, Fig. 4, Fig. 6 are to step shown in Fig. 10.That is, the third control method may include step shown in Figure 12 to Figure 19.
Therefore, in the edit step, the third database (DB3) can also be formed in the editorial office 310,
The third database (DB3) can be stored in the image control part 120 together with the first database (DB1).
Moreover, in the 3-1-4 steps (S314), the image control part 120 can be walked from the 3-1-1
Suddenly in the image (all-around video (FPk)) selected in (S311), the experience of selection and 3-1-3 steps (S313) operation
The corresponding image in the visual field (visual field corresponds to image (VPk)) of person is simultaneously sent to the image expressed portion 110.
Moreover, in the 3-1-5 steps (S315), the image expressed portion 110 can be played through the 3-
The image (VPk) that 1-4 steps (S314) receive.
Moreover, in 3-3-1 steps (S331), the image control part 120 can will pass through the second step (S2)
The object time stamp (Tk) of reception substitutes into the third database (DB3), in the public image (SP1 to SPk) selection with
The corresponding public image (SPk) of object time stamp (Tk), is sent to the public screen 400.
Moreover, in 3-3-2 steps (S332), the public screen 400 can be played through the 3-3-1 steps
(S331) the public image (SPk) received.
Moreover, the 3-1-4 steps (S314), the 3-2-2 steps (S322) and the 3-3-2 steps
(S332) if in it is at least one terminate, into the 4th step (S4).
Moreover, in the 5th step (S5), it, can be simultaneously with institute when being judged as without the time being determined in advance
The mode for stating 7-1-1 steps (S711), the 7-2-1 steps (S721) and aftermentioned 7-3-1 steps (S731) carries out.
In 7-3-1 steps (S731), the image control part 120 can be calculated as corresponding in the public screen
The real time stamp (Tk " ') of the timestamp of public image played in curtain 400.
Then, at 7-3-2 steps (in S732), the image control part 120 may determine that in the 7-3-1 steps
(S731) in calculate real time stamp (Tk " ') whether with is received by the second step (S2) object time stamp (Tk)
Unanimously.
Moreover, in the 7-3-2 steps (S732), when being judged as that object time stamp (Tk) stabs with the real time
When (Tk " ') is consistent, the second step (S2) is revert to, when being judged as that the object time, stamp (Tk) and the real time stabbed (Tk " ')
When inconsistent, it can be carried out in a manner of aftermentioned 7-3-3 steps (S733).
In 7-3-3 steps (S733), the image control part 120 may determine that the real time stamp (Tk " ') whether be
Earlier than the time point of object time stamp (Tk).
It is earlier than the object time when being judged as that the real time stabs (Tk " ') moreover, in the 7-3-3 steps (S733)
Stab (Tk) time point when, carried out in a manner of aftermentioned 7-3-4 steps (S734), when be judged as the real time stab (Tk ')
It is that when being later than the time point of object time stamp (Tk), can be carried out in a manner of aftermentioned 7-3-5 steps (S735).
In 7-3-4 steps (S734), the image control part 120 can assign instruction to the public screen 400,
So that the public screen 400 is made to play and be stabbed with the real time with the broadcasting speed faster than the broadcasting speed being determined in advance
Public image between (Tk " ') corresponding public image and public image corresponding with object time stamp (Tk).
The 7-3-4 steps (S734) are terminated, then can revert to the second step (S2).
In 7-3-5 steps (S735), the image control part 120 can assign instruction to the public screen 400,
So that the public screen 400 is made to play and be stabbed with the real time with the broadcasting speed slower than the broadcasting speed being determined in advance
Public image after (Tk " ') corresponding public image.Alternatively, the image control part 120 can be to the public screen
400 assign instruction, so that the public screen 400 is made to play public shadow corresponding with real time stamp (Tk " ') repeatedly
Picture.
The 7-3-5 steps (S735) are terminated, then can return the second step (S2).
Wherein, the 7-3-1 to 7-3-5 steps (S731 to S735) can be determined in advance frequency (for example,
60Hz) interval is carried out.
For the virtual reality experience device of this composition, even if in the case where experiencer is several, it can also carry
It is publicized for public image and attracts experiencer.
On the other hand, previous embodiment virtual reality experience device so that it is described experience image and the experience action with
Experiencer's will is independently formed with the mode that time flowing provides the image being determined in advance and action, so that experiencer cans be compared to
It watches film and carries out virtual reality experience.But as shown in Figure 20 to Figure 24, virtual reality experience device can be with so that the body
It tests image and experience action is provided as image corresponding with experiencer's will and the mode of action is formed, so that experiencer is good
Virtual reality experience is carried out than playing to play.
Figure 20 is that the inscape of the virtual reality experience device for illustrating another embodiment according to the present utility model is
System figure, Figure 21 is the precedence diagram for the part for illustrating the control method to the virtual reality experience device of Figure 20, Figure 22 to Figure 24
It is the precedence diagram of another part of the control method of pictorial image 21 respectively.
At this point, the device for image 100, device 200 and the control device (not shown) of taking can be with
Previous embodiment is formed similarly.
It but, at this time can be with so that the experience image be provided as image corresponding with experiencer's will, the experience
The mode that action is provided as action corresponding with experiencer's will is formed.
Specifically, the device for image 100 can be included according to the virtual reality experience device of the present embodiment, described taken
Quadrupler 200, the control device (not shown) and the public screen 400 can also include receiving from experiencer defeated
Enter the operating device 500 of data, and can will virtual reality imagery corresponding with the input data with the device for image 100
It is supplied to experiencer, the side taken device 200 and physically action corresponding with the input data is supplied to experiencer
Formula is formed.
The operating device 500 for example can be by control stick, haptic device, button, the biography for measuring the movement of experiencer's sight
The formation such as sensor (first test section 130), so that the input data is included for position, directionality, speed, acceleration
The information of degree, rotation etc..
Moreover, the experience image can be formed by the game content based on the input data.
The device for image 100 and the device 200 of taking can be respectively by aggressive device (master decvice) shapes
Into.That is, the device for image 100 can to receive the input data from the operating device 500, based on the input data,
The mode for providing the experience image is formed, and the device 200 of taking can also be described defeated to be received from the operating device 500
Enter data, based on the input data, the mode for providing the experience action is formed.But the at this point, first database
(DB1) become larger with the capacity of second database (DB2), and form the first database (DB1) and the second database
(DB2) comparable time and expense are needed.
In view of this point, such as the present embodiment, the device for image 100 can be formed by aggressive device, described to take device
200 can be formed by slave unit (slave device).That is, the device for image 100 can be with from the operating device
500 receive the input datas, and the mode that the experience image is provided based on the input data is formed, but described take device
200 can be formed in a manner of providing the experience action based on the experience image.Wherein, the device 200 of taking can be with
It is formed by aggressive device, the device for image 100 can be formed by slave unit, but the device for image 100 than described in is taken
The restriction of device 200 is big, thus the preferably device for image 100 is formed by aggressive device, and the device for image 100 is by driven
Device is formed.
For this purpose, described in the editorial office 310 can changeably be formed with the experience data according to the input data
1st database (DB1) changeably forms second database (DB2) according to the experience image.That is, first number
It can be with so that the input data becomes input value, the mode that making the experience image becomes output valve be formed, institute according to library (DB1)
Stating the second database (DB2) can be with so that the experience image makes side of the experience action as output valve as input value
Formula is formed.
Moreover, the control unit (C) can to store the first database (DB1) and second database (DB2),
The input data is received from the operating device 500, based on the first database (DB1), second database (DB2)
And the input data, control the device for image 100 (more accurately, image expressed portion 110) and the driving device
The mode of (more accurately, driving portion 220) is formed.That is, the control unit (C) can be will connect from the operating device 500
The input datas of receipts substitutes into the first database (DB1), selects experience image corresponding with the input data, and by selection
Experience image is sent to the image expressed portion 110, and the experience image of the selection is substituted into second database (DB2)
And the mode for acting and being sent to the driving portion 220 corresponding with the image is selected to be formed.
It, can be with by the time (for example, 12ms) being determined in advance for the synchronization in experiencing moreover, the control unit (C)
Interval, compares target action and actual act and is allowed to consistent mode and formed.
Specifically, the transmitting measured values of second test section 240 give the control unit (C), the control unit (C) can
With actual act corresponding with the measured value of second test section 240 to be moved with being sent to the target of the driving portion 220
The mode for making to be compared is formed.
Moreover, when actual act and target action difference, the control unit (C) can assign instruction, described to make
Driving portion 220 is performed with the actuating speed faster than the actuating speed being determined in advance.
The virtual reality experience device of this composition can be run as the real-time control method shown in Figure 21 to Figure 24.
That is, in edit step, the first database (DB1) and described of could be formed in editorial office 310
Second database (DB2), the first database (DB1) and second database (DB2) can be stored in the control unit
(C)。
In steps performed, experiencer takes takes device 200 in described, and the device for image 100 is worn on the body
Behind the head for the person of testing, it can start to experience.
Start to experience, then in 1' steps (S1'), the operating device 500 can receive the input from experiencer
Data.
Then, in 2' steps (S2'), the operating device 500 can will be received by the 1' steps (S1')
Input data be sent to the control unit (C).
Then, in 3' steps (S3'), what the control unit (C) can will be received by the 2' steps (S2')
Input data substitutes into the first database (DB1) and selects image (all-around video) (FPk) corresponding with the input data.
Then, in 4-1-1' steps (S411'), first test section 130 can be by first test section 130
Measured value is sent to the control unit (C), and second test section 240 can send the measured value of second test section 240
To the control unit (C).
Then, in 4-1-2' steps (S412'), the control unit (C) can be based on first test section 130
The measured value of measured value and second test section 240, the visual field of operation experiencer.
Then, in 4-1-3' steps (S413'), the control unit (C) can select in the 3' steps (S3')
Image (all-around video) (FPk) in, selection with the 4-1-2' steps (S412') in operation the visual field with experiencer
Corresponding image (visual field corresponds to image) (VPk), and it is sent to the image expressed portion 110 and the public screen 400.
Then, in 4-1-4' steps (S414'), the image expressed portion 110 and the public screen 400 can divide
The image (VPk) that Bo Fang not be received by the 4-1-3' steps (S413').
On the other hand, in 4-2-1' steps (S421'), the control unit (C) can select and the 3' steps
(S3') the corresponding action (Mk) of image (all-around video) (FPk) selected in, and the action (Mk) of the selection is sent to
The driving portion 220.
Then, in 4-2-2' steps (S422'), the driving portion 220 can be fulfiled to be walked by the 4-2-1'
Suddenly the action (Mk) that (S421') is received.
On the other hand, at least one knot in the 4-1-4' steps (S414') and the 4-2-2' steps (S422')
Beam, then in 5' steps (S5'), the control unit (C) may determine that whether experience terminates.That is, the control unit (C) can
To judge whether to meet experience termination condition (for example, game on game content) set in addition.
Moreover, in the 5' steps (S5'), when at the end of being judged as experience (when meeting experience termination condition), body
End is tested, it, can be with aftermentioned 6' steps (S6') when in being judged as that experience carries out (when being unsatisfactory for experience termination condition)
Mode carries out.
In 6' steps (S6'), second test section 240 can be by the measured value (driving of second test section 240
The actual act in portion 220) it is sent to the control unit (C).
Then, at 7' steps (in S7'), the control unit (C) may determine that be received by the 6' steps (S6')
The second test section 240 measured value (actual act of driving portion 220) (Mk ') whether the target action with driving portion 220
(Mk) it is consistent.
Moreover, in the 7' steps (S7'), when the actual act (Mk ') and target action that are judged as driving portion 220
(Mk) when consistent, the 1' steps (S1') are revert to, when the actual act (Mk ') and target action that are judged as driving portion 220
(Mk) it when inconsistent, can be carried out in a manner of aftermentioned 8' steps (S8').
In 8' steps (S8'), the control unit (C) can assign instruction to the driving portion 220, to make the drive
Dynamic portion 220 is performed with the actuating speed faster than the actuating speed being determined in advance.
The 8' steps (S8') are terminated, then can revert to the 1' steps (S1').
Wherein, the 6' steps (S6') to 8' steps (S8') can by be determined in advance time interval (for example,
12ms) implement.
On the other hand, for previous embodiment, the driving portion 220 is formed by mechanical arm 221, but such as Figure 25 and Figure 29
Shown, the driving portion 220 can be from occurring pitching (Pitching), rolling (Yawing), rolling to the portion 210 of taking
(Rolling) and the gyro mechanism 222 of reciprocating motion is formed.Wherein, so-called reciprocating motion, it is meant that it is described take portion 210 to
From the situation for the direction movement for supporting the works 223 of the gyro mechanism 222 separate and close.
Figure 25 is the stereogram for the virtual reality experience device for illustrating another embodiment according to the present utility model, Tu26Zhi
Figure 29 is the plan view of action that the virtual reality experience device of pictorial image 25 respectively provides.
The gyro mechanism 222 can include:First mechanism 2221 as shown in figure 27, makes described to take portion 210 and occur
Rolling as shown in figure 29, makes described to take portion 210 and move back and forth;Second mechanism 2222 as shown in figure 26, makes institute
It states the portion of taking 210 and pitching occurs;And third mechanism 2223, as shown in figure 28, make described to take portion 210 and roll.
First mechanism 2221 can be on the basis of the works 223 can be rotated and moved back and forth side
Formula is formed.
Specifically, the first engaging groove for being inserted into first mechanism 2221 is formed in the works 223 (in figure
It is not shown), first mechanism 2221 can include:Base portion 2221a is inserted in first engaging groove and (does not show in figure
Go out);And arm 2221b, extend from the base portion 2221a to the opposite side of the works 223 and support described second
Mechanism 2222.
The base portion 2221a can in the state of the first engaging groove (not shown) is inserted in by this
The mode that the depth direction of one engaging groove (not shown) can be rotated for rotary shaft is formed, and can be with along described the
The mode that the depth direction of one engaging groove (not shown) can move back and forth is formed.
Moreover, between the works 223 and first mechanism 2221 (more accurately, base portion 2221a), it can
Be formed with the first actuator (not shown) of the driving force needed for the rotary motion that first mechanism 2221 occurs and
The second actuator (not shown) of the driving force needed for the reciprocating motion of first mechanism 2221 occurs.
The first actuator (not shown) and the second actuator (not shown) can be respectively by having
Motor, retarder and transmission mechanism (for example, pulley, sprocket wheel, belt, chain) and form.
Wherein, although being not otherwise shown, first mechanism 2221 can also be on the basis of the works 223 with energy
It is enough rotated, and supports the position of second mechanism 2222 that can be carried out to the direction separate and close from works 223
The mode of reciprocating motion is formed.That is, the arm 2221b can include the first arm with the base portion 2221a secure bonds
Portion 2221ba and in the case of the second mechanism 2222 are supported to sides of the first arm 2221ba can move back and forth
The second arm 2221bb that formula combines, the base portion 2221 is in the shape for being inserted in the first engaging groove (not shown)
Under state, it can only be formed in a manner that the depth direction of the first engaging groove (not shown) is rotated for rotary shaft.
At this point, between the works 223 and first mechanism 2221, first actuator can be formed and (do not shown in figure
Go out), between the first arm 2221ba and the second arm 2221bb, it is formed with and the second arm 2221bb occurs
Reciprocating motion needed for driving force the second actuator (not shown).
Second mechanism 2222 can be supported on first mechanism 2221 (more accurately, arm 2221b),
On, and can be formed in a manner that the direction of the rotary shaft perpendicular to first mechanism 2221 rotates.
Specifically, in the arm 2221b of first mechanism 2221, oriented and first engaging groove is formed (in figure
Be not shown) the vertical extended second engaging groove (not shown) in direction of depth direction, second mechanism 2222 can be with
Including be inserted in the hinge part (not shown) of the second engaging groove (not shown) and from the hinge part (in figure not
Show) with the annular ring part 2222b extended and support the third mechanism 2223.Wherein, the hinge part (does not show in figure
Go out) it can be formed in a manner that the radial direction of from the peripheral part from the ring part 2222b to ring part 2222b are extended.
The hinge part (not shown) can be in the state for being inserted in the second engaging groove (not shown)
Under, it can be formed in a manner that the depth direction of the second engaging groove (not shown) is rotated for rotary shaft.
Moreover, arm 2221b and 2222 (more accurately, the hinge of the second mechanism in first mechanism 2221
Portion's (not shown)) between, it could be formed with of the driving force needed for the rotary motion that second mechanism 2222 occurs
Three actuator (not shown)s.
The third actuator (not shown) can be similarly formed with the first actuator (not shown).
The third mechanism 2223 is supported on second mechanism 2222 (more specifically, ring part 2222b), can be with
With the side that can be rotated perpendicular to the direction of the rotary shaft of the rotary shaft and second mechanism 2222 of first mechanism 2221
Formula rotates.At this point, the portion 210 of taking can be formed in a manner that secure bond is in the third mechanism 2223.
Specifically, the third mechanism 2223 by with second mechanism 2222 (more accurately, ring part 2222b)
It forms concentric annular to be formed, the peripheral surface of the third mechanism 2223 can rotatably be incorporated into second mechanism 2222
The inner peripheral surface of (more accurately, ring part 2222b).
Moreover, between the inner peripheral surface of second mechanism 2222 and the peripheral surface of the third mechanism 2223, it can be with shape
Into have the third mechanism 2223 occurs rotary motion needed for driving force the 4th actuator (not shown).
Wherein, the third mechanism 2223 can be in all peripheral surfaces of the third mechanism 2223 and second mechanism
In a state that 2222 all inner peripheral surfaces face, for the inner peripheral surface of second mechanism 2222, can along the circumferential direction it slide
Mode combines.
Virtual reality device including the gyro mechanism 222 according to this composition is with including the mechanical arm 221
Virtual reality device is compared, even if in more narrow space, can also be provided the experience to experiencer and be acted.
On the other hand, for the embodiment shown in Figure 25 to Figure 29, the gyro mechanism 222 is vertical can all provide
Shake, rolling, rolling and reciprocating motion mode formed, but the gyro mechanism 222 can also be to be merely able to provide pitching, horizontal stroke
Part in shaking, roll and moving back and forth.
On the other hand, the driving portion 220 is as shown in Figure 30 to Figure 34, can also be to have the mechanical arm 221 and institute
The mode for stating gyro mechanism 222 is formed.At this point, the portion 210 of taking can be incorporated into the third mechanism of the gyro mechanism 222
2223, the gyro mechanism 222 can be incorporated into the free end of the mechanical arm 221.
Figure 30 is the stereogram for the virtual reality experience device for illustrating another embodiment according to the present utility model, Tu31Zhi
Figure 34 is the stereogram of action that the virtual reality experience device of pictorial image 30 respectively provides.
At this point it is possible to provide the action that can not be embodied with the mechanical arm 221.
For example, referring to Figure 31 and Figure 32, then the portion 210 of taking is located above maximum value in the mechanical arm 221
Under state, the gyro mechanism 222 make it is described take occur in portion 210 pitching, rolling, rolling and in moving back and forth at least one
Kind, experiencer can also take various posture in upside maximum position.
As another example, with reference to Figure 33, then make described to take portion 210 and be located at front maximum value in the mechanical arm 221
In the state of, the gyro mechanism 222 make it is described take portion 210 occur pitching, rolling, rolling and in moving back and forth at least one
Kind, and experiencer can also take various posture in front side maximum position.
As another example, with reference to Figure 34, then the mechanical arm 221 made on the basis of ground it is described take portion 210 into
In the state of row revolution, the gyro mechanism 222 adds in pitching, rolling, rolling and reciprocating motion the portion 210 of taking
At least one, experiencer in the state of revolution can also rotation, various posture can also be taken.
The boundary for the action that the driving portion 220 provides reduces and improves the degree of freedom of making video as a result, as a result, energy
Enough reduce the restriction for the virtual reality to be embodied.
Industrial applicibility
The utility model is related to virtual reality experience devices, more specifically, are related to one kind and are capable of providing image and physics
The virtual reality experience device of formula action.
Claims (10)
1. a kind of virtual reality experience device, including:
Device for image provides image to experiencer;And
Device is taken, to experiencer's offer action;
The device of taking includes:
Portion is taken, the space that can be taken is provided to experiencer;And
Gyro mechanism makes the portion of taking that at least one of pitching, rolling, rolling and reciprocating motion occur.
2. virtual reality experience device according to claim 1, wherein,
The gyro mechanism includes:
First mechanism makes the portion of taking that rolling occur;
Second mechanism makes the portion of taking that pitching occur;And
Third mechanism rolls the portion of taking.
3. virtual reality experience device according to claim 2, wherein,
First mechanism is formed on the basis of supporting the works of the gyro mechanism in a manner of it can be rotated,
Second mechanism supports are in first mechanism, and on the basis of the axis perpendicular to the rotary shaft of first mechanism
It is formed in a manner of it can be rotated,
The third mechanism supports are in second mechanism, and on the basis of the axis perpendicular to the rotary shaft of second mechanism
It is formed in a manner of it can be rotated.
4. virtual reality experience device according to claim 3, wherein,
Portion's secure bond of taking is in the third mechanism.
5. virtual reality experience device according to claim 3, wherein,
Between the works and first mechanism, it is formed with the driving needed for the rotary motion that first mechanism occurs
First actuator of power,
Between first mechanism and second mechanism, it is formed with the drive needed for the rotary motion that second mechanism occurs
The third actuator of power,
Between second mechanism and the third mechanism, it is formed with the drive needed for the rotary motion that the third mechanism occurs
4th actuator of power.
6. virtual reality experience device according to claim 3, wherein,
First mechanism is formed in a manner of the portion of taking is made also to move back and forth.
7. virtual reality experience device according to claim 6, wherein,
First mechanism on the basis of the works in a manner of it can move back and forth to be formed.
8. virtual reality experience device according to claim 7, wherein,
Between the works and first mechanism, it is formed with the driving needed for the reciprocating motion that first mechanism occurs
Second actuator of power.
9. virtual reality experience device according to claim 6, wherein,
First mechanism is to support the position of second mechanism can be into along the direction separate and close from the works
The mode that row moves back and forth is formed.
10. virtual reality experience device according to claim 9, wherein,
In first mechanism, it is formed with the second actuator needed for the reciprocating motion at the position for supporting second mechanism.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160123261A KR101885128B1 (en) | 2016-03-11 | 2016-09-26 | Virtual reality experience apparatus |
KR10-2016-0123261 | 2016-09-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN207591284U true CN207591284U (en) | 2018-07-10 |
Family
ID=62759753
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201721233518.5U Active CN207591284U (en) | 2016-09-26 | 2017-09-25 | Virtual reality experience device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN207591284U (en) |
-
2017
- 2017-09-25 CN CN201721233518.5U patent/CN207591284U/en active Active
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107533377A (en) | Virtual reality experience device | |
US10293257B2 (en) | Systems and methods for programmatically generating non-stereoscopic images for presentation and 3D viewing in a physical gaming and entertainment suite | |
AU685953B2 (en) | System for human trajectory learning | |
KR20210127936A (en) | Augmented Cognitive Method and Apparatus for Simultaneous Feedback of Psychomotor Learning | |
US5913727A (en) | Interactive movement and contact simulation game | |
EP0880380B1 (en) | System for human trajectory learning in virtual environments | |
US6741911B2 (en) | Natural robot control | |
US10362299B1 (en) | System for introducing physical experiences into virtual reality (VR) worlds | |
CN108701429A (en) | The virtual and/or augmented reality that physics interactive training is carried out with operating robot is provided | |
US10675766B1 (en) | System for introducing physical experiences into virtual reality (VR) worlds | |
WO2017094356A1 (en) | Video producing device, method for controlling video producing device, display system, video production control program, and computer-readable recording medium | |
CN204543523U (en) | A kind of virtual reality wears display system | |
US10832490B2 (en) | Virtual reality experience apparatus capable of providing experiencing user with virtual reality image and physical motion | |
CN207591284U (en) | Virtual reality experience device | |
Liebermann et al. | The use of feedback-based technologies | |
Zago et al. | Observing human movements helps decoding environmental forces | |
KR102083528B1 (en) | Experience apparatus | |
Nawahdah et al. | Helping physical task learning by automatic adjustment of a virtual teacher's rotation angle | |
KR20190075358A (en) | Experience apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant |