JP4129072B2 - Motion-based control device - Google Patents

Motion-based control device Download PDF

Info

Publication number
JP4129072B2
JP4129072B2 JP08309298A JP8309298A JP4129072B2 JP 4129072 B2 JP4129072 B2 JP 4129072B2 JP 08309298 A JP08309298 A JP 08309298A JP 8309298 A JP8309298 A JP 8309298A JP 4129072 B2 JP4129072 B2 JP 4129072B2
Authority
JP
Japan
Prior art keywords
motion
video
frame
control
base
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP08309298A
Other languages
Japanese (ja)
Other versions
JPH11276719A (en
Inventor
実 加藤
昌之 大城
司 椎名
Original Assignee
株式会社日立ケーイーシステムズ
株式会社日立産機システム
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立ケーイーシステムズ, 株式会社日立産機システム filed Critical 株式会社日立ケーイーシステムズ
Priority to JP08309298A priority Critical patent/JP4129072B2/en
Priority claimed from US09/276,739 external-priority patent/US6413090B1/en
Publication of JPH11276719A publication Critical patent/JPH11276719A/en
Application granted granted Critical
Publication of JP4129072B2 publication Critical patent/JP4129072B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Description

[0001]
BACKGROUND OF THE INVENTION
The present invention relates to control in a simulation ride system that operates a motion base in accordance with a video by a synchronization method of the video and the motion base.
[0002]
[Prior art]
Conventionally, since the video scenario is synchronized with the motion-based operation (non-interactive system), the video device and the motion-based control device are controlled independently after that only the start time is matched.
[0003]
[Problems to be solved by the invention]
However, in the conventional technology, the synchronized video and motion-based operation at the start time may be out of sync due to the processing capability of each control device.
If the motion of the video and motion base breaks down, the motion data (motion) originally generated in accordance with the video is executed when it is not the original video scene.
This causes discomfort to people on the motion base, and as a result, loses the sense of immersion in the video world that the simulation ride system is aiming for.
The problem to be solved by the present invention is to provide a correction means when synchronization between video and motion-based operation is lost.
[0004]
[Means for Solving the Problems]
The present invention includes the following means as means for solving the above-mentioned problems without impairing the immersive feeling of the person on the motion base.
(1) Provided with a correcting means adapted to the video with reference to the video.
(2) A means for correcting using a frame NO of a video synchronization command for each frame capable of outputting a video is provided.
[0005]
(3) The following means are provided as correction methods.
○ Compare the frame number (frame currently displayed by the video device) in the video synchronization command with the frame number indicating the motion data being executed by the motion base, and determine the correction speed from the difference to determine the motion base operation. A means of synchronizing motion data and video by changing the speed.
[0006]
○ When using frame NO, the correction method is to change the motion data to the motion data of the corresponding frame NO based on the frame NO of the currently captured image output from the video device, and then sequentially A means to execute the motion data that is lined up and synchronize with the video.
[0007]
(Function)
A motion-based control device having means for solving the problems described above can correct a video and a motion without correcting the video (correction may result in missing frames) even when a deviation between the video and the motion-based operation occurs. Base operation synchronization can be secured.
[0008]
DETAILED DESCRIPTION OF THE INVENTION
An embodiment of the present invention will be described with reference to FIG. FIG. 1 shows a system configuration for implementing the present invention.
The video device uses the projector 1-2 from the video control device 1-1 to project the video on the screen 1-3.
The motion base 1-5 is controlled by the motion base control device 1-4. The motion base control device 1-4 and the video control device 1-1 are connected via a LAN 1-6 and can exchange data.
[0009]
When the video device starts video, a video start command 2-1 and a video synchronization command 2-2 shown in FIG. 2 are transmitted to the LAN 1-6 and received by the motion control device 1-4. The motion control device 1-4 is processed by a man-machine controller processor (MCP) 3-1 that controls the I / F with the LAN shown in FIG. 3 and a real-time control processor (RCP) 3-2 that controls motion control in real time. It is divided into systems, and it is configured to be coupled by a DPRAM 3-4 with an interrupt function.
[0010]
First, when the video start command 2-1 is received by the LAN control system 3-3 of the MCP 3-1, the LAN control system 3-3 writes a command in the area of the DPRAM 3-4 to pass to the RCP 3-2, and RCP3- An interrupt is made so that 2 can be recognized (process (1)).
In (Process (2)), when an interrupt is applied to the RCP 3-2, the handler 3-10 of the RCP 3-2 operates (Process (3)), and via man-machine controller processor control (Process (4)), Data is transferred to the MCL control 3-5 that controls the motion (process (5)).
In the MCL control 3-5, it is determined whether or not the motion base is in a startable state. If the motion base is in the startable state, an external clock process 3 for giving a trigger for performing a periodic start of the time control 3-7 (servo control) -6 is activated (process A).
In the external clock processing 3-6, the time control 3-7 is activated at every cycle (time interval for the video apparatus to display one frame) (processing B), and the time control is arranged in the order of frame NO. Data is acquired from the existing motion data file 3-8 (process D), and servo control is performed using this as a command value (process E) to realize a motion-based operation.
[0011]
During the operation, the motion data for each frame in the format shown in FIG. 4 is taken out every external clock cycle, the command value is output to the servo (processing E), and the control to match the video is realized by operating.
However, according to this process, the frame display cycle of the video apparatus starts to vary, and a discrepancy with the motion control apparatus that is controlled as being essentially constant occurs.
This state will be described with reference to FIG. 5. In the video control, the video is originally dropped in the frame N + 1. The motion base must synchronize with this and operate in the falling state 5-1. However, if the synchronization is shifted (the motion base is delayed in FIG. 5), the operation is in the horizontal state 5-2.
When this delay occurs, the video and the motion-based operation are not synchronized, which gives discomfort to the person on the motion base and eliminates the feeling of immersion in the video.
[0012]
Therefore, the present invention corrects motion-based operations and realizes a system that can be synchronized by correction even when the frame of the video apparatus is disturbed.
First, the frame number (frame currently displayed by the video device) in the video synchronization command is compared with the frame number indicating the motion data being executed by the motion base, and the correction speed is obtained from the difference to obtain the motion base. The motion data and video are synchronized by changing the operation speed.
[0013]
FIG. 6 shows the flow of this process.
The motion control device passes the video synchronization command to the MCL control 3-5 shown in FIG. 3 in the same manner as the video start command. In the MCL control, the frame NO (FeNO, 2-4 in FIG. 2) is extracted from the video synchronization command (step 6-1). At the same time, the frame number (FmNO, 4-1 in FIG. 4) of the motion data being executed by the time control 3-7 shown in FIG. 3 is extracted (step 6-2). Two frames NO are compared (step 6-3), and if the comparison results are the same, it is determined that synchronization is established, and correction control is not performed. If there is a difference as a result of the comparison in (Step 6-3), a correction speed ΔV is calculated according to Equation 1 (Step 6-4).
[0014]
A method for calculating the correction speed ΔV.
ΔV = Vn− (ΔL / (T + ΔT)) −−−−−−−−−−
Vn: movement speed to the target position T: arrival time ΔL: movement distance from the current position to the target position ΔT: synchronization deviation time ΔT = (FeNO.−FmNO.) * Calculated from the difference of the frame NO S
FeNO. : Frame No. of video synchronization command
FmNO. : Motion control device execution frame NO
S: 1 frame period [0015]
Then, the magnitude relations of the frame NOs are compared (step 6-5). If the frame number of the video synchronization command is large, it is determined that the video is progressing, and the original operation is performed to increase the motion-based operation speed. The correction speed calculated by Equation 1 is added to the power speed (step 6-6).
If there is an inverse relationship, it is determined that the video is delayed (step 6-5), and the calculated correction speed is subtracted to reduce the operation speed (step 6-7).
This processing enables speed correction.
[0016]
Next, frame correction will be described with reference to FIG.
Similar to the speed correction, it is determined up to the difference in frame NO (steps 7-1 to 7-3). If the frame numbers are different, the currently executed frame number is changed to the frame number of the video synchronization command (step 7-4).
As a result, the motion control apparatus sequentially executes the motion data managed in the frame NO. Therefore, when the frame NO is changed, the motion data corresponding to the frame NO is executed (step 7- 5) Correction is possible.
[0017]
【The invention's effect】
The motion-based control apparatus according to the present invention makes it possible to perform a fine correction operation because correction is performed for each frame in the correction based on time, and the correction that is effective when the synchronization is greatly shifted in one frame correction. Means.
According to the present invention, it is possible to eliminate the synchronization shift between the motion base control device and the video device, and there is an effect that it is possible to construct a simulation ride system that can bring out a more immersive feeling.
[Brief description of the drawings]
FIG. 1 is a system configuration diagram according to an embodiment of the present invention.
FIG. 2 is an I / F command specification diagram between a video and a motion control device.
FIG. 3 is a diagram showing a configuration and data flow of a motion control device.
FIG. 4 is a diagram showing a format of motion data.
FIG. 5 is an explanatory diagram showing a difference between video and motion control.
FIG. 6 is a flowchart of a speed correction function.
FIG. 7 is a flowchart of a frame correction function.
[Explanation of symbols]
1-1 Video control device 1-2 Projector 1-3 Screen 1-4 Motion base control device 1-5 Motion base 1-6 LAN
3-1 Man-machine control processor (MCP)
3-2 Real-time control processor (RCP)
3-4 DPRAM
3-5 Motion control logic (MCL)

Claims (2)

  1. Controls motion-based motion in a video device that displays video on a screen and a control device that controls motion-based motion that sequentially executes motion-based motion data created from video data and performs operations that match the video. The control means includes a correcting means for operating the motion base in synchronization with the video data of the video device , and the correcting means for operating the motion base is a video that is a frame NO of the video currently being displayed from the video device. Means for receiving the frame NO; means for detecting the motion base frame NO which is the frame NO of the motion base motion data currently being executed by the motion base; The means to find and the difference from the difference Means for calculating the motion-based operation speed for correction, and means for operating the motion base at the calculated speed to eliminate the difference between the video frame NO and the motion frame NO and to perform a synchronous operation with the video A motion-based control device characterized by that .
  2. Controls motion-based motion in a video device that displays video on a screen and a control device that controls motion-based motion that sequentially executes motion-based motion data created from video data and performs operations that match the video. The control means includes a correcting means for operating the motion base in synchronization with the video data of the video device, and the correcting means for operating the motion base is a video that is a frame NO of the video currently being displayed from the video device. The means for receiving the frame NO , the means for detecting the motion base frame NO which is the frame NO of the motion base motion data currently being executed by the motion base, and the difference between the video frame NO and the motion frame NO and means for determining, mode if there is a difference Deployment frame NO characteristics and to makes the chromophore at the distal end Activation based control system further comprising a synchronous operation can be performed means an image eliminate difference by changing the video frame NO a.
JP08309298A 1998-03-30 1998-03-30 Motion-based control device Expired - Fee Related JP4129072B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP08309298A JP4129072B2 (en) 1998-03-30 1998-03-30 Motion-based control device

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP08309298A JP4129072B2 (en) 1998-03-30 1998-03-30 Motion-based control device
US09/276,739 US6413090B1 (en) 1998-03-30 1999-03-26 VR motion base control apparatus and it's supporting structure
US09/769,342 US6336811B2 (en) 1998-03-30 2001-01-26 VR motion base control apparatus and it's supporting structure
US09/769,301 US6409509B2 (en) 1998-03-30 2001-01-26 VR motion base control apparatus and it's supporting structure
US10/066,692 US6641399B2 (en) 1998-03-30 2002-02-06 VR motion base control apparatus and it's supporting structure
US10/073,336 US20020150865A1 (en) 1998-03-30 2002-02-13 VR motion base control apparatus and it's supporting structure

Publications (2)

Publication Number Publication Date
JPH11276719A JPH11276719A (en) 1999-10-12
JP4129072B2 true JP4129072B2 (en) 2008-07-30

Family

ID=13792547

Family Applications (1)

Application Number Title Priority Date Filing Date
JP08309298A Expired - Fee Related JP4129072B2 (en) 1998-03-30 1998-03-30 Motion-based control device

Country Status (1)

Country Link
JP (1) JP4129072B2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004102506A1 (en) * 2003-05-14 2004-11-25 D-Box Technology Inc. Flexible interface for controlling a motion platform
KR101154122B1 (en) * 2012-02-20 2012-06-11 씨제이포디플렉스 주식회사 System and method for controlling motion using time synchronization between picture and motion

Also Published As

Publication number Publication date
JPH11276719A (en) 1999-10-12

Similar Documents

Publication Publication Date Title
US9104238B2 (en) Systems and methods for providing enhanced motion detection
EP3054424B1 (en) Image rendering method and apparatus
US7483448B2 (en) Method and system for the clock synchronization of network terminals
US20130169537A1 (en) Image processing apparatus and method, and program therefor
US6034733A (en) Timing and control for deinterlacing and enhancement of non-deterministically arriving interlaced video data
TWI489293B (en) Collaborative image control
US7453522B2 (en) Video data processing apparatus
US7248187B2 (en) Synchronizing data streams
US6336811B2 (en) VR motion base control apparatus and it's supporting structure
US4133007A (en) Video data detect circuits for video hard copy controller
US10015370B2 (en) Method for synchronizing video and audio in virtual reality system
JP4026649B2 (en) Projector, projector control method, projector control program, and storage medium storing the program
AU2005277136B2 (en) Real-time image stabilization
KR100523785B1 (en) Distributed processing system, distributed processing method and clients terminal capable of using the method
US20130328766A1 (en) Projection type image display apparatus, image projecting method, and computer program
US5775996A (en) Method and apparatus for synchronizing the execution of multiple video game systems in a networked environment
JP2009152897A (en) Stereoscopic video display device, stereoscopic video display method, and liquid crystal display
US7978224B2 (en) Multicast control of motion capture sequences
KR930020886A (en) Clock synchronization system
JP5701707B2 (en) Moving image photographing apparatus, information processing system, information processing apparatus, and image data processing method
CN100389603C (en) Automatic exposure unit with collaborative digital brightness gain and exposure time
JP4188588B2 (en) Method and display system for updating an image frame on a screen
JP3698376B2 (en) Synchronous playback device
JPWO2004034326A1 (en) Image conversion apparatus, image conversion method, and image projection apparatus
JPWO2008114683A1 (en) Computer program product for subject tracking, subject tracking device, and camera

Legal Events

Date Code Title Description
RD02 Notification of acceptance of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7422

Effective date: 20050204

A711 Notification of change in applicant

Free format text: JAPANESE INTERMEDIATE CODE: A712

Effective date: 20050204

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20050204

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20070522

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20070706

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20071204

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20080513

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20080516

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110523

Year of fee payment: 3

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

LAPS Cancellation because of no payment of annual fees