Related data is collected the invention proposes a kind of remote user and is the industry of wherein running-course control system
The method that local engineer at the position of factory provides instruction and guidance.
Fig. 1, which is schematically shown, wherein provides the industrial plant of Process Control System 10.Process Control System 10 is to be used for
Control the computerized process control system of industrial process.Process can be any kind of industrial process, such as power generation,
Transmission & distribution electric process and Water warfare and assigning process, production of hydrocarbons and assigning process, petrochemical industry, chemical industry, medicine and food products, with
And slurrying and paper making process.Some examples of these processes that only wherein system can be applied.There is also it is countless other
Industrial process.Process can also be other kinds of industrial process, such as cargo manufacturing industry.It can be supervised by one or more process
Control computer carrys out monitoring process, the server communication of the monitoring and control of the computer and progress process.
Thus, the Process Control System 10 in Fig. 1 includes multiple process monitoring computers 12 and 14.These computers are at this
In also can be considered formation operation person's terminal, and be connected to the first data/address bus B1.There is also be connected to first data/address bus
The gateway 16 of B1, the gateway 16 are connected at least one wireless network WN.The gateway is also connected to public data communication network,
It is internet IN herein.Video communication device 32 is connected to wireless network WN.Wireless network WN can be local area network,
Such as WLAN (WLAN).It is also possible to blueteeth network, the i.e. network of the bluetooth nodes with multiple interconnection.It may be used also
To be mobile communications network.
In addition, there is also the second data/address bus B2, and clothes are connected between the first and second data/address bus B1 and B2
Business device 18, the control and protection of offer process and database 20, wherein storage number relevant to the process control and protection
According to.It is this to control and to protect relevant data may include process data, such as measured value and control command herein, while with
Protecting relevant data may include alarm and event data and the data that can generate alarm and event on it, such as process
Obtained in measured value.It can also provide the panel of process control objects, and the panel may include from database 20
Process control data in relation to process control objects.In addition, being connected to selectable number of objects between two bus Bs 1 and B2
According to server 21.The object data server 21 includes the data in relation to all processes control object, such as excessively program-controlled about this
Blueprint, specification and the handbook of object processed.
In addition, other multiple devices 24,26,28 and 30 are also connected to the second data/address bus B2.These other devices 24,
26,28 and 30 be field device, is the interface arrangement of controlled process.Field device be usually obtain process measurement by
And the interface for being given to control command.In addition, field device is process control objects because of this.Of the invention one
In a modification, primary scene device is the first process control objects 24, and secondary scene device is the second process control objects 26, with
And third field device is third process control objects 28.
Fig. 2 shows provide the block diagram of multiple units in video communication device 32.Video communication device 32 is provided with
Shell 49.Bus 33 is provided in shell 49, and selectable short-range communication unit 46 or proximity sensor, video are projected
Machine 48, filming apparatus 34, recording controller 36, program storage 39, processor 40 and radio communication circuit 42 are connected to this
A bus 33.It can also include at least one other sensor, such as temperature sensor, accelerometer, ambient light sensor
With gyroscope (not shown).In addition, radio communication circuit 42 is connected to antenna 44, the wherein wireless communication unit 42 and antenna 44
It is provided for being communicated with wireless network WN.Radio communication circuit 42 and antenna 44 are formed together a type of communication and connect
Mouthful, it is used for and Process Control System and other entity communications.Therefore, it can be WiFi or WLAN interface.It is also possible to
Mobile communication interface.It should be further appreciated that two communication interfaces may be present in video communication device, a mobile communication interface and
One WiFi interface.Recording controller 36 is connected to microphone 35, and the recording controller 36 and microphone 35 are formed together note
Record unit, the sound that can be used in the position of recording process control system.Although it is not shown, but video communication device 32 can also
To include sound issue unit, such as loudspeaker and earphone.Microphone and earphone are combined to and are connected to video communication device 32
Headphone in be also possible that.Short-range communication unit 46 can also be considered as a type of sensor, object sensing
Device or proximity sensor, for sensing the process control objects to be serviced.The sensor can by near-field communication (NFC) technology come
Software code is provided in program storage 39, forms control unit 38 when being run by processor 40.Control
Unit 38 is more particularly configured to execute multiple functions under the control of remote user.
Video communication device 32 can move in the place of industrial plant.Therefore it can move on to another from a position
Position.It also may be positioned so that it will capture video image and by the demonstration digitlization demonstration of projector 48.
For this purpose, shell 49 can be placed on tripod 50, schematically shown in Fig. 3.Filming apparatus 34 have the visual field, i.e., wherein it
Detect the region of its ambient enviroment.The visual field can change in different ways.It can zoom out order by zoom to increase, and
And it can push towards order by zoom and reduce.Various types of panning orders can be used also to migrate or move in the visual field
It is dynamic.In order to obtain panning, the orientation of filming apparatus be can change.In a similar way, projector has demonstration area or projected area,
I.e. in the region for wherein capableing of Visualization Demo information.The demonstration area can be centered on projector's sight line, and can have
There are any proper shape, such as round, rectangle and quadratic function shape.The demonstration area is also can be by changing projector
Orientation and moved.Filming apparatus 34 can change its orientation in three dimensions, and projector 48 equally can be at three
Change its orientation in dimension.Furthermore they can also independently change.In a variant, can be changed by entire shell 46
Become orientation and jointly changes these orientations.Fig. 4 a and Fig. 4 b schematically show the movement for realizing this reorientation.It can
Find out that shell 49 can be along the horizontal plane around the vertical axis 360 degree rotation of tripod 50.It can also be seen that shell 49 can vertically on
Lower inclination.In order to obtain this movement, video communication device 32 can provide at least one motor for obtaining this movement.
As described above, more than one motor can also be provided with, for example, a motor is vertically moved for providing, and another is used
It is moved horizontally in offer.It is this kind of there may also be provided two pairs in order to obtain being individually moved for filming apparatus and projector
Motor, wherein providing a pair is used for filming apparatus 34, another pair is used for projector 48.In addition, although filming apparatus and projector
It still provides inside identical casings, but these are individually moved and are also provided.
As it appears from the above, projector 48 can independently change orientation with filming apparatus.To filming apparatus 34 and projector
48 can be pointed in different directions.
As shown in figure 1, it can be seen that video communication device 32 can access internet IN by wireless network WN.This allows video
Communication device 32 is remotely operated, that is, is operated from other places outside the factory.Thus the video communication device 32 can be by remote
Journey user for example passes through the computer operation of remote user 52.Such case is schematically shown in Fig. 5, can be seen here
Out, the computer 51 of remote user 52 can be communicated by internet with video communication device 32.In this way, video communication device 32
Control unit 38 for example can receive the order from remote user 52 by the website that the remote user 52 can log in.Make
For alternative, control command can directly be sent from the computer 51 of remote user 52.
Therefore, remote user 52 can obtain the video figure that the filming apparatus 34 of video communication device 32 is captured
Picture, then computer 51 that the video image passes through remote user 52 are demonstrated for it.This is shown in FIG. 6.Fig. 5 is schematically
Indicate video flowing VS, threedimensional model 3DM and filming apparatus data CD from video communication device 32 to the computer of remote user 52
51 transmission.Information from sensor can also be transmitted to the remote user by internet wireless.Pass through the remote user
The information for providing go back to factory can also pass through Internet transmission.More information about transmission will provide quickly.
Some embodiments of the present invention will be described in detail now.
In industrial circle, such as wherein in the industrial plant of operation process, it is being always maintained at by Process Control System
The operating of production is extremely important, because even the slight interruption in production will also pay mint of money.Because of this, maintenance is regarded
To be extremely important to keep producing and to operate.
Maintenance can be very expensive, because external experts help must be invited to solve ordinary person sometimes to be had
There is professional ability to go the higher level operation of processing.
Cooperating between the local worker and expert carried out by telephone line is often not good enough.Expert may need
See that scene occurs and may be required to instruct Field Force in the case where no any misunderstanding risk.To postback
Sending photo is also a kind of slow mode of shared information, therefore this is also and non-real effective.
Finding suitable expert and the expert being allowed to fly to scene can take a long time.Outside is needed in chance failure
In the case that expert assistance is to continue production, since expert travelling may must reach scene over long distances, so this causes to grow
Time shuts down.
Please external experts fly to scene and can be very expensive.Not only cost associated with the expert (stay by travelling
Deng), and the production disruption during factory personnel waiting expert's arrival can be very expensive for factory owner.
Above situation is solved by the use of video communication device.
In operation, that is, when there is certain in factory at position, video communication device is brought to the industry spot
Position, and place the site that needs to assist in this position.Device can be placed in, for example, the center in room.The video is logical
T unit can be placed on this position by local user, for use in solving the problems, such as at position, for example, one or more
Machine or process control objects are likely to occur failure or process and show the abnormal fact at this location.
Since described device has been brought to the position, many activities can be executed.
In the first modification, remote user is provided with contextual data related with video flowing.It is retouched with reference to Fig. 7,8 and 9
First modification is stated, wherein Fig. 7 schematically shows the use of the filming apparatus of video communication device to be used to capture industry
The video of process control objects a part at the position at scene, Fig. 8 schematically show the video together with the position and are somebody's turn to do
The demonstration of the 3-D view of video communication device, and Fig. 9 schematically show by the video communication device execute by video
It is sent to the flow chart of the method for remote user.
According to the first modification, control unit 38 makes video communication device 32 scan the region at the position, step first
56.This can control motor by control unit 38 and rotate the combination control motor of shell 49 by shown in Fig. 4 a and 4b around vertical rotation axis
Differing tilt angles tilt shell 49 up and down to carry out.In this way, it is filled using the video communication of filming apparatus 34
The three-dimensional space around site set is captured different video image.
After scanning, control unit 38 analyzes institute's captured image in the region, and investigates it and compare the position prestored
Whether the threedimensional model of the position at place and object, i.e. process control objects and the position other objects that may be present can be to it
Identified.If it, which has identified the video image and therefore exists, prestores model, step 58, then the model is called
(fetch), step 60.This prestores threedimensional model and can be provided in video communication device 32.Optionally, it can also be with
From server, such as server 21, obtains or call.If there is model is prestored, then in the position about video communication device
The data in the previous site at place and the video flowing and filming apparatus recorded in the video communication device that the previous site is placed
Orientation can be saved together with the model.Moreover, the previous site data and associated historical video streams can be called.
It is therefore called if having made any threedimensional model by the position.But if there is no model is prestored, step 58,
Then the new threedimensional model 3DM of various objects is created by control unit 38 in the position and the position, step 62.It in this case can be with
Model is obtained by creating model.Model can be with, such as is created using augmented reality functionality.If data demonstrating device
Including infrared sensor, then infrared technique, such as Microsoft Kinect is used to be also possible that.It is naturally special at the position
The 3D mapping of sign can be by using various features extracting method for example with 2D RGB data and 3D RGBD (red, green, blue, depth
Degree) turning that carries out of data or edge detection construct.Use this sparse mapping, it is also possible to determine and have filming apparatus 34
Video communication device 32 position.It is possible to establish that the orientation or posture of filming apparatus 34.It is possible to determining shooting dress
34 are set towards which direction.The orientation can be calculated based on registration Algorithm.These algorithms can be used in positioning real world and reflect
The feature of the present frame or video image hit, and based on the orientation of this determination filming apparatus 34.
Process control objects, i.e. real world objects can be provided with object identifier, such as NFC label or bar code.
If reading the identifier, it is likely that obtain about them be what type object information.Type can be filled by shooting
Set 34 detection visual object identifiers, such as bar code, to be identified.Optionally, short-range communication unit may be set to
Read the label for having the object identifier.This bar code can be used for, such as from the data base call in Process Control System
Data associated with the object.In order to simplify the calling of this data, thus control unit 38 can storage object identifier with
Object in the 3DM model of the position it is associated.
Scanning or short range communication also can be used for determining video communication device 32 in the field position in control unit 38
Site, step 63, that is, the site about layout drawing and various objects at the position.This can also refer to the video communication
Device is added to the threedimensional model 3DM of the position.
Above-mentioned steps can execute before starting communication session with remote user 52.WiFi can be used in this session
The TCP connection established with internet carries out.Optionally, these steps just execute after the communication session starts.?
Under both of these case, whether the investigation communication session of control unit 38 is ready or carrying out, in this case extremely
It is related to passing through the sound generating device and sound of computer 51 and video communication device 32 between remote user and local user less
The audio communication session that recording equipment is carried out.It further relates to the transmission of live video stream VS, can be from process control system
Video communication device 32 in system arrives the one-way video stream of the computer 51 of remote user 52.
In some instances, it can also relate to two-way video meeting, i.e., is also mentioned by the computer 51 of remote user 52
For video and send it to video communication device 32.Therefore the video image captured by filming apparatus 34 can be transmitted to remote
Journey user 52.Moreover, the data of remote user 52 can project at this location in the case where the remote user controls.
If at the scene without session, step 64, then control unit 38 is waited by local user or by remote user 52
However, step 64, then control unit 38 controls filming apparatus 34 and records video if a session is in progress
Stream, step 66.Therefore it controls filming apparatus 34 to capture the video image VI of the position.Control unit 38 also continuously determines
Filming apparatus orientation, step 68, such as the sight line of the view finder based on filming apparatus 34.To which the control unit 38 determines view
The current orientation of frequency communication device 32.The orientation can be used as solid angle relevant to the site of video communication device and reference angle
To provide.Optionally or additionally, determine that the orientation is also possible that using gyroscope and/or accelerometer.
In a communication session, model 3DM can be transmitted to remote user 52 from video communication device.More particularly, this three
Dimension module 3DM can be transmitted together with filming apparatus data CD (in video flowing VS or other than video flowing VS), step
Rapid 70, wherein the filming apparatus data may include filming apparatus orientation.The filming apparatus data may also comprise the position of filming apparatus
Point, that is, the site of video communication device.Further, it is possible that control unit 38 modifies the model of the position, so that the video communication
Device and orientation become a part of the model.The filming apparatus data thus can be used as a part of the model to provide.
In this case, therefore which can be used as a part of threedimensional model to provide, be continuously communicated to the long-range use
The device at family.Alternatively, in the transmit process in video image from the video communication device to the remote user device, orientation
The update that variation can be used as threedimensional model is transmitted.The computer of the remote user then can be in modifying the threedimensional model
Use these updates.
Then remote user receives video flowing together with model 3DM and filming apparatus data CD.Then, which can
To see captured video and obtain the 3-D view of the position using model 3DM.Therefore, which is possible to
See its situation that can be seen in the field.
Such example is shown in figures 7 and 8.Fig. 7 show video communication device 32 be how with first, second and
Third process control objects 24,26 and 28 are in same position and it is how to capture the one of the second process control objects 26
Partial video image.Therefore it records the video for remote user 52.Fig. 8 shows the video image VI in video flowing VS,
As will be appreciated that when being shown on the computer 51 of remote user 52.The video image may include many useful letters
Breath.But it may lack scene.The scene is also by the transport model 3DM together with video flowing VS and filming apparatus data
CD and provide.Fig. 8 shows the screen that remote user 52 can see on the display of its computer 51.The view includes
Live video stream from filming apparatus, wherein image VI is demonstrated.In addition, scene information passes through the general view image OI of the position
It provides, which is obtained and visualizing the model 3DM of the position using video communication device VCD and its orientation
?.Herein, the expression of video communication device can be put into the model by remote user computer 51 together with the orientation of image
It is possible to.Alternatively, this is completed via the control unit 38 of video communication device 32.In the latter case, it shows
The modified model of video communication device and orientation is provided.
Then control unit 38 investigates whether communication session terminates.If be not finished, step 72, then video continues to be recorded
And filming apparatus orientation is determined, and step 68, and sends remote user, step 70 to together with the model.However, such as
Fruit terminates communication session, step 72, then also end operation, step 74.
Therefore, according to first modification, it can be seen that in video conference track up device 34 current location
And be orientated while going back what constructing environment mapping was possible to, enable remote user 52 that there is better position context-aware.
Such as Fig. 8 it can be seen that remote user 52 had both seen current filming apparatus view VI, the small figure OI being also able to use in the lower right corner
Obtain the extraordinary general view of surrounding enviroment.
Herein, remote user can also browse in constructed 3D view, and therefore be not limited to from video transition
Middle observation present frame, but from known 3D model constructed by video frame freely " exploration ".
Conference call (wherein purpose is the environment that share a user) is not limited to simple streamcast video number
According to, but can also include the data in site and current pose or orientation in relation to filming apparatus, wherein the orientation can be made
It is arranged for the orientation of the sight line of filming apparatus view finder.
If prior model exists, when these historical video streams are recorded, remote user is possible to that the position can be called
The video flowing for setting precedence record is orientated together with the site of video communication device and filming apparatus.
Some advantages of the invention can preferably be recognized from following situations:
1. local maintenance engineer just does a certain maintenance when he identifies potential problems in factory floor；He calls far
Journey user and video call is initiated to obtain the suggestion to this situation.
2. he using the filming apparatus in video communication device, it is made to scan the position with show process control objects (
In this example be the second process control objects 26) present case and surrounding enviroment.
3. the mapping that different video frames is processed to form the environment.In the formation of the mapping, video communication dress
The site set also is determined.Using this mapping and current video image, the current orientation or posture of filming apparatus are calculated.
4. the mapping 3DM of the environment then, with filming apparatus data is sent out by network during the call together with video flowing
5. due to remote user can see simultaneously environment mapping and video flowing, this additional information help he have to
Determine to orient its own in the world of the dynamic property of filming apparatus orientation.The remote user obtains than ordinary video conference system
The be more good context-aware that can be given.
6. it is efficient due to video collaboration system, which is obtained for home environment in factory site
Gem-pure understanding, then two users can try solve situation.
First modification has many further advantages.In first modification, filming apparatus and mapping data, i.e. the bat
Device data and threedimensional model are taken the photograph, is transmitted together with video flowing.This improves context-aware relative to conventional video flowing, promotees
Realize less mess and higher position.
Flow data is used to create the integral photograph of position.Remote user is able to use this 3D Model Independent and shoots in physics
Device site and viewpoint of navigating；This will give the great context-aware of the remote user.
Another advantage is that, it is not necessary to the problem of quantity be contracted by, such as " now at which are you? ", " I sees now that
It is what part? " other indicative problems that the engineer of such problems and remote collaboration must not pay no attention to now are avoided by.
Communication will also become more accurate.Communication mistake relevant to position will be less common.
Cooperation between two users will also become more efficient.For video collaboration required by task to be done when
Between will be improved most possibly.
Further, it is possible that safety is enhanced.The better scene for the case where having at hand due to remote user is anticipated
Know, he can observe whether local user is carrying out correct action.
It is described now with reference to Figure 10,11a, 11b and 12 pair of second modification, wherein Figure 10, which is schematically shown, has
The position of video communication device 32, the video communication device 32 provide the projected area PA of wherein projection demonstration project PI, Figure 11 a and
11b schematically shows the position with video communication device 32 and demonstration project PI when projected area PA is moved, figure
12 schematically show the flow chart that the method for video communication device 32 is operated by remote user 52.
When at this location, video communication device 32 is advantageously used in from the position acquisition data to be supplied to long-range use
Family 52, and receive the instruction of the local user from remote user 52 to from the position.This can pass through double-directional speech or video
Communication is to complete.
When communication session is carrying out, therefore control unit 38 calls measurement value sensor, such as temperature from sensor
Sensor and ambient light sensor, and these measurement value sensors are transmitted to the computer 51 of remote user 52, step
76.Filming apparatus 34 also captures and transmits video VS, step 78 to remote user 52.
Remote user 52 may wish to obtain about some of its process control objects seen in video flowing VS now
More related datas.He is possible, for example, it is desirable to obtain the data of the voltage of the temperature or transformer in basin.In order to complete this
It is a, he can in video or in the model of the position previously obtained selecting object.He can be with, such as in detection video
Object identifier, and video communication device is sent by the object identifier.He can also selecting object in a model, and
The selection may pass to control unit 38.Then, control unit 38 can call the number about the object from database 20
According to.It can be with, such as calls the panel of the current data with the process control objects.
Control unit 38 can thus be selected from 52 receive process control object of remote user, and step 80, and being based on should
It selects it from Process Control System, such as from database 20, calls the process control objects data, and by the process control pair
Image data is transmitted to the computer 51 of remote user 52, step 82.Therefore remote user 52 can select in model in the position
Object is selected, and when the object is selected, his available additional data, such as the panel with operation information.
After process control objects are selected, or if not having selection course control object, control unit 38 can be from
The remote user receives demonstration project PI.Remote user 52 can more particularly be provided the demonstration project of projection by projector.
Demonstration project PI can be digitlization demonstration project, be also possible to digital still, the image of such as arrow or circle, such as magic lantern
The demonstration of piece or text string with instruction.It may also is that the drawing done by remote user 52.The demonstration project thus can
To be the demonstration project of remote user's generation comprising instruction and visual detector.Therefore demonstration project PI can become mark
Image is demonstrated by projector 48 to local user.If this demonstration project PI is received, step 84, then the demonstration
The selection in the site of project can also be received.Remote user can select demonstration project PI's in the 3D model 3DM of position
Site.Site selection can also be transmitted to control unit 38.Then, control unit 38 is by the position of the demonstration project and the selection
Point is associated, step 86.The site of the demonstration project can use relevant to the site of video communication device and reference angle
Solid angle and radius are set.Demonstration project can be assigned to the space in the threedimensional model of the position as a result,.With this side
Formula is it is also possible to specify more than one demonstration project.
Hereafter, control unit 38 waits the filming apparatus control command that may be from remote user 52.The filming apparatus control
System order may include visual field control command, such as zoom commands, changes the size in the visual field but retains identical sight, either
Change the tropism control order of sight.Tropism control order generally includes panning (panning) order.Remote user 52 so as to
To change its orientation by rotation or tilt device 34.Remote user 52 can also be furthered with zoom and be zoomed out with zoom.Such as
Fruit receives order, and step 88, then these orders are used by control unit 38.If order is visual field order, it is used to control
The visual field of filming apparatus processed, step 90.Zoom commands are forwarded to filming apparatus 34, then its class for depending on the control command
Type progress zoom furthers or zoom zooms out.It tilts or rotates if necessary, then control unit 38 manipulates corresponding motor to obtain
It is mobile needed for obtaining.
Hereafter, control unit 38 can receive projector's control command from remote user 52.Projector's control command
It may include the order of projection demonstration project PI.In some cases, this order is also possible to project in specific ideal site
The order of the demonstration project.If receiving projector's control command, step 92, then projector 48 is by control unit 38 according to this
Order is manipulated, this is related to, if the control command is the order of projection demonstration project PI, controls projector 48 at this
The demonstration area PA of projector projects demonstration project PI, step 94.If the order is projected in specific site, projector's quilt
Control is so as in site projection demonstration project PI.Order can also include changing the order of the orientation in demonstration area.Such case
Under, it is mobile that projector can be used or another motor identical as motor used in filming apparatus 34, and controlled to project this
Demonstration project, so that it is presented in expectation site, step 94.Therefore, remote user can control video communication device in the position
Selection space or site projection demonstration project in setting.This can be related to project to the demonstration project and associated bit in threedimensional model
The corresponding real world site of point.If real world objects at this location will be in the demonstration according to demonstration project site
The front of project, the then part for the demonstration project blocked by the real world objects will be unable to demonstrate.
If projector is redirected to make to demonstrate area PA movement, demonstration project PI can be set as stopping
In the site of user's selection.When the project of demonstration and model interaction, it also means that the demonstration project can be retained for the position
Set the session after place.In addition, the progress of demonstration project projection is unrelated with video display.
Control unit 38 therefore can will be separated to the control of projector 48 with the control to filming apparatus 34 or independence.
For example, so that demonstration project PI is in except the visual field of filming apparatus 34, then should if the stream of filming apparatus 34 amplifies at details
Demonstration project PI will be demonstrated still.Therefore it carries out the control demonstrated in demonstration area and the control in the filming apparatus visual field is independent.
Such as from the above-mentioned zoom example provided it is clear that this is therefore, it is intended that demonstrate the site of project in the demonstration area PA of projector 48
It may be at except the filming apparatus visual field.This also means that demonstration area PA can be different from the visual field of filming apparatus 34.Work as bat
Taking the photograph device control is to control the order of the orientation of the filming apparatus, and the control command of projector is to control the orientation of the projector
Order when, it is same it can be seen that the control that the control for executing projector orientation is orientated independently of the filming apparatus, this is therefore
Mean that the control of filming apparatus orientation does not influence the control of projector orientation.
From Figure 11 a and 11b it can be seen that the projected area PA of projector 48 can be moveable.It is drilled if there is several
Aspect mesh can adapt in the demonstration area when being located at present bit point, then these demonstration projects can be based on the order of remote user
Individually or simultaneously demonstrate.
For example, provided that if a demonstration project, some of them project are in except the current location of demonstration area PA,
Then projector 48 can be redirected, and one or more demonstration projects are projected.After specified, remote user can letter
Singly select the demonstration project to be demonstrated, and control unit 38 will control one or more motors to redirect the projector,
So that it, which demonstrates area, covers selected demonstration project.
Hereafter, the capture of video is continued, and step 78, and waits the various orders from remote user, step 80,
84,88 and 92.This action type is continued for as long as long as session.
Remote user 52 can also send control projector 48, (such as temperature passes for filming apparatus 34 and various sensors
By video communication device 32, remote user is possible to obtain knowing for the operation of process control objects at this location
Know, and obtains the temperature at other information such as position.To observe the position, remote user 52 can also be filled with rotary taking
Set and obtain the viewdata of the position.Connected by voice, the remote user can also with local user communication and receive
There may be the voice opinions of problem at the position.
Then, which can determine that action appropriate, such as what process control objects and its which part will be caused
It moves and when activates.For example, the remote user can provide multiple demonstration projects, such as arrow and descriptive text, and
The different loci being assigned in dummy model.The remote user can also provide time-of-the-day order, to provide demonstration project
The sequence being demonstrated.Then order and demonstration project can be sent to video communication device 32, by projector 48 with long-range
Sequence determined by user 52 is demonstrated.It, can be with if these demonstration projects are provided in the demonstration area in current site
It is demonstrated simultaneously.When the new demonstration project that needs are demonstrated is in except 48 present viewing field of projector (i.e. when in current site
Demonstrating except area) when, then projector 48 can be moved or be redirected, so that demonstration area covers the position of the new demonstration project
Point.This movement of projector 48 can be carried out independently of filming apparatus 34.In this way, remote user 52 is possible in one place
Locate presentation information, such as about the instruction for activating a certain process control objects, and it is unlapped another to monitor projector 48 simultaneously
Another pair at place as.
Second modification is to provide a kind of permission remote user by live video stream come teleinstruction Field Force's
Method.The video communication device also will allow local user and remote user audibly to communicate.It will also allow this long-range
User obtains the general view of environment by filming apparatus.The remote user can also roll, the filming apparatus at panning and zoom scene,
To obtain the more excellent general view to the scene from remote position.Due to having used 3D filming apparatus, so being needed in the remote user
In the case of will be about the additional spatial information of the position, he will be able to see that the 3D model of the environment.
Moreover, remote user is it is also possible to add demonstration item to the mode of real world projection information by using projector
Mesh or information, such as mark and drawing to physical world, that is, the remote user can be with local user in the position from view
Shared information and mark in feel.
All sensors together with filming apparatus and sound pick-up outfit will make the user remotely connected it can be seen that, hear and feel
The case where by factory.Projector and sound generating device can be used for passing to scene from the information that the remote user feeds back again
Personnel.The projector is to return to factory personnel for allowing remote user visually to pass the information on.
By allowing remote user to control video communication device, which can be with the side of rotation, inclination and zoom
Formula uses filming apparatus browsing environment.When the information that there is the remote user it to want to share with the local user at scene,
The remote user is able to use projector will be in the information " drawing " to demonstration area.Remote user be able to use text, image or
The simply rendered object on telecreen.Then the drawing is projected into scene using projector.Since filming apparatus records
The 3D model of the environment, so annotation also can be the object being left.
All visual informations as provided by remote user can be augmented reality information, it is meant that, which adds
Any mark or drawing added is saved or is connected to the point added using constructed environment 3D model.This meaning
, if the remote user rotary taking device, the mark after being added to mark will remain in identical point.
Such as Figure 11 it can be seen that remote user has been added to demonstration project PI.When the remote user such as Figure 11 b can be seen
Rotating video communication device 32 as arriving, such as better general view in order to obtain, even if the site of demonstration area PA has occurred and that
Variation, but demonstration project PI still can project correctly.Thus it can be seen that even if the demonstration area is moved, wherein projecting
The real world site of the demonstration project is still kept.
Imagine following scenario:
1. serious problems unexpectedly have occurred on one offshore platform in Norway's natural gas companies.The problem is rare
And technical complexity, site operation personnel need expert to support to resume production.
2. expert flies to on-the-spot meeting time-consuming at least 48 hours due to remote at all expert positions.
3. operator gets in touch with support company, can help to solve immediately for the expert of the technical problem there described close
The problem of extra large platform.
4. operator discusses with expert, which instructs site operation personnel to take video communication device to production
The specific part of process, so that the Remote can see institute's problem.
5. the Remote observes field conditions using filming apparatus, sound pick-up outfit and sensor.
6. it is certain that the Remote can instruct site operation personnel to execute now based on the information from the offshore platform
Operation is to correct problem.
7. the Remote using voice and possibly with coastal waters user visually shared information.Remote uses
A possibility that sound and vision shared information, is extremely effective, because the Remote is possible to " point out " scene where immediately
Operator should execute action.
By the second modification, a possibility that remote user has been assigned to support is anywhere provided immediately in the world.He
No longer need all to be in the action when its assistance every time, but, in many cases, they can solve the problems, such as from office.
Remote user has been assigned the context-aware of certain level, and such context-aware can not pass through the 3D mould of building real world
Type only uses video flowing to realize.
Local user, such as service engineer have been assigned a kind of low-key for watching augmented reality information and naturally side
Formula, in many cases, this mode are better than watching augmented reality information with wearing type glasses or handheld screen.
Annotation can be added to the environment being projected on physical device surface by remote user, for local maintenance engineering
In the presence of will annotation be added on the 3D model in the world and shown at the scene using projector these annotation possibility
Even if filming apparatus covers another site, the annotation being added on 3D model is also secured on its position.
Being added to the environment and/or mark on the 3D model in the world and annotation can also be recorded and save as this
A part of industrial plant maintenance history.If the video communication device is taken back to some known location, these information can also be with
It retrieves again in the future.
It will also be appreciated that two modifications can combine.Activity in the two modifications is so as in same communication session
Middle progress.In this case, the knowledge for the position that the first modification medium-long range user obtains can be used for controlling the video communication
Device, and especially when using demonstration project.
The video communication device is to be provided with projector and including tripod as described above.It will be appreciated that the video is logical
T unit can be handheld apparatus such as video camera, portable computer or mobile phone, such as smart phone.
Control unit can be with as described above, being provided with processor together with form of memory, which includes for executing
The computer program code of its function.The computer program code can also provide in one or more data mediums, at this
Program code is loaded into memory and executes the function of the control unit when being run by processor.One this with calculating
The data medium 96 of machine program code 98 is schematically shown in Figure 13 in the form of CD ROM disk.
In addition to the embodiment being already mentioned above, the present invention can also be changed in a manner of more.It should therefore be realized that
It is that the present invention is only limited by following claims.