CN105659170B - For transmitting the method and video communication device of video to remote user - Google Patents
For transmitting the method and video communication device of video to remote user Download PDFInfo
- Publication number
- CN105659170B CN105659170B CN201380077726.9A CN201380077726A CN105659170B CN 105659170 B CN105659170 B CN 105659170B CN 201380077726 A CN201380077726 A CN 201380077726A CN 105659170 B CN105659170 B CN 105659170B
- Authority
- CN
- China
- Prior art keywords
- communication device
- video
- video communication
- filming apparatus
- remote user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000004891 communication Methods 0.000 title claims abstract description 160
- 238000000034 method Methods 0.000 title claims abstract description 44
- 238000004590 computer program Methods 0.000 claims abstract description 11
- 238000004886 process control Methods 0.000 claims description 41
- 230000005540 biological transmission Effects 0.000 claims description 9
- 239000007787 solid Substances 0.000 claims description 4
- 235000013399 edible fruits Nutrition 0.000 claims description 3
- 238000011835 investigation Methods 0.000 claims description 3
- 230000008569 process Effects 0.000 description 21
- 238000004519 manufacturing process Methods 0.000 description 14
- 230000000007 visual effect Effects 0.000 description 14
- 238000012986 modification Methods 0.000 description 13
- 230000004048 modification Effects 0.000 description 13
- 238000012423 maintenance Methods 0.000 description 9
- 238000013507 mapping Methods 0.000 description 9
- 230000008859 change Effects 0.000 description 8
- 230000009471 action Effects 0.000 description 5
- 238000004091 panning Methods 0.000 description 5
- 238000012369 In process control Methods 0.000 description 4
- 230000003190 augmentative effect Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 4
- 238000010965 in-process control Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 238000010295 mobile communication Methods 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 241001269238 Data Species 0.000 description 2
- 235000006679 Mentha X verticillata Nutrition 0.000 description 2
- 235000002899 Mentha suaveolens Nutrition 0.000 description 2
- 235000001636 Mentha x rotundifolia Nutrition 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- VNWKTOKETHGBQD-UHFFFAOYSA-N methane Chemical compound C VNWKTOKETHGBQD-UHFFFAOYSA-N 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- 230000010415 tropism Effects 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 239000003653 coastal water Substances 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 235000013305 food Nutrition 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 229930195733 hydrocarbon Natural products 0.000 description 1
- 150000002430 hydrocarbons Chemical class 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000003345 natural gas Substances 0.000 description 1
- 238000010248 power generation Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000012887 quadratic function Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/194—Transmission of image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42202—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] environmental sensors, e.g. for detecting temperature, luminosity, pressure, earthquakes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44008—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4788—Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
- H04N21/6587—Control parameters, e.g. trick play commands, viewpoint selection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8126—Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/816—Monomedia components thereof involving special video data, e.g 3D video
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/142—Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/32—Operator till task planning
- G05B2219/32014—Augmented reality assists operator in maintenance, repair, programming, assembly, use of head mounted display with 2-D 3-D display and voice feedback, voice and gesture command
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Business, Economics & Management (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biodiversity & Conservation Biology (AREA)
- Ecology (AREA)
- Emergency Management (AREA)
- Environmental & Geological Engineering (AREA)
- Environmental Sciences (AREA)
- Remote Sensing (AREA)
- General Engineering & Computer Science (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Selective Calling Equipment (AREA)
- Processing Or Creating Images (AREA)
Abstract
The present invention relates to method, computer program product and the video communication devices (32) for transmitting video to remote user.The video communication device (32) includes: communication interface, for providing the communication session between the device of the remote user;Filming apparatus captures image;And control unit, it is configured to obtain the threedimensional model of the field position, control the video image (VI) of the filming apparatus catch position, determine the front direction of working as of the video communication device, and by the threedimensional model (OI) and direction together with the device of the video stream including captured video image from the filming apparatus to the remote user.
Description
Technical field
The present invention relates generally to Process Control Systems.More particularly, the present invention relates to the works from running-course control system
Method, computer program product and the video communication device of video are transmitted to remote user in industry scene.
Background technique
In Process Control System, it is extremely important for being always to maintain the operating of production, because even production in it is slight in
It is disconnected also to pay mint of money.Because this, maintenance be considered as it is extremely important so as to keep produce and operate.
It is a kind of in Process Control System carry out local maintenance provided useful tool retouched in SE1300138
It states.
Maintenance may be very expensive, because must sometimes invite external experts help to solve ordinary person can not have
The higher level operation for thering is professional ability to go voluntarily to handle.
It is all effective enough in the not all situation that cooperates between local worker and expert carried out by telephone line, because
It can not see local worker is doing anything for expert.The slow mode that photo is also a kind of shared information is sent back and forth.Specially
Family may need to see that scene occurs and can be required to instruct scene in the case where the risk of no any misunderstanding
Personnel.
In this kind of situation, it is extremely advantageous for sending the video at scene to external experts.Then, it is also possible to further
Filming apparatus so as to obtain scene details.However, retaining scene may be difficult.Thus it is difficult to retain live overall picture.
Known threedimensional model uses in various different situations.
The threedimensional model that WO2012/018497 describes a kind of image by scene interested is forwarded to multiple palms and calculates
The system of machine.Live image is acquired, and is supplied to central means, provides threedimensional model and the model is output to scene
Hand-held device.
US 2010/0315416 discloses a kind of system with the central computer for having 3D model.Within the system,
Filming apparatus is used for the image of captures object, which is sent to central computer and combines with 3D model.
DE 102011085003, which has been inquired into, inputs digital picture in virtual environment, and the virtual environment is 3D environment.
WO 2011/008660 has inquired into the transmission of the location information of video capture device in video streaming.
However, the scene of video flowing is improved using threedimensional model without Literature Discussion.
Therefore, there is the space promoted in the field, especially when the video flowing for capture provides scene.
Summary of the invention
The problem of present invention concern is provides scene to the video flowing that remote user transmits from Process Control System.
This purpose is according to a first aspect of the present invention by from the industry spot of running-course control system to remote user
The method of video flowing is transmitted to realize, this method is executed by the video communication device for the position being placed in the industry spot, and
And execution related to the communication session that the device of the remote user carries out, this method comprises:
The threedimensional model of the position is obtained,
The video image of the position is captured using filming apparatus,
Determine the current orientation of filming apparatus, and
By the threedimensional model and orientation together with the video stream including captured video image to the remote user's
Device.
This purpose is according to a second aspect of the present invention by from the industry spot of running-course control system to remote user
The video communication device of video flowing is transmitted to realize, which is placed in the position of the industry spot and includes:
Communication interface, for provide with the communication session of the device of the remote user,
Filming apparatus captures image, and
Control unit is configured to
The threedimensional model of the position is obtained,
Control filming apparatus captures the video image of the position,
Determine the current orientation of the filming apparatus, and
By the threedimensional model and orientation together with the video stream including captured video image to the remote user's
Device.
This purpose is according to a third aspect of the present invention by from the industry spot of running-course control system to remote user
The computer program product of video flowing is transmitted to realize, the computer program product provided on the data carrier includes to calculate
Machine program code is configured to be loaded into the video communication device, and the video communication device quilt when the computer program code
Be placed in the position of the industry spot and with the device of the remote user carry out communication session when, promote include filming apparatus view
Frequency communication device
The threedimensional model of the position is obtained,
The video image of the position is captured using the filming apparatus,
Determine the current orientation of the filming apparatus, and
By the threedimensional model and orientation together with the video stream including captured video image to the remote user's
Device.
The present invention has many advantages.The present invention provides contextual datas relevant to the video flowing captured.This is opposite
Context-aware is improved in conventional video flowing, this reduces the remote user for Video stream information to be interpreted to obscure feelings
Condition and improve position consciousness.Pass through the use of the model, it is possible to obtain the full picture of the position, wherein panorama can
It is combined with details.
Detailed description of the invention
The present invention will be described in detail below with reference to accompanying drawings, in which:
Fig. 1 schematically shows industrial plant, and wherein Process Control System operates industry together with video communication device
Process,
Fig. 2 schematically shows the block diagram of the unit of the enclosure interior of video communication device,
Fig. 3 shows the perspective view of the video communication device in the form of tripod upper housing,
Fig. 4 a and 4b show the perspective view of the video communication device of the various possibilities of the movement of indicator shell,
Fig. 5 schematically shows video communication device and communicates via internet with the computer of remote user,
Fig. 6 is schematically shown with its computer remote user, shows position in Process Control System on computers
The video set,
Fig. 7 is shown schematically for the video communication device of the video of a part of process control objects at catch position
Filming apparatus use,
Fig. 8 schematically shows video together with the demonstration of position and the 3-D view of video communication device,
Fig. 9 schematically shows the process of the method that video is sent to remote user executed by video communication device
Figure,
Figure 10 schematically shows wherein video communication device and provides the position of the wherein projected area of offer demonstration project,
Figure 11 a and 11b are schematically shown when projected area is moved, the position with video communication device and demonstration project
It sets,
Figure 12 schematically shows the flow chart that the method for video communication device is operated by remote user, and
Figure 13 schematically shows the data medium with computer program code in the form of CD-ROM disk, to
Implement the control unit of video communication device.
Specific embodiment
Related data is collected the invention proposes a kind of remote user and is the industry of wherein running-course control system
The method that local engineer at the position of factory provides instruction and guidance.
Fig. 1, which is schematically shown, wherein provides the industrial plant of Process Control System 10.Process Control System 10 is to be used for
Control the computerized process control system of industrial process.Process can be any kind of industrial process, such as power generation,
Transmission & distribution electric process and Water warfare and assigning process, production of hydrocarbons and assigning process, petrochemical industry, chemical industry, medicine and food products, with
And slurrying and paper making process.Some examples of these processes that only wherein system can be applied.There is also it is countless other
Industrial process.Process can also be other kinds of industrial process, such as cargo manufacturing industry.It can be supervised by one or more process
Control computer carrys out monitoring process, the server communication of the monitoring and control of the computer and progress process.
Thus, the Process Control System 10 in Fig. 1 includes multiple process monitoring computers 12 and 14.These computers are at this
In also can be considered formation operation person's terminal, and be connected to the first data/address bus B1.There is also be connected to first data/address bus
The gateway 16 of B1, the gateway 16 are connected at least one wireless network WN.The gateway is also connected to public data communication network,
It is internet IN herein.Video communication device 32 is connected to wireless network WN.Wireless network WN can be local area network,
Such as WLAN (WLAN).It is also possible to blueteeth network, the i.e. network of the bluetooth nodes with multiple interconnection.It may be used also
To be mobile communications network.
In addition, there is also the second data/address bus B2, and clothes are connected between the first and second data/address bus B1 and B2
Business device 18, the control and protection of offer process and database 20, wherein storage number relevant to the process control and protection
According to.It is this to control and to protect relevant data may include process data, such as measured value and control command herein, while with
Protecting relevant data may include alarm and event data and the data that can generate alarm and event on it, such as process
Obtained in measured value.It can also provide the panel of process control objects, and the panel may include from database 20
Process control data in relation to process control objects.In addition, being connected to selectable number of objects between two bus Bs 1 and B2
According to server 21.The object data server 21 includes the data in relation to all processes control object, such as excessively program-controlled about this
Blueprint, specification and the handbook of object processed.
In addition, other multiple devices 24,26,28 and 30 are also connected to the second data/address bus B2.These other devices 24,
26,28 and 30 be field device, is the interface arrangement of controlled process.Field device be usually obtain process measurement by
And the interface for being given to control command.In addition, field device is process control objects because of this.Of the invention one
In a modification, primary scene device is the first process control objects 24, and secondary scene device is the second process control objects 26, with
And third field device is third process control objects 28.
Fig. 2 shows provide the block diagram of multiple units in video communication device 32.Video communication device 32 is provided with
Shell 49.Bus 33 is provided in shell 49, and selectable short-range communication unit 46 or proximity sensor, video are projected
Machine 48, filming apparatus 34, recording controller 36, program storage 39, processor 40 and radio communication circuit 42 are connected to this
A bus 33.It can also include at least one other sensor, such as temperature sensor, accelerometer, ambient light sensor
With gyroscope (not shown).In addition, radio communication circuit 42 is connected to antenna 44, the wherein wireless communication unit 42 and antenna 44
It is provided for being communicated with wireless network WN.Radio communication circuit 42 and antenna 44 are formed together a type of communication and connect
Mouthful, it is used for and Process Control System and other entity communications.Therefore, it can be WiFi or WLAN interface.It is also possible to
Mobile communication interface.It should be further appreciated that two communication interfaces may be present in video communication device, a mobile communication interface and
One WiFi interface.Recording controller 36 is connected to microphone 35, and the recording controller 36 and microphone 35 are formed together note
Record unit, the sound that can be used in the position of recording process control system.Although it is not shown, but video communication device 32 can also
To include sound issue unit, such as loudspeaker and earphone.Microphone and earphone are combined to and are connected to video communication device 32
Headphone in be also possible that.Short-range communication unit 46 can also be considered as a type of sensor, object sensing
Device or proximity sensor, for sensing the process control objects to be serviced.The sensor can by near-field communication (NFC) technology come
It realizes.
Software code is provided in program storage 39, forms control unit 38 when being run by processor 40.Control
Unit 38 is more particularly configured to execute multiple functions under the control of remote user.
Video communication device 32 can move in the place of industrial plant.Therefore it can move on to another from a position
Position.It also may be positioned so that it will capture video image and by the demonstration digitlization demonstration of projector 48.
For this purpose, shell 49 can be placed on tripod 50, schematically shown in Fig. 3.Filming apparatus 34 have the visual field, i.e., wherein it
Detect the region of its ambient enviroment.The visual field can change in different ways.It can zoom out order by zoom to increase, and
And it can push towards order by zoom and reduce.Various types of panning orders can be used also to migrate or move in the visual field
It is dynamic.In order to obtain panning, the orientation of filming apparatus be can change.In a similar way, projector has demonstration area or projected area,
I.e. in the region for wherein capableing of Visualization Demo information.The demonstration area can be centered on projector's sight line, and can have
There are any proper shape, such as round, rectangle and quadratic function shape.The demonstration area is also can be by changing projector
Orientation and moved.Filming apparatus 34 can change its orientation in three dimensions, and projector 48 equally can be at three
Change its orientation in dimension.Furthermore they can also independently change.In a variant, can be changed by entire shell 46
Become orientation and jointly changes these orientations.Fig. 4 a and Fig. 4 b schematically show the movement for realizing this reorientation.It can
Find out that shell 49 can be along the horizontal plane around the vertical axis 360 degree rotation of tripod 50.It can also be seen that shell 49 can vertically on
Lower inclination.In order to obtain this movement, video communication device 32 can provide at least one motor for obtaining this movement.
As described above, more than one motor can also be provided with, for example, a motor is vertically moved for providing, and another is used
It is moved horizontally in offer.It is this kind of there may also be provided two pairs in order to obtain being individually moved for filming apparatus and projector
Motor, wherein providing a pair is used for filming apparatus 34, another pair is used for projector 48.In addition, although filming apparatus and projector
It still provides inside identical casings, but these are individually moved and are also provided.
As it appears from the above, projector 48 can independently change orientation with filming apparatus.To filming apparatus 34 and projector
48 can be pointed in different directions.
As shown in figure 1, it can be seen that video communication device 32 can access internet IN by wireless network WN.This allows video
Communication device 32 is remotely operated, that is, is operated from other places outside the factory.Thus the video communication device 32 can be by remote
Journey user for example passes through the computer operation of remote user 52.Such case is schematically shown in Fig. 5, can be seen here
Out, the computer 51 of remote user 52 can be communicated by internet with video communication device 32.In this way, video communication device 32
Control unit 38 for example can receive the order from remote user 52 by the website that the remote user 52 can log in.Make
For alternative, control command can directly be sent from the computer 51 of remote user 52.
Therefore, remote user 52 can obtain the video figure that the filming apparatus 34 of video communication device 32 is captured
Picture, then computer 51 that the video image passes through remote user 52 are demonstrated for it.This is shown in FIG. 6.Fig. 5 is schematically
Indicate video flowing VS, threedimensional model 3DM and filming apparatus data CD from video communication device 32 to the computer of remote user 52
51 transmission.Information from sensor can also be transmitted to the remote user by internet wireless.Pass through the remote user
The information for providing go back to factory can also pass through Internet transmission.More information about transmission will provide quickly.
Some embodiments of the present invention will be described in detail now.
In industrial circle, such as wherein in the industrial plant of operation process, it is being always maintained at by Process Control System
The operating of production is extremely important, because even the slight interruption in production will also pay mint of money.Because of this, maintenance is regarded
To be extremely important to keep producing and to operate.
Maintenance can be very expensive, because external experts help must be invited to solve ordinary person sometimes to be had
There is professional ability to go the higher level operation of processing.
Cooperating between the local worker and expert carried out by telephone line is often not good enough.Expert may need
See that scene occurs and may be required to instruct Field Force in the case where no any misunderstanding risk.To postback
Sending photo is also a kind of slow mode of shared information, therefore this is also and non-real effective.
Finding suitable expert and the expert being allowed to fly to scene can take a long time.Outside is needed in chance failure
In the case that expert assistance is to continue production, since expert travelling may must reach scene over long distances, so this causes to grow
Time shuts down.
Please external experts fly to scene and can be very expensive.Not only cost associated with the expert (stay by travelling
Deng), and the production disruption during factory personnel waiting expert's arrival can be very expensive for factory owner.
Above situation is solved by the use of video communication device.
In operation, that is, when there is certain in factory at position, video communication device is brought to the industry spot
Position, and place the site that needs to assist in this position.Device can be placed in, for example, the center in room.The video is logical
T unit can be placed on this position by local user, for use in solving the problems, such as at position, for example, one or more
Machine or process control objects are likely to occur failure or process and show the abnormal fact at this location.
Since described device has been brought to the position, many activities can be executed.
In the first modification, remote user is provided with contextual data related with video flowing.It is retouched with reference to Fig. 7,8 and 9
First modification is stated, wherein Fig. 7 schematically shows the use of the filming apparatus of video communication device to be used to capture industry
The video of process control objects a part at the position at scene, Fig. 8 schematically show the video together with the position and are somebody's turn to do
The demonstration of the 3-D view of video communication device, and Fig. 9 schematically show by the video communication device execute by video
It is sent to the flow chart of the method for remote user.
According to the first modification, control unit 38 makes video communication device 32 scan the region at the position, step first
56.This can control motor by control unit 38 and rotate the combination control motor of shell 49 by shown in Fig. 4 a and 4b around vertical rotation axis
Differing tilt angles tilt shell 49 up and down to carry out.In this way, it is filled using the video communication of filming apparatus 34
The three-dimensional space around site set is captured different video image.
After scanning, control unit 38 analyzes institute's captured image in the region, and investigates it and compare the position prestored
Whether the threedimensional model of the position at place and object, i.e. process control objects and the position other objects that may be present can be to it
Identified.If it, which has identified the video image and therefore exists, prestores model, step 58, then the model is called
(fetch), step 60.This prestores threedimensional model and can be provided in video communication device 32.Optionally, it can also be with
From server, such as server 21, obtains or call.If there is model is prestored, then in the position about video communication device
The data in the previous site at place and the video flowing and filming apparatus recorded in the video communication device that the previous site is placed
Orientation can be saved together with the model.Moreover, the previous site data and associated historical video streams can be called.
It is therefore called if having made any threedimensional model by the position.But if there is no model is prestored, step 58,
Then the new threedimensional model 3DM of various objects is created by control unit 38 in the position and the position, step 62.It in this case can be with
Model is obtained by creating model.Model can be with, such as is created using augmented reality functionality.If data demonstrating device
Including infrared sensor, then infrared technique, such as Microsoft Kinect is used to be also possible that.It is naturally special at the position
The 3D mapping of sign can be by using various features extracting method for example with 2D RGB data and 3D RGBD (red, green, blue, depth
Degree) turning that carries out of data or edge detection construct.Use this sparse mapping, it is also possible to determine and have filming apparatus 34
Video communication device 32 position.It is possible to establish that the orientation or posture of filming apparatus 34.It is possible to determining shooting dress
34 are set towards which direction.The orientation can be calculated based on registration Algorithm.These algorithms can be used in positioning real world and reflect
The feature of the present frame or video image hit, and based on the orientation of this determination filming apparatus 34.
Process control objects, i.e. real world objects can be provided with object identifier, such as NFC label or bar code.
If reading the identifier, it is likely that obtain about them be what type object information.Type can be filled by shooting
Set 34 detection visual object identifiers, such as bar code, to be identified.Optionally, short-range communication unit may be set to
Read the label for having the object identifier.This bar code can be used for, such as from the data base call in Process Control System
Data associated with the object.In order to simplify the calling of this data, thus control unit 38 can storage object identifier with
Object in the 3DM model of the position it is associated.
Scanning or short range communication also can be used for determining video communication device 32 in the field position in control unit 38
Site, step 63, that is, the site about layout drawing and various objects at the position.This can also refer to the video communication
Device is added to the threedimensional model 3DM of the position.
Above-mentioned steps can execute before starting communication session with remote user 52.WiFi can be used in this session
The TCP connection established with internet carries out.Optionally, these steps just execute after the communication session starts.?
Under both of these case, whether the investigation communication session of control unit 38 is ready or carrying out, in this case extremely
It is related to passing through the sound generating device and sound of computer 51 and video communication device 32 between remote user and local user less
The audio communication session that recording equipment is carried out.It further relates to the transmission of live video stream VS, can be from process control system
Video communication device 32 in system arrives the one-way video stream of the computer 51 of remote user 52.
In some instances, it can also relate to two-way video meeting, i.e., is also mentioned by the computer 51 of remote user 52
For video and send it to video communication device 32.Therefore the video image captured by filming apparatus 34 can be transmitted to remote
Journey user 52.Moreover, the data of remote user 52 can project at this location in the case where the remote user controls.
If at the scene without session, step 64, then control unit 38 is waited by local user or by remote user 52
Initiate session.
However, step 64, then control unit 38 controls filming apparatus 34 and records video if a session is in progress
Stream, step 66.Therefore it controls filming apparatus 34 to capture the video image VI of the position.Control unit 38 also continuously determines
Filming apparatus orientation, step 68, such as the sight line of the view finder based on filming apparatus 34.To which the control unit 38 determines view
The current orientation of frequency communication device 32.The orientation can be used as solid angle relevant to the site of video communication device and reference angle
To provide.Optionally or additionally, determine that the orientation is also possible that using gyroscope and/or accelerometer.
In a communication session, model 3DM can be transmitted to remote user 52 from video communication device.More particularly, this three
Dimension module 3DM can be transmitted together with filming apparatus data CD (in video flowing VS or other than video flowing VS), step
Rapid 70, wherein the filming apparatus data may include filming apparatus orientation.The filming apparatus data may also comprise the position of filming apparatus
Point, that is, the site of video communication device.Further, it is possible that control unit 38 modifies the model of the position, so that the video communication
Device and orientation become a part of the model.The filming apparatus data thus can be used as a part of the model to provide.
In this case, therefore which can be used as a part of threedimensional model to provide, be continuously communicated to the long-range use
The device at family.Alternatively, in the transmit process in video image from the video communication device to the remote user device, orientation
The update that variation can be used as threedimensional model is transmitted.The computer of the remote user then can be in modifying the threedimensional model
Use these updates.
Then remote user receives video flowing together with model 3DM and filming apparatus data CD.Then, which can
To see captured video and obtain the 3-D view of the position using model 3DM.Therefore, which is possible to
See its situation that can be seen in the field.
Such example is shown in figures 7 and 8.Fig. 7 show video communication device 32 be how with first, second and
Third process control objects 24,26 and 28 are in same position and it is how to capture the one of the second process control objects 26
Partial video image.Therefore it records the video for remote user 52.Fig. 8 shows the video image VI in video flowing VS,
As will be appreciated that when being shown on the computer 51 of remote user 52.The video image may include many useful letters
Breath.But it may lack scene.The scene is also by the transport model 3DM together with video flowing VS and filming apparatus data
CD and provide.Fig. 8 shows the screen that remote user 52 can see on the display of its computer 51.The view includes
Live video stream from filming apparatus, wherein image VI is demonstrated.In addition, scene information passes through the general view image OI of the position
It provides, which is obtained and visualizing the model 3DM of the position using video communication device VCD and its orientation
?.Herein, the expression of video communication device can be put into the model by remote user computer 51 together with the orientation of image
It is possible to.Alternatively, this is completed via the control unit 38 of video communication device 32.In the latter case, it shows
The modified model of video communication device and orientation is provided.
Then control unit 38 investigates whether communication session terminates.If be not finished, step 72, then video continues to be recorded
And filming apparatus orientation is determined, and step 68, and sends remote user, step 70 to together with the model.However, such as
Fruit terminates communication session, step 72, then also end operation, step 74.
Therefore, according to first modification, it can be seen that in video conference track up device 34 current location
And be orientated while going back what constructing environment mapping was possible to, enable remote user 52 that there is better position context-aware.
Such as Fig. 8 it can be seen that remote user 52 had both seen current filming apparatus view VI, the small figure OI being also able to use in the lower right corner
Obtain the extraordinary general view of surrounding enviroment.
Herein, remote user can also browse in constructed 3D view, and therefore be not limited to from video transition
Middle observation present frame, but from known 3D model constructed by video frame freely " exploration ".
Conference call (wherein purpose is the environment that share a user) is not limited to simple streamcast video number
According to, but can also include the data in site and current pose or orientation in relation to filming apparatus, wherein the orientation can be made
It is arranged for the orientation of the sight line of filming apparatus view finder.
If prior model exists, when these historical video streams are recorded, remote user is possible to that the position can be called
The video flowing for setting precedence record is orientated together with the site of video communication device and filming apparatus.
Some advantages of the invention can preferably be recognized from following situations:
1. local maintenance engineer just does a certain maintenance when he identifies potential problems in factory floor;He calls far
Journey user and video call is initiated to obtain the suggestion to this situation.
2. he using the filming apparatus in video communication device, it is made to scan the position with show process control objects (
In this example be the second process control objects 26) present case and surrounding enviroment.
3. the mapping that different video frames is processed to form the environment.In the formation of the mapping, video communication dress
The site set also is determined.Using this mapping and current video image, the current orientation or posture of filming apparatus are calculated.
4. the mapping 3DM of the environment then, with filming apparatus data is sent out by network during the call together with video flowing
It send.
5. due to remote user can see simultaneously environment mapping and video flowing, this additional information help he have to
Determine to orient its own in the world of the dynamic property of filming apparatus orientation.The remote user obtains than ordinary video conference system
The be more good context-aware that can be given.
6. it is efficient due to video collaboration system, which is obtained for home environment in factory site
Gem-pure understanding, then two users can try solve situation.
First modification has many further advantages.In first modification, filming apparatus and mapping data, i.e. the bat
Device data and threedimensional model are taken the photograph, is transmitted together with video flowing.This improves context-aware relative to conventional video flowing, promotees
Realize less mess and higher position.
Flow data is used to create the integral photograph of position.Remote user is able to use this 3D Model Independent and shoots in physics
Device site and viewpoint of navigating;This will give the great context-aware of the remote user.
Another advantage is that, it is not necessary to the problem of quantity be contracted by, such as " now at which are you? ", " I sees now that
It is what part? " other indicative problems that the engineer of such problems and remote collaboration must not pay no attention to now are avoided by.
Communication will also become more accurate.Communication mistake relevant to position will be less common.
Cooperation between two users will also become more efficient.For video collaboration required by task to be done when
Between will be improved most possibly.
Further, it is possible that safety is enhanced.The better scene for the case where having at hand due to remote user is anticipated
Know, he can observe whether local user is carrying out correct action.
It is described now with reference to Figure 10,11a, 11b and 12 pair of second modification, wherein Figure 10, which is schematically shown, has
The position of video communication device 32, the video communication device 32 provide the projected area PA of wherein projection demonstration project PI, Figure 11 a and
11b schematically shows the position with video communication device 32 and demonstration project PI when projected area PA is moved, figure
12 schematically show the flow chart that the method for video communication device 32 is operated by remote user 52.
When at this location, video communication device 32 is advantageously used in from the position acquisition data to be supplied to long-range use
Family 52, and receive the instruction of the local user from remote user 52 to from the position.This can pass through double-directional speech or video
Communication is to complete.
When communication session is carrying out, therefore control unit 38 calls measurement value sensor, such as temperature from sensor
Sensor and ambient light sensor, and these measurement value sensors are transmitted to the computer 51 of remote user 52, step
76.Filming apparatus 34 also captures and transmits video VS, step 78 to remote user 52.
Remote user 52 may wish to obtain about some of its process control objects seen in video flowing VS now
More related datas.He is possible, for example, it is desirable to obtain the data of the voltage of the temperature or transformer in basin.In order to complete this
It is a, he can in video or in the model of the position previously obtained selecting object.He can be with, such as in detection video
Object identifier, and video communication device is sent by the object identifier.He can also selecting object in a model, and
The selection may pass to control unit 38.Then, control unit 38 can call the number about the object from database 20
According to.It can be with, such as calls the panel of the current data with the process control objects.
Control unit 38 can thus be selected from 52 receive process control object of remote user, and step 80, and being based on should
It selects it from Process Control System, such as from database 20, calls the process control objects data, and by the process control pair
Image data is transmitted to the computer 51 of remote user 52, step 82.Therefore remote user 52 can select in model in the position
Object is selected, and when the object is selected, his available additional data, such as the panel with operation information.
After process control objects are selected, or if not having selection course control object, control unit 38 can be from
The remote user receives demonstration project PI.Remote user 52 can more particularly be provided the demonstration project of projection by projector.
Demonstration project PI can be digitlization demonstration project, be also possible to digital still, the image of such as arrow or circle, such as magic lantern
The demonstration of piece or text string with instruction.It may also is that the drawing done by remote user 52.The demonstration project thus can
To be the demonstration project of remote user's generation comprising instruction and visual detector.Therefore demonstration project PI can become mark
Image is demonstrated by projector 48 to local user.If this demonstration project PI is received, step 84, then the demonstration
The selection in the site of project can also be received.Remote user can select demonstration project PI's in the 3D model 3DM of position
Site.Site selection can also be transmitted to control unit 38.Then, control unit 38 is by the position of the demonstration project and the selection
Point is associated, step 86.The site of the demonstration project can use relevant to the site of video communication device and reference angle
Solid angle and radius are set.Demonstration project can be assigned to the space in the threedimensional model of the position as a result,.With this side
Formula is it is also possible to specify more than one demonstration project.
Hereafter, control unit 38 waits the filming apparatus control command that may be from remote user 52.The filming apparatus control
System order may include visual field control command, such as zoom commands, changes the size in the visual field but retains identical sight, either
Change the tropism control order of sight.Tropism control order generally includes panning (panning) order.Remote user 52 so as to
To change its orientation by rotation or tilt device 34.Remote user 52 can also be furthered with zoom and be zoomed out with zoom.Such as
Fruit receives order, and step 88, then these orders are used by control unit 38.If order is visual field order, it is used to control
The visual field of filming apparatus processed, step 90.Zoom commands are forwarded to filming apparatus 34, then its class for depending on the control command
Type progress zoom furthers or zoom zooms out.It tilts or rotates if necessary, then control unit 38 manipulates corresponding motor to obtain
It is mobile needed for obtaining.
Hereafter, control unit 38 can receive projector's control command from remote user 52.Projector's control command
It may include the order of projection demonstration project PI.In some cases, this order is also possible to project in specific ideal site
The order of the demonstration project.If receiving projector's control command, step 92, then projector 48 is by control unit 38 according to this
Order is manipulated, this is related to, if the control command is the order of projection demonstration project PI, controls projector 48 at this
The demonstration area PA of projector projects demonstration project PI, step 94.If the order is projected in specific site, projector's quilt
Control is so as in site projection demonstration project PI.Order can also include changing the order of the orientation in demonstration area.Such case
Under, it is mobile that projector can be used or another motor identical as motor used in filming apparatus 34, and controlled to project this
Demonstration project, so that it is presented in expectation site, step 94.Therefore, remote user can control video communication device in the position
Selection space or site projection demonstration project in setting.This can be related to project to the demonstration project and associated bit in threedimensional model
The corresponding real world site of point.If real world objects at this location will be in the demonstration according to demonstration project site
The front of project, the then part for the demonstration project blocked by the real world objects will be unable to demonstrate.
If projector is redirected to make to demonstrate area PA movement, demonstration project PI can be set as stopping
In the site of user's selection.When the project of demonstration and model interaction, it also means that the demonstration project can be retained for the position
Set the session after place.In addition, the progress of demonstration project projection is unrelated with video display.
Control unit 38 therefore can will be separated to the control of projector 48 with the control to filming apparatus 34 or independence.
For example, so that demonstration project PI is in except the visual field of filming apparatus 34, then should if the stream of filming apparatus 34 amplifies at details
Demonstration project PI will be demonstrated still.Therefore it carries out the control demonstrated in demonstration area and the control in the filming apparatus visual field is independent.
Such as from the above-mentioned zoom example provided it is clear that this is therefore, it is intended that demonstrate the site of project in the demonstration area PA of projector 48
It may be at except the filming apparatus visual field.This also means that demonstration area PA can be different from the visual field of filming apparatus 34.Work as bat
Taking the photograph device control is to control the order of the orientation of the filming apparatus, and the control command of projector is to control the orientation of the projector
Order when, it is same it can be seen that the control that the control for executing projector orientation is orientated independently of the filming apparatus, this is therefore
Mean that the control of filming apparatus orientation does not influence the control of projector orientation.
From Figure 11 a and 11b it can be seen that the projected area PA of projector 48 can be moveable.It is drilled if there is several
Aspect mesh can adapt in the demonstration area when being located at present bit point, then these demonstration projects can be based on the order of remote user
Individually or simultaneously demonstrate.
For example, provided that if a demonstration project, some of them project are in except the current location of demonstration area PA,
Then projector 48 can be redirected, and one or more demonstration projects are projected.After specified, remote user can letter
Singly select the demonstration project to be demonstrated, and control unit 38 will control one or more motors to redirect the projector,
So that it, which demonstrates area, covers selected demonstration project.
Hereafter, the capture of video is continued, and step 78, and waits the various orders from remote user, step 80,
84,88 and 92.This action type is continued for as long as long as session.
Remote user 52 can also send control projector 48, (such as temperature passes for filming apparatus 34 and various sensors
Sensor) order.
By video communication device 32, remote user is possible to obtain knowing for the operation of process control objects at this location
Know, and obtains the temperature at other information such as position.To observe the position, remote user 52 can also be filled with rotary taking
Set and obtain the viewdata of the position.Connected by voice, the remote user can also with local user communication and receive
There may be the voice opinions of problem at the position.
Then, which can determine that action appropriate, such as what process control objects and its which part will be caused
It moves and when activates.For example, the remote user can provide multiple demonstration projects, such as arrow and descriptive text, and
The different loci being assigned in dummy model.The remote user can also provide time-of-the-day order, to provide demonstration project
The sequence being demonstrated.Then order and demonstration project can be sent to video communication device 32, by projector 48 with long-range
Sequence determined by user 52 is demonstrated.It, can be with if these demonstration projects are provided in the demonstration area in current site
It is demonstrated simultaneously.When the new demonstration project that needs are demonstrated is in except 48 present viewing field of projector (i.e. when in current site
Demonstrating except area) when, then projector 48 can be moved or be redirected, so that demonstration area covers the position of the new demonstration project
Point.This movement of projector 48 can be carried out independently of filming apparatus 34.In this way, remote user 52 is possible in one place
Locate presentation information, such as about the instruction for activating a certain process control objects, and it is unlapped another to monitor projector 48 simultaneously
Another pair at place as.
Second modification is to provide a kind of permission remote user by live video stream come teleinstruction Field Force's
Method.The video communication device also will allow local user and remote user audibly to communicate.It will also allow this long-range
User obtains the general view of environment by filming apparatus.The remote user can also roll, the filming apparatus at panning and zoom scene,
To obtain the more excellent general view to the scene from remote position.Due to having used 3D filming apparatus, so being needed in the remote user
In the case of will be about the additional spatial information of the position, he will be able to see that the 3D model of the environment.
Moreover, remote user is it is also possible to add demonstration item to the mode of real world projection information by using projector
Mesh or information, such as mark and drawing to physical world, that is, the remote user can be with local user in the position from view
Shared information and mark in feel.
All sensors together with filming apparatus and sound pick-up outfit will make the user remotely connected it can be seen that, hear and feel
The case where by factory.Projector and sound generating device can be used for passing to scene from the information that the remote user feeds back again
Personnel.The projector is to return to factory personnel for allowing remote user visually to pass the information on.
By allowing remote user to control video communication device, which can be with the side of rotation, inclination and zoom
Formula uses filming apparatus browsing environment.When the information that there is the remote user it to want to share with the local user at scene,
The remote user is able to use projector will be in the information " drawing " to demonstration area.Remote user be able to use text, image or
The simply rendered object on telecreen.Then the drawing is projected into scene using projector.Since filming apparatus records
The 3D model of the environment, so annotation also can be the object being left.
All visual informations as provided by remote user can be augmented reality information, it is meant that, which adds
Any mark or drawing added is saved or is connected to the point added using constructed environment 3D model.This meaning
, if the remote user rotary taking device, the mark after being added to mark will remain in identical point.
Such as Figure 11 it can be seen that remote user has been added to demonstration project PI.When the remote user such as Figure 11 b can be seen
Rotating video communication device 32 as arriving, such as better general view in order to obtain, even if the site of demonstration area PA has occurred and that
Variation, but demonstration project PI still can project correctly.Thus it can be seen that even if the demonstration area is moved, wherein projecting
The real world site of the demonstration project is still kept.
Imagine following scenario:
1. serious problems unexpectedly have occurred on one offshore platform in Norway's natural gas companies.The problem is rare
And technical complexity, site operation personnel need expert to support to resume production.
2. expert flies to on-the-spot meeting time-consuming at least 48 hours due to remote at all expert positions.
3. operator gets in touch with support company, can help to solve immediately for the expert of the technical problem there described close
The problem of extra large platform.
4. operator discusses with expert, which instructs site operation personnel to take video communication device to production
The specific part of process, so that the Remote can see institute's problem.
5. the Remote observes field conditions using filming apparatus, sound pick-up outfit and sensor.
6. it is certain that the Remote can instruct site operation personnel to execute now based on the information from the offshore platform
Operation is to correct problem.
7. the Remote using voice and possibly with coastal waters user visually shared information.Remote uses
A possibility that sound and vision shared information, is extremely effective, because the Remote is possible to " point out " scene where immediately
Operator should execute action.
By the second modification, a possibility that remote user has been assigned to support is anywhere provided immediately in the world.He
No longer need all to be in the action when its assistance every time, but, in many cases, they can solve the problems, such as from office.
Remote user has been assigned the context-aware of certain level, and such context-aware can not pass through the 3D mould of building real world
Type only uses video flowing to realize.
Local user, such as service engineer have been assigned a kind of low-key for watching augmented reality information and naturally side
Formula, in many cases, this mode are better than watching augmented reality information with wearing type glasses or handheld screen.
Annotation can be added to the environment being projected on physical device surface by remote user, for local maintenance engineering
Teacher's viewing.
In the presence of will annotation be added on the 3D model in the world and shown at the scene using projector these annotation possibility
Property.
Even if filming apparatus covers another site, the annotation being added on 3D model is also secured on its position.
Being added to the environment and/or mark on the 3D model in the world and annotation can also be recorded and save as this
A part of industrial plant maintenance history.If the video communication device is taken back to some known location, these information can also be with
It retrieves again in the future.
It will also be appreciated that two modifications can combine.Activity in the two modifications is so as in same communication session
Middle progress.In this case, the knowledge for the position that the first modification medium-long range user obtains can be used for controlling the video communication
Device, and especially when using demonstration project.
The video communication device is to be provided with projector and including tripod as described above.It will be appreciated that the video is logical
T unit can be handheld apparatus such as video camera, portable computer or mobile phone, such as smart phone.
Control unit can be with as described above, being provided with processor together with form of memory, which includes for executing
The computer program code of its function.The computer program code can also provide in one or more data mediums, at this
Program code is loaded into memory and executes the function of the control unit when being run by processor.One this with calculating
The data medium 96 of machine program code 98 is schematically shown in Figure 13 in the form of CD ROM disk.
In addition to the embodiment being already mentioned above, the present invention can also be changed in a manner of more.It should therefore be realized that
It is that the present invention is only limited by following claims.
Claims (13)
1. a kind of method for transmitting video to remote user from the industry spot of running-course control system (10), the method by
The video communication device (32) for being placed in the position of the industry spot executes, and with the video communication device and described remote
The communication session carried out between the device (51) of journey user relatively executes, which comprises
The threedimensional model and the various objects at the position for obtaining the position,
The video communication device is obtained in the site of the position,
The video image of the position is captured using the filming apparatus (34) of the video communication device,
Determine the current orientation of the filming apparatus,
The demonstration of the video communication device is added to the threedimensional model of the position, so that the video communication device exists
It is shown in the threedimensional model, and
In the communication session from the video communication device to the device of the remote user, by the threedimensional model
With the current orientation of the filming apparatus and the site of the video communication device together with including captured video image
Video flowing be transmitted to the described device of the remote user together.
2. the method as described in claim 1 further comprises, investigation prestores threedimensional model with the presence or absence of the position, such as
Fruit prestores threedimensional model described in existing, then transfers and prestore threedimensional model using described, and otherwise, based on the scanning position
Create the threedimensional model.
3. the method as described in claim 1, wherein the current orientation of the filming apparatus as with the video communication
The relevant solid angle in the site and reference angle of device provides.
4. method according to claim 2 then further comprises obtaining the video to lead to if there is threedimensional model is prestored
The previous site of T unit, together with the previous videograph for being orientated and being done in the previous site of the filming apparatus.
5. such as method of any of claims 1-4, wherein described in the current orientation conduct of the filming apparatus
A part of threedimensional model is provided, and is continuously sent to the device of the remote user.
6. such as method of any of claims 1-4, wherein from the video communication device to the long-range use
During the described device transmission video image at family, update of the current orientation of the filming apparatus as the threedimensional model
It is transmitted.
7. a kind of video communication device (32), for from the industry spot of running-course control system (10) to remote user (52)
Video is transmitted, the video communication device (32) is placed in the position of the industry spot and includes:
Communication interface, the communication being used to provide the described between video communication device and the device (51) of the remote user (52)
Session,
Filming apparatus (34) captures image, and
Control unit (38), is configured to
The various objects at the threedimensional model and the position of the position are obtained,
The video communication device is obtained in the site of the position,
The video image that the filming apparatus captures the position is controlled,
Determine the current orientation of the filming apparatus,
The demonstration of the video communication device is added to the threedimensional model of the position, so that the video communication device is in institute
It states in threedimensional model and shows, and
In the communication session from the video communication device to the device of the remote user, by the threedimensional model and
The current orientation of the filming apparatus and the site of the video communication device are together with including captured video figure
The video flowing from the filming apparatus of picture is transmitted to the described device of the remote user together.
8. video communication device as claimed in claim 7, described control unit are further configured to, investigation is with the presence or absence of described
Position prestores threedimensional model, prestores threedimensional model if there is described, then transfers and prestore threedimensional model using described, and
Otherwise, it is based on scanning the position creation threedimensional model using the filming apparatus.
9. video communication device as claimed in claim 7, wherein the current orientation of the filming apparatus as with it is described
The relevant solid angle in the site and reference angle of video communication device provides.
10. video communication device as claimed in claim 8, wherein if there is threedimensional model is prestored, then described control unit
It is further configured to, obtains the previous site of the video communication device, previous together with the filming apparatus is orientated and in institute
State the videograph that previous site is done.
11. the video communication device as described in any one of claim 7-10, wherein the described of the filming apparatus currently takes
It is provided to a part as the threedimensional model, is continuously sent to the device of the remote user.
12. the video communication device as described in any one of claim 7-10, wherein from the video communication device to
During the described device transmission video image of the remote user, the current orientation of the filming apparatus is used as the three-dimensional
The update of model is transmitted.
13. a kind of computer-readable medium is stored thereon with computer program code (98), the computer program code (98)
It is configured to be loaded into video communication device when the computer program code and the video communication device is placed on industry
The position at scene and when the device (51) of the video communication device and remote user carry out communication session, promotes for from operation
The industry spot of Process Control System (10) includes the described of filming apparatus (34) to remote user (52) transmission video
Video communication device (32):
The threedimensional model and the various objects at the position for obtaining the position,
The video communication device is obtained in the site of the position,
The video image of the position is captured using the filming apparatus (34),
Determine the current orientation of the filming apparatus,
The demonstration of the video communication device is added to the threedimensional model of the position, so that the video communication fills
It sets and is shown in the threedimensional model, and
In the communication session from the video communication device to the described device of the remote user, by the three-dimensional
The site of the current orientation of model and the filming apparatus and the video communication device is together with including being captured
The video flowing of video image is transmitted to the described device of the remote user together.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2013/063539 WO2014206473A1 (en) | 2013-06-27 | 2013-06-27 | Method and video communication device for transmitting video to a remote user |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105659170A CN105659170A (en) | 2016-06-08 |
CN105659170B true CN105659170B (en) | 2019-02-12 |
Family
ID=48703510
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201380077726.9A Active CN105659170B (en) | 2013-06-27 | 2013-06-27 | For transmitting the method and video communication device of video to remote user |
Country Status (4)
Country | Link |
---|---|
US (1) | US9628772B2 (en) |
EP (1) | EP3014367B1 (en) |
CN (1) | CN105659170B (en) |
WO (1) | WO2014206473A1 (en) |
Families Citing this family (84)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9411327B2 (en) | 2012-08-27 | 2016-08-09 | Johnson Controls Technology Company | Systems and methods for classifying data in building automation systems |
US8713600B2 (en) * | 2013-01-30 | 2014-04-29 | Almondnet, Inc. | User control of replacement television advertisements inserted by a smart television |
EP2818948B1 (en) * | 2013-06-27 | 2016-11-16 | ABB Schweiz AG | Method and data presenting device for assisting a remote user to provide instructions |
US20160035246A1 (en) * | 2014-07-31 | 2016-02-04 | Peter M. Curtis | Facility operations management using augmented reality |
US20160054900A1 (en) * | 2014-08-25 | 2016-02-25 | Chuck Surack | Computer Implemented System and Method for Producing 360 Degree Perspective Images |
US9760635B2 (en) | 2014-11-07 | 2017-09-12 | Rockwell Automation Technologies, Inc. | Dynamic search engine for an industrial environment |
US10959107B2 (en) * | 2015-04-14 | 2021-03-23 | ETAK Systems, LLC | Systems and methods for delivering a close out package for work done at a telecommunications site |
US9947135B2 (en) * | 2015-04-14 | 2018-04-17 | ETAK Systems, LLC | Close-out audit systems and methods for cell site installation and maintenance |
US12039230B2 (en) | 2015-04-14 | 2024-07-16 | ETAK Systems, LLC | Systems and methods for coordinating initiation, preparing, vetting, scheduling, constructing, and implementing a power plant implementation |
US11797723B2 (en) * | 2015-04-14 | 2023-10-24 | ETAK Systems, LLC | Systems and methods for coordinating initiation, preparing, vetting, scheduling, constructing, and implementing a power plant implementation |
US10255719B2 (en) * | 2015-04-14 | 2019-04-09 | ETAK Systems, LLC | Systems and methods for satellite data capture for telecommunications site modeling |
US11790124B2 (en) * | 2015-04-14 | 2023-10-17 | ETAK Systems, LLC | Systems and methods for coordinating initiation, preparing, vetting, scheduling, constructing, and implementing a power plant implementation |
US12030630B2 (en) | 2015-04-14 | 2024-07-09 | ETAK Systems, LLC | Systems and methods for coordinating initiation, preparing, vetting, scheduling, constructing, and implementing a power plant implementation |
US10893419B2 (en) * | 2015-04-14 | 2021-01-12 | ETAK Systems, LLC | Systems and methods for coordinating initiation, preparing, vetting, scheduling, constructing, and implementing a small cell implementation |
US10235808B2 (en) | 2015-08-20 | 2019-03-19 | Microsoft Technology Licensing, Llc | Communication system |
US20170053455A1 (en) * | 2015-08-20 | 2017-02-23 | Microsoft Technology Licensing, Llc | Asynchronous 3D annotation of a Video Sequence |
US10169917B2 (en) | 2015-08-20 | 2019-01-01 | Microsoft Technology Licensing, Llc | Augmented reality |
US10534326B2 (en) | 2015-10-21 | 2020-01-14 | Johnson Controls Technology Company | Building automation system with integrated building information model |
US11947785B2 (en) | 2016-01-22 | 2024-04-02 | Johnson Controls Technology Company | Building system with a building graph |
US11268732B2 (en) | 2016-01-22 | 2022-03-08 | Johnson Controls Technology Company | Building energy management system with energy analytics |
WO2017173167A1 (en) | 2016-03-31 | 2017-10-05 | Johnson Controls Technology Company | Hvac device registration in a distributed building management system |
US10325155B2 (en) | 2016-04-19 | 2019-06-18 | Rockwell Automation Technologies, Inc. | Analyzing video streams in an industrial environment to identify potential problems and select recipients for a display of video streams related to the potential problems |
US10613729B2 (en) * | 2016-05-03 | 2020-04-07 | Johnson Controls Technology Company | Building and security management system with augmented reality interface |
US10417451B2 (en) | 2017-09-27 | 2019-09-17 | Johnson Controls Technology Company | Building system with smart entity personal identifying information (PII) masking |
US11774920B2 (en) | 2016-05-04 | 2023-10-03 | Johnson Controls Technology Company | Building system with user presentation composition based on building context |
US10901373B2 (en) | 2017-06-15 | 2021-01-26 | Johnson Controls Technology Company | Building management system with artificial intelligence for unified agent based control of building subsystems |
US10505756B2 (en) | 2017-02-10 | 2019-12-10 | Johnson Controls Technology Company | Building management system with space graphs |
US10684033B2 (en) | 2017-01-06 | 2020-06-16 | Johnson Controls Technology Company | HVAC system with automated device pairing |
US10957102B2 (en) * | 2017-01-16 | 2021-03-23 | Ncr Corporation | Virtual reality maintenance and repair |
US11900287B2 (en) | 2017-05-25 | 2024-02-13 | Johnson Controls Tyco IP Holdings LLP | Model predictive maintenance system with budgetary constraints |
US10452043B2 (en) | 2017-02-10 | 2019-10-22 | Johnson Controls Technology Company | Building management system with nested stream generation |
US20190095518A1 (en) | 2017-09-27 | 2019-03-28 | Johnson Controls Technology Company | Web services for smart entity creation and maintenance using time series data |
US11764991B2 (en) | 2017-02-10 | 2023-09-19 | Johnson Controls Technology Company | Building management system with identity management |
US11307538B2 (en) | 2017-02-10 | 2022-04-19 | Johnson Controls Technology Company | Web services platform with cloud-eased feedback control |
US11360447B2 (en) | 2017-02-10 | 2022-06-14 | Johnson Controls Technology Company | Building smart entity system with agent based communication and control |
US10515098B2 (en) | 2017-02-10 | 2019-12-24 | Johnson Controls Technology Company | Building management smart entity creation and maintenance using time series data |
US11994833B2 (en) | 2017-02-10 | 2024-05-28 | Johnson Controls Technology Company | Building smart entity system with agent based data ingestion and entity creation using time series data |
US10417245B2 (en) | 2017-02-10 | 2019-09-17 | Johnson Controls Technology Company | Building management system with eventseries processing |
WO2018175912A1 (en) | 2017-03-24 | 2018-09-27 | Johnson Controls Technology Company | Building management system with dynamic channel communication |
US11327737B2 (en) | 2017-04-21 | 2022-05-10 | Johnson Controls Tyco IP Holdings LLP | Building management system with cloud management of gateway configurations |
US10788229B2 (en) | 2017-05-10 | 2020-09-29 | Johnson Controls Technology Company | Building management system with a distributed blockchain database |
US11022947B2 (en) | 2017-06-07 | 2021-06-01 | Johnson Controls Technology Company | Building energy optimization system with economic load demand response (ELDR) optimization and ELDR user interfaces |
WO2019018304A1 (en) | 2017-07-17 | 2019-01-24 | Johnson Controls Technology Company | Systems and methods for agent based building simulation for optimal control |
EP3655825B1 (en) | 2017-07-21 | 2023-11-22 | Johnson Controls Tyco IP Holdings LLP | Building management system with dynamic rules with sub-rule reuse and equation driven smart diagnostics |
US10619882B2 (en) | 2017-07-27 | 2020-04-14 | Johnson Controls Technology Company | Building management system with scorecard for building energy and equipment performance |
US10962945B2 (en) | 2017-09-27 | 2021-03-30 | Johnson Controls Technology Company | Building management system with integration of data into smart entities |
US11195401B2 (en) | 2017-09-27 | 2021-12-07 | Johnson Controls Tyco IP Holdings LLP | Building risk analysis system with natural language processing for threat ingestion |
CN107846553B (en) * | 2017-10-26 | 2024-05-14 | 中国工商银行股份有限公司 | Remote control method, device and remote control system for collecting image information |
US11281169B2 (en) | 2017-11-15 | 2022-03-22 | Johnson Controls Tyco IP Holdings LLP | Building management system with point virtualization for online meters |
US10809682B2 (en) | 2017-11-15 | 2020-10-20 | Johnson Controls Technology Company | Building management system with optimized processing of building system data |
US11127235B2 (en) | 2017-11-22 | 2021-09-21 | Johnson Controls Tyco IP Holdings LLP | Building campus with integrated smart environment |
CN108156443A (en) * | 2017-12-21 | 2018-06-12 | 成都真实维度科技有限公司 | A kind of operation VR live streaming platforms and its business model |
US10832558B2 (en) * | 2018-01-08 | 2020-11-10 | Honeywell International Inc. | Systems and methods for augmenting reality during a site survey using an unmanned aerial vehicle |
US11954713B2 (en) | 2018-03-13 | 2024-04-09 | Johnson Controls Tyco IP Holdings LLP | Variable refrigerant flow system with electricity consumption apportionment |
US11016648B2 (en) | 2018-10-30 | 2021-05-25 | Johnson Controls Technology Company | Systems and methods for entity visualization and management with an entity node editor |
US11927925B2 (en) | 2018-11-19 | 2024-03-12 | Johnson Controls Tyco IP Holdings LLP | Building system with a time correlated reliability data stream |
US11221661B2 (en) | 2019-01-14 | 2022-01-11 | Rockwell Automation Technologies, Inc. | Method for auto-discovery and categorization of a plants power and energy smart devices for analytics |
US20200234220A1 (en) | 2019-01-18 | 2020-07-23 | Johnson Controls Technology Company | Smart building automation system with employee productivity features |
US10788798B2 (en) | 2019-01-28 | 2020-09-29 | Johnson Controls Technology Company | Building management system with hybrid edge-cloud processing |
US11894944B2 (en) | 2019-12-31 | 2024-02-06 | Johnson Controls Tyco IP Holdings LLP | Building data platform with an enrichment loop |
US12021650B2 (en) | 2019-12-31 | 2024-06-25 | Tyco Fire & Security Gmbh | Building data platform with event subscriptions |
US20210200912A1 (en) | 2019-12-31 | 2021-07-01 | Johnson Controls Technology Company | Building data platform with graph based policies |
US20210200174A1 (en) | 2019-12-31 | 2021-07-01 | Johnson Controls Technology Company | Building information model management system with hierarchy generation |
US11769066B2 (en) | 2021-11-17 | 2023-09-26 | Johnson Controls Tyco IP Holdings LLP | Building data platform with digital twin triggers and actions |
US12100280B2 (en) | 2020-02-04 | 2024-09-24 | Tyco Fire & Security Gmbh | Systems and methods for software defined fire detection and risk assessment |
US11537386B2 (en) | 2020-04-06 | 2022-12-27 | Johnson Controls Tyco IP Holdings LLP | Building system with dynamic configuration of network resources for 5G networks |
CN111885398B (en) * | 2020-07-20 | 2021-12-07 | 贝壳找房(北京)科技有限公司 | Interaction method, device and system based on three-dimensional model, electronic equipment and storage medium |
CN111787341B (en) * | 2020-05-29 | 2023-12-05 | 北京京东尚科信息技术有限公司 | Guide broadcasting method, device and system |
US11874809B2 (en) | 2020-06-08 | 2024-01-16 | Johnson Controls Tyco IP Holdings LLP | Building system with naming schema encoding entity type and entity relationships |
US11397773B2 (en) | 2020-09-30 | 2022-07-26 | Johnson Controls Tyco IP Holdings LLP | Building management system with semantic model integration |
US11954154B2 (en) | 2020-09-30 | 2024-04-09 | Johnson Controls Tyco IP Holdings LLP | Building management system with semantic model integration |
US20220138362A1 (en) | 2020-10-30 | 2022-05-05 | Johnson Controls Technology Company | Building management system with configuration by building model augmentation |
US12061453B2 (en) | 2020-12-18 | 2024-08-13 | Tyco Fire & Security Gmbh | Building management system performance index |
WO2022197964A1 (en) | 2021-03-17 | 2022-09-22 | Johnson Controls Tyco IP Holdings LLP | Systems and methods for determining equipment energy waste |
US11899723B2 (en) | 2021-06-22 | 2024-02-13 | Johnson Controls Tyco IP Holdings LLP | Building data platform with context based twin function processing |
US11796974B2 (en) | 2021-11-16 | 2023-10-24 | Johnson Controls Tyco IP Holdings LLP | Building data platform with schema extensibility for properties and tags of a digital twin |
US11934966B2 (en) | 2021-11-17 | 2024-03-19 | Johnson Controls Tyco IP Holdings LLP | Building data platform with digital twin inferences |
US11704311B2 (en) | 2021-11-24 | 2023-07-18 | Johnson Controls Tyco IP Holdings LLP | Building data platform with a distributed digital twin |
US11714930B2 (en) | 2021-11-29 | 2023-08-01 | Johnson Controls Tyco IP Holdings LLP | Building data platform with digital twin based inferences and predictions for a graphical building model |
US12013673B2 (en) | 2021-11-29 | 2024-06-18 | Tyco Fire & Security Gmbh | Building control system using reinforcement learning |
WO2023180456A1 (en) * | 2022-03-23 | 2023-09-28 | Basf Se | Method and apparatus for monitoring an industrial plant |
US12061633B2 (en) | 2022-09-08 | 2024-08-13 | Tyco Fire & Security Gmbh | Building system that maps points into a graph schema |
US12013823B2 (en) | 2022-09-08 | 2024-06-18 | Tyco Fire & Security Gmbh | Gateway system that maps points into a graph schema |
CN115643365B (en) * | 2022-12-26 | 2023-03-28 | 北京天智航医疗技术服务有限公司 | Remote operation control method and device and electronic equipment |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6985620B2 (en) * | 2000-03-07 | 2006-01-10 | Sarnoff Corporation | Method of pose estimation and model refinement for video representation of a three dimensional scene |
WO2009074600A1 (en) * | 2007-12-10 | 2009-06-18 | Abb Research Ltd | A computer implemented method and system for remote inspection of an industrial process |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110007150A1 (en) | 2009-07-13 | 2011-01-13 | Raytheon Company | Extraction of Real World Positional Information from Video |
IL202460A (en) * | 2009-12-01 | 2013-08-29 | Rafael Advanced Defense Sys | Method and system of generating a three-dimensional view of a real scene |
WO2012018497A2 (en) | 2010-07-25 | 2012-02-09 | Raytheon Company | ENHANCED SITUATIONAL AWARENESS AND TARGETING (eSAT) SYSTEM |
US8587583B2 (en) * | 2011-01-31 | 2013-11-19 | Microsoft Corporation | Three-dimensional environment reconstruction |
DE102011085003A1 (en) | 2011-10-21 | 2013-04-25 | Siemens Aktiengesellschaft | Method for visualizing spatial relationships of manufacturing plant, involves providing navigation tool to select viewer views of digital images in virtual environment |
SE1300138A1 (en) | 2013-02-21 | 2013-02-25 | Abb Technology Ltd | Method and data presentation device for assisting a user to service a process control object |
-
2013
- 2013-06-27 CN CN201380077726.9A patent/CN105659170B/en active Active
- 2013-06-27 WO PCT/EP2013/063539 patent/WO2014206473A1/en active Application Filing
- 2013-06-27 EP EP13732486.9A patent/EP3014367B1/en active Active
- 2013-06-27 US US14/895,121 patent/US9628772B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6985620B2 (en) * | 2000-03-07 | 2006-01-10 | Sarnoff Corporation | Method of pose estimation and model refinement for video representation of a three dimensional scene |
WO2009074600A1 (en) * | 2007-12-10 | 2009-06-18 | Abb Research Ltd | A computer implemented method and system for remote inspection of an industrial process |
CN101939765A (en) * | 2007-12-10 | 2011-01-05 | Abb研究有限公司 | A computer implemented method and system for remote inspection of an industrial process |
Also Published As
Publication number | Publication date |
---|---|
US9628772B2 (en) | 2017-04-18 |
US20160127712A1 (en) | 2016-05-05 |
EP3014367B1 (en) | 2017-03-15 |
EP3014367A1 (en) | 2016-05-04 |
WO2014206473A1 (en) | 2014-12-31 |
CN105659170A (en) | 2016-06-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105659170B (en) | For transmitting the method and video communication device of video to remote user | |
CN105814500B (en) | For assisting remote user to provide the method and data demonstrating device of instruction | |
KR102366293B1 (en) | System and method for monitoring field based augmented reality using digital twin | |
KR102289745B1 (en) | System and method for real-time monitoring field work | |
JP2023153815A (en) | 3d mapping of process control environment | |
US9854206B1 (en) | Privacy-aware indoor drone exploration and communication framework | |
JP2019533372A (en) | Panorama image display control method, apparatus, and storage medium | |
CN110888428B (en) | Mobile robot, remote terminal, computer readable medium, control system, and control method | |
JP2005117621A (en) | Image distribution system | |
US12112439B2 (en) | Systems and methods for immersive and collaborative video surveillance | |
US11758081B2 (en) | Server and method for displaying 3D tour comparison | |
JP2022057771A (en) | Communication management device, image communication system, communication management method, and program | |
JP2022148262A (en) | Remote operation system, remote operation moving body, remote operation method, and program | |
WO2024001799A1 (en) | Anti-collision method for virtual reality (vr) device, and electronic device | |
WO2022176450A1 (en) | Information processing device, information processing method, and program | |
US20150153715A1 (en) | Rapidly programmable locations in space | |
KR102196683B1 (en) | Device and method for photographing 3d tour | |
US20240323240A1 (en) | Communication control server, communication system, and communication control method | |
JP2006067393A (en) | Viewing system | |
US20230308705A1 (en) | Communication terminal, communication management system, remote operation method, processing method, and communication management method | |
SE1500055A1 (en) | Method and data presenting device for facilitating work at an industrial site assisted by a remote user and a process control system | |
JP2024123375A (en) | Information processing device, control program, and control method | |
SE1300676A1 (en) | Procedure and data presentation arrangements to assist a remote user to provide instructions to a local user | |
CN118575471A (en) | Virtual reality conference system | |
CN105630142A (en) | Method and device for publishing and transferring identification information, and information identification system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20180510 Address after: Baden, Switzerland Applicant after: ABB TECHNOLOGY LTD. Address before: Zurich Applicant before: ABB T & D Technology Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |