CN109062407A - Remote mobile terminal three-dimensional display & control system and method based on VR technology - Google Patents
Remote mobile terminal three-dimensional display & control system and method based on VR technology Download PDFInfo
- Publication number
- CN109062407A CN109062407A CN201810838456.3A CN201810838456A CN109062407A CN 109062407 A CN109062407 A CN 109062407A CN 201810838456 A CN201810838456 A CN 201810838456A CN 109062407 A CN109062407 A CN 109062407A
- Authority
- CN
- China
- Prior art keywords
- mobile terminal
- video
- data
- camera
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000005516 engineering process Methods 0.000 title claims abstract description 30
- 238000000034 method Methods 0.000 title claims abstract description 14
- 238000012545 processing Methods 0.000 claims abstract description 14
- 238000004891 communication Methods 0.000 claims abstract description 7
- 235000011034 Rubus glaucus Nutrition 0.000 claims description 28
- 235000009122 Rubus idaeus Nutrition 0.000 claims description 28
- 240000007651 Rubus glaucus Species 0.000 claims description 27
- 230000005484 gravity Effects 0.000 claims description 9
- 230000033001 locomotion Effects 0.000 claims description 7
- 238000012546 transfer Methods 0.000 claims description 7
- 230000015572 biosynthetic process Effects 0.000 claims description 5
- 239000011521 glass Substances 0.000 claims description 5
- 244000235659 Rubus idaeus Species 0.000 claims 1
- 230000005540 biological transmission Effects 0.000 abstract description 12
- 230000000007 visual effect Effects 0.000 abstract description 5
- 230000009467 reduction Effects 0.000 abstract description 2
- 238000004088 simulation Methods 0.000 abstract 1
- 230000000694 effects Effects 0.000 description 6
- 238000011161 development Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000008447 perception Effects 0.000 description 3
- PEDCQBHIVMGVHV-UHFFFAOYSA-N Glycerine Chemical compound OCC(O)CO PEDCQBHIVMGVHV-UHFFFAOYSA-N 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 241000283973 Oryctolagus cuniculus Species 0.000 description 1
- 244000131316 Panax pseudoginseng Species 0.000 description 1
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 1
- 235000003140 Panax quinquefolius Nutrition 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 238000005538 encapsulation Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 235000008434 ginseng Nutrition 0.000 description 1
- 231100001261 hazardous Toxicity 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 238000009776 industrial production Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002688 persistence Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Processing Or Creating Images (AREA)
Abstract
Remote mobile terminal three-dimensional display & control system and method based on VR technology, it is characterised in that including mobile terminal, including video acquisition module and more than one camera, camera and video acquisition module data communication, mobile terminal are carried on trolley;Video server, includes VR video processing module, and video server is connected to mobile terminal data;Display terminal includes VR video display control module, display terminal and video server data communication.The present invention is based on the panoramic videos of binocular camera video to generate transmission process and splicing, the transmission of panorama is realized with binocular camera, it is able to solve delay issue, the video that two cameras are sent back is perfectly spliced, realize the top reduction of simulation human eye observation, the visual experience most really most shaken to user.
Description
Technical field
The present invention relates to the fields virtual reality (VR), and in particular to a kind of remote mobile terminal three-dimensional based on VR technology is aobvious
Control system and method.
Background technique
With the development of electronic technology, computer technology, control technology etc., intelligent robot, unmanned plane, remote operated vehicle etc.
Remote control is mobile to be achieved huge development and is widely applied, and is permitted modern industrial production, national defense construction, resource exploration etc.
It is multi-field all to produce significant impact.It can be carried out by operator the guide path of remote control operation or contexture by self into
Every trade is sailed, and under the limit and hazardous environment, is carried out some complex jobs instead of the mankind, is human perception ability and capacity
Extend.Such as: 2012, the curiosity trolley in the U.S. was climbed up Mars and is detected, and it is beautiful that the moon in China in 2013 makes an inspection tour trolley
Rabbit number also once landed on the moon and seeks the topography and geomorphology of there.
But as the working environment of smart machine becomes increasingly complex, fineness and the complexity for executing task are also more next
It is higher, keep the mobile operation entirely autonomous under non-structure environment of far control also unrealistic by artificial intelligence technology.Therefore we
Consideration can organically combine the intelligence of people with the intelligence of long-range executing agency, be carried out using the intelligence of people high-level
The tasks such as understanding, problem solving, mission planning and Task-decomposing are perceived, complete low-level signal using the intelligence of executing agency
Perception and feedback, path planning, precise motion, information processing, conventional and repeated task dispatching work, what is formed in this way is man-machine
Intelligence system can give full play to the advantage of people and machine respectively.By it is this it is man-machine between coordination and interaction, not only may be used
To enhance the ability that long-range executing agency completes operation task, while the application field of equipment is also widened.Give full play to people
Subjective initiative, operator allows for adequately interacting with distal environment.
Virtual reality (VR) technology is one of world today's front line science, is human-machine interface technology strongest so far.
It refers to the multi-party surface technology such as integrated use computer graphics, psychology, sensor, network communication, constructs in a computer
The virtual environment consistent with objective world, by naturally interacting, true to nature, generation in real time can directly be perceived more by user
Channel information makes user generate a kind of feeling on the spot in person.
Therefore, it is contemplated that virtual reality technology is controlled among mobile applied to remote, one is provided in real time far for user
Journey Three-dimensional Display control system.
Summary of the invention
According to the elaboration of background technique, design a kind of based on the remote mobile terminal three-dimensional display & control system of VR technology and side
Method, will long-range week in the way of wireless transmission by way of building binocular camera in equipment in far control mobile terminal
The environment enclosed is sent back in real time to be come;In control terminal, we then utilize virtual reality technology mobile to remote control observed by environment into
Row three-dimensional reconstruction;At display control end, user need to only take VR glasses by mobile terminal devices such as mobile phones just it is observed that remote
The 3-D effect of journey environment obtains the experience of really " on the spot in person ".Moreover, user can also hand over remote control movement
Interoperability only need to can be operated equipment and observe remote environment under any direction in real time, really be realized by the rotation on head
360 degree of three-dimensional panorama display effect.
Now embodiment of the present invention is further elaborated.
Remote mobile terminal three-dimensional display & control system based on VR technology, including
The mobile terminal being carried on trolley, including video acquisition module and more than one camera, camera and video
Acquisition module data communication, mobile terminal shoot to form data information and carry out the control data received from video server;
Video server includes that VR video processing module, video server and mobile terminal and display terminal data connect
It is logical, 3 D scene rebuilding and splicing on the one hand are carried out to the acquisition data from video acquisition module and scene transfer will be rebuild extremely
On the other hand display terminal carries out processing formation control signal to the perspective data from display terminal and is transmitted to mobile whole
End;
Display terminal includes VR video display control module, received signal is carried out real presentation, and generate visual angle letter
Number, and it is transmitted to video server.
Further, the camera is that two raspberry pie cameras constitute binocular raspberry pie camera;It can obtain double
Road vision signal more can truly restore outdoor scene.
Further, wireless data mode of communicating is used between the mobile terminal and server, preferably WHDI wirelessly connects
Connect mode;It can avoid the interference that a large amount of existing data lines advance to device.
Further, the video server using matching algorithm be based on characteristic point cooperation Epipolar geometry etc. constraint conditions into
Row matching carries out 3 D scene rebuilding and splicing to the data of binocular raspberry pie camera acquisition.
A kind of remote mobile terminal control method based on VR technology, it is characterised in that the following steps are included:
Step 1: the mobile terminal for being equipped with binocular raspberry pie camera is loaded on trolley;
Step 2: video server is transmitted to by video acquisition module after raspberry pie camera acquisition data;
Step 3: video server is based on the constraint conditions progress such as characteristic point cooperation Epipolar geometry using matching algorithm
Match, 3 D scene rebuilding and splicing is carried out to the data of binocular raspberry pie camera acquisition, and scene transfer will be rebuild and extremely shown
Terminal;
Step 4: taking and be built-in with the VR glasses viewing of display terminal and outdoor scene and act, the gyroscope of display terminal and again
Force snesor generates gyroscope angle signal and gravity sensing signal, forms perspective data and is transmitted to video server;
Step 5: video server receives perspective data, and processing forms control data, and is transmitted to the raspberry of mobile terminal
Send camera;
Step 6: raspberry pie camera transmits data to ardunio plate again, and according to the fortune of control data control trolley
Flowing mode.
The utility model has the advantages that compared with prior art, the present invention is based on the virtual realities of remote control mobile device to observe experience, user
The far control mobile device such as intelligent robot, unmanned plane and intelligent carriage can be allowed to replace oneself going to specific scene, it will be specific
Situation under scene passes through real-time Transmission and is passed in mobile phone after handling, and user can restore special scenes by VR glasses
Scene;The present invention is based on the camera shootings of the holder of gravity sensing to control, and remote mobile device of controlling can be controlled by the steering of mobile device
System, user only needs the direction of motion for simply changing itself that can control the movement of robot, to observe that itself thinks
The picture to be observed;The present invention is based on the panoramic videos of binocular camera video to generate transmission process and splicing, is taken the photograph with binocular
As the transmission of head realization panorama, it is able to solve delay issue, the video that two cameras are sent back is perfectly spliced, realizes mould
The top reduction of quasi- human eye observation, the visual experience most really most shaken to user.
Detailed description of the invention
Fig. 1 is the principle of the present invention figure;
Fig. 2 is that functional module of the invention forms figure;
Fig. 3 is the three-dimensional reconstruction geometrical model figure of the invention based on binocular camera;
Fig. 4 is the VR video display control module figure of the invention based on mobile phone mobile terminal;
Fig. 5 is carrier of the invention-trolley pictorial diagram;
Fig. 6 is development board pictorial diagram of the invention;
Fig. 7 is APP surface chart of the invention;
Fig. 8 is realtime video transmission figure of the invention.
Specific embodiment
Purpose of design of the invention really will bring long-range three-dimensional space expression effect in real time for user, realize when participating in the cintest
Emulation sense, and change stiff, passively situation when people interacts with machinery equipment.Degree of danger can be preferably detected with this people
And extreme position, the various uncertainties of use environment, and therefrom free, long distance is carried out in the environment of safety and comfort
It is liftoff to control and operate, to extend the perception and operational capacity of the mankind.
Further explaination is done below in conjunction with 1~8 pair of the embodiment of the present invention of attached drawing, it is intended to help reader more preferably geographical
Essence of the invention is solved, but any restriction cannot be constituted to implementation of the invention and protection scope.
A kind of remote mobile terminal three-dimensional display & control system based on VR technology, comprising:
The mobile terminal being carried on trolley, including video acquisition module and more than one camera, camera and video
Acquisition module data communication, mobile terminal shoot to form data information and carry out the control data received from video server;
Video server includes that VR video processing module, video server and mobile terminal and display terminal data connect
It is logical, 3 D scene rebuilding and splicing on the one hand are carried out to the acquisition data from video acquisition module and scene transfer will be rebuild extremely
On the other hand display terminal carries out processing formation control signal to the perspective data from display terminal and is transmitted to mobile whole
End;
Display terminal includes VR video display control module, received signal is carried out real presentation, and generate visual angle letter
Number, and it is transmitted to video server.
Mobile terminal, including video acquisition module and more than one camera, camera and video acquisition module data connect
It is logical, in the present embodiment, raspberry pie camera is chosen, camera quantity is selected as two composition raspberry pie binocular cameras, so as to
The eyes of people are simulated, camera chooses the high GoPro of shooting quality.
With reference to attached drawing 5, mobile terminal is carried on trolley, the trolley be lorry, figure is smaller, can quickly and
It neatly moves, is not easy in the small space placed in large-scale capture apparatus, the lesser trolley of body type is easy to place, convenient for clapping
It takes the photograph.
Wireless data mode of communicating is used between mobile terminal and server, so that trolley conveniently moving, does not have to worry control
The limitation of distance.
It the raspberry pie binocular camera carried on trolley while being sent at video acquisition module after the shooting for carrying out video
Reason, video acquisition module processing use wireless transmission method, will be on realtime video transmission to same video server.
The raspberry pie binocular camera carried on trolley will be shot simultaneously, and simultaneously that video is real using wireless transmission
When pass back to same video server, and to meet low delay, the requirement of high-resolution.
In this embodiment, we take WHDI (wireless family digital interface) as radio connection, use
The modulation system of MIMO technology and OFDM can be realized the transfer rate of up to 3Gbps, and work is in 4.9GHz~5.875GHz frequency
Section, the channel 20MHz or 40MHz meet global 5GHz spectrum regulations, effectively transmit diameter up to 30 meters, can penetrate wall, and prolong
Late less than 1 millisecond.It can guarantee the audio/video flow full HD in the transmission entirely without compression, can support up to 1920X1080P's
Video resolution.
The video information that video server obtains reception will also synchronize processing, and the present embodiment uses three-dimensional reconstruction
Mode is further processed.
The three-dimensional reconstruction of binocular video is then the depth for needing to restore from two video contents sent back shooting object
Information, it is crucial then be to seek the geometrical model parameter of video camera.Referring to Fig. 3, the Three-dimensional Gravity based on raspberry pie binocular camera
Geometrical model is built, wherein spatial point p passes through camera imaging, is mapped to point (x, y) on image, and wherein Oc is camera photocentre, WCS,
DCS, ICS are world coordinate system, equipment (camera) coordinate system, image coordinate system respectively.That puts on spatial point p to camera image is several
What transformation can be described with camera internal reference, and O1, O2 are the optical center of left and right camera respectively, to determine the opposite position of the two cameras
It sets relationship: can be described with spin matrix R and translation vector T, it is determined that the positional relationship of R and T, two cameras determine that
, the step be called camera outer ginseng calibration, i.e., optical triangulation shape is solved by a series of known p1, p2 and p12, estimate
Optimal R, T out.P1, p2 are known as a point to (pair), they are imaging point of the same spatial point in different cameral, are sought
The process of such point pair is looked for be known as Stereo matching, it is three-dimensional reconstruction most critical, it may also be said to be the step being most difficult to.Matching is calculated
Method is generally done by characteristic point, i.e., extracts the characteristic point (common sift algorithm) of left images respectively, be then based on characteristic point
The constraint conditions such as cooperation Epipolar geometry are matched, to restore the depth of shooting object from two video contents sent back
Information.
The two-way video passed back such as is compared, is corrected, spliced at the processing by server, and final three-dimensional reconstruction and splicing obtain
The 3D outdoor scene effect of synthesis, on the basis of the splicing of existing image, realize the image that is truncated to binocular camera into
Row splicing, and the picture taken by binocular camera can be synthesized, significantly solve the problems, such as slowed-down video.
Object is in quick movement, after the image that human eye is seen disappears, human eye remain to continue to retain its image 0.1~
0.4 second or so image, this phenomenon are referred to as visual persistence phenomenon, are a kind of property that human eye has.Our invention passes
Defeated delay time is less than 0.04 second, according to this characteristic of human eye, it is believed that real-time Transmission, reference have been accomplished in our invention
Attached drawing 8.
By treated, data are transmitted to display terminal to video server, and the VR video display control module of display terminal can
Inside contain cell phone, presentation is showed in human eye after cell phone receives data, and people is it will be seen that trolley binocular camera
The real-time video of the 3D effect taken.
People can select the direction for wanting to see and angle after observing real-time video as needed, it is therefore desirable to generate choosing
It selects signal and is transmitted to trolley.
Matched APP can be worked out on mobile phone, in the present embodiment, mobile phone use Android system, be divided into the mode of wearing and
Two kinds of operating modes of hand-held mode, the surface chart of APP is referring to attached drawing 7.
Main Patterns are the mode of wearing that may be viewed by 3D video effect, and hand-held mode is underwork mode, both moulds
Formula can be by user's unrestricted choice.
Under secondary mode, the same key crossed in APP is controlled moving of car by user, but user will only in a handheld mode
The real-time panoramic video that common trolley binocular camera takes can be watched.
Main Patterns are the operating mode that the present embodiment uses, that is, wear mode, and under Main Patterns, display terminal is utilized
Mobile phone gyroscope and gravity sensor perceive and human body head movement are converted into parameter, parameter perspective data is transmitted
To data server, into processing after form control instruction and send trolley to, and then control trolley and advance.
Under mode of wearing, perspective data sends trolley to as control instruction, and control trolley is advanced, and what is utilized is mobile phone
Gravity sensing module, mobile phone grab gravimetric data, and be converted into accordingly advance, retreat, turn left or turn right variable.With reference to
Attached drawing 6, parameter will reach raspberry pie camera, and raspberry pie camera transmits data to ardunio plate again, by constantly transmitting
The mobile robot for instructing trolley to be loaded with controls the motion mode of trolley to reach: left and right side speed is the same and is all timing
Trolley advances, and left and right side speed is the same and with retroversion when being negative, and the speed difference by crossing left and right motor controls robot and turns to.It is logical
It crosses instruction control robot incessantly to be continuously advanced, robot will not readvance if instructing stopping.
Trolley is equipped with IR evading obstacle sensors, and detecting can halt when barrier, thus in normal operation
Protection hardware components of the invention are haved the function that, trolley are made to possess the service life of long period.
Trolley band Intelligent anti-collision function, therefore user is when in use, if the instruction sent is with the instruction of anti-collision
Mutually conflict, APP will be prompted to operate wrong, it is impossible to carry out faulty operation.
Attached drawing 4 is the VR video display control module figure of the invention based on mobile phone mobile terminal, in mobile phone mobile terminal, is used
Before APP need to be initialized, including to trolley holder angle, nose wheel angle, camera connection and network sublimity obtain IP
Address, if existing account, will check account information.
Control unit takes a large amount of real time data, including ambient temperature, car interior operation temperature when trolley travelling
The gravity sensing signal and gyroscope angle of speed and angle of turn, mobile phone acquisition that degree, trolley are advanced.
Real time data will be completed to encapsulate two-way video synthesis VR by the calling to encapsulation library function, while by multidimensional number
The control signal for being reduced to trolley horizontal velocity and trolley deviation angle according to dimension-reduction treatment is done.
A kind of remote mobile terminal control method based on VR technology, it is characterised in that the following steps are included:
Step 1: the mobile terminal for being equipped with binocular raspberry pie camera is loaded on trolley;
Step 2: video server is transmitted to by video acquisition module after raspberry pie camera acquisition data;
Step 3: video server is based on the constraint conditions progress such as characteristic point cooperation Epipolar geometry using matching algorithm
Match, 3 D scene rebuilding and splicing is carried out to the data of binocular raspberry pie camera acquisition, and scene transfer will be rebuild and extremely shown
Terminal;
Step 4: taking and be built-in with the VR glasses viewing of display terminal and outdoor scene and act, the gyroscope of display terminal and again
Force snesor generates gyroscope angle signal and gravity sensing signal, forms perspective data and is transmitted to video server;
Step 5: video server receives perspective data, and processing forms control data, and is transmitted to the raspberry of mobile terminal
Send camera;
Step 6: raspberry pie camera transmits data to ardunio plate again, and according to the fortune of control data control trolley
Flowing mode.
The present invention simultaneously from different angles claps object with position using two cameras by binocular vision technology
According to, and gained image transmitting is given to the computer being attached thereto, two eyes that two cameras simulate the mankind respectively believe the external world
Breath is captured, and computer, which is used to simulate human brain, handles the image of input.And in existing technology, only realize
The splicing of binocular camera image, realizes and carries out the image that binocular camera is truncated to be spliced to form whole picture figure.
Above-described, only presently preferred embodiments of the present invention, the range being not intended to limit the invention, of the invention is upper
Stating embodiment can also make a variety of changes.Made by i.e. all claims applied according to the present invention and description
Simply, equivalent changes and modifications fall within the claims of the invention patent.The not detailed description of the present invention is
Routine techniques content.
Claims (9)
1. the remote mobile terminal three-dimensional display & control system based on VR technology, characterized by comprising:
The mobile terminal being carried on trolley, including video acquisition module and more than one camera, camera and video acquisition
Module data connection, mobile terminal shoot to form data information and carry out the control data received from video server;
Video server, comprising VR video processing module, video server and mobile terminal and display terminal data communication, one
Aspect carries out 3 D scene rebuilding and splices and will rebuild scene transfer to display to the acquisition data from video acquisition module
On the other hand terminal carries out processing formation control signal to the perspective data from display terminal and is transmitted to mobile terminal;
Display terminal includes VR video display control module, received signal is carried out real presentation, and generate angle signals, and
It is transmitted to video server.
2. the remote mobile terminal three-dimensional display & control system according to claim 1 based on VR technology, it is characterised in that: described
Camera is raspberry pie camera.
3. the remote mobile terminal three-dimensional display & control system according to claim 2 based on VR technology, it is characterised in that: described
Raspberry pie camera constitutes binocular raspberry pie camera there are two setting.
4. the remote mobile terminal three-dimensional display & control system according to claim 1 based on VR technology, it is characterised in that: described
Wireless data mode of communicating is used between mobile terminal and server.
5. the remote mobile terminal three-dimensional display & control system according to claim 4 based on VR technology, it is characterised in that: described
Wireless data mode of communicating takes WHDI radio connection.
6. the remote mobile terminal three-dimensional display & control system according to claim 1 based on VR technology, it is characterised in that: described
Trolley is equipped with more than one sensor, sensor and video acquisition module data communication.
7. the remote mobile terminal three-dimensional display & control system according to claim 1 based on VR technology, it is characterised in that: described
Cell phone built in VR video display control module, cell phone are that the data sink of display terminal and signal generate dress
It sets.
8. the remote mobile terminal three-dimensional display & control system according to claim 6 based on VR technology, it is characterised in that: described
Mobile terminal includes wearing two kinds of operating modes of mode and hand-held mode.
9. a kind of remote mobile terminal control method based on VR technology, it is characterised in that the following steps are included:
Step 1: the mobile terminal for being equipped with binocular raspberry pie camera is loaded on trolley;
Step 2: video server is transmitted to by video acquisition module after raspberry pie camera acquisition data;
Step 3: video server is based on the constraint conditions such as characteristic point cooperation Epipolar geometry using matching algorithm and is matched, right
The data of binocular raspberry pie camera acquisition carry out 3 D scene rebuilding and splicing, and will rebuild scene transfer to display terminal;
Step 4: it takes the VR glasses viewing outdoor scene for being built-in with display terminal and acts, the gyroscope and gravity of display terminal pass
Sensor generates gyroscope angle signal and gravity sensing signal, forms perspective data and is transmitted to video server;
Step 5: video server receives perspective data, and processing forms control data, and the raspberry pie for being transmitted to mobile terminal is taken the photograph
As head;
Step 6: raspberry pie camera transmits data to ardunio plate again, and according to the movement side of control data control trolley
Formula.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810838456.3A CN109062407A (en) | 2018-07-27 | 2018-07-27 | Remote mobile terminal three-dimensional display & control system and method based on VR technology |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810838456.3A CN109062407A (en) | 2018-07-27 | 2018-07-27 | Remote mobile terminal three-dimensional display & control system and method based on VR technology |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109062407A true CN109062407A (en) | 2018-12-21 |
Family
ID=64835497
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810838456.3A Pending CN109062407A (en) | 2018-07-27 | 2018-07-27 | Remote mobile terminal three-dimensional display & control system and method based on VR technology |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109062407A (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110176022A (en) * | 2019-05-23 | 2019-08-27 | 广西交通科学研究院有限公司 | A kind of tunnel overall view monitoring system and method based on video detection |
CN110809148A (en) * | 2019-11-26 | 2020-02-18 | 大连海事大学 | Sea area search system and three-dimensional environment immersive experience VR intelligent glasses |
CN111402651A (en) * | 2020-05-22 | 2020-07-10 | 丽水学院 | Intelligent teaching system based on VR technique |
CN111491125A (en) * | 2020-04-20 | 2020-08-04 | 深圳市谦视智能科技有限责任公司 | AR intelligence glasses and record appearance |
CN111814757A (en) * | 2020-08-20 | 2020-10-23 | 许昌学院 | Garbage classification simulation system based on computer software |
CN112291593A (en) * | 2020-12-24 | 2021-01-29 | 湖北芯擎科技有限公司 | Data synchronization method and data synchronization device |
CN113099204A (en) * | 2021-04-13 | 2021-07-09 | 北京航空航天大学青岛研究院 | Remote live-action augmented reality method based on VR head-mounted display equipment |
CN114051130A (en) * | 2021-10-13 | 2022-02-15 | 北京天玛智控科技股份有限公司 | VR-based panoramic video monitoring system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN205726125U (en) * | 2016-03-30 | 2016-11-23 | 重庆邮电大学 | A kind of novel robot Long-Range Surveillance System |
CN106993177A (en) * | 2016-10-12 | 2017-07-28 | 深圳市圆周率软件科技有限责任公司 | A kind of 720 degree of panorama acquisition systems of binocular |
WO2017173735A1 (en) * | 2016-04-07 | 2017-10-12 | 深圳市易瞳科技有限公司 | Video see-through-based smart eyeglasses system and see-through method thereof |
CN107277442A (en) * | 2017-06-20 | 2017-10-20 | 南京第五十五所技术开发有限公司 | One kind is based on intelligent panoramic real-time video VR inspections supervising device and monitoring method |
-
2018
- 2018-07-27 CN CN201810838456.3A patent/CN109062407A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN205726125U (en) * | 2016-03-30 | 2016-11-23 | 重庆邮电大学 | A kind of novel robot Long-Range Surveillance System |
WO2017173735A1 (en) * | 2016-04-07 | 2017-10-12 | 深圳市易瞳科技有限公司 | Video see-through-based smart eyeglasses system and see-through method thereof |
CN106993177A (en) * | 2016-10-12 | 2017-07-28 | 深圳市圆周率软件科技有限责任公司 | A kind of 720 degree of panorama acquisition systems of binocular |
CN107277442A (en) * | 2017-06-20 | 2017-10-20 | 南京第五十五所技术开发有限公司 | One kind is based on intelligent panoramic real-time video VR inspections supervising device and monitoring method |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110176022A (en) * | 2019-05-23 | 2019-08-27 | 广西交通科学研究院有限公司 | A kind of tunnel overall view monitoring system and method based on video detection |
CN110176022B (en) * | 2019-05-23 | 2023-03-28 | 广西交通科学研究院有限公司 | Tunnel panoramic monitoring system and method based on video detection |
CN110809148A (en) * | 2019-11-26 | 2020-02-18 | 大连海事大学 | Sea area search system and three-dimensional environment immersive experience VR intelligent glasses |
CN111491125A (en) * | 2020-04-20 | 2020-08-04 | 深圳市谦视智能科技有限责任公司 | AR intelligence glasses and record appearance |
CN111491125B (en) * | 2020-04-20 | 2022-05-13 | 深圳市谦视智能科技有限责任公司 | AR intelligence glasses and record appearance |
CN111402651A (en) * | 2020-05-22 | 2020-07-10 | 丽水学院 | Intelligent teaching system based on VR technique |
CN111814757A (en) * | 2020-08-20 | 2020-10-23 | 许昌学院 | Garbage classification simulation system based on computer software |
CN111814757B (en) * | 2020-08-20 | 2024-01-09 | 许昌学院 | Garbage classification simulation system based on computer software |
CN112291593A (en) * | 2020-12-24 | 2021-01-29 | 湖北芯擎科技有限公司 | Data synchronization method and data synchronization device |
CN113099204A (en) * | 2021-04-13 | 2021-07-09 | 北京航空航天大学青岛研究院 | Remote live-action augmented reality method based on VR head-mounted display equipment |
CN114051130A (en) * | 2021-10-13 | 2022-02-15 | 北京天玛智控科技股份有限公司 | VR-based panoramic video monitoring system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109062407A (en) | Remote mobile terminal three-dimensional display & control system and method based on VR technology | |
CN104699247B (en) | A kind of virtual reality interactive system and method based on machine vision | |
CN108139799B (en) | System and method for processing image data based on a region of interest (ROI) of a user | |
Krückel et al. | Intuitive visual teleoperation for UGVs using free-look augmented reality displays | |
WO2017211031A1 (en) | Unmanned aerial vehicle mechanical arm control method and device | |
CN108769531B (en) | Method for controlling shooting angle of shooting device, control device and remote controller | |
CN110531846A (en) | The two-way real-time 3D interactive operation of real-time 3D virtual objects in the range of real-time 3D virtual world representing real world | |
CN111716365B (en) | Immersive remote interaction system and method based on natural walking | |
CN110140099A (en) | System and method for tracking control unit | |
CN107515606A (en) | Robot implementation method, control method and robot, electronic equipment | |
WO2015180497A1 (en) | Motion collection and feedback method and system based on stereoscopic vision | |
CN106227231A (en) | The control method of unmanned plane, body feeling interaction device and unmanned plane | |
CN109076249A (en) | System and method for video processing and showing | |
CN109164829A (en) | A kind of flight mechanical arm system and control method based on device for force feedback and VR perception | |
CN106708074A (en) | Method and device for controlling unmanned aerial vehicle based on VR glasses | |
CN204741528U (en) | Intelligent control ware is felt to three -dimensional immersive body | |
CN107071389A (en) | Take photo by plane method, device and unmanned plane | |
CN109983468A (en) | Use the method and system of characteristic point detection and tracking object | |
CN105080134A (en) | Realistic remote-control experience game system | |
CN110969905A (en) | Remote teaching interaction and teaching aid interaction system for mixed reality and interaction method thereof | |
CN109358754B (en) | Mixed reality head-mounted display system | |
CN106095094A (en) | The method and apparatus that augmented reality projection is mutual with reality | |
CN109671141A (en) | The rendering method and device of image, storage medium, electronic device | |
US20160121232A1 (en) | Battle game relay system using flying robots | |
KR20200116459A (en) | Systems and methods for augmented reality |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20181221 |
|
WD01 | Invention patent application deemed withdrawn after publication |