CN106131483A - A kind of method for inspecting based on virtual reality and relevant device, system - Google Patents
A kind of method for inspecting based on virtual reality and relevant device, system Download PDFInfo
- Publication number
- CN106131483A CN106131483A CN201610474969.1A CN201610474969A CN106131483A CN 106131483 A CN106131483 A CN 106131483A CN 201610474969 A CN201610474969 A CN 201610474969A CN 106131483 A CN106131483 A CN 106131483A
- Authority
- CN
- China
- Prior art keywords
- terminal
- head
- user
- virtual reality
- azimuth information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Processing Or Creating Images (AREA)
Abstract
The embodiment of the invention discloses a kind of method for inspecting based on virtual reality and relevant device, system, be applied to mobile communication technology field.The method comprise the steps that first terminal receives the view data that the second terminal sends, and described view data is carried out image procossing, generate three-dimensional scene images;Described three-dimensional scene images is sent to virtual reality device;And receive the azimuth information of head of the user that described virtual reality device sends, wherein, the headwork of described user is tracked obtaining by the azimuth information of the head of described user by described virtual reality device;Described first terminal sends control instruction to described second terminal, so that the headwork of described user followed the tracks of by the camera head in the second terminal described in described second terminal control, wherein, described control instruction includes the azimuth information of the head of described user.Implement the embodiment of the present invention, can carry out comprehensively to patrolling and examining workshop, the shooting of multi-angle, and then the quality patrolled and examined can be improved.
Description
Technical field
The present invention relates to mobile communication technology field, particularly relate to a kind of method for inspecting based on virtual reality and relevant set
Standby, system.
Background technology
Traditional workshop system safety patrol inspection work is mainly superintended and checked department by periodically inspection and the higher level of patrol officer
Irregularly spot-check two ways.During existing Daily Round Check, patrol officer is mainly by using smart mobile phone to car
Between equipment take pictures so that patrol officer is according to patrolling and examining the kind of workshop appliance in photo, size, position, current device transport
The display parameters such as row situation, identify that the workshop appliance in photo is the most normal.But, owing to each patrol officer is when taking pictures
Personal experience, shooting angle, shooting condition and the difference of shooting custom, cause the photo shot according to patrol officer cannot be accurate
Really judge that workshop appliance is the most normal, and then the quality patrolled and examined cannot be ensured.
Summary of the invention
Embodiments provide a kind of method for inspecting based on virtual reality and relevant device, system, can be to patrolling
Inspection workshop carries out comprehensively, the shooting of multi-angle, and then can improve the quality patrolled and examined.
Embodiment of the present invention first aspect discloses a kind of method for inspecting based on virtual reality, including:
First terminal receives the view data that the second terminal sends;
Described first terminal carries out image procossing to described view data, generates three-dimensional scene images;
Described first terminal sends described three-dimensional scene images to virtual reality device;
Described first terminal receives the azimuth information of the head of the user that described virtual reality device sends, wherein, described
The headwork of described user is tracked obtaining by the azimuth information of the head of user by described virtual reality device;
Described first terminal sends control instruction to described second terminal, so that the second end described in described second terminal control
The headwork of described user followed the tracks of by camera head in end, and wherein, described control instruction includes the side of the head of described user
Position information.
As the optional embodiment of one, described first terminal carries out image procossing to described view data, generates three
Dimension scene image, including:
Described first terminal carries out picture to described view data and renders, and generates initial three-dimensional scene image;
Described initial three-dimensional scene image is carried out image optimization, obtains target three-dimensional scene images.
As the optional embodiment of one, described first terminal receives the head of the user that described virtual reality device sends
The azimuth information in portion, including:
Described first terminal receives the use that described virtual reality device is sent to described first terminal by data collector
The azimuth information of the head at family.
As the optional embodiment of one, the azimuth information of the head of described user includes: the amount of movement of X-direction, Y side
To amount of movement, the amount of movement of Z-direction, the amount of spin around X-axis, the amount of spin around Y-axis and amount of spin about the z axis.
As the optional embodiment of one, described method also includes:
The view data sent according to described second terminal received generates patrols and examines report;
Patrol and examine described in detection in report and whether there is exception, if existing, then outputting alarm information.
Embodiment of the present invention second aspect discloses a kind of terminal, including:
First receives unit, receives, for first terminal, the view data that the second terminal sends;
Graphics processing unit, for described view data carries out image procossing, generates three-dimensional scene images;
First transmitting element, for sending described three-dimensional scene images to virtual reality device;
Second receives unit, for receiving the azimuth information of the head of the user that described virtual reality device sends, wherein,
The headwork of described user is tracked obtaining by the azimuth information of the head of described user by described virtual reality device;
Second transmitting element, for sending control instruction to described second terminal, so that described in described second terminal control
The headwork of described user followed the tracks of by camera head in second terminal, and wherein, described control instruction includes the head of described user
The azimuth information in portion.
As the optional embodiment of one, described graphics processing unit includes:
Rendering unit, renders for described view data is carried out picture, generates initial three-dimensional scene image;
Image optimization unit, for described initial three-dimensional scene image is carried out image optimization, obtains target three-dimensional scenic
Image.
As the optional embodiment of one,
Described second receives unit, specifically for receiving described virtual reality device by data collector to described first
The azimuth information of the head of the user that terminal sends.
As the optional embodiment of one, the azimuth information of the head of described user includes: the amount of movement of X-direction, Y side
To amount of movement, the amount of movement of Z-direction, the amount of spin around X-axis, the amount of spin around Y-axis and amount of spin about the z axis.
As the optional embodiment of one, described terminal also includes:
Report generation unit, the view data for sending according to described second terminal received generates patrols and examines report;
Detector unit, be used for detecting described in patrol and examine in report whether there is exception;
Output unit, for described patrol and examine report exists abnormal time, outputting alarm information.
The embodiment of the present invention third aspect discloses a kind of cruising inspection system based on virtual reality, including first terminal,
Two terminals, virtual reality device, data collector:
The terminal described in any one that described first terminal provides for the embodiment of the present invention;
Described virtual reality device, for receiving the three-dimensional scene images that described first terminal sends;Follow the tracks of the head of user
Portion's action, obtains the azimuth information of the head of described user;The orientation of the head of described user is sent to described data collector
Information;
Described data collector, for receiving the orientation letter of the head of the described user that described virtual reality device sends
Breath, and the azimuth information of the head of described user is sent to described first terminal;
Described second terminal, for sending view data to described first terminal;Receive the control that described first terminal sends
System instruction, controls the camera head in described second terminal and follows the tracks of the headwork of described user.
As the optional embodiment of one,
The azimuth information of the head of described user includes: the amount of movement of X-direction, the amount of movement of Y-direction, the movement of Z-direction
Amount, the amount of spin around X-axis, the amount of spin around Y-axis and amount of spin about the z axis.
As can be seen from the above technical solutions, the embodiment of the present invention has the advantage that by the second terminal taking
Patrol and examine photo and carry out image procossing, generate three-dimensional scene images, send this three-dimensional scene images to virtual reality device, and receive
The azimuth information of the headwork following the tracks of user that virtual reality device sends, sends to the second terminal and comprises above-mentioned azimuth information
Control instruction so that the headwork of user followed the tracks of by camera head in second terminal control the second terminal.Implement the present invention
Embodiment, follows the tracks of the headwork of user, can carry out comprehensively to patrolling and examining workshop by controlling the camera head in the second terminal,
The shooting of multi-angle, and then the quality patrolled and examined can be improved.
Accompanying drawing explanation
For the technical scheme being illustrated more clearly that in the embodiment of the present invention, in embodiment being described below required for make
Accompanying drawing briefly introduce, it should be apparent that, below describe in accompanying drawing be only some embodiments of the present invention, for this
From the point of view of the those of ordinary skill in field, on the premise of not paying creative work, it is also possible to obtain it according to these accompanying drawings
His accompanying drawing.
Fig. 1 is the schematic flow sheet of a kind of method for inspecting based on virtual reality disclosed in the embodiment of the present invention;
Fig. 2 is the schematic flow sheet of another kind of method for inspecting based on virtual reality disclosed in the embodiment of the present invention;
Fig. 3 is the structural representation of a kind of terminal disclosed in the embodiment of the present invention;
Fig. 4 is the structural representation of another kind of terminal disclosed in the embodiment of the present invention;
Fig. 5 is the structural representation of another kind of terminal disclosed in the embodiment of the present invention;
Fig. 6 is the structural representation of a kind of cruising inspection system based on virtual reality disclosed in the embodiment of the present invention.
Detailed description of the invention
In order to make the object, technical solutions and advantages of the present invention clearer, below in conjunction with accompanying drawing the present invention made into
One step ground describes in detail, it is clear that described embodiment is only some embodiments of the present invention rather than whole enforcement
Example.Based on the embodiment in the present invention, those of ordinary skill in the art are obtained under not making creative work premise
All other embodiments, broadly fall into the scope of protection of the invention.
Term " first " and " second " in description and claims of this specification and above-mentioned accompanying drawing are for distinguishing
Different objects, not for describing particular order.Additionally, term " includes " and they any deformation, it is intended that cover not
Exclusive comprises.Such as contain series of steps or the process of unit, method, system, product or equipment are not limited to
The step listed or unit, but the most also include step or the unit do not listed, or the most also include for these
Other step that process, method, product or equipment are intrinsic or unit.
Embodiments provide a kind of method for inspecting based on virtual reality and relevant device, system, can be to patrolling
Inspection workshop carries out comprehensively, the shooting of multi-angle, and then can improve the quality patrolled and examined.
Refer to the flow process signal that Fig. 1, Fig. 1 are a kind of method for inspecting based on virtual reality disclosed in the embodiment of the present invention
Figure.Wherein, the method for inspecting based on virtual reality shown in Fig. 1 may comprise steps of:
101, first terminal receives the view data that the second terminal sends;
In the embodiment of the present invention, first terminal can include that smart mobile phone, panel computer, desktop computer, individual digital help
Reason (Personal Digital Assistant, PDA) and mobile internet device (Mobile Internet Device,
The terminal such as MID).
The mobile operations such as in the embodiment of the present invention, the second terminal can include unmanned vehicle, intelligent robot, unmanned plane set
Standby, and carry camera head in the second terminal.
In the embodiment of the present invention, first terminal and the second terminal can be communicated by cable network, it is also possible to pass through
Wireless Fidelity (Wireless-Fidelity, WiFi) communicates, it is also possible to communicated by bluetooth, and concrete employing how
Planting communication mode, the embodiment of the present invention is not made uniqueness and is limited.
In the embodiment of the present invention, the second terminal is shot patrolling and examining workshop by the camera head in the second terminal, and
The photo of shooting is sent to first terminal by communication module.
102, first terminal carries out image procossing to above-mentioned view data, generates three-dimensional scene images;
The first terminal view data to receiving carries out image procossing, generates three-dimensional scene images, and concrete can adopt
Process by following manner: view data is carried out picture and renders, generate initial three-dimensional scene image, then to initial three-dimensional
Scene image carries out the image optimization technology such as image rectification, distortion scaling, segmentation drafting and obtains target three-dimensional scene images.
103, first terminal sends above-mentioned three-dimensional scene images to virtual reality device;
The target three-dimensional scene images that process is obtained by first terminal be sent to virtual reality (Virtual Reality,
VR) equipment so that VR equipment can reduction plant scene really, wherein, VR equipment can be the wear-type VR helmet or hands
The VR glasses etc. of machine repacking.
104, the azimuth information of the head of the user that the above-mentioned virtual reality device of first terminal reception sends, wherein, above-mentioned
The headwork of user is tracked obtaining by the azimuth information of the head of user by above-mentioned virtual reality device;
User, according to the real scene figure patrolling and examining workshop presented in VR equipment, can be controlled by the rotational action of head
The rotational action of head followed the tracks of by camera head in second terminal, it is possible to achieve multi-angled shooting.For example, it is possible in VR equipment
Installing the motion of three-axis gyroscope detection head, obtain the azimuth information of the head of user, this azimuth information includes: X-direction
Amount of movement, the amount of movement of Y-direction, the amount of movement of Z-direction, the amount of spin around X-axis, the amount of spin around Y-axis and rotation about the z axis
Amount.
The azimuth information of the head of the user got is transmitted by VR equipment to first terminal, alternatively, and VR equipment
The azimuth information of the head of user can be sent to first terminal by data collector.
105, first terminal sends control instruction to above-mentioned second terminal, so that in this second terminal of the second terminal control
The headwork of user followed the tracks of by camera head, and wherein, above-mentioned control instruction includes the azimuth information of the head of user.
After the azimuth information of the head that first terminal receives user, send control according to this azimuth information to the second terminal
Instruction, after the second terminal receives the control instruction that first terminal sends, controls the camera head action in the second terminal to protect
The azimuth information holding the head with user is consistent, it is alternatively possible to camera head and VR equipment are pre-set a reference
Coordinate system, the azimuth information of the head of the user of follow-up acquisition can be as the criterion with this reference frame, and camera head and
The reference frame of VR equipment should keep consistent.
In the method for inspecting based on virtual reality described by Fig. 1, enter by the second terminal taking is patrolled and examined photo
Row image procossing, generates three-dimensional scene images, sends this three-dimensional scene images to virtual reality device, and receives virtual reality and set
The azimuth information of the headwork following the tracks of user that preparation is sent, sends, to the second terminal, the control comprising above-mentioned azimuth information and refers to
Order, so that the headwork of user followed the tracks of by the camera head in second terminal control the second terminal.Implement the embodiment of the present invention, logical
Cross the camera head controlled in the second terminal and follow the tracks of the headwork of user, can carry out comprehensively to patrolling and examining workshop, multi-angle
Shooting, and then the quality patrolled and examined can be improved.
Further, referring to Fig. 2, Fig. 2 is the another kind of side of patrolling and examining based on virtual reality disclosed in the embodiment of the present invention
The schematic flow sheet of method.Wherein, the method for inspecting based on virtual reality shown in Fig. 2 may comprise steps of:
201, first terminal receives the view data that the second terminal sends;
202, first terminal carries out picture to above-mentioned view data and renders, and generates initial three-dimensional scene image;
203, above-mentioned initial three-dimensional scene image is carried out image optimization, obtain target three-dimensional scene images;
In the embodiment of the present invention, render by view data is carried out picture, generate initial three-dimensional scene image, the most right
Initial three-dimensional scene image carries out the image optimization technology such as image rectification, distortion scaling, segmentation drafting and obtains target three-dimensional scenic
Image.
204, first terminal sends above-mentioned target three-dimensional scene images to virtual reality device;
205, first terminal receives the head of the user that virtual reality device is sent to this first terminal by data collector
The azimuth information in portion;
Alternatively, the azimuth information of the head of above-mentioned user includes: the amount of movement of X-direction, the amount of movement of Y-direction, Z-direction
Amount of movement, the amount of spin around X-axis, the amount of spin around Y-axis and amount of spin about the z axis.
206, first terminal sends control instruction to above-mentioned second terminal, so that in this second terminal of the second terminal control
The headwork of user followed the tracks of by camera head, and wherein, above-mentioned control instruction includes the azimuth information of the head of user;
207, the view data sent according to the second terminal received generates and patrols and examines report;
208, detect above-mentioned to patrol and examine in report whether there is exception;
If 209 above-mentioned patrolling and examining report that existence is abnormal, then outputting alarm information.
It is alternatively possible to pre-set contrast images, the view data that the second terminal sends is compared with contrast images
To analysis, generate and patrol and examine report.And detect to patrol and examine in report whether there is exception, if existing, then outputting alarm information, this alarm
Can include inside information that the position of abnormal conditions, abnormal kind, intensity of anomaly etc. occur.
Referring to Fig. 3, Fig. 3 is the structural representation of a kind of terminal disclosed in the embodiment of the present invention, as it is shown on figure 3, this is eventually
End may include that
First receives unit 301, for receiving the view data that the second terminal sends;
Graphics processing unit 302, the view data received for receiving unit 301 to above-mentioned first is carried out at image
Reason, generates three-dimensional scene images;
First transmitting element 303, processes, for sending above-mentioned graphics processing unit 302 to virtual reality device, three obtained
Dimension scene image;
Second receives unit 304, for receiving the azimuth information of the head of the user that virtual reality device sends, wherein,
The headwork of this user is tracked obtaining by the azimuth information of the head of above-mentioned user by virtual reality device;
Second transmitting element 305, for receiving the orientation letter of the head of the user that unit 304 receives according to above-mentioned second
Cease and send control instruction to the second terminal, so that the head of this user followed the tracks of by the camera head in this second terminal of the second terminal control
Portion's action, wherein, above-mentioned control instruction includes the azimuth information of the head of this user.
Seeing also Fig. 4, Fig. 4 is the structural representation of another kind of terminal disclosed in the embodiment of the present invention.Wherein, Fig. 4
Shown terminal is that terminal as shown in Figure 3 is optimized and obtains, compared with the terminal shown in Fig. 3, and above-mentioned image procossing list
Unit 302 includes:
Rendering unit 3021, renders for above-mentioned view data is carried out picture, generates initial three-dimensional scene image;
Image optimization unit 3022, carries out image for the initial three-dimensional scene image generating above-mentioned rendering unit 3021
Optimize, obtain target three-dimensional scene images.
Alternatively, in the terminal shown in Fig. 4,
Above-mentioned second receives unit 304, specifically for receiving virtual reality device by data collector to first terminal
The azimuth information of the head of the user sent.
Alternatively, the azimuth information of the head of above-mentioned user includes: the amount of movement of X-direction, the amount of movement of Y-direction, Z-direction
Amount of movement, the amount of spin around X-axis, the amount of spin around Y-axis and amount of spin about the z axis.
Alternatively, in the terminal shown in Fig. 4, this terminal can also include:
Report generation unit 306, for receiving the image of the second terminal transmission that unit 301 receives according to above-mentioned first
Data genaration patrols and examines report;
Detector unit 307, for detect above-mentioned report generation unit 306 generate patrol and examine in report whether there is exception;
Output unit 308, for above-mentioned detector unit 307 detect above-mentioned patrol and examine report exists abnormal time, output
Warning information.
Further, the physical arrangement schematic diagram that Fig. 5, Fig. 5 are a kind of terminals disclosed in the embodiment of the present invention is referred to.As
Shown in Fig. 5, this terminal may include that at least one processor 501, such as CPU, at least one communication unit 502, user interface
503, memorizer 504, at least one communication bus 505.Wherein, communication bus 505 is for realizing the connection between these assemblies
Communication, user interface 503 can include display screen (Display), keyboard (Keyboard) etc..Memorizer 504 can be at a high speed
RAM memory, it is also possible to be non-labile memorizer (non-volatile memory), for example, at least one disk storage
Device.Communication unit 502 can include that (Wireless Local Area Network is called for short wireless to WLAN
LAN) module, bluetooth module, WiFi module, close range wireless communication (Near Field Communication, be called for short NFC),
The wireless communication modules such as base band (Base Band) module and Ethernet, USB (universal serial bus) (Universal Serial Bus,
Be called for short USB), the wire communication module such as lightning interface equipment such as (Lightning, current Apple for) iPhone6/6s.Storage
Device 504 optionally can also is that at least one is located remotely from the storage device of aforementioned processor 501.As it is shown in figure 5, as a kind of
The memorizer 504 of computer-readable storage medium can include operating system, network communication module, Subscriber Interface Module SIM and based on
The program of the method for inspecting of virtual reality.
In the terminal shown in Fig. 5, processor 501 may be used for calling storage in memorizer 504 based on virtual reality
The program of method for inspecting, and perform following operation:
Receive the view data that the second terminal sends;
Above-mentioned view data is carried out image procossing, generates three-dimensional scene images;
Above-mentioned three-dimensional scene images is sent to virtual reality device;
Receive the azimuth information of the head of the user that above-mentioned virtual reality device sends, wherein, the head of above-mentioned user
The headwork of user is tracked obtaining by azimuth information by above-mentioned virtual reality device;
Control instruction is sent, so that the camera head in this second terminal of the second terminal control is followed the tracks of to above-mentioned second terminal
The headwork of user, wherein, above-mentioned control instruction includes the azimuth information of the head of user.
Alternatively, processor 501 may be used for calling the method for inspecting based on virtual reality of storage in memorizer 504
Program, performs to carry out above-mentioned view data image procossing, generates three-dimensional scene images, specifically include:
Above-mentioned view data is carried out picture render, generate initial three-dimensional scene image;
Above-mentioned initial three-dimensional scene image is carried out image optimization, obtains target three-dimensional scene images.
Alternatively, processor 501 may be used for calling the method for inspecting based on virtual reality of storage in memorizer 504
Program, receives the azimuth information of the head of the user that virtual reality device is sent to this first terminal by data collector.
Alternatively, the azimuth information of the head of above-mentioned user includes: the amount of movement of X-direction, the amount of movement of Y-direction, Z-direction
Amount of movement, the amount of spin around X-axis, the amount of spin around Y-axis and amount of spin about the z axis.
Alternatively, processor 501 may be used for calling the method for inspecting based on virtual reality of storage in memorizer 504
Program, it is also possible to execution following steps:
The view data sent according to the second terminal received generates patrols and examines report;
Detect and above-mentioned patrol and examine in report whether there is exception, if existing, then outputting alarm information.
It should be noted that the terminal shown in Fig. 5 only denotes in terminal for performing disclosed in the embodiment of the present invention
Assembly needed for method for inspecting based on virtual reality, other assembly embodiment of the present invention that can possess for terminal are not marked
Show, because this does not affect the realization of the embodiment of the present invention.
The embodiment of the present invention additionally provides a kind of cruising inspection system based on virtual reality, and as shown in Figure 6, this system can be wrapped
Include first terminal the 601, second terminal 602, virtual reality device 603, data collector 604.
First terminal 601, for receiving the view data that the second terminal 602 sends;
First terminal 601, is additionally operable to carry out above-mentioned view data image procossing, generates three-dimensional scene images;
Alternatively, above-mentioned view data is carried out image procossing, generate three-dimensional scene images, may include that
Above-mentioned view data is carried out picture render, generate initial three-dimensional scene image;
Above-mentioned initial three-dimensional scene image is carried out image optimization, obtains target three-dimensional scene images.
First terminal 601, is additionally operable to send above-mentioned three-dimensional scene images to virtual reality device 603;
Virtual reality device 603, for following the tracks of the headwork of user, obtains the azimuth information of the head of this user;
Alternatively, the azimuth information of the head of above-mentioned user includes: the amount of movement of X-direction, the amount of movement of Y-direction, Z-direction
Amount of movement, the amount of spin around X-axis, the amount of spin around Y-axis and amount of spin about the z axis.
Virtual reality device 603, is additionally operable to send the azimuth information of the head of above-mentioned user to data collector 604;
Data collector 604, for sending the azimuth information of the head of above-mentioned user to first terminal 601;
First terminal 601, is additionally operable to receive the azimuth information of the head of above-mentioned user;
First terminal 601, is additionally operable to send control instruction to the second terminal 602;
Second terminal 602, is used for receiving above-mentioned control instruction, controls the camera head in the second terminal 602 and follows the tracks of above-mentioned
The headwork of user;
First terminal 601, is additionally operable to generate according to the view data that the second terminal 602 received sends patrol and examine report;
First terminal 601, is additionally operable to detect and above-mentioned patrols and examines in report whether there is exception, if existing, then and outputting alarm letter
Breath.
It should be noted that in the embodiment of above-mentioned method for inspecting based on virtual reality and relevant device, system, wrapped
The unit included is to carry out dividing according to function logic, but is not limited to above-mentioned division, if the phase of being capable of
The function answered;It addition, the specific name of each functional unit is also only to facilitate mutually distinguish, it is not limited to this
Bright protection domain.
In the above-described embodiments, the description to each embodiment all emphasizes particularly on different fields, and does not has the portion described in detail in certain embodiment
Point, may refer to the associated description of other embodiments.
It addition, one of ordinary skill in the art will appreciate that all or part of step realized in above-mentioned each method embodiment
The program that can be by completes to instruct relevant hardware, and corresponding program can be stored in a kind of computer-readable recording medium
In, storage medium mentioned above can be flash disk, read only memory (Read-Only Memory, ROM), random access device
The various media that can store program code such as (Random Access Memory, RAM), disk or CD.
These are only the present invention preferably detailed description of the invention, but protection scope of the present invention is not limited thereto, any
Those familiar with the art in the technical scope that the embodiment of the present invention discloses, the change that can readily occur in or replace
Change, all should contain within protection scope of the present invention.Therefore, protection scope of the present invention should be with the protection model of claim
Enclose and be as the criterion.
Claims (12)
1. a method for inspecting based on virtual reality, it is characterised in that including:
First terminal receives the view data that the second terminal sends;
Described first terminal carries out image procossing to described view data, generates three-dimensional scene images;
Described first terminal sends described three-dimensional scene images to virtual reality device;
Described first terminal receives the azimuth information of the head of the user that described virtual reality device sends, wherein, described user
The azimuth information of head be tracked obtaining to the headwork of described user by described virtual reality device;
Described first terminal sends control instruction to described second terminal, so that in the second terminal described in described second terminal control
Camera head follow the tracks of described user headwork, wherein, described control instruction include the head of described user orientation letter
Breath.
Method the most according to claim 1, it is characterised in that described view data is carried out at image by described first terminal
Reason, generates three-dimensional scene images, including:
Described first terminal carries out picture to described view data and renders, and generates initial three-dimensional scene image;
Described initial three-dimensional scene image is carried out image optimization, obtains target three-dimensional scene images.
Method the most according to claim 1, it is characterised in that described first terminal receives what described virtual reality device sent
The azimuth information of the head of user, including:
The user's that the described first terminal described virtual reality device of reception is sent to described first terminal by data collector
The azimuth information of head.
Method the most according to claim 3, it is characterised in that the azimuth information of the head of described user includes: the shifting of X-direction
Momentum, the amount of movement of Y-direction, the amount of movement of Z-direction, the amount of spin around X-axis, the amount of spin around Y-axis and rotation about the z axis
Amount.
5. according to method described in Claims 1-4 any one, it is characterised in that described method also includes:
The view data sent according to described second terminal received generates patrols and examines report;
Patrol and examine described in detection in report and whether there is exception, if existing, then outputting alarm information.
6. a terminal, it is characterised in that including:
First receives unit, receives, for first terminal, the view data that the second terminal sends;
Graphics processing unit, for described view data carries out image procossing, generates three-dimensional scene images;
First transmitting element, for sending described three-dimensional scene images to virtual reality device;
Second receives unit, for receiving the azimuth information of the head of the user that described virtual reality device sends, wherein, described
The headwork of described user is tracked obtaining by the azimuth information of the head of user by described virtual reality device;
Second transmitting element, for sending control instruction to described second terminal, so that described in described second terminal control second
The headwork of described user followed the tracks of by camera head in terminal, and wherein, described control instruction includes the head of described user
Azimuth information.
Terminal the most according to claim 6, it is characterised in that described graphics processing unit includes:
Rendering unit, renders for described view data is carried out picture, generates initial three-dimensional scene image;
Image optimization unit, for described initial three-dimensional scene image is carried out image optimization, obtains target three-dimensional scene images.
Terminal the most according to claim 6, it is characterised in that
Described second receives unit, specifically for receiving described virtual reality device by data collector to described first terminal
The azimuth information of the head of the user sent.
Terminal the most according to claim 8, it is characterised in that the azimuth information of the head of described user includes: the shifting of X-direction
Momentum, the amount of movement of Y-direction, the amount of movement of Z-direction, the amount of spin around X-axis, the amount of spin around Y-axis and rotation about the z axis
Amount.
10. according to terminal described in claim 6 to 9 any one, it is characterised in that described terminal also includes:
Report generation unit, the view data for sending according to described second terminal received generates patrols and examines report;
Detector unit, be used for detecting described in patrol and examine in report whether there is exception;
Output unit, for described patrol and examine report exists abnormal time, outputting alarm information.
11. 1 kinds of cruising inspection systems based on virtual reality, it is characterised in that include that first terminal, the second terminal, virtual reality set
Standby, data collector, described first terminal is the terminal described in claim 6 to 10 any one;
Described virtual reality device, for receiving the three-dimensional scene images that described first terminal sends;The head following the tracks of user moves
Make, obtain the azimuth information of the head of described user;The azimuth information of the head of described user is sent to described data collector;
Described data collector, for receiving the azimuth information of the head of the described user that described virtual reality device sends, and
The azimuth information of the head of described user is sent to described first terminal;
Described second terminal, for sending view data to described first terminal;The control receiving the transmission of described first terminal refers to
Order, controls the camera head in described second terminal and follows the tracks of the headwork of described user.
12. according to system described in claim 11, it is characterised in that the azimuth information of the head of described user includes: X-direction
Amount of movement, the amount of movement of Y-direction, the amount of movement of Z-direction, the amount of spin around X-axis, the amount of spin around Y-axis and rotation about the z axis
Amount.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610474969.1A CN106131483A (en) | 2016-06-24 | 2016-06-24 | A kind of method for inspecting based on virtual reality and relevant device, system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610474969.1A CN106131483A (en) | 2016-06-24 | 2016-06-24 | A kind of method for inspecting based on virtual reality and relevant device, system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106131483A true CN106131483A (en) | 2016-11-16 |
Family
ID=57266010
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610474969.1A Pending CN106131483A (en) | 2016-06-24 | 2016-06-24 | A kind of method for inspecting based on virtual reality and relevant device, system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106131483A (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106557170A (en) * | 2016-11-25 | 2017-04-05 | 三星电子(中国)研发中心 | The method and device zoomed in and out by image on virtual reality device |
CN108366232A (en) * | 2018-03-30 | 2018-08-03 | 东南大学 | A kind of intelligent video monitoring system based on mobile terminal virtual reality technology |
WO2018196184A1 (en) * | 2017-04-28 | 2018-11-01 | 深圳前海弘稼科技有限公司 | Plant monitoring method and monitoring system |
CN110347163A (en) * | 2019-08-07 | 2019-10-18 | 京东方科技集团股份有限公司 | A kind of control method of unmanned equipment, equipment and unmanned control system |
CN110825333A (en) * | 2018-08-14 | 2020-02-21 | 广东虚拟现实科技有限公司 | Display method, display device, terminal equipment and storage medium |
CN111402444A (en) * | 2020-03-24 | 2020-07-10 | 深圳市中盛瑞达科技有限公司 | Integrated machine room operation and maintenance management system |
CN111508096A (en) * | 2020-04-16 | 2020-08-07 | 宁波易周科技有限公司 | Intelligent industrial online patrol system |
CN111522444A (en) * | 2020-04-15 | 2020-08-11 | 云南电网有限责任公司带电作业分公司 | Mechanism method and system for dynamic simulation generation of power transmission line inspection scene based on VR technology |
CN112669469A (en) * | 2021-01-08 | 2021-04-16 | 国网山东省电力公司枣庄供电公司 | Power plant virtual roaming system and method based on unmanned aerial vehicle and panoramic camera |
US11068048B2 (en) | 2016-11-25 | 2021-07-20 | Samsung Electronics Co., Ltd. | Method and device for providing an image |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102348068A (en) * | 2011-08-03 | 2012-02-08 | 东北大学 | Head gesture control-based following remote visual system |
CN103324203A (en) * | 2013-06-08 | 2013-09-25 | 西北工业大学 | Unmanned airplane avionics system based on intelligent mobile phone |
CN103359642A (en) * | 2013-07-29 | 2013-10-23 | 中联重科股份有限公司 | Tower crane work monitoring system, method thereof and tower crane |
CN103824340A (en) * | 2014-03-07 | 2014-05-28 | 山东鲁能智能技术有限公司 | Intelligent inspection system and inspection method for electric transmission line by unmanned aerial vehicle |
CN104111658A (en) * | 2014-07-17 | 2014-10-22 | 金陵科技学院 | Unmanned aerial vehicle capable of performing monitoring shooting and controlling through smart glasses |
CN105222761A (en) * | 2015-10-29 | 2016-01-06 | 哈尔滨工业大学 | The first person immersion unmanned plane control loop realized by virtual reality and binocular vision technology and drive manner |
CN105334864A (en) * | 2015-11-24 | 2016-02-17 | 杨珊珊 | Intelligent glasses and control method for controlling unmanned aerial vehicle |
CN205103661U (en) * | 2015-07-24 | 2016-03-23 | 刘思成 | Unmanned aerial vehicle control system based on control technique is felt to body |
EP2959352B1 (en) * | 2013-07-31 | 2017-08-16 | SZ DJI Technology Co., Ltd. | Remote control method and terminal |
-
2016
- 2016-06-24 CN CN201610474969.1A patent/CN106131483A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102348068A (en) * | 2011-08-03 | 2012-02-08 | 东北大学 | Head gesture control-based following remote visual system |
CN103324203A (en) * | 2013-06-08 | 2013-09-25 | 西北工业大学 | Unmanned airplane avionics system based on intelligent mobile phone |
CN103359642A (en) * | 2013-07-29 | 2013-10-23 | 中联重科股份有限公司 | Tower crane work monitoring system, method thereof and tower crane |
EP2959352B1 (en) * | 2013-07-31 | 2017-08-16 | SZ DJI Technology Co., Ltd. | Remote control method and terminal |
CN103824340A (en) * | 2014-03-07 | 2014-05-28 | 山东鲁能智能技术有限公司 | Intelligent inspection system and inspection method for electric transmission line by unmanned aerial vehicle |
CN104111658A (en) * | 2014-07-17 | 2014-10-22 | 金陵科技学院 | Unmanned aerial vehicle capable of performing monitoring shooting and controlling through smart glasses |
CN205103661U (en) * | 2015-07-24 | 2016-03-23 | 刘思成 | Unmanned aerial vehicle control system based on control technique is felt to body |
CN105222761A (en) * | 2015-10-29 | 2016-01-06 | 哈尔滨工业大学 | The first person immersion unmanned plane control loop realized by virtual reality and binocular vision technology and drive manner |
CN105334864A (en) * | 2015-11-24 | 2016-02-17 | 杨珊珊 | Intelligent glasses and control method for controlling unmanned aerial vehicle |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11068048B2 (en) | 2016-11-25 | 2021-07-20 | Samsung Electronics Co., Ltd. | Method and device for providing an image |
CN106557170A (en) * | 2016-11-25 | 2017-04-05 | 三星电子(中国)研发中心 | The method and device zoomed in and out by image on virtual reality device |
WO2018196184A1 (en) * | 2017-04-28 | 2018-11-01 | 深圳前海弘稼科技有限公司 | Plant monitoring method and monitoring system |
CN108366232A (en) * | 2018-03-30 | 2018-08-03 | 东南大学 | A kind of intelligent video monitoring system based on mobile terminal virtual reality technology |
CN110825333A (en) * | 2018-08-14 | 2020-02-21 | 广东虚拟现实科技有限公司 | Display method, display device, terminal equipment and storage medium |
CN110825333B (en) * | 2018-08-14 | 2021-12-21 | 广东虚拟现实科技有限公司 | Display method, display device, terminal equipment and storage medium |
CN110347163A (en) * | 2019-08-07 | 2019-10-18 | 京东方科技集团股份有限公司 | A kind of control method of unmanned equipment, equipment and unmanned control system |
CN111402444A (en) * | 2020-03-24 | 2020-07-10 | 深圳市中盛瑞达科技有限公司 | Integrated machine room operation and maintenance management system |
CN111402444B (en) * | 2020-03-24 | 2021-01-01 | 深圳市中盛瑞达科技有限公司 | Integrated machine room operation and maintenance management system |
CN111522444A (en) * | 2020-04-15 | 2020-08-11 | 云南电网有限责任公司带电作业分公司 | Mechanism method and system for dynamic simulation generation of power transmission line inspection scene based on VR technology |
CN111508096A (en) * | 2020-04-16 | 2020-08-07 | 宁波易周科技有限公司 | Intelligent industrial online patrol system |
CN112669469A (en) * | 2021-01-08 | 2021-04-16 | 国网山东省电力公司枣庄供电公司 | Power plant virtual roaming system and method based on unmanned aerial vehicle and panoramic camera |
CN112669469B (en) * | 2021-01-08 | 2023-10-13 | 国网山东省电力公司枣庄供电公司 | Power plant virtual roaming system and method based on unmanned aerial vehicle and panoramic camera |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106131483A (en) | A kind of method for inspecting based on virtual reality and relevant device, system | |
US20190220650A1 (en) | Systems and methods for depth map sampling | |
CN211262493U (en) | Temperature measurement system and temperature measurement main system | |
CN103841313A (en) | Pan-tilt camera control method, system and device | |
CN105828068A (en) | Method and device for carrying out occlusion detection on camera and terminal device | |
US20220236056A1 (en) | Metering adjustment method, apparatus and device and storage medium | |
CN110673647B (en) | Omnidirectional obstacle avoidance method and unmanned aerial vehicle | |
CN105513155A (en) | Inspection picture classifying and naming method and terminal equipment | |
CN110223413A (en) | Intelligent polling method, device, computer storage medium and electronic equipment | |
CN102902943A (en) | Two-dimension code scanning method, processing device and terminal | |
EP3872702A2 (en) | Light color identifying method and apparatus of signal light, and roadside device | |
CN108180909A (en) | Relative position determines method, apparatus and electronic equipment | |
US20220100795A1 (en) | Systems and methods for image retrieval | |
CN110493521A (en) | Automatic Pilot camera control method, device, electronic equipment, storage medium | |
CN102915189B (en) | A kind of display processing method and electronic equipment | |
CN109903308B (en) | Method and device for acquiring information | |
CN109886100A (en) | A kind of pedestrian detecting system based on Area generation network | |
JPWO2014027500A1 (en) | Feature extraction method, program, and system | |
CN108536156A (en) | Target Tracking System and method for tracking target | |
CN112595728A (en) | Road problem determination method and related device | |
CN108600691A (en) | Image-pickup method, apparatus and system | |
CN105991903A (en) | Electronic apparatus and information processing method | |
CN114194056B (en) | Vehicle charging method and device and electronic equipment | |
CN109981973A (en) | Prevent the method, apparatus and storage medium of dangerous self-timer | |
CN104899548A (en) | Video detection method for number of operation hands on steering wheel |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20161116 |
|
RJ01 | Rejection of invention patent application after publication |