Specific embodiment
In order to which technical problems, technical solutions and advantages to be solved are more clearly understood, tie below
Accompanying drawings and embodiments are closed, the present invention will be described in further detail.It should be appreciated that specific embodiment described herein is only
To explain the present invention, it is not intended to limit the present invention.
Referring to FIG. 1, Fig. 1 is the flow diagram for the fire-fighting experiential method based on VR that one embodiment of the invention provides,
This method comprises:
S101: the corresponding project label of fire-fighting experience project that user specifies is obtained.
In the present embodiment, the corresponding relationship that fire-fighting experience project and project label can be initially set up, when user's selection disappears
When anti-experience project, corresponding project label can be acquired.
S102: the corresponding default VR scene of the project label is obtained according to project label.
In the present embodiment, user can select the fire-fighting to be experienced to experience project according to demands of individuals, and the present invention is implemented
The corresponding project label of fire-fighting experience project that example is specified by acquisition user determines VR default corresponding to the project label
Scape.
S103: detection user's sight orientation, and default VR scene is played in the corresponding region in user's sight orientation.
In the present embodiment, default VR scene can be played in the range of visibility of user, and is enhanced in user's range of visibility
VR scene, virtualization user realize range outside scene, thus reduce data transmission pressure, save bandwidth resources.
S104: it during playing default VR scene, detects whether to receive triggering idsplay order, triggering idsplay order carries
There is node identification, if receiving triggering idsplay order, corresponding data is loaded according to node identification and are shown.
In the present embodiment, triggering idsplay order includes automatic trigger instruction, user triggers idsplay order and backstage is triggered
Idsplay order.Wherein automatic trigger instruction is according to play time axis automatic trigger, and without artificial triggering, user triggers display
Instruction refers to that user (does not include control class instruction, such as preset VR scene and play temporarily with the instruction triggered when VR scene interactivity
Stop, terminate), when backstage triggering idsplay order points out to show special status (such as the load of playback equipment failure, playing program malfunctions
Deng) instruction that is triggered of background work personnel.
S105: it during playing default VR scene, detects whether to receive user's control instruction, if receiving user's control
Instruction is then adjusted the playback progress of default VR scene according to user's control instruction.
In the present embodiment, user's control instruction includes but is not limited to pause, the end in default VR scene playing process
Instruction etc.
It is evidenced from the above discussion that the one side embodiment of the present invention, which is based on VR technology, carries out fire education, object is effectively reduced
Power cost;The another aspect embodiment of the present invention plays default VR scene in the corresponding region in user's sight orientation, without all
Region plays default VR scene, therefore can effectively reduce data transmission pressure, saves bandwidth resources;The another further aspect present invention is implemented
Example is acted according to triggering idsplay order, is not necessarily to personnel intervention, has been saved human cost.In summary the present invention is implemented
Example effectively reduces the cost of fire safety education.
Please also refer to Fig. 1 and Fig. 2, Fig. 2 is the fire-fighting experiential method based on VR that another embodiment of the application provides
Flow diagram.On the basis of the above embodiments, this method can also include the creation process of default VR scene, the creation
Journey may include:
S201: the real picture of each fire-fighting experience project is obtained.
In the present embodiment, real picture can be used as the basis for creating the background scene of default VR scene.
S202: VR scene is preset based on real picture and default three-dimensional part model creation.
In the present embodiment, real picture can be primarily based on and establish three-dimensional live scene, then by default three-dimensional part model
It is added in three-dimensional live scene and obtains default VR scene.
Please also refer to Fig. 2 and Fig. 3, Fig. 3 is the fire-fighting experiential method based on VR that the application another embodiment provides
Flow diagram.On the basis of the above embodiments, step S202 can be described in detail are as follows:
S301: outdoor scene scene is created based on real picture.
In the present embodiment, outdoor scene scene is three-dimensional live scene, can be used as the background scene of default VR scene.
S302: the individual component in identification real picture determines default three-dimensional part model corresponding with the individual component.
In the present embodiment, three-dimensional part model library can be searched, determines the corresponding default three-dimensional part of each individual component
Model.
S303: default three-dimensional part model is added in outdoor scene scene, obtains default VR scene.
In the present embodiment, default three-dimensional part model can be added to this in outdoor scene scene and preset three-dimensional part model pair
The individual component position answered, then handled by edge of the image edge-blending means to default three-dimensional part model, it obtains
Default VR scene.
Please also refer to Fig. 3 and Fig. 4, Fig. 4 is the fire-fighting experiential method based on VR that the another embodiment of the application provides
Flow diagram.On the basis of the above embodiments, step S302 can be described in detail are as follows:
S401: the individual component in identification real picture obtains individual component mark.
In the present embodiment, it can recognize the individual component in real picture, obtain individual component mark, and in individual component
Position be arranged position mark, the position mark of each individual component and the mark of the individual component are corresponding.
S402: being based on individual component identifier lookup three-dimensional part model library, obtains corresponding with the individual component default three
Tie up partial model.
In the present embodiment, it can be based on individual component identifier lookup three-dimensional part model library, obtained and the individual component pair
The default three-dimensional part model answered.When carrying out the addition of default three-dimensional part model, three-dimensional part mould can be preset according to this
The position mark of the corresponding individual component of type, individual component mark determine the point of addition of default three-dimensional part model.
Please also refer to Fig. 1 and Fig. 5, Fig. 5 is the fire-fighting experiential method based on VR that the another embodiment of the application provides
Flow diagram.On the basis of the above embodiments, step S103 can be described in detail are as follows:
S501: detection user's sight orientation, and the first area that default VR scene plays is determined according to user's sight orientation.
In the present embodiment, the firstth area that the fan-shaped region of face user's direction of visual lines can be played as default VR scene
Domain.Wherein, the coverage area of first area is greater than the range of visibility of user.
S502: according to first area and preset user's head deflection angle, the secondth area that default VR scene plays is determined
Domain, and default VR scene is played in second area.
In the present embodiment, since user may convert direction of visual lines at any time, screen switching will lead to default VR not in time
The broadcasting Caton of scene influences user experience, therefore can increase the broadcasting range of default VR scene on the basis of first area,
Strive for more screen switching times when user converts direction of visual lines, avoids the appearance for playing Caton situation.
In the present embodiment, preset user's head deflection angle can be obtained first, deflected according to preset user's head
Angle, user maximum sighting distance (user wear it is farthest after VR equipment it can be seen that distance) determine that user deflects a sight institute
The sight of presupposition multiple (such as 1.5 times~3 times) is deflected area by the area (as sight deflection area) for the picture that need to be converted
Broadcasting range as the default VR scene increased on the basis of first area.Then, the sight based on presupposition multiple is inclined
Turn area and first area determines second area.
Wherein, area is deflected based on the sight of presupposition multiple and first area determines that second area may include:
Area is deflected in the sight that each edge of first area increases separately presupposition multiple, obtains second area.
Please also refer to Fig. 1 and Fig. 6, Fig. 6 is the fire-fighting experiential method based on VR that the another embodiment of the application provides
Flow diagram.On the basis of the above embodiments, corresponding data are loaded according to node identification show and can be described in detail are as follows:
S601: dynamic link library is searched according to node identification, obtains data file corresponding with node identification address.
S602: the corresponding data in data file address are read and are shown.
In the present embodiment, data file includes but is not limited to video data file, audio data file, scene prompt text
This etc..
Corresponding to the fire-fighting experiential method based on VR of foregoing embodiments, Fig. 7 be one embodiment of the invention provide based on
The structural block diagram of the fire-fighting experience apparatus of VR.For ease of description, only parts related to embodiments of the present invention are shown.With reference to
Fig. 7, the device include: identifier acquisition module 10, and scene obtains module 20, and scene playing module 30 triggers feedback module 40, into
Spend adjustment module 50.
Wherein, identifier acquisition module 10, the corresponding project label of fire-fighting experience project specified for obtaining user.
Scene obtains module 20, for obtaining the corresponding default VR scene of the project label according to project label.
Scene playing module 30 plays in advance for detecting user's sight orientation, and in the corresponding region in user's sight orientation
If VR scene.
Feedback module 40 is triggered, for detecting whether to receive triggering idsplay order during playing default VR scene, is touched
Hair idsplay order carries node identification, if receiving triggering idsplay order, according to node identification load corresponding data into
Row display.
Progress adjustment module 50 receives user's control instruction for detecting whether during playing default VR scene, if
User's control instruction is received, then the playback progress of default VR scene is adjusted according to user's control instruction.
With reference to Fig. 7, in another embodiment of the present invention, the fire-fighting experience apparatus based on VR further includes scene creation mould
Block 60, scene creation module 60 include:
Outdoor scene acquiring unit 61, for obtaining the real picture of each fire-fighting experience project.
Scene creation unit 62, for presetting VR scene based on real picture and default three-dimensional part model creation.
Optionally, a kind of specific embodiment of the fire-fighting experience apparatus as provided in an embodiment of the present invention based on VR,
Presetting VR scene based on real picture and default three-dimensional part model creation includes:
Outdoor scene scene is created based on real picture.
It identifies the individual component in real picture, determines default three-dimensional part model corresponding with the individual component.
Default three-dimensional part model is added in outdoor scene scene, default VR scene is obtained.
Optionally, a kind of specific embodiment of the fire-fighting experience apparatus as provided in an embodiment of the present invention based on VR,
It identifies the individual component in real picture, determines that default three-dimensional part model corresponding with the individual component includes:
It identifies the individual component in real picture, obtains individual component mark.
Based on individual component identifier lookup three-dimensional part model library, default three-dimensional part corresponding with the individual component is obtained
Model.
With reference to Fig. 7, in yet another embodiment of the present invention, scene playing module 30 may include:
Orientation detection unit 31 determines default VR scene for detecting user's sight orientation, and according to user's sight orientation
The first area of broadcasting.
Scene broadcast unit 32, for determining VR default according to first area and preset user's head deflection angle
The second area that scape plays, and default VR scene is played in second area.
With reference to Fig. 7, in yet another embodiment of the present invention, triggering feedback module 40 may include:
Address lookup unit 41 obtains corresponding with the node identification for searching dynamic link library according to node identification
Data file address.
Data display unit 42, for reading the corresponding data in data file address and being shown.
Referring to Fig. 8, Fig. 8 is the schematic block diagram for the terminal device that one embodiment of the invention provides.This implementation as shown in Figure 8
Terminal 800 in example may include: one or more processors 801, one or more input equipment 802, one or more defeated
Equipment 803 and one or more memories 804 out.Above-mentioned processor 801, input equipment 802, then output equipment 803 and storage
Device 804 completes mutual communication by communication bus 805.Memory 804 is for storing computer program, computer program packet
Include program instruction.Processor 801 is used to execute the program instruction of the storage of memory 804.Wherein, processor 801 is configured for
The function of each module/unit in above-mentioned each Installation practice, such as module 10 shown in Fig. 7 are operated below caller instruction execution
To 60 function.
It should be appreciated that in embodiments of the present invention, alleged processor 801 can be central processing unit (Central
Processing Unit, CPU), which can also be other general processors, digital signal processor (Digital
Signal Processor, DSP), specific integrated circuit (Application Specific Integrated Circuit,
ASIC), ready-made programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic
Device, discrete gate or transistor logic, discrete hardware components etc..General processor can be microprocessor or this at
Reason device is also possible to any conventional processor etc..
Input equipment 802 may include that Trackpad, fingerprint adopt sensor (for acquiring the finger print information and fingerprint of user
Directional information), microphone etc., output equipment 803 may include display (LCD etc.), loudspeaker etc..
The memory 804 may include read-only memory and random access memory, and to processor 801 provide instruction and
Data.The a part of of memory 804 can also include nonvolatile RAM.For example, memory 804 can also be deposited
Store up the information of device type.
In the specific implementation, processor 801 described in the embodiment of the present invention, input equipment 802, output equipment 803 can
Described in the first embodiment and second embodiment for executing the fire-fighting experiential method provided in an embodiment of the present invention based on VR
The implementation of terminal described in the embodiment of the present invention also can be performed in implementation, and details are not described herein.
A kind of computer readable storage medium is provided in another embodiment of the invention, and computer readable storage medium is deposited
Computer program is contained, computer program includes program instruction, and above-described embodiment side is realized when program instruction is executed by processor
All or part of the process in method can also instruct relevant hardware to complete by computer program, and computer program can
It is stored in a computer readable storage medium, the computer program is when being executed by processor, it can be achieved that above-mentioned each method
The step of embodiment.Wherein, computer program includes computer program code, and computer program code can be source code shape
Formula, object identification code form, executable file or certain intermediate forms etc..Computer-readable medium may include: that can carry meter
Any entity or device of calculation machine program code, recording medium, USB flash disk, mobile hard disk, magnetic disk, CD, computer storage, only
Read memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), electricity load
Wave signal, telecommunication signal and software distribution medium etc..It should be noted that the content that computer-readable medium includes can root
Increase and decrease appropriate is carried out according to the requirement made laws in jurisdiction with patent practice, such as in certain jurisdictions, according to vertical
Method and patent practice, computer-readable medium do not include be electric carrier signal and telecommunication signal.
Computer readable storage medium can be the internal storage unit of the terminal of aforementioned any embodiment, such as terminal
Hard disk or memory.Computer readable storage medium is also possible to the External memory equipment of terminal, such as the grafting being equipped in terminal
Formula hard disk, intelligent memory card (Smart Media Card, SMC), secure digital (Secure Digital, SD) card, flash card
(Flash Card) etc..Further, computer readable storage medium can also both include the internal storage unit of terminal or wrap
Include External memory equipment.Computer readable storage medium is for storing other program sum numbers needed for computer program and terminal
According to.Computer readable storage medium can be also used for temporarily storing the data that has exported or will export.
Those of ordinary skill in the art may be aware that list described in conjunction with the examples disclosed in the embodiments of the present disclosure
Member and algorithm steps, can be realized with electronic hardware, computer software, or a combination of the two, in order to clearly demonstrate hardware
With the interchangeability of software, each exemplary composition and step are generally described according to function in the above description.This
A little functions are implemented in hardware or software actually, the specific application and design constraint depending on technical solution.Specially
Industry technical staff can use different methods to achieve the described function each specific application, but this realization is not
It is considered as beyond the scope of this invention.
It is apparent to those skilled in the art that for convenience of description and succinctly, the end of foregoing description
The specific work process at end and unit, can refer to corresponding processes in the foregoing method embodiment, details are not described herein.
In several embodiments provided herein, it should be understood that disclosed terminal and method can pass through it
Its mode is realized.For example, the apparatus embodiments described above are merely exemplary, for example, the division of unit, only
A kind of logical function partition, there may be another division manner in actual implementation, for example, multiple units or components can combine or
Person is desirably integrated into another system, or some features can be ignored or not executed.In addition, it is shown or discussed it is mutual it
Between coupling, direct-coupling or communication connection can be through some interfaces, the INDIRECT COUPLING or communication link of device or unit
It connects, is also possible to electricity, mechanical or other form connections.
Unit may or may not be physically separated as illustrated by the separation member, shown as a unit
Component may or may not be physical unit, it can and it is in one place, or may be distributed over multiple networks
On unit.It can select some or all of unit therein according to the actual needs to realize the mesh of the embodiment of the present invention
's.
It, can also be in addition, the functional units in various embodiments of the present invention may be integrated into one processing unit
It is that each unit physically exists alone, is also possible to two or more units and is integrated in one unit.It is above-mentioned integrated
Unit both can take the form of hardware realization, can also realize in the form of software functional units.
More than, only a specific embodiment of the invention, but scope of protection of the present invention is not limited thereto, and it is any to be familiar with
Those skilled in the art in the technical scope disclosed by the present invention, can readily occur in various equivalent modifications or substitutions,
These modifications or substitutions should be covered by the protection scope of the present invention.Therefore, protection scope of the present invention should be wanted with right
Subject to the protection scope asked.