CN107105065A - VR long-range control methods, apparatus and system - Google Patents
VR long-range control methods, apparatus and system Download PDFInfo
- Publication number
- CN107105065A CN107105065A CN201710450766.3A CN201710450766A CN107105065A CN 107105065 A CN107105065 A CN 107105065A CN 201710450766 A CN201710450766 A CN 201710450766A CN 107105065 A CN107105065 A CN 107105065A
- Authority
- CN
- China
- Prior art keywords
- equipment
- scene
- information
- control
- real
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/02—Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
- H04L67/025—Protocols based on web technology, e.g. hypertext transfer protocol [HTTP] for remote control or remote monitoring of applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/131—Protocols for games, networked simulations or virtual reality
Abstract
The invention provides a kind of VR long-range control methods, apparatus and system, it is related to technical field of virtual reality, methods described includes:The VR equipment obtains described image and obtains the real-time scene information of equipment captured in real-time and upload and show, the real-time scene information is used for the real-time scene for characterizing space where described image obtains equipment;Receive the object of controlled device control in the control information that user sends, the scene in space of the control information where for controlling described image acquisition equipment;Control instruction is sent to the control device, the control instruction is used to make the object in the scene in space of the control device according to where the control information control described image that the user sends obtains equipment.User sends control instruction according to real-time scene to VR equipment, controls the object in real-time scene by control device, more intelligently, conveniently.
Description
Technical field
The present invention relates to technical field of virtual reality, in particular to VR long-range control methods, apparatus and system.
Background technology
With continuing to develop for VR technologies, also more and more extensively, people will to VR experience for applications of the VR in life, work
More and more higher is sought, can not only be watched, is also required to realize interaction.
The content of the invention
In view of this, the embodiments of the invention provide VR long-range control methods, apparatus and system, to solve the above problems.
To achieve these goals, the technical solution adopted by the present invention is as follows:
In a first aspect, a kind of VR long-range control methods provided in an embodiment of the present invention, applied to VR systems, the VR systems
Including image acquisition equipment, VR equipment and control device, methods described includes:The VR equipment obtains described image acquisition and set
The standby captured in real-time and real-time scene information uploaded is simultaneously shown, the real-time scene information, which is used to characterizing described image, obtains equipment
The real-time scene in place space;The control information that user sends is received, the control information is used to control described image acquisition to set
The object of controlled device control in the scene in the space at standby place;Control instruction is sent to the control device, the control
Instruct the sky for making the control device according to where the control information control described image that the user sends obtains equipment
Between scene in object.
Second aspect, VR long-range control methods provided in an embodiment of the present invention are applied to VR systems, and the VR systems include
Image acquisition equipment, VR equipment and control device, methods described include:Described image obtains equipment captured in real-time and uploaded real-time
Scene information is to the VR equipment, and the real-time scene information is used for the real-time field for characterizing space where described image obtains equipment
Scape;The VR equipment receives the control information that user sends, and the control information is used to control described image to obtain equipment place
Space scene in the control of controlled device object;The VR equipment sends control instruction to the control device, described
Control instruction is used to make the control device according to where the control information control described image that the user sends obtains equipment
Space scene in object.The control instruction control described image that the control device is sent according to the VR equipment is obtained
Object in the scene in the space where equipment.
The third aspect, the embodiment of the present invention provides a kind of VR remote controls, applied to VR systems, the VR systems bag
Image acquisition equipment, VR equipment and control device are included, the VR remote controls include:First acquisition unit, for obtaining
Described image is taken to obtain the real-time scene information of equipment captured in real-time and upload and show, the real-time scene information is used to characterize
The real-time scene in space where described image obtains equipment.First receiving unit, the control information for receiving user's transmission, institute
State the object of controlled device control in the scene in space of the control information where for controlling described image acquisition equipment.Send
Unit, for sending control instruction to the control device, the control instruction is used to make the control device be used according to described
Object in the scene in the space where the control information control described image acquisition equipment that family is sent.
Fourth aspect, the embodiment of the present invention provides a kind of VR tele-control systems, including image acquisition equipment, VR equipment and
Control device:Described image obtains equipment captured in real-time and uploads real-time scene information to the VR equipment, the real-time scene
Information is used for the real-time scene for characterizing space where described image obtains equipment.The VR equipment receives the control letter that user sends
The thing of controlled device control in breath, the scene in space of the control information where for controlling described image acquisition equipment
Body.The VR equipment sends control instruction to the control device, and the control instruction is used to make the control device according to institute
State user transmission control information control described image obtain equipment where space scene in object.The control device
Object in the scene in the space according to where the control instruction control described image that the VR equipment is sent obtains equipment.
VR long-range control methods, the apparatus and system of embodiments of the present invention offer, VR equipment receive real-time scene
Information is simultaneously shown that user sends control instruction to VR equipment, real-time scene is controlled by control device according to real-time scene
In object, more intelligently, conveniently.
To enable the above objects, features and advantages of the present invention to become apparent, preferred embodiment cited below particularly, and coordinate
Appended accompanying drawing, is described in detail below.
Brief description of the drawings
To make the purpose, technical scheme and advantage of the embodiment of the present invention clearer, below in conjunction with the embodiment of the present invention
In accompanying drawing, the technical scheme in the embodiment of the present invention is clearly and completely described, it is clear that described embodiment is
A part of embodiment of the present invention, rather than whole embodiments.Based on the embodiment in the present invention, those of ordinary skill in the art
The every other embodiment obtained under the premise of creative work is not made, belongs to the scope of protection of the invention.
Fig. 1 shows the structured flowchart of VR systems provided in an embodiment of the present invention;
Fig. 2 is the structural representation for the electronic equipment that present pre-ferred embodiments are provided;
Fig. 3 shows the step flow chart for the VR long-range control methods that second embodiment of the invention is provided;
Fig. 4 shows the step flow chart for the VR long-range control methods that third embodiment of the invention is provided;
Fig. 5 shows the step flow chart for the VR long-range control methods that fourth embodiment of the invention is provided;
Fig. 6 shows the structured flowchart for the VR remote controls that fifth embodiment of the invention is provided.
Embodiment
Below in conjunction with accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete
Ground is described, it is clear that described embodiment is only a part of embodiment of the invention, rather than whole embodiments.Generally exist
The component of the embodiment of the present invention described and illustrated in accompanying drawing can be arranged and designed with a variety of configurations herein.Cause
This, the detailed description of the embodiments of the invention to providing in the accompanying drawings is not intended to limit claimed invention below
Scope, but it is merely representative of the selected embodiment of the present invention.Based on embodiments of the invention, those skilled in the art are not doing
The every other embodiment obtained on the premise of going out creative work, belongs to the scope of protection of the invention.
It should be noted that:Similar label and letter represents similar terms in following accompanying drawing, therefore, once a certain Xiang Yi
It is defined in individual accompanying drawing, then it further need not be defined and explained in subsequent accompanying drawing.Meanwhile, the present invention's
In description, term " first ", " second " etc. are only used for distinguishing description, and it is not intended that indicating or implying relative importance.
Fig. 1 be VR systems 100 provided in an embodiment of the present invention structured flowchart, the VR systems 100, including:Image is obtained
Taking equipment 110, VR equipment 120, server 130 and control device 140, lead between image acquisition equipment 110 and VR equipment 120
Wireless connection is crossed, to enter row data communication or interaction;By wireless connection between VR equipment 120 and the server 130, with
Enter row data communication or interaction.By wireless connection between the server 130 and the control device 140, to enter line number
According to communication or interaction;Certainly, the VR equipment 120 can also be between control device 140 directly by wireless connection, to enter
Row data interaction.
Fig. 2 is referred to, Fig. 2 shows that one kind can be applied to VR long-range control methods provided in an embodiment of the present invention, device
And the structured flowchart of the electronic equipment 200 of system.As a kind of embodiment, the electronic equipment 200 can be PC
(personal computer, PC), tablet personal computer, smart mobile phone, personal digital assistant (personal digital
Assistant, PDA), the terminal such as wearable device.
Fig. 2 shows a kind of structured flowchart for the electronic equipment 200 that can be applied in the embodiment of the present application.The electronic equipment
200 can be used as image acquisition equipment 110, VR equipment 120 and control device 140.As shown in Fig. 2 electronic equipment 200 can be with
Including memory 202, storage control 203, processor 204 and mixed-media network modules mixed-media 205.
It is directly or indirectly electric between memory 202, storage control 203, processor 204, each element of mixed-media network modules mixed-media 205
Connection, to realize the transmission or interaction of data.For example, one or more communication bus or signal can be passed through between these elements
Bus realizes electrical connection.The VR long-range control methods include at least one respectively can be with software or firmware (firmware)
Form is stored in the software function module in memory 202, such as software function module that described VR remote controls include
Or computer program.
Memory 202 can store various software programs and module, the VR remote controls that such as the embodiment of the present application is provided
Corresponding programmed instruction/the module of method and device.Processor 204 by operation be stored in the software program in memory 202 with
And module, so as to perform various function application and data processing, that is, realize the VR long-range control methods in the embodiment of the present application.
Memory 202 can include but is not limited to random access memory (Random Access Memory, RAM), read-only storage
(Read Only Memory, ROM), programmable read only memory (Programmable Read-Only Memory, PROM),
Erasable read-only memory (Erasable Programmable Read-Only Memory, EPROM), electric erasable is read-only to be deposited
Reservoir (Electric Erasable Programmable Read-Only Memory, EEPROM) etc..
Processor 204 can be a kind of IC chip, with signal handling capacity.Above-mentioned processor can be general
Processor, including central processing unit (Central Processing Unit, abbreviation CPU), network processing unit (Network
Processor, abbreviation NP) etc.;Can also be digital signal processor (DSP), it is application specific integrated circuit (ASIC), ready-made programmable
Gate array (FPGA) or other PLDs, discrete gate or transistor logic, discrete hardware components.It can
To realize or perform disclosed each method, step and the logic diagram in the embodiment of the present application.General processor can be micro-
Processor or the processor can also be any conventional processors etc..
Mixed-media network modules mixed-media 205 is used to receiving and sending network signal.Above-mentioned network signal may include wireless signal or have
Line signal.
Electronic equipment 200 can also include display module, and the display module can be with offer between electronic equipment 200 and user
One interactive interface (such as user interface) refers to for display image data to user.
First embodiment
Continuing with the VR tele-control systems provided referring to Fig. 1, the present invention, obtained applied to VR systems 100, including image
Equipment, VR equipment and control device.
Described image obtains equipment captured in real-time and uploads real-time scene information to the VR equipment, the real-time scene letter
Cease the real-time scene for characterizing space where described image obtains equipment.
The VR equipment receives the control information that user sends, and the control information is used to control described image to obtain equipment
The object of controlled device control in the scene in the space at place.
The VR equipment sends control instruction to the control device, and the control instruction is used to make the control device root
Object in the scene in the space where the control information control described image acquisition equipment sent according to the user.
Sky of the control device according to where the control instruction control described image that the VR equipment is sent obtains equipment
Between scene in object.Control device is distributed in its near vicinity for needing to control, meanwhile, control device needs the object controlled
On be provided with receiving module, for receiving control device send control instruction.
Second embodiment
Fig. 3 is referred to, the step flow chart of the VR long-range control methods provided for second embodiment of the invention.Below will knot
Fig. 3 is closed, the VR long-range control methods that the present embodiment is provided are described in detail.
Step S310, the VR equipment obtains described image and obtains equipment captured in real-time, the real-time scene information uploaded simultaneously
It has been shown that, the real-time scene information is used for the real-time scene for characterizing space where described image obtains equipment.
With the development of VR technologies, VR equipment is increasingly widely applied, for example, housing corporation uses VR technologies,
That improves user sees that room is experienced, and user is read a book by VR equipment, changes the pattern of in the past traditional plan for seeing show flat.
As a kind of embodiment, VR equipment includes 3D glasses and VR scene display devices, and the VR scenes, which are shown, to be set
Standby is display terminal, such as computer, PC etc..
Image acquisition equipment is used to obtaining the common real-time scene information in space where it, and described image obtains equipment can be with
For camera, the picture of spatial scene of the captured in real-time where it, and send to the VR equipment.VR equipment, which is shown, gets figure
As obtaining the real-time scene information that equipment is uploaded.
Step S320, receives the control information that user sends, and the control information is used to control described image to obtain equipment
The object of controlled device control in the scene in the space at place.
When user needs to be controlled some objects in real-time scene, real-time scene is checked by VR equipment first
The state of middle object.For example, user forgets whether close the light in parlor, but user is now in an office, is set by VR
The light status for future reference seen in now parlor, it is found that the light in parlor is not turned off really, can be sent to VR equipment and close lamp
The control instruction of light.
Step S330, sends control instruction to the control device, the control instruction is used to make the control device root
Object in the scene in the space where the control information control described image acquisition equipment sent according to the user.
VR equipment is received after the control instruction of user, it is necessary to which the control instruction is sent to corresponding real-time scene
In control device, by control device control need control object.
As a kind of embodiment, the VR systems also include server, and the server is previously stored with described real-time
The corresponding scene information of scene information and the control device bound with the scene information., can be with for example, in a VR equipment
Correspondence shows the room of three different users (user A, user B and user C), the scene information X in user A room, user B
Scene information Y and user C scene information Z, wherein, scene information X, Y and Z are in user's registration, while typing, with
The account binding of user, the also image acquisition equipment bound simultaneously with user account.For example, the image in user A rooms is obtained
The real-time scene information that taking equipment was shot later uploads onto the server, and first finds in the user account A bound in advance with it, from
The scene information X prestored is transferred in user account A, real-time scene information is corresponding with scene information X, and user A is to real-time
When object in scene is controlled, VR equipment sends control instruction to server, and server is according to lookup and real-time scene
The corresponding scene information X of information, and find the corresponding control devices of scene information X, server sends control instruction to looking into
The control device found, corresponding object is controlled by control device.
For example, user needs the lamp for finding parlor by VR equipment not close, click on and turn off the light in VR equipment, VR equipment
The instruction that " will turn off the light " is sent to server, by whois lookup to corresponding control device, and the instruction that " will turn off the light " sends and extremely controlled
Device processed, " lamp " is closed for control device control.
VR long-range control methods provided in an embodiment of the present invention, VR equipment receives real-time scene information and shown,
User sends control instruction according to real-time scene to VR equipment, and the object in real-time scene is controlled by control device.User is led to
Cross and check real-time scene information, obtain the state of object in real-time scene information, and according to the actual requirements, object is controlled
System, more intelligently, conveniently.
3rd embodiment
Fig. 4 is referred to, the step flow chart of the VR long-range control methods provided for third embodiment of the invention, applied to figure
VR systems shown in 1, the VR long-range control methods that 3rd embodiment is provided include:
Step S411, receives described image and obtains the original scene information that equipment is obtained, the original scene information is used for
The scene in the space where sign described image acquisition equipment.
As a kind of embodiment, VR equipment receives described image and obtains the original scene information that equipment is obtained, the original
Beginning scene information is the scene information in some space that image acquisition equipment is uploaded to VR equipment for the first time.The original scene
Information is used for VR equipment when receiving the real-time scene information in same space later, is associated.
As another embodiment, the VR systems also include server, and described image obtains equipment and passes through the clothes
The information transfer being engaged between device realization and the VR equipment.The server receives described image and obtains the primary field that equipment is obtained
Scape information, similarly, the original scene information are that described image obtains equipment for the first time to a certain of server upload
The scene information in individual space, for server when receiving the real-time scene information in same space later, is associated.
Step S412, carries out subregion to the original scene information and obtains partition context information, the partition context information
Including multiple sub-scene information, the sub-scene information is used to characterize carries out each block after subregion to the original scene information
Scene information.
Scene information is carried out subregion, obtained by server or VR equipment when receiving the original scene information in a certain space
Scoring area scene information, partition context information is made up of multiple sub-scene information.As a kind of embodiment, original scene is believed
Breath is converted to plane outspread drawing, is divided on plane outspread drawing, for example, average successively along the transverse direction of the plane outspread drawing
Five areas are divided, the partition context information includes five sub- scene informations.
It is more intensive that block is divided, and control accuracy is higher, but data volume to be processed will be bigger, can be with to network requirement
Rise;On the contrary, block divides more sparse, control accuracy will be lower, and network bandwidth requirement can be reduced, therefore can be according to reality
The quantity of the division of border demand selection block.
A sub- scene information is selected from the partition context information as demarcation sub-scene information.
Because the real-time scene information of the same space of original scene information for convenience with uploading later is associated, because
This, associates for convenience, selects a sub- scene information to be used as demarcation in the corresponding partition context information of original scene information
Sub-scene information.Sub-scene information is demarcated as mark, facilitates the binding of original scene information and real-time scene information.
As a kind of embodiment, the selection of the demarcation sub-scene information can select more a fixed object, example
Such as, the position of selection bed is relatively fixed, and the sub-scene information where selection bed is used as demarcation sub-scene information.
Step S413, obtains described image and obtains equipment captured in real-time and the real-time scene information uploaded.
After being transmitted across original scene information in advance to server or VR equipment in a certain space, server or VR equipment are carried out
Storage.Other image acquisition equipment is arranged in a certain space, if in work, can be always to server or VR equipment
Upload real-time scene information.
Step S414, by the scene in the scene in the real-time scene information and the partition context information one by one
Match somebody with somebody.
When user needs to check the real time information in a certain space for being transmitted through original scene information on, directly it can pass through VR
Equipment is checked.VR equipment receive user check instruction after, search the image acquisition equipment of a certain space binding with
And original scene information, and real-time scene image is obtained from described image acquisition equipment, by original scene information and real-time field
Scape information is associated, and makes real-time scene presentation of information in the VR equipment.
As a kind of embodiment, the demarcation sub-scene that server or VR equipment are first searched in the original scene is believed
Breath.
Search again with demarcating sub-scene information identical object in real-time scene information, according to the demarcation sub-scene information
Scene in the real-time scene information is matched one by one with the scene in the partition context information.
Step S420, receives the control information that user sends, and the control information is used to control described image to obtain equipment
The object of controlled device control in the scene in the space at place.
Step S430, sends control instruction to the control device, the control instruction is used to make the control device root
Object in the scene in the space where the control information control described image acquisition equipment sent according to the user.
In embodiment three, step S420 and S430 is identical with step S320 and S330 that embodiment is a kind of.Specifically
Embodiment refers to embodiment one, repeats no more here.
The VR long-range control methods that the present embodiment is provided, obtain the original scene information in a certain space, in original scene letter
Selection demarcation sub-scene information in breath, by demarcating associating for sub-scene information realization and real-time scene information, user's needs are looked into
It when seeing real-time scene information, can be directly viewable, and the object in real-time scene information can be controlled, more facilitate,
Intelligence.
Fourth embodiment
Fig. 5 is referred to, the step flow chart of the VR long-range control methods provided for fourth embodiment of the invention.4th implements
The VR long-range control methods that example is provided are applied to the VR systems shown in Fig. 1, including:
Step S510, described image obtains equipment captured in real-time and uploads real-time scene information to the VR equipment, described
Real-time scene information is used for the real-time scene for characterizing space where described image obtains equipment.
Step S520, the VR equipment receives the control information that user sends, and the control information is used to control the figure
The object that controlled device is controlled in the scene in the space as where obtaining equipment.
Step S530, the VR equipment sends control instruction to the control device, and the control instruction is described for making
Thing in the scene in space of the control device according to where the control information control described image that the user sends obtains equipment
Body.
Step S540, the control instruction control described image that the control device is sent according to the VR equipment obtains equipment
Object in the scene in the space at place.
The specific implementation process for the VR long-range control methods 500 that the present embodiment is provided may refer to second embodiment and the
Three embodiments, this is no longer going to repeat them.
5th embodiment
Fig. 6 is referred to, the structured flowchart of the VR remote controls 600 provided for fifth embodiment of the invention.The VR
Remote control 600 includes:
First acquisition unit 610, equipment captured in real-time, the real-time scene information uploaded are obtained simultaneously for obtaining described image
It has been shown that, the real-time scene information is used for the real-time scene for characterizing space where described image obtains equipment.
As a kind of embodiment, the first acquisition unit 610 includes:
Second receiving unit 611, the original scene information that equipment is obtained, the primary field are obtained for receiving described image
The scene in space of the scape information where for characterizing described image acquisition equipment.
Zoning unit 612, for carrying out subregion to the original scene information, obtains partition context information, the subregion
Scene information includes multiple sub-scene information, and the sub-scene information is used to characterize to be carried out after subregion to the original scene information
The scene information of each block.
As a kind of embodiment, the zoning unit 612 includes:
Unit is chosen, for selecting a sub- scene information from the partition context information as demarcation sub-scene letter
Breath.
Second acquisition unit 613, equipment captured in real-time and the real-time scene information uploaded are obtained for obtaining described image.
Matching unit 614, for by the scene in the scene in the real-time scene information and the partition context information
Match one by one.
As a kind of embodiment, the matching unit 614 includes:
Searching unit, for searching the demarcation sub-scene information.
Coupling subelement, for according to it is described demarcation sub-scene information by the scene in the real-time scene information with it is described
Scene in partition context information is matched one by one.
First receiving unit 620, the control information for receiving user's transmission, the control information is used to control the figure
The object that controlled device is controlled in the scene in the space as where obtaining equipment.
Transmitting element 630, for sending control instruction to the control device, the control instruction is used to make the control
Object in the scene in space of the device according to where the control information control described image that the user sends obtains equipment.
The specific implementation process of VR remote controls 600 provided in an embodiment of the present invention may refer to above method implementation
Example, this is no longer going to repeat them.
In summary, VR long-range control methods provided in an embodiment of the present invention, apparatus and system, VR equipment are received in real time
Scene information is simultaneously shown that user sends control instruction according to real-time scene to VR equipment, is controlled by control device real-time
Object in scene.User obtains the state of object in real-time scene information by checking real-time scene information, and according to reality
Demand, is controlled to object, more intelligently, conveniently.
It should be noted that each embodiment in this specification is described by the way of progressive, each embodiment weight
Point explanation be all between difference with other embodiment, each embodiment identical similar part mutually referring to.
For device class embodiment, because it is substantially similar to embodiment of the method, so description is fairly simple, related part is joined
See the part explanation of embodiment of the method.
, can also be by it in several embodiments provided herein, it should be understood that disclosed apparatus and method
Its mode is realized.Device embodiment described above is only schematical, for example, the flow chart and block diagram in accompanying drawing are aobvious
Device, architectural framework in the cards, the work(of method and computer program product of multiple embodiments according to the present invention are shown
Can and it operate.At this point, each square frame in flow chart or block diagram can represent one of a module, program segment or code
Point, a part for the module, program segment or code is used to realize the executable of defined logic function comprising one or more
Instruction.It should also be noted that in some implementations as replacement, the function of being marked in square frame can also be with different from attached
The order marked in figure occurs.For example, two continuous square frames can essentially be performed substantially in parallel, they also may be used sometimes
To perform in the opposite order, this is depending on involved function.It is also noted that each in block diagram and/or flow chart
The combination of square frame and the square frame in block diagram and/or flow chart, can with function or action as defined in performing it is special based on
The system of hardware is realized, or can be realized with the combination of specialized hardware and computer instruction.
In addition, each functional module in each embodiment of the invention can integrate to form an independent portion
Point or modules individualism, can also two or more modules be integrated to form an independent part.
If the function is realized using in the form of software function module and is used as independent production marketing or in use, can be with
It is stored in a computer read/write memory medium.Understood based on such, technical scheme is substantially in other words
The part contributed to prior art or the part of the technical scheme can be embodied in the form of software product, the meter
Calculation machine software product is stored in a storage medium, including some instructions are to cause a computer equipment (can be individual
People's computer, server, or network equipment etc.) perform all or part of step of each of the invention embodiment methods described.
And foregoing storage medium includes:USB flash disk, mobile hard disk, read-only storage (ROM, Read-Only Memory), arbitrary access
Memory (RAM, Random Access Memory), magnetic disc or CD etc. are various can be with the medium of store program codes.Need
It is noted that herein, such as first and second or the like relational terms are used merely to an entity or operation
Made a distinction with another entity or operation, and not necessarily require or imply these entities or exist between operating any this
Actual relation or order.Moreover, term " comprising ", "comprising" or its any other variant are intended to nonexcludability
Comprising so that process, method, article or equipment including a series of key elements are not only including those key elements, but also wrap
Include other key elements being not expressly set out, or also include for this process, method, article or equipment intrinsic want
Element.In the absence of more restrictions, the key element limited by sentence "including a ...", it is not excluded that wanted including described
Also there is other identical element in process, method, article or the equipment of element.
The preferred embodiments of the present invention are the foregoing is only, are not intended to limit the invention, for the skill of this area
For art personnel, the present invention can have various modifications and variations.Within the spirit and principles of the invention, that is made any repaiies
Change, equivalent substitution, improvement etc., should be included in the scope of the protection.It should be noted that:Similar label and letter exists
Similar terms is represented in following accompanying drawing, therefore, once being defined in a certain Xiang Yi accompanying drawing, is then not required in subsequent accompanying drawing
It is further defined and explained.
The foregoing is only a specific embodiment of the invention, but protection scope of the present invention is not limited thereto, any
Those familiar with the art the invention discloses technical scope in, change or replacement can be readily occurred in, should all be contained
Cover within protection scope of the present invention.Therefore, protection scope of the present invention described should be defined by scope of the claims.
Claims (10)
1. a kind of VR long-range control methods, it is characterised in that applied to VR systems, the VR systems include image acquisition equipment,
VR equipment and control device, methods described include:
The VR equipment obtains described image and obtains the real-time scene information of equipment captured in real-time and upload and show, described real-time
Scene information is used for the real-time scene for characterizing space where described image obtains equipment;
Receive the control information that user sends, the field in space of the control information where for controlling described image acquisition equipment
The object of controlled device control in scape;
Control instruction is sent to the control device, the control instruction is used to make the control device send according to the user
Control information control described image obtain equipment where space scene in object.
2. according to the method described in claim 1, it is characterised in that the VR equipment obtains described image acquisition equipment and clapped in real time
The step of scene information taken the photograph and uploaded, including:
Receive described image and obtain the original scene information that equipment is obtained, the original scene information is obtained for characterizing described image
The scene in the space where taking equipment;
Subregion is carried out to the original scene information and obtains partition context information, the partition context information includes multiple sub-scenes
Information, the sub-scene information is used to characterize the scene information for carrying out the original scene information each block after subregion;
Obtain described image and obtain equipment captured in real-time and the real-time scene information uploaded;
Scene in the real-time scene information is matched one by one with the scene in the partition context information.
3. method according to claim 2, it is characterised in that described to carry out subregion to the original scene information, is obtained
Partition context information, after the step of partition context information includes multiple sub-scene information, methods described also includes:
A sub- scene information is selected from the partition context information as demarcation sub-scene information.
4. method according to claim 3, it is characterised in that the scene by the real-time scene information with it is described
The step of scene in partition context information is matched one by one includes:
Search the demarcation sub-scene information;
According to the sub-scene information of demarcating by the field in the scene in the real-time scene information and the partition context information
Scape is matched one by one.
5. a kind of VR long-range control methods, it is characterised in that applied to VR systems, the VR systems include image acquisition equipment,
VR equipment and control device, methods described include:
Described image obtains equipment captured in real-time and uploads real-time scene information to the VR equipment, and the real-time scene information is used
In the real-time scene for characterizing space where described image obtains equipment;
The VR equipment receives the control information that user sends, and the control information is used to control described image to obtain equipment place
Space scene in the control of controlled device object;
The VR equipment sends control instruction to the control device, and the control instruction is used to make the control device according to institute
State user transmission control information control described image obtain equipment where space scene in object;
Space of the control device according to where the control instruction control described image that the VR equipment is sent obtains equipment
Object in scene.
6. a kind of VR remote controls, it is characterised in that applied to VR systems, the VR systems include image acquisition equipment,
VR equipment and control device, the VR remote controls include:
First acquisition unit, for obtaining the real-time scene information of described image acquisition equipment captured in real-time and upload and showing,
The real-time scene information is used for the real-time scene for characterizing space where described image obtains equipment;
First receiving unit, the control information for receiving user's transmission, the control information is used to control described image to obtain
The object of controlled device control in the scene in the space where equipment;
Transmitting element, for sending control instruction to the control device, the control instruction is used to make the control device root
Object in the scene in the space where the control information control described image acquisition equipment sent according to the user.
7. device according to claim 6, it is characterised in that the first acquisition unit includes:
Second receiving unit, the original scene information that equipment is obtained, the original scene information are obtained for receiving described image
For characterize described image obtain equipment where space scene;
Zoning unit, for carrying out subregion to the original scene information, obtains partition context information, the partition context information
Including multiple sub-scene information, the sub-scene information is used to characterize carries out each block after subregion to the original scene information
Scene information;
Second acquisition unit, equipment captured in real-time and the real-time scene information uploaded are obtained for obtaining described image;
Matching unit, for by the scene in the scene in the real-time scene information and the partition context information one by one
Match somebody with somebody.
8. device according to claim 7, it is characterised in that the zoning unit includes:
Unit is chosen, for selecting a sub- scene information from the partition context information as demarcation sub-scene information.
9. device according to claim 8, it is characterised in that the matching unit includes:
Searching unit, for searching the demarcation sub-scene information;
Coupling subelement, for according to the demarcation sub-scene information by the scene in the real-time scene information and the subregion
Scene in scene information is matched one by one.
10. a kind of VR tele-control systems, it is characterised in that including image acquisition equipment, VR equipment and control device:
Described image obtains equipment captured in real-time and uploads real-time scene information to the VR equipment, and the real-time scene information is used
In the real-time scene for characterizing space where described image obtains equipment;
The VR equipment receives the control information that user sends, and the control information is used to control described image to obtain equipment place
Space scene in the control of controlled device object;
The VR equipment sends control instruction to the control device, and the control instruction is used to make the control device according to institute
State user transmission control information control described image obtain equipment where space scene in object;
Space of the control device according to where the control instruction control described image that the VR equipment is sent obtains equipment
Object in scene.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710450766.3A CN107105065A (en) | 2017-06-15 | 2017-06-15 | VR long-range control methods, apparatus and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710450766.3A CN107105065A (en) | 2017-06-15 | 2017-06-15 | VR long-range control methods, apparatus and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107105065A true CN107105065A (en) | 2017-08-29 |
Family
ID=59660988
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710450766.3A Pending CN107105065A (en) | 2017-06-15 | 2017-06-15 | VR long-range control methods, apparatus and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107105065A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112099529A (en) * | 2020-09-22 | 2020-12-18 | 苏州臻迪智能科技有限公司 | Virtual reality equipment control system and method |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9460351B2 (en) * | 2012-11-28 | 2016-10-04 | Electronics And Telecommunications Research Institute | Image processing apparatus and method using smart glass |
CN106126022A (en) * | 2016-06-20 | 2016-11-16 | 美的集团股份有限公司 | Intelligent home furnishing control method based on virtual reality and device |
CN106445156A (en) * | 2016-09-29 | 2017-02-22 | 宇龙计算机通信科技(深圳)有限公司 | Method, device and terminal for intelligent home device control based on virtual reality |
CN106648297A (en) * | 2016-10-09 | 2017-05-10 | 广州艾想电子科技有限公司 | Intelligent device control method and device based on VR device |
CN106713082A (en) * | 2016-11-16 | 2017-05-24 | 惠州Tcl移动通信有限公司 | Virtual reality method for intelligent home management |
-
2017
- 2017-06-15 CN CN201710450766.3A patent/CN107105065A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9460351B2 (en) * | 2012-11-28 | 2016-10-04 | Electronics And Telecommunications Research Institute | Image processing apparatus and method using smart glass |
CN106126022A (en) * | 2016-06-20 | 2016-11-16 | 美的集团股份有限公司 | Intelligent home furnishing control method based on virtual reality and device |
CN106445156A (en) * | 2016-09-29 | 2017-02-22 | 宇龙计算机通信科技(深圳)有限公司 | Method, device and terminal for intelligent home device control based on virtual reality |
CN106648297A (en) * | 2016-10-09 | 2017-05-10 | 广州艾想电子科技有限公司 | Intelligent device control method and device based on VR device |
CN106713082A (en) * | 2016-11-16 | 2017-05-24 | 惠州Tcl移动通信有限公司 | Virtual reality method for intelligent home management |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112099529A (en) * | 2020-09-22 | 2020-12-18 | 苏州臻迪智能科技有限公司 | Virtual reality equipment control system and method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106210861B (en) | Method and system for displaying bullet screen | |
CN107113226B (en) | Electronic device for identifying peripheral equipment and method thereof | |
KR101247054B1 (en) | Pre-configured settings for portable devices | |
CN102436461A (en) | User terminal, remote terminal, and method for sharing augmented reality service | |
CN104796610A (en) | Mobile terminal and camera sharing method, device and system thereof | |
CN105574910A (en) | Electronic Device and Method for Providing Filter in Electronic Device | |
CN105847673B (en) | Photo display methods, device and mobile terminal | |
US10523900B1 (en) | Method and apparatus for capturing a group photograph during a video conferencing session | |
CN104145474A (en) | Guided image capture | |
CN105005597A (en) | Photograph sharing method and mobile terminal | |
US9554060B2 (en) | Zoom images with panoramic image capture | |
CN105005599A (en) | Photograph sharing method and mobile terminal | |
TW201814552A (en) | Method and system for sorting a search result with space objects, and a computer-readable storage device | |
CN109302630A (en) | Barrage generation method and relevant apparatus | |
CN111800644B (en) | Video sharing and acquiring method, server, terminal equipment and medium | |
CN208589219U (en) | A kind of 3-D imaging system | |
KR20200063526A (en) | On-line managing system for work of art | |
CN109993251A (en) | Three-dimensional storage area management system and method | |
CN110889573A (en) | Equipment model selection method and device thereof | |
CN105809618A (en) | Picture processing method and device | |
CN111007997A (en) | Remote display method, electronic device and computer-readable storage medium | |
CN104065742A (en) | Picking-up person information processing method and device for safe kindergarten picking up | |
CN115617166A (en) | Interaction control method and device and electronic equipment | |
CN107105065A (en) | VR long-range control methods, apparatus and system | |
CN107544660B (en) | Information processing method and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170829 |
|
RJ01 | Rejection of invention patent application after publication |