CN106200944A - The control method of a kind of object, control device and control system - Google Patents

The control method of a kind of object, control device and control system Download PDF

Info

Publication number
CN106200944A
CN106200944A CN201610513676.XA CN201610513676A CN106200944A CN 106200944 A CN106200944 A CN 106200944A CN 201610513676 A CN201610513676 A CN 201610513676A CN 106200944 A CN106200944 A CN 106200944A
Authority
CN
China
Prior art keywords
electronic equipment
control instruction
control
virtual
virtual objects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610513676.XA
Other languages
Chinese (zh)
Inventor
许奔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201610513676.XA priority Critical patent/CN106200944A/en
Publication of CN106200944A publication Critical patent/CN106200944A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides the control method of a kind of object, control device and control system, including: detection obtains for the first operation of the first virtual objects in virtual scene, and described virtual scene creates based on the first electronic equipment;The first control instruction is generated based on the first operation;Respond the first control instruction, control the first virtual objects and perform corresponding operating, and the first control instruction is sent at least one second electronic equipment, so that the first display object that the second electronic equipment controls display interface according to the first control instruction performs corresponding operating, display interface corresponds to virtual scene, first display object and the association of the first virtual objects, based on this, the first electronic equipment can carry out Collaborative Control with the second electronic equipment to the object in scene.

Description

The control method of a kind of object, control device and control system
Technical field
The present invention relates to control technical field, more particularly, it relates to the control method of a kind of object, control device and control System processed.
Background technology
AR (Augmented Reality, augmented reality), is that real world information is tied mutually by one with virtual world information The technology closed, its entity information that will originally be difficult to experience in real world certain time spatial dimension, such as vision, sound The information such as sound, taste, sense of touch, are felt by human sensory by the real world that is added to again after computer science and technology analog simulation Know, to realize the sensory experience of a kind of exceeding reality.
Existing a kind of AR equipment, such as the AR equipment of helmet-type, it is possible to use family is while being immediately seen real object See that projecting cell projects the virtual objects on helmet glasses.Further, this AR equipment can also be caught by image acquisition units Catch the gesture of user, and according to the gesture generation control instruction of user, virtual objects is controlled.But, existing AR sets Standby can not carry out the Collaborative Control of object with other electronic equipments.
Summary of the invention
In view of this, the invention provides the control method of a kind of object, control device and control system, existing to solve The problem that in technology, AR equipment can not carry out the Collaborative Control of object with other electronic equipments.
For achieving the above object, the present invention provides following technical scheme:
A kind of control method of object, including:
Detection obtains for the first operation of the first virtual objects in virtual scene, and wherein, described virtual scene is based on the One electronic equipment creates;
Based on described first operation, generate the first control instruction;
Respond described first control instruction, control described first virtual objects and perform corresponding operating, and by described first control Instruction processed is sent at least one second electronic equipment, so that described second electronic equipment controls according to described first control instruction First display object of display interface performs described corresponding operating, and wherein, described display interface corresponds to described virtual scene, institute State the first display object to associate with described first virtual objects.
Preferably, also include:
The second control instruction, wherein said second control instruction and described second electronics is obtained from described second electronic equipment The second operation for the described first display object of described display interface that equipment receives is corresponding;
Responding described second control instruction, described first virtual objects controlled in described virtual scene performs corresponding behaviour Make.
Preferably, also include:
The second control instruction, wherein said second control instruction and described second electronics is obtained from described second electronic equipment The second operation for described first display object that equipment receives is corresponding;
Based on described first control instruction and described second control instruction, generate the 3rd control instruction;
Respond described 3rd control instruction, control described first virtual objects and perform corresponding operating.
Preferably, based on described 3rd control instruction, after controlling described first virtual objects execution corresponding operating, also wrap Include:
Described 3rd control instruction is sent to described second electronic equipment, so that described second electronic equipment is according to described 3rd control instruction controls described first display object and performs described corresponding operating.
Preferably, the display dimension of described first virtual objects is N-dimensional, and the display dimension of described first display object is M Dimension, M is less than or equal to N.
Preferably, detection obtains and operates in virtual scene the first of the first virtual objects, including:
Detection obtains the gesture of the user of described first electronic equipment, and is mapped as described gesture for described virtual field First operation of the first virtual objects described in scape.
Preferably, detection obtains and operates in virtual scene the first of the first virtual objects, including:
Detection obtains the gesture of the user of described first electronic equipment, and the hands that described gesture comprises described user is empty at physics Between the first coordinate information and motion track;
First coordinate information of described physical space is mapped as with motion track the virtual sky that described virtual scene is corresponding Between the second coordinate information and operation trace;
The second coordinate information and operation trace based on described Virtual Space, it is thus achieved that for described in described virtual scene First operation of one virtual objects.
A kind of control device, including detection device, processor and communication device;
Described detection device, obtains for the first operation of the first virtual objects in virtual scene, wherein, institute for detecting State virtual scene to create based on the first electronic equipment;
Described processor, for based on described first operation, generating the first control instruction, responds described first and controls to refer to Order, controls described first virtual objects and performs corresponding operating, and control described communication device by described first control instruction transmission To the second electronic equipment, perform described corresponding behaviour controlling the first display object in the display interface of described second electronic equipment Making, wherein, described display interface corresponds to described virtual scene, and described first display object closes with described first virtual objects Connection.
Preferably, described communication device is additionally operable to obtain the second control instruction from described second electronic equipment, and by described Second control instruction sends to described processor, and wherein, described second control instruction receives with described second electronic equipment The second operation for the described first display object of display interface is corresponding;
Described processor is additionally operable to respond described second control instruction, and control in described virtual scene is described first virtual Object performs corresponding operating.
Preferably, described communication device is additionally operable to obtain the second control instruction from described second electronic equipment, and by described Second control instruction sends to described processor, and wherein, described second control instruction receives with described second electronic equipment The second operation for the described first display object of display interface is corresponding;
Described processor is additionally operable to based on described first control instruction and described second control instruction, generates the 3rd control and refers to Order, responds described 3rd control instruction, controls described first virtual objects and performs corresponding operating.
Preferably, described processor is additionally operable to control described communication device and described 3rd control instruction is sent to the most described the Two electronic equipments, to control the described first display object described corresponding operating of execution of described second electronic equipment.
Preferably, the display dimension of described first virtual objects is N-dimensional, and the display dimension of described first display object is M Dimension, M is less than or equal to N.
Preferably, described detection device includes induction installation and processing means;
Described induction installation is for detecting the gesture of the user obtaining described first electronic equipment;
Described processing means is for being mapped as described gesture for the first virtual objects described in described virtual scene First operation.
Preferably, described detection device includes induction installation and processing means;
Described induction installation is for detecting the gesture of the user obtaining described first electronic equipment, and described gesture comprises described The hands of user is at the first coordinate information of physical space and motion track;
Described processing means is for being mapped as described virtual by the first coordinate information and the motion track of described physical space Second coordinate information of the Virtual Space that scene is corresponding and operation trace, and the second coordinate information based on described Virtual Space and Operation trace, it is thus achieved that for the first operation of the first virtual objects described in described virtual scene.
A kind of control system, including:
Detector unit, obtains for the first operation of the first virtual objects, described virtual field in virtual scene for detecting Scape creates based on the first electronic equipment;
Processing unit, for based on described first operation, generating the first control instruction, respond described first control instruction, Control described first virtual objects and perform corresponding operating;
Communication unit, for described first control instruction is sent at least one second electronic equipment, described to control The first display object in the display interface of the second electronic equipment performs described corresponding operating, and wherein, described display interface is corresponding In described virtual scene, described first display object associates with described first virtual objects.
Compared with prior art, technical scheme provided by the present invention has the advantage that
The control method of object provided by the present invention, control device and control system, according to the first electronic equipment (such as AR Equipment) first operation of user generates the first virtual objects that the first control instruction controls in virtual scene and performs corresponding operating Meanwhile, by the first control command being sent to the second electronic equipment (such as smart mobile phone), the aobvious of the second electronic equipment is controlled Show that the first display object in interface performs described corresponding operating.Owing to display interface is corresponding with virtual scene, the first display is right As associating with the first virtual objects, therefore, the object in scene can be assisted by the first electronic equipment with the second electronic equipment With controlling.
Accompanying drawing explanation
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing In having technology to describe, the required accompanying drawing used is briefly described, it should be apparent that, the accompanying drawing in describing below is only this Inventive embodiment, for those of ordinary skill in the art, on the premise of not paying creative work, it is also possible to according to The accompanying drawing provided obtains other accompanying drawing.
A kind of flow chart of the control method of the object that Fig. 1 provides for the embodiment of the present invention;
The flow chart of the another kind of control method of the object that Fig. 2 provides for the embodiment of the present invention;
The flow chart of another control method of the object that Fig. 3 provides for the embodiment of the present invention;
The flow chart of another control method of the object that Fig. 4 provides for the embodiment of the present invention;
A kind of structural representation controlling device that Fig. 5 provides for the embodiment of the present invention.
Detailed description of the invention
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Describe, it is clear that described embodiment is only a part of embodiment of the present invention rather than whole embodiments wholely.Based on Embodiment in the present invention, it is every other that those of ordinary skill in the art are obtained under not making creative work premise Embodiment, broadly falls into the scope of protection of the invention.
Embodiments providing the control method of a kind of object, the control method in the present embodiment is based on including inspection That surveys device, processor and communication device controls what device realized, and this control device is preferably provided in the first electronic equipment.
The flow chart of the control method that the present embodiment provides is as it is shown in figure 1, include:
S101: detection obtains for the first operation of the first virtual objects in virtual scene;
S102: based on described first operation, generate the first control instruction;
S103: respond described first control instruction, controls described first virtual objects and performs corresponding operating, and by described the One control instruction is sent at least one second electronic equipment, so that described second electronic equipment is according to described first control instruction The the first display object controlling display interface performs described corresponding operating.
Wherein, this virtual scene creates based on the first electronic equipment.In the present embodiment, the first electronic equipment can be AR Equipment i.e. augmented reality equipment, it is also possible to be VR (Virtual Reality, virtual reality) equipment, certainly, the present invention not only It is limited to this.
It should be noted that the difference of VR equipment and AR equipment is, VR equipment user is seen by VR equipment It is all virtual scene, and AR equipment user can not only see virtual scene by AR equipment, moreover it is possible to that sees in true environment is true Real field scape, say, that AR equipment user it is seen that virtual scene and real scene combine after scene.
In the present embodiment, the second electronic equipment can also is that the electronics such as smart mobile phone, panel computer and notebook computer set Standby, wherein smart mobile phone refers to the smart mobile phone with touch screen, and certainly, in other embodiments, the second electronic equipment also may be used To be AR equipment or VR equipment, say, that the first electronic equipment and the second electronic equipment in the present invention can be all that AR sets Standby, it is also possible to be all VR equipment, it is also possible to one be AR equipment, another be VR equipment, the present invention is not limited to this.
After communication connection set up by the first electronic equipment and the second electronic equipment, the first electronic equipment can virtual by self Scene information sends to the second electronic equipment, and the second electronic equipment receives the virtual scene information that the first electronic equipment sends After, the display interface corresponding to this virtual scene can be shown, and, the first display object and virtual scene in this display interface In first virtual objects correspondence association.
It should be noted that the virtual scene that the display interface that shows of the second electronic equipment and the first electronic equipment show is Same Scene, the first display object in this display interface is same target with the first virtual objects in this virtual scene, two The display dimension of person can be identical, it is also possible to different.Optionally, the display dimension of the first virtual objects is N-dimensional, and the first display is right The display dimension of elephant is M dimension, and M is less than or equal to N.In a detailed description of the invention, the second electronic equipment is shown that same The two-dimensional picture of scene, the first electronic equipment is shown that the three-dimensional picture of Same Scene, and based on this, first shows the aobvious of object Showing that dimension is two dimension, the display dimension of the first virtual objects is three-dimensional.
Below with the first electronic equipment for AR equipment, the second electronic equipment is to illustrate as a example by smart mobile phone.
When a certain user is by the first electronic equipment viewing virtual scene, detects device, the gesture of user can be detected, and After the gesture of the user of detection this first electronic equipment of acquisition, it is virtual right to be mapped as this gesture in virtual scene first First operation of elephant.
Specifically, the induction installation detection in detection device obtains the gesture of the user of the first electronic equipment, this gesture bag The hands containing this user processing means in the first coordinate information of physical space and motion track, detection device can be empty by physics Between the first coordinate information be mapped as with motion track Virtual Space corresponding to virtual scene the second coordinate information and operation rail Mark, and the second coordinate information and operation trace based on Virtual Space, it is thus achieved that in virtual scene the of the first virtual objects One operation.Wherein, if the changing coordinates of operation trace and the first virtual objects has common factor, then can determine that the gesture of user be for First operation of the first virtual objects, if without occuring simultaneously, then the gesture of user is not the operation for the first virtual objects.
After detection device detection acquisition user is for the first operation of the first virtual objects in virtual scene, processor can base Generate the first control instruction in the first operation, respond the first control instruction, control the first virtual objects and perform corresponding operating, and control Make the 3-dimensional image of the first virtual objects after display device display this corresponding operating of execution of the first electronic equipment.Further Ground, after processor controls the first virtual objects execution corresponding operating, can be sent to the second electronic equipment by this first control instruction, So that the first display object that the second electronic equipment controls display interface according to the first control instruction performs this corresponding operating, and show Show the bidimensional image of the first display object after performing this corresponding operating.
Such as, the three-dimensional cup that a certain user is placed on desk by the virtual scene that the first electronic equipment is seen, The cup of the two dimension that another user is also placed on desk by the display interface that the second electronic equipment is seen, wherein, three-dimensional Cup is the first virtual objects in this virtual scene, and the cup of two dimension is the first display object in this display interface.
If the first electronic device user is wanted to pick up the three-dimensional cup on desk, its hand can be in three-dimensional cup of holding with a firm grip Shape also has the gesture moved up.After detection device captures this gesture of user's hand, can be to by this gesture The first operation being mapped as moving up cup, and generate the first control instruction and control cup and move up, in order to Yong Huneng Enough see the result successfully picked up by the cup on desk.It is to say, processor can generate the first control according to the first operation Instruction, controls, according to the first control instruction, the operation that cup performs to move up, and controls the display device of the first electronic equipment Display performs the three-dimensional picture of the cup after moving up operation, so that user sees the three-dimensional picture picked up by cup in one's hands.
Afterwards, processor can by this first control instruction send to the second electronic equipment so that the second electronic equipment according to First control instruction controls the operation that the cup execution of display interface moves up, and demonstrates two of the cup after moving up Dimension picture.
It follows that the technical scheme of the present embodiment, the first electronic device user is virtual by the first electronic equipment viewing During scene, the second electronic device user can know the letter of this virtual scene by watching the display interface of the second electronic equipment Breath, and, the first electronic equipment controls while the first virtual objects in virtual scene performs corresponding operating, can control the First display object of the display interface of two electronic equipments performs described corresponding operating, it is achieved thereby that the first electronic equipment and the The Collaborative Control of two electronic equipment objects.
On the basis of any of the above-described embodiment, as in figure 2 it is shown, the controlling party of the object of another embodiment of the present invention offer Method, also includes:
S201: obtain the second control instruction from described second electronic equipment, wherein said second control instruction and described the The second operation for the described first display object of described display interface that two electronic equipments receive is corresponding;
S202: respond described second control instruction, described first virtual objects controlled in described virtual scene performs phase Should operate.
Still with the first electronic equipment for AR equipment, as a example by the second electronic equipment is smart mobile phone, the second electronic device user After being inputted for the second operation of the first display object in display interface by the touch interface of the second electronic equipment, the second electricity Subset can generate the second control instruction according to the second operation, responds this second control instruction, controls the first display object and performs Corresponding operating, display performs the bidimensional image of the first display object after this corresponding operating, and this second control instruction is sent To processor.Processor by communication device, can be responded the second control and refer to after the second electronic equipment obtains the second control instruction Order, controls the first virtual objects and performs corresponding operating, and the display device display controlling the first electronic equipment performs this and grasps accordingly The 3-dimensional image of the first virtual objects after work.
Such as, after smart phone user is controlled the cup in smart mobile phone display interface by touch screen and is moved up, Smart mobile phone can by move up cup this control instruction send to processor, and show execution move up operation after The two-dimensional picture of cup.Processor can control, according to this control instruction, the behaviour that the cup in virtual scene performs to move up Make, and the display device display controlling the first electronic equipment performs the three-dimensional picture of the cup after moving up operation.
Based on this, the second electronic device user can not only know that the first electronic device user is empty in virtual scene first Intend the operation of object, additionally it is possible to empty by the operation of the first display object in display interface is controlled first in virtual scene Intend object and perform identical operation, it is achieved thereby that the second electronic device user and the interaction of the first electronic device user, strengthen The experience enjoyment of the first electronic device user.
On the basis of any of the above-described embodiment, as it is shown on figure 3, the controlling party of the object of further embodiment of this invention offer Method, also includes:
S301: obtain the second control instruction from described second electronic equipment, wherein said second control instruction and described the The second operation for described first display object that two electronic equipments receive is corresponding;
S302: based on described first control instruction and described second control instruction, generates the 3rd control instruction;
S303: respond described 3rd control instruction, controls described first virtual objects and performs corresponding operating.
Equally, the second control instruction is sent to processor by the second electronic equipment, and processor can be according to empty for first Intend the first control instruction and second control instruction of the first operation generation of object, generate the 3rd control instruction, and respond the 3rd Control instruction, controls the first virtual objects and performs corresponding operating.It is to say, the first electronic device user and the second electronic equipment User can control the first virtual objects simultaneously.
In another embodiment, as shown in Figure 4, after step 303, also include:
S304: described 3rd control instruction is sent to described second electronic equipment, so that described second electronic equipment root Control described first display object according to described 3rd control instruction and perform described corresponding operating.
Processor response the 3rd control instruction, after controlling the first virtual objects execution corresponding operating, also can control the 3rd Instruction sends to the second electronic equipment, so that the second electronic equipment controls the first display object according to the 3rd control instruction performs phase Should operate.Based on this, the first electronic device user and the second electronic device user can control the first virtual objects and simultaneously One display object.
Such as, detection device detection obtains the first operation that the cup in virtual scene is pulled up by AR equipment user, Simultaneously or nearly smart mobile phone obtains the second operation that the cup in display interface is stretched downwards by smart phone user, place simultaneously Reason device can generate the first control instruction based on the first operation being pulled up, and smart mobile phone can the second operation based on stretching downwards Generating the second control instruction, and be sent to processor, this processor can generate according to the first control instruction and the second control instruction 3rd control instruction, and stretch to upper and lower both direction, so according to the cup in the 3rd control instruction control virtual scene simultaneously The 3-dimensional image of the cup after the display device display up-down stretch of rear control AR equipment.Preprocessor can by the 3rd control refer to Order is sent to smart mobile phone, and smart mobile phone can control the cup in display interface to upper and lower both direction according to the 3rd control instruction Stretch simultaneously, and show the bidimensional image of the cup after up-down stretch.
The control method of the object that the present embodiment provides, according to first operation of the first electronic equipment (such as AR equipment) user While generating the first virtual objects execution corresponding operating that the first control instruction controls in virtual scene, by controlling first Order sends to the second electronic equipment (such as smart mobile phone), controls the first display in the display interface of the second electronic equipment right As performing described corresponding operating.Owing to display interface is corresponding with virtual scene, the first display object and the association of the first virtual objects, Therefore, the first electronic equipment can carry out Collaborative Control with the second electronic equipment to the object in scene.
Embodiments of the invention additionally provide the control device of a kind of object, are applied to the first electronic equipment, it is achieved above-mentioned The control method of the object that embodiment provides, as it is shown in figure 5, this control device includes detecting device 1, processor 2 and communication dress Put 3.
Wherein, detection device 1 obtains for the first operation of the first virtual objects in virtual scene for detecting, wherein, Virtual scene creates based on the first electronic equipment.
Processor 2, for based on the first operation, generating the first control instruction, responds the first control instruction, controls first empty Intend object and perform corresponding operating, and the first control instruction is sent to the second electronic equipment, to control second by communication control device The first display object in the display interface of electronic equipment performs corresponding operating, and wherein, display interface corresponds to virtual scene, the One display object and the association of the first virtual objects.
Based on this, while the first electronic equipment controls the first virtual objects execution corresponding operating in virtual scene, can To control the first display object described corresponding operating of execution of the display interface of the second electronic equipment, it is achieved thereby that the first electronics Equipment and the Collaborative Control of the second electronic equipment object.
In the present embodiment, communication device 3 is mainly used in realizing the communication between processor and the second electronic equipment, optional , can wirelessly transmit instruction between communication device 3 and the second electronic equipment, such as bluetooth, infrared or WIFI, when So, instruction can also be transmitted between communication device 3 and the second electronic equipment by the way of wired.
In a specific embodiment, communication device 3 is additionally operable to obtain the second control instruction from the second electronic equipment, and will Second control instruction send to processor 2, wherein, the second control instruction and the second electronic equipment receive for display interface First display object second operation corresponding;Processor 2 is additionally operable to respond the second control instruction, controls in virtual scene First virtual objects performs corresponding operating.
Based on this, the second electronic device user can not only know that the first electronic device user is empty in virtual scene first Intend the operation of object, additionally it is possible to empty by the operation of the first display object in display interface is controlled first in virtual scene Intend object and perform identical operation, it is achieved thereby that the second electronic device user and the interaction of the first electronic device user, strengthen The experience enjoyment of the first electronic device user.
In another specific embodiment, communication device 3 is additionally operable to obtain the second control instruction from the second electronic equipment, and By second control instruction send to processor 2, wherein, the second control instruction and the second electronic equipment receive for show boundary Second operation of the first display object in face is corresponding;Processor 2 is additionally operable to based on the first control instruction and the second control instruction, Generate the 3rd control instruction, respond the 3rd control instruction, control the first virtual objects and perform corresponding operating.Further, in this enforcement On the basis of example, processor 2 is additionally operable to communication control device 3 and the 3rd control instruction is sent to the second electronic equipment, to control First display object of the second electronic equipment performs corresponding operating.Based on this, the first electronic device user and the second electronic equipment User can control the first virtual objects and the first display object simultaneously.
In the present embodiment, detection device includes induction installation and processing means;Induction installation is used for detecting acquisition described the The gesture of the user of one electronic equipment;Processing means is for being mapped as gesture in virtual scene the of the first virtual objects One operation.
In a detailed description of the invention, the gesture that induction installation detects comprises the hands of user in the first of physical space Coordinate information and motion track;Based on this, the first coordinate information of physical space and motion track can be mapped as by processing means Second coordinate information of the Virtual Space that virtual scene is corresponding and operation trace, and the second coordinate information based on Virtual Space and Operation trace, it is thus achieved that for the first operation of the first virtual objects in virtual scene.
Optionally, induction installation can be photographic head or sensor etc..Wherein it is possible to shoot the first electronics by photographic head The action of equipment user's hand obtains the gesture of this user, it is also possible to by being arranged on the biography of the hand of the first electronic device user Sensor, the action of perception user's hand obtains the gesture of this user.
The control device of object that the present embodiment provides, processor is according to the of the first electronic equipment (such as AR equipment) user While one operation generates the first virtual objects execution corresponding operating that the first control instruction controls in virtual scene, pass through communication First control command is sent to the second electronic equipment (such as smart mobile phone) by device, controls the display interface of the second electronic equipment In first display object perform described corresponding operating.Owing to display interface is corresponding with virtual scene, the first display object and the One virtual objects association, therefore, the first electronic equipment can carry out Collaborative Control with the second electronic equipment to the object in scene.
Embodiments of the invention additionally provide the control system of a kind of object, are applied to the virtual right of above-described embodiment offer The control method of elephant, this control system includes detector unit, processing unit and communication unit.
Wherein, detector unit obtains for the first operation of the first virtual objects in virtual scene, wherein, institute for detecting State virtual scene to create based on the first electronic equipment;Processing unit, for operating based on described first, generates the first control instruction, Respond described first control instruction, control described first virtual objects and perform corresponding operating;Communication unit is for by described first Control instruction is sent at least one second electronic equipment, so that described second electronic equipment is according to described first control instruction control First display object of display interface processed performs described corresponding operating, and wherein, described display interface corresponds to described virtual scene, Described first display object associates with described first virtual objects.
In a specific embodiment, communication unit is additionally operable to obtain the second control instruction from the second electronic equipment, and will Second control instruction send to processing unit, wherein, the second control instruction and the second electronic equipment receive for show boundary Second operation of the first display object in face is corresponding;Processing unit is additionally operable to respond the second control instruction, controls virtual scene In first virtual objects perform corresponding operating.
In another specific embodiment, communication unit is additionally operable to obtain the second control instruction from the second electronic equipment, and By second control instruction send to processing unit, wherein, the second control instruction and the second electronic equipment receive for display Second operation of the first display object at interface is corresponding;Processing unit is additionally operable to control to refer to based on the first control instruction and second Order, generates the 3rd control instruction, responds the 3rd control instruction, controls the first virtual objects and performs corresponding operating.
Further, on the basis of this embodiment, processing unit is additionally operable to communication control unit and the 3rd control instruction is sent To the second electronic equipment, to control the first display object execution corresponding operating of the second electronic equipment.
In the present embodiment, detector unit includes induction module and processing module;Induction module is used for detecting acquisition the first electricity The gesture of the user of subset, this gesture comprises the hands of user at the first coordinate information of physical space and motion track;Process Module for being mapped as the second of Virtual Space corresponding to virtual scene by the first coordinate information of physical space with motion track Coordinate information and operation trace, and the second coordinate information and operation trace based on Virtual Space, it is thus achieved that in virtual scene First operation of the first virtual objects.
The control system of the object that the present embodiment provides, processing unit is according to the first electronic equipment (such as AR equipment) user While first operation generates the first virtual objects execution corresponding operating that the first control instruction controls in virtual scene, by logical First control command is sent to the second electronic equipment (such as smart mobile phone) by news unit, controls display circle of the second electronic equipment The first display object in face performs described corresponding operating.Owing to display interface is corresponding with virtual scene, first display object with First virtual objects association, therefore, the object in scene can be carried out working in coordination with control with the second electronic equipment by the first electronic equipment System.
In this specification, each embodiment uses the mode gone forward one by one to describe, and what each embodiment stressed is and other The difference of embodiment, between each embodiment, identical similar portion sees mutually.For device disclosed in embodiment For, owing to it corresponds to the method disclosed in Example, so describe is fairly simple, relevant part sees method part and says Bright.
Described above to the disclosed embodiments, makes professional and technical personnel in the field be capable of or uses the present invention. Multiple amendment to these embodiments will be apparent from for those skilled in the art, as defined herein General Principle can realize without departing from the spirit or scope of the present invention in other embodiments.Therefore, the present invention It is not intended to be limited to the embodiments shown herein, and is to fit to and principles disclosed herein and features of novelty phase one The widest scope caused.

Claims (15)

1. the control method of an object, it is characterised in that including:
Detection obtains for the first operation of the first virtual objects in virtual scene, and described virtual scene is based on the first electronic equipment Create;
Based on described first operation, generate the first control instruction;
Respond described first control instruction, control described first virtual objects and perform corresponding operating, and control to refer to by described first Order is sent to the second electronic equipment, performs described controlling the first display object in the display interface of described second electronic equipment Corresponding operating, wherein, described display interface corresponds to described virtual scene, and described first display object is first virtual right with described As association.
Control method the most according to claim 1, it is characterised in that also include:
The second control instruction, wherein said second control instruction and described second electronic equipment is obtained from described second electronic equipment The second operation for the described first display object of described display interface received is corresponding;
Responding described second control instruction, described first virtual objects controlled in described virtual scene performs corresponding operating.
Control method the most according to claim 1, it is characterised in that also include:
The second control instruction, wherein said second control instruction and described second electronic equipment is obtained from described second electronic equipment The second operation for described first display object received is corresponding;
Based on described first control instruction and described second control instruction, generate the 3rd control instruction;
Respond described 3rd control instruction, control described first virtual objects and perform corresponding operating.
Control method the most according to claim 3, it is characterised in that based on described 3rd control instruction, controls described After one virtual objects performs corresponding operating, also include:
Described 3rd control instruction is sent to described second electronic equipment, so that described second electronic equipment is according to the described 3rd Control instruction controls described first display object and performs described corresponding operating.
Control method the most according to claim 1, it is characterised in that the display dimension of described first virtual objects is N-dimensional, The display dimension of described first display object is M dimension, and M is less than or equal to N.
Control method the most according to claim 1, it is characterised in that it is virtual right in virtual scene first that detection obtains First operation of elephant, including:
Detection obtains the gesture of the user of described first electronic equipment, and is mapped as described gesture in described virtual scene First operation of described first virtual objects.
Control method the most according to claim 1, it is characterised in that it is virtual right in virtual scene first that detection obtains First operation of elephant, including:
Detection obtains the gesture of the user of described first electronic equipment, and described gesture comprises the hands of described user in physical space First coordinate information and motion track;
First coordinate information of described physical space is mapped as with motion track Virtual Space corresponding to described virtual scene Second coordinate information and operation trace;
The second coordinate information and operation trace based on described Virtual Space, it is thus achieved that empty for described in described virtual scene first Intend the first operation of object.
8. one kind controls device, it is characterised in that include detecting device, processor and communication device;
Described detection device, obtains for the first operation of the first virtual objects in virtual scene, wherein, described void for detecting Intend scene to create based on the first electronic equipment;
Described processor, for based on described first operation, generating the first control instruction, respond described first control instruction, control Make described first virtual objects and perform corresponding operating, and control described communication device described first control instruction is sent to second Electronic equipment, performs described corresponding operating controlling the first display object in the display interface of described second electronic equipment, its In, described display interface corresponds to described virtual scene, and described first display object associates with described first virtual objects.
Control device the most according to claim 8, it is characterised in that described communication device is additionally operable to from described second electronics Equipment obtains the second control instruction, and sends described second control instruction to described processor, and wherein, described second controls to refer to The second operation for the described first display object of display interface that order receives with described second electronic equipment is corresponding;
Described processor is additionally operable to respond described second control instruction, controls described first virtual objects in described virtual scene Perform corresponding operating.
Control device the most according to claim 8, it is characterised in that described communication device is additionally operable to from described second electricity Subset obtains the second control instruction, and sends described second control instruction to described processor, and wherein, described second controls The second operation for the described first display object of display interface that instruction receives with described second electronic equipment is corresponding;
Described processor is additionally operable to, based on described first control instruction and described second control instruction, generate the 3rd control instruction, Respond described 3rd control instruction, control described first virtual objects and perform corresponding operating.
11. control devices according to claim 10, it is characterised in that described processor is additionally operable to control described communication dress Put and described 3rd control instruction is sent to described second electronic equipment, aobvious to control described the first of described second electronic equipment Show that object performs described corresponding operating.
12. control devices according to claim 8, it is characterised in that the display dimension of described first virtual objects is N Dimension, the display dimension of described first display object is M dimension, and M is less than or equal to N.
13. control devices according to claim 8, it is characterised in that described detection device includes induction installation and process Device;
Described induction installation is for detecting the gesture of the user obtaining described first electronic equipment;
Described processing means for being mapped as first for the first virtual objects described in described virtual scene by described gesture Operation.
14. control devices according to claim 8, it is characterised in that described detection device includes induction installation and process Device;
Described induction installation is for detecting the gesture of the user obtaining described first electronic equipment, and described gesture comprises described user Hands at the first coordinate information of physical space and motion track;
Described processing means is for being mapped as described virtual scene by the first coordinate information and the motion track of described physical space Second coordinate information of corresponding Virtual Space and operation trace, and the second coordinate information and operation based on described Virtual Space Track, it is thus achieved that for the first operation of the first virtual objects described in described virtual scene.
15. 1 kinds of control systems, it is characterised in that including:
Detector unit, obtains for the first operation of the first virtual objects, described virtual scene base in virtual scene for detecting Create in the first electronic equipment;
Processing unit, for based on described first operation, generating the first control instruction, respond described first control instruction, control Described first virtual objects performs corresponding operating;
Communication unit, for being sent at least one second electronic equipment by described first control instruction, to control described second The first display object in the display interface of electronic equipment performs described corresponding operating, and wherein, described display interface corresponds to institute Stating virtual scene, described first display object associates with described first virtual objects.
CN201610513676.XA 2016-06-30 2016-06-30 The control method of a kind of object, control device and control system Pending CN106200944A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610513676.XA CN106200944A (en) 2016-06-30 2016-06-30 The control method of a kind of object, control device and control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610513676.XA CN106200944A (en) 2016-06-30 2016-06-30 The control method of a kind of object, control device and control system

Publications (1)

Publication Number Publication Date
CN106200944A true CN106200944A (en) 2016-12-07

Family

ID=57464411

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610513676.XA Pending CN106200944A (en) 2016-06-30 2016-06-30 The control method of a kind of object, control device and control system

Country Status (1)

Country Link
CN (1) CN106200944A (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107885334A (en) * 2017-11-23 2018-04-06 联想(北京)有限公司 A kind of information processing method and virtual unit
WO2018107941A1 (en) * 2016-12-13 2018-06-21 腾讯科技(深圳)有限公司 Multi-screen linking method and system utilized in ar scenario
WO2018113740A1 (en) * 2016-12-21 2018-06-28 Zyetric Technologies Limited Combining virtual reality and augmented reality
CN108227520A (en) * 2016-12-12 2018-06-29 李涛 A kind of control system and control method of the smart machine based on panorama interface
CN108762704A (en) * 2018-05-17 2018-11-06 深圳创维-Rgb电子有限公司 Method, apparatus, display equipment and the storage medium of multihead display
CN109343703A (en) * 2018-09-10 2019-02-15 中国科学院计算机网络信息中心 Information processing method, device, system, storage medium and processor
CN110473293A (en) * 2019-07-30 2019-11-19 Oppo广东移动通信有限公司 Virtual objects processing method and processing device, storage medium and electronic equipment
CN110647239A (en) * 2018-06-27 2020-01-03 脸谱科技有限责任公司 Gesture-based projection and manipulation of virtual content in an artificial reality environment
CN110908509A (en) * 2019-11-05 2020-03-24 Oppo广东移动通信有限公司 Multi-augmented reality device cooperation method and device, electronic device and storage medium
CN111045558A (en) * 2018-10-12 2020-04-21 上海博泰悦臻电子设备制造有限公司 Interface control method based on three-dimensional scene, vehicle-mounted equipment and vehicle
CN111144202A (en) * 2019-11-14 2020-05-12 北京海益同展信息科技有限公司 Object control method, device and system, electronic equipment and storage medium
CN111381670A (en) * 2018-12-29 2020-07-07 广东虚拟现实科技有限公司 Virtual content interaction method, device, system, terminal equipment and storage medium
CN111383345A (en) * 2018-12-29 2020-07-07 广东虚拟现实科技有限公司 Virtual content display method and device, terminal equipment and storage medium
CN111399631A (en) * 2019-01-03 2020-07-10 广东虚拟现实科技有限公司 Virtual content display method and device, terminal equipment and storage medium
WO2020216018A1 (en) * 2019-04-26 2020-10-29 腾讯科技(深圳)有限公司 Operation control method and apparatus, storage medium and device
CN112506335A (en) * 2019-09-16 2021-03-16 Oppo广东移动通信有限公司 Head-mounted device, control method, device and system thereof, and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120302289A1 (en) * 2011-05-27 2012-11-29 Kang Heejoon Mobile terminal and method of controlling operation thereof
WO2013095393A1 (en) * 2011-12-20 2013-06-27 Intel Corporation Augmented reality representations across multiple devices
CN103259911A (en) * 2012-02-17 2013-08-21 联想(北京)有限公司 Electronic equipment relating method, electronic equipment and multiple-equipment synergy electronic system
US20140062854A1 (en) * 2012-08-31 2014-03-06 Lg Electronics Inc. Head mounted display and method of controlling digital device using the same
US20160093108A1 (en) * 2014-09-30 2016-03-31 Sony Computer Entertainment Inc. Synchronizing Multiple Head-Mounted Displays to a Unified Space and Correlating Movement of Objects in the Unified Space
CN105528066A (en) * 2014-10-15 2016-04-27 三星电子株式会社 Method and apparatus for processing screen using device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120302289A1 (en) * 2011-05-27 2012-11-29 Kang Heejoon Mobile terminal and method of controlling operation thereof
WO2013095393A1 (en) * 2011-12-20 2013-06-27 Intel Corporation Augmented reality representations across multiple devices
CN103259911A (en) * 2012-02-17 2013-08-21 联想(北京)有限公司 Electronic equipment relating method, electronic equipment and multiple-equipment synergy electronic system
US20140062854A1 (en) * 2012-08-31 2014-03-06 Lg Electronics Inc. Head mounted display and method of controlling digital device using the same
US20160093108A1 (en) * 2014-09-30 2016-03-31 Sony Computer Entertainment Inc. Synchronizing Multiple Head-Mounted Displays to a Unified Space and Correlating Movement of Objects in the Unified Space
CN105528066A (en) * 2014-10-15 2016-04-27 三星电子株式会社 Method and apparatus for processing screen using device

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108227520A (en) * 2016-12-12 2018-06-29 李涛 A kind of control system and control method of the smart machine based on panorama interface
CN108228120A (en) * 2016-12-13 2018-06-29 腾讯科技(深圳)有限公司 A kind of multi-screen ganged method and system under AR scenes
WO2018107941A1 (en) * 2016-12-13 2018-06-21 腾讯科技(深圳)有限公司 Multi-screen linking method and system utilized in ar scenario
US10768881B2 (en) 2016-12-13 2020-09-08 Tencent Technology (Shenzhen) Company Limited Multi-screen interaction method and system in augmented reality scene
US10488941B2 (en) 2016-12-21 2019-11-26 Zyetric Technologies Limited Combining virtual reality and augmented reality
WO2018113740A1 (en) * 2016-12-21 2018-06-28 Zyetric Technologies Limited Combining virtual reality and augmented reality
CN107885334A (en) * 2017-11-23 2018-04-06 联想(北京)有限公司 A kind of information processing method and virtual unit
CN108762704A (en) * 2018-05-17 2018-11-06 深圳创维-Rgb电子有限公司 Method, apparatus, display equipment and the storage medium of multihead display
CN110647239A (en) * 2018-06-27 2020-01-03 脸谱科技有限责任公司 Gesture-based projection and manipulation of virtual content in an artificial reality environment
CN109343703A (en) * 2018-09-10 2019-02-15 中国科学院计算机网络信息中心 Information processing method, device, system, storage medium and processor
CN111045558A (en) * 2018-10-12 2020-04-21 上海博泰悦臻电子设备制造有限公司 Interface control method based on three-dimensional scene, vehicle-mounted equipment and vehicle
CN111381670B (en) * 2018-12-29 2022-04-01 广东虚拟现实科技有限公司 Virtual content interaction method, device, system, terminal equipment and storage medium
CN111381670A (en) * 2018-12-29 2020-07-07 广东虚拟现实科技有限公司 Virtual content interaction method, device, system, terminal equipment and storage medium
CN111383345A (en) * 2018-12-29 2020-07-07 广东虚拟现实科技有限公司 Virtual content display method and device, terminal equipment and storage medium
CN111399631A (en) * 2019-01-03 2020-07-10 广东虚拟现实科技有限公司 Virtual content display method and device, terminal equipment and storage medium
US11839821B2 (en) 2019-04-26 2023-12-12 Tencent Technology (Shenzhen) Company Limited Racing game operation control method and apparatus, storage medium, and device
WO2020216018A1 (en) * 2019-04-26 2020-10-29 腾讯科技(深圳)有限公司 Operation control method and apparatus, storage medium and device
WO2021018214A1 (en) * 2019-07-30 2021-02-04 Oppo广东移动通信有限公司 Virtual object processing method and apparatus, and storage medium and electronic device
CN110473293B (en) * 2019-07-30 2023-03-24 Oppo广东移动通信有限公司 Virtual object processing method and device, storage medium and electronic equipment
CN110473293A (en) * 2019-07-30 2019-11-19 Oppo广东移动通信有限公司 Virtual objects processing method and processing device, storage medium and electronic equipment
US11893702B2 (en) 2019-07-30 2024-02-06 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Virtual object processing method and apparatus, and storage medium and electronic device
CN112506335A (en) * 2019-09-16 2021-03-16 Oppo广东移动通信有限公司 Head-mounted device, control method, device and system thereof, and storage medium
CN112506335B (en) * 2019-09-16 2022-07-12 Oppo广东移动通信有限公司 Head-mounted device, control method, device and system thereof, and storage medium
CN110908509A (en) * 2019-11-05 2020-03-24 Oppo广东移动通信有限公司 Multi-augmented reality device cooperation method and device, electronic device and storage medium
CN111144202A (en) * 2019-11-14 2020-05-12 北京海益同展信息科技有限公司 Object control method, device and system, electronic equipment and storage medium
CN111144202B (en) * 2019-11-14 2023-11-07 京东科技信息技术有限公司 Object control method, device and system, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN106200944A (en) The control method of a kind of object, control device and control system
CN108525298B (en) Image processing method, image processing device, storage medium and electronic equipment
CN107132917B (en) For the hand-type display methods and device in virtual reality scenario
CN107707817B (en) video shooting method and mobile terminal
KR102147430B1 (en) virtual multi-touch interaction apparatus and method
CN107646098A (en) System for tracking portable equipment in virtual reality
KR102568708B1 (en) Apparatus and method for recognizing hand gestures in a virtual reality headset
US9412190B2 (en) Image display system, image display apparatus, image display method, and non-transitory storage medium encoded with computer readable program
CN110947181A (en) Game picture display method, game picture display device, storage medium and electronic equipment
US11188144B2 (en) Method and apparatus to navigate a virtual content displayed by a virtual reality (VR) device
CN205581784U (en) Can mix real platform alternately based on reality scene
CN102779000A (en) User interaction system and method
CN113426117B (en) Shooting parameter acquisition method and device for virtual camera, electronic equipment and storage medium
CN110362231A (en) The method and device that new line touch control device, image are shown
CN112817453A (en) Virtual reality equipment and sight following method of object in virtual reality scene
CN108564613A (en) A kind of depth data acquisition methods and mobile terminal
WO2017061890A1 (en) Wireless full body motion control sensor
US20210287330A1 (en) Information processing system, method of information processing, and program
CN102508561B (en) Operating rod
TW201439813A (en) Display device, system and method for controlling the display device
US20130271371A1 (en) Accurate extended pointing apparatus and method thereof
JP2013218423A (en) Directional video control device and method
WO2022176450A1 (en) Information processing device, information processing method, and program
CN110489026A (en) A kind of handheld input device and its blanking control method and device for indicating icon
CN211180839U (en) Motion teaching equipment and motion teaching system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20161207

RJ01 Rejection of invention patent application after publication