CN105739879A - Virtual reality application method and terminal - Google Patents
Virtual reality application method and terminal Download PDFInfo
- Publication number
- CN105739879A CN105739879A CN201610064431.3A CN201610064431A CN105739879A CN 105739879 A CN105739879 A CN 105739879A CN 201610064431 A CN201610064431 A CN 201610064431A CN 105739879 A CN105739879 A CN 105739879A
- Authority
- CN
- China
- Prior art keywords
- target object
- virtual target
- surging
- instruction
- face
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Processing Or Creating Images (AREA)
Abstract
Embodiments of the invention provide a virtual reality application method. The method comprises the following steps: under a virtual reality VR mode, obtaining a virtual target object added by a user at a first position; detecting touch parameters in allusion to the virtual target object; generating corresponding control instructions according to the touch parameters; and executing corresponding operations in response to the control instructions. The embodiments of the invention furthermore provide a terminal. Through the method, different control instructions can be generated according to different touch parameters in allusion to the virtual target object, and a series of operations are executed through a series of control instructions, so that a virtual reality method can be provided for the users.
Description
Technical field
The present invention relates to technical field of virtual reality, be specifically related to application process and the terminal of a kind of virtual reality.
Background technology
Virtual reality (VirtualReality, VR) technology is a kind of computer utility and human-computer interaction technology, virtual Information application to real world, real environment and virtual object have been added to same picture by its computer and visualization technique in real time or space exists simultaneously.Briefly, it is simply that deficiency and excess combines.
At present, the development of VR is still in formative stage, and a lot of fields are not yet better developed.Just by terminal (such as mobile phone, panel computer etc.) application, not yet better applied in this field.
Summary of the invention
Embodiments provide application process and the terminal of a kind of virtual reality, it is provided that a kind of method realizing virtual reality in terminal.
Embodiment of the present invention first aspect provides the application process of a kind of virtual reality, including:
Under Virtual Reality pattern, obtain the virtual target object that user adds in primary importance;
Detect the touch parameter for described virtual target object;
Corresponding control instruction is generated according to described touch parameter;
Respond described control instruction and perform corresponding operation.
First aspect in conjunction with the embodiment of the present invention, in the first possible embodiment of first aspect, described touch parameter be press described virtual target object by surging, press described virtual target object compressing time, press described virtual target object touch point number, press described virtual target object pressing area at least one.
The first possible embodiment in conjunction with the first aspect of the embodiment of the present invention, in the embodiment that the second of first aspect is possible, described touch parameter be described by surging and described touch point number time, described according to described touch parameter generate corresponding control instruction, including:
Judge whether described touch point number is in preset range;
If so, whether surging is pressed more than the first predetermined threshold value described in judging;
If so, then generating and drag instruction, described virtual target object is dragged to the second position by described primary importance for instruction by described dragging instruction.
The first possible embodiment in conjunction with the first aspect of the embodiment of the present invention, in the third possible embodiment of first aspect, when described touch parameter is described compressing time and described touch point number, described according to the described touch parameter corresponding control instruction of generation, including:
Judge that whether described touch point number is more than the second predetermined threshold value;
If so, judge that described compressing time is more than the 3rd predetermined threshold value;
If so, then generating fixed instruction, described virtual target object is fixed by described preset instructions for instruction.
In conjunction with the first of the first aspect of the embodiment of the present invention or first aspect to any one possible embodiment in the third, in the 4th kind of possible embodiment of first aspect, when described virtual target object comprises face, after the described control instruction of described response performs corresponding operation, described method also includes:
Described virtual target object is carried out Face datection, to obtain the face of described virtual target object;
Obtain and press surging for touch screen;
According to the described arbitrary face by surging, described face switched in default face picture storehouse.
In conjunction with the first of the first aspect of the embodiment of the present invention or first aspect to any one possible embodiment in the third, in the 5th kind of possible embodiment of first aspect, after the described control instruction of described response performs corresponding operation, described method also includes:
Obtain and press surging for touch screen;
The background picture of described virtual target object place scene is switched over by surging according to described.
Embodiment of the present invention second aspect provides a kind of terminal, including:
First acquiring unit, for, under Virtual Reality pattern, obtaining the virtual target object that user adds in primary importance;
Detection unit, for detecting the touch parameter of the described virtual target object obtained for described first acquiring unit;
Generate unit, for generating corresponding control instruction according to the described touch parameter of described detection unit detection;
Performance element, the described control instruction generated for responding described generation unit performs corresponding operation.
Second aspect in conjunction with the embodiment of the present invention, in the first possible embodiment of second aspect, described touch parameter be press described virtual target object by surging, press described virtual target object compressing time, press described virtual target object touch point number, press described virtual target object pressing area at least one.
In conjunction with the second aspect of the embodiment of the present invention, in the embodiment that the second of second aspect is possible, described touch parameter be described by surging and described touch point number time, described generation unit:
First judge module, is used for judging whether described touch point number is in preset range;
Second judge module, if being yes for the judged result of described first judge module, it is judged that described by surging whether more than the first predetermined threshold value;
First generation module, if the judged result for described second judge module is yes, then generates and drags instruction, and described virtual target object is dragged to the second position by described primary importance for instruction by described dragging instruction.
In conjunction with the second aspect of the embodiment of the present invention, in the third possible embodiment of second aspect, when described touch parameter is described compressing time and described touch point number, described generation unit includes:
3rd judge module, is used for judging that whether described touch point number is more than the second predetermined threshold value;
4th judge module, if the judged result for described 3rd judge module is yes, it is judged that described compressing time is more than the 3rd predetermined threshold value;
Second generation module, if the judged result for described 4th judge module is yes, then generates fixed instruction, and described virtual target object is fixed by described preset instructions for instruction.
In conjunction with the first of the second aspect of the embodiment of the present invention or second aspect to any one possible embodiment in the third, in the 4th kind of possible embodiment of second aspect, when described virtual target object comprises face, described detection unit is additionally operable to, after performance element responds the corresponding operation of described control instruction execution that described generation unit generates, described virtual target object is carried out Face datection, to obtain the face of described virtual target object;
Described terminal also includes:
Second acquisition unit, for obtain for touch screen by surging;
First switch unit, for switching to the arbitrary face in default face picture storehouse according to described second acquisition unit described in obtaining by surging by described face.
In conjunction with the first of the second aspect of the embodiment of the present invention or second aspect to any one possible embodiment in the third, in the 5th kind of possible embodiment of second aspect, described detection unit is additionally operable to, responding after the described control instruction that described generation unit generates performs corresponding operation at performance element, detection presses surging for touch screen;
Described terminal also includes:
Second switch unit, switches over the background picture of described virtual target object place scene by surging described in detecting according to described detection unit.
The embodiment of the present invention the 5th aspect provides a kind of terminal, including:
Storage has the memorizer of executable program code;
The processor coupled with described memorizer;
Described processor calls the described executable program code of storage in described memorizer, performs the part or all of step as described in embodiment of the present invention first aspect either method.
Implement the embodiment of the present invention, have the advantages that
By the embodiment of the present invention under Virtual Reality pattern, obtain the virtual target object that user adds in primary importance, detect the touch parameter for this virtual target object, generate corresponding control instruction according to this touch parameter, respond this control instruction and perform corresponding operation.Thus, according to the user's different touch parameters for virtual target object, generate different control instructions, by a series of control instructions, perform a series of operation, therefore, a kind of method that virtual reality can be provided the user.
Accompanying drawing explanation
In order to be illustrated more clearly that the technical scheme in the embodiment of the present invention, below the accompanying drawing used required during embodiment is described is briefly described, apparently, accompanying drawing in the following describes is some embodiments of the present invention, for those of ordinary skill in the art, under the premise not paying creative work, it is also possible to obtain other accompanying drawing according to these accompanying drawings.
Fig. 1 is the first embodiment schematic flow sheet of the application process of a kind of virtual reality that the embodiment of the present invention provides;
Fig. 2 is the second embodiment schematic flow sheet of the application process of a kind of virtual reality that the embodiment of the present invention provides;
Fig. 3 is the 3rd embodiment schematic flow sheet of the application process of a kind of virtual reality that the embodiment of the present invention provides;
Fig. 4 is the 4th embodiment schematic flow sheet of the application process of a kind of virtual reality that the embodiment of the present invention provides;
Fig. 5 is the 5th embodiment schematic flow sheet of the application process of a kind of virtual reality that the embodiment of the present invention provides;
Fig. 6 a is the first embodiment structural representation of a kind of terminal that the embodiment of the present invention provides;
Fig. 6 b is the another structural representation of first embodiment of a kind of terminal that the embodiment of the present invention provides;
Fig. 6 c is the another structural representation of first embodiment of a kind of terminal that the embodiment of the present invention provides;
Fig. 6 d is the another structural representation of first embodiment of a kind of terminal that the embodiment of the present invention provides;
Fig. 6 e is the another structural representation of first embodiment of a kind of terminal that the embodiment of the present invention provides;
Fig. 7 is the second example structure schematic diagram of a kind of terminal that the embodiment of the present invention provides.
Detailed description of the invention
Embodiments provide application process and the terminal of a kind of virtual reality, it is provided that a kind of method realizing virtual reality in terminal.
In order to make those skilled in the art be more fully understood that the present invention program, below in conjunction with the accompanying drawing in the embodiment of the present invention, technical scheme in the embodiment of the present invention is clearly and completely described, obviously, described embodiment is only the embodiment of a present invention part, rather than whole embodiments.Based on the embodiment in the present invention, the every other embodiment that those of ordinary skill in the art obtain under not making creative work premise, all should belong to the scope of protection of the invention.
It is described in detail individually below.
Term " first " in description and claims of this specification and described accompanying drawing, " second ", " the 3rd " and " the 4th " etc. are for distinguishing different object, rather than are used for describing particular order.Additionally, term " including " and " having " and their any deformation, it is intended that cover non-exclusive comprising.Such as contain series of steps or the process of unit, method, system, product or equipment are not limited to step or the unit listed, but also include step or the unit do not listed alternatively, or also include other step intrinsic for these processes, method, product or equipment or unit alternatively.
Referenced herein " embodiment " is it is meant that the special characteristic, structure or the characteristic that describe may be embodied at least one embodiment of the present invention in conjunction with the embodiments.Each position in the description occurs that this phrase might not each mean identical embodiment, neither with the independent of other embodiments mutual exclusion or alternative embodiment.Those skilled in the art explicitly and are implicitly understood by, and embodiment described herein can combine with other embodiments.
It should be noted that, terminal described by the embodiment of the present invention can include smart mobile phone (such as Android phone, iOS mobile phone, WindowsPhone mobile phone etc.), panel computer, palm PC, notebook computer, mobile internet device (MID, or Wearable etc. MobileInternetDevices), above-mentioned terminal is only citing, and non exhaustive, including but not limited to above-mentioned terminal.
Refer to Fig. 1, for the first embodiment schematic flow sheet of application process of a kind of virtual reality that the embodiment of the present invention provides.The application process of the virtual reality described in the present embodiment, comprises the following steps:
101, under Virtual Reality pattern, the virtual target object that user adds is obtained in primary importance.
In the embodiment of the present invention, terminal can receive the enabled instruction of the VR pattern of user's input, and under this enabled instruction, terminal can enter VR pattern, i.e. virtual real mode.Entering VR pattern, the display screen of terminal can show default display interface, and this default display interface can be blank interface or show picture that user selects in advance displaying interface as a setting.Primary importance can be any position of this default display interface, user can add arbitrary virtual target object in this position, this virtual target object can be the image prestored, or, the photographic head of available terminal shoots, using the jobbie in the image that photographs as virtual target object.Further, user may be added to that a few virtual target object, such as, the first virtual target object is added in primary importance, the second position adds the second virtual target object, the 3rd virtual target object is added in 3rd position, wherein, primary importance, the second position and the 3rd position can be different positions, first virtual target object, the second virtual target object and the 3rd virtual target object can be different objects, distinguishingly, the first virtual target object, the second virtual target object, the 3rd virtual target object are same object.It should be noted that the first object can be the image of arbitrary image about people or thing.
102, detection is for the touch parameter of described virtual target object.
In the embodiment of the present invention, virtual target object can be carried out touch control operation by the touch screen of this terminal by terminal, thus, can detect that the touch parameter for this virtual target object based on this touch control operation.Wherein, touch parameter can include but are not limited to: pressing virtual target object by surging, the compressing time of pressing virtual target object, the touch point number of pressing virtual target object, touch the touch number of times of described virtual target object, at least one in the pressing area of pressing virtual target object, for this virtual target object sliding trace carrying out slide generation etc., wherein, touch point number can be regarded as the finger number that different fingers carries out touching, if 1 finger touches, then produce 1 touch point, if multiple fingers touch, then corresponding multiple touch point.Such as, when user carries out touch control operation for virtual target object, press this virtual target object, then can detect that for this virtual target object by surging.It should be noted that, touch parameter also can be combined by multiple different touch parameters, such as, by surging and compressing time, being combined into pressing parameter can be (by surging, compressing time), or, touch point number and the touch parameter not passed through can be formed by surging, for instance, one touch point number and touch parameter c1 can be formed by surging A, and two touch point numbers and touch parameter c2 can be formed by surging A.
103, corresponding control instruction is generated according to described touch parameter.
In the embodiment of the present invention, terminal can generate different control instructions according to different touch parameter numbers, control instruction can include but are not limited to: drags instruction, fixed instruction, rotation instruction, convergent-divergent instruction, synthetic instruction, face switching command and background switching command.Wherein, dragging instruction may indicate that and by primary importance, virtual target object is dragged to the second position, wherein, the second position can be any position except primary importance, fixed instruction may indicate that and is fixed by virtual target object, so, when this virtual target object is dragged by user, will be unable to drag.Rotate instruction and may indicate that different rotary angle rotates virtual target object.Convergent-divergent instruction may indicate that and virtual target object is zoomed in and out according to different proportion coefficient.Synthetic instruction can add the image of other objects a certain in position, target location, is synthesized by the image of virtual target object and other objects a certain, for instance, virtual target object and other objects a certain can be merged.When virtual target object comprises face, face switching command may indicate that and the face in virtual target object is switched over, wherein, when having multiple face in virtual target object, can switch mutually between the multiple faces in virtual target object, when virtual target object only comprises a face, then this virtual target object being switched to pre-set image, preset the image prestored in face terminal, the image prestored can be facial image, cartoon image or, other images.Background switching command can be used for the background picture residing for virtual target object is switched over, and when the background residing for virtual target object does not have background picture, the background at virtual target object can be switched to the background specified.
Further, control instruction can be also exit instruction, for instance, user double-clicks virtual target object, then can generate exit instruction, and this exit instruction can be used for instruction and exits virtual real mode.
104, respond described control instruction and perform corresponding operation.
In the embodiment of the present invention, terminal can respond control instruction, and the content indicated by control instruction performs corresponding operation.
By the embodiment of the present invention under Virtual Reality pattern, obtain the virtual target object that user adds in primary importance, detect the touch parameter for this virtual target object, generate corresponding control instruction according to this touch parameter, respond this control instruction and perform corresponding operation.Thus, according to the user's different touch parameters for virtual target object, generate different control instructions, by a series of control instructions, perform a series of operation, therefore, a kind of method that virtual reality can be provided the user.
Refer to Fig. 2, for the second embodiment schematic flow sheet of application process of a kind of virtual reality that the embodiment of the present invention provides.The application process of the virtual reality described in the present embodiment, comprises the following steps:
201, under Virtual Reality pattern, the virtual target object that user adds is obtained in primary importance.
In the embodiment of the present invention, terminal can receive the enabled instruction of the VR pattern of user's input, and under this enabled instruction, terminal can enter VR pattern.Entering VR pattern, the display screen of terminal can show default display interface, and this default display interface can be blank interface or show picture that user selects in advance displaying interface as a setting.Primary importance can be any position of this default display interface, user can add arbitrary virtual target object in this position, this virtual target object can be the image prestored, or, the photographic head of available terminal shoots, using the jobbie in the image that photographs as virtual target object.
202, detection is for the touch parameter of described virtual target object.
In the embodiment of the present invention, virtual target object can be carried out touch control operation by the touch screen of this terminal by terminal, thus, can detect that the touch parameter for this virtual target object based on this touch control operation.Wherein, touch parameter can include but are not limited to: pressing virtual target object by surging, pressing virtual target object compressing time, pressing virtual target object touch point number, pressing virtual target object pressing area at least one, for this virtual target object sliding trace carrying out slide generation etc., wherein, touch point number can be regarded as the finger number that different fingers carries out touching, if 1 finger touches, then produce 1 touch point, if multiple fingers touch, then produce multiple touch point.
203, when described touch parameter is by surging and touch point number, it is judged that whether described touch point number is in preset range.
In the embodiment of the present invention, when touch parameter can be by surging, touch point number, can determine whether whether touch point number is in preset range, for instance, touch point number is less than 3, then, touch point number be 1 or 2 eligible.
204, if so, judge described in whether press surging more than the first predetermined threshold value.
Whether in the embodiment of the present invention, when touch point number satisfies condition, then determine whether by surging more than the first predetermined threshold value, wherein, the first predetermined threshold value can include but are not limited to: 5 Ns, 6 Ns, 7 Ns, 6.5 Ns, 7.32 Ns etc..
205, dragging instruction is if so, then generated.
In the embodiment of the present invention, in touch point number with when being satisfied by condition by surging, then generate and drag instruction.
206, respond described dragging instruction and described virtual target object is dragged to the second position by primary importance.
In the embodiment of the present invention, when generating dragging instruction, the position that virtual target object is dragged by user can be detected.User is when dragging virtual target object, virtual target object can be pinned not right, drag this virtual target object, when user stops pressing virtual target object, using the position of virtual target object now as the second position, that is, terminal can respond control instruction and by primary importance, virtual target object is dragged to the second position.
By the embodiment of the present invention under Virtual Reality pattern, obtain the virtual target object that user adds in primary importance, detect the touch parameter for this virtual target object, generate according to this touch parameter and drag instruction, respond this dragging instruction and virtual target object is dragged to the second position by primary importance.Thus, dragging instruction can be generated according to touch parameter and virtual target object is dragged, therefore, a kind of method that virtual reality can be provided the user.
Refer to Fig. 3, for the 3rd embodiment schematic flow sheet of application process of a kind of virtual reality that the embodiment of the present invention provides.The application process of the virtual reality described in the present embodiment, comprises the following steps:
301, under Virtual Reality pattern, the virtual target object that user adds is obtained in primary importance.
In the embodiment of the present invention, terminal can receive the enabled instruction of the VR pattern of user's input, and under this enabled instruction, terminal can enter VR pattern.Entering VR pattern, the display screen of terminal can show default display interface, and this default display interface can be blank interface or show picture that user selects in advance displaying interface as a setting.Primary importance can be any position of this default display interface, user can add arbitrary virtual target object in this position, this virtual target object can be the image prestored, or, the photographic head of available terminal shoots, using the jobbie in the image that photographs as virtual target object.
302, detection is for the touch parameter of described virtual target object.
In the embodiment of the present invention, virtual target object can be carried out touch control operation by the touch screen of this terminal by terminal, thus, can detect that the touch parameter for this virtual target object based on this touch control operation.Wherein, touch parameter can include but are not limited to: pressing virtual target object by surging, pressing virtual target object compressing time, pressing virtual target object touch point number, pressing virtual target object pressing area at least one, for this virtual target object sliding trace carrying out slide generation etc., wherein, touch point number can be regarded as the finger number that different fingers carries out touching, if 1 finger touches, then produce 1 touch point, if multiple fingers touch, then produce multiple touch point.
303, when described touch parameter is compressing time and touch point number, it is judged that whether described touch point number is more than the second predetermined threshold value.
In the embodiment of the present invention, when touch parameter is compressing time and touch point number, touch point number can first be judged by terminal.Second predetermined threshold value can include but are not limited to: 1,2,3,4 etc..When touch point number is 1, then illustrate that user presses virtual target object with 1 finger, when touch point number is 2, then illustrate that user presses virtual target object with 2 fingers.
304, if so, judge that described compressing time is more than the 3rd predetermined threshold value;
In the embodiment of the present invention, the 3rd predetermined threshold value can include but are not limited to: 5 seconds, 6 seconds, 7 seconds, 6.5 seconds, 7.32 seconds etc..
305, fixed instruction is if so, then generated.
In the embodiment of the present invention, when compressing time and touch point number are satisfied by condition, then terminal can generate fixed instruction.
306, respond described fixed instruction to be fixed by described virtual target object.
In the embodiment of the present invention, terminal can respond this fixed instruction and virtual target object is fixed, then, user then cannot arbitrarily move this virtual target object, alternatively, after described virtual target object is fixed by terminal response fixed instruction, then can be further, obtain the touch parameter for virtual target object, generate according to this touch parameter and release fixed instruction, so, under responding this releasing fixed instruction, this virtual target object is being carried out drag operation by user again.
By the embodiment of the present invention under Virtual Reality pattern, obtain the virtual target object that user adds in primary importance, detect the touch parameter for this virtual target object, generate fixed instruction according to this touch parameter, respond this fixed instruction and virtual target object is fixed.Thus, after virtual target object is fixed, then cannot arbitrarily move the position of this virtual target object, therefore, a kind of method that virtual reality can be provided the user, meanwhile, there is interest.
Refer to Fig. 4, for the 4th embodiment schematic flow sheet of application process of a kind of virtual reality that the embodiment of the present invention provides.The application process of the virtual reality described in the present embodiment, comprises the following steps:
401, under Virtual Reality pattern, the virtual target object that user adds is obtained in primary importance.
402, detection is for the touch parameter of described virtual target object.
403, corresponding control instruction is generated according to described touch parameter.
404, respond described control instruction and perform corresponding operation.
In the embodiment of the present invention, the step of step 401-404 describes the step 101-104 in the embodiment of the present invention being referred to described by Fig. 1.
405, when described virtual target object comprises face, described virtual target object is carried out Face datection, to obtain the face of described virtual target object.
In the embodiment of the present invention, if virtual target object comprises face, then, this virtual target object can be carried out Face datection by terminal, when face being detected, it is determined that the human face region of this virtual target object, further, the face in this virtual target object can be extracted.
406, obtain and press surging for touch screen.
In the embodiment of the present invention, terminal after determining face, can detect user for any position of touch screen carry out touch control operation by surging.Such as, user presses the position A of touch screen, then, can detect that this position A by surging.
407, according to the described arbitrary face by surging, described face switched in default face picture storehouse.
In the embodiment of the present invention, presetting in face picture storehouse and can comprise multiple face, the plurality of face can be cartoon human face, terminal can according to a certain face switched to by face by surging in default face picture storehouse.
Specifically, the face in virtual target object according to by surging, can be switched to appointment face by terminal, for instance, it is A by surging, it is intended that face is a;It is B by surging, it is intended that face is b, is C by surging, it is intended that face is c, then, when being A by surging, then target face is switched to A.Thus, then can according to the arbitrary face by surging face switched in default face picture storehouse.
Alternatively, terminal can according to by surging, it is determined that the switching times of face, according to this switching times, face is switched over, then can according to the arbitrary face switched to by face by surging in default face picture storehouse.
Still optionally further, terminal can according to by surging, it is determined that the switch speed of face, such as, more fast by the more big then switch speed of surging, more little by the more little then switch speed of surging, then can according to the arbitrary face by surging face switched in default face picture storehouse.
By the embodiment of the present invention under Virtual Reality pattern, obtain the virtual target object that user adds in primary importance, detect the touch parameter for this virtual target object, corresponding control instruction is generated according to this touch parameter, respond this control instruction and perform corresponding operation, and, when virtual target object comprises face, virtual target object is carried out Face datection, to obtain the face of virtual target object, obtain and press surging for touch screen, by surging face switched to according to this arbitrary face in default face picture storehouse.Thus, not only according to the user's different touch parameters for virtual target object, different control instructions can be generated, by a series of control instructions, perform a series of operation, it is possible to face in virtual target object is switched over, therefore, a kind of method that virtual reality can be provided the user.
Refer to Fig. 5, for the 5th embodiment schematic flow sheet of application process of a kind of virtual reality that the embodiment of the present invention provides.The application process of the virtual reality described in the present embodiment, comprises the following steps:
501, under Virtual Reality pattern, the virtual target object that user adds is obtained in primary importance.
502, detection is for the touch parameter of described virtual target object.
503, corresponding control instruction is generated according to described touch parameter.
504, respond described control instruction and perform corresponding operation.
In the embodiment of the present invention, the step of step 501-504 describes the step 101-104 in the embodiment of the present invention being referred to described by Fig. 1.
505, obtain and press surging for touch screen.
In the embodiment of the present invention, terminal after determining face, can detect user for any position of touch screen carry out touch control operation by surging.Such as, user presses the position A of touch screen, then, can detect that this position A by surging.
506, the background picture of described virtual target object place scene is switched over by surging according to described.
In the embodiment of the present invention, the background picture of virtual target object place scene can be switched over by terminal according to by surging, specifically, terminal can according to by surging, the background picture of virtual target object place scene is switched to specific context, wherein, background picture is virtual target object except other regions of this virtual target object region are referred to as background picture.Such as, it is A by surging, it is intended that background picture is a;It is B by surging, it is intended that background picture is b, is C by surging, it is intended that background picture is c, then, when being A by surging, then the background picture of virtual target object place scene is switched to A.
Alternatively, terminal can according to by surging, it is determined that the switching times of background picture, according to this switching times, the background picture of virtual target object place scene is repeatedly switched.
Still optionally further, terminal can according to by surging, it is determined that the switch speed of background picture, such as, more fast by the more big then switch speed of surging, more little by the more little then switch speed of surging, then according to by surging, the background picture of virtual target object place scene can be switched over.
By the embodiment of the present invention under Virtual Reality pattern, obtain the virtual target object that user adds in primary importance, detect the touch parameter for this virtual target object, corresponding control instruction is generated according to this touch parameter, respond this control instruction and perform corresponding operation, further, obtain and press surging for touch screen, by surging, the background picture of virtual target object place scene is switched over according to this.Thus, not only can according to the user's different touch parameters for virtual target object, generate different control instructions, by a series of control instructions, perform a series of operation, also the background picture of virtual target object place scene can be switched over, therefore, a kind of method that virtual reality can be provided the user.
Refer to Fig. 6 a, for the first embodiment structural representation of a kind of terminal that the embodiment of the present invention provides.Terminal described in the present embodiment, including:
First acquiring unit 601, for, under Virtual Reality pattern, obtaining the virtual target object that user adds in primary importance;
Detection unit 602, for detecting the touch parameter of the described virtual target object obtained for described first acquiring unit 601;
Generate unit 603, for generating corresponding control instruction according to the described touch parameter of described detection unit 602 detection;
Performance element 604, the described control instruction generated for responding described generation unit 603 performs corresponding operation.
Alternatively, described touch parameter be press described virtual target object by surging, press described virtual target object compressing time, press described virtual target object touch point number, press described virtual target object pressing area at least one.
Still optionally further, as shown in Figure 6 b, described touch parameter be described by surging and described touch point number time, the described generation unit 603 described in Fig. 6 a comprises the steps that
First judge module 6031, is used for judging whether described touch point number is in preset range;
Second judge module 6032, if being yes for the judged result of described first judge module, it is judged that described by surging whether more than the first predetermined threshold value;
First generation module 6033, if the judged result for described second judge module is yes, then generates and drags instruction, and described virtual target object is dragged to the second position by described primary importance for instruction by described dragging instruction.
Still optionally further, as fig. 6 c, when described touch parameter is described compressing time and described touch point number, the described generation unit 603 described in Fig. 6 a comprises the steps that
3rd judge module 605, is used for judging that whether described touch point number is more than the second predetermined threshold value;
4th judge module 606, if the judged result for described 3rd judge module 605 is yes, it is judged that described compressing time is more than the 3rd predetermined threshold value;
Second generation module 607, if the judged result for described 4th judge module is yes, then generates fixed instruction, and described virtual target object is fixed by described preset instructions for instruction.
Still optionally further, as shown in fig 6d, when described virtual target object comprises face, described detection unit 602 is additionally operable to, after performance element 604 responds the corresponding operation of described control instruction execution that described generation unit 603 generates, described virtual target object is carried out Face datection, and to obtain the face of described virtual target object, the terminal described in Fig. 6 a may also include that
Second acquisition unit 608, for obtain for touch screen by surging;
First switch unit 609, for switching to the arbitrary face in default face picture storehouse according to described second acquisition unit 608 described in obtaining by surging by described face.
Still optionally further, as shown in fig 6e, described detection unit 602 is additionally operable to, and responds after the described control instruction that described generation unit 603 generates performs corresponding operation at performance element 604, and detection presses surging for touch screen;Terminal described in Fig. 6 a may also include that
Second switch unit 610, switches over the background picture of described virtual target object place scene by surging described in detecting according to described detection unit 602.
Can under Virtual Reality pattern by the terminal described by the embodiment of the present invention, obtain the virtual target object that user adds in primary importance, detect the touch parameter for this virtual target object, generate corresponding control instruction according to this touch parameter, respond this control instruction and perform corresponding operation.Thus, according to the user's different touch parameters for virtual target object, generate different control instructions, by a series of control instructions, perform a series of operation, therefore, a kind of method that virtual reality can be provided the user.
Refer to Fig. 7, for the second example structure schematic diagram of a kind of terminal that the embodiment of the present invention provides.Terminal described in the present embodiment, including: at least one input equipment 1000;At least one outut device 2000;At least one processor 3000, for instance CPU;With memorizer 4000, above-mentioned input equipment 1000, outut device 2000, processor 3000 and memorizer 4000 are connected by bus 5000.
Wherein, above-mentioned input equipment 1000 concretely contact panel, physical button or mouse.
Above-mentioned outut device 2000 concretely display screen.
Above-mentioned memorizer 4000 can be high-speed RAM memorizer, it is possible to for non-labile memorizer (non-volatilememory), for instance disk memory.Above-mentioned memorizer 4000 is used for storing batch processing code, and above-mentioned input equipment 1000, outut device 2000 and processor 3000, for calling the program code of storage in memorizer 4000, perform following operation:
Above-mentioned processor 3000, is used for:
Under Virtual Reality pattern, obtain the virtual target object that user adds in primary importance;
Detect the touch parameter for described virtual target object;
Corresponding control instruction is generated according to described touch parameter;
Respond described control instruction and perform corresponding operation.
In the embodiment that some are feasible, described touch parameter be press described virtual target object by surging, press described virtual target object compressing time, press described virtual target object touch point number, press described virtual target object pressing area at least one.
Alternatively, described touch parameter be described by surging and described touch point number time, above-mentioned processor 3000 according to described touch parameter generate corresponding control instruction, including:
Judge whether described touch point number is in preset range;
If so, whether surging is pressed more than the first predetermined threshold value described in judging;
If so, then generating and drag instruction, described virtual target object is dragged to the second position by described primary importance for instruction by described dragging instruction.
Alternatively, when described touch parameter is described compressing time and described touch point number, above-mentioned processor 3000 generates corresponding control instruction according to described touch parameter, including:
Judge that whether described touch point number is more than the second predetermined threshold value;
If so, judge that described compressing time is more than the 3rd predetermined threshold value;
If so, then generating fixed instruction, described virtual target object is fixed by described preset instructions for instruction.
Alternatively, when described virtual target object comprises face, after above-mentioned processor 3000 responds the corresponding operation of described control instruction execution, also particularly useful for:
Described virtual target object is carried out Face datection, to obtain the face of described virtual target object;
Obtain and press surging for touch screen;
According to the described arbitrary face by surging, described face switched in default face picture storehouse.
After above-mentioned processor 3000 responds the corresponding operation of described control instruction execution alternatively, also particularly useful for:
Obtain and press surging for touch screen;
The background picture of described virtual target object place scene is switched over by surging according to described.
In implementing, input equipment 1000 described in the embodiment of the present invention, outut device 2000 and processor 3000 can perform the implementation described in the first embodiment of application process of a kind of virtual reality, the second embodiment, the 3rd embodiment, the 4th embodiment and the 5th embodiment that the embodiment of the present invention provides, also can perform the implementation of the terminal described in first embodiment of a kind of terminal that the embodiment of the present invention provides, not repeat them here.
It should be noted that, for aforesaid each embodiment of the method, in order to be briefly described, therefore it is all expressed as a series of combination of actions, but those skilled in the art should know, the present invention is not by the restriction of described sequence of movement, because according to the present invention, some step can adopt other orders or carry out simultaneously.Secondly, those skilled in the art also should know, embodiment described in this description belongs to preferred embodiment, necessary to involved action and the module not necessarily present invention.
In the above-described embodiments, the description of each embodiment is all emphasized particularly on different fields, certain embodiment there is no the part described in detail, it is possible to referring to the associated description of other embodiments.
In several embodiments provided herein, it should be understood that disclosed device, can realize by another way.Such as, device embodiment described above is merely schematic, the such as division of described unit, it is only a kind of logic function to divide, actual can have other dividing mode when realizing, such as multiple unit or assembly can in conjunction with or be desirably integrated into another system, or some features can be ignored, or does not perform.Another point, shown or discussed coupling each other or direct-coupling or communication connection can be through INDIRECT COUPLING or the communication connection of some interfaces, device or unit, it is possible to be electrical or other form.
The described unit illustrated as separating component can be or may not be physically separate, and the parts shown as unit can be or may not be physical location, namely may be located at a place, or can also be distributed on multiple NE.Some or all of unit therein can be selected according to the actual needs to realize the purpose of the present embodiment scheme.
It addition, each functional unit in each embodiment of the present invention can be integrated in a processing unit, it is also possible to be that unit is individually physically present, it is also possible to two or more unit are integrated in a unit.Above-mentioned integrated unit both can adopt the form of hardware to realize, it would however also be possible to employ the form of SFU software functional unit realizes.
If described integrated unit is using the form realization of SFU software functional unit and as independent production marketing or use, it is possible to be stored in a computer read/write memory medium.Based on such understanding, part or all or part of of this technical scheme that prior art is contributed by technical scheme substantially in other words can embody with the form of software product, this computer software product is stored in a storage medium, including some instructions with so that a computer equipment (can for personal computer, server or the network equipment etc.) performs all or part of step of method described in each embodiment of the present invention.And aforesaid storage medium includes: USB flash disk, read only memory (ROM, Read-OnlyMemory), the various media that can store program code such as random access memory (RAM, RandomAccessMemory), portable hard drive, magnetic disc or CD.
The above, above example only in order to technical scheme to be described, is not intended to limit;Although the present invention being described in detail with reference to previous embodiment, it will be understood by those within the art that: the technical scheme described in foregoing embodiments still can be modified by it, or wherein portion of techniques feature is carried out equivalent replacement;And these amendments or replacement, do not make the essence of appropriate technical solution depart from the scope of various embodiments of the present invention technical scheme.
Claims (13)
1. the application process of a virtual reality, it is characterised in that including:
Under Virtual Reality pattern, obtain the virtual target object that user adds in primary importance;
Detect the touch parameter for described virtual target object;
Corresponding control instruction is generated according to described touch parameter;
Respond described control instruction and perform corresponding operation.
2. application process according to claim 1, it is characterized in that, described touch parameter be press described virtual target object by surging, press described virtual target object compressing time, press described virtual target object touch point number, press described virtual target object pressing area at least one.
3. application process according to claim 2, it is characterised in that described touch parameter be described by surging and described touch point number time, described according to described touch parameter generate corresponding control instruction, including:
Judge whether described touch point number is in preset range;
If so, whether surging is pressed more than the first predetermined threshold value described in judging;
If so, then generating and drag instruction, described virtual target object is dragged to the second position by described primary importance for instruction by described dragging instruction.
4. application process according to claim 2, it is characterised in that when described touch parameter is described compressing time and described touch point number, described according to the described touch parameter corresponding control instruction of generation, including:
Judge that whether described touch point number is more than the second predetermined threshold value;
If so, judge that described compressing time is more than the 3rd predetermined threshold value;
If so, then generating fixed instruction, described virtual target object is fixed by described preset instructions for instruction.
5. the application process according to any one of Claims 1-4, it is characterised in that when described virtual target object comprises face, after the described control instruction of described response performs corresponding operation, described method also includes:
Described virtual target object is carried out Face datection, to obtain the face of described virtual target object;
Obtain and press surging for touch screen;
According to the described arbitrary face by surging, described face switched in default face picture storehouse.
6. the application process according to any one of Claims 1-4, it is characterised in that after the described control instruction of described response performs corresponding operation, described method also includes:
Obtain and press surging for touch screen;
The background picture of described virtual target object place scene is switched over by surging according to described.
7. a terminal, it is characterised in that including:
First acquiring unit, for, under Virtual Reality pattern, obtaining the virtual target object that user adds in primary importance;
Detection unit, for detecting the touch parameter of the described virtual target object obtained for described first acquiring unit;
Generate unit, for generating corresponding control instruction according to the described touch parameter of described detection unit detection;
Performance element, the described control instruction generated for responding described generation unit performs corresponding operation.
8. terminal according to claim 7, it is characterized in that, described touch parameter be press described virtual target object by surging, press described virtual target object compressing time, press described virtual target object touch point number, press described virtual target object pressing area at least one.
9. terminal according to claim 8, it is characterised in that described touch parameter be described by surging and described touch point number time, described generation unit:
First judge module, is used for judging whether described touch point number is in preset range;
Second judge module, if being yes for the judged result of described first judge module, it is judged that described by surging whether more than the first predetermined threshold value;
First generation module, if the judged result for described second judge module is yes, then generates and drags instruction, and described virtual target object is dragged to the second position by described primary importance for instruction by described dragging instruction.
10. terminal according to claim 8, it is characterised in that when described touch parameter is described compressing time and described touch point number, described generation unit includes:
3rd judge module, is used for judging that whether described touch point number is more than the second predetermined threshold value;
4th judge module, if the judged result for described 3rd judge module is yes, it is judged that described compressing time is more than the 3rd predetermined threshold value;
Second generation module, if the judged result for described 4th judge module is yes, then generates fixed instruction, and described virtual target object is fixed by described preset instructions for instruction.
11. according to the terminal described in any one of claim 7 to 10, it is characterized in that, when described virtual target object comprises face, described detection unit is additionally operable to, after performance element responds the corresponding operation of described control instruction execution that described generation unit generates, described virtual target object is carried out Face datection, to obtain the face of described virtual target object;
Described terminal also includes:
Second acquisition unit, for obtain for touch screen by surging;
First switch unit, for switching to the arbitrary face in default face picture storehouse according to described second acquisition unit described in obtaining by surging by described face.
12. according to the terminal described in any one of claim 7 to 10, it is characterised in that described detection unit is additionally operable to, respond after the described control instruction that described generation unit generates performs corresponding operation at performance element, detection presses surging for touch screen;
Described terminal also includes:
Second switch unit, switches over the background picture of described virtual target object place scene by surging described in detecting according to described detection unit.
13. a terminal, it is characterised in that including:
Storage has the memorizer of executable program code;
The processor coupled with described memorizer;
Described processor calls the described executable program code of storage in described memorizer, performs the method as described in any one of claim 1 to claim 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610064431.3A CN105739879A (en) | 2016-01-29 | 2016-01-29 | Virtual reality application method and terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610064431.3A CN105739879A (en) | 2016-01-29 | 2016-01-29 | Virtual reality application method and terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
CN105739879A true CN105739879A (en) | 2016-07-06 |
Family
ID=56247940
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610064431.3A Pending CN105739879A (en) | 2016-01-29 | 2016-01-29 | Virtual reality application method and terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105739879A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106774925A (en) * | 2016-12-30 | 2017-05-31 | 维沃移动通信有限公司 | The data processing method and virtual reality terminal of a kind of virtual reality terminal |
CN107908322A (en) * | 2017-10-27 | 2018-04-13 | 深圳市创凯智能股份有限公司 | Object delet method, device and computer-readable storage medium based on three dimensions |
CN109508093A (en) * | 2018-11-13 | 2019-03-22 | 宁波视睿迪光电有限公司 | A kind of virtual reality exchange method and device |
WO2019109904A1 (en) * | 2017-12-04 | 2019-06-13 | 捷开通讯(深圳)有限公司 | Intelligent terminal control method, intelligent terminal, and device having storage function |
CN111047392A (en) * | 2019-11-11 | 2020-04-21 | 北京迈格威科技有限公司 | System, method and storage medium for constructing digital shop |
CN111399654A (en) * | 2020-03-25 | 2020-07-10 | Oppo广东移动通信有限公司 | Information processing method, information processing device, electronic equipment and storage medium |
CN112698778A (en) * | 2021-03-23 | 2021-04-23 | 北京芯海视界三维科技有限公司 | Method and device for target transmission between devices and electronic device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102105853A (en) * | 2008-07-25 | 2011-06-22 | 微软公司 | Touch interaction with a curved display |
CN103970268A (en) * | 2013-02-01 | 2014-08-06 | 索尼公司 | Information processing device, client device, information processing method, and program |
CN103999030A (en) * | 2011-12-26 | 2014-08-20 | 索尼公司 | Head-mounted display and information display device |
CN104156993A (en) * | 2014-07-18 | 2014-11-19 | 小米科技有限责任公司 | Method and device for switching face image in picture |
-
2016
- 2016-01-29 CN CN201610064431.3A patent/CN105739879A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102105853A (en) * | 2008-07-25 | 2011-06-22 | 微软公司 | Touch interaction with a curved display |
CN103999030A (en) * | 2011-12-26 | 2014-08-20 | 索尼公司 | Head-mounted display and information display device |
CN103970268A (en) * | 2013-02-01 | 2014-08-06 | 索尼公司 | Information processing device, client device, information processing method, and program |
CN104156993A (en) * | 2014-07-18 | 2014-11-19 | 小米科技有限责任公司 | Method and device for switching face image in picture |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106774925A (en) * | 2016-12-30 | 2017-05-31 | 维沃移动通信有限公司 | The data processing method and virtual reality terminal of a kind of virtual reality terminal |
CN107908322A (en) * | 2017-10-27 | 2018-04-13 | 深圳市创凯智能股份有限公司 | Object delet method, device and computer-readable storage medium based on three dimensions |
WO2019109904A1 (en) * | 2017-12-04 | 2019-06-13 | 捷开通讯(深圳)有限公司 | Intelligent terminal control method, intelligent terminal, and device having storage function |
CN109508093A (en) * | 2018-11-13 | 2019-03-22 | 宁波视睿迪光电有限公司 | A kind of virtual reality exchange method and device |
CN111047392A (en) * | 2019-11-11 | 2020-04-21 | 北京迈格威科技有限公司 | System, method and storage medium for constructing digital shop |
CN111047392B (en) * | 2019-11-11 | 2024-02-06 | 北京迈格威科技有限公司 | System, method and storage medium for constructing digital store |
CN111399654A (en) * | 2020-03-25 | 2020-07-10 | Oppo广东移动通信有限公司 | Information processing method, information processing device, electronic equipment and storage medium |
CN112698778A (en) * | 2021-03-23 | 2021-04-23 | 北京芯海视界三维科技有限公司 | Method and device for target transmission between devices and electronic device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105739879A (en) | Virtual reality application method and terminal | |
KR100687737B1 (en) | Apparatus and method for a virtual mouse based on two-hands gesture | |
EP2699986B1 (en) | Touch screen selection | |
CN104063128B (en) | A kind of information processing method and electronic equipment | |
KR20150014083A (en) | Method For Sensing Inputs of Electrical Device And Electrical Device Thereof | |
CN108064368A (en) | The control method and device of flexible display device | |
US20140218315A1 (en) | Gesture input distinguishing method and apparatus in touch input device | |
JP2017504877A (en) | Method and apparatus for click object enlargement based on floating touch | |
CN105744054A (en) | Mobile terminal control method and mobile terminal | |
US10042445B1 (en) | Adaptive display of user interface elements based on proximity sensing | |
CN108845752A (en) | touch operation method, device, storage medium and electronic equipment | |
CN105824553A (en) | Touch method and mobile terminal | |
CN103455262A (en) | Pen-based interaction method and system based on mobile computing platform | |
CN112416236A (en) | Gesture packaging and interaction method and device based on web page and storage medium | |
EP3008556B1 (en) | Disambiguation of indirect input | |
CN112506502A (en) | Visual programming method, device, equipment and storage medium based on human-computer interaction | |
CN107577404B (en) | Information processing method and device and electronic equipment | |
KR101060175B1 (en) | Method for controlling touch screen, recording medium for the same, and method for controlling cloud computing | |
CN108073347B (en) | The processing method of multi-point touch calculates equipment and computer storage medium | |
CN104978135A (en) | Icon display method and device, and mobile terminal | |
CN104317492A (en) | Wallpaper setting method | |
CN115917488A (en) | Display interface processing method and device and storage medium | |
CN113342222B (en) | Application classification method and device and electronic equipment | |
US9799103B2 (en) | Image processing method, non-transitory computer-readable storage medium and electrical device | |
CN112667931B (en) | Webpage collecting method, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20160706 |