CN106371559A - Interactive method, interactive apparatus and user equipment - Google Patents
Interactive method, interactive apparatus and user equipment Download PDFInfo
- Publication number
- CN106371559A CN106371559A CN201510490318.7A CN201510490318A CN106371559A CN 106371559 A CN106371559 A CN 106371559A CN 201510490318 A CN201510490318 A CN 201510490318A CN 106371559 A CN106371559 A CN 106371559A
- Authority
- CN
- China
- Prior art keywords
- information
- user
- portable equipment
- condition
- motion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
Embodiments of the invention disclose an interactive method, an interactive apparatus and user equipment. The method comprises the steps of obtaining first information corresponding to a first motion of a user head; obtaining second information corresponding to a second motion of a portable device carried by a user; and executing at least one first operation corresponding to a first condition in response to a condition that the first information and the second information meet the first condition, wherein the first condition corresponds to a condition that at least part of the portable device moves into a visual range of the user from outside of the visual range of the user. According to at least one implementation scheme of the method, the apparatus and the user equipment provided by the embodiments of the invention, it is determined that the portable device moves into the visual range from outside of the visual range of the user simultaneously according to the motion of the user head and the motion of the portable device, and the operation is executed, so that an operation intention of the user for the portable device can be determined more accurately and the use convenience is brought for the user.
Description
Technical field
The application is related to a kind of interaction technique field, more particularly, to a kind of exchange method, interaction dress
Put and user equipment.
Background technology
With the development of human-computer interaction technology, the ease of use of user equipment is increasingly by user
Concern.Before user is operated, user equipment is intended to simultaneously with regard to the operation of energy sensed in advance user
Execute corresponding operation, the ease of use of user equipment can be greatly improved, therefore, how
So that the operation that user equipment perceives user accurately and in time is intended to has been that designer is increasingly closed
The problem of the heart.
Content of the invention
The possible purpose of the embodiment of the present application is: provides a kind of interaction technique scheme.
In a first aspect, a possible embodiment of the application provides a kind of exchange method, bag
Include:
Obtain the first corresponding first information of motion with a user's head;
Obtain the second corresponding second information of motion with the portable equipment that described user carries;
Meet a first condition, execution and institute in response to the described first information with described second information
State corresponding at least one first operation of first condition;
Wherein, described first condition corresponds to: described portable equipment at least part of by described use
Described user is moved within sweep of the eye outside the field range at family.
In conjunction with a first aspect, in the possible embodiment of second, the described first information and institute
Second information of stating meets described first condition and includes:
The described first information meets a first information Changing Pattern, wherein: the described first information becomes
Law corresponds to described user's head by a nonreciprocal attitude motion to an interaction attitude;
Described second information meets one second information change rule, and wherein, described second information becomes
Law is at least partly moved to by a nonreciprocal position corresponding to the described of described portable equipment
One interaction locations;
Wherein: described interaction locations are located at the corresponding user of described interactive attitude within sweep of the eye.
In conjunction with any of the above-described kind of possible embodiment of first aspect, in the third possible reality
Apply in mode, described at least one first operation includes: at least partly activates described portable equipment.
In conjunction with any of the above-described kind of possible embodiment of first aspect, in the 4th kind of possible reality
Apply in mode, methods described also includes:
Meet a second condition in response to the described first information with described second information,
Execution at least one second operation corresponding with described second condition;
Wherein, described second condition corresponds to: described portable equipment at least part of by described use
Outside the field range being changed to described user within sweep of the eye at family.
In conjunction with any of the above-described kind of possible embodiment of first aspect, in the 5th kind of possible reality
Apply in mode, the described first information meets described second condition with described second information and includes:
The described first information meets one the 3rd information change rule, wherein: described 3rd information becomes
Law corresponds to described user's head by an interaction attitude motion to a nonreciprocal attitude;
And/or
Described second information meets one the 4th information change rule, and wherein, described 4th information becomes
Law at least partly moves to one by interaction locations corresponding to the described of described portable equipment
Nonreciprocal position.
In conjunction with any of the above-described kind of possible embodiment of first aspect, in the 6th kind of possible reality
Apply in mode, described at least one second operation includes: at least partly portable equipment described in dormancy.
In conjunction with any of the above-described kind of possible embodiment of first aspect, in the 7th kind of possible reality
Apply in mode, the described first information includes following at least one:
Move corresponding first heat transfer agent and the described first motion corresponding the with described first
One athletic posture information.
In conjunction with any of the above-described kind of possible embodiment of first aspect, in the 8th kind of possible reality
Apply in mode, described first heat transfer agent includes following at least one:
At least one first inertial sensing information of at least one helmet of described user, described extremely
At least one first image sequence of a few helmet collection, at least the 1 the of described user's head
Two image sequences, the corresponding at least one myoelectricity heat transfer agent in a described user's cervical region at least position,
The corresponding at least one crooked sensory information in a described user's cervical region at least position.
In conjunction with any of the above-described kind of possible embodiment of first aspect, in the 9th kind of possible reality
Apply in mode, described second information includes following at least one:
Move corresponding second heat transfer agent and the described second motion corresponding the with described second
Two athletic posture information.
In conjunction with any of the above-described kind of possible embodiment of first aspect, in the tenth kind of possible reality
Apply in mode, described second heat transfer agent includes following at least one:
At least one second inertial sensing information of described portable equipment, described portable equipment gather
At least one the 3rd image sequence, at least one the 4th image sequence of described portable equipment.
In conjunction with any of the above-described kind of possible embodiment of first aspect, a kind of possible the tenth
In embodiment, at least part of inclusion of described portable equipment:
Described portable equipment can interaction area.
In conjunction with any of the above-described kind of possible embodiment of first aspect, possible at the 12nd kind
In embodiment, described can interaction area include: visual indication area.
Second aspect, a possible embodiment of the application provides a kind of interactive device, bag
Include:
First information acquisition module, moves corresponding for obtaining with the first of a user's head
One information;
Second data obtaining module, for obtaining the of the portable equipment carrying with described user
Two corresponding second information of motion;
First process performing module, for full with described second information in response to the described first information
Foot one first condition, execution at least one first operation corresponding with described first condition;Wherein,
Described first condition corresponds to: at least part of visual field model by described user of described portable equipment
Described user is moved within sweep of the eye outside enclosing.
In conjunction with second aspect, in the possible embodiment of second, the described first information and institute
Second information of stating meets described first condition and includes:
The described first information meets a first information Changing Pattern, wherein: the described first information becomes
Law corresponds to described user's head by a nonreciprocal attitude motion to an interaction attitude;
Described second information meets one second information change rule, and wherein, described second information becomes
Law is at least partly moved to by a nonreciprocal position corresponding to the described of described portable equipment
One interaction locations;
Wherein: described interaction locations are located at the corresponding user of described interactive attitude within sweep of the eye.
In conjunction with any of the above-described kind of possible embodiment of second aspect, in the third possible reality
Apply in mode, described at least one first operation includes: at least partly activates described portable equipment.
In conjunction with any of the above-described kind of possible embodiment of second aspect, in the 4th kind of possible reality
Apply in mode, described device also includes:
Second processing performing module, for full with described second information in response to the described first information
Foot one second condition, execution at least one second operation corresponding with described second condition;
Wherein, described second condition corresponds to: described portable equipment at least part of by described use
Outside the field range being changed to described user within sweep of the eye at family.
In conjunction with any of the above-described kind of possible embodiment of second aspect, in the 5th kind of possible reality
Apply in mode, the described first information meets described second condition with described second information and includes:
The described first information meets one the 3rd information change rule, wherein: described 3rd information becomes
Law corresponds to described user's head by an interaction attitude motion to a nonreciprocal attitude;
And/or
Described second information meets one the 4th information change rule, and wherein, described 4th information becomes
Law at least partly moves to one by interaction locations corresponding to the described of described portable equipment
Nonreciprocal position.
In conjunction with any of the above-described kind of possible embodiment of second aspect, in the 6th kind of possible reality
Apply in mode, described at least one second operation includes: at least partly portable equipment described in dormancy.
In conjunction with any of the above-described kind of possible embodiment of second aspect, in the 7th kind of possible reality
Apply in mode, the described first information includes following at least one:
Move corresponding first heat transfer agent and the described first motion corresponding the with described first
One athletic posture information.
In conjunction with any of the above-described kind of possible embodiment of second aspect, in the 8th kind of possible reality
Apply in mode, described first heat transfer agent includes following at least one:
At least one first inertial sensing information of at least one helmet of described user, described extremely
At least one first image sequence of a few helmet collection, at least the 1 the of described user's head
Two image sequences, the corresponding at least one myoelectricity heat transfer agent in a described user's cervical region at least position,
The corresponding at least one crooked sensory information in a described user's cervical region at least position.
In conjunction with any of the above-described kind of possible embodiment of second aspect, in the 9th kind of possible reality
Apply in mode, described second information includes following at least one:
Move corresponding second heat transfer agent and the described second motion corresponding the with described second
Two athletic posture information.
In conjunction with any of the above-described kind of possible embodiment of second aspect, in the tenth kind of possible reality
Apply in mode, described second heat transfer agent includes following at least one:
At least one second inertial sensing information of described portable equipment, described portable equipment gather
At least one the 3rd image sequence, at least one the 4th image sequence of described portable equipment.
In conjunction with any of the above-described kind of possible embodiment of second aspect, a kind of possible the tenth
In embodiment, at least part of inclusion of described portable equipment:
Described portable equipment can interaction area.
In conjunction with any of the above-described kind of possible embodiment of second aspect, possible at the 12nd kind
In embodiment, described can interaction area include: visual indication area.
The third aspect, a possible embodiment of the application provides a kind of user equipment, institute
State user equipment to include:
Memorizer, is used for depositing program;
Processor, for executing the program of described memory storage, described program makes described place
Reason device execution is following to be operated:
Obtain the first corresponding first information of motion with a user's head;
Obtain the second corresponding second information of motion with the portable equipment that described user carries;
Meet a first condition, execution and institute in response to the described first information with described second information
State corresponding at least one first operation of first condition;
Wherein, described first condition corresponds to: described portable equipment at least part of by described use
Described user is moved within sweep of the eye outside the field range at family.
The motion according to user's head simultaneously of at least one embodiment of the embodiment of the present application and
The motion of portable equipment determines that portable equipment moves to field range by outside the field range of user
Interior, and and then execute an operation, more accurately to determine the operation to described portable equipment for the user
It is intended to, be user-friendly to.
Brief description
Fig. 1 is a kind of schematic flow sheet of exchange method of the embodiment of the present application;
Fig. 2 a and 2b is the application scenarios schematic diagram of the embodiment of the present application;
Fig. 3 is a kind of structural schematic block diagram of interactive device of the embodiment of the present application;
Fig. 4 a~4c is the structural schematic block diagram of three kinds of interactive devices of the embodiment of the present application;
Fig. 5 is a kind of structural schematic block diagram of user equipment of the embodiment of the present application.
Specific embodiment
(in some accompanying drawings, identical label represents identical element) and enforcement below in conjunction with the accompanying drawings
Example, is described in further detail to the specific embodiment of the application.Following examples are used for
Bright the application, but it is not limited to scope of the present application.
It will be understood by those skilled in the art that the term such as " first ", " second " in the application is only
For distinguishing different step, equipment or module etc., neither represent any particular technology implication,
Do not indicate that the inevitable logical order between them.
As shown in figure 1, a kind of possible embodiment of the embodiment of the present application provides a kind of friendship
Mutually method, comprising:
S110 obtains the first corresponding first information of motion with a user's head;
S120 obtains the second motion corresponding second with the portable equipment that described user carries
Information;
S130 meets a first condition in response to the described first information with described second information, holds
Row at least one first operation corresponding with described first condition;Wherein, described first condition corresponds to
In: move to described use outside at least part of field range by described user of described portable equipment
Family is within sweep of the eye.
For example, the interactive device that the application provides, as the executive agent of the present embodiment, is held
Row s110~s130.Specifically, described interactive device can be tied with software, hardware or software and hardware
The mode closed is arranged in a user device, or, the inherently described user of described interactive device
Equipment;Described user equipment includes but is not limited to: smart mobile phone, Intelligent bracelet, intelligent watch,
Intelligent ring etc..
The embodiment of the embodiment of the present application the first motion according to user's head and user simultaneously
Using second the moving determining whether described portable equipment enters regarding of access customer of portable equipment
In wild scope, more accurately the operation of portable equipment can be intended to identifying user, and then execute
Corresponding operation, it is to avoid because user is not intended to move head or carries described portable equipment
The no intention action of limbs leads to the possibility of maloperation.
Further illustrate each step of the embodiment of the present application by embodiments below:
S110 obtains the first corresponding first information of motion with a user's head.
In a kind of possible embodiment, alternatively, the described first information may include that
With the described first corresponding first heat transfer agent of motion.
In a kind of possible embodiment, alternatively, described first heat transfer agent may include that
At least one first inertial sensing information of at least one helmet of described user.
In a kind of possible embodiment, described at least one first inertial sensing information is by described
At least one inertial sensor collection of configuration at least one helmet.Wherein, a kind of possible
Embodiment in, inherently described at least one helmet of described at least one inertial sensor.
Wherein, described at least one helmet can be for example: in glasses, the helmet, earphone etc.
One or more.
In the present embodiment, the first athletic meeting of described user's head lead to described at least one
Wear the motion of equipment, so make described at least one inertial sensor collect and described first fortune
Move corresponding at least one first inertial sensing information.
In a kind of possible embodiment, alternatively, described first heat transfer agent may include that
At least one first image sequence of described at least one helmet collection.
In the present embodiment, described at least one first image sequence can be by described at least one
Wear at least one image acquisition device collection on equipment.Described first image sequence can be for example
Multiple first images according to time sequence.Face in a kind of possible embodiment, described first
Image sequence can be for example the multiple image comprising in one section of video.
In the present embodiment, the first athletic meeting of described user's head lead to described at least one
Wear the motion of equipment, and then one or more of at least one first image sequence described in making is right
As the position in the plurality of first image occurs and the described first corresponding change of motion.
In the above-described embodiment, the described first information of described acquisition, for example, can be to pass through
The mode of communication obtains the described first information from described at least one helmet.
Because the motion of user's head is usually corresponding with the motion of user's cervical region, therefore, exist
In some possible embodiments, can be according to the corresponding heat transfer agent of described user's cervical region Lai really
Fixed described first motion.
In a kind of possible embodiment, alternatively, described first heat transfer agent may include that
The corresponding at least one myoelectricity heat transfer agent in a described user's cervical region at least position.
The head movement of user is corresponding with the muscular movement of user's cervical region, therefore, in this embodiment party
In formula, be can determine according to the corresponding at least one myoelectricity heat transfer agent in a described cervical region at least position
Described first motion.Wherein, in some possible embodiments, described at least one myoelectricity passes
Corresponding relation between sense information and described first motion or the described user visual field, can be by instruction
Get.Described at least one position can be for example the rear cervical region of user.
In a kind of possible embodiment, for example, can pass through a neck ring, the neck of user's cervical region
Set or multiple myoelectricity sensing electrode pieces of attaching are come at least one myoelectricity heat transfer agent described in obtaining.
In a kind of possible embodiment, alternatively, described first heat transfer agent may include that
The corresponding at least one crooked sensory information in a described user's cervical region at least position.
The head movement of user is corresponding with the angle of bend of user's cervical region, for example, when user overlooks,
Cervical region is to front curve;When user looks up, cervical region bends backward.In the present embodiment, example
As can be by being attached to the bend sensor of a user's cervical region epidermis at least position to obtain
State at least one crooked sensory information.Wherein, in some possible embodiments, described at least
Corresponding relation between one crooked sensory information and described first motion or the described user visual field, can
To be obtained by training.Described at least one position can be for example: user's cervical region lower jaw is to Adam's apple
Section.
In a kind of possible embodiment, alternatively, described first heat transfer agent may include that
At least one second image sequence of described user's head.
In a kind of possible embodiment, described at least one second image sequence can be by one
External equipment collection.In the present embodiment, the described first information of described acquisition is for example permissible
For: obtain the described first information from described external equipment by way of communication.
Certainly, those skilled in the art are it is recognised that described first heat transfer agent can also include
Multiple in several first heat transfer agents above, sentenced with increasing user's head athletic posture further
Disconnected accuracy.And, it is in addition to several first heat transfer agents recited above, other possible
Heat transfer agent for determining user's head athletic posture can also be applied in the embodiment of the present application
Embodiment in, will not enumerate here.
In a kind of possible embodiment, the described first information may include that and described first
Move corresponding first athletic posture information.
In a kind of possible embodiment, described first athletic posture information can be according to upper
Described in face, one or more of at least one first heat transfer agent obtains.For example, according to described
At least one first inertial sensing information obtains described first athletic posture information.
Wherein, described first athletic posture information for example may include that the fortune of described first motion
Dynamic change information and attitudes vibration information.
S120 obtains the second motion corresponding second with the portable equipment that described user carries
Information.
Similar to the described first information, in the embodiment of the present application, described second information is permissible
Including following at least one:
Move corresponding second heat transfer agent and the described second motion corresponding the with described second
Two athletic posture information.
Wherein, described second athletic posture information can obtain according to described second heat transfer agent.
In a kind of possible embodiment, similar with the first heat transfer agent, described second biography
Sense information may include that
At least one second inertial sensing information of described portable equipment and/or described portable equipment
At least one the 3rd image sequence of collection.
In the present embodiment, the executive agent in described exchange method is not described portable equipment
When, described s120 can obtain the second information from described portable equipment by way of communication.
In described executive agent inherently described portable equipment, can be directly from locally obtaining
Described second information.
In a kind of possible embodiment, described second heat transfer agent may include that described in just
Take at least one the 4th image sequence of equipment.
S130 meets a first condition in response to the described first information with described second information, holds
Row at least one first operation corresponding with described first condition;Wherein, described first condition corresponds to
In: move to described use outside at least part of field range by described user of described portable equipment
Family is within sweep of the eye.
In the present embodiment, the field range of described user can be arranged as required to.For example:
Can comfortable be seen with the general effective field of view scope of the mankind, absolute visual field scope, described user
See field range of object etc. as the field range of described user.
In a kind of possible embodiment, for example, described s130 can be according to described first
Information determines the attitudes vibration information of described user's head;According to described second information determines
The change in location information (can be the change in location information with respect to described user) of portable equipment;
Determine that whether can in the position of described portable equipment during the attitudes vibration of described user's head
Entered within sweep of the eye by outside described field range, if it has, the then described first information and institute
Second information of stating meets described first condition, at least one first operation described in execution.
In alternatively possible embodiment, the described first information, described second information and institute
State the corresponding relation between first condition can obtain in advance, then by the described first information, described
Second information and described corresponding relation are assured that the described first information and described second information
Whether meet described first condition.
Described corresponding relation can be for example: the described first information meets a first information change rule
During rule, described second information also complies with least one second information change rule, then described first letter
Breath and described second information meet described first condition.
A kind of embodiment of possible described corresponding relation is as shown in Table 1:
Table one:
Combination sequence number | First information Changing Pattern | Second information change rule |
1 | v11 | v21 |
2 | v11 | v22 |
3 | v12 | v23 |
4 | v13 | v24 |
5 | v14 | v25 |
… | … | … |
Wherein, the described first information for meeting described first condition that often row represents in table one
A combination with the second information.For example: as long as the described first information meets Changing Pattern v11,
And the second information meets Changing Pattern v21 or v22 simultaneously, then the described first information and institute
Second information of stating meets described first condition.
It is worn on for user as a example the intelligent watch 210 in wrist by described portable equipment and carry out
Illustrate, the described first information is that the first inertia that the earphone 220 worn by user's head gathers passes
The first athletic posture information that sense information obtains, described second information is described intelligent watch 210
The second athletic posture information that second inertial sensing information of collection obtains.
As shown in figures 2 a and 2b, in a kind of possible embodiment, a first information change
Rule can be for example: forward downward rotates 30~45 degree, and it corresponds to the head of user by base
Originally look squarely and become 30~45 degree of vertical view;One second information change rule can be for example: goes up forward
Fang Yundong setpoint distance, and there occurs upset in the process, it corresponds to the handss of user
Wrist lifts and overturns.Be can be seen that by Fig. 2 b and meet above-mentioned in the described first information
After one information change rule, described second information meet the second above-mentioned information change rule,
Just so that the surface of described intelligent watch is by being exposed to regarding of described user outside field range
In wild scope fov.
In a kind of possible embodiment, the described first information and described second information meet institute
State first condition may include that
The described first information meets a first information Changing Pattern, wherein: the described first information becomes
Law corresponds to described user's head by a nonreciprocal attitude motion to an interaction attitude;
Described second information meets one second information change rule, and wherein, described second information becomes
Law is at least partly moved to by a nonreciprocal position corresponding to the described of described portable equipment
One interaction locations;
Wherein: described interaction locations are located at the corresponding user of described interactive attitude within sweep of the eye.
In the present embodiment, described nonreciprocal attitude, interaction attitude, nonreciprocal position and
Interaction locations can obtain according to the study such as use habit of user.
As can be seen that in the present embodiment, the head of described user is needed to be changed into an interaction appearance
State, the position of described portable equipment are located at interaction locations, and described interaction locations are located at institute
State just at least one first operation described in execution within sweep of the eye of the corresponding user of interactive attitude, enter
The possibility of the minimizing maloperation of one step.
For example, user comes back and lifts wrist and goes during the article taking eminence although described portable set
Standby when also moving within sweep of the eye from the field range of user, but the head due to described user
Portion be not in described portable equipment interact attitude (for example, in a kind of possible embodiment party
In formula, set and only have the attitude that user bows to be only the interactive appearance interacting with described portable equipment
State), therefore, also will not cause described at least one first operation execution.
Described at least one first operation can be and described helmet and/or described portable equipment
Related operation.For example, the interactive operation between described helmet and described portable equipment,
For example: interactive operation of data transfer etc..
In a kind of possible embodiment, alternatively, described at least one first operation can be wrapped
Include: at least partly activate described portable equipment.
For example, described portable equipment is an intelligent watch, and described at least one first operation includes:
Described intelligent watch is waken up from energy saver mode, for example, lights the screen of described intelligent watch.
In a kind of possible embodiment, alternatively, at least partly may be used of described portable equipment
To include:
Described portable equipment can interaction area.
Those skilled in the art it is recognised that when user needs portable equipment is operated,
Generally portable equipment can interaction area in described user within sweep of the eye, might not
Need described portable equipment all be located at described within sweep of the eye.
Described can interaction area be to be available for the region of user interactive, for example: user's finger behaviour
The regions such as the touch pad of work, touch screen, button;Screen that eyes of user is watched attentively, display lamp etc.
Visual indication area etc..
In a kind of possible embodiment, alternatively, methods described can also include:
Meet a second condition in response to the described first information with described second information,
Execution at least one second operation corresponding with described second condition;
Wherein, described second condition corresponds to: described portable equipment at least part of by described use
Outside the field range being changed to described user within sweep of the eye at family.
Described at least one second operation can be and described helmet and/or described portable equipment
Related operation.For example, the interactive operation between described helmet and described portable equipment,
For example: interactive operation that data transfer stops etc..
In a kind of possible embodiment, alternatively, described at least one second operation can be wrapped
Include: at least partly portable equipment described in dormancy.For example, the described portable equipment with activation above
Corresponding, described portable equipment leave described user within sweep of the eye when, described portable set
Standby can entrance closes the energy saver modes such as screen.
In a kind of possible embodiment, the described first information and described second information meet institute
State second condition may include that
The described first information meets one the 3rd information change rule, wherein: described 3rd information becomes
Law corresponds to described user's head by an interaction attitude motion to a nonreciprocal attitude.
Or, in a kind of possible embodiment, the described first information and described second information
Meet described second condition may include that
Described second information meets one the 4th information change rule, and wherein, described 4th information becomes
Law at least partly moves to one by interaction locations corresponding to the described of described portable equipment
Nonreciprocal position.
Or in a kind of possible embodiment, the described first information is full with described second information
The described second condition of foot may include that the described first information meets described 3rd information change rule,
Described second information meets described 4th information change rule.
Generally, in described nonreciprocal attitude or in described nonreciprocal position, generally described portable set
The standby field range not appearing in described user, the head of therefore described user is by an interaction appearance
State moves to a nonreciprocal attitude or described portable equipment and moves to a non-friendship by interaction locations
Mutually position means that user has the intention terminating interaction.According to such action in present embodiment,
At least one second operation described in execution, can bring more preferable experience to user.
Further illustrate the embodiment of the present application by the application scenarios shown in Fig. 2 a and 2b.
In the present embodiment, the executive agent of described exchange method is described portable equipment
Intelligent watch 210.
The motion that described intelligent watch 210 obtains described user's head from described earphone 220 corresponds to
The first inertial sensing information, for example: acceleration sensing information, gyro sensors information.With
When, described intelligent watch 210 obtains the second inertial sensing information of the inertial sensor of itself.
Wherein, described user bow and raise one's hand see described intelligent watch 210 when, described first
Comprise in inertial sensing information and the first corresponding first information of motion that described user bows, institute
State the second corresponding second information of motion comprising in the second inertial sensing information to raise one's hand with user,
And this two parts has coincidence in time.
S130 according to the method in embodiments described above is it may be determined that described first
Information and described second information meet described first condition, therefore, during user raises one's hand,
Described portable equipment wakes up from energy saver mode, enters and facilitates user's viewing, the interacted mould of operation
Formula, convenient for users to use.
Certainly, in other possible embodiments, for example, the execution master of described exchange method
Body can also be the miscellaneous equipment beyond described portable equipment.For example, shown in Fig. 2 a and 2b
Application scenarios in, described executive agent can be a mobile phone of described user, and described mobile phone divides
Do not communicate to connect with described earphone 220 and described intelligent watch 210, obtain from described earphone 220
Take the described first information, obtain described second information from described intelligent watch 210, and processing
After determine whether the described first information and described second information meet described first condition, and full
When sufficient, send a control instruction to described intelligent watch 210, for example, send one from energy-conservation mould
The control instruction that formula wakes up.
Certainly, in a kind of possible embodiment, described executive agent can also be described ear
Machine 220.For example: described earphone 220 obtains described second information from described intelligent watch 210,
And process together with the first information of itself acquisition, it is determined whether meet described first condition, and
The interactive operation between execution and described intelligent watch 210 when meeting.
It will be understood by those skilled in the art that in the said method of the application specific embodiment,
The sequence number size of each step is not meant to the priority of execution sequence, and the execution sequence of each step should
Determined with its function and internal logic, and should not be to the implementation process of the application specific embodiment
Constitute any restriction.
As shown in figure 3, a kind of possible embodiment of the embodiment of the present application provides a kind of friendship
Mutually device 300, comprising:
First information acquisition module 310 is corresponding with the first motion of a user's head for obtaining
The first information;
Second data obtaining module 320, for obtaining the portable equipment carrying with described user
Second motion corresponding second information;
First process performing module 330, in response to the described first information and described second letter
Breath meets a first condition, execution at least one first operation corresponding with described first condition;Its
In, described first condition corresponds to: at least part of regarding by described user of described portable equipment
Described user is moved within sweep of the eye outside wild scope.
The embodiment of the embodiment of the present application the first motion according to user's head and user simultaneously
Using second the moving determining whether described portable equipment enters regarding of access customer of portable equipment
In wild scope, more accurately the operation of portable equipment can be intended to identifying user, and then execute
Corresponding operation, it is to avoid because user is not intended to move head or carries described portable equipment
The no intention action of limbs leads to the possibility of maloperation.
Further illustrate each module of the embodiment of the present application by embodiments below.
In a kind of possible embodiment, alternatively, the described first information may include that
With the described first corresponding first heat transfer agent of motion.
In a kind of possible embodiment, alternatively, described first heat transfer agent may include that
At least one first inertial sensing information of at least one helmet of described user.
In a kind of possible embodiment, described at least one first inertial sensing information is by described
At least one inertial sensor collection of configuration at least one helmet.
Wherein, described at least one helmet can be for example: in glasses, the helmet, earphone etc.
One or more.
In the present embodiment, the first athletic meeting of described user's head lead to described at least one
Wear the motion of equipment, so make described at least one inertial sensor collect and described first fortune
Move corresponding at least one first inertial sensing information.
In a kind of possible embodiment, alternatively, described first heat transfer agent may include that
At least one first image sequence of described at least one helmet collection.
In the present embodiment, described at least one first image sequence can be by described at least one
Wear at least one image acquisition device collection on equipment.Described first image sequence can be for example
Multiple first images according to time sequence.Face in a kind of possible embodiment, described first
Image sequence can be for example the multiple image comprising in one section of video.
In the present embodiment, the first athletic meeting of described user's head lead to described at least one
Wear the motion of equipment, and then one or more of at least one first image sequence described in making is right
As the position in the plurality of first image occurs and the described first corresponding change of motion.
In a kind of possible embodiment, alternatively, described first heat transfer agent may include that
The corresponding at least one myoelectricity heat transfer agent in a described user's cervical region at least position.
The head movement of user is corresponding with the muscular movement of user's cervical region, therefore, in this embodiment party
In formula, be can determine according to the corresponding at least one myoelectricity heat transfer agent in a described cervical region at least position
Described first motion.Wherein, in some possible embodiments, described at least one myoelectricity passes
Corresponding relation between sense information and described first motion or the described user visual field, can be by instruction
Get.Described at least one position can be for example the rear cervical region of user.
In a kind of possible embodiment, for example, can pass through a neck ring, the neck of user's cervical region
Set or multiple myoelectricity sensing electrode pieces of attaching are come at least one myoelectricity heat transfer agent described in obtaining.
In a kind of possible embodiment, alternatively, described first heat transfer agent may include that
The corresponding at least one crooked sensory information in a described user's cervical region at least position.
The head movement of user is corresponding with the angle of bend of user's cervical region, for example, when user overlooks,
Cervical region is to front curve;When user looks up, cervical region bends backward.In the present embodiment, example
As can be by being attached to the bend sensor of a user's cervical region epidermis at least position to obtain
State crooked sensory information.Wherein, in some possible embodiments, described at least one bending
Corresponding relation between heat transfer agent and described first motion or the described user visual field, can pass through
Training obtains.Described at least one position can be for example: user's cervical region lower jaw is to Adam's apple section.
In a kind of possible embodiment, alternatively, described first heat transfer agent may include that
At least one second image sequence of described user's head.
In a kind of possible embodiment, described at least one second image sequence can be by one
External equipment collection.In the present embodiment, the described first information of described acquisition is for example permissible
For: obtain the described first information from described external equipment by way of communication.
Certainly, those skilled in the art are it is recognised that described first heat transfer agent can also include
Multiple in several first heat transfer agents above, sentenced with increasing user's head athletic posture further
Disconnected accuracy.And, it is in addition to several first heat transfer agents recited above, other possible
Heat transfer agent for determining user's head athletic posture can also be applied in the embodiment of the present application
Embodiment in, will not enumerate here.
In a kind of possible embodiment, the described first information may include that and described first
Move corresponding first athletic posture information.
In a kind of possible embodiment, described first athletic posture information can be according to upper
Described in face, one or more of at least one first heat transfer agent obtains.For example, according to described
At least one first inertial sensing information obtains described first athletic posture information.
Wherein, described first athletic posture information for example may include that the fortune of described first motion
Dynamic change information and attitudes vibration information.
As shown in fig. 4 a, a kind of in possible embodiment, alternatively, described first letter
Breath acquisition module 310 can include one first communication unit 311, for setting outside at least one
Standby (for example described at least one helmet) obtains the described first information.
As shown in Figure 4 b, in alternatively possible embodiment, alternatively, described first
Data obtaining module 310 can include at least one first information sensing unit 312, for gathering
The described first information.
Similar to the described first information, in the embodiment of the present application, described second information is permissible
Including following at least one:
Move corresponding second heat transfer agent and the described second motion corresponding the with described second
Two athletic posture information.
Wherein, described second athletic posture information can obtain according to described second heat transfer agent.
In a kind of possible embodiment, similar with the first heat transfer agent, described second biography
Sense information may include that
At least one second inertial sensing information of described portable equipment and/or described portable equipment
At least one the 3rd image sequence of collection.
In a kind of possible embodiment, as shown in Figure 4 b, do not belong in described interactive device
In the scene of described portable equipment, described second data obtaining module 320 can include one
Two communication units 321, for obtaining institute second letter from described portable equipment by way of communication
Breath.
In a kind of possible embodiment, belong to described portable equipment in described interactive device
In scene, described second data obtaining module 320 can include gathering described second information extremely
Few one second information sensing unit 322.
In a kind of possible embodiment, described second heat transfer agent may include that described in just
Take at least one the 4th image sequence of equipment.In the present embodiment, described second acquisition of information
Module 320 can also be including communication unit, for described in obtaining from external equipment at least 1
Four image sequences.
In a kind of possible embodiment, described first processes performing module 330 can basis
The described first information determines the attitudes vibration information of described user's head;According to described second information
Determine described portable equipment change in location information (can be with respect to described user position become
Change information);Determine the position of described portable equipment during the attitudes vibration of described user's head
Put and whether can be entered within sweep of the eye by outside described field range, if it has, then described first
Information and described second information meet described first condition, at least one first operation described in execution.
In a kind of possible embodiment, the described first information and described second information meet institute
State first condition may include that
The described first information meets a first information Changing Pattern, wherein: the described first information becomes
Law corresponds to described user's head by a nonreciprocal attitude motion to an interaction attitude;
Described second information meets one second information change rule, and wherein, described second information becomes
Law is at least partly moved to by a nonreciprocal position corresponding to the described of described portable equipment
One interaction locations;
Wherein: described interaction locations are located at the corresponding user of described interactive attitude within sweep of the eye.
In the present embodiment, described nonreciprocal attitude, interaction attitude, nonreciprocal position and
Interaction locations can obtain according to the study such as use habit of user.
As can be seen that in the present embodiment, described first process performing module 330 needs really
The head of fixed described user is changed into an interaction attitude, the position of described portable equipment becomes an interaction
Position, and described interaction locations are located at the corresponding user of described interactive attitude within sweep of the eye
Just at least one first operation described in execution, the further possibility reducing maloperation.
Described at least one first operation can be and described helmet and/or described portable equipment
Related operation.For example, the interactive operation between described helmet and described portable equipment,
For example: interactive operation of data transfer etc..
In a kind of possible embodiment, alternatively, described at least one first operation can be wrapped
Include: at least partly activate described portable equipment.
In a kind of possible embodiment, alternatively, at least partly may be used of described portable equipment
To include:
Described portable equipment can interaction area.
Described can interaction area be to be available for the region of user interactive, for example: user's finger behaviour
The regions such as the touch pad of work, touch screen, button;Screen that eyes of user is watched attentively, display lamp etc.
Visual indication area etc..
In a kind of possible embodiment, as illustrated in fig. 4 c, alternatively, described device 300
Can also include:
Second processing performing module 340, in response to the described first information and described second letter
Breath meets a second condition, execution at least one second operation corresponding with described second condition;
Wherein, described second condition corresponds to: described portable equipment at least part of by described use
Outside the field range being changed to described user within sweep of the eye at family.
Described at least one second operation can be and described helmet and/or described portable equipment
Related operation.For example, the interactive operation between described helmet and described portable equipment,
For example: interactive operation that data transfer stops etc..
In a kind of possible embodiment, alternatively, described at least one second operation can be wrapped
Include: at least partly portable equipment described in dormancy.
In a kind of possible embodiment, the described first information and described second information meet institute
State second condition may include that
The described first information meets one the 3rd information change rule, wherein: described 3rd information becomes
Law corresponds to described user's head by an interaction attitude motion to a nonreciprocal attitude.
Or, in a kind of possible embodiment, the described first information and described second information
Meet described second condition may include that
Described second information meets one the 4th information change rule, and wherein, described 4th information becomes
Law at least partly moves to one by interaction locations corresponding to the described of described portable equipment
Nonreciprocal position.
Or in a kind of possible embodiment, the described first information is full with described second information
The described second condition of foot may include that the described first information meets described 3rd information change rule,
Described second information meets described 4th information change rule.
Generally, in described nonreciprocal attitude or in described nonreciprocal position, generally described portable set
The standby field range not appearing in described user, the head of therefore described user is by an interaction appearance
State moves to a nonreciprocal attitude or described portable equipment and moves to a non-friendship by interaction locations
Mutually position means that user has the intention terminating interaction.According to such action in present embodiment,
At least one second operation described in execution, can bring more preferable experience to user.
The structural representation of another user equipment 500 that Fig. 5 provides for the embodiment of the present application,
The application specific embodiment does not limit to implementing of user equipment 500.As Fig. 5 institute
Show, this user equipment 500 may include that
Processor (processor) 510, communication interface (communications interface) 520,
Memorizer (memory) 530 and communication bus 540.Wherein:
Processor 510, communication interface 520 and memorizer 530 pass through communication bus 540
Complete mutual communication.
Communication interface 520, for the net element communication with such as client etc..
Processor 510, for configuration processor 532, specifically can execute said method embodiment
In correlation step.
Specifically, program 532 can include program code, and described program code includes computer
Operational order.
Processor 510 is probably a central processing unit cpu, or specific integrated circuit
Asic (application specific integrated circuit), or be arranged to implement
One or more integrated circuits of the embodiment of the present application.
Memorizer 530, is used for depositing program 532.Memorizer 530 may comprise high speed ram
Memorizer is it is also possible to also include nonvolatile memory (non-volatile memory), such as
At least one disk memory.Program 532 specifically can be used for so that described user equipment 500
Execution following steps:
Obtain the first corresponding first information of motion with a user's head;
Obtain the second corresponding second information of motion with the portable equipment that described user carries;
Meet a first condition, execution and institute in response to the described first information with described second information
State corresponding at least one first operation of first condition;
Wherein, described first condition corresponds to: described portable equipment at least part of by described use
Described user is moved within sweep of the eye outside the field range at family.
Wherein, in a kind of possible embodiment, described portable equipment can be described user
Equipment.
In program 532, each step implements the corresponding step that may refer in above-described embodiment
Corresponding description in rapid and unit, will not be described here.Those skilled in the art can be clear
The specific works of module are recognized, for convenience and simplicity of description, the equipment of foregoing description and in ground
Process, may be referred to the corresponding process description in preceding method embodiment, will not be described here.
Those of ordinary skill in the art are it is to be appreciated that retouch with reference to the embodiments described herein
The unit of each example stated and method and step, can with electronic hardware or computer software and
Being implemented in combination in of electronic hardware.These functions to be executed with hardware or software mode actually,
Application-specific depending on technical scheme and design constraint.Professional and technical personnel can be to every
The individual specific application function described to use different methods to realization, but this realization is not
It is considered as beyond scope of the present application.
If described function is realized and as independent product pin using in the form of SFU software functional unit
When selling or using, can be stored in a computer read/write memory medium.Based on such
Understand, the part that the technical scheme of the application substantially contributes to prior art in other words or
Partly being embodied in the form of software product of this technical scheme of person, this computer software
Product is stored in a storage medium, including some instructions with so that a computer equipment
(can be personal computer, server, or network equipment etc.) executes the application, and each is real
Apply all or part of step of a methods described.And aforesaid storage medium includes: u disk, shifting
Dynamic hard disk, read only memory (rom, read-only memory), random access memory
(ram, random access memory), magnetic disc or CD etc. are various can to store journey
The medium of sequence code.
Embodiment of above is merely to illustrate the application, and the not restriction to the application, relevant
The those of ordinary skill of technical field, in the case of without departing from spirit and scope,
Can also make a variety of changes and modification, therefore all equivalent technical schemes fall within the application
Category, the scope of patent protection of the application should be defined by the claims.
Claims (10)
1. a kind of exchange method is it is characterised in that include:
Obtain the first corresponding first information of motion with a user's head;
Obtain the second corresponding second information of motion with the portable equipment that described user carries;
Meet a first condition, execution and institute in response to the described first information with described second information
State corresponding at least one first operation of first condition;
Wherein, described first condition corresponds to: described portable equipment at least part of by described use
Described user is moved within sweep of the eye outside the field range at family.
2. the method for claim 1 is it is characterised in that the described first information and institute
Second information of stating meets described first condition and includes:
The described first information meets a first information Changing Pattern, wherein: the described first information becomes
Law corresponds to described user's head by a nonreciprocal attitude motion to an interaction attitude;
Described second information meets one second information change rule, and wherein, described second information becomes
Law is at least partly moved to by a nonreciprocal position corresponding to the described of described portable equipment
One interaction locations;
Wherein: described interaction locations are located at the corresponding user of described interactive attitude within sweep of the eye.
3. the method for claim 1 is it is characterised in that described at least one first grasps
Work includes: at least partly activates described portable equipment.
4. the method for claim 1 is it is characterised in that the described first information includes
Following at least one:
Move corresponding first heat transfer agent and the described first motion corresponding the with described first
One athletic posture information.
5. the method for claim 1 is it is characterised in that described second information includes
Following at least one:
Move corresponding second heat transfer agent and the described second motion corresponding the with described second
Two athletic posture information.
6. a kind of interactive device is it is characterised in that include:
First information acquisition module, moves corresponding for obtaining with the first of a user's head
One information;
Second data obtaining module, for obtaining the of the portable equipment carrying with described user
Two corresponding second information of motion;
First process performing module, for full with described second information in response to the described first information
Foot one first condition, execution at least one first operation corresponding with described first condition;Wherein,
Described first condition corresponds to: at least part of visual field model by described user of described portable equipment
Described user is moved within sweep of the eye outside enclosing.
7. device as claimed in claim 6 is it is characterised in that the described first information and institute
Second information of stating meets described first condition and includes:
The described first information meets a first information Changing Pattern, wherein: the described first information becomes
Law corresponds to described user's head by a nonreciprocal attitude motion to an interaction attitude;
Described second information meets one second information change rule, and wherein, described second information becomes
Law is at least partly moved to by a nonreciprocal position corresponding to the described of described portable equipment
One interaction locations;
Wherein: described interaction locations are located at the corresponding user of described interactive attitude within sweep of the eye.
8. device as claimed in claim 6 is it is characterised in that described at least one first grasps
Work includes: at least partly activates described portable equipment.
9. device as claimed in claim 6 it is characterised in that described portable equipment extremely
Small part includes:
Described portable equipment can interaction area.
10. a kind of user equipment is it is characterised in that described user equipment includes:
Memorizer, is used for depositing program;
Processor, for executing the program of described memory storage, described program makes described place
Reason device execution is following to be operated:
Obtain the first corresponding first information of motion with a user's head;
Obtain the second corresponding second information of motion with the portable equipment that described user carries;
Meet a first condition, execution and institute in response to the described first information with described second information
State corresponding at least one first operation of first condition;
Wherein, described first condition corresponds to: described portable equipment at least part of by described use
Described user is moved within sweep of the eye outside the field range at family.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510490318.7A CN106371559B (en) | 2015-08-11 | 2015-08-11 | Exchange method, interactive device and user equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510490318.7A CN106371559B (en) | 2015-08-11 | 2015-08-11 | Exchange method, interactive device and user equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106371559A true CN106371559A (en) | 2017-02-01 |
CN106371559B CN106371559B (en) | 2019-09-10 |
Family
ID=57880929
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510490318.7A Active CN106371559B (en) | 2015-08-11 | 2015-08-11 | Exchange method, interactive device and user equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106371559B (en) |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102566749A (en) * | 2010-11-09 | 2012-07-11 | 捷讯研究有限公司 | Method and apparatus for controlling an output device of a portable electronic device |
US20130017516A1 (en) * | 2011-07-11 | 2013-01-17 | Kelly Tyler | Active braille timepiece & related methods |
CN102915189A (en) * | 2011-08-02 | 2013-02-06 | 联想(北京)有限公司 | Display processing method and electronic device |
CN103207666A (en) * | 2012-01-16 | 2013-07-17 | 联想(北京)有限公司 | Processing method and electronic equipment for responding to operation of user |
CN103977559A (en) * | 2014-05-23 | 2014-08-13 | 北京智谷睿拓技术服务有限公司 | Interactive method and interactive device |
CN104156141A (en) * | 2014-09-09 | 2014-11-19 | 联想(北京)有限公司 | Wearable electronic equipment control method and device |
CN104346297A (en) * | 2013-07-24 | 2015-02-11 | 宏达国际电子股份有限公司 | Method for operating mobile device, mobile device using the same, wearable device using the same, and computer readable medium |
CN104375648A (en) * | 2014-11-26 | 2015-02-25 | 三星电子(中国)研发中心 | Wrist type device and using method thereof |
CN104484047A (en) * | 2014-12-29 | 2015-04-01 | 北京智谷睿拓技术服务有限公司 | Interactive method and interactive device based on wearable device, and wearable device |
CN104503589A (en) * | 2015-01-05 | 2015-04-08 | 京东方科技集团股份有限公司 | Somatosensory recognition system and recognition method |
CN104516660A (en) * | 2013-09-27 | 2015-04-15 | 联想(北京)有限公司 | Information processing method and system and electronic device |
CN104793866A (en) * | 2015-05-05 | 2015-07-22 | 陈王胜 | Control unit and control method for backlight of intelligent wrist type wearable equipment |
-
2015
- 2015-08-11 CN CN201510490318.7A patent/CN106371559B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102566749A (en) * | 2010-11-09 | 2012-07-11 | 捷讯研究有限公司 | Method and apparatus for controlling an output device of a portable electronic device |
US20130017516A1 (en) * | 2011-07-11 | 2013-01-17 | Kelly Tyler | Active braille timepiece & related methods |
CN102915189A (en) * | 2011-08-02 | 2013-02-06 | 联想(北京)有限公司 | Display processing method and electronic device |
CN103207666A (en) * | 2012-01-16 | 2013-07-17 | 联想(北京)有限公司 | Processing method and electronic equipment for responding to operation of user |
CN104346297A (en) * | 2013-07-24 | 2015-02-11 | 宏达国际电子股份有限公司 | Method for operating mobile device, mobile device using the same, wearable device using the same, and computer readable medium |
CN104345972A (en) * | 2013-07-24 | 2015-02-11 | 宏达国际电子股份有限公司 | Method for operating mobile device, mobile device, and computer readable medium |
CN104516660A (en) * | 2013-09-27 | 2015-04-15 | 联想(北京)有限公司 | Information processing method and system and electronic device |
CN103977559A (en) * | 2014-05-23 | 2014-08-13 | 北京智谷睿拓技术服务有限公司 | Interactive method and interactive device |
CN104156141A (en) * | 2014-09-09 | 2014-11-19 | 联想(北京)有限公司 | Wearable electronic equipment control method and device |
CN104375648A (en) * | 2014-11-26 | 2015-02-25 | 三星电子(中国)研发中心 | Wrist type device and using method thereof |
CN104484047A (en) * | 2014-12-29 | 2015-04-01 | 北京智谷睿拓技术服务有限公司 | Interactive method and interactive device based on wearable device, and wearable device |
CN104503589A (en) * | 2015-01-05 | 2015-04-08 | 京东方科技集团股份有限公司 | Somatosensory recognition system and recognition method |
CN104793866A (en) * | 2015-05-05 | 2015-07-22 | 陈王胜 | Control unit and control method for backlight of intelligent wrist type wearable equipment |
Also Published As
Publication number | Publication date |
---|---|
CN106371559B (en) | 2019-09-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220326781A1 (en) | Bimanual interactions between mapped hand regions for controlling virtual and graphical elements | |
US20220334649A1 (en) | Hand gestures for animating and controlling virtual and graphical elements | |
CN104503589B (en) | Somatosensory recognition system and its recognition methods | |
CN112926423B (en) | Pinch gesture detection and recognition method, device and system | |
CN109918975A (en) | A kind of processing method of augmented reality, the method for Object identifying and terminal | |
CN104023802B (en) | Use the control of the electronic installation of neural analysis | |
CN103984413B (en) | Information interacting method and information interactive device | |
CN105046215B (en) | The posture Activity recognition method not influenced by individual wearing position and wearing mode | |
CN105388820A (en) | Intelligent monitoring device and monitoring method thereof, and monitoring system | |
CN104573459B (en) | Exchange method, interactive device and user equipment | |
CN107450717A (en) | A kind of information processing method and Wearable | |
CN106296796B (en) | Information processing method, information processing unit and user equipment | |
CN106599811A (en) | Facial expression tracking method of VR heat-mounted display | |
CN108932058A (en) | Display methods, device and electronic equipment | |
CN107209936A (en) | Message processing device, information processing method and program | |
CN108829247B (en) | Interaction method and device based on sight tracking and computer equipment | |
CN106371559A (en) | Interactive method, interactive apparatus and user equipment | |
CN106445152A (en) | Method for managing menus in virtual reality environments and virtual reality equipment | |
CN110009446A (en) | A kind of display methods and terminal | |
CN109615462A (en) | Control the method and relevant apparatus of user data | |
US10075816B2 (en) | Mobile device position determining method and determining apparatus, and mobile device | |
CN206863694U (en) | A kind of virtual fitting system based on somatosensory recognition device | |
CN104731332B (en) | A kind of information processing method and electronic equipment | |
CN109658355A (en) | A kind of image processing method and device | |
CN109379531A (en) | A kind of image pickup method and mobile terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |