CN103180803A - Interface switching method and apparatus - Google Patents

Interface switching method and apparatus Download PDF

Info

Publication number
CN103180803A
CN103180803A CN2012800014677A CN201280001467A CN103180803A CN 103180803 A CN103180803 A CN 103180803A CN 2012800014677 A CN2012800014677 A CN 2012800014677A CN 201280001467 A CN201280001467 A CN 201280001467A CN 103180803 A CN103180803 A CN 103180803A
Authority
CN
China
Prior art keywords
posture
user
articulation point
interface
parameter value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012800014677A
Other languages
Chinese (zh)
Other versions
CN103180803B (en
Inventor
宣曼
黄晨
薛传颂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN103180803A publication Critical patent/CN103180803A/en
Application granted granted Critical
Publication of CN103180803B publication Critical patent/CN103180803B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8011Ball

Abstract

The embodiment of the invention provides an interface switching method and an apparatus. The method comprises that after detecting user information, identifying a first posture from the user information; if the first posture of the user is an interface switching posture, displaying prompt information in appointed time; the prompt information is used for prompting users to input a second posture; when detecting the user information in the appointed time, identifying the second posture of the user; if the second posture of the user is an acknowledgement switching posture, executing interface switching operation associated with the first posture. According to the interface switching method and the apparatus provided by the invention, defects of high misjudge rate or long waiting time can be solved, accurate rate of posture control is raised and user experience is lifted.

Description

The method and apparatus that switch at the interface
Technical field
The present invention relates to the communication network technology field, relate in particular to the method and apparatus that under a kind of body sense interaction scenarios, switch at the interface.
Background technology
Refer to alternately that based on the body sense of vision computing machine catches user's image by camera, and utilize the technology such as pattern-recognition, artificial intelligence to understand the implication of user action, provide more naturally, the interactive mode of body sense intuitively.Be widely used at present under the scenes such as augmented reality, somatic sensation television game control.
In the mutual process of body sense, body sense IAS is caught the frame of video that contains user profile by camera, then obtain the information (for example articulation point information) of user in frame of video by image analysis technology, thus the judgement user attitude (pose) and changed the action (gesture) consist of by the attitude in successive video frames; User's attitude and the common formation of action posture, corresponding feedback operation is carried out in the instruction that body sense IAS is corresponding according to user's posture.Consisted of thus the complete body sense reciprocal process based on vision.
In prior art, switch the judgement of attitude for the interface, the mode of taking is: at first identify the attitude of user's input, when switching attitude at the interface of satisfying regulation, require the user to keep this attitude after a period of time, just trigger the interface switching command.For example: the user is in the somatic sensation television game equipment Kinect that uses Microsoft carries out the somatic sensation television game process, can withdraw from game by the attitude of " the left hand arm stretches; be tiltedly lower 45 ° with health ", require the user to keep this attitude a period of time just can trigger " withdrawing from game " operation.Otherwise cancel operation, keep original interface.
If the stand-by period arrange more in short-term, easily some the unconscious operations with the user are mistaken for the interface switching command.If when the stand-by period, setting was longer, the user needed the long period to keep certain attitude constant, poor user experience.Therefore, there is the problem that when carrying out the interface switching command, False Rate is high or the stand-by period is long in prior art.
Summary of the invention
The method and apparatus that the embodiment of the present invention provides a kind of interface to switch is used for improving the identification accurate rate of interface switching posture instruction under body sense interaction scenarios, promotes the user and experiences.
The method that switch at the interface that first aspect, the embodiment of the present invention provide comprises:
After user profile being detected, identify user's prime from described user profile;
If described user's prime is the interface switches posture, at the appointed time in display reminding information, described information is in order to point out the user to input second;
When user profile being detected within the described fixed time, identify user's second;
If described user's second switches posture for confirming, carry out the interface blocked operation of described prime association.
In the possible implementation of the first of first aspect, also comprise after identification user second: if described user's second switches posture for cancelling, cancel the interface blocked operation of described prime association; If or described user's second is not confirm to switch posture or cancel and switch posture, continue to detect user profile, and return describedly when user profile detected within the described fixed time time, identify the step of user's second.
In conjunction with the possible implementation of the first of first aspect or first aspect, in the possible implementation of the second, the prompting user also comprises after inputting second: when described user profile not detected within the described fixed time, cancel the interface blocked operation of described prime association.
In conjunction with the possible implementation of the first of first aspect or first aspect or the possible implementation of the second of first aspect, in the third possible implementation, identification user's prime or second comprise: the related articulation point of posture that obtains setting; Read the data of described articulation point from the described user profile that has detected, wherein, described user profile comprises user's skeletal frame information, and described skeletal frame information comprises articulation point information and timestamp information; Calculate the matching parameter value of the posture of described setting according to the data of described articulation point; According to the described matching parameter value described user's prime of identification or second.
The third possible implementation in conjunction with first aspect, in the 4th kind of possible implementation, the related articulation point of posture of setting comprises: determine the posture that sets under current interface type and current interface, the articulation point that the posture that obtains to set under described current interface relates to; The matching parameter value of calculating the posture of described setting according to the data of described articulation point comprises: the matching parameter value of calculating the posture that sets under described current interface according to the data of described articulation point.
In conjunction with the third possible implementation of first aspect, in the 5th kind of possible implementation, the related articulation point of posture that obtains to set comprises: determine the acquiescence posture of body sense interactive game application system, obtain the articulation point that described acquiescence posture relates to; The matching parameter value that described data according to described articulation point are calculated the posture of described setting comprises: the matching parameter value of calculating described acquiescence posture according to the data of described articulation point.
The third possible implementation in conjunction with first aspect, in the 6th kind of possible implementation, when the posture of setting was action, the described data that read described articulation point from the described user profile that has detected comprised: read corresponding articulation point data of articulation point that described setting posture relates to and the timestamp information of user's skeletal frame from a plurality of continuous user's skeletal frame information; The matching parameter value that described data according to described articulation point are calculated the posture of described setting comprises: according to the displacement of described articulation point data and the described articulation point of described timestamp information calculating.
The third possible implementation in conjunction with first aspect, in the 7th kind of possible implementation, when the posture of setting was the posture pose position, the described data that read described articulation point from the described user profile that has detected comprised: read the corresponding articulation point data of articulation point that described setting posture relates to from described user's skeletal frame information; The matching parameter value that described data according to described articulation point are calculated the posture of described setting comprises: according to the bone angle between described articulation point data calculating articulation point.
In conjunction with the 4th kind of possible implementation of first aspect or the 5th kind of possible implementation of first aspect, in the 8th kind of possible implementation, comprise according to the described matching parameter value described user's prime of identification or second: the matching condition of the posture set under described matching parameter value and described current interface is compared, perhaps described matching parameter value is compared with the matching condition of the acquiescence posture of described body sense IAS; Definite corresponding posture of matching parameter value that is complementary with described matching condition is take definite posture as user's prime or second.
The device that switch at the interface that second aspect, the embodiment of the present invention provide comprises:
Detecting unit is for detection of user profile;
The first recognition unit after user profile being detected for described detecting unit, identifies user's prime from described user profile;
Display unit, be used for when described user's prime be interface when switching posture, interior display reminding information at the appointed time, described information is in order to point out the user to input second;
The second recognition unit is used for identifying user's second when described detecting unit detects user profile within the described fixed time;
The interface switching treatmenting unit is used for carrying out the interface blocked operation of described prime association when described user's second switches posture for confirmation.
In the possible implementation of the first of second aspect, the interface switching treatmenting unit also is used for cancelling the interface blocked operation of described prime association when described user's second switches posture for cancelling.
In conjunction with the possible implementation of the first of second aspect or second aspect, in the possible implementation of the second, the interface switching treatmenting unit also is used for cancelling the interface blocked operation of described prime association when described user profile not detected within the described fixed time.
In conjunction with the possible implementation of the first of second aspect or second aspect or the possible implementation of the second of second aspect, in the third possible implementation, the first recognition unit or described the second recognition unit comprise:
Obtain module, be used for obtaining the related articulation point of posture of setting; Read module, for read the data of described articulation point from the described user profile that has detected, wherein, described user profile comprises user's skeletal frame information, described skeletal frame information comprises articulation point information and timestamp information;
Computing module is for calculate the matching parameter value of the posture of described setting according to the data of described articulation point;
Identification module is used for according to the described matching parameter value described user's prime of identification or second.
In conjunction with the third possible implementation of second aspect, in the 4th kind of possible implementation, obtain module, the articulation point that the posture that further is used for setting under the acquisition current interface relates to; Computing module is further for calculate the matching parameter value of the posture that sets under described current interface according to the data of described articulation point.
In conjunction with the third possible implementation of second aspect, in the 5th kind of possible implementation, obtain module, the articulation point that further relates to for the acquiescence posture that obtains body sense IAS; Computing module further is used for the matching parameter value according to the described acquiescence posture of data calculating of described articulation point.
The third possible implementation in conjunction with second aspect, in the 6th kind of possible implementation, read module, further be used for when the posture of described setting is action, read corresponding articulation point data of articulation point that described setting posture relates to and the timestamp information of user's skeletal frame from a plurality of continuous user's skeletal frame information; Computing module further is used for the displacement according to described articulation point data and the described articulation point of described timestamp information calculating.
The third possible implementation in conjunction with second aspect, in the 7th kind of possible implementation, read module further is used for reading the corresponding articulation point data of articulation point that described setting posture relates to from described user's skeletal frame information when the posture of described setting is pose position; Computing module further is used for according to the bone angle between described articulation point data calculating articulation point.
In conjunction with the 4th kind of possible implementation of second aspect or the 5th kind of possible implementation of second aspect, in the 8th kind of possible implementation, identification module, the matching condition that further is used for the posture that will set under described matching parameter value and described current interface is compared, and perhaps described matching parameter value is compared with the matching condition of the acquiescence posture of described body sense IAS; Definite corresponding posture of matching parameter value that is complementary with described matching condition is take definite posture as user's prime or second.
As can be seen from the above technical solutions, because the embodiment of the present invention adopts the second mechanism that instruction is confirmed to prime, thereby effectively solved the problem that under body sense interaction scenarios, the interface switching gesture recognition time is long or False Rate is high, improve the accurate rate that posture is controlled, thereby greatly promoted user's experience.
Description of drawings
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, the below will do to introduce simply to the accompanying drawing of required use in embodiment or description of the Prior Art, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skills, under the prerequisite of not paying creative work, can also obtain according to these accompanying drawings other accompanying drawing.
The process flow diagram of the interface switching method that Fig. 1 provides for one embodiment of the invention;
The identification user prime that Fig. 2 provides for one embodiment of the invention or the method flow diagram of second;
Fig. 3 A-3D explanation is according to one embodiment of the invention, in the different time points of interface switching, the graphical user interface displays of equipment;
The composition frame chart of the interface switching device shifter that Fig. 4 provides for one embodiment of the invention;
The composition frame chart of the interface switching device shifter that Fig. 5 provides for another embodiment of the present invention;
The structural drawing based on the interface switching device shifter of computer system that Fig. 6 provides for another embodiment of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is clearly and completely described, obviously, described embodiment is only the present invention's part embodiment, rather than whole embodiment.Based on the embodiment in the present invention, those of ordinary skills belong to the scope of protection of the invention not making the every other embodiment that obtains under the creative work prerequisite.
A kind of method that one embodiment of the invention provides interface to switch, switch at the interface that the method can be used under body sense interaction scenarios, and it is one of following that concrete interface operation comprises: withdraw from applications, return to the upper level interface, return to main interface and recruit out menu etc. to cause the operation of interface variation.
Please refer to Fig. 1, the method comprises:
Step 101: after user profile being detected, identification user prime from described user profile.
Wherein, the method that detects user profile can be to obtain user's skeletal frame information, and judges whether the user's who gets skeletal frame information comprises effective articulation point data.If comprise effective articulation point data, illustrate user profile to have been detected, otherwise, illustrate user profile not detected, continue to detect.
Wherein, the method for identification user prime can be: obtain the default related articulation point of posture; Read effective articulation point data corresponding to articulation point that relate to default posture from the user profile that detects; Calculate and preset the matching parameter value that posture is mated according to described effective articulation point data; Then according to this matching parameter value, identify user's prime.Wherein, user's prime can be action, can be also pose position.
Concrete, a kind of be in implementation, take body sense IAS as executive agent, obtain user's skeletal frame information from body sense interactive device (can be somatic sensation television game product K inect the present embodiment).Concrete, SDK (the Software Development Kit that somatic sensation television game product K inect equipment provides, development kit) include skeletal frame information extraction function NuiSkeletonGetNextFrame in, application program can be extracted from kinect equipment user's skeletal frame information of current time by calling this function, no matter before current time Kinect equipment, whether the user is arranged, this equipment all can generate framed user's skeletal frame information.
Wherein, user's skeletal frame information adopts the NUI_SKELETON_FRAME data structure to represent, comprise articulation point information (adopting the NUI_SKELETON_DATA data structure to represent) and timestamp information (adopting the liTimestamp Parametric Representation) in this data structure, comprise the judgement sign eTrackingState that whether has effective articulation point data in articulation point information.If the eTrackingState parameter value is true, illustrate user profile to have been detected, otherwise parameter value is false, illustrate user profile not detected, continue to detect.
Step 102: if described user's prime is the interface switches posture, at the appointed time in display reminding information, this information is in order to point out the user to input second.
Optionally, step 102 can also comprise the following steps: when described user's prime is not interface switching posture, continue to detect user profile or, when described user's prime is not interface when switching posture and being provided with the operation related with this prime, also can carry out the operation of this prime association.
Wherein, information can comprise the operation indication of second options and second.Concrete, the display mode of information can be the forms of expression such as word, picture, the display effect such as can also adopt flicker, be fade-in fade-out.For example, the second options can be " confirming to switch posture ", " cancel and switch posture " two options, and can adopt the mode of text box or word to show, corresponding, the operation indication of second is the method for operating indication of " confirming to switch posture ", " cancel and switch posture ", can adopt word, symbol, picture or animation indicating user how to operate.
Wherein, second can be action, also can be pose position.
Step 103: when user profile being detected within the described fixed time, identification user second.
Wherein, the method that detects user profile is identical with the method for detection user profile in step 101, and the method for identification user second is identical with the method for identification user prime in step 101.
Wherein, if step 103 can also comprise when user profile not detected within the described fixed time, ignore the interface blocked operation related with prime.
Step 104: if described user's second switches posture for confirming, carry out the interface blocked operation of described prime association.
Further, if whether described second is to cancel switching posture, if it is ignore the interface blocked operation of prime association.Alternately, if described second whether neither confirm to switch posture, posture is switched in non-cancellation again, returns to step 103, continues to detect user profile at the appointed time.
One embodiment of the invention is provided in the method for switching at the interface, after user profile being detected, and the method for identification user prime, the method for identification second similarly repeats no more.Please refer to Fig. 2, the method comprises:
Step 201: the related articulation point of posture that obtains setting;
Wherein, this step specifically comprises: determine the posture set under current interface type and current interface, obtain this current interface and divide into and decide the articulation point that posture relates to.
Alternately, this step also can specifically comprise: determine the acquiescence posture that body sense IAS is all, obtain the articulation point that all acquiescence postures of described system relate to.
Concrete, in one embodiment, by the state of interface switching state machine, the judgement current interface is application interface before switching, and the posture of setting under current interface only has the interface to switch posture; The interface is switched posture and is one and is shaped as " left arm 45 degree " pose position, and the articulation point that relates to has 7: SHOULDER_CENTER (center shoulder joint point), SHOULDER_RIGHT (right shoulder joint point) and ELBOW_RIGHT (right elbow joint point), WRIST_RIGHT (right wrist joint point), SHOULDER_LEFT (left shoulder joint point), ELBOW_LEFT (left elbow joint point), WRIST_LEFT (left wrist joint point).
Alternative, by the state of interface switching state machine, the judgement current interface is the switch prompting interface, and the posture of setting under current interface has the switching of confirmation posture and cancellation to switch posture; Wherein, confirm that switching posture is an action that left hand is brandished to the right, the articulation point that relates to has HAND_LEFT (left hand articulation point), and cancelling the switching posture is an action that left hand is brandished left, and the articulation point that relates to has HAND_LEFT (left hand articulation point).
Step 202: the data that read this articulation point from the user profile that has detected; Wherein, this user profile comprises user's skeletal frame information, and user's skeletal frame information comprises articulation point information and timestamp information.
When the posture of setting was pose position, step 202 can specifically comprise: be specially and read the corresponding articulation point data of articulation point that described setting posture relates to from this user's skeletal frame information.
When being action in the posture of setting, step 202 can specifically comprise: read corresponding articulation point data of articulation point that described setting posture relates to and the timestamp information of user's skeletal frame from a plurality of continuous user's skeletal frame information.
When having identical articulation point in a plurality of articulation points that the posture of setting relates to, only need to read once these articulation point data to every frame bone information.
In specific implementation, for example, be that posture is switched at the interface for the posture of setting under current interface, because this posture is pose position, only need to read the coordinate data of 7 articulation points that in current skeletal frame information, this posture relates to, as shown in table 1.
Figure BDA00002474459300091
Again for example, have for the posture of setting under current interface and confirm to switch posture and cancel the switching posture, these two postures relate to identical articulation point (left hand articulation point), therefore only need to read once these articulation point data to every frame bone information.Because two postures are all actions, therefore need to read continuously the left hand body joint point coordinate data of a plurality of skeletal frames of user and the timestamp information of every frame again, the current detection timestamp is initial time stamp ti, and is as shown in table 2.
Position (articulation point position) LiTimestamp (timestamp)
x 1,y 1,z 1 t 1
x 2,y 2,z 2 t 2
...... ......
x i-1,y i-1,z i-1 t i-1
x i,y i,z i ti
Step 203: calculate the matching parameter value of setting posture according to the data of described articulation point.
Wherein, calculating the matching parameter value of setting posture comprises: the matching parameter value of the acquiescence posture that the posture of setting under the calculating current interface or body sense IAS are all..
Wherein, when there is identical matching parameter in this setting posture, only need to calculate the once value of this matching parameter.
For example, be that posture is switched at the interface for the posture of setting under current interface, its matching parameter is 4 bone angles, take ∠ abc as example, it be bone that in table 1, center shoulder joint point a and right shoulder joint point articulation point b form with the bone of right shoulder joint point b and right elbow joint point c composition between angle, the computing formula of this angle is:
∠ abc = cos - 1 ( ab 2 + bc 2 - ac 2 2 ab · bc ) ;
ac 2=(x 1-x 3) 2+(y 1-y 3) 2
ab 2=(x 1-x 2) 2+(y 1-y 2) 2
bc 2=(x 2-x 3) 2+(y 2-y 3) 2
By similar formula, can calculate the parameter value of other 3 bone angles (∠ bcd, ∠ aef, ∠ efg).
Again for example, the switching of confirmation posture is arranged and cancel the switching posture for the posture of setting under current interface, confirm that the matching parameter of switching posture is the displacement of left hand articulation point, cancelling the matching parameter of switching posture is also the displacement of left hand articulation point.There is identical matching parameter in two postures, therefore only calculate the once value of this matching parameter.Wherein, the displacement computing formula of left hand articulation point be in table 2 initial detection time t1 and concluding time t iBetween total displacement
Figure BDA00002474459300102
Adjacent two timestamp t iWith t i-1Between displacement be Δ s i=x i-x i-1, compare displacement s iWith the sign symbol of total displacement s, the identical expression of sign symbol need to continue to calculate, and the sign symbol difference represents that gesture finishes.Before arriving Preset Time, gesture detected and finish, stop timing and calculating, otherwise timing always and calculating until the Preset Time point, total displacement s is the shift value of left hand articulation point.
Step 204: identify described user's prime according to described matching parameter value.
Wherein, this step is specifically as follows, and the matching condition of the posture set under described matching parameter value and described current interface is compared, and perhaps described matching parameter value is compared with the matching condition of the acquiescence posture of body sense IAS; Definite corresponding posture of matching parameter value that is complementary with described matching condition is take definite posture as user's prime.
For example, be that posture is switched at the interface for the posture of setting under current interface, according to 4 bone angles that calculate, judge whether abc=135 ° ± 10 ° of while Satisfying Matching Conditions ∠, ∠ bcd=180 ° ± 10 °, ∠ aef=90 ° ± 10 °, ∠ efg=180 ° ± 10 °, being that posture is switched at the interface if identify described user's posture, is not that posture is switched at the interface otherwise identify described user's posture; Bring impact for fear of the unconscious action of user, can meet matching condition in the user profile that the consecutive numbers frame detects, just be identified as this setting posture.
Again for example, switch posture for the posture of setting under current interface for confirming switching posture and cancellation, shift value according to the left hand articulation point that calculates, judge whether to satisfy s>0.3m or s<-0.3m, if s>described user's posture of 0.3m identification is for confirming to switch posture, if s<-the described user's posture of 0.3m identification is for cancelling the switching posture, and it is not to confirm to switch posture or cancel the switching posture that s identifies described user's posture for other scopes.
For further understanding the present invention, provide the GUI of embodiment explanation equipment when the different time points of interface switching to show.Please refer to Fig. 3 A-3D, to withdraw from game as example, the display interface of different time points is as follows:
Before switch at the interface, namely receive the interface withdrawed from before the game posture as shown in Figure 3A, equipment 300 comprises display screen 301, and the demonstration screen display is current game picture 302.
After interface 3A before show switching, thus body sense IAS identify the user inputted withdraw from the game posture at the appointed time in demonstration comprise the prompting interface of information.A kind of prompting interface is as shown in Fig. 3 B: comprise the game picture 302 before withdrawing from, and the prompting user inputs the information of second.Information appears at the lower left corner of game picture 302 in the mode of stack, comprise " confirmation is withdrawed from ", " cancellation is withdrawed from " second reminder item 303, and two second operation indications 304 that reminder item is corresponding.The second reminder item shows in the mode of word, and word is glimmering with call user's attention, second operation indication comprises that word " is waved ", " waving " and left/right arrow graphical symbol left to the right, be used to indicate the action of waving to the right for confirming to withdraw from posture, can trigger and confirm to withdraw from game operation, wave left to move and withdraw from posture for cancelling, can trigger to cancel and withdraw from game operation.
Another points out the interface as shown in Figure 3 C: except comprising the game picture 302 before withdrawing from, and the prompting user inputs the information of second, optionally, also comprise timing progress dish 305, propelling along with the time, the sector region of black reduces gradually, indicates by the minimizing in black fan zone to allow the user to input the minimizing of the excess time of second.Information comprises second reminder item 306 and second operation indication 307, wherein second reminder item 306 comprises " confirmation is withdrawed from ", " cancellation is withdrawed from ", mode with text box shows, corresponding second operation indication 307 comprises that word " is lifted on left hand ", " lifting under left hand " and posture schematic diagram, be used to indicate and lift attitude under left hand for confirming to withdraw from posture, can trigger and confirm to withdraw from game operation, lift attitude on left hand and withdraw from posture for cancelling, can trigger to cancel and withdraw from game operation.Optionally, when entering the prompting interface, information occurs in the mode of fading in, and when withdrawing from the prompting interface, information disappears in the mode of fading out.
After display reminding interface 3B or 3C, if at the appointed time, body sense IAS receives user's confirmation and withdraws from posture, withdraws from game, and the interface after demonstration is withdrawed from is as shown in Fig. 3 D: comprise Game Menu picture 308.
After display reminding interface 3B or 3C, if at the appointed time, the cancellation that body sense IAS receives the user is withdrawed from confirmation that posture or body sense IAS both do not received the user and is withdrawed from the cancellation that posture do not receive the user yet and withdraw from posture, shows the interface 3A before switching.
The device that one embodiment of the invention provides a kind of interface to switch please refer to Fig. 4, and this interface switching device shifter 400 comprises:
Detecting unit 401 is for detection of user profile;
The first recognition unit 402 after user profile being detected for described detecting unit, identifies user's prime from described user profile;
Display unit 403, be used for when described user's prime be interface when switching posture, interior display reminding information at the appointed time, described information is in order to point out the user to input second;
The second recognition unit 404 is used for identifying user's second when described detecting unit detects user profile within the described fixed time;
Interface switching treatmenting unit 405 is used for carrying out the interface blocked operation of described prime association when described user's second switches posture for confirmation.
Optionally, interface switching treatmenting unit 405 further also is used for cancelling the interface blocked operation of described prime association when described user's second switches posture for cancelling.
Optionally, interface switching treatmenting unit 405 further also is used for cancelling the interface blocked operation of described prime association when described user profile not detected within the described fixed time.
With reference to shown in Figure 5, in this interface switching device shifter 400, the first recognition unit 402 further can comprise:
First obtains module 4021, is used for obtaining the related articulation point of posture of setting;
The first read module 4022, for read the data of described articulation point from the described user profile that has detected, wherein, described user profile comprises user's skeletal frame information, described skeletal frame information comprises articulation point information and timestamp information;
The first computing module 4023 is for calculate the matching parameter value of the posture of described setting according to the data of described articulation point;
The first identification module 4024 is used for according to the described matching parameter value described user's prime of identification or second;
Optionally, described first obtains module 4021, the articulation point that the posture that further is used for setting under the acquisition current interface relates to;
Optionally, described the first computing module 4023 is further for calculate the matching parameter value of the posture that sets under described current interface according to the data of described articulation point;
Optionally, described first obtains module 4021, the articulation point that further relates to for the acquiescence posture that obtains body sense IAS;
Optionally, described the first computing module 4023 further is used for the matching parameter value according to the described acquiescence posture of data calculating of described articulation point;
Optionally, described the first identification module 4024, the matching condition that further is used for the posture that will set under described matching parameter value and described current interface is compared, and perhaps described matching parameter value is compared with the matching condition of the acquiescence posture of described body sense IAS; Definite corresponding posture of matching parameter value that is complementary with described matching condition is take definite posture as user's prime or second;
Optionally, described the first read module 4022, further be used for when the posture of described setting is action, read corresponding articulation point data of articulation point that described setting posture relates to and the timestamp information of user's skeletal frame from a plurality of continuous user's skeletal frame information;
Optionally, described the first computing module 4023 further is used for the displacement according to described articulation point data and the described articulation point of described timestamp information calculating
Optionally, described the first read module 4022 further is used for reading the corresponding articulation point data of articulation point that described setting posture relates to from described user's skeletal frame information when the posture of described setting is pose position;
Optionally, described the first computing module 4023 further is used for according to the bone angle between described articulation point data calculating articulation point;
Similar with the first recognition unit 402, the second recognition unit 404 further also can comprise four modules: second obtains module 4041, the second read module 4042, the second computing module 4043, the second identification modules 4044.The function of the work mistake of each module of this second recognition unit 404 and the respective modules of the first recognition unit 402 is also similar, gives unnecessary details no longer in detail here.
Interface switching device shifter in the embodiment of the present invention can realize based on computer system, and Fig. 1-method shown in Figure 2 all can realize at the interface switching device shifter based on computer system.Fig. 6 shows the embodiment of the interface switching device shifter of realizing based on computer system.The present embodiment median surface switching device shifter can comprise: processor 601, storer 602 and communication interface 603, wherein:
Communication interface 603 is used for and body sense inter device communication.Message mutual between interface switching device shifter and body sense interactive device is all by communication interface 603 sending and receivings.Particularly, communication interface 603 is used for obtaining from body sense interactive device user's skeletal frame information; Storer 602 is used for stored program instruction; Processor 601 is used for calling the programmed instruction of storer 602 storages, carries out following operation: after user profile being detected, identify user's prime from described user profile; If described user's prime is the interface switches posture, at the appointed time in display reminding information, described information is in order to point out the user to input second; When user profile being detected within the described fixed time, identify user's second; If described user's second switches posture for confirming, carry out the interface blocked operation of described prime association.
Wherein, processor 601 can be central processing unit (central processing unit, CPU), special IC (application-specific integrated circuit, ASIC) etc.Wherein, the interface switching device shifter in the present embodiment can comprise bus 604.Can connect and communicate by letter by bus 604 between processor 601, storer 602 and communication interface 603.Wherein, storer 602 can comprise: random access memory (random access memory, RAM), and ROM (read-only memory) (read-only memory, ROM), disks etc. have the entity of memory function;
Processor 601 can also be used for each step that manner of execution embodiment Fig. 1 describes to Fig. 2, and the embodiment of the present invention is not described in detail in this.
Above to a kind of being described in detail provided by the present invention, to have used specific case herein principle of the present invention and embodiment have been set forth, the explanation of above embodiment just is used for helping to understand method of the present invention and core concept thereof; Simultaneously, for one of ordinary skill in the art, according to thought of the present invention, all will change in specific embodiments and applications, in sum, this description should not be construed as limitation of the present invention.

Claims (18)

1. the method that switch at the interface, is characterized in that, described method comprises:
After user profile being detected, identify user's prime from described user profile;
If described user's prime is the interface switches posture, at the appointed time in display reminding information, described information is in order to point out the user to input second;
When user profile being detected within the described fixed time, identify user's second;
If described user's second switches posture for confirming, carry out the interface blocked operation of described prime association.
2. method according to claim 1, is characterized in that, also comprises after described identification user second:
If described user's second switches posture for cancelling, cancel the interface blocked operation of described prime association; Or
If described user's second is not confirm to switch posture or cancel and switch posture, continue to detect user profile, and return describedly when user profile detected within the described fixed time time, identify the step of user's second.
3. method according to claim 1 and 2, is characterized in that, described prompting user also comprises after inputting second:
When described user profile not detected within the described fixed time, cancel the interface blocked operation of described prime association.
4. the described method of according to claim 1 to 3 any one, is characterized in that, described identification user's prime or second comprise:
Obtain the related articulation point of posture of setting;
Read the data of described articulation point from the described user profile that has detected, wherein, described user profile comprises user's skeletal frame information, and described skeletal frame information comprises articulation point information and timestamp information;
Calculate the matching parameter value of the posture of described setting according to the data of described articulation point;
According to the described matching parameter value described user's prime of identification or second.
5. method according to claim 4, is characterized in that, the related articulation point of posture that described acquisition is set comprises:
Determine the posture that sets under current interface type and current interface, the articulation point that the posture that obtains to set under described current interface relates to;
The matching parameter value that described data according to described articulation point are calculated the posture of described setting comprises:
Calculate the matching parameter value of the posture that sets under described current interface according to the data of described articulation point.
6. method according to claim 4, is characterized in that, the related articulation point of posture that described acquisition is set comprises:
Determine the acquiescence posture of body sense IAS, obtain the articulation point that described acquiescence posture relates to;
The matching parameter value that described data according to described articulation point are calculated the posture of described setting comprises:
Calculate the matching parameter value of described acquiescence posture according to the data of described articulation point.
7. require 5 or 6 described methods according to power, it is characterized in that, describedly comprise according to the described matching parameter value described user's prime of identification or second:
The matching condition of the posture set under described matching parameter value and described current interface is compared, perhaps described matching parameter value is compared with the matching condition of the acquiescence posture of described body sense IAS;
Definite corresponding posture of matching parameter value that is complementary with described matching condition is take definite posture as user's prime or second.
8. the method for stating according to claim 4 is characterized in that, when the posture of described setting is action,
The described data that read described articulation point from the described user profile that has detected comprise:
Read corresponding articulation point data of articulation point that described setting posture relates to and the timestamp information of user's skeletal frame from a plurality of continuous user's skeletal frame information;
The matching parameter value that described data according to described articulation point are calculated the posture of described setting comprises:
Displacement according to described articulation point data and the described articulation point of described timestamp information calculating.
9. method according to claim 4, is characterized in that, when the posture of described setting is pose position,
The described data that read described articulation point from the described user profile that has detected comprise:
Read the corresponding articulation point data of articulation point that described setting posture relates to from described user's skeletal frame information;
The matching parameter value that described data according to described articulation point are calculated the posture of described setting comprises:
According to the bone angle between described articulation point data calculating articulation point.
10. the device that switch at the interface, is characterized in that, described device comprises:
Detecting unit is for detection of user profile;
The first recognition unit after user profile being detected for described detecting unit, identifies user's prime from described user profile;
Display unit, be used for when described user's prime be interface when switching posture, interior display reminding information at the appointed time, described information is in order to point out the user to input second;
The second recognition unit is used for identifying user's second when described detecting unit detects user profile within the described fixed time;
The interface switching treatmenting unit is used for carrying out the interface blocked operation of described prime association when described user's second switches posture for confirmation.
11. device according to claim 10 is characterized in that:
Described interface switching treatmenting unit also is used for cancelling the interface blocked operation of described prime association when described user's second switches posture for cancelling.
12. according to claim 10 or 11 described devices is characterized in that:
Described interface switching treatmenting unit also is used for cancelling the interface blocked operation of described prime association when described user profile not detected within the described fixed time.
13. according to claim 10 to the 12 described devices of any one, it is characterized in that, described the first recognition unit or described the second recognition unit comprise:
Obtain module, be used for obtaining the related articulation point of posture of setting;
Read module, for read the data of described articulation point from the described user profile that has detected, wherein, described user profile comprises user's skeletal frame information, described skeletal frame information comprises articulation point information and timestamp information;
Computing module is for calculate the matching parameter value of the posture of described setting according to the data of described articulation point;
Identification module is used for according to the described matching parameter value described user's prime of identification or second.
14. device according to claim 13 is characterized in that:
Described acquisition module, the articulation point that the posture that further is used for setting under the acquisition current interface relates to;
Described computing module is further for calculate the matching parameter value of the posture that sets under described current interface according to the data of described articulation point.
15. device according to claim 13 is characterized in that:
Described acquisition module, the articulation point that further relates to for the acquiescence posture that obtains body sense IAS;
Described computing module further is used for the matching parameter value according to the described acquiescence posture of data calculating of described articulation point.
16. according to claim 14 or 15 described devices is characterized in that:
Described identification module, the matching condition that further is used for the posture that will set under described matching parameter value and described current interface is compared, and perhaps described matching parameter value is compared with the matching condition of the acquiescence posture of described body sense IAS; Definite corresponding posture of matching parameter value that is complementary with described matching condition is take definite posture as user's prime or second.
17. device according to claim 13 is characterized in that:
Described read module further is used for when the posture of described setting is action, reads corresponding articulation point data of articulation point that described setting posture relates to and the timestamp information of user's skeletal frame from a plurality of continuous user's skeletal frame information;
Described computing module further is used for the displacement according to described articulation point data and the described articulation point of described timestamp information calculating.
18. device according to claim 13 is characterized in that:
Described read module further is used for reading the corresponding articulation point data of articulation point that described setting posture relates to from described user's skeletal frame information when the posture of described setting is pose position;
Described computing module further is used for according to the bone angle between described articulation point data calculating articulation point.
CN201280001467.7A 2012-10-30 2012-10-30 The method and apparatus of changing interface Active CN103180803B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2012/083721 WO2014067058A1 (en) 2012-10-30 2012-10-30 Interface switching method and apparatus

Publications (2)

Publication Number Publication Date
CN103180803A true CN103180803A (en) 2013-06-26
CN103180803B CN103180803B (en) 2016-01-13

Family

ID=48639389

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201280001467.7A Active CN103180803B (en) 2012-10-30 2012-10-30 The method and apparatus of changing interface

Country Status (2)

Country Link
CN (1) CN103180803B (en)
WO (1) WO2014067058A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103472920A (en) * 2013-09-13 2013-12-25 通号通信信息集团有限公司 Action-recognition-based medical image control method and system
CN104808788A (en) * 2015-03-18 2015-07-29 北京工业大学 Method for controlling user interfaces through non-contact gestures
CN104881421A (en) * 2014-12-15 2015-09-02 深圳市腾讯计算机系统有限公司 Three-dimensional image switching method and three-dimensional image switching device
CN105929953A (en) * 2016-04-18 2016-09-07 北京小鸟看看科技有限公司 Operation guide method and apparatus in 3D immersive environment and virtual reality device
CN109062467A (en) * 2018-07-03 2018-12-21 Oppo广东移动通信有限公司 Split screen application switching method, device, storage medium and electronic equipment
CN111435512A (en) * 2019-01-11 2020-07-21 北京嘀嘀无限科技发展有限公司 Service information acquisition method and device
CN112337087A (en) * 2020-09-28 2021-02-09 湖南泽途体育文化有限公司 Somatosensory interaction method and system applied to sports competition

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015191323A1 (en) 2014-06-10 2015-12-17 3M Innovative Properties Company Nozzle assembly with external baffles

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101729808A (en) * 2008-10-14 2010-06-09 Tcl集团股份有限公司 Remote control method for television and system for remotely controlling television by same
CN102023798A (en) * 2009-09-17 2011-04-20 宏正自动科技股份有限公司 Method and apparatus for switching of kvm switch ports using gestures on a touch panel
WO2012005893A2 (en) * 2010-06-29 2012-01-12 Microsoft Corporation Skeletal joint recognition and tracking system
CN102749993A (en) * 2012-05-30 2012-10-24 无锡掌游天下科技有限公司 Motion recognition method based on skeleton node data

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101729808A (en) * 2008-10-14 2010-06-09 Tcl集团股份有限公司 Remote control method for television and system for remotely controlling television by same
CN102023798A (en) * 2009-09-17 2011-04-20 宏正自动科技股份有限公司 Method and apparatus for switching of kvm switch ports using gestures on a touch panel
WO2012005893A2 (en) * 2010-06-29 2012-01-12 Microsoft Corporation Skeletal joint recognition and tracking system
CN102749993A (en) * 2012-05-30 2012-10-24 无锡掌游天下科技有限公司 Motion recognition method based on skeleton node data

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103472920A (en) * 2013-09-13 2013-12-25 通号通信信息集团有限公司 Action-recognition-based medical image control method and system
CN104881421A (en) * 2014-12-15 2015-09-02 深圳市腾讯计算机系统有限公司 Three-dimensional image switching method and three-dimensional image switching device
CN104881421B (en) * 2014-12-15 2018-04-27 深圳市腾讯计算机系统有限公司 The switching method and device of a kind of 3-D graphic
CN104808788A (en) * 2015-03-18 2015-07-29 北京工业大学 Method for controlling user interfaces through non-contact gestures
CN104808788B (en) * 2015-03-18 2017-09-01 北京工业大学 A kind of method that non-contact gesture manipulates user interface
CN105929953A (en) * 2016-04-18 2016-09-07 北京小鸟看看科技有限公司 Operation guide method and apparatus in 3D immersive environment and virtual reality device
CN109062467A (en) * 2018-07-03 2018-12-21 Oppo广东移动通信有限公司 Split screen application switching method, device, storage medium and electronic equipment
CN109062467B (en) * 2018-07-03 2020-10-09 Oppo广东移动通信有限公司 Split screen application switching method and device, storage medium and electronic equipment
CN111435512A (en) * 2019-01-11 2020-07-21 北京嘀嘀无限科技发展有限公司 Service information acquisition method and device
CN112337087A (en) * 2020-09-28 2021-02-09 湖南泽途体育文化有限公司 Somatosensory interaction method and system applied to sports competition

Also Published As

Publication number Publication date
CN103180803B (en) 2016-01-13
WO2014067058A1 (en) 2014-05-08

Similar Documents

Publication Publication Date Title
CN103180803A (en) Interface switching method and apparatus
US20180232135A1 (en) Method for window displaying on a mobile terminal and mobile terminal
EP2919104B1 (en) Information processing device, information processing method, and computer-readable recording medium
US8902158B2 (en) Multi-user interaction with handheld projectors
KR20160088620A (en) Virtual input apparatus and method for receiving user input using thereof
CN103123574B (en) The method and apparatus that character is inputted in touch apparatus
CN102339141B (en) Mobile terminal and display control method thereof
US10572772B2 (en) Object detection device, object detection method, and recording medium, and recording medium
CN110362192A (en) Message position based on position
US20120249585A1 (en) Information processing device, method thereof, and display device
CN107185232A (en) Virtual objects motion control method, device, electronic equipment and storage medium
JP6470111B2 (en) Game program having message transmission function, message transmission method, and computer terminal with message transmission function
CN113190109A (en) Input control method and device of head-mounted display equipment and head-mounted display equipment
JP2016220847A (en) Game program with message transmission function, message transmission method, and computer terminal with message transmission function
CN103914228B (en) A kind of operating method of mobile terminal and its touch-screen
CN105760104A (en) Message handling method and terminal
CN103176711A (en) Information processing device
CN108108417A (en) Exchange method, system, equipment and the storage medium of cross-platform self adaptive control
CN106909272A (en) A kind of display control method and mobile terminal
EP3373250A1 (en) Method and portable electronic device for changing graphics processing resolution based on scenario
US20120245741A1 (en) Information processing apparatus, information processing method, recording medium, and program
CN112287708A (en) Near Field Communication (NFC) analog card switching method, device and equipment
JP2017021466A (en) Computer program for user input support
JP5864018B1 (en) Computer program
CN110727345B (en) Method and system for realizing man-machine interaction through finger intersection movement

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant