WO2014067058A1 - Interface switching method and apparatus - Google Patents

Interface switching method and apparatus Download PDF

Info

Publication number
WO2014067058A1
WO2014067058A1 PCT/CN2012/083721 CN2012083721W WO2014067058A1 WO 2014067058 A1 WO2014067058 A1 WO 2014067058A1 CN 2012083721 W CN2012083721 W CN 2012083721W WO 2014067058 A1 WO2014067058 A1 WO 2014067058A1
Authority
WO
WIPO (PCT)
Prior art keywords
posture
user
gesture
information
data
Prior art date
Application number
PCT/CN2012/083721
Other languages
French (fr)
Chinese (zh)
Inventor
宣曼
黄晨
薛传颂
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN201280001467.7A priority Critical patent/CN103180803B/en
Priority to PCT/CN2012/083721 priority patent/WO2014067058A1/en
Publication of WO2014067058A1 publication Critical patent/WO2014067058A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8011Ball

Definitions

  • the present invention relates to the field of communication network technologies, and in particular, to a method and apparatus for interface switching in a somatosensory interaction scenario. Background technique
  • Vision-based somatosensory interaction means that the computer captures the user's image through the camera, and uses pattern recognition, artificial intelligence and other technologies to understand the meaning of the user's actions, providing a more natural and intuitive way of interacting with the body.
  • Pattern recognition e.g., acoustic feature recognition
  • artificial intelligence e.g., acoustic feature recognition
  • the somatosensory interactive application system captures a video frame containing user information through a camera, and then obtains information of the user in the video frame (for example, joint point information) through image analysis technology, thereby determining the posture of the user (pose) and A gesture composed of a posture change in a continuous video frame; a gesture and a motion of the user together constitute a gesture, and the somatosensory interactive application system performs a corresponding feedback operation according to an instruction corresponding to the posture of the user. This constitutes a complete vision-based somatosensory interaction process.
  • the manner of capturing is: firstly, the posture input by the user is recognized, and when the predetermined interface switching posture is satisfied, the user is required to maintain the posture for a period of time before the interface switching instruction is triggered.
  • the user can exit the game by "stretching the left arm and tilting it 45° to the body", and asking the user to keep the gesture for a while to trigger the exit. Game "Operation. Otherwise cancel the operation and retain the original game interface. If the waiting time is set to a short time, it is easy to misjudge some of the user's unintended operations as an interface switching instruction.
  • the embodiment of the invention provides a method and a device for switching interfaces, which are used for improving the recognition accuracy of the interface switching posture instruction in the somatosensory interaction scenario and improving the user experience.
  • the method for interface switching provided by the embodiment of the present invention includes:
  • the first posture of the user is recognized from the user information; if the first posture of the user is an interface switching posture, prompt information is displayed within a specified time, and the prompt information is used to prompt the user to input the second Posture
  • the second gesture of the user is recognized; if the second gesture of the user is the confirmation switching gesture, the interface switching operation associated with the first gesture is performed.
  • the method further includes: if the second posture of the user is a cancel switching gesture, canceling the interface switching operation associated with the first gesture; or The user second gesture is not to confirm the switching posture or cancel the switching posture, continue to detect the user information, and returns to the step of recognizing the second posture of the user when the user information is detected within the specified time.
  • the method further includes: when the user is not detected within the specified time In the case of information, the interface switching operation associated with the first gesture is cancelled.
  • the identifying the first posture or the second posture of the user includes: Obtaining a joint point involved in the set posture; reading data of the joint point from the detected user information, wherein the user information includes user bone frame information, and the skeleton frame information includes a joint point Information and time stamp information; calculating a matching parameter value of the set posture according to the data of the joint point; and identifying the first posture or the second posture of the user according to the matching parameter value.
  • the joint points involved in obtaining the set posture include: determining the current interface type and the current interface a set posture, obtaining a joint point involved in the posture set under the current interface; calculating a matching parameter value of the set posture according to the data of the joint point includes: calculating according to data of the joint point The matching parameter value of the posture set under the current interface.
  • the joint point involved in obtaining the set posture includes: determining a default posture of the somatosensory interactive game application system, obtaining the default posture The joint point involved; calculating the matching parameter value of the set posture according to the data of the joint point comprises: calculating a matching parameter value of the default posture according to the data of the joint point.
  • the reading the joint point from the detected user information includes: reading, according to a plurality of consecutive user skeleton frame information, joint point data corresponding to the joint point and time stamp information of the user bone frame; and calculating the data according to the joint point
  • the matching parameter value of the set posture includes: calculating a displacement of the joint point based on the joint point data and the time stamp information.
  • the reading the joint from the detected user information includes: reading joint point data corresponding to the joint point involved in the set posture from the user skeleton frame information; and calculating a matching parameter value of the set posture according to the data of the joint point
  • the method includes: calculating an angle of a bone between the joint points according to the joint point data.
  • the first posture or the user is identified according to the matching parameter value
  • the second posture includes: comparing the matching parameter value with a matching condition of the posture set under the current interface, or comparing the matching parameter value with a matching condition of a default posture of the somatosensory interaction application system; The posture corresponding to the matching parameter value matching the matching condition, the determined posture is the user first posture or the second posture.
  • the apparatus for interface switching includes: a detecting unit, configured to detect user information;
  • a first identifying unit configured to: after the detecting unit detects the user information, identify the first posture of the user from the user information;
  • a display unit configured to display prompt information within a specified time when the first gesture of the user is an interface switching gesture, where the prompt information is used to prompt the user to input a second gesture;
  • a second identifying unit configured to: when the detecting unit detects the user information within the specified time, identify the second posture of the user;
  • the interface switching processing unit is configured to perform an interface switching operation of the first gesture association when the second gesture of the user is a confirmation switching gesture.
  • the interface switching processing unit is further configured to cancel the interface switching operation associated with the first gesture when the second posture of the user is a cancel switching gesture.
  • the interface switching processing unit is further configured to: when the user information is not detected within the specified time, The interface switching operation associated with the first gesture is cancelled.
  • the first identification unit or the second identification unit Contains:
  • a module for obtaining a joint point involved in the set posture a reading module, configured to read data of the joint point from the detected user information, wherein the user information includes a user bone Frame information, the skeleton frame information includes joint point information and time stamp information; and a calculation module, configured to calculate a matching parameter value of the set posture according to the data of the joint point;
  • an identification module configured to identify the first posture or the second posture of the user according to the matching parameter value.
  • Obtaining a module further configured to obtain a joint point involved in the posture set in the current interface; and a calculating module, configured to calculate a matching parameter value of the posture set in the current interface according to the data of the joint point .
  • the obtaining module is further configured to obtain a joint point involved in a default posture of the somatosensory interactive application system; the calculating module is further used for A matching parameter value of the default posture is calculated based on the data of the joint point.
  • the reading module is further configured to: when the set posture is an action posture, from a plurality of consecutive user bones Reading, in the frame information, the joint point data corresponding to the joint point and the time stamp information of the user skeleton frame; the calculation module, further configured to calculate the joint point data and the time stamp information according to the joint point data Off the displacement of the node.
  • the reading module is further configured to: when the set posture is an attitude posture, information from the user skeleton frame The joint point data corresponding to the joint point involved in the set posture is read; the calculation module is further configured to calculate a bone angle between the joint points according to the joint point data.
  • the identifying module is further configured to use the matching parameter value Comparing the matching conditions of the postures set under the current interface, or comparing the matching parameter values with the matching conditions of the default postures of the somatosensory interaction application system; determining matching parameter values that match the matching conditions
  • the corresponding posture is a first posture or a second posture of the user.
  • FIG. 1 is a flowchart of a method for switching an interface according to an embodiment of the present invention
  • FIG. 2 is a flowchart of a method for identifying a first posture or a second posture of a user according to an embodiment of the present invention
  • FIGS. 3A-3D illustrate a method according to the present invention.
  • the embodiment shows a graphical user interface of the device at different time points of the interface switching.
  • FIG. 4 is a block diagram of the interface switching device according to an embodiment of the present invention
  • FIG. 5 is a schematic diagram of the interface switching device according to another embodiment of the present invention
  • FIG. 6 is a structural diagram of a computer system-based interface switching apparatus according to still another embodiment of the present invention. detailed description
  • An embodiment of the present invention provides a method for interface switching, which can be used for interface switching in a somatosensory interaction scenario.
  • the specific interface operation includes one of the following: exiting the application, returning to the upper-level interface, returning to the main interface, and the recruiting menu. The operation that causes the interface to change.
  • the method includes: Step 101: After detecting the user information, identify the first posture of the user from the user information.
  • the method for detecting the user information may be: acquiring the skeleton frame information of the user, and determining whether the acquired skeleton frame information of the user includes valid joint point data. If valid joint point data is included, the user information is detected. Otherwise, the user information is not detected and the test is continued.
  • the method for recognizing the first posture of the user may be: obtaining a joint point involved in the preset posture; and reading valid joint point data corresponding to the joint point involved in the preset posture from the detected user information; The effective joint point data calculates a matching parameter value that matches the preset posture; and then the user's first posture is identified based on the matching parameter value.
  • the first posture of the user may be an action posture or a posture posture.
  • the somatosensory interactive application system is used as an execution subject, and the user's skeleton frame information is obtained from the somatosensory interaction device (in this embodiment, the somatosensory game product Kinect).
  • the SDK Software Development Kit
  • the Kinect device of the somatosensory game product includes a skeleton frame information extraction function NuiSkeletonGetNextFrame, and the application can extract the user skeleton frame information of the current time from the kinect device by calling the function, regardless of At present, there is no user in front of the Kinect device, and the device will generate a frame of user skeleton frame information.
  • the user skeleton frame information is represented by a NUI-SKELETON-FRAME data structure, and the data structure includes joint point information (using NUI-SKELETON-DATA data structure representation) and timestamp information (represented by liTimestamp parameter), joint
  • the point information contains a judgment flag eTrackingState having valid joint point data. If the eTrackingState parameter is true, the user information is detected. Otherwise, the parameter value is false, indicating that no user information is detected and the detection continues.
  • Step 102 If the first gesture of the user is an interface switching gesture, the prompt information is displayed within a specified time, and the prompt information is used to prompt the user to input the second gesture.
  • step 102 may further include the following steps: when the first posture of the user is not a boundary When the face switching gesture, the user information continues to be detected or when the user first gesture is not an interface switching gesture and an operation associated with the first gesture is set, the first gesture associated operation may also be performed.
  • the prompt information may include a second gesture selection item and an operation indication of the second posture.
  • the display manner of the prompt information may be a text, a picture, or the like, and a display effect such as blinking, fading, and the like may also be used.
  • the second posture selection item may be two options of “confirm switching posture” and “cancel switching posture”, and may be displayed by using a text box or a text, and correspondingly, the operation instruction of the second posture is “confirming the switching posture”.
  • the operation method indication of ", "Cancel switching posture” can indicate how the user operates by using text, symbols, pictures or animations.
  • the second posture may be an action posture or a posture posture.
  • Step 103 Identify the user's second gesture when the user information is detected within the specified time.
  • the method of detecting the user information is the same as the method of detecting the user information in step 101, and the method of recognizing the second posture of the user is the same as the method of recognizing the first posture of the user in step 101.
  • the step 103 may further include ignoring the interface switching operation associated with the first gesture if the user information is not detected within the specified time.
  • Step 104 If the second posture of the user is a confirmation switching posture, perform an interface switching operation of the first posture association. Further, if the second gesture is a cancel switching gesture, if yes, the interface switching operation associated with the first gesture is ignored.
  • An embodiment of the present invention provides a method for recognizing a first posture of a user after detecting user information in a method of interface switching, and a method for recognizing the second gesture is similar to the method, and details are not described herein. Referring to Figure 2, the method includes:
  • Step 201 Obtain a joint point involved in the set posture
  • the step specifically includes: determining a current interface type and a posture set under the current interface. Potential, obtain the joint points involved in setting the posture under the current interface.
  • the step may specifically include: determining all the default gestures of the somatosensory interaction application system, and obtaining joint points involved in all the default gestures of the system.
  • the state of the state machine is switched through the interface, and the current interface is determined to be an application interface before the switching, and the posture set under the current interface is only the interface switching posture; the interface switching posture is a shape of the left arm 45. Degree" posture, there are 7 joint points involved: SHOULDER - CENTER (central shoulder joint point), SHOULDER - RIGHT (right shoulder joint point) and ELBOW- RIGHT (right elbow joint point;), WRIST RIGHK right wrist joint point; ), SHOULDER—LEFT (left shoulder joint point), ELBOW—LEFT (left elbow joint point), WRIST LEFT (left wrist joint point;).
  • the state of the state machine is switched through the interface, and the current interface is determined to be a switching prompt interface, and the posture set under the current interface has a confirmation switching posture and a cancel switching posture; wherein, the confirmation switching posture is a left-handed right-wing motion
  • the joint point involved is HAND-LEFT (left hand joint point)
  • the cancel switching posture is a left-handed left-wing action posture
  • the joint points involved are HAND-LEFT (left-hand joint point).
  • Step 202 Read data of the joint point from the detected user information; wherein the user information includes user skeleton frame information, and the user skeleton frame information includes joint point information and time stamp information.
  • the step 202 may specifically include: specifically reading the joint point data corresponding to the joint point involved in the set posture from the user skeleton frame information.
  • the step 202 may include: reading, from the plurality of consecutive user skeleton frame information, the joint point data of the joint point and the time of the user skeleton frame related to the set posture. Stamp information.
  • the joint point data only needs to be read once for each frame of bone information.
  • the posture set under the current interface is an interface switching posture. Because the posture is a posture posture, only the coordinate data of the seven joint points involved in the posture in the current skeleton frame information is read, such as a table. 1 is shown.
  • the two postures involve the same joint point (left hand joint point), so the joint point data only needs to be read once for each frame of bone information.
  • the two postures are action postures, it is necessary to continuously read the left hand joint point coordinate data of the user's multiple bone frames and the timestamp information of each frame, and the current detection timestamp is the start time stamp ⁇ , as shown in Table 2. .
  • Step 203 Calculate a matching parameter value of the set posture according to the data of the joint point.
  • the matching parameter value for calculating the set posture includes: calculating a matching parameter value of all the default postures of the posture set under the current interface or the somatosensory interaction application system. Wherein, when the same matching parameter exists in the set posture, only the value of the matching parameter needs to be calculated once.
  • the matching parameter is 4 bone angles, taking Zabc as an example, which is composed of the central shoulder joint point a and the right shoulder joint point joint point b in Table 1.
  • Zabc is composed of the central shoulder joint point a and the right shoulder joint point joint point b in Table 1.
  • the angle is calculated as:
  • the matching parameter for confirming the switching posture is the displacement of the left hand joint point
  • the matching parameter for canceling the switching posture is also the displacement of the left hand joint point.
  • the same matching parameters for both poses so only the value of the match parameter is calculated once.
  • Step 204 Identify the first posture of the user according to the matching parameter value.
  • the step may be: comparing the matching parameter value with a matching condition of the posture set under the current interface, or comparing the matching parameter value with a matching condition of a default posture of the somatosensory interaction application system. Determining a posture corresponding to the matching parameter value that matches the matching condition, and determining the posture is the first posture of the user.
  • the posture set under the current interface is the interface switching posture
  • the potential is the interface switching posture, otherwise the user gesture is not the interface switching posture; in order to avoid the influence of the unintended action of the user, the user information detected in consecutive frames can be recognized as the setting posture.
  • the posture set under the current interface is the confirmation switching posture and the cancel switching posture, according to the calculated displacement value of the left hand joint point, it is determined whether s>0.3m or s ⁇ -0.3m is satisfied, if s>0.3m
  • the user gesture is identified as a confirmation switching gesture, and if s ⁇ -0.3m identifies the user gesture as canceling the switching gesture, and s is another range, identifying the user gesture is not confirming the switching posture or canceling the switching posture.
  • FIG. 3A-3D Take the exit game as an example.
  • the display interface at different time points is as follows:
  • the device 300 Before the interface is switched, that is, before the game exit gesture is received, as shown in FIG. 3A, the device 300 includes a display screen 301, and the current game screen 302 is displayed on the display screen.
  • a cueing interface is shown in Fig. 3B: includes a game screen 302 before exiting, and prompt information prompting the user to input the second gesture.
  • the prompt information appears in the lower left corner of the game screen 302 in a superimposed manner, and includes a "confirm exit", a "cancel exit” second gesture prompt item 303, and a second gesture operation instruction 304 corresponding to the two prompt items.
  • the second gesture prompt item is displayed in a text manner, and the text is blinking to remind the user of the attention, and the second gesture operation indication includes the words "wave left", “wave to the right", and left/right arrow graphic symbols for indicating
  • the right swing manual is used as the confirmation exit gesture, and the confirmation can be triggered to exit the game operation, and the leftward swing is manually canceled as the cancel exit gesture, and the cancellation of the exit game operation can be triggered.
  • FIG. 3C Another prompt interface is shown in FIG. 3C:
  • the timing progress disk 305 is included, and as time progresses, black
  • the sector area is gradually reduced, indicating a decrease in the remaining time that allows the user to input the second gesture by the reduction of the black sector area.
  • the prompt information includes a second gesture prompt item 306 and a second gesture operation indication 307, wherein the second gesture prompt item 306 includes “confirm exit”, “cancel exit”, and is displayed in a text box manner, and the corresponding second gesture operation indication 307 Contains the words "left hand lift”, “left hand lift” and posture diagram, used to indicate the left hand gesture
  • the prompt information appears in a fade-in manner, and when exiting the prompt interface, the prompt information disappears in a fade-out manner.
  • the interface after the display exits is as shown in FIG. 3D:
  • the game menu screen 308 is included.
  • the interface switching apparatus 400 includes: a detecting unit 401, configured to detect user information;
  • the first identifying unit 402 is configured to: after the detecting unit detects the user information, identify the first posture of the user from the user information;
  • the display unit 403 is configured to display prompt information in a specified time when the first posture of the user is an interface switching posture, where the prompt information is used to prompt the user to input the second posture;
  • a second identifying unit 404 configured to: when the detecting unit detects user information within the specified time, identify a second posture of the user;
  • the interface switching processing unit 405 is configured to perform an interface switching operation associated with the first gesture when the second posture of the user is a confirmation switching posture.
  • the interface switching processing unit 405 is further configured to cancel the interface switching operation associated with the first gesture when the second gesture of the user is to cancel the switching gesture.
  • the interface switching processing unit 405 is further configured to cancel the interface switching operation associated with the first gesture when the user information is not detected within the specified time.
  • the first identifying unit 402 may further include:
  • a first obtaining module 4021 configured to obtain a joint point involved in the set posture
  • a first reading module 4022 configured to read the joint from the detected user information Point data, where the user information includes user skeleton frame information, and the skeleton frame information includes joint point information and time stamp information;
  • a first calculating module 4023 configured to calculate a matching parameter value of the set posture according to the data of the joint point
  • the first identification module 4024 is configured to identify the first posture or the second posture of the user according to the matching parameter value
  • the first obtaining module 4021 is further configured to obtain a joint point involved in the posture set under the current interface
  • the first calculating module 4023 is further configured to calculate, according to the data of the joint point, a matching parameter value of the posture set in the current interface;
  • the first obtaining module 4021 is further configured to obtain a joint point involved in a default posture of the somatosensory interactive application system
  • the first calculating module 4023 is further configured to calculate a matching parameter value of the default posture according to the data of the joint point;
  • the first identification module 4024 is further configured to compare the matching parameter value with a matching condition of the posture set under the current interface, or interact the matching parameter value with the physical sense Comparing the matching conditions of the default posture of the application system; determining a posture corresponding to the matching parameter value that matches the matching condition, and determining the posture is the user first posture or the second posture;
  • the first reading module 4022 is further configured to: when the set posture is an action posture, read a joint point related to the set posture from a plurality of consecutive user bone frame information. Corresponding joint point data and time stamp information of the user skeleton frame;
  • the first calculating module 4023 is further configured to calculate the displacement of the joint point according to the joint point data and the timestamp information.
  • the first reading module 4022 is further configured to: when the set posture is an attitude posture, read a joint point corresponding to the set posture from the user skeleton frame information. Off node data;
  • the first calculating module 4023 is further configured to calculate a bone angle between the joint points according to the joint point data; Similar to the first identifying unit 402, the second identifying unit 404 may further include four modules: a second obtaining module 4041, a second reading module 4042, a second calculating module 4043, and a second identifying module 4044.
  • the function of each module of the second identification unit 404 and the function of the corresponding module of the first identification unit 402 are also similar, and details are not described herein again.
  • the interface switching device in the embodiment of the present invention can be implemented based on a computer system, and the methods shown in FIG. 1 to FIG. 2 can be implemented in a computer system-based interface switching device.
  • Figure 6 illustrates an embodiment of an interface switching device implemented in accordance with a computer system.
  • the interface switching device in this embodiment may include: a processor 601, a memory 602, and a communication interface 603, where:
  • the communication interface 603 is configured to communicate with the somatosensory interaction device. Messages exchanged between the interface switching device and the somatosensory interaction device are transmitted and received through the communication interface 603. Specifically, the communication interface 603 is configured to acquire the skeleton frame information of the user from the somatosensory interaction device; the memory 602 is configured to store the program instructions; the processor 601 is configured to invoke the program instructions stored in the memory 602, and perform the following operations: after detecting the user information Identifying the user's first gesture from the user information; if the user's first gesture is an interface switching gesture, displaying prompt information within a specified time, the prompt information is used to prompt the user to input the second gesture; When the user information is detected within the specified time, the second posture of the user is recognized; if the second posture of the user is the confirmation switching posture, the interface switching operation associated with the first gesture is performed.
  • the processor 601 can be a central processing unit (CPU), an application-specific integrated circuit (ASIC), or the like.
  • the interface switching device in this embodiment may include a bus 604.
  • the processor 601, the memory 602, and the communication interface 603 can be connected and communicated via the bus 604.
  • the memory 602 may include: a random access memory (RAM), a read-only memory (ROM), a disk and the like having an storage function;
  • the processor 601 can also be used to perform the steps described in FIG. 1 to FIG. 2 in the method embodiment, and the embodiments of the present invention are not described in detail herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Embodiments of the present invention provide an interface switching method and apparatus. The method comprises: after detecting user information, identifying a first posture of a user from the user information; if the first posture of the user is an interface switching posture, displaying prompt information in a specified time period, wherein the prompt information is used to prompt the user to input a second posture; when detecting user information in the specified time period, identifying the second posture of the user; if the second posture of the user is a switching acknowledgement posture, executing an interface switching operation associated with the first posture. According to the present invention, defects of a high misjudgment rate or long waiting time for identification of an interface switching posture instruction in a somatosensory interaction scenario can be solved, accuracy of posture control is raised and user experience is improved.

Description

界面切换的方法和装置  Method and device for interface switching
技术领域 Technical field
本发明涉及通信网络技术领域, 尤其涉及一种体感交互场景下界面切 换的方法和装置。 背景技术  The present invention relates to the field of communication network technologies, and in particular, to a method and apparatus for interface switching in a somatosensory interaction scenario. Background technique
基于视觉的体感交互是指计算机通过摄像头捕获用户的图像, 并利用 模式识别、 人工智能等技术理解用户动作的含义, 提供更加自然、 直观的 体感交互方式。 目前广泛应用在增强现实、 体感游戏控制等场景下。 体感交互的过程中, 体感交互应用系统通过摄像头捕获含有用户信息 的视频帧, 然后通过图像分析技术得到视频帧中用户的信息 (例如关节点 信息), 从而判断用户的的姿态 (pose ) 以及由连续视频帧中的姿态变化构 成的动作 (gesture ); 用户的姿态与动作共同构成姿势, 体感交互应用系统 根据用户的姿势对应的指令进行相应的反馈操作。 由此构成了完整的基于 视觉的体感交互过程。 现有技术中, 对于界面切换姿态的判断, 釆取的方式是: 首先识别用 户输入的姿态, 在满足规定的界面切换姿态时, 要求用户保持该姿态一段 时间后,才触发界面切换指令。例如:用户在使用微软的体感游戏设备 Kinect 进行体感游戏过程中, 可通过 "左手手臂伸直, 与身体呈斜下 45° " 的姿 态退出游戏, 要求用户保持该姿态一段时间才会触发 "退出游戏" 操作。 否则取消操作, 保留原来的游戏界面。 如果等待时间设置的较短时, 容易将用户的一些无意识的操作误判为 界面切换指令。 如果等待时间设置较长时, 用户需要较长时间保持某个姿 态不变, 用户体验差。 因此, 现有技术存在执行界面切换指令时误判率高 或等待时间较长的问题。 发明内容 本发明实施例提供了一种界面切换的方法和装置, 用于提高体感交互 场景下界面切换姿势指令的识别精确率, 提升用户体验。 Vision-based somatosensory interaction means that the computer captures the user's image through the camera, and uses pattern recognition, artificial intelligence and other technologies to understand the meaning of the user's actions, providing a more natural and intuitive way of interacting with the body. Currently widely used in augmented reality, somatosensory game control and other scenarios. In the process of somatosensory interaction, the somatosensory interactive application system captures a video frame containing user information through a camera, and then obtains information of the user in the video frame (for example, joint point information) through image analysis technology, thereby determining the posture of the user (pose) and A gesture composed of a posture change in a continuous video frame; a gesture and a motion of the user together constitute a gesture, and the somatosensory interactive application system performs a corresponding feedback operation according to an instruction corresponding to the posture of the user. This constitutes a complete vision-based somatosensory interaction process. In the prior art, for the judgment of the interface switching posture, the manner of capturing is: firstly, the posture input by the user is recognized, and when the predetermined interface switching posture is satisfied, the user is required to maintain the posture for a period of time before the interface switching instruction is triggered. For example, when using the Microsoft's somatosensory game device Kinect for a somatosensory game, the user can exit the game by "stretching the left arm and tilting it 45° to the body", and asking the user to keep the gesture for a while to trigger the exit. Game "Operation. Otherwise cancel the operation and retain the original game interface. If the waiting time is set to a short time, it is easy to misjudge some of the user's unintended operations as an interface switching instruction. If the waiting time is set longer, the user needs to keep a certain posture for a long time, and the user experience is poor. Therefore, the prior art has a problem that the false positive rate or the waiting time is long when the interface switching instruction is executed. Summary of the invention The embodiment of the invention provides a method and a device for switching interfaces, which are used for improving the recognition accuracy of the interface switching posture instruction in the somatosensory interaction scenario and improving the user experience.
第一方面, 本发明实施例提供的界面切换的方法包含:  The first aspect, the method for interface switching provided by the embodiment of the present invention includes:
检测到用户信息后, 从所述用户信息中识别出用户第一姿势; 如果所述用户第一姿势为界面切换姿势, 在指定时间内显示提示信息, 所述提示信息用以提示用户输入第二姿势;  After detecting the user information, the first posture of the user is recognized from the user information; if the first posture of the user is an interface switching posture, prompt information is displayed within a specified time, and the prompt information is used to prompt the user to input the second Posture
当在所述指定时间内检测到用户信息时, 识别出用户第二姿势; 如果所述用户第二姿势为确认切换姿势, 执行所述第一姿势关联的界 面切换操作。  When the user information is detected within the specified time, the second gesture of the user is recognized; if the second gesture of the user is the confirmation switching gesture, the interface switching operation associated with the first gesture is performed.
在第一方面的第一种可能的实现方式中, 识别用户第二姿势之后还包 含: 如果所述用户第二姿势为取消切换姿势, 取消所述第一姿势关联的界 面切换操作; 或如果所述用户第二姿势不是确认切换姿势或取消切换姿势, 继续检测用户信息, 并返回所述当在所述指定时间内检测到用户信息时, 识别用户第二姿势的步骤。  In a first possible implementation manner of the first aspect, after the identifying the second gesture of the user, the method further includes: if the second posture of the user is a cancel switching gesture, canceling the interface switching operation associated with the first gesture; or The user second gesture is not to confirm the switching posture or cancel the switching posture, continue to detect the user information, and returns to the step of recognizing the second posture of the user when the user information is detected within the specified time.
结合第一方面或第一方面的第一种可能的实现方式, 在第二种可能的 实现方式中, 提示用户输入第二姿势之后还包含: 当在所述指定时间内未 检测到所述用户信息时, 取消所述第一姿势关联的界面切换操作。  With reference to the first aspect or the first possible implementation manner of the first aspect, in a second possible implementation manner, after prompting the user to input the second gesture, the method further includes: when the user is not detected within the specified time In the case of information, the interface switching operation associated with the first gesture is cancelled.
结合第一方面或第一方面的第一种可能的实现方式或第一方面的第二 种可能的实现方式, 在第三种可能的实现方式中, 识别用户第一姿势或第 二姿势包含: 获得设定的姿势所涉及的关节点; 从已检测到的所述用户信 息中读取所述关节点的数据, 其中, 所述用户信息包括用户骨骼帧信息, 所述骨骼帧信息包括关节点信息和时间戳信息; 根据所述关节点的数据计 算所述设定的姿势的匹配参数值; 根据所述匹配参数值识别所述用户第一 姿势或第二姿势。  In combination with the first aspect or the first possible implementation of the first aspect or the second possible implementation of the first aspect, in a third possible implementation, the identifying the first posture or the second posture of the user includes: Obtaining a joint point involved in the set posture; reading data of the joint point from the detected user information, wherein the user information includes user bone frame information, and the skeleton frame information includes a joint point Information and time stamp information; calculating a matching parameter value of the set posture according to the data of the joint point; and identifying the first posture or the second posture of the user according to the matching parameter value.
结合第一方面的第三种可能的实现方式, 在第四种可能的实现方式中, 获得设定的姿势所涉及的关节点包含: 确定当前界面类型以及当前界面下 所设定的姿势, 获得所述当前界面下所设定的姿势涉及的关节点; 根据所 述关节点的数据计算所述设定的姿势的匹配参数值包含: 根据所述关节点 的数据计算所述当前界面下所设定的姿势的匹配参数值。 In conjunction with the third possible implementation of the first aspect, in a fourth possible implementation, the joint points involved in obtaining the set posture include: determining the current interface type and the current interface a set posture, obtaining a joint point involved in the posture set under the current interface; calculating a matching parameter value of the set posture according to the data of the joint point includes: calculating according to data of the joint point The matching parameter value of the posture set under the current interface.
结合第一方面的第三种可能的实现方式, 在第五种可能的实现方式中, 获得设定的姿势所涉及的关节点包含: 确定体感交互游戏应用系统的默认 姿势, 获得所述默认姿势涉及的关节点; 所述根据所述关节点的数据计算 所述设定的姿势的匹配参数值包含: 根据所述关节点的数据计算所述默认 姿势的匹配参数值。  In conjunction with the third possible implementation of the first aspect, in a fifth possible implementation, the joint point involved in obtaining the set posture includes: determining a default posture of the somatosensory interactive game application system, obtaining the default posture The joint point involved; calculating the matching parameter value of the set posture according to the data of the joint point comprises: calculating a matching parameter value of the default posture according to the data of the joint point.
结合第一方面的第三种可能的实现方式, 在第六种可能的实现方式中, 设定的姿势为动作姿势时, 所述从已检测到的所述用户信息中读取所述关 节点的数据包括: 从多个连续的用户骨骼帧信息中读取所述设定姿势涉及 的关节点相应的关节点数据及用户骨骼帧的时间戳信息; 所述根据所述关 节点的数据计算所述设定的姿势的匹配参数值包括: 根据所述关节点数据 和所述时间戳信息计算所述关节点的位移。  With reference to the third possible implementation manner of the first aspect, in a sixth possible implementation manner, when the set posture is an action gesture, the reading the joint point from the detected user information The data includes: reading, according to a plurality of consecutive user skeleton frame information, joint point data corresponding to the joint point and time stamp information of the user bone frame; and calculating the data according to the joint point The matching parameter value of the set posture includes: calculating a displacement of the joint point based on the joint point data and the time stamp information.
结合第一方面的第三种可能的实现方式, 在第七种可能的实现方式中, 设定的姿势为姿势姿态姿势时, 所述从已检测到的所述用户信息中读取所 述关节点的数据包括: 从所述用户骨骼帧信息中读取所述设定姿势涉及的 关节点相应的关节点数据; 所述根据所述关节点的数据计算所述设定的姿 势的匹配参数值包括: 根据所述关节点数据计算关节点之间的骨骼夹角。  With reference to the third possible implementation manner of the first aspect, in a seventh possible implementation manner, when the set posture is a posture posture posture, the reading the joint from the detected user information The data of the point includes: reading joint point data corresponding to the joint point involved in the set posture from the user skeleton frame information; and calculating a matching parameter value of the set posture according to the data of the joint point The method includes: calculating an angle of a bone between the joint points according to the joint point data.
结合第一方面的第四种可能的实现方式或第一方面的第五种可能的实 现方式, 在第八种可能的实现方式中, 根据所述匹配参数值识别所述用户 第一姿势或第二姿势包括: 将所述匹配参数值与所述当前界面下设定的姿 势的匹配条件相比较, 或者将所述匹配参数值与所述体感交互应用系统的 默认姿势的匹配条件相比较; 确定与所述匹配条件相匹配的匹配参数值所 对应的姿势, 以确定的姿势为用户第一姿势或第二姿势。  With reference to the fourth possible implementation of the first aspect or the fifth possible implementation manner of the first aspect, in an eighth possible implementation, the first posture or the user is identified according to the matching parameter value The second posture includes: comparing the matching parameter value with a matching condition of the posture set under the current interface, or comparing the matching parameter value with a matching condition of a default posture of the somatosensory interaction application system; The posture corresponding to the matching parameter value matching the matching condition, the determined posture is the user first posture or the second posture.
第二方面, 本发明实施例提供的界面切换的装置包含: 检测单元, 用于检测用户信息; In a second aspect, the apparatus for interface switching provided by the embodiment of the present invention includes: a detecting unit, configured to detect user information;
第一识别单元, 用于所述检测单元检测到用户信息后, 从所述用户信 息中识别出用户第一姿势;  a first identifying unit, configured to: after the detecting unit detects the user information, identify the first posture of the user from the user information;
显示单元, 用于当所述用户第一姿势为界面切换姿势时, 在指定时间 内显示提示信息, 所述提示信息用以提示用户输入第二姿势;  a display unit, configured to display prompt information within a specified time when the first gesture of the user is an interface switching gesture, where the prompt information is used to prompt the user to input a second gesture;
第二识别单元, 用于当所述检测单元在所述指定时间内检测到用户信 息时, 识别出用户第二姿势;  a second identifying unit, configured to: when the detecting unit detects the user information within the specified time, identify the second posture of the user;
界面切换处理单元, 用于当所述用户第二姿势为确认切换姿势时, 执 行所述第一姿势关联的界面切换操作。  The interface switching processing unit is configured to perform an interface switching operation of the first gesture association when the second gesture of the user is a confirmation switching gesture.
在第二方面的第一种可能的实现方式中, 界面切换处理单元还用于当 所述用户第二姿势为取消切换姿势时, 取消所述第一姿势关联的界面切换 操作。  In a first possible implementation manner of the second aspect, the interface switching processing unit is further configured to cancel the interface switching operation associated with the first gesture when the second posture of the user is a cancel switching gesture.
结合第二方面或第二方面的第一种可能的实现方式, 在第二种可能的 实现方式中, 界面切换处理单元还用于当在所述指定时间内未检测到所述 用户信息时, 取消所述第一姿势关联的界面切换操作。  With reference to the second aspect or the first possible implementation manner of the second aspect, in a second possible implementation manner, the interface switching processing unit is further configured to: when the user information is not detected within the specified time, The interface switching operation associated with the first gesture is cancelled.
结合第二方面或第二方面的第一种可能的实现方式或第二方面的第二 种可能的实现方式, 在第三种可能的实现方式中, 第一识别单元或所述第 二识别单元包含:  With reference to the second aspect or the first possible implementation manner of the second aspect, or the second possible implementation manner of the second aspect, in a third possible implementation manner, the first identification unit or the second identification unit Contains:
获得模块, 用于获得设定的姿势所涉及的关节点; 读取模块, 用于从 已检测到的所述用户信息中读取所述关节点的数据, 其中, 所述用户信息 包括用户骨 帧信息 , 所述骨骼帧信息包括关节点信息和时间戳信息; 计算模块, 用于根据所述关节点的数据计算所述设定的姿势的匹配参 数值;  a module for obtaining a joint point involved in the set posture; a reading module, configured to read data of the joint point from the detected user information, wherein the user information includes a user bone Frame information, the skeleton frame information includes joint point information and time stamp information; and a calculation module, configured to calculate a matching parameter value of the set posture according to the data of the joint point;
识别模块, 用于根据所述匹配参数值识别所述用户第一姿势或第二姿 势。  And an identification module, configured to identify the first posture or the second posture of the user according to the matching parameter value.
结合第二方面的第三种可能的实现方式, 在第四种可能的实现方式中, 获得模块, 进一步的用于获得当前界面下所设定的姿势涉及的关节点; 计 算模块, 进一步的用于根据所述关节点的数据计算所述当前界面下所设定 的姿势的匹配参数值。 In conjunction with the third possible implementation of the second aspect, in a fourth possible implementation, Obtaining a module, further configured to obtain a joint point involved in the posture set in the current interface; and a calculating module, configured to calculate a matching parameter value of the posture set in the current interface according to the data of the joint point .
结合第二方面的第三种可能的实现方式, 在第五种可能的实现方式中, 获得模块, 进一步的用于获得体感交互应用系统的默认姿势涉及的关节点; 计算模块, 进一步的用于根据所述关节点的数据计算所述默认姿势的匹配 参数值。  In conjunction with the third possible implementation of the second aspect, in a fifth possible implementation, the obtaining module is further configured to obtain a joint point involved in a default posture of the somatosensory interactive application system; the calculating module is further used for A matching parameter value of the default posture is calculated based on the data of the joint point.
结合第二方面的第三种可能的实现方式, 在第六种可能的实现方式中, 读取模块, 进一步的用于当所述设定的姿势为动作姿势时, 从多个连续的 用户骨骼帧信息中读取所述设定姿势涉及的关节点相应的关节点数据及用 户骨骼帧的时间戳信息; 计算模块, 进一步的用于根据所述关节点数据和 所述时间戳信息计算所述关节点的位移。  In conjunction with the third possible implementation of the second aspect, in a sixth possible implementation, the reading module is further configured to: when the set posture is an action posture, from a plurality of consecutive user bones Reading, in the frame information, the joint point data corresponding to the joint point and the time stamp information of the user skeleton frame; the calculation module, further configured to calculate the joint point data and the time stamp information according to the joint point data Off the displacement of the node.
结合第二方面的第三种可能的实现方式, 在第七种可能的实现方式中, 读取模块, 进一步的用于当所述设定的姿势为姿态姿势时, 从所述用户骨 骼帧信息中读取所述设定姿势涉及的关节点相应的关节点数据; 计算模块, 进一步的用于根据所述关节点数据计算关节点之间的骨骼夹角。  With reference to the third possible implementation manner of the second aspect, in a seventh possible implementation, the reading module is further configured to: when the set posture is an attitude posture, information from the user skeleton frame The joint point data corresponding to the joint point involved in the set posture is read; the calculation module is further configured to calculate a bone angle between the joint points according to the joint point data.
结合第二方面的第四种可能的实现方式或第二方面的第五种可能的实 现方式, 在第八种可能的实现方式中, 识别模块, 进一步的用于将所述匹 配参数值与所述当前界面下设定的姿势的匹配条件相比较, 或者将所述匹 配参数值与所述体感交互应用系统的默认姿势的匹配条件相比较; 确定与 所述匹配条件相匹配的匹配参数值所对应的姿势, 以确定的姿势为用户第 一姿势或第二姿势。  With reference to the fourth possible implementation of the second aspect, or the fifth possible implementation of the second aspect, in an eighth possible implementation, the identifying module is further configured to use the matching parameter value Comparing the matching conditions of the postures set under the current interface, or comparing the matching parameter values with the matching conditions of the default postures of the somatosensory interaction application system; determining matching parameter values that match the matching conditions The corresponding posture is a first posture or a second posture of the user.
由以上技术方案可以看出, 由于本发明实施例釆用第二姿势对第一姿 势指令进行确认的机制, 从而有效的解决了体感交互场景下界面切换姿势 识别时间长或者误判率高的问题, 提高了姿势操控的精确率, 从而极大提 升了用户体验。 附图说明 为了更清楚地说明本发明实施例或现有技术中的技术方案, 下面将对 实施例或现有技术描述中所需要使用的附图作简单地介绍, 显而易见地, 下面描述中的附图仅仅是本发明的一些实施例, 对于本领域普通技术人员 来讲, 在不付出创造性劳动性的前提下, 还可以根据这些附图获得其他的 附图。 图 1为本发明一个实施例提供的界面切换方法的流程图; 图 2为本发明一个实施例提供的识别用户第一姿势或第二姿势的方法 流程图; 图 3A-3D说明根据本发明一个实施例, 在界面切换的不同时间点, 设 备的图形用户界面显示; 图 4为本发明一个实施例提供的界面切换装置的组成框图; 图 5为本发明另一个实施例提供的界面切换装置的组成框图; 图 6为本发明再一个实施例提供的基于计算机系统的界面切换装置的 结构图。 具体实施方式 It can be seen from the above technical solution that the mechanism for confirming the first posture instruction by using the second posture is effectively solved by the embodiment of the present invention, thereby effectively solving the problem that the interface switching posture recognition time is long or the false positive rate is high under the somatosensory interaction scene. , which improves the accuracy of posture control, which greatly enhances the user experience. BRIEF DESCRIPTION OF THE DRAWINGS In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings to be used in the embodiments or the description of the prior art will be briefly described below, and obviously, in the following description The drawings are only some of the embodiments of the present invention, and other drawings may be obtained from those skilled in the art without departing from the drawings. 1 is a flowchart of a method for switching an interface according to an embodiment of the present invention; FIG. 2 is a flowchart of a method for identifying a first posture or a second posture of a user according to an embodiment of the present invention; and FIGS. 3A-3D illustrate a method according to the present invention. The embodiment shows a graphical user interface of the device at different time points of the interface switching. FIG. 4 is a block diagram of the interface switching device according to an embodiment of the present invention; FIG. 5 is a schematic diagram of the interface switching device according to another embodiment of the present invention; FIG. 6 is a structural diagram of a computer system-based interface switching apparatus according to still another embodiment of the present invention. detailed description
下面将结合本发明实施例中的附图, 对本发明实施例中的技术方案进 行清楚、完整地描述,显然, 所描述的实施例仅仅是本发明一部分实施例, 而不是全部的实施例。 基于本发明中的实施例, 本领域普通技术人员在没 有作出创造性劳动前提下所获得的所有其他实施例, 都属于本发明保护的 范围。 本发明一个实施例提供一种界面切换的方法, 该方法可用于体感交互 场景下的界面切换, 具体的界面操作包含如下之一:退出应用、 返回上一级 界面、 返回主界面和招出菜单等引起界面变化的操作。 请参照图 1 , 该方法包括: 步骤 101 : 检测到用户信息后, 从所述用户信息中识别用户第一姿势。 其中, 检测用户信息的方法可以是获取用户的骨骼帧信息, 并判断获 取到的用户的骨骼帧信息是否包含有效的关节点数据。 如果包含有效的关 节点数据, 说明检测到了用户信息, 否则, 说明没有检测到用户信息, 继 续检测。 The technical solutions in the embodiments of the present invention are clearly and completely described in the following with reference to the accompanying drawings in the embodiments of the present invention. It is obvious that the described embodiments are only a part of the embodiments of the present invention, but not all of the embodiments. All other embodiments obtained by those skilled in the art based on the embodiments of the present invention without creative efforts are within the scope of the present invention. An embodiment of the present invention provides a method for interface switching, which can be used for interface switching in a somatosensory interaction scenario. The specific interface operation includes one of the following: exiting the application, returning to the upper-level interface, returning to the main interface, and the recruiting menu. The operation that causes the interface to change. Referring to Figure 1, the method includes: Step 101: After detecting the user information, identify the first posture of the user from the user information. The method for detecting the user information may be: acquiring the skeleton frame information of the user, and determining whether the acquired skeleton frame information of the user includes valid joint point data. If valid joint point data is included, the user information is detected. Otherwise, the user information is not detected and the test is continued.
其中, 识别用户第一姿势的方法可以是: 获得预设姿势所涉及的关节 点; 从检测到的用户信息中读取与预设姿势涉及的关节点对应的有效的关 节点数据; 根据所述有效的关节点数据计算与预设姿势进行匹配的匹配参 数值;然后根据该匹配参数值, 识别出用户第一姿势。 其中, 用户第一姿势 可以是动作姿势, 也可以是姿态姿势。  The method for recognizing the first posture of the user may be: obtaining a joint point involved in the preset posture; and reading valid joint point data corresponding to the joint point involved in the preset posture from the detected user information; The effective joint point data calculates a matching parameter value that matches the preset posture; and then the user's first posture is identified based on the matching parameter value. The first posture of the user may be an action posture or a posture posture.
具体的, 在一种是实现方式中, 以体感交互应用系统为执行主体, 从 体感交互设备(本实施例中可以是体感游戏产品 Kinect )获取用户的骨骼 帧信息。 具体的, 体感游戏产品 Kinect 设备提供的 SDK ( Software Development Kit , 开发工具包 ) 中 包含有骨骼帧信息提取函数 NuiSkeletonGetNextFrame ,应用程序通过调用该函数能从 kinect设备提取当 前时刻的用户骨骼帧信息, 无论当前时刻 Kinect设备前有没有用户, 该设 备都会生成一帧用户骨骼帧信息。  Specifically, in an implementation manner, the somatosensory interactive application system is used as an execution subject, and the user's skeleton frame information is obtained from the somatosensory interaction device (in this embodiment, the somatosensory game product Kinect). Specifically, the SDK (Software Development Kit) provided by the Kinect device of the somatosensory game product includes a skeleton frame information extraction function NuiSkeletonGetNextFrame, and the application can extract the user skeleton frame information of the current time from the kinect device by calling the function, regardless of At present, there is no user in front of the Kinect device, and the device will generate a frame of user skeleton frame information.
其中, 用户骨骼帧信息釆用 NUI— SKELETON— FRAME数据结构表示, 该数据结构中包含关节点信息 (釆用 NUI— SKELETON— DATA数据结构表 示)和时间戳信息 (釆用 liTimestamp参数表示), 关节点信息中包含是否具 有有效关节点数据的判断标识 eTrackingState。如果 eTrackingState参数值为 true, 说明检测到了用户信息, 否则参数值为 false, 说明没有检测到用户信 息, 继续检测。  The user skeleton frame information is represented by a NUI-SKELETON-FRAME data structure, and the data structure includes joint point information (using NUI-SKELETON-DATA data structure representation) and timestamp information (represented by liTimestamp parameter), joint The point information contains a judgment flag eTrackingState having valid joint point data. If the eTrackingState parameter is true, the user information is detected. Otherwise, the parameter value is false, indicating that no user information is detected and the detection continues.
步骤 102: 如果所述用户第一姿势为界面切换姿势, 则在指定时间内显 示提示信息, 该提示信息用以提示用户输入第二姿势。  Step 102: If the first gesture of the user is an interface switching gesture, the prompt information is displayed within a specified time, and the prompt information is used to prompt the user to input the second gesture.
可选的, 步骤 102还可以包括以下步骤: 当所述用户第一姿势不是界 面切换姿势时, 继续检测用户信息或者, 当所述用户第一姿势不是界面切 换姿势且设置有与该第一姿势关联的操作时, 也可以执行该第一姿势关联 的操作。 Optionally, step 102 may further include the following steps: when the first posture of the user is not a boundary When the face switching gesture, the user information continues to be detected or when the user first gesture is not an interface switching gesture and an operation associated with the first gesture is set, the first gesture associated operation may also be performed.
其中, 提示信息可以包含第二姿势选择项以及第二姿势的操作指示。 具体的, 提示信息的显示方式可以是文字、 图片等表现形式, 还可以釆用 闪烁、 淡入淡出等显示效果。 例如, 第二姿势选择项可以是 "确认切换姿 势"、 "取消切换姿势" 两个选项, 并可以釆用文本框或文字的方式显示, 对应的, 第二姿势的操作指示是 "确认切换姿势"、 "取消切换姿势" 的操 作方法指示, 可以釆用文字、 符号、 图片或动画指示用户如何操作。 其中, 第二姿势可以为动作姿势, 也可以为姿态姿势。 步骤 103:当在所述指定时间内检测到用户信息时,识别用户第二姿势。 其中, 检测用户信息的方法和步骤 101 中检测用户信息的方法相同, 识别用户第二姿势的方法和步骤 101中识别用户第一姿势的方法相同。 其中, 步骤 103 还可以包含如果在所述指定时间内未检测到用户信息 时, 忽略与第一姿势关联的界面切换操作。 步骤 104: 如果所述用户第二姿势为确认切换姿势, 则执行所述第一姿 势关联的界面切换操作。 进一步地, 如果所述第二姿势是否为取消切换姿势, 如果是则忽略第 一姿势关联的界面切换操作。 可替代地, 如果所述第二姿势是否既非确认 切换姿势、 又非取消切换姿势, 则返回步骤 103 , 在指定时间内继续检测用 户信息。 本发明一个实施例提供在界面切换的方法中, 当检测到用户信息后, 识别用户第一姿势的方法, 识别第二姿势的方法与此类似, 不再赘述。 请 参照图 2, 该方法包含:  The prompt information may include a second gesture selection item and an operation indication of the second posture. Specifically, the display manner of the prompt information may be a text, a picture, or the like, and a display effect such as blinking, fading, and the like may also be used. For example, the second posture selection item may be two options of “confirm switching posture” and “cancel switching posture”, and may be displayed by using a text box or a text, and correspondingly, the operation instruction of the second posture is “confirming the switching posture”. The operation method indication of ", "Cancel switching posture" can indicate how the user operates by using text, symbols, pictures or animations. The second posture may be an action posture or a posture posture. Step 103: Identify the user's second gesture when the user information is detected within the specified time. The method of detecting the user information is the same as the method of detecting the user information in step 101, and the method of recognizing the second posture of the user is the same as the method of recognizing the first posture of the user in step 101. The step 103 may further include ignoring the interface switching operation associated with the first gesture if the user information is not detected within the specified time. Step 104: If the second posture of the user is a confirmation switching posture, perform an interface switching operation of the first posture association. Further, if the second gesture is a cancel switching gesture, if yes, the interface switching operation associated with the first gesture is ignored. Alternatively, if the second gesture is neither a confirmation switching gesture nor a cancel switching gesture, returning to step 103, the user information is continuously detected for a specified time. An embodiment of the present invention provides a method for recognizing a first posture of a user after detecting user information in a method of interface switching, and a method for recognizing the second gesture is similar to the method, and details are not described herein. Referring to Figure 2, the method includes:
步骤 201 : 获得设定的姿势所涉及的关节点;  Step 201: Obtain a joint point involved in the set posture;
其中, 该步骤具体包括: 确定当前界面类型以及当前界面下设定的姿 势, 获得该当前界面下设定姿势涉及的关节点。 The step specifically includes: determining a current interface type and a posture set under the current interface. Potential, obtain the joint points involved in setting the posture under the current interface.
可替代地, 该步骤也可以具体包括: 确定体感交互应用系统所有的默 认姿势, 获得所述系统所有的默认姿势涉及的关节点。  Alternatively, the step may specifically include: determining all the default gestures of the somatosensory interaction application system, and obtaining joint points involved in all the default gestures of the system.
具体的, 在一个实施例中, 通过界面切换状态机的状态, 判断当前界 面为切换前应用界面, 且当前界面下设定的姿势只有界面切换姿势; 界面 切换姿势是一个形状为 "左臂 45度" 姿态姿势, 涉及的关节点有 7 个: SHOULDER— CENTER (中心肩膀关节点)、 SHOULDER— RIGHT (右肩膀 关节点)与 ELBOW— RIGHT(右肘关节点;)、 WRIST RIGHK右腕关节点;)、 SHOULDER— LEFT (左肩膀关节点)、 ELBOW— LEFT (左肘关节点)、 WRIST LEFT (左腕关节点;)。  Specifically, in an embodiment, the state of the state machine is switched through the interface, and the current interface is determined to be an application interface before the switching, and the posture set under the current interface is only the interface switching posture; the interface switching posture is a shape of the left arm 45. Degree" posture, there are 7 joint points involved: SHOULDER - CENTER (central shoulder joint point), SHOULDER - RIGHT (right shoulder joint point) and ELBOW- RIGHT (right elbow joint point;), WRIST RIGHK right wrist joint point; ), SHOULDER—LEFT (left shoulder joint point), ELBOW—LEFT (left elbow joint point), WRIST LEFT (left wrist joint point;).
可替代的, 通过界面切换状态机的状态, 判断当前界面为切换提示界 面, 且当前界面下设定的姿势有确认切换姿势和取消切换姿势; 其中, 确 认切换姿势是一个左手向右挥动的动作姿势, 涉及的关节点有 HAND— LEFT (左手关节点), 取消切换姿势是一个左手向左挥动的动作姿 势, 涉及的关节点有 HAND— LEFT (左手关节点)。  Alternatively, the state of the state machine is switched through the interface, and the current interface is determined to be a switching prompt interface, and the posture set under the current interface has a confirmation switching posture and a cancel switching posture; wherein, the confirmation switching posture is a left-handed right-wing motion The posture, the joint point involved is HAND-LEFT (left hand joint point), the cancel switching posture is a left-handed left-wing action posture, and the joint points involved are HAND-LEFT (left-hand joint point).
步骤 202: 从已检测到的用户信息中读取该关节点的数据; 其中, 该用 户信息包括用户骨骼帧信息, 用户骨骼帧信息包括关节点信息和时间戳信 息。  Step 202: Read data of the joint point from the detected user information; wherein the user information includes user skeleton frame information, and the user skeleton frame information includes joint point information and time stamp information.
当设定的姿势为姿态姿势时, 步骤 202, 可以具体包括: 具体为从该用 户骨骼帧信息中读取所述设定姿势涉及的关节点相应的关节点数据。  When the set posture is the posture posture, the step 202 may specifically include: specifically reading the joint point data corresponding to the joint point involved in the set posture from the user skeleton frame information.
当设定的姿势中为动作姿势时, 步骤 202, 可以具体包括: 从多个连续 的用户骨骼帧信息中读取所述设定姿势涉及的关节点相应的关节点数据及 用户骨骼帧的时间戳信息。  When the gesture is a gesture, the step 202 may include: reading, from the plurality of consecutive user skeleton frame information, the joint point data of the joint point and the time of the user skeleton frame related to the set posture. Stamp information.
当设定的姿势涉及的多个关节点中存在相同的关节点时, 对每帧骨骼 信息只需要读取一次该关节点数据。  When there is the same joint point among the multiple joint points involved in the set posture, the joint point data only needs to be read once for each frame of bone information.
在具体实现中, 例如, 对于当前界面下设定的姿势为界面切换姿势, 因为该姿势是姿态姿势, 只需要读取当前骨骼帧信息中该姿势涉及的 7个 关节点的坐标数据, 如表 1所示。  In a specific implementation, for example, the posture set under the current interface is an interface switching posture. Because the posture is a posture posture, only the coordinate data of the seven joint points involved in the posture in the current skeleton frame information is read, such as a table. 1 is shown.
Joint Joint (关节 1名称 ) Posit  Joint Joint Posit
Index!关节 1011 (关节 点序号' ) 点位置 ) Index! Joint 1011 (joint Point number ') point position)
a n L U LU Pv ϊ s \ I Xi' yi' Zi  a n L U LU Pv ϊ s \ I Xi' yi' Zi
心肩膀关节点)  Heart shoulder joint point)
b SHOULDERmRIGHT(右肩 χ2· Υΐ· ζ2 b SHOULDER m RIGHT (right shoulder χ 2· Υΐ· ζ 2
膀关节点)  Joint point)
c ELBOW _ RIGHT (右肘关 Χ3, Υ3, z3 c ELBOW _ RIGHT (right elbow close Χ 3, Υ 3, z 3
τ ^ )  τ ^ )
d WRIST _ RIGHT (右腕关 ^4> y4< ¾ d WRIST _ RIGHT (right wrist off ^4> y 4 < 3⁄4
τ )  τ )
e SHOULDER—LEFT (左肩  e SHOULDER-LEFT (left shoulder
膀关节点 )  Joint point
f ELBOW— LEFT (左,时关节 χ6· Υβ· z6 f ELBOW—LEFT (left, hour joint χ 6· Υβ· z 6
点:)  Point :)
g WRIST _ LEFT (左腕关节  g WRIST _ LEFT (left wrist joint
点)  Point)
再例如, 对于当前界面下设定的姿势有确认切换姿势和取消切换姿势, 这两个姿势涉及相同的关节点(左手关节点), 因此对每帧骨骼信息只需要 读取一次该关节点数据。 又因为两个姿势都是动作姿势, 因此需要连续读 取用户多个骨骼帧的左手关节点坐标数据以及每帧的时间戳信息, 当前检 测时间戳为起始时间戳^ , 如表 2所示。  For example, for the posture set under the current interface, there are a confirmation switching posture and a cancellation switching posture, and the two postures involve the same joint point (left hand joint point), so the joint point data only needs to be read once for each frame of bone information. . Because the two postures are action postures, it is necessary to continuously read the left hand joint point coordinate data of the user's multiple bone frames and the timestamp information of each frame, and the current detection timestamp is the start time stamp ^, as shown in Table 2. .
Figure imgf000011_0001
Figure imgf000011_0001
步骤 203: 根据所述关节点的数据计算设定姿势的匹配参数值。  Step 203: Calculate a matching parameter value of the set posture according to the data of the joint point.
其中, 计算设定姿势的匹配参数值包含: 计算当前界面下设定的姿势 或者体感交互应用系统所有的默认姿势的匹配参数值.。 其中, 当该设定姿势存在相同的匹配参数时, 只需要计算一次该匹配 参数的值。 The matching parameter value for calculating the set posture includes: calculating a matching parameter value of all the default postures of the posture set under the current interface or the somatosensory interaction application system. Wherein, when the same matching parameter exists in the set posture, only the value of the matching parameter needs to be calculated once.
例如, 对于当前界面下设定的姿势为界面切换姿势, 其匹配参数为 4 个骨骼夹角, 以 Zabc为例, 它是表 1中中心肩膀关节点 a与右肩膀关节点 关节点 b组成的骨骼与右肩膀关节点 b与右肘关节点 c组成的骨骼之间的 夹角, 该夹角的计算公式为: For example, for the posture set under the current interface is the interface switching posture, the matching parameter is 4 bone angles, taking Zabc as an example, which is composed of the central shoulder joint point a and the right shoulder joint point joint point b in Table 1. The angle between the bone and the bone of the right shoulder joint point b and the right elbow joint point c. The angle is calculated as:
a c = cos 1, ac2 ^ ( i -- x3)2 + (yt - y3)2; ah (x1― x2Y + (y1― y2Y\ be2 = (x2― x3)2十( 2— y3)2 , Ac = cos 1, ac 2 ^ ( i -- x 3 ) 2 + (y t - y 3 ) 2 ; ah (x 1 ― x 2 Y + (y 1 ― y 2 Y\ be 2 = (x 2 ― x 3 ) 2 ( 2 - y 3 ) 2 ,
通过类似的公式, 可以计算其他 3个骨骼夹角 ( Zbcd、 aef, Zefg) 的参数值。 再例如, 对于当前界面下设定的姿势有确认切换姿势和取消切换姿势, 确认切换姿势的匹配参数是左手关节点的位移, 取消切换姿势的匹配参数 也是左手关节点的位移。 两个姿势存在相同的匹配参数, 因此只计算一次 该匹配参数的值。 其中, 左手关节点的位移计算公式为表 2 中起始检测时 间 与结束时间 ^之间的总位移 s =∑i=2 Ask , 相邻两个时间戳^与 之间的位 移为 ^ = x;- x;_1 比较位移 ^与总位移 s的正负符号, 正负符号相同表示需 要继续计算, 正负符号不同则表示手势结束。 在到达预设时间之前, 检测 到手势结束, 停止计时和计算, 否则一直计时并计算直到预设时间点, 总 位移 s为左手关节点的位移值。 With a similar formula, you can calculate the parameter values of the other three bone angles (Zbcd, aef, Zefg). For example, for the posture set under the current interface, there is a confirmation switching posture and a cancel switching posture, and the matching parameter for confirming the switching posture is the displacement of the left hand joint point, and the matching parameter for canceling the switching posture is also the displacement of the left hand joint point. There are the same matching parameters for both poses, so only the value of the match parameter is calculated once. Wherein, the displacement calculation formula of the left hand joint point is the total displacement s = ∑i = 2 As k between the initial detection time and the end time ^ in Table 2, and the displacement between the adjacent two time stamps ^ is ^ = x ; - x ; _ 1 Compare the positive and negative signs of the displacement ^ with the total displacement s. The same positive and negative signs indicate that the calculation needs to be continued. If the positive and negative signs are different, the gesture ends. Before the preset time is reached, the end of the gesture is detected, the timing and calculation are stopped, otherwise the time is counted and calculated until the preset time point, and the total displacement s is the displacement value of the left hand joint point.
步骤 204: 根据所述匹配参数值识别出所述用户第一姿势。  Step 204: Identify the first posture of the user according to the matching parameter value.
其中, 该步骤具体可以为, 将所述匹配参数值与所述当前界面下设定 的姿势的匹配条件相比较, 或者将所述匹配参数值与体感交互应用系统的 默认姿势的匹配条件相比较; 确定与所述匹配条件相匹配的匹配参数值所 对应的姿势, 以确定的姿势为用户第一姿势。  Specifically, the step may be: comparing the matching parameter value with a matching condition of the posture set under the current interface, or comparing the matching parameter value with a matching condition of a default posture of the somatosensory interaction application system. Determining a posture corresponding to the matching parameter value that matches the matching condition, and determining the posture is the first posture of the user.
例如, 对于当前界面下设定的姿势为界面切换姿势,根据计算得到的 4 个骨骼夹角,判断是否同时满足匹配条件 Zabc = 135。 ±10° , Zbcd= 180。 ±10。 , Zaef=90。 ±10。 , Zefg=180。 ±10。 , 若是则识别所述用户姿 势为界面切换姿势, 否则识别所述用户姿势不是界面切换姿势; 为了避免 用户无意识的动作带来影响, 可以在连续数帧检测到的用户信息符合匹配 条件, 才识别为该设定姿势。 For example, for the posture set under the current interface is the interface switching posture, according to the calculated four bone angles, it is determined whether the matching condition Zabc = 135 is satisfied at the same time. ±10°, Zbcd=180. ±10. , Zaef=90. ±10. , Zefg=180. ±10. If yes, identify the user's posture The potential is the interface switching posture, otherwise the user gesture is not the interface switching posture; in order to avoid the influence of the unintended action of the user, the user information detected in consecutive frames can be recognized as the setting posture.
再例如, 对于当前界面下设定的姿势为确认切换姿势和取消切换姿势, 根据计算得到的左手关节点的位移值, 判断是否满足 s>0.3m或 s<-0.3m, 假如 s>0.3m识别所述用户姿势为确认切换姿势, 假如 s<-0.3m识别所述用 户姿势为取消切换姿势, s为其他范围则识别所述用户姿势不是确认切换姿 势或取消切换姿势。  For example, if the posture set under the current interface is the confirmation switching posture and the cancel switching posture, according to the calculated displacement value of the left hand joint point, it is determined whether s>0.3m or s<-0.3m is satisfied, if s>0.3m The user gesture is identified as a confirmation switching gesture, and if s<-0.3m identifies the user gesture as canceling the switching gesture, and s is another range, identifying the user gesture is not confirming the switching posture or canceling the switching posture.
为进一步的理解本发明, 提供了一个实施例说明在界面切换的不同时 间点时设备的 GUI显示。 请参照图 3A-3D, 以退出游戏为例, 不同时间点 的显示界面如下:  For a further understanding of the present invention, an embodiment is provided to illustrate the GUI display of the device at different points in time of interface switching. Please refer to Figure 3A-3D. Take the exit game as an example. The display interface at different time points is as follows:
在界面切换前、 即收到退出游戏姿势前的界面如图 3A所示, 设备 300 包含显示屏 301 , 并且显示屏上显示的是当前的游戏画面 302。  Before the interface is switched, that is, before the game exit gesture is received, as shown in FIG. 3A, the device 300 includes a display screen 301, and the current game screen 302 is displayed on the display screen.
在显示切换前的界面 3A后,体感交互应用系统识别出用户输入了退出 游戏姿势、 从而在指定时间内显示包含提示信息的提示界面。 一种提示界 面如图 3B所示: 包含退出前的游戏画面 302, 以及提示用户输入第二姿势 的提示信息。 提示信息以叠加的方式出现在游戏画面 302 的左下角, 包含 "确认退出"、 "取消退出"第二姿势提示项 303 , 以及两个提示项对应的第 二姿势操作指示 304。 第二姿势提示项以文字的方式显示, 而且文字在闪烁 以提醒用户注意, 第二姿势操作指示包含文字 "向左挥手"、 "向右挥手" 以及左 /右箭头图形符号, 用于指示向右挥手动作为确认退出姿势, 能够触 发确认退出游戏操作, 向左挥手动作为取消退出姿势, 能够触发取消退出 游戏操作。  After displaying the interface 3A before switching, the somatosensory interactive application system recognizes that the user inputs the exiting game gesture, thereby displaying the prompt interface containing the prompt information within the specified time. A cueing interface is shown in Fig. 3B: includes a game screen 302 before exiting, and prompt information prompting the user to input the second gesture. The prompt information appears in the lower left corner of the game screen 302 in a superimposed manner, and includes a "confirm exit", a "cancel exit" second gesture prompt item 303, and a second gesture operation instruction 304 corresponding to the two prompt items. The second gesture prompt item is displayed in a text manner, and the text is blinking to remind the user of the attention, and the second gesture operation indication includes the words "wave left", "wave to the right", and left/right arrow graphic symbols for indicating The right swing manual is used as the confirmation exit gesture, and the confirmation can be triggered to exit the game operation, and the leftward swing is manually canceled as the cancel exit gesture, and the cancellation of the exit game operation can be triggered.
另外一种提示界面如图 3C所示: 除了包含退出前的游戏画面 302 , 以 及提示用户输入第二姿势的提示信息, 可选的, 还包含计时进度盘 305 , 随 着时间的推进, 黑色的扇形区域逐渐减少, 通过黑色扇形区域的减少来指 示允许用户输入第二姿势的剩余时间的减少。 提示信息包含第二姿势提示 项 306和第二姿势操作指示 307 ,其中第二姿势提示项 306包含"确认退出 "、 "取消退出", 以文本框的方式显示, 对应的第二姿势操作指示 307 包含 文字 "左手上举"、 "左手下举" 以及姿势示意图, 用于指示左手下举姿态 为确认退出姿势, 能够触发确认退出游戏操作, 左手上举姿态为取消退出 姿势, 能够触发取消退出游戏操作。 可选的, 进入提示界面时, 提示信息 以淡入的方式出现, 退出提示界面时, 提示信息以淡出的方式消失。 Another prompt interface is shown in FIG. 3C: In addition to the game screen 302 before exiting, and the prompt information prompting the user to input the second gesture, optionally, the timing progress disk 305 is included, and as time progresses, black The sector area is gradually reduced, indicating a decrease in the remaining time that allows the user to input the second gesture by the reduction of the black sector area. The prompt information includes a second gesture prompt item 306 and a second gesture operation indication 307, wherein the second gesture prompt item 306 includes "confirm exit", "cancel exit", and is displayed in a text box manner, and the corresponding second gesture operation indication 307 Contains the words "left hand lift", "left hand lift" and posture diagram, used to indicate the left hand gesture In order to confirm the exit posture, it is possible to trigger the confirmation to exit the game operation, and the left hand gesture is to cancel the exit gesture, and the cancel cancellation game operation can be triggered. Optionally, when entering the prompt interface, the prompt information appears in a fade-in manner, and when exiting the prompt interface, the prompt information disappears in a fade-out manner.
在显示提示界面 3B或者 3C后, 若在指定时间内, 体感交互应用系统 接收到用户的确认退出姿势, 退出游戏, 显示退出后的界面如图 3D所示: 包含游戏菜单画面 308。  After the prompt interface 3B or 3C is displayed, if the somatosensory interactive application system receives the confirmation exit posture of the user within a specified time, and exits the game, the interface after the display exits is as shown in FIG. 3D: The game menu screen 308 is included.
在显示提示界面 3B或者 3C后, 若在指定时间内, 体感交互应用系统 接收到用户的取消退出姿势、 或者体感交互应用系统既未接收到用户的确 认退出姿势也未接收到用户的取消退出姿势, 则展示切换前的界面 3A。 本发明一个实施例提供一种界面切换的装置,请参照图 4, 该界面切换 装置 400包含: 检测单元 401 , 用于检测用户信息;  After the prompt interface 3B or 3C is displayed, if the somatosensory interactive application system receives the user's cancel exit gesture within a specified time, or the somatosensory interactive application system neither receives the user's confirmation exit gesture nor receives the user's cancel exit gesture. , then show the interface 3A before switching. An embodiment of the present invention provides an apparatus for interface switching. Referring to FIG. 4, the interface switching apparatus 400 includes: a detecting unit 401, configured to detect user information;
第一识别单元 402 , 用于所述检测单元检测到用户信息后,从所述用户 信息中识别出用户第一姿势;  The first identifying unit 402 is configured to: after the detecting unit detects the user information, identify the first posture of the user from the user information;
显示单元 403 , 用于当所述用户第一姿势为界面切换姿势时, 在指定时 间内显示提示信息, 所述提示信息用以提示用户输入第二姿势;  The display unit 403 is configured to display prompt information in a specified time when the first posture of the user is an interface switching posture, where the prompt information is used to prompt the user to input the second posture;
第二识别单元 404 ,用于当所述检测单元在所述指定时间内检测到用户 信息时, 识别出用户第二姿势;  a second identifying unit 404, configured to: when the detecting unit detects user information within the specified time, identify a second posture of the user;
界面切换处理单元 405 , 用于当所述用户第二姿势为确认切换姿势时, 执行所述第一姿势关联的界面切换操作。  The interface switching processing unit 405 is configured to perform an interface switching operation associated with the first gesture when the second posture of the user is a confirmation switching posture.
可选的, 界面切换处理单元 405 , 进一步的还用于当所述用户第二姿势 为取消切换姿势时, 取消所述第一姿势关联的界面切换操作。  Optionally, the interface switching processing unit 405 is further configured to cancel the interface switching operation associated with the first gesture when the second gesture of the user is to cancel the switching gesture.
可选的, 界面切换处理单元 405 , 进一步的还用于当在所述指定时间内 未检测到所述用户信息时, 取消所述第一姿势关联的界面切换操作。  Optionally, the interface switching processing unit 405 is further configured to cancel the interface switching operation associated with the first gesture when the user information is not detected within the specified time.
参考图 5所示, 该界面切换装置 400中, 第一识别单元 402进一步的 可以包含:  Referring to FIG. 5, in the interface switching device 400, the first identifying unit 402 may further include:
第一获得模块 4021 , 用于获得设定的姿势所涉及的关节点;  a first obtaining module 4021, configured to obtain a joint point involved in the set posture;
第一读取模块 4022, 用于从已检测到的所述用户信息中读取所述关节 点的数据, 其中, 所述用户信息包括用户骨骼帧信息, 所述骨骼帧信息包 括关节点信息和时间戳信息; a first reading module 4022, configured to read the joint from the detected user information Point data, where the user information includes user skeleton frame information, and the skeleton frame information includes joint point information and time stamp information;
第一计算模块 4023 , 用于根据所述关节点的数据计算所述设定的姿势 的匹配参数值;  a first calculating module 4023, configured to calculate a matching parameter value of the set posture according to the data of the joint point;
第一识别模块 4024, 用于根据所述匹配参数值识别所述用户第一姿势 或第二姿势;  The first identification module 4024 is configured to identify the first posture or the second posture of the user according to the matching parameter value;
可选的, 所述第一获得模块 4021 , 进一步的用于获得当前界面下所设 定的姿势涉及的关节点;  Optionally, the first obtaining module 4021 is further configured to obtain a joint point involved in the posture set under the current interface;
可选的, 所述第一计算模块 4023 , 进一步的用于根据所述关节点的数 据计算所述当前界面下所设定的姿势的匹配参数值;  Optionally, the first calculating module 4023 is further configured to calculate, according to the data of the joint point, a matching parameter value of the posture set in the current interface;
可选的, 所述第一获得模块 4021 , 进一步的用于获得体感交互应用系 统的默认姿势涉及的关节点;  Optionally, the first obtaining module 4021 is further configured to obtain a joint point involved in a default posture of the somatosensory interactive application system;
可选的, 所述第一计算模块 4023 , 进一步的用于根据所述关节点的数 据计算所述默认姿势的匹配参数值;  Optionally, the first calculating module 4023 is further configured to calculate a matching parameter value of the default posture according to the data of the joint point;
可选的, 所述第一识别模块 4024, 进一步的用于将所述匹配参数值与 所述当前界面下设定的姿势的匹配条件相比较, 或者将所述匹配参数值与 所述体感交互应用系统的默认姿势的匹配条件相比较; 确定与所述匹配条 件相匹配的匹配参数值所对应的姿势, 以确定的姿势为用户第一姿势或第 二姿势;  Optionally, the first identification module 4024 is further configured to compare the matching parameter value with a matching condition of the posture set under the current interface, or interact the matching parameter value with the physical sense Comparing the matching conditions of the default posture of the application system; determining a posture corresponding to the matching parameter value that matches the matching condition, and determining the posture is the user first posture or the second posture;
可选的, 所述第一读取模块 4022, 进一步的用于当所述设定的姿势为 动作姿势时, 从多个连续的用户骨骼帧信息中读取所述设定姿势涉及的关 节点相应的关节点数据及用户骨骼帧的时间戳信息;  Optionally, the first reading module 4022 is further configured to: when the set posture is an action posture, read a joint point related to the set posture from a plurality of consecutive user bone frame information. Corresponding joint point data and time stamp information of the user skeleton frame;
可选的, 所述第一计算模块 4023 , 进一步的用于根据所述关节点数据 和所述时间戳信息计算所述关节点的位移  Optionally, the first calculating module 4023 is further configured to calculate the displacement of the joint point according to the joint point data and the timestamp information.
可选的, 所述第一读取模块 4022, 进一步的用于当所述设定的姿势为 姿态姿势时, 从所述用户骨骼帧信息中读取所述设定姿势涉及的关节点相 应的关节点数据;  Optionally, the first reading module 4022 is further configured to: when the set posture is an attitude posture, read a joint point corresponding to the set posture from the user skeleton frame information. Off node data;
可选的, 所述第一计算模块 4023 , 进一步的用于根据所述关节点数据 计算关节点之间的骨骼夹角; 和第一识别单元 402类似的, 第二识别单元 404进一步的也可以包含 四个模块: 第二获得模块 4041 , 第二读取模块 4042, 第二计算模块 4043 , 第二识别模块 4044。 该第二识别单元 404的每个模块的工呢过和第一识别 单元 402的对应模块的功能也类似, 这里不再详细赘述。 Optionally, the first calculating module 4023 is further configured to calculate a bone angle between the joint points according to the joint point data; Similar to the first identifying unit 402, the second identifying unit 404 may further include four modules: a second obtaining module 4041, a second reading module 4042, a second calculating module 4043, and a second identifying module 4044. The function of each module of the second identification unit 404 and the function of the corresponding module of the first identification unit 402 are also similar, and details are not described herein again.
本发明实施例中的界面切换装置可以基于计算机系统来实现,图 1-图 2 所示的方法均可在基于计算机系统的界面切换装置来实现。 图 6示出了基 于计算机系统来实现的界面切换装置的实施例。 本实施例中界面切换装置 可以包括: 处理器 601、 存储器 602和通信接口 603 , 其中: The interface switching device in the embodiment of the present invention can be implemented based on a computer system, and the methods shown in FIG. 1 to FIG. 2 can be implemented in a computer system-based interface switching device. Figure 6 illustrates an embodiment of an interface switching device implemented in accordance with a computer system. The interface switching device in this embodiment may include: a processor 601, a memory 602, and a communication interface 603, where:
通信接口 603 , 用于与体感交互设备通信。界面切换装置与体感交互设 备之间交互的消息均通过通信接口 603发送和接收。 具体地, 通信接口 603 用于从体感交互设备获取用户的骨骼帧信息; 存储器 602用于存储程序指 令; 处理器 601用于调用存储器 602中存储的程序指令, 执行如下操作: 检测到用户信息后, 从所述用户信息中识别出用户第一姿势; 如果所述用 户第一姿势为界面切换姿势, 在指定时间内显示提示信息, 所述提示信息 用以提示用户输入第二姿势; 当在所述指定时间内检测到用户信息时, 识 别出用户第二姿势; 如果所述用户第二姿势为确认切换姿势, 执行所述第 一姿势关联的界面切换操作。  The communication interface 603 is configured to communicate with the somatosensory interaction device. Messages exchanged between the interface switching device and the somatosensory interaction device are transmitted and received through the communication interface 603. Specifically, the communication interface 603 is configured to acquire the skeleton frame information of the user from the somatosensory interaction device; the memory 602 is configured to store the program instructions; the processor 601 is configured to invoke the program instructions stored in the memory 602, and perform the following operations: after detecting the user information Identifying the user's first gesture from the user information; if the user's first gesture is an interface switching gesture, displaying prompt information within a specified time, the prompt information is used to prompt the user to input the second gesture; When the user information is detected within the specified time, the second posture of the user is recognized; if the second posture of the user is the confirmation switching posture, the interface switching operation associated with the first gesture is performed.
其中, 处理器 601可以是中央处理器 ( central processing unit, CPU )、 专用集成电路 ( application-specific integrated circuit, ASIC )等。 其中, 本实 施例中的界面切换装置可以包括总线 604。 处理器 601、 存储器 602以及通 信接口 603之间可通过总线 604连接并通信。其中,存储器 602可以包括: 随机存取存诸器 ( random access memory, RAM ), 只读存 4诸器 ( read-only memory, ROM ), 磁盘等具有存储功能的实体;  The processor 601 can be a central processing unit (CPU), an application-specific integrated circuit (ASIC), or the like. The interface switching device in this embodiment may include a bus 604. The processor 601, the memory 602, and the communication interface 603 can be connected and communicated via the bus 604. The memory 602 may include: a random access memory (RAM), a read-only memory (ROM), a disk and the like having an storage function;
处理器 601还可以用于执行方法实施例中图 1到图 2描述的各步骤, 本发明实施例在此不再详述。  The processor 601 can also be used to perform the steps described in FIG. 1 to FIG. 2 in the method embodiment, and the embodiments of the present invention are not described in detail herein.
以上对本发明所提供的一种进行了详细介绍, 本文中应用了具体个例 对本发明的原理及实施方式进行了阐述, 以上实施例的说明只是用于帮助 理解本发明的方法及其核心思想; 同时, 对于本领域的一般技术人员, 依 据本发明的思想, 在具体实施方式及应用范围上均会有改变之处, 综上所 述, 本说明书内容不应理解为对本发明的限制。 The foregoing provides a detailed description of the present invention. The principles and embodiments of the present invention are described herein with reference to specific examples. The description of the above embodiments is only to assist in understanding the method of the present invention and its core ideas; At the same time, for those skilled in the art, according to the idea of the present invention, there will be changes in the specific embodiments and application scopes. The description is not to be construed as limiting the invention.

Claims

权利要求 Rights request
1、 一种界面切换的方法, 其特征在于, 所述方法包含: 1. A method of interface switching, characterized in that the method includes:
检测到用户信息后, 从所述用户信息中识别出用户第一姿势; 如果所述用户第一姿势为界面切换姿势, 在指定时间内显示提示信息, 所述提示信息用以提示用户输入第二姿势; After detecting the user information, identify the user's first gesture from the user information; if the user's first gesture is an interface switching gesture, display prompt information within a specified time, and the prompt information is used to prompt the user to enter the second posture;
当在所述指定时间内检测到用户信息时, 识别出用户第二姿势; 如果所述用户第二姿势为确认切换姿势, 执行所述第一姿势关联的界 面切换操作。 When the user information is detected within the specified time, the user's second gesture is recognized; if the user's second gesture is a confirmation switching gesture, the interface switching operation associated with the first gesture is performed.
2、 根据权利要求 1所述的方法, 其特征在于, 所述识别用户第二姿势 之后还包含: 2. The method according to claim 1, wherein the identifying the second gesture of the user further includes:
如果所述用户第二姿势为取消切换姿势, 取消所述第一姿势关联的界 面切换操作; 或 If the user's second gesture is a switching gesture, cancel the interface switching operation associated with the first gesture; or
如果所述用户第二姿势不是确认切换姿势或取消切换姿势, 继续检测 用户信息, 并返回所述当在所述指定时间内检测到用户信息时, 识别用户 第二姿势的步骤。 If the user's second gesture is not a confirmation switching gesture or a cancellation switching gesture, continue to detect the user information, and return to the step of identifying the user's second gesture when the user information is detected within the specified time.
3、 根据权利要求 1或 2所述的方法, 其特征在于, 所述提示用户输 入第二姿势之后还包含: 3. The method according to claim 1 or 2, characterized in that, after prompting the user to input the second gesture, the method further includes:
当在所述指定时间内未检测到所述用户信息时, 取消所述第一姿势关 联的界面切换操作。 When the user information is not detected within the specified time, the interface switching operation associated with the first gesture is canceled.
4、 根据权利要求 1至 3 任意一项所述的方法, 其特征在于, 所述识别 用户第一姿势或第二姿势包含: 4. The method according to any one of claims 1 to 3, characterized in that the identifying the user's first gesture or the second gesture includes:
获得设定的姿势所涉及的关节点; Obtain the joint points involved in the set pose;
从已检测到的所述用户信息中读取所述关节点的数据, 其中, 所述用 户信息包括用户骨骼帧信息, 所述骨骼帧信息包括关节点信息和时间戳信 息; 根据所述关节点的数据计算所述设定的姿势的匹配参数值; Read the data of the joint points from the detected user information, where the user information includes user skeleton frame information, and the skeleton frame information includes joint point information and timestamp information; Calculate the matching parameter value of the set posture according to the data of the joint point;
根据所述匹配参数值识别所述用户第一姿势或第二姿势。 The first gesture or the second gesture of the user is identified according to the matching parameter value.
5、 根据权利要求 4所述的方法, 其特征在于, 所述获得设定的姿势所 涉及的关节点包含: 5. The method according to claim 4, characterized in that the joint points involved in obtaining the set posture include:
确定当前界面类型以及当前界面下所设定的姿势, 获得所述当前界面 下所设定的姿势涉及的关节点; Determine the current interface type and the posture set under the current interface, and obtain the joint points involved in the posture set under the current interface;
所述根据所述关节点的数据计算所述设定的姿势的匹配参数值包含: 根据所述关节点的数据计算所述当前界面下所设定的姿势的匹配参数 值。 Calculating the matching parameter value of the set posture based on the data of the joint point includes: calculating the matching parameter value of the set posture under the current interface based on the data of the joint point.
6、 根据权利要求 4所述的方法, 其特征在于, 所述获得设定的姿势所 涉及的关节点包含: 6. The method according to claim 4, characterized in that the joint points involved in obtaining the set posture include:
确定体感交互应用系统的默认姿势, 获得所述默认姿势涉及的关节点; 所述根据所述关节点的数据计算所述设定的姿势的匹配参数值包含: 根据所述关节点的数据计算所述默认姿势的匹配参数值。 Determine the default posture of the somatosensory interactive application system, and obtain the joint points involved in the default posture; Calculating the matching parameter value of the set posture based on the data of the joint points includes: Calculating the matching parameter values based on the data of the joint points Describes the matching parameter values for the default pose.
7、 根据权要求 5或 6所述的方法, 其特征在于, 所述根据所述匹配参 数值识别所述用户第一姿势或第二姿势包括: 7. The method according to claim 5 or 6, characterized in that identifying the user's first gesture or second gesture according to the matching parameter value includes:
将所述匹配参数值与所述当前界面下设定的姿势的匹配条件相比较, 或者将所述匹配参数值与所述体感交互应用系统的默认姿势的匹配条件相 比较; Compare the matching parameter value with the matching condition of the posture set under the current interface, or compare the matching parameter value with the matching condition of the default posture of the somatosensory interactive application system;
确定与所述匹配条件相匹配的匹配参数值所对应的姿势, 以确定的姿 势为用户第一姿势或第二姿势。 Determine the posture corresponding to the matching parameter value that matches the matching condition, and the determined posture is the first posture or the second posture of the user.
8、 根据权利要求 4述的方法, 其特征在于, 所述设定的姿势为动作 姿势时, 8. The method according to claim 4, characterized in that when the set posture is an action posture,
所述从已检测到的所述用户信息中读取所述关节点的数据包括: 从多个连续的用户骨骼帧信息中读取所述设定姿势涉及的关节点相 应的关节点数据及用户骨骼帧的时间戳信息; Reading the data of the joint points from the detected user information includes: reading the joint point phases involved in the set posture from multiple consecutive user skeleton frame information. The corresponding joint point data and the timestamp information of the user's skeleton frame;
所述根据所述关节点的数据计算所述设定的姿势的匹配参数值包括: 根据所述关节点数据和所述时间戳信息计算所述关节点的位移。 Calculating the matching parameter value of the set posture according to the data of the joint point includes: calculating the displacement of the joint point according to the data of the joint point and the time stamp information.
9、 根据权利要求 4所述的方法, 其特征在于, 所述设定的姿势为姿 态姿势时, 9. The method according to claim 4, characterized in that when the set posture is a posture posture,
所述从已检测到的所述用户信息中读取所述关节点的数据包括: 从所述用户骨骼帧信息中读取所述设定姿势涉及的关节点相应的关 节点数据; Reading the data of the joint points from the detected user information includes: reading the corresponding joint point data of the joint points involved in the set posture from the user skeleton frame information;
所述根据所述关节点的数据计算所述设定的姿势的匹配参数值包括: 根据所述关节点数据计算关节点之间的骨骼夹角。 Calculating the matching parameter value of the set posture according to the data of the joint points includes: calculating the bone angle between the joint points according to the data of the joint points.
10、 一种界面切换的装置, 其特征在于, 所述装置包含: 10. An interface switching device, characterized in that the device includes:
检测单元, 用于检测用户信息; Detection unit, used to detect user information;
第一识别单元, 用于所述检测单元检测到用户信息后, 从所述用户信 息中识别出用户第一姿势; A first recognition unit, configured to recognize the user's first gesture from the user information after the detection unit detects the user information;
显示单元, 用于当所述用户第一姿势为界面切换姿势时, 在指定时间 内显示提示信息, 所述提示信息用以提示用户输入第二姿势; A display unit configured to display prompt information within a specified time when the user's first posture is an interface switching posture, and the prompt information is used to prompt the user to input the second posture;
第二识别单元, 用于当所述检测单元在所述指定时间内检测到用户信 息时, 识别出用户第二姿势; The second recognition unit is configured to recognize the user's second gesture when the detection unit detects user information within the specified time;
界面切换处理单元, 用于当所述用户第二姿势为确认切换姿势时, 执 行所述第一姿势关联的界面切换操作。 An interface switching processing unit, configured to perform an interface switching operation associated with the first gesture when the user's second gesture is a confirmation switching gesture.
11、 根据权利要求 10所述的装置, 其特征在于: 11. The device according to claim 10, characterized in that:
所述界面切换处理单元, 还用于当所述用户第二姿势为取消切换姿势 时, 取消所述第一姿势关联的界面切换操作。 The interface switching processing unit is also configured to cancel the interface switching operation associated with the first gesture when the user's second gesture is a switching cancellation gesture.
12、 根据权利要求 10或 11所述的装置, 其特征在于: 12. The device according to claim 10 or 11, characterized in that:
所述界面切换处理单元, 还用于当在所述指定时间内未检测到所述用 户信息时, 取消所述第一姿势关联的界面切换操作。 The interface switching processing unit is also configured to when the user is not detected within the specified time When user information is obtained, the interface switching operation associated with the first gesture is canceled.
13、 根据权利要求 10至 12 任意一项所述的装置, 其特征在于, 所述 第一识别单元或所述第二识别单元包含: 13. The device according to any one of claims 10 to 12, characterized in that the first identification unit or the second identification unit includes:
获得模块, 用于获得设定的姿势所涉及的关节点; Obtain module, used to obtain the joint points involved in the set posture;
读取模块, 用于从已检测到的所述用户信息中读取所述关节点的数据, 其中, 所述用户信息包括用户骨骼帧信息, 所述骨骼帧信息包括关节点信 息和时间戳信息; A reading module, configured to read the data of the joint points from the detected user information, where the user information includes user skeleton frame information, and the skeleton frame information includes joint point information and timestamp information. ;
计算模块, 用于根据所述关节点的数据计算所述设定的姿势的匹配参 数值; A calculation module, used to calculate the matching parameter value of the set posture according to the data of the joint point;
识别模块, 用于根据所述匹配参数值识别所述用户第一姿势或第二姿 势。 An identification module, configured to identify the user's first posture or second posture according to the matching parameter value.
14、 根据权利要求 13所述的装置, 其特征在于: 14. The device according to claim 13, characterized in that:
所述获得模块, 进一步的用于获得当前界面下所设定的姿势涉及的关 节点; The obtaining module is further used to obtain the joint points involved in the posture set under the current interface;
所述计算模块, 进一步的用于根据所述关节点的数据计算所述当前界 面下所设定的姿势的匹配参数值。 The calculation module is further configured to calculate the matching parameter value of the posture set under the current interface according to the data of the joint points.
15、 根据权利要求 13所述的装置, 其特征在于: 15. The device according to claim 13, characterized in that:
所述获得模块, 进一步的用于获得体感交互应用系统的默认姿势涉及 的关节点; The obtaining module is further used to obtain the joint points involved in the default posture of the somatosensory interactive application system;
所述计算模块, 进一步的用于根据所述关节点的数据计算所述默认姿 势的匹配参数值。 The calculation module is further configured to calculate the matching parameter value of the default posture according to the data of the joint point.
16、 根据权利要求 14或 15所述的装置, 其特征在于: 16. The device according to claim 14 or 15, characterized in that:
所述识别模块, 进一步的用于将所述匹配参数值与所述当前界面下设 定的姿势的匹配条件相比较, 或者将所述匹配参数值与所述体感交互应用 系统的默认姿势的匹配条件相比较; 确定与所述匹配条件相匹配的匹配参 数值所对应的姿势, 以确定的姿势为用户第一姿势或第二姿势。 The identification module is further configured to compare the matching parameter value with the matching condition of the posture set under the current interface, or match the matching parameter value with the default posture of the somatosensory interactive application system. Compare the conditions; determine the matching parameters that match the matching conditions. The posture corresponding to the numerical value, the determined posture is the user's first posture or second posture.
17、 根据权利要求 13所述的装置, 其特征在于: 17. The device according to claim 13, characterized in that:
所述读取模块, 进一步的用于当所述设定的姿势为动作姿势时, 从多 个连续的用户骨骼帧信息中读取所述设定姿势涉及的关节点相应的关节点 数据及用户骨骼帧的时间戳信息; The reading module is further configured to, when the set posture is an action posture, read the joint point data corresponding to the joint points involved in the set posture and the user's skeletal frame information from multiple consecutive user skeleton frame information. The timestamp information of the skeleton frame;
所述计算模块, 进一步的用于根据所述关节点数据和所述时间戳信息 计算所述关节点的位移。 The calculation module is further configured to calculate the displacement of the joint point according to the joint point data and the timestamp information.
18、 根据权利要求 13所述的装置, 其特征在于: 18. The device according to claim 13, characterized in that:
所述读取模块, 进一步的用于当所述设定的姿势为姿态姿势时, 从所 述用户骨骼帧信息中读取所述设定姿势涉及的关节点相应的关节点数据; 所述计算模块, 进一步的用于根据所述关节点数据计算关节点之间的 骨骼夹角。 The reading module is further configured to read the joint point data corresponding to the joint points involved in the set posture from the user skeleton frame information when the set posture is a posture posture; the calculation A module further configured to calculate the bone angle between the joint points based on the joint point data.
PCT/CN2012/083721 2012-10-30 2012-10-30 Interface switching method and apparatus WO2014067058A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201280001467.7A CN103180803B (en) 2012-10-30 2012-10-30 The method and apparatus of changing interface
PCT/CN2012/083721 WO2014067058A1 (en) 2012-10-30 2012-10-30 Interface switching method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2012/083721 WO2014067058A1 (en) 2012-10-30 2012-10-30 Interface switching method and apparatus

Publications (1)

Publication Number Publication Date
WO2014067058A1 true WO2014067058A1 (en) 2014-05-08

Family

ID=48639389

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2012/083721 WO2014067058A1 (en) 2012-10-30 2012-10-30 Interface switching method and apparatus

Country Status (2)

Country Link
CN (1) CN103180803B (en)
WO (1) WO2014067058A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015191323A1 (en) 2014-06-10 2015-12-17 3M Innovative Properties Company Nozzle assembly with external baffles

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104881421B (en) * 2014-12-15 2018-04-27 深圳市腾讯计算机系统有限公司 The switching method and device of a kind of 3-D graphic
CN104808788B (en) * 2015-03-18 2017-09-01 北京工业大学 A kind of method that non-contact gesture manipulates user interface
CN105929953A (en) * 2016-04-18 2016-09-07 北京小鸟看看科技有限公司 Operation guide method and apparatus in 3D immersive environment and virtual reality device
CN109062467B (en) * 2018-07-03 2020-10-09 Oppo广东移动通信有限公司 Split screen application switching method and device, storage medium and electronic equipment
CN111435512A (en) * 2019-01-11 2020-07-21 北京嘀嘀无限科技发展有限公司 Service information acquisition method and device
CN112337087A (en) * 2020-09-28 2021-02-09 湖南泽途体育文化有限公司 Somatosensory interaction method and system applied to sports competition

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101729808A (en) * 2008-10-14 2010-06-09 Tcl集团股份有限公司 Remote control method for television and system for remotely controlling television by same
CN102023798A (en) * 2009-09-17 2011-04-20 宏正自动科技股份有限公司 Method and apparatus for switching of kvm switch ports using gestures on a touch panel
WO2012005893A2 (en) * 2010-06-29 2012-01-12 Microsoft Corporation Skeletal joint recognition and tracking system
CN102749993A (en) * 2012-05-30 2012-10-24 无锡掌游天下科技有限公司 Motion recognition method based on skeleton node data

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101729808A (en) * 2008-10-14 2010-06-09 Tcl集团股份有限公司 Remote control method for television and system for remotely controlling television by same
CN102023798A (en) * 2009-09-17 2011-04-20 宏正自动科技股份有限公司 Method and apparatus for switching of kvm switch ports using gestures on a touch panel
WO2012005893A2 (en) * 2010-06-29 2012-01-12 Microsoft Corporation Skeletal joint recognition and tracking system
CN102749993A (en) * 2012-05-30 2012-10-24 无锡掌游天下科技有限公司 Motion recognition method based on skeleton node data

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015191323A1 (en) 2014-06-10 2015-12-17 3M Innovative Properties Company Nozzle assembly with external baffles
US10688508B2 (en) 2014-06-10 2020-06-23 3M Innovative Properties Company Nozzle assembly with external baffles

Also Published As

Publication number Publication date
CN103180803A (en) 2013-06-26
CN103180803B (en) 2016-01-13

Similar Documents

Publication Publication Date Title
WO2014067058A1 (en) Interface switching method and apparatus
US11323658B2 (en) Display apparatus and control methods thereof
EP2664985B1 (en) Tablet terminal and operation receiving program
US10990748B2 (en) Electronic device and operation method for providing cover of note in electronic device
US20150149925A1 (en) Emoticon generation using user images and gestures
WO2022237268A1 (en) Information input method and apparatus for head-mounted display device, and head-mounted display device
US9916043B2 (en) Information processing apparatus for recognizing user operation based on an image
TW201346640A (en) Image processing device, and computer program product
WO2014106219A1 (en) User centric interface for interaction with visual display that recognizes user intentions
US9846529B2 (en) Method for processing information and electronic device
TWI547788B (en) Electronic device and gravity sensing calibration method thereof
WO2017032078A1 (en) Interface control method and mobile terminal
JP5437726B2 (en) Information processing program, information processing apparatus, information processing system, and coordinate calculation method
JP2015172887A (en) Gesture recognition device and control method of gesture recognition device
CN112488914A (en) Image splicing method, device, terminal and computer readable storage medium
WO2015131590A1 (en) Method for controlling blank screen gesture processing and terminal
EP3582068A1 (en) Information processing device, information processing method, and program
US20180373392A1 (en) Information processing device and information processing method
US9148537B1 (en) Facial cues as commands
CN109542218B (en) Mobile terminal, human-computer interaction system and method
JP2014086018A (en) Input device, angle input device, and program
JP5946965B2 (en) Display system, display method, and program
EP4276591A1 (en) Interaction method, electronic device, and interaction system
CN109379533A (en) A kind of photographic method, camera arrangement and terminal device
CN110162251B (en) Image scaling method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12887521

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12887521

Country of ref document: EP

Kind code of ref document: A1