WO2014067058A1 - Procédé et appareil de changement d'interface - Google Patents

Procédé et appareil de changement d'interface Download PDF

Info

Publication number
WO2014067058A1
WO2014067058A1 PCT/CN2012/083721 CN2012083721W WO2014067058A1 WO 2014067058 A1 WO2014067058 A1 WO 2014067058A1 CN 2012083721 W CN2012083721 W CN 2012083721W WO 2014067058 A1 WO2014067058 A1 WO 2014067058A1
Authority
WO
WIPO (PCT)
Prior art keywords
posture
user
gesture
information
data
Prior art date
Application number
PCT/CN2012/083721
Other languages
English (en)
Chinese (zh)
Inventor
宣曼
黄晨
薛传颂
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2012/083721 priority Critical patent/WO2014067058A1/fr
Priority to CN201280001467.7A priority patent/CN103180803B/zh
Publication of WO2014067058A1 publication Critical patent/WO2014067058A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8011Ball

Definitions

  • the present invention relates to the field of communication network technologies, and in particular, to a method and apparatus for interface switching in a somatosensory interaction scenario. Background technique
  • Vision-based somatosensory interaction means that the computer captures the user's image through the camera, and uses pattern recognition, artificial intelligence and other technologies to understand the meaning of the user's actions, providing a more natural and intuitive way of interacting with the body.
  • Pattern recognition e.g., acoustic feature recognition
  • artificial intelligence e.g., acoustic feature recognition
  • the somatosensory interactive application system captures a video frame containing user information through a camera, and then obtains information of the user in the video frame (for example, joint point information) through image analysis technology, thereby determining the posture of the user (pose) and A gesture composed of a posture change in a continuous video frame; a gesture and a motion of the user together constitute a gesture, and the somatosensory interactive application system performs a corresponding feedback operation according to an instruction corresponding to the posture of the user. This constitutes a complete vision-based somatosensory interaction process.
  • the manner of capturing is: firstly, the posture input by the user is recognized, and when the predetermined interface switching posture is satisfied, the user is required to maintain the posture for a period of time before the interface switching instruction is triggered.
  • the user can exit the game by "stretching the left arm and tilting it 45° to the body", and asking the user to keep the gesture for a while to trigger the exit. Game "Operation. Otherwise cancel the operation and retain the original game interface. If the waiting time is set to a short time, it is easy to misjudge some of the user's unintended operations as an interface switching instruction.
  • the embodiment of the invention provides a method and a device for switching interfaces, which are used for improving the recognition accuracy of the interface switching posture instruction in the somatosensory interaction scenario and improving the user experience.
  • the method for interface switching provided by the embodiment of the present invention includes:
  • the first posture of the user is recognized from the user information; if the first posture of the user is an interface switching posture, prompt information is displayed within a specified time, and the prompt information is used to prompt the user to input the second Posture
  • the second gesture of the user is recognized; if the second gesture of the user is the confirmation switching gesture, the interface switching operation associated with the first gesture is performed.
  • the method further includes: if the second posture of the user is a cancel switching gesture, canceling the interface switching operation associated with the first gesture; or The user second gesture is not to confirm the switching posture or cancel the switching posture, continue to detect the user information, and returns to the step of recognizing the second posture of the user when the user information is detected within the specified time.
  • the method further includes: when the user is not detected within the specified time In the case of information, the interface switching operation associated with the first gesture is cancelled.
  • the identifying the first posture or the second posture of the user includes: Obtaining a joint point involved in the set posture; reading data of the joint point from the detected user information, wherein the user information includes user bone frame information, and the skeleton frame information includes a joint point Information and time stamp information; calculating a matching parameter value of the set posture according to the data of the joint point; and identifying the first posture or the second posture of the user according to the matching parameter value.
  • the joint points involved in obtaining the set posture include: determining the current interface type and the current interface a set posture, obtaining a joint point involved in the posture set under the current interface; calculating a matching parameter value of the set posture according to the data of the joint point includes: calculating according to data of the joint point The matching parameter value of the posture set under the current interface.
  • the joint point involved in obtaining the set posture includes: determining a default posture of the somatosensory interactive game application system, obtaining the default posture The joint point involved; calculating the matching parameter value of the set posture according to the data of the joint point comprises: calculating a matching parameter value of the default posture according to the data of the joint point.
  • the reading the joint point from the detected user information includes: reading, according to a plurality of consecutive user skeleton frame information, joint point data corresponding to the joint point and time stamp information of the user bone frame; and calculating the data according to the joint point
  • the matching parameter value of the set posture includes: calculating a displacement of the joint point based on the joint point data and the time stamp information.
  • the reading the joint from the detected user information includes: reading joint point data corresponding to the joint point involved in the set posture from the user skeleton frame information; and calculating a matching parameter value of the set posture according to the data of the joint point
  • the method includes: calculating an angle of a bone between the joint points according to the joint point data.
  • the first posture or the user is identified according to the matching parameter value
  • the second posture includes: comparing the matching parameter value with a matching condition of the posture set under the current interface, or comparing the matching parameter value with a matching condition of a default posture of the somatosensory interaction application system; The posture corresponding to the matching parameter value matching the matching condition, the determined posture is the user first posture or the second posture.
  • the apparatus for interface switching includes: a detecting unit, configured to detect user information;
  • a first identifying unit configured to: after the detecting unit detects the user information, identify the first posture of the user from the user information;
  • a display unit configured to display prompt information within a specified time when the first gesture of the user is an interface switching gesture, where the prompt information is used to prompt the user to input a second gesture;
  • a second identifying unit configured to: when the detecting unit detects the user information within the specified time, identify the second posture of the user;
  • the interface switching processing unit is configured to perform an interface switching operation of the first gesture association when the second gesture of the user is a confirmation switching gesture.
  • the interface switching processing unit is further configured to cancel the interface switching operation associated with the first gesture when the second posture of the user is a cancel switching gesture.
  • the interface switching processing unit is further configured to: when the user information is not detected within the specified time, The interface switching operation associated with the first gesture is cancelled.
  • the first identification unit or the second identification unit Contains:
  • a module for obtaining a joint point involved in the set posture a reading module, configured to read data of the joint point from the detected user information, wherein the user information includes a user bone Frame information, the skeleton frame information includes joint point information and time stamp information; and a calculation module, configured to calculate a matching parameter value of the set posture according to the data of the joint point;
  • an identification module configured to identify the first posture or the second posture of the user according to the matching parameter value.
  • Obtaining a module further configured to obtain a joint point involved in the posture set in the current interface; and a calculating module, configured to calculate a matching parameter value of the posture set in the current interface according to the data of the joint point .
  • the obtaining module is further configured to obtain a joint point involved in a default posture of the somatosensory interactive application system; the calculating module is further used for A matching parameter value of the default posture is calculated based on the data of the joint point.
  • the reading module is further configured to: when the set posture is an action posture, from a plurality of consecutive user bones Reading, in the frame information, the joint point data corresponding to the joint point and the time stamp information of the user skeleton frame; the calculation module, further configured to calculate the joint point data and the time stamp information according to the joint point data Off the displacement of the node.
  • the reading module is further configured to: when the set posture is an attitude posture, information from the user skeleton frame The joint point data corresponding to the joint point involved in the set posture is read; the calculation module is further configured to calculate a bone angle between the joint points according to the joint point data.
  • the identifying module is further configured to use the matching parameter value Comparing the matching conditions of the postures set under the current interface, or comparing the matching parameter values with the matching conditions of the default postures of the somatosensory interaction application system; determining matching parameter values that match the matching conditions
  • the corresponding posture is a first posture or a second posture of the user.
  • FIG. 1 is a flowchart of a method for switching an interface according to an embodiment of the present invention
  • FIG. 2 is a flowchart of a method for identifying a first posture or a second posture of a user according to an embodiment of the present invention
  • FIGS. 3A-3D illustrate a method according to the present invention.
  • the embodiment shows a graphical user interface of the device at different time points of the interface switching.
  • FIG. 4 is a block diagram of the interface switching device according to an embodiment of the present invention
  • FIG. 5 is a schematic diagram of the interface switching device according to another embodiment of the present invention
  • FIG. 6 is a structural diagram of a computer system-based interface switching apparatus according to still another embodiment of the present invention. detailed description
  • An embodiment of the present invention provides a method for interface switching, which can be used for interface switching in a somatosensory interaction scenario.
  • the specific interface operation includes one of the following: exiting the application, returning to the upper-level interface, returning to the main interface, and the recruiting menu. The operation that causes the interface to change.
  • the method includes: Step 101: After detecting the user information, identify the first posture of the user from the user information.
  • the method for detecting the user information may be: acquiring the skeleton frame information of the user, and determining whether the acquired skeleton frame information of the user includes valid joint point data. If valid joint point data is included, the user information is detected. Otherwise, the user information is not detected and the test is continued.
  • the method for recognizing the first posture of the user may be: obtaining a joint point involved in the preset posture; and reading valid joint point data corresponding to the joint point involved in the preset posture from the detected user information; The effective joint point data calculates a matching parameter value that matches the preset posture; and then the user's first posture is identified based on the matching parameter value.
  • the first posture of the user may be an action posture or a posture posture.
  • the somatosensory interactive application system is used as an execution subject, and the user's skeleton frame information is obtained from the somatosensory interaction device (in this embodiment, the somatosensory game product Kinect).
  • the SDK Software Development Kit
  • the Kinect device of the somatosensory game product includes a skeleton frame information extraction function NuiSkeletonGetNextFrame, and the application can extract the user skeleton frame information of the current time from the kinect device by calling the function, regardless of At present, there is no user in front of the Kinect device, and the device will generate a frame of user skeleton frame information.
  • the user skeleton frame information is represented by a NUI-SKELETON-FRAME data structure, and the data structure includes joint point information (using NUI-SKELETON-DATA data structure representation) and timestamp information (represented by liTimestamp parameter), joint
  • the point information contains a judgment flag eTrackingState having valid joint point data. If the eTrackingState parameter is true, the user information is detected. Otherwise, the parameter value is false, indicating that no user information is detected and the detection continues.
  • Step 102 If the first gesture of the user is an interface switching gesture, the prompt information is displayed within a specified time, and the prompt information is used to prompt the user to input the second gesture.
  • step 102 may further include the following steps: when the first posture of the user is not a boundary When the face switching gesture, the user information continues to be detected or when the user first gesture is not an interface switching gesture and an operation associated with the first gesture is set, the first gesture associated operation may also be performed.
  • the prompt information may include a second gesture selection item and an operation indication of the second posture.
  • the display manner of the prompt information may be a text, a picture, or the like, and a display effect such as blinking, fading, and the like may also be used.
  • the second posture selection item may be two options of “confirm switching posture” and “cancel switching posture”, and may be displayed by using a text box or a text, and correspondingly, the operation instruction of the second posture is “confirming the switching posture”.
  • the operation method indication of ", "Cancel switching posture” can indicate how the user operates by using text, symbols, pictures or animations.
  • the second posture may be an action posture or a posture posture.
  • Step 103 Identify the user's second gesture when the user information is detected within the specified time.
  • the method of detecting the user information is the same as the method of detecting the user information in step 101, and the method of recognizing the second posture of the user is the same as the method of recognizing the first posture of the user in step 101.
  • the step 103 may further include ignoring the interface switching operation associated with the first gesture if the user information is not detected within the specified time.
  • Step 104 If the second posture of the user is a confirmation switching posture, perform an interface switching operation of the first posture association. Further, if the second gesture is a cancel switching gesture, if yes, the interface switching operation associated with the first gesture is ignored.
  • An embodiment of the present invention provides a method for recognizing a first posture of a user after detecting user information in a method of interface switching, and a method for recognizing the second gesture is similar to the method, and details are not described herein. Referring to Figure 2, the method includes:
  • Step 201 Obtain a joint point involved in the set posture
  • the step specifically includes: determining a current interface type and a posture set under the current interface. Potential, obtain the joint points involved in setting the posture under the current interface.
  • the step may specifically include: determining all the default gestures of the somatosensory interaction application system, and obtaining joint points involved in all the default gestures of the system.
  • the state of the state machine is switched through the interface, and the current interface is determined to be an application interface before the switching, and the posture set under the current interface is only the interface switching posture; the interface switching posture is a shape of the left arm 45. Degree" posture, there are 7 joint points involved: SHOULDER - CENTER (central shoulder joint point), SHOULDER - RIGHT (right shoulder joint point) and ELBOW- RIGHT (right elbow joint point;), WRIST RIGHK right wrist joint point; ), SHOULDER—LEFT (left shoulder joint point), ELBOW—LEFT (left elbow joint point), WRIST LEFT (left wrist joint point;).
  • the state of the state machine is switched through the interface, and the current interface is determined to be a switching prompt interface, and the posture set under the current interface has a confirmation switching posture and a cancel switching posture; wherein, the confirmation switching posture is a left-handed right-wing motion
  • the joint point involved is HAND-LEFT (left hand joint point)
  • the cancel switching posture is a left-handed left-wing action posture
  • the joint points involved are HAND-LEFT (left-hand joint point).
  • Step 202 Read data of the joint point from the detected user information; wherein the user information includes user skeleton frame information, and the user skeleton frame information includes joint point information and time stamp information.
  • the step 202 may specifically include: specifically reading the joint point data corresponding to the joint point involved in the set posture from the user skeleton frame information.
  • the step 202 may include: reading, from the plurality of consecutive user skeleton frame information, the joint point data of the joint point and the time of the user skeleton frame related to the set posture. Stamp information.
  • the joint point data only needs to be read once for each frame of bone information.
  • the posture set under the current interface is an interface switching posture. Because the posture is a posture posture, only the coordinate data of the seven joint points involved in the posture in the current skeleton frame information is read, such as a table. 1 is shown.
  • the two postures involve the same joint point (left hand joint point), so the joint point data only needs to be read once for each frame of bone information.
  • the two postures are action postures, it is necessary to continuously read the left hand joint point coordinate data of the user's multiple bone frames and the timestamp information of each frame, and the current detection timestamp is the start time stamp ⁇ , as shown in Table 2. .
  • Step 203 Calculate a matching parameter value of the set posture according to the data of the joint point.
  • the matching parameter value for calculating the set posture includes: calculating a matching parameter value of all the default postures of the posture set under the current interface or the somatosensory interaction application system. Wherein, when the same matching parameter exists in the set posture, only the value of the matching parameter needs to be calculated once.
  • the matching parameter is 4 bone angles, taking Zabc as an example, which is composed of the central shoulder joint point a and the right shoulder joint point joint point b in Table 1.
  • Zabc is composed of the central shoulder joint point a and the right shoulder joint point joint point b in Table 1.
  • the angle is calculated as:
  • the matching parameter for confirming the switching posture is the displacement of the left hand joint point
  • the matching parameter for canceling the switching posture is also the displacement of the left hand joint point.
  • the same matching parameters for both poses so only the value of the match parameter is calculated once.
  • Step 204 Identify the first posture of the user according to the matching parameter value.
  • the step may be: comparing the matching parameter value with a matching condition of the posture set under the current interface, or comparing the matching parameter value with a matching condition of a default posture of the somatosensory interaction application system. Determining a posture corresponding to the matching parameter value that matches the matching condition, and determining the posture is the first posture of the user.
  • the posture set under the current interface is the interface switching posture
  • the potential is the interface switching posture, otherwise the user gesture is not the interface switching posture; in order to avoid the influence of the unintended action of the user, the user information detected in consecutive frames can be recognized as the setting posture.
  • the posture set under the current interface is the confirmation switching posture and the cancel switching posture, according to the calculated displacement value of the left hand joint point, it is determined whether s>0.3m or s ⁇ -0.3m is satisfied, if s>0.3m
  • the user gesture is identified as a confirmation switching gesture, and if s ⁇ -0.3m identifies the user gesture as canceling the switching gesture, and s is another range, identifying the user gesture is not confirming the switching posture or canceling the switching posture.
  • FIG. 3A-3D Take the exit game as an example.
  • the display interface at different time points is as follows:
  • the device 300 Before the interface is switched, that is, before the game exit gesture is received, as shown in FIG. 3A, the device 300 includes a display screen 301, and the current game screen 302 is displayed on the display screen.
  • a cueing interface is shown in Fig. 3B: includes a game screen 302 before exiting, and prompt information prompting the user to input the second gesture.
  • the prompt information appears in the lower left corner of the game screen 302 in a superimposed manner, and includes a "confirm exit", a "cancel exit” second gesture prompt item 303, and a second gesture operation instruction 304 corresponding to the two prompt items.
  • the second gesture prompt item is displayed in a text manner, and the text is blinking to remind the user of the attention, and the second gesture operation indication includes the words "wave left", “wave to the right", and left/right arrow graphic symbols for indicating
  • the right swing manual is used as the confirmation exit gesture, and the confirmation can be triggered to exit the game operation, and the leftward swing is manually canceled as the cancel exit gesture, and the cancellation of the exit game operation can be triggered.
  • FIG. 3C Another prompt interface is shown in FIG. 3C:
  • the timing progress disk 305 is included, and as time progresses, black
  • the sector area is gradually reduced, indicating a decrease in the remaining time that allows the user to input the second gesture by the reduction of the black sector area.
  • the prompt information includes a second gesture prompt item 306 and a second gesture operation indication 307, wherein the second gesture prompt item 306 includes “confirm exit”, “cancel exit”, and is displayed in a text box manner, and the corresponding second gesture operation indication 307 Contains the words "left hand lift”, “left hand lift” and posture diagram, used to indicate the left hand gesture
  • the prompt information appears in a fade-in manner, and when exiting the prompt interface, the prompt information disappears in a fade-out manner.
  • the interface after the display exits is as shown in FIG. 3D:
  • the game menu screen 308 is included.
  • the interface switching apparatus 400 includes: a detecting unit 401, configured to detect user information;
  • the first identifying unit 402 is configured to: after the detecting unit detects the user information, identify the first posture of the user from the user information;
  • the display unit 403 is configured to display prompt information in a specified time when the first posture of the user is an interface switching posture, where the prompt information is used to prompt the user to input the second posture;
  • a second identifying unit 404 configured to: when the detecting unit detects user information within the specified time, identify a second posture of the user;
  • the interface switching processing unit 405 is configured to perform an interface switching operation associated with the first gesture when the second posture of the user is a confirmation switching posture.
  • the interface switching processing unit 405 is further configured to cancel the interface switching operation associated with the first gesture when the second gesture of the user is to cancel the switching gesture.
  • the interface switching processing unit 405 is further configured to cancel the interface switching operation associated with the first gesture when the user information is not detected within the specified time.
  • the first identifying unit 402 may further include:
  • a first obtaining module 4021 configured to obtain a joint point involved in the set posture
  • a first reading module 4022 configured to read the joint from the detected user information Point data, where the user information includes user skeleton frame information, and the skeleton frame information includes joint point information and time stamp information;
  • a first calculating module 4023 configured to calculate a matching parameter value of the set posture according to the data of the joint point
  • the first identification module 4024 is configured to identify the first posture or the second posture of the user according to the matching parameter value
  • the first obtaining module 4021 is further configured to obtain a joint point involved in the posture set under the current interface
  • the first calculating module 4023 is further configured to calculate, according to the data of the joint point, a matching parameter value of the posture set in the current interface;
  • the first obtaining module 4021 is further configured to obtain a joint point involved in a default posture of the somatosensory interactive application system
  • the first calculating module 4023 is further configured to calculate a matching parameter value of the default posture according to the data of the joint point;
  • the first identification module 4024 is further configured to compare the matching parameter value with a matching condition of the posture set under the current interface, or interact the matching parameter value with the physical sense Comparing the matching conditions of the default posture of the application system; determining a posture corresponding to the matching parameter value that matches the matching condition, and determining the posture is the user first posture or the second posture;
  • the first reading module 4022 is further configured to: when the set posture is an action posture, read a joint point related to the set posture from a plurality of consecutive user bone frame information. Corresponding joint point data and time stamp information of the user skeleton frame;
  • the first calculating module 4023 is further configured to calculate the displacement of the joint point according to the joint point data and the timestamp information.
  • the first reading module 4022 is further configured to: when the set posture is an attitude posture, read a joint point corresponding to the set posture from the user skeleton frame information. Off node data;
  • the first calculating module 4023 is further configured to calculate a bone angle between the joint points according to the joint point data; Similar to the first identifying unit 402, the second identifying unit 404 may further include four modules: a second obtaining module 4041, a second reading module 4042, a second calculating module 4043, and a second identifying module 4044.
  • the function of each module of the second identification unit 404 and the function of the corresponding module of the first identification unit 402 are also similar, and details are not described herein again.
  • the interface switching device in the embodiment of the present invention can be implemented based on a computer system, and the methods shown in FIG. 1 to FIG. 2 can be implemented in a computer system-based interface switching device.
  • Figure 6 illustrates an embodiment of an interface switching device implemented in accordance with a computer system.
  • the interface switching device in this embodiment may include: a processor 601, a memory 602, and a communication interface 603, where:
  • the communication interface 603 is configured to communicate with the somatosensory interaction device. Messages exchanged between the interface switching device and the somatosensory interaction device are transmitted and received through the communication interface 603. Specifically, the communication interface 603 is configured to acquire the skeleton frame information of the user from the somatosensory interaction device; the memory 602 is configured to store the program instructions; the processor 601 is configured to invoke the program instructions stored in the memory 602, and perform the following operations: after detecting the user information Identifying the user's first gesture from the user information; if the user's first gesture is an interface switching gesture, displaying prompt information within a specified time, the prompt information is used to prompt the user to input the second gesture; When the user information is detected within the specified time, the second posture of the user is recognized; if the second posture of the user is the confirmation switching posture, the interface switching operation associated with the first gesture is performed.
  • the processor 601 can be a central processing unit (CPU), an application-specific integrated circuit (ASIC), or the like.
  • the interface switching device in this embodiment may include a bus 604.
  • the processor 601, the memory 602, and the communication interface 603 can be connected and communicated via the bus 604.
  • the memory 602 may include: a random access memory (RAM), a read-only memory (ROM), a disk and the like having an storage function;
  • the processor 601 can also be used to perform the steps described in FIG. 1 to FIG. 2 in the method embodiment, and the embodiments of the present invention are not described in detail herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Des modes de réalisation de la présente invention concernent un procédé et un appareil de changement d'interface. Le procédé consiste, après détection d'informations d'utilisateur, à identifier une première posture d'un utilisateur à partir des informations de l'utilisateur; si la première posture de l'utilisateur est une posture de changement, afficher des informations d'invite au cours d'une période de temps spécifiée, les informations d'invite étant utilisées pour inviter l'utilisateur à saisir une seconde posture; lorsque les informations d'utilisateur sont détectées au cours de la période de temps spécifiée, identifier la seconde posture de l'utilisateur; si la seconde posture de l'utilisateur est une posture d'accusé de réception de changement d'interface, exécuter une opération de changement d'interface associée à la première posture. Conformément à la présente invention, il est possible de remédier aux défauts liés à un taux d'erreur de jugement élevé ou à un temps d'attente prolongé pour l'identification d'une instruction de posture de changement d'interface dans un scénario d'interaction somato-sensoriel, d'augmenter la précision de la commande de posture et d'améliorer le confort d'utilisation.
PCT/CN2012/083721 2012-10-30 2012-10-30 Procédé et appareil de changement d'interface WO2014067058A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2012/083721 WO2014067058A1 (fr) 2012-10-30 2012-10-30 Procédé et appareil de changement d'interface
CN201280001467.7A CN103180803B (zh) 2012-10-30 2012-10-30 界面切换的方法和装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2012/083721 WO2014067058A1 (fr) 2012-10-30 2012-10-30 Procédé et appareil de changement d'interface

Publications (1)

Publication Number Publication Date
WO2014067058A1 true WO2014067058A1 (fr) 2014-05-08

Family

ID=48639389

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2012/083721 WO2014067058A1 (fr) 2012-10-30 2012-10-30 Procédé et appareil de changement d'interface

Country Status (2)

Country Link
CN (1) CN103180803B (fr)
WO (1) WO2014067058A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015191323A1 (fr) 2014-06-10 2015-12-17 3M Innovative Properties Company Ensemble formant buse avec chicanes internes

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104881421B (zh) * 2014-12-15 2018-04-27 深圳市腾讯计算机系统有限公司 一种三维图形的切换方法及装置
CN104808788B (zh) * 2015-03-18 2017-09-01 北京工业大学 一种非接触式手势操控用户界面的方法
CN105929953A (zh) * 2016-04-18 2016-09-07 北京小鸟看看科技有限公司 一种3d沉浸式环境下的操作引导方法和装置及虚拟现实设备
CN109062467B (zh) * 2018-07-03 2020-10-09 Oppo广东移动通信有限公司 分屏应用切换方法、装置、存储介质和电子设备
CN111435512A (zh) * 2019-01-11 2020-07-21 北京嘀嘀无限科技发展有限公司 一种服务信息获取方法及装置
CN112748971A (zh) * 2019-10-31 2021-05-04 合肥海尔洗衣机有限公司 家用电器的开机显示控制方法
CN112337087A (zh) * 2020-09-28 2021-02-09 湖南泽途体育文化有限公司 应用于体育竞技的体感交互方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101729808A (zh) * 2008-10-14 2010-06-09 Tcl集团股份有限公司 一种电视遥控方法及用该方法遥控操作电视机的系统
CN102023798A (zh) * 2009-09-17 2011-04-20 宏正自动科技股份有限公司 触控面板上使用手势进行切换多计算机切换端口的方法及装置
WO2012005893A2 (fr) * 2010-06-29 2012-01-12 Microsoft Corporation Système de reconnaissance et de suivi d'articulation squelettique
CN102749993A (zh) * 2012-05-30 2012-10-24 无锡掌游天下科技有限公司 基于骨骼节点数据的动作识别方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101729808A (zh) * 2008-10-14 2010-06-09 Tcl集团股份有限公司 一种电视遥控方法及用该方法遥控操作电视机的系统
CN102023798A (zh) * 2009-09-17 2011-04-20 宏正自动科技股份有限公司 触控面板上使用手势进行切换多计算机切换端口的方法及装置
WO2012005893A2 (fr) * 2010-06-29 2012-01-12 Microsoft Corporation Système de reconnaissance et de suivi d'articulation squelettique
CN102749993A (zh) * 2012-05-30 2012-10-24 无锡掌游天下科技有限公司 基于骨骼节点数据的动作识别方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015191323A1 (fr) 2014-06-10 2015-12-17 3M Innovative Properties Company Ensemble formant buse avec chicanes internes
US10688508B2 (en) 2014-06-10 2020-06-23 3M Innovative Properties Company Nozzle assembly with external baffles

Also Published As

Publication number Publication date
CN103180803A (zh) 2013-06-26
CN103180803B (zh) 2016-01-13

Similar Documents

Publication Publication Date Title
WO2014067058A1 (fr) Procédé et appareil de changement d'interface
US11323658B2 (en) Display apparatus and control methods thereof
US8933882B2 (en) User centric interface for interaction with visual display that recognizes user intentions
EP2664985B1 (fr) Terminal tablette et programme de réception d'opération
US9916043B2 (en) Information processing apparatus for recognizing user operation based on an image
US10990748B2 (en) Electronic device and operation method for providing cover of note in electronic device
WO2022237268A1 (fr) Procédé et appareil de saisie d'informations pour visiocasque, et visiocasque
JP6349800B2 (ja) ジェスチャ認識装置およびジェスチャ認識装置の制御方法
JP5437726B2 (ja) 情報処理プログラム、情報処理装置、情報処理システム、および座標算出方法
US9846529B2 (en) Method for processing information and electronic device
TWI547788B (zh) 電子裝置及其重力感測校正方法
WO2017032078A1 (fr) Procédé de commande d'interface et terminal mobile
CN112488914A (zh) 图像拼接方法、装置、终端及计算机可读存储介质
WO2015131590A1 (fr) Procédé pour commander un traitement de geste d'écran vide et terminal
EP3582068A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
US20180373392A1 (en) Information processing device and information processing method
CN105183538A (zh) 一种信息处理方法及电子设备
CN104007928B (zh) 信息处理方法及电子设备
CN109542218B (zh) 一种移动终端、人机交互系统及方法
WO2017215211A1 (fr) Procédé d'affichage d'images basé sur un terminal intelligent ayant un écran tactile et appareil électronique
KR20170045101A (ko) 콘텐트를 외부 장치와 공유하는 전자 장치 및 이의 콘텐트 공유 방법
JP2014086018A (ja) 入力装置、角度入力装置およびプログラム
JP5946965B2 (ja) 表示システム、表示方法、及びプログラム
EP4276591A1 (fr) Procédé d'interaction, dispositif électronique et système d'interaction
CN109379533A (zh) 一种拍照方法、拍照装置及终端设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12887521

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12887521

Country of ref document: EP

Kind code of ref document: A1