CN112416115A - Method and equipment for man-machine interaction in control interaction interface - Google Patents

Method and equipment for man-machine interaction in control interaction interface Download PDF

Info

Publication number
CN112416115A
CN112416115A CN201910785670.1A CN201910785670A CN112416115A CN 112416115 A CN112416115 A CN 112416115A CN 201910785670 A CN201910785670 A CN 201910785670A CN 112416115 A CN112416115 A CN 112416115A
Authority
CN
China
Prior art keywords
head
information
angular velocity
control
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910785670.1A
Other languages
Chinese (zh)
Other versions
CN112416115B (en
Inventor
李文卿
刘霞
徐健钢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Liangfengtai Shanghai Information Technology Co ltd
Original Assignee
Liangfengtai Shanghai Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Liangfengtai Shanghai Information Technology Co ltd filed Critical Liangfengtai Shanghai Information Technology Co ltd
Priority to CN201910785670.1A priority Critical patent/CN112416115B/en
Publication of CN112416115A publication Critical patent/CN112416115A/en
Application granted granted Critical
Publication of CN112416115B publication Critical patent/CN112416115B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application aims to provide a method and equipment for man-machine interaction in a control interaction interface, which particularly comprise the steps of presenting the control interaction interface corresponding to a head-mounted device, wherein the control interaction interface comprises a plurality of pieces of control identification information, collecting head action information of a user, determining a corresponding interaction instruction based on the head action information, and executing the corresponding interaction instruction based on currently selected current control identification information in the control interaction interface. According to the method, the corresponding interaction instruction is generated by combining the head action information of the user on the control interaction interface, the man-machine interaction of the control is carried out, the accuracy of action information identification is improved, the user has good impression, and the user experience is improved.

Description

Method and equipment for man-machine interaction in control interaction interface
Technical Field
The application relates to the field of intelligent interaction, in particular to a technology for man-machine interaction in a control interaction interface.
Background
The popularity of head-mounted devices is also becoming more and more widespread, such as augmented reality helmets, augmented reality glasses, virtual reality head-mounted devices, and the like. The current interaction modes of the head-mounted equipment mainly comprise a touch panel, voice recognition, an external keyboard and a mouse, gesture recognition interaction and the like. The voice recognition interaction has poor anti-interference capability, environmental noise has great influence on recognition accuracy, and the requirement on the use environment is high; the interaction between the touch pad and the external mouse keyboard requires a user to bind two hands, and the interaction between the touch pad and the external mouse keyboard does not allow the user to wear gloves (especially is not favorable for the user in a factory environment), and the like; gesture recognition needs to occupy both hands of a user, and the both hands cannot be liberated; the wearable device is originally designed to liberate hands, and obviously, the purpose of liberating the hands cannot be achieved by utilizing a keyboard, a mouse, a touch pad and the like for interaction.
Disclosure of Invention
The application aims to provide a method and equipment for man-machine interaction in a control interaction interface.
According to one aspect of the application, a method for human-computer interaction in a control interaction interface is provided, and is applied to a head-mounted device, and the method comprises the following steps:
presenting a control interaction interface corresponding to the head-mounted device, wherein the control interaction interface comprises a plurality of control identification information;
acquiring head action information of a user, and determining a corresponding interaction instruction based on the head action information;
and executing a corresponding interaction instruction based on the currently selected current control identification information in the control interaction interface.
According to another aspect of the application, a method for human-computer interaction in a control interaction interface is provided, and is applied to a head-mounted device, and the method comprises the following steps:
presenting a control interaction interface corresponding to the head-mounted device, wherein the control interaction interface comprises selection determining information and selection canceling information;
acquiring head action information of a user, determining triaxial angular velocity template information matched with triaxial angular velocities corresponding to the head motion information, and determining a corresponding interactive instruction according to the triaxial angular velocity template information, wherein the interactive instruction information comprises a selection confirmation instruction or a selection cancellation instruction;
and executing the corresponding interactive instruction.
According to one aspect of the application, a head-mounted device for man-machine interaction in a control interaction interface is provided, and the method comprises the following steps:
the one-to-one module is used for presenting a control interaction interface corresponding to the head-mounted device, wherein the control interaction interface comprises a plurality of control identification information;
the first module and the second module are used for collecting head action information of a user and determining a corresponding interaction instruction based on the head action information;
and the three modules are used for executing corresponding interactive instructions based on the currently selected current control identification information in the control interactive interface.
According to one aspect of the application, a head-mounted device for man-machine interaction in a control interaction interface is provided, and the method comprises the following steps:
the module is used for presenting a control interaction interface corresponding to the head-mounted device, wherein the control interaction interface comprises selection determining information and selection canceling information;
the second module is used for acquiring head action information of a user, determining triaxial angular velocity template information matched with triaxial angular velocities corresponding to the head action information, and determining a corresponding interactive instruction according to the triaxial angular velocity template information, wherein the interactive instruction information comprises a selection confirmation instruction or a selection cancellation instruction;
and the second module and the third module are used for executing corresponding interactive instructions.
According to one aspect of the application, a device for human-computer interaction in a control interaction interface is provided, wherein the device comprises:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to operate in accordance with any of the methods described above.
According to one aspect of the application, there is provided a computer-readable medium storing instructions that, when executed, cause a system to perform the operations of any of the methods described above.
Compared with the prior art, the control interaction interface corresponding to the head-mounted device is presented, wherein the control interaction interface comprises a plurality of control identification information, the head action information of a user is collected, the corresponding interaction instruction is determined based on the head action information, and the corresponding interaction instruction is executed based on the currently selected current control identification information in the control interaction interface. According to the method, the corresponding interaction instruction is generated by combining the head action information of the user on the control interaction interface, the man-machine interaction of the control is carried out, the accuracy of action information identification is improved, the user has good impression, and the user experience is improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, with reference to the accompanying drawings in which:
FIG. 1 is a flowchart illustrating a method for human-computer interaction in a control interaction interface according to an embodiment of the present application;
FIG. 2 illustrates an example of a single row of control interface interactions according to one embodiment of the present application;
FIG. 3 illustrates an example of a control interface interaction arranged in rows and columns according to another embodiment of the present application;
FIG. 4 illustrates an example of determining a standard value of a rotation angle according to one embodiment of the present application;
FIG. 5 illustrates an example of a head swing action according to one embodiment of the present application;
FIG. 6 illustrates an example of an angular velocity profile according to one embodiment of the present application;
FIG. 7 illustrates an example of a squared correspondence curve for a mode of angular velocity according to one embodiment of the present application;
FIG. 8 illustrates a flowchart of a method for human-computer interaction in a control interaction interface, according to an embodiment of the present application;
FIG. 9 illustrates an example of a control interaction interface to confirm cancellation according to one embodiment of the present application;
FIG. 10 illustrates a functional module of a head-mounted device according to one embodiment of the present application;
FIG. 11 illustrates a functional module of a head-mounted device according to another embodiment of the present application;
FIG. 12 illustrates an exemplary system that can be used to implement the various embodiments described in this application.
The same or similar reference numbers in the drawings identify the same or similar elements.
Detailed Description
The present application is described in further detail below with reference to the attached figures.
In a typical configuration of the present application, the terminal, the device serving the network, and the trusted party each include one or more processors (e.g., Central Processing Units (CPUs)), input/output interfaces, network interfaces, and memory.
The Memory may include forms of volatile Memory, Random Access Memory (RAM), and/or non-volatile Memory in a computer-readable medium, such as Read Only Memory (ROM) or Flash Memory. Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, Phase-Change Memory (PCM), Programmable Random Access Memory (PRAM), Static Random-Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), flash Memory or other Memory technology, Compact Disc Read-Only Memory (CD-ROM), Digital Versatile Disks (DVD) or other optical storage, magnetic tape storage or other magnetic storage devices, magnetic tape storage or other non-transmission media, may be used to store information that may be accessed by the computing device.
The device referred to in this application includes, but is not limited to, a user device, a network device, or a device formed by integrating a user device and a network device through a network. The user equipment includes, but is not limited to, any mobile electronic product capable of performing human-computer interaction with a user (e.g., human-computer interaction through a touch panel), such as a smart phone, a tablet computer, and the like, and the mobile electronic product may employ any operating system, such as an Android operating system, an iOS operating system, and the like. The network Device includes an electronic Device capable of automatically performing numerical calculation and information processing according to a preset or stored instruction, and the hardware includes, but is not limited to, a microprocessor, an Application Specific Integrated Circuit (ASIC), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), an embedded Device, and the like. The network device includes but is not limited to a computer, a network host, a single network server, a plurality of network server sets or a cloud of a plurality of servers; here, the Cloud is composed of a large number of computers or web servers based on Cloud Computing (Cloud Computing), which is a kind of distributed Computing, one virtual supercomputer consisting of a collection of loosely coupled computers. Including, but not limited to, the internet, a wide area network, a metropolitan area network, a local area network, a VPN network, a wireless Ad Hoc network (Ad Hoc network), etc. Preferably, the device may also be a program running on the user device, the network device, or a device formed by integrating the user device and the network device, the touch terminal, or the network device and the touch terminal through a network.
Of course, those skilled in the art will appreciate that the foregoing is by way of example only, and that other existing or future devices, which may be suitable for use in the present application, are also encompassed within the scope of the present application and are hereby incorporated by reference.
In the description of the present application, "a plurality" means two or more unless specifically defined otherwise.
Fig. 1 shows a method for performing human-computer interaction in a control interaction interface, which is applied to a head-mounted device and specifically includes step S101, step S102, and step S103. In step S101, a head-mounted device presents a control interaction interface corresponding to the head-mounted device, where the control interaction interface includes a plurality of control identification information; in step S102, the head-mounted device collects head motion information of a user, and determines a corresponding interaction instruction based on the head motion information; in step S103, the head-mounted device executes a corresponding interactive instruction based on the currently selected current control identification information in the control interactive interface. Here, the head-mounted device includes a display device, configured to present a human-computer interaction interface, such as a display screen, and display corresponding control information, control identification information, and the like; the head-mounted device further includes a collecting device for collecting posture information corresponding to the head motion information of the user, such as an inertial measurement unit, a gyroscope (a three-axis gyroscope or a plurality of single-axis gyroscopes), etc., it should be understood by those skilled in the art that the above collecting device is only an example, and other existing or future collecting devices may be applicable to the present application, and should be included in the scope of the present application, and are incorporated herein by reference. The head-mounted equipment further comprises a data processing device which is used for judging the head movement of the user according to the posture information corresponding to the head movement information and generating corresponding instruction information, or executing the corresponding instruction information and the like. The head-mounted equipment comprises any head-mounted mobile electronic equipment capable of performing man-machine interaction with a user, such as augmented reality glasses, augmented reality helmets, virtual reality glasses and the like. The control is used for encapsulating data and methods, such as various types of files (such as documents, tables and the like), folders or applications and the like; the control identification information is used for representing the control, and a user can conveniently access control data, such as a file, a folder or a name, an icon or an access link in an application in an interface.
Specifically, in step S101, the head-mounted device presents a control interaction interface corresponding to the head-mounted device, where the control interaction interface includes a plurality of control identification information. For example, the head-mounted device may present, through the display device, a corresponding control interaction interface that includes a plurality of control identification information, and present, through the display device, the corresponding control interaction interface on the display screen or superimpose and present the control interaction interface in the line of sight of the user for user input. In some embodiments, the control interaction interface includes an application interaction interface, and the head-mounted device presents a plurality of applications installed in the head-mounted device through the display device for a user to select or start the applications and the like on the application interaction interface.
In some embodiments, the control interaction interface includes, but is not limited to: the control piece interaction interfaces are arranged in a single row; the control interactive interface is arranged in a single column; and the control interactive interface is arranged in multiple rows and multiple columns. For example, the corresponding control interaction interface presents different control interaction interfaces according to the number of controls, and if the number of controls is small, the corresponding control interaction interface may be formed by arranging a single line or a single column, as shown in fig. 2, a plurality of application icons are arranged by a single line, different application icons are selected by moving left and right, or an application corresponding to the currently selected application icon is started, and of course, only 4 application icons in the line are shown in the figure, and the line may also include other application icons and the like which are not displayed in the current display screen, and other application icons which are not displayed in the current screen are displayed by moving left and right. For another example, when the number of controls in the corresponding control interaction interface is large, the controls may be presented in an arrangement form of multiple rows and multiple columns in the interface, as shown in fig. 3, multiple application icons are arranged in the interface in the form of multiple rows and multiple columns, different application icons are selected by moving up and down, left and right, or a currently selected application is started, etc., only the application icons in the multiple rows and multiple columns in the current page are displayed in the current drawing, the application interaction interface further includes other application icons not in the current screen, and the current interface may be switched to an interface where other applications are located, or other application icons not in the current screen are displayed by moving up and down, left and right, etc.
In step S102, the head-mounted device collects head motion information of the user, and determines a corresponding interaction instruction based on the head motion information. For example, the head movement information includes posture change information corresponding to the head movement of the user, such as posture change information of the head movement of the current movement is obtained according to the posture information of the initial head position and the posture information of the head position at the movement time, in some embodiments, the data update frequency of the gyroscope of the head-mounted device is 100HZ, and the head-mounted device can receive the gyroscope data in real time and perform calculation (for example, angular velocity integration and square of angular velocity module) to obtain the posture change information of the head movement. The method comprises the steps that a user holds and wears head-mounted equipment, a plurality of control identification information are operated on a control interaction interface in the head-mounted equipment, a corresponding control interaction interface is presented through a display device, the user selects and accesses the control identification information through head movement, the corresponding head-mounted equipment can acquire posture change information corresponding to the head movement information of the user through an inertial measurement unit or a gyroscope or an acceleration sensor, and a corresponding interaction instruction is determined according to the corresponding posture change information, wherein the interaction instruction is used for determining an intention control of the user, such as selecting interesting file identification information or application identification information, or accessing a corresponding file or starting a selected application.
In some embodiments, the instructions for interaction include, but are not limited to: moving a preset distance from the current control identification information along the motion direction of the head action information; and starting the control corresponding to the current control identification information. For example, the interactive instruction is used to instruct the selection symbol such as the cursor, the pointer, or the focus to move a preset distance from the currently selected current application identification information along the movement direction of the head action information, for example, the interactive instruction includes moving one cell to the right, that is, moving the currently selected application identification information to the application identification information corresponding to the cell to the right of the application identification, converting the selected application identification, and the like; for another example, the interactive instruction includes moving N cells to the left, that is, moving from the currently selected application identifier information to the application identifier information corresponding to the N cells on the left of the application identifier, converting the selected application identifier, and the like. The interactive instruction is also used for starting a control corresponding to the current space identification information, if the currently selected application identification information is an application corresponding to an 'electronic book', and the like, the interactive instruction corresponding to the started application is determined based on the head action information of the user, and the application is started based on the interactive instruction, and the like. In some embodiments, when the control interaction interface includes control interaction interfaces arranged in a single row or a single column, head motion information in two directions on one axis is only needed for the movement of a selected control in the corresponding control interaction interface, and the head motion information on the other axis is used for starting an interaction instruction corresponding to the control, for example, the head is moved for selecting the control by turning the head left or right, swinging the head, and the like on the horizontal axis, and the start instruction is corresponding to the movement of tilting the head or lowering the head, and the like in the longitudinal axis direction; of course, the corresponding start instruction may also be implemented by other start operations of the user.
In some embodiments, the control interaction interface information includes control interaction interfaces arranged in a plurality of rows and a plurality of columns, and the interaction instruction includes a movement from the current control identification information along a movement direction of the head action information by a preset distance. For example, the control interaction interface includes a plurality of control identification information arranged in rows and columns, and the head motion information corresponding to the selection movement of the control identification information in the control interaction interface in four directions on two axes, such as the left and right movement for selecting the control by turning the head left or right on the horizontal axis, swinging the head, etc., and the upward and downward movement for selecting the control by turning the head up or down on the vertical axis, pitching, etc., are included. In some embodiments, the method further includes step S106 (not shown), if a determination operation of the user about the control is obtained, starting the control corresponding to the currently selected control identification information. For example, if a determination operation about the currently selected control is obtained, the head-mounted device starts the control corresponding to the currently selected control identification information, such as accessing a corresponding file, opening a corresponding folder, starting a corresponding application, opening a corresponding table, and the like. In some embodiments, the correspondence determination operation includes, but is not limited to: a movement motion of the user's head in a front-rear direction; the user head standing time is greater than or equal to a standing time threshold; voice instruction information related to the user; gesture instruction information related to the user; touch instruction information related to the user; eye movement instruction information related to the user. For example, the head-mounted device collects posture information about the head movement of the user, determines the displacement of the head of the user in the front-back direction based on the change of the posture information in the front-back direction of the head of the user, and determines a corresponding determination operation to generate the determination instruction if the displacement is greater than or equal to a certain distance threshold. Or after the head-mounted device selects the corresponding control identification information, continuing to acquire posture information about the head of the user, and if the standing time of the head of the user is greater than or equal to a standing time threshold (such as 500ms), determining corresponding determination operation by the head-mounted device, generating a determination instruction, and starting or accessing the corresponding control; or the head-mounted device is provided with a corresponding voice template and the like, the head-mounted device collects voice information and the like related to the user through a microphone, and if the voice information is matched with the voice template (if the similarity reaches a certain threshold), the head-mounted device determines a corresponding determination operation, generates a determination instruction, and starts or accesses a corresponding control; or the head-mounted device is provided with a corresponding gesture template and the like, the head-mounted device collects gesture information and the like related to the user through the camera, and if the gesture information is matched with the gesture template (if the similarity reaches a certain threshold), the head-mounted device determines corresponding determination operation, generates a determination instruction, and starts or accesses a corresponding control; or the head-mounted device is provided with a corresponding touch pad and the like, the head-mounted device collects touch information and the like related to the user through the touch pad, if the touch information is matched with a preset touch action (such as clicking by the user, double-click by the user and the like), the head-mounted device determines a corresponding determination operation, generates a determination instruction, and starts or accesses a corresponding control; or the head-mounted device is provided with a corresponding eye movement template and the like, the head-mounted device collects eye movement information and the like related to the user through the camera, and if the eye movement information is matched with the eye movement template (for example, the similarity reaches a certain threshold), the head-mounted device determines corresponding determination operation, generates a determination instruction, and starts or accesses a corresponding control.
In step S103, the head-mounted device executes a corresponding interactive instruction based on the currently selected current control identification information in the control interactive interface. For example, the head-mounted device executes a corresponding interactive instruction according to the interactive instruction, in combination with currently selected control identification information and the like; for example, the interactive instruction includes moving N cases to the left, that is, moving from the currently selected application identification information to the application identification information corresponding to the N cases to the left of the application identification, converting the selected application identification, and the like; for another example, the interactive instruction is further used to start a control corresponding to the currently selected application identification information, and if the currently selected application identification information is an application corresponding to an "electronic book", the interactive instruction corresponding to the application to be started is determined based on the head action information of the user, and the application is started based on the interactive instruction, and the like.
In some embodiments, the head action information includes, but is not limited to: head lateral rotation motion information, wherein the corresponding rotation direction comprises left or right with respect to the initial head position; head longitudinal rotation motion information, wherein the corresponding rotation direction comprises up or down with respect to the initial head position; head lateral swing motion information, wherein the corresponding swing direction comprises left or right with respect to the initial head position; head longitudinal swing motion information, wherein the corresponding swing direction comprises up or down with respect to the initial head position; for example, when the head-mounted device is in a normal wearing state, the center of a display screen of the head-mounted device is taken as an origin, the horizontal direction of a plane where the display screen of the head-mounted device is located is a horizontal axis, the vertical direction is a vertical axis, a perpendicular line of the center of the display screen is a third axis to establish a corresponding spatial coordinate system, and the head movement information is determined to move in the spatial coordinate system according to the posture change information corresponding to the head movement information of the user, where the head movement of the user includes, but is not limited to, head transverse rotation movement information, head longitudinal rotation movement information, head transverse swing movement information, head longitudinal swing movement information, and the like; wherein, the rotation motion represents a unidirectional movement of the head of the user, moving from a certain position to another position, such as moving from an initial position to the left/right/upper/lower side of the user, and the corresponding rotation direction includes a movement direction relative to the initial head position, such as left, right, upward or downward; the swing motion means a reciprocating motion of the head of the user within a certain range, such as moving from an initial position to a left/right/upper/lower side of the user and returning to the initial position, and the swing direction includes a moving direction relative to the initial head position, such as left, right, upward or downward. Here, the judgment of the head motion information in the lateral and longitudinal directions may be a single judgment, for example, one motion judges only a motion in one direction; alternatively, the head movement information is determined simultaneously in the lateral direction and the vertical direction, and for example, the current head movement information is divided into the movements of the axes on the lateral axis and the vertical axis, and the user movement is determined in the lateral direction and the vertical direction based on the movements of the axes.
In some embodiments, the determining the corresponding interaction instruction based on the head action information includes: and determining angular velocity template information matched with the angular velocity corresponding to the head movement information, and determining a corresponding interactive instruction according to the angular velocity template information. For example, the head-mounted device end stores posture data related to head movements corresponding to the interaction instructions, such as triaxial angular velocity template information preset by the user, or triaxial angular velocity template information obtained by clustering or other methods according to a large amount of statistical data, where the triaxial angular velocity template information is only an example, and may also be other uniaxial or multiaxial angular velocity template information. The head-mounted device collects three-axis gyroscope data corresponding to the head action information of the user, and matches the three-axis gyroscope data with stored three-axis angular velocity template information (for example, the angular velocity correlation coefficient of each axis reaches a threshold value, or the angular velocity correlation coefficient of any axis reaches a threshold value, etc.), and if the matched angular velocity template information exists, the interactive instruction corresponding to the angular velocity template information is used as the interactive instruction corresponding to the head action information.
In some embodiments, the determining the corresponding interaction instruction based on the head action information includes: determining angular velocity template information matched with the angular velocity corresponding to the head movement information, determining acceleration template information matched with the acceleration corresponding to the head movement information, and determining a corresponding interactive instruction according to the angular velocity template information and the acceleration template information. For example, the head-mounted device stores posture data related to head movements corresponding to the interactive commands, such as three-axis acceleration template information and three-axis angular velocity template information preset by the user, or three-axis angular velocity template information and acceleration template information that are obtained by clustering or the like according to a large amount of statistical data, where the three-axis acceleration template information and the three-axis angular velocity template information may be one template (the template includes acceleration and angular velocity information, such as (grox, groy, groz, x, y, z)), or may be two independent templates (each includes acceleration and angular velocity information, such as (grox, groy, groz) and (x, y, z)). When the angular velocity template information and the acceleration template information are respectively contained in two independent templates, the two independent angular velocity template information and the acceleration template information correspond to the same interactive instruction, and the interactive instruction is the same as the interactive instruction corresponding to the corresponding unified template (the template information containing six parameters of the angular velocity information and the acceleration information). The template information is a set of head motion data, for example, when angular velocity template information and acceleration template information are included in a unified template, the unified template includes assignments corresponding to six parameters, for example, (grox, groy, groz, x, y, z) after the assignments of the six parameters, each parameter corresponds to a group, each unified template has a corresponding interactive command, the groups obtained after the assignments of the parameters corresponding to different unified templates are different, and the corresponding interactive commands are also different. The three-axis angular velocity template information and the three-axis acceleration template information are only examples, and may also be other single-axis or multi-axis angular velocity template information or other single-axis or multi-axis acceleration template information. The head-mounted equipment collects three-axis gyroscope data and three-axis acceleration data corresponding to head action information of a user, and matches the three-axis angular velocity template information and the three-axis acceleration template information (for example, the relation number of each axis reaches a threshold value, or the correlation coefficient of any axis reaches a threshold value, etc.) according to the three-axis gyroscope data and the three-axis acceleration data and the stored three-axis angular velocity template information. For another example, if the three-axis acceleration template information and the three-axis angular velocity template information are two independent templates, and there are matched three-axis angular velocity template information and three-axis acceleration template information, and the three-axis angular velocity template information and the three-axis acceleration template information correspond to the same interactive instruction, the interactive instruction is used as the interactive instruction corresponding to the head motion information.
In some embodiments, the method further includes step S104 (not shown), the head-mounted device collects a plurality of motion samples related to the head motion information, and determines corresponding angular velocity template information according to the plurality of motion samples. For example, a plurality of motion samples related to head motion information are collected, such as motion data of people with different body types, sexes and ages, a learning set is formed, wherein each motion is completed within a time period, so each motion sample in the learning set is a set of angular velocity data (the collection device of the angular velocity may be a gyroscope or an inertial measurement unit) within a time period, the time period is preset, and one motion is collected under a certain time threshold value to obtain a set of gyroscope data or inertial measurement unit data. And then, a clustering method is used for manufacturing generalized matching templates which respectively correspond to different interaction instructions.
In some embodiments, the method further includes step S105 (not shown), the head-mounted device acquires a plurality of motion samples related to the head motion information, and determines corresponding angular velocity template information and acceleration template information according to the plurality of motion samples. The angular velocity template information and the acceleration template information may be one template (the template includes acceleration and angular velocity information) or may be two independent templates (the template includes acceleration and angular velocity information, respectively). For example, the learning set and the template may include acceleration data in addition to only angular velocity data, and the current state of the head-mounted device may be more accurately determined based on the acceleration data, for example, the acceleration sensor is used to obtain the acceleration data to comprehensively determine static data of three axes so as to determine whether the head-mounted device is in a wearing state, and if the head-mounted device is in the wearing state, the corresponding interaction instruction is further determined.
In some embodiments, the head movement information comprises the head lateral rotation movement information or the head longitudinal rotation movement information; wherein the determining of the corresponding interactive instruction based on the head action information comprises: and if the rotating angle of the head movement is larger than or equal to the standard rotating angle value, generating a corresponding interactive instruction. For example, integrating gyroscope data in the head-mounted device based on a rotation motion of the head motion information of the user on a horizontal axis or a vertical axis, determining a rotation angle corresponding to the rotation motion, and if the rotation angle is greater than or equal to a rotation angle standard value, generating a corresponding interactive instruction, where the interactive instruction is used to instruct the currently selected control identification information to move by one lattice number or N lattice numbers in a moving direction with a consistent rotation direction. For another example, the gyroscope data in the head-mounted device is integrated in real time (for example, real-time integration is performed at intervals of the acquisition frequency of the gyroscope, for example, 100 HZ), and if it is determined in real time that the current rotation angle exceeds the rotation angle standard value, a corresponding interactive instruction is generated and executed, where the interactive instruction is used to instruct the currently selected control identification information to move by one lattice number or N lattice numbers in the moving direction in which the rotation directions are consistent. After an interactive instruction is determined, the head-mounted device continuously acquires data corresponding to the gyroscope in real time and continuously judges, so that one or more interactive instructions are determined, one or more grids or one or more N grids which are moved by the corresponding selected control marking information of the head action information are determined, and the moving direction is consistent with the rotating direction. The mode of integrating gyroscope data in real time to obtain an interactive instruction is beneficial to controlling the head action by a user, and if the action has deviation, the action can be adjusted in time, so that the user interaction feeling is good. In addition, before the rotation angle is calculated for the gyroscope data acquired by the head-mounted device, filtering (such as mean filtering) may be performed on the gyroscope data to remove noise in the sensor data, and then integrating the gyroscope data to acquire rotation angle information. Fig. 4 shows an example of determining a standard value of a rotation angle, where θ is the standard value of the rotation angle, a rotation angle of a corresponding user is ω, and the rotation angle ω of the user is divided by the number of controls in the rotation direction in the control interactive interface, so as to obtain the corresponding standard value of the rotation angle θ, where the rotation angle of the user may be preset in the head-mounted device by the user, or may be template data obtained according to statistics of a large number of users, or the like. For example, when the head of the user moves left and right, a boundary included angle formed by rotating along the cervical vertebra is calculated, a large number of rotating included angles of the user are counted, an included angle w with high adaptability is obtained through calculation, and the included angle w is divided by the number of controls in the rotating direction in the control interaction interface, so that a corresponding rotating angle standard value theta is obtained. In some embodiments, the determining the corresponding interactive instruction based on the head action information includes: and if the rotation angle of the head movement is larger than or equal to the rotation angle standard value, the movement time of the head movement meets a movement time threshold value, and a corresponding interaction instruction is generated. For example, the head-mounted device is provided with threshold information of head motion, wherein the threshold information includes, but is not limited to, a rotation angle standard value and a motion time threshold/confidence interval (e.g. the time threshold is 0.5s, or the confidence interval of the motion time is between 0.4s and 0.6 s), and the like, and if the rotation angle of the head motion information of the user within the motion time threshold is greater than or equal to the rotation angle standard value, corresponding interaction instruction information is generated, and the like. The motion time threshold may be a value or a value-taking interval.
In some embodiments, in step S102, the head-mounted device collects head motion information of the user, and detects the head motion information; and if the detection is passed, determining a corresponding interactive instruction based on the head action information. For example, when generating an interactive command based on the head motion information, the head motion information may be pre-detected, and if the detection is passed, the corresponding interactive command may be determined according to the head motion information. For example, the head motion information may be determined by presetting an angle threshold or an angular velocity square threshold, or by presetting a motion time threshold/confidence interval, or by presetting an angular velocity threshold/an angular velocity square threshold in combination with motion time, etc., to detect whether the head motion information is a valid motion, and remove low-frequency drift and high-frequency noise signals, etc. For example, integrating gyroscope data of the head action to obtain an angle of the head action, and if the angle does not accord with a preset angle threshold, removing the motion data and not judging an interactive instruction corresponding to the head action; for example, the square of the module of the gyroscope angular velocity corresponding to the head motion is calculated, and if the square of the module of the angular velocity does not accord with the preset angular velocity square threshold, the motion data is removed, and the interactive command corresponding to the head motion is not judged; for example, calculating the movement time of the head movement, and if the time does not meet a preset movement time threshold/confidence interval (for example, is lower than the acquisition time interval of gyroscope data), removing the movement data and not judging the interactive instruction corresponding to the head movement; for another example, the head movement gyro data is integrated to obtain the angle of the head movement, and the movement time of the head movement is calculated. If the time does not meet the preset movement time threshold or if the angle does not meet the preset angle threshold, the movement data is removed, and the interactive instruction corresponding to the head action is not judged.
In some embodiments, the head movement information comprises the head lateral rotation movement information or the head longitudinal rotation movement information; wherein the detecting the head motion information includes but is not limited to: whether a square value of a norm of the angular velocity of the head motion information is greater than or equal to a preset first angular velocity square threshold value; whether the difference between angular velocity corresponding times, in which the square value of the modulus of the angular velocity of the head motion information is greater than or equal to a second angular velocity square threshold, satisfies a preset first time difference threshold, wherein the first and second angular velocity square thresholds may be the same or different. For example, the swing motion of the head motion information is a reciprocating motion, and the angular velocity value of the corresponding gyroscope forms a curve similar to a sine wave with increasing first and then decreasing or decreasing first and then increasing (one cycle), as shown in fig. 5, the head-mounted device first moves from the static state of point a to point B with point O as an axis and stops, and then moves along point B to point a with point O as an axis, and this process forms a sine wave pattern similar to that shown in fig. 6, wherein t1 and t2 correspond to the swing starting motion of the swing direction (for example, from point a to point B), and t2 and t3 correspond to the recovery motion (for example, from point B to point a). Taking the square value of the modulus corresponding to the angular velocity as the vertical axis and the time as the horizontal axis, the relation of the square value of the modulus of the angular velocity with respect to time can be obtained as shown in fig. 7, in which the square value of the modulus of the angular velocity is determined by the w1 curve corresponding to the time points t1 and t 5. If the square of the modulus for the wobble angular velocity has a value greater than or equal to w1, or the angular velocity corresponding time difference (e.g., t1 and t5) that satisfies the condition satisfies a predetermined first time threshold/confidence interval, the action is determined to be a valid action. The first time threshold may be a value or a value interval. In some implementations, the head motion information includes the head sway lateral motion information or the head sway longitudinal motion information; wherein the detecting of the head motion information comprises: detecting whether the square value of the modulus of the angular velocity of the head action information is greater than or equal to a third angular velocity square threshold value or not, and whether the difference between the angular velocities meeting the condition and the corresponding time meets a preset second time difference threshold value or not; if yes, detecting whether the square value of the modulus of the angular velocity of four different time nodes exists in the head movement information in the movement process is equal to a fourth angular velocity square threshold value. For example, the head set has a fourth angular velocity square threshold value larger than the third angular velocity square threshold value on the basis of the swing curve, and generates a corresponding interactive command if there are four intersections between the swing curve corresponding to the head movement and the fourth angular velocity square threshold value, and the head set confirms that the current head movement information is a valid movement if there are four intersections between the square value of the norm of the instantaneous angular velocity of the swing in the drawing and the straight line corresponding to w2, as shown in fig. 10.
Fig. 8 illustrates a method for performing human-computer interaction in a control interaction interface, which is applied to a head-mounted device, according to an aspect of the present application, where the method includes step S201, step S202, and step S203. In step S201, the head-mounted device presents a control interaction interface corresponding to the head-mounted device, where the control interaction interface includes determination selection information and deselection information. In step S202, a head-mounted device collects head motion information of a user, determines angular velocity template information matched with an angular velocity corresponding to the head motion information, and determines a corresponding interactive instruction according to the angular velocity template information, where the interactive instruction information includes a selection confirmation instruction or a selection cancellation instruction; in step S203, the head-mounted device executes the corresponding interactive instruction. For example, the head-mounted device presents a control interaction interface, such as a confirmation cancellation interface shown in fig. 9, to the user, where the confirmation cancellation interface includes the confirmation selection information and the cancellation selection information; then, the head-mounted device collects head motion information, obtains angular velocity data information and the like corresponding to the head motion information, and the head-mounted device end stores angular velocity data and the like related to head motion corresponding to the interactive instruction, such as triaxial angular velocity template information preset by a user, or obtains generalized matching triaxial angular velocity template information and the like through a clustering method and the like according to a large amount of statistical data, wherein the triaxial angular velocity template information is only an example, and can also be other uniaxial or multiaxial angular velocity template information. The head-mounted device collects three-axis gyroscope data corresponding to user head action information, and matches the three-axis gyroscope data with stored three-axis angular velocity template information (for example, an angular velocity correlation coefficient of each axis reaches a threshold, or an angular velocity correlation coefficient of any axis reaches a threshold, etc.), if the matched three-axis angular velocity template information exists, an interactive instruction corresponding to the three-axis angular velocity template information is used as an interactive instruction corresponding to the head action information, and if the matched three-axis angular velocity template information exists, a head-shaking instruction corresponds to a confirmation instruction, and a head-shaking instruction corresponds to a cancellation instruction. Subsequently, the headset executes corresponding interaction instructions, such as executing an acknowledgement instruction or a cancel instruction.
In some embodiments, in step S202, the head-mounted device collects head motion information of the user, determines angular velocity template information matching an angular velocity corresponding to the head motion information, determines acceleration template information matching an acceleration corresponding to the head motion information, and determines a corresponding interaction instruction according to the angular velocity template information and the acceleration template information. For example, a plurality of motion samples related to head motion information are collected, such as motion data of people of different body types, sexes and ages, a learning set is formed, wherein each motion is completed within a time period, so that each motion sample in the learning set is a set of angular velocity and acceleration data within a time period, the time period is preset, and a motion is collected under a certain time threshold value to obtain a set of angular velocity and acceleration data, such as inertial measurement unit data. And then, a clustering method is used for manufacturing generalized matching templates which respectively correspond to different interaction instructions.
Fig. 9 shows a head-mounted device for human-computer interaction in a control interaction interface according to an aspect of the present application, which specifically includes a module 101, a module 102, and a module 103. A module 101, configured to present a control interaction interface corresponding to the head-mounted device, where the control interaction interface includes a plurality of control identification information; a second module 102, configured to collect head motion information of a user, and determine a corresponding interaction instruction based on the head motion information; and a third module 103, configured to execute a corresponding interactive instruction based on currently selected current control identification information in the control interactive interface. Here, the specific embodiments of the one-to-one module 101, the two-to-two module 102, and the one-to-three module 103 are the same as or similar to the specific embodiments of the step S101, the step S102, and the step S103, and are not repeated herein and are included herein by reference.
In some embodiments, the control interaction interface includes, but is not limited to: the control piece interaction interfaces are arranged in a single row; the control interactive interface is arranged in a single column; and the control interactive interface is arranged in multiple rows and multiple columns. Here, the specific embodiment of the control interaction interface is the same as or similar to the specific embodiment of the control interaction interface, and is not repeated here, and is included herein by way of reference.
In some embodiments, the instructions for interaction include, but are not limited to: moving a preset distance from the current control identification information along the motion direction of the head action information; and starting the control corresponding to the current control identification information. In some embodiments, the control interaction interface information includes control interaction interfaces arranged in a plurality of rows and a plurality of columns, and the interaction instruction includes a preset distance from the current control identification information along the motion direction of the head action information. Here, the specific embodiment of the interaction instruction is the same as or similar to the specific embodiment of the control interaction interface interaction instruction, and is not repeated here, and is included herein by way of reference.
In some embodiments, the apparatus further includes a sixth module 106 (not shown), configured to, if a determination operation of the user about the control is obtained, start the control corresponding to the currently selected control identification information. In some embodiments, the correspondence determination operation includes, but is not limited to: a movement motion of the user's head in a front-rear direction; the user head standing time is greater than or equal to a standing time threshold; voice instruction information related to the user; gesture instruction information related to the user; touch instruction information related to the user; eye movement instruction information related to the user. Here, the specific embodiments of the one-six module 106 and the determining operation are the same as or similar to the specific embodiments of the step S106 and the determining operation, and are not repeated here, and are included herein by way of reference.
In some embodiments, the head action information includes, but is not limited to: head lateral rotation motion information, wherein the corresponding rotation direction comprises left or right with respect to the initial head position; head longitudinal rotation motion information, wherein the corresponding rotation direction comprises up or down with respect to the initial head position; head lateral swing motion information, wherein the corresponding swing direction comprises left or right with respect to the initial head position; head longitudinal swing motion information, wherein the corresponding swing direction comprises up or down with respect to the initial head position. Here, the specific embodiment of the header motion information is the same as or similar to the specific embodiment of the header motion information, and is not repeated herein and is included by way of reference.
In some embodiments, the determining the corresponding interaction instruction based on the head action information includes: and determining angular velocity template information matched with the angular velocity corresponding to the head movement information, and determining a corresponding interactive instruction according to the angular velocity template information. In some embodiments, the determining the corresponding interactive instruction based on the head action information comprises: determining angular velocity template information matched with the angular velocity corresponding to the head movement information, determining acceleration template information matched with the acceleration corresponding to the head movement information, and determining a corresponding interactive instruction according to the angular velocity template information and the acceleration template information. Here, the specific embodiment of the interactive instruction determination is the same as or similar to the specific embodiment of the interactive instruction determination, and is not repeated here, and is included herein by way of reference.
In some embodiments, the apparatus further includes a fourth module 104 (not shown) for collecting a plurality of motion samples related to the head motion information, and determining corresponding angular velocity template information according to the plurality of motion samples. Here, the specific embodiment of the fourth module 104 is the same as or similar to the specific embodiment of the step S104, and is not repeated here, and is included herein by way of reference.
In some embodiments, the apparatus further includes a fifth module 105 (not shown) for collecting a plurality of motion samples related to the head motion information, and determining corresponding angular velocity template information and acceleration template information according to the plurality of motion samples. Here, the specific embodiment of the one-five module 105 is the same as or similar to the specific embodiment of the step S105, and is not repeated here, and is included herein by way of reference.
In some embodiments, the head movement information comprises the head lateral rotation movement information or the head longitudinal rotation movement information; wherein the determining of the corresponding interactive instruction based on the head action information comprises: and if the rotating angle of the head movement is larger than or equal to the standard rotating angle value, generating a corresponding interactive instruction. In some embodiments, the determining the corresponding interaction instruction based on the head action information includes: and if the rotation angle of the head movement is larger than or equal to the rotation angle standard value, the movement time of the head movement meets a movement time threshold value, and a corresponding interactive instruction is generated. Here, the specific embodiment of the head rotation determination interactive instruction is the same as or similar to the specific embodiment of the head rotation determination interactive instruction, and is not repeated herein and is included by way of reference.
In some embodiments, in step S102, the head-mounted device collects head motion information of the user, and detects the head motion information; and if the detection is passed, determining a corresponding interactive instruction based on the head action information. For example, when generating an interactive command based on the head motion information, the head motion information may be pre-detected, and if the detection is passed, the corresponding interactive command may be determined according to the head motion information. In some embodiments, the head movement information comprises the head lateral rotation movement information or the head longitudinal rotation movement information; wherein the detecting the head motion information includes but is not limited to: whether a square value of a modulus of the angular velocity of the head motion information is greater than or equal to a preset first angular velocity square threshold value; whether the difference between the angular velocity corresponding times of which the square value of the modulus of the angular velocity of the head motion information is greater than or equal to the second angular velocity square threshold value meets a preset first time difference threshold value or not. In some embodiments, the head motion information comprises the head sway lateral motion information or the head sway longitudinal motion information; wherein the detecting of the head motion information includes: detecting whether the square value of the modulus of the angular velocity of the head action information is greater than or equal to a third angular velocity square threshold value or not, and whether the difference between the angular velocities meeting the conditions and the corresponding time meets a preset second time difference threshold value or not; if yes, detecting whether the square value of the modulus of the angular velocity of four different time nodes exists in the head movement information in the movement process is equal to a fourth angular velocity square threshold value. Here, the specific embodiment of the head motion information pre-detection is the same as or similar to the specific embodiment of the head motion information pre-detection, and is not repeated herein, and is included herein by way of reference.
Fig. 11 illustrates a head-mounted device for human-computer interaction in a control interaction interface according to an aspect of the present application, where the device includes a two-in-one module 201, a two-in-two module 202, and a two-in-three module 203. A second module 201, configured to present a control interaction interface corresponding to the head-mounted device, where the control interaction interface includes determination selection information and deselection information. A second module 202, configured to collect head motion information of a user, determine triaxial angular velocity template information matched with triaxial angular velocities corresponding to the head motion information, and determine a corresponding interactive instruction according to the triaxial angular velocity template information, where the interactive instruction information includes a selection confirmation instruction or a selection cancellation instruction; and a second and third module 203 for executing corresponding interactive instructions. Here, the specific embodiments of the two-in-one module 201, the two-in-two module 202, and the two-in-three module 203 are the same as or similar to the specific embodiments of the step S201, the step S202, and the step S203, and are not repeated herein and are included herein by reference.
In some embodiments, in step S202, the head-mounted device collects head motion information of the user, determines angular velocity template information matching an angular velocity corresponding to the head motion information, determines acceleration template information matching an acceleration corresponding to the head motion information, and determines a corresponding interaction instruction according to the angular velocity template information and the acceleration template information. Here, a specific embodiment of the two-module 202 performing matching based on the angular velocity template and the acceleration template information is the same as or similar to the specific embodiment of the step S202 performing matching based on the angular velocity template and the acceleration template information, and is not described again and included herein by way of reference.
In addition to the methods and apparatus described in the embodiments above, the present application also provides a computer readable storage medium storing computer code that, when executed, performs the method described in any of the preceding claims.
The present application also provides a computer program product, which when executed by a computer device, performs the method of any of the preceding claims.
The present application further provides a computer device, comprising:
one or more processors;
a memory for storing one or more computer programs;
the one or more computer programs, when executed by the one or more processors, cause the one or more processors to implement the method of any preceding claim.
FIG. 12 illustrates an exemplary system that can be used to implement the various embodiments described herein;
in some embodiments, as shown in FIG. 12, the system 300 can be implemented as any of the above-described devices in the various embodiments. In some embodiments, system 300 may include one or more computer-readable media (e.g., system memory or NVM/storage 320) having instructions and one or more processors (e.g., processor(s) 305) coupled with the one or more computer-readable media and configured to execute the instructions to implement modules to perform the actions described herein.
For one embodiment, system control module 310 may include any suitable interface controllers to provide any suitable interface to at least one of processor(s) 305 and/or any suitable device or component in communication with system control module 310.
The system control module 310 may include a memory controller module 330 to provide an interface to the system memory 315. Memory controller module 330 may be a hardware module, a software module, and/or a firmware module.
System memory 315 may be used, for example, to load and store data and/or instructions for system 300. For one embodiment, system memory 315 may include any suitable volatile memory, such as suitable DRAM. In some embodiments, the system memory 315 may include a double data rate type four synchronous dynamic random access memory (DDR4 SDRAM).
For one embodiment, system control module 310 may include one or more input/output (I/O) controllers to provide an interface to NVM/storage 320 and communication interface(s) 325.
For example, NVM/storage 320 may be used to store data and/or instructions. NVM/storage 320 may include any suitable non-volatile memory (e.g., flash memory) and/or may include any suitable non-volatile storage device(s) (e.g., one or more Hard Disk Drives (HDDs), one or more Compact Disc (CD) drives, and/or one or more Digital Versatile Disc (DVD) drives).
NVM/storage 320 may include storage resources that are physically part of a device on which system 300 is installed or may be accessed by the device and not necessarily part of the device. For example, NVM/storage 320 may be accessible over a network via communication interface(s) 325.
Communication interface(s) 325 may provide an interface for system 300 to communicate over one or more networks and/or with any other suitable device. System 300 may wirelessly communicate with one or more components of a wireless network according to any of one or more wireless network standards and/or protocols.
For one embodiment, at least one of the processor(s) 305 may be packaged together with logic for one or more controllers (e.g., memory controller module 330) of system control module 310. For one embodiment, at least one of the processor(s) 305 may be packaged together with logic for one or more controller(s) of the system control module 310 to form a System In Package (SiP). For one embodiment, at least one of the processor(s) 305 may be integrated on the same die with logic for one or more controller(s) of the system control module 310. For one embodiment, at least one of the processor(s) 305 may be integrated on the same die with logic for one or more controller(s) of the system control module 310 to form a system on a chip (SoC).
In various embodiments, system 300 may be, but is not limited to being: a server, a workstation, a desktop computing device, or a mobile computing device (e.g., a laptop computing device, a handheld computing device, a tablet, a netbook, etc.). In various embodiments, system 300 may have more or fewer components and/or different architectures. For example, in some embodiments, system 300 includes one or more cameras, a keyboard, a Liquid Crystal Display (LCD) screen (including a touch screen display), a non-volatile memory port, multiple antennas, a graphics chip, an Application Specific Integrated Circuit (ASIC), and speakers.
It should be noted that the present application may be implemented in software and/or a combination of software and hardware, for example, implemented using Application Specific Integrated Circuits (ASICs), general purpose computers or any other similar hardware devices. In one embodiment, the software programs of the present application may be executed by a processor to implement the steps or functions described above. Likewise, the software programs (including associated data structures) of the present application may be stored in a computer readable recording medium, such as RAM memory, magnetic or optical drive or diskette and the like. Additionally, some of the steps or functions of the present application may be implemented in hardware, for example, as circuitry that cooperates with the processor to perform various steps or functions.
Additionally, a portion of the present application may be controlled as a computer program product, such as computer program instructions, that when executed by a computer, may invoke or provide methods and/or aspects in accordance with the present application through operation of the computer. Those skilled in the art will appreciate that the form in which the computer program instructions reside on a computer-readable medium includes, but is not limited to, source files, executable files, installation package files, and the like, and that the manner in which the computer program instructions are executed by a computer includes, but is not limited to: the computer directly executes the instruction, or the computer compiles the instruction and then executes the corresponding compiled program, or the computer reads and executes the instruction, or the computer reads and installs the instruction and then executes the corresponding installed program. Computer-readable media herein can be any available computer-readable storage media or communication media that can be accessed by a computer.
Communication media includes media by which communication signals, including, for example, computer readable instructions, data structures, program modules, or other data, are transmitted from one system to another. Communication media may include conductive transmission media such as cables and wires (e.g., fiber optics, coaxial, etc.) and wireless (non-conductive transmission) media capable of propagating energy waves such as acoustic, electromagnetic, RF, microwave, and infrared. Computer readable instructions, data structures, program modules, or other data may be embodied in modulated data signals, for example, in a wireless medium such as a carrier wave or similar mechanism such as is embodied as part of spread spectrum technology. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. The modulation may be analog, digital or a hybrid modulation technique.
By way of example, and not limitation, computer-readable storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer readable storage media include, but are not limited to, volatile memory such as random access memory (RAM, DRAM, SRAM); and non-volatile memory such as flash memory, various read-only memories (ROM, PROM, EPROM, EEPROM), magnetic and ferromagnetic/ferroelectric memories (MRAM, FeRAM); and magnetic and optical storage devices (hard disk, tape, CD, DVD); or other now known media or later developed that can store computer-readable information/data for use by a computer system.
An embodiment according to the present application comprises an apparatus comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the apparatus to perform a method and/or a solution according to the aforementioned embodiments of the present application.
It will be evident to those skilled in the art that the application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the apparatus claims may also be implemented by one unit or means in software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.

Claims (22)

1. A method for human-computer interaction in a control interaction interface is applied to a head-mounted device, wherein the method comprises the following steps:
presenting a control interaction interface corresponding to the head-mounted device, wherein the control interaction interface comprises a plurality of control identification information;
acquiring head action information of a user, and determining a corresponding interaction instruction based on the head action information;
and executing a corresponding interactive instruction based on the currently selected current control identification information in the control interactive interface.
2. The method of claim 1, wherein the control interaction interface comprises at least any one of:
the control interaction interfaces are arranged in a single row;
the control interactive interface is arranged in a single column;
and the control interactive interface is arranged in multiple rows and multiple columns.
3. The method of claim 2, wherein the interaction instruction comprises at least any one of:
moving a preset distance from the current control identification information along the movement direction of the head action information;
and starting the control corresponding to the current control identification information.
4. The method of claim 3, wherein the control interaction interface information comprises a control interaction interface arranged in a plurality of rows and columns, and the interaction instruction comprises a preset distance from the current control identification information along the motion direction of the head action information.
5. The method of any of claims 1 to 4, wherein the head action information comprises at least any of:
head lateral rotation motion information, wherein the corresponding rotation direction comprises left or right with respect to the initial head position;
head longitudinal rotation motion information, wherein the corresponding rotation direction comprises up or down with respect to the initial head position;
head lateral swing motion information, wherein the corresponding swing direction comprises left or right with respect to the initial head position;
head longitudinal swing motion information, wherein the corresponding swing direction comprises up or down with respect to the initial head position.
6. The method of claim 5, wherein the determining a corresponding interaction instruction based on the head action information comprises:
and determining angular velocity template information matched with the angular velocity corresponding to the head movement information, and determining a corresponding interactive instruction according to the angular velocity template information.
7. The method of claim 6, wherein the determining a corresponding interaction instruction based on the head action information comprises:
determining angular velocity template information matched with the angular velocity corresponding to the head movement information, determining acceleration template information matched with the acceleration corresponding to the head movement information, and determining a corresponding interactive instruction according to the angular velocity template information and the acceleration template information.
8. The method of claim 6 or 7, wherein the method further comprises:
and acquiring a plurality of motion samples related to the head action information, and determining corresponding angular velocity template information according to the plurality of motion samples.
9. The method of claim 7 or 8, wherein the method further comprises:
and acquiring a plurality of motion samples related to the head action information, and determining corresponding angular velocity template information and acceleration template information according to the plurality of motion samples.
10. The method of claim 5, wherein the head movement information includes the head lateral rotation movement information or the head longitudinal rotation movement information; wherein the determining of the corresponding interactive instruction based on the head action information comprises:
and if the rotating angle of the head movement is larger than or equal to the standard rotating angle value, generating a corresponding interactive instruction.
11. The method of claim 10, wherein the determining a corresponding interaction instruction based on the head action information comprises:
and if the rotation angle of the head movement is larger than or equal to the rotation angle standard value, the movement time of the head movement meets a movement time threshold value, and a corresponding interaction instruction is generated.
12. The method of any one of claims 1 to 11, wherein the collecting head motion information of a user and determining a corresponding interaction instruction based on the head motion information comprises:
collecting head action information of a user, and detecting the head action information;
and if the detection is passed, determining a corresponding interactive instruction based on the head action information.
13. The method of claim 12, wherein the head movement information includes the head lateral rotation movement information or the head longitudinal rotation movement information; wherein the detecting of the head motion information comprises at least any one of:
whether a square value of a norm of the angular velocity of the head motion information is greater than or equal to a preset first angular velocity square threshold value;
whether the difference between the angular velocity corresponding times of which the square value of the modulus of the angular velocity of the head motion information is greater than or equal to the second angular velocity square threshold value meets a preset first time difference threshold value or not.
14. The method of claim 12, wherein the head motion information comprises the head sway lateral motion information or the head sway longitudinal motion information; wherein the detecting of the head motion information comprises:
detecting whether the square value of the modulus of the angular velocity of the head action information is greater than or equal to a third angular velocity square threshold value, and whether the difference between the angular velocities meeting the condition and the corresponding time meets a preset second time difference threshold value;
if yes, whether the square value of the modulus of the angular velocity of four different time nodes exists in the head movement information in the movement process or not is detected to be equal to a fourth angular velocity square threshold value.
15. The method of any of claims 1 to 14, wherein the method further comprises:
and if the determined operation of the user about the control is obtained, starting the control corresponding to the currently selected control identification information.
16. The method of claim 15, wherein the determining operation comprises at least any one of:
a movement motion of the user's head in a front-rear direction;
the user head standing time is greater than or equal to a standing time threshold;
voice instruction information related to the user;
gesture instruction information related to the user;
touch instruction information related to the user;
eye movement instruction information related to the user.
17. A method for human-computer interaction in a control interaction interface is applied to a head-mounted device, wherein the method comprises the following steps:
presenting a control interaction interface corresponding to the head-mounted device, wherein the control interaction interface comprises selection determining information and selection canceling information;
acquiring head action information of a user, determining angular velocity template information matched with an angular velocity corresponding to the head motion information, and determining a corresponding interactive instruction according to the angular velocity template information, wherein the interactive instruction information comprises a selection confirmation instruction or a selection cancellation instruction;
and executing the corresponding interactive instruction.
18. The method of claim 17, wherein the collecting head motion information of the user, determining angular velocity template information matched with an angular velocity corresponding to the head motion information, and determining a corresponding interactive instruction according to the angular velocity template information, wherein the interactive instruction information comprises a confirmation selection instruction or a cancellation selection instruction, comprises:
the method comprises the steps of collecting head action information of a user, determining angular velocity template information matched with angular velocity corresponding to the head motion information, determining acceleration template information matched with acceleration corresponding to the head motion information, and determining a corresponding interaction instruction according to the angular velocity template information and the acceleration template information.
19. A head-mounted device for human-computer interaction in a control interaction interface, wherein the method comprises the following steps:
the one-to-one module is used for presenting a control interaction interface corresponding to the head-mounted device, wherein the control interaction interface comprises a plurality of control identification information;
the first module and the second module are used for collecting head action information of a user and determining a corresponding interaction instruction based on the head action information;
and the three modules are used for executing corresponding interactive instructions based on the currently selected current control identification information in the control interactive interface.
20. A head-mounted device for human-computer interaction in a control interaction interface, wherein the method comprises the following steps:
the module is used for presenting a control interaction interface corresponding to the head-mounted device, wherein the control interaction interface comprises selection determining information and selection canceling information;
the second module is used for acquiring head action information of a user, determining angular velocity template information matched with the angular velocity corresponding to the head action information, and determining a corresponding interactive instruction according to the angular velocity template information, wherein the interactive instruction information comprises a selection confirmation instruction or a selection cancellation instruction;
and the second module and the third module are used for executing corresponding interactive instructions.
21. An apparatus for human-computer interaction in a control interaction interface, wherein the apparatus comprises:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to operate in accordance with the method of any one of claims 1 to 18.
22. A computer-readable medium storing instructions that, when executed, cause a system to perform the operations of any of the methods of claims 1-18.
CN201910785670.1A 2019-08-23 2019-08-23 Method and equipment for performing man-machine interaction in control interaction interface Active CN112416115B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910785670.1A CN112416115B (en) 2019-08-23 2019-08-23 Method and equipment for performing man-machine interaction in control interaction interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910785670.1A CN112416115B (en) 2019-08-23 2019-08-23 Method and equipment for performing man-machine interaction in control interaction interface

Publications (2)

Publication Number Publication Date
CN112416115A true CN112416115A (en) 2021-02-26
CN112416115B CN112416115B (en) 2023-12-15

Family

ID=74779451

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910785670.1A Active CN112416115B (en) 2019-08-23 2019-08-23 Method and equipment for performing man-machine interaction in control interaction interface

Country Status (1)

Country Link
CN (1) CN112416115B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114048726A (en) * 2022-01-13 2022-02-15 北京中科汇联科技股份有限公司 Computer graphic interface interaction method and system
WO2023024871A1 (en) * 2021-08-24 2023-03-02 亮风台(上海)信息科技有限公司 Interface interaction method and device

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2808752A1 (en) * 2013-05-28 2014-12-03 BlackBerry Limited Performing an action associated with a motion based input
CN104536654A (en) * 2014-12-25 2015-04-22 小米科技有限责任公司 Menu selecting method and device on intelligent wearable device and intelligent wearable device
CN105824409A (en) * 2016-02-16 2016-08-03 乐视致新电子科技(天津)有限公司 Interactive control method and device for virtual reality
KR20160133328A (en) * 2015-05-12 2016-11-22 삼성전자주식회사 Remote control method and device using wearable device
CN106527722A (en) * 2016-11-08 2017-03-22 网易(杭州)网络有限公司 Interactive method and system in virtual reality and terminal device
CN106970697A (en) * 2016-01-13 2017-07-21 华为技术有限公司 Interface alternation device and method
CN108008873A (en) * 2017-11-10 2018-05-08 亮风台(上海)信息科技有限公司 A kind of operation method of user interface of head-mounted display apparatus
US9996149B1 (en) * 2016-02-22 2018-06-12 Immersacad Corporation Method for one-touch translational navigation of immersive, virtual reality environments
CN108170279A (en) * 2015-06-03 2018-06-15 塔普翊海(上海)智能科技有限公司 The eye of aobvious equipment is moved moves exchange method with head
CN108304075A (en) * 2018-02-11 2018-07-20 亮风台(上海)信息科技有限公司 A kind of method and apparatus carrying out human-computer interaction in augmented reality equipment
US20190130622A1 (en) * 2017-10-27 2019-05-02 Magic Leap, Inc. Virtual reticle for augmented reality systems
US20190171283A1 (en) * 2017-12-04 2019-06-06 International Business Machines Corporation Modifying a computer-based interaction based on eye gaze
JP2019136066A (en) * 2018-02-06 2019-08-22 グリー株式会社 Application processing system, application processing method, and application processing program
US20200081526A1 (en) * 2018-09-06 2020-03-12 Sony Interactive Entertainment Inc. Gaze Input System and Method

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2808752A1 (en) * 2013-05-28 2014-12-03 BlackBerry Limited Performing an action associated with a motion based input
CN104536654A (en) * 2014-12-25 2015-04-22 小米科技有限责任公司 Menu selecting method and device on intelligent wearable device and intelligent wearable device
KR20160133328A (en) * 2015-05-12 2016-11-22 삼성전자주식회사 Remote control method and device using wearable device
CN108170279A (en) * 2015-06-03 2018-06-15 塔普翊海(上海)智能科技有限公司 The eye of aobvious equipment is moved moves exchange method with head
CN106970697A (en) * 2016-01-13 2017-07-21 华为技术有限公司 Interface alternation device and method
CN105824409A (en) * 2016-02-16 2016-08-03 乐视致新电子科技(天津)有限公司 Interactive control method and device for virtual reality
US9996149B1 (en) * 2016-02-22 2018-06-12 Immersacad Corporation Method for one-touch translational navigation of immersive, virtual reality environments
CN106527722A (en) * 2016-11-08 2017-03-22 网易(杭州)网络有限公司 Interactive method and system in virtual reality and terminal device
US20190130622A1 (en) * 2017-10-27 2019-05-02 Magic Leap, Inc. Virtual reticle for augmented reality systems
CN108008873A (en) * 2017-11-10 2018-05-08 亮风台(上海)信息科技有限公司 A kind of operation method of user interface of head-mounted display apparatus
US20190171283A1 (en) * 2017-12-04 2019-06-06 International Business Machines Corporation Modifying a computer-based interaction based on eye gaze
JP2019136066A (en) * 2018-02-06 2019-08-22 グリー株式会社 Application processing system, application processing method, and application processing program
CN108304075A (en) * 2018-02-11 2018-07-20 亮风台(上海)信息科技有限公司 A kind of method and apparatus carrying out human-computer interaction in augmented reality equipment
US20200081526A1 (en) * 2018-09-06 2020-03-12 Sony Interactive Entertainment Inc. Gaze Input System and Method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023024871A1 (en) * 2021-08-24 2023-03-02 亮风台(上海)信息科技有限公司 Interface interaction method and device
CN114048726A (en) * 2022-01-13 2022-02-15 北京中科汇联科技股份有限公司 Computer graphic interface interaction method and system

Also Published As

Publication number Publication date
CN112416115B (en) 2023-12-15

Similar Documents

Publication Publication Date Title
US9870057B1 (en) Gesture detection using an array of short-range communication devices
CN103985137B (en) It is applied to the moving body track method and system of man-machine interaction
US10503373B2 (en) Visual feedback for highlight-driven gesture user interfaces
US10289214B2 (en) Method and device of controlling virtual mouse and head-mounted displaying device
US20130010071A1 (en) Methods and systems for mapping pointing device on depth map
CN106605202A (en) Handedness detection from touch input
CN107168539A (en) A kind of equipment awakening method, device and electronic equipment
CN103513788B (en) Based on the gesture identification method of gyro sensor, system and mobile terminal
CN103902061A (en) Air mouse cursor display method, device and system
KR101228336B1 (en) Personalization Service Providing Method by Using Mobile Terminal User's Activity Pattern and Mobile Terminal therefor
CN112416115B (en) Method and equipment for performing man-machine interaction in control interaction interface
CN102945088A (en) Method, device and mobile equipment for realizing terminal-simulated mouse to operate equipment
CN109828672B (en) Method and equipment for determining man-machine interaction information of intelligent equipment
US20130268900A1 (en) Touch sensor gesture recognition for operation of mobile devices
CN103885571A (en) Information processing method and electronic equipment
EP3767435A1 (en) 6-dof tracking using visual cues
CN109324741A (en) A kind of method of controlling operation thereof, device and system
Gouthaman et al. Gesture detection system using smart watch based motion sensors
CN107454970A (en) A kind of System and method for of movement locus collection and analysis based on ball game
CN103177245A (en) Gesture recognition method and device
CN103558913A (en) Virtual input glove keyboard with vibration feedback function
CN112416140B (en) Method and equipment for inputting characters
CN106933466A (en) Page interaction and system
CN110413177B (en) Method and equipment for turning pages of electronic book
CN103000161B (en) A kind of method for displaying image, device and a kind of intelligent hand-held terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 201210 7th Floor, No. 1, Lane 5005, Shenjiang Road, China (Shanghai) Pilot Free Trade Zone, Pudong New Area, Shanghai

Applicant after: HISCENE INFORMATION TECHNOLOGY Co.,Ltd.

Address before: Room 501 / 503-505, 570 shengxia Road, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai, 201203

Applicant before: HISCENE INFORMATION TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant