WO2017096958A1 - Human-computer interaction method, apparatus, and mobile device - Google Patents

Human-computer interaction method, apparatus, and mobile device Download PDF

Info

Publication number
WO2017096958A1
WO2017096958A1 PCT/CN2016/096771 CN2016096771W WO2017096958A1 WO 2017096958 A1 WO2017096958 A1 WO 2017096958A1 CN 2016096771 W CN2016096771 W CN 2016096771W WO 2017096958 A1 WO2017096958 A1 WO 2017096958A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensing information
usage scenario
user
human
gesture action
Prior art date
Application number
PCT/CN2016/096771
Other languages
French (fr)
Chinese (zh)
Inventor
甘书宇
于燕
Original Assignee
乐视控股(北京)有限公司
乐视移动智能信息技术(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 乐视控股(北京)有限公司, 乐视移动智能信息技术(北京)有限公司 filed Critical 乐视控股(北京)有限公司
Publication of WO2017096958A1 publication Critical patent/WO2017096958A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present application relates to the field of communications, and in particular, to a human-computer interaction method and device, and a mobile device.
  • Mobile devices such as mobile phones and tablets play an important role in people's daily lives.
  • mobile phones as the main means of people's long-distance communication, are almost indispensable personal items for everyone.
  • the human-computer interaction of mobile phones has become more and more intelligent. From the initial buttons to the current touch screen, engineers have been working to make machine interaction more convenient and more user-friendly.
  • human-computer interaction is realized by a user operating a touch screen.
  • the way to interact is to provide some operation portals (such as drawing a button or icon) on the interface through a software program.
  • the user can "understand” the user's intention by clicking the corresponding icon or position on the touch screen.
  • the steps that the user needs to open the camera to capture are: pull out the phone; press the power button; pull up or pull down to find the camera icon; click the camera icon to start the camera; Scenes.
  • the touch screen has considerable convenience, it also has the following defects: due to different operation interfaces provided by different interfaces, the operation entrance is difficult to find; it is prone to false touch; the finger blocks the screen during operation, affecting the line of sight; The screen of the control screen is getting bigger and bigger. When one-handed operation, the finger leaves the frame of the mobile phone, and the process of moving to the screen of the mobile phone is likely to cause the mobile phone to fall.
  • voice recognition technology is used to acquire a user's instruction, and then the machine is used.
  • the user's instructions act.
  • the drawback of this method is that the correct recognition rate of speech recognition is low due to the strange accent of the user; the algorithm is complicated, the software development cost is high, and the processor overhead is large.
  • the purpose of the application is to provide a human-computer interaction method and device, and a mobile device, which improve the intelligence of the mobile device and improve the user experience.
  • the embodiment of the present application provides a human-computer interaction method, including:
  • sensing information of a user action includes touch sensing information generated by a touch sensor located at a border of the mobile device
  • the above method includes:
  • the sensing information includes any one or more of rotational angular velocity information, distance information, and fingerprint information
  • the touch sensing information includes finger coordinate information and/or pressure information
  • the mobile device is a mobile phone.
  • the human-computer interaction method of the embodiment of the present application can actively "understand” the user's operation intention, give a quick and accurate response, make the mobile device smarter and more human, and improve the user experience.
  • the embodiment of the present application further provides a human-machine interaction device, including:
  • a receiving module configured to receive, by the mobile device, sensing information of a user action, where the sensing information includes touch sensing information generated by a touch sensor located at a border of the mobile device;
  • An analysis module configured to analyze the sensing information received by the receiving module, and obtain a current gesture of the user action
  • a matching module configured to search for a usage scenario that matches a current gesture action learned by the analysis module from a correspondence between the saved usage scenario and the gesture action;
  • the service module is configured to provide a service corresponding to the usage scenario found by the matching module.
  • the above device comprises:
  • a recording module configured to record a usage scenario actively activated by the user under the current gesture action when a usage scenario matching the current gesture action is not found in the corresponding relationship between the saved usage scenario and the gesture action;
  • the adding module is configured to bind the actively enabled usage scenario and the current gesture action into a set of correspondences, and add to the corresponding relationship between the usage scenario and the gesture action.
  • the sensing information includes any one or more of rotational angular velocity information, distance information, and fingerprint information
  • the touch sensing information includes finger coordinate information and/or pressure information
  • the mobile device is a mobile phone.
  • the human-machine interaction device of the embodiment of the present application can actively "understand” the user's operation intention, give a quick and accurate response, make the mobile device smarter and more human, and improve the user experience.
  • an embodiment of the present application further provides a mobile device, including a touch sensor located on a border of the mobile device, and the human-machine interaction device according to any one of the preceding items, wherein:
  • the touch sensor is configured to sense a touch of a user, generate touch sensing information, and send the touch sensing information to the human-machine interaction device;
  • the human-machine interaction device is configured to receive sensing information of the mobile device on the user action, and provide a corresponding service according to the sensing information, where the sensing information includes the touch sensing information.
  • the mobile device is a mobile phone.
  • an embodiment of the present application further provides an electronic device, including:
  • At least one processor and,
  • the memory stores instructions executable by the one processor, the instructions being Executing by a processor to enable the at least one processor to:
  • sensing information of the electronic device on the user action includes touch sensing information generated by a touch sensor located on the border of the electronic device;
  • the embodiment of the present application further provides a non-volatile computer storage medium storing computer executable instructions, and the computer executable instructions are set as:
  • sensing information of the electronic device on the user action includes touch sensing information generated by a touch sensor located on the border of the electronic device;
  • an embodiment of the present application further provides a computer program product, the computer program product comprising a computing program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions
  • the program instructions When the program instructions are executed by a computer, the computer is caused to execute the human-computer interaction method described in any of the above embodiments.
  • the mobile device in the embodiment of the present application can actively “understand” the user's operation intention, give a quick and accurate response, make the mobile device smarter and more human, and improve the user experience.
  • Embodiment 1 is a flowchart of a human-computer interaction method in Embodiment 1 of the present application;
  • Embodiment 2 is a flowchart of a human-computer interaction method in Embodiment 2 of the present application;
  • FIG. 3 is a structural block diagram of a human-machine interaction device in Embodiment 3 of the present application.
  • FIG. 4 is a structural block diagram of a human-machine interaction device in Embodiment 4 of the present application.
  • FIG. 5 is a structural block diagram of a mobile device according to Embodiment 5 of the present application.
  • FIG. 6 is a structural block diagram of an electronic device according to Embodiment 6 of the present application.
  • FIG. 1 is a flowchart of a human-computer interaction method in Embodiment 1 of the present application. As shown in FIG. 1 , in this embodiment, the human-computer interaction method may include the following steps:
  • Step S101 Receive sensing information of a mobile device on a user action, where the sensing information includes touch sensing information generated by a touch sensor located on a border of the mobile device;
  • the user here refers to the user of the mobile device, that is, the mobile device user.
  • the mobile device may be a mobile phone or other smart device that can be operated by one hand.
  • the mobile device is provided with a gyroscope, a fingerprint sensor, a distance sensor, and a touch sensor located on the border of the mobile device.
  • the sensing information may include any one or more of rotational angular velocity information, distance information, fingerprint information, and the like, and the touch sensing information may include one or both of finger coordinate information and pressure information.
  • the rotational angular velocity information is generated by the gyroscope
  • the fingerprint information is generated by the fingerprint sensor
  • the distance information is generated by the distance sensor.
  • Touch-sensing information is generated by a touch sensor located on the border of the mobile device.
  • a touch sensor can be placed on all four borders of the mobile device.
  • the touch sensor can sense the touch of the user's finger, generate finger coordinate information, pressure information, and the like.
  • Step S102 analyzing the sensing information received in step S101, and knowing the current gesture action of the user;
  • Each gesture action of the user usually corresponds to some relatively fixed sensing information generated by the mobile device.
  • the frame of the mobile phone generates finger coordinate information (generated by the touch sensor located on the border of the mobile phone), and the gyro in the mobile phone senses the swing of the mobile phone due to the shaking of the mobile phone while walking, thereby generating rotation.
  • Angular velocity information generated by the touch sensor located on the border of the mobile phone
  • the received sensing information is "rotation angular velocity information + finger coordinate information”
  • it can be known that the corresponding user gesture action is to walk with the mobile phone with one hand.
  • Step S103 Searching for a usage scenario that matches the current gesture action from the corresponding relationship between the saved usage scenario and the gesture action;
  • the correspondence between the usage scene and the gesture action is preset in the mobile device. For example, the above example of the user walking with the mobile phone with one hand is taken as an example. In the correspondence between the preset use scene and the gesture action in the mobile phone, the use scene corresponding to the gesture action with one hand holding the mobile phone is walking.
  • Step S104 providing a service corresponding to the found usage scenario.
  • the mobile phone is put into deep sleep. Therefore, when the found usage scene is walking, the application automatically causes the mobile phone to enter deep sleep to reduce the power consumption of the mobile phone, without requiring the user to manually set the mobile phone to enter deep sleep.
  • the sensing information received by the mobile phone includes the finger coordinate information generated by the mobile phone frame, the pressure information, and the distance information generated by the distance sensor in the mobile phone (the mobile phone is blocked from no cover) and the fingerprint sensor generates Fingerprint information;
  • the service corresponding to the capture is to turn on the camera. Therefore, after finding the usage scene as a snapshot, the camera is automatically turned on, so that the mobile phone is in a state capable of capturing.
  • the human-computer interaction method of the present application the user only needs to make two actions of holding the mobile phone, and can automatically switch to the camera mode, without the need to find the camera icon and click the camera icon to open the camera function. It simplifies the user's operation and improves the user experience. This is due to the fact that the human-computer interaction method of the present application can actively "understand" the characteristics of the user's intention.
  • the current gesture motion of the user is that the thumb slides up and down the border of the mobile phone
  • the user can scroll the content on the screen of the mobile phone without scrolling the finger to the screen, thereby avoiding the inconvenience caused by the finger blocking the screen.
  • the human-computer interaction method of the embodiment of the present application can actively "understand" the user's operation intention, quickly and accurately give a response, and quickly switch the mobile device to the user's desired use scene mode, so that the mobile device is more human and more intelligent. Moreover, in the human-computer interaction method of the embodiment of the present application, many operations that can only be performed on the touch screen or the physical button can be completed on the border of the mobile device without blocking the screen, and the finger does not need to be moved a lot. Can greatly enhance the user experience.
  • FIG. 2 is a flowchart of a human-computer interaction method in Embodiment 2 of the present application. As shown in FIG. 2, in this embodiment, the human-computer interaction method may include the following steps:
  • Step S201 Receive sensing information of a mobile device on a user action, where the sensing information includes touch sensing information generated by a touch sensor located on a border of the mobile device;
  • Step S202 analyzing the sensing information to obtain the current gesture action of the user
  • Step S203 determining whether there is a usage scenario that matches the current gesture action in the corresponding relationship between the saved usage scenario and the gesture action, if yes, executing step 204, otherwise performing step 206;
  • Step S204 Searching for a usage scenario that matches the current gesture action from the corresponding relationship between the saved usage scenario and the gesture action;
  • Step S205 providing a service corresponding to the found usage scenario, and ending;
  • Step S206 recording a usage scenario that the user actively activates under the current gesture action
  • Step S207 Bind the actively activated usage scenario and the current gesture action into a set of correspondences, add to the corresponding relationship between the usage scenario and the gesture action, and end.
  • step S207 the user's custom gesture and the use scenario enabled in the custom gesture can be recorded.
  • the mobile phone can automatically start the corresponding use scenario without using the user initiative. To start the usage scenario.
  • the human-computer interaction method of the embodiment of the present application can actively "understand" the user's operation intention, quickly and accurately give a response, and quickly switch the mobile device to the user's desired use scene mode, so that the mobile device is more human and more Smart; many current operations that can only be done on the touch screen or physical buttons can be done on the border of the mobile device without blocking the screen, and do not need to move the finger a lot, which can greatly enhance the user experience.
  • the human-computer interaction method in the embodiment of the present application can establish a correspondence between the user's custom gesture and the usage scenario, so that the next time the user uses the gesture according to the user's habit, the user can switch to the corresponding usage scene mode to implement the personality that conforms to the user's personal habits. Operation to improve user satisfaction.
  • FIG. 3 is a structural block diagram of a human-machine interaction apparatus according to Embodiment 3 of the present application.
  • the human-machine interaction device of the embodiment shown in FIG. 3 can be used to implement the human-computer interaction method of the embodiment shown in FIG. 1.
  • the human-machine interaction apparatus 300 can include a receiving module 310, an analysis module 320, a matching module 330, and a service module 340.
  • the receiving module 310 is configured to receive sensing information of the mobile device on the user action, where the sensing information includes touch sensing information generated by a touch sensor located on the border of the mobile device.
  • the analysis module 320 is configured to analyze the sensing information received by the receiving module 310 to learn the current gesture action of the user.
  • the matching module 330 is configured to search for a usage scenario that matches the current gesture action learned by the analysis module 320 from the corresponding relationship between the saved usage scenario and the gesture action.
  • the service module 340 is configured to provide a service corresponding to the usage scenario found by the matching module 330.
  • the mobile device may be a mobile phone or other smart device that can be operated by one hand.
  • the sensing information may include any one or more of rotational angular velocity information, distance information, fingerprint information, and the like, and the touch sensing information may include one of finger coordinate information and pressure information. Or all.
  • the human-machine interaction device of the embodiment of the present application can actively "understand" the user's operation intention, give a quick and accurate response, and quickly switch the mobile device to the user-used use scene mode, so that the mobile device is more human and more intelligent.
  • many operations that can only be performed on the touch screen or the physical button can be completed on the border of the mobile device without blocking the screen, and the finger does not need to be moved a lot. Can greatly enhance the user experience.
  • FIG. 4 is a structural block diagram of a human-machine interaction device in Embodiment 4 of the present application.
  • the human-machine interaction device of the embodiment shown in FIG. 4 can be used to implement the human-computer interaction method of the embodiment shown in FIG. 2.
  • the human-machine interaction apparatus 300 may include a receiving module 310, an analysis module 320, a matching module 330, a service module 340, a recording module 350, and an adding module 360.
  • the functions of the receiving module 310, the analyzing module 320, the matching module 330, and the service module 340 are the same as those of the corresponding modules in the foregoing embodiment shown in FIG. 3, and details are not described herein again.
  • the recording module 350 is configured to record a usage scenario that is actively activated by the user under the current gesture action when the usage scenario that matches the current gesture action learned by the analysis module 320 is not found in the corresponding relationship between the saved usage scenario and the gesture action.
  • the adding module 360 is configured to bind the user-activated usage scenario recorded by the recording module 350 to the current gesture action into a set of correspondences, and add to the corresponding relationship between the usage scenario and the gesture action.
  • the sensing information may include any one or more of rotational angular velocity information, distance information, fingerprint information, and the like, and the touch sensing information may include one or both of finger coordinate information and pressure information.
  • the mobile device may be a mobile phone or other smart device that can be operated by one hand.
  • the human-machine interaction device of the embodiment of the present application can actively "understand" the user's operation intention, quickly and accurately give a response, and quickly switch the mobile device to the user-used use scene mode, so that the mobile device is more human and more Smart; many current operations that can only be done on the touch screen or physical buttons can be done on the border of the mobile device without blocking the screen, and do not need to move the finger a lot, which can greatly enhance the user experience.
  • the human-machine interaction device of the embodiment of the present application can establish a correspondence relationship between the user's custom gesture and the usage scenario, so that the user can switch to the corresponding usage scene mode according to the user's customary gesture at the next use, thereby realizing the personality conforming to the user's personal habits. Operation to improve user satisfaction.
  • FIG. 5 is a structural block diagram of a mobile device according to Embodiment 5 of the present application.
  • the mobile device 500 includes a touch sensor 400 and a human-machine interaction device 300 located at a border of the mobile device.
  • the human-machine interaction device 300 can be any one of the foregoing embodiments of the present application.
  • the touch sensor 400 is configured to sense a user's touch, generate touch sensing information, and transmit the touch sensing information to the human interaction device 300.
  • the human-machine interaction device 300 is configured to receive sensing information of the mobile device for the user action, and provide a corresponding service according to the sensing information, where the sensing information includes touch sensing information generated by the touch sensor 400.
  • the human-machine interaction device 300 can be configured to receive touch sensing information of a mobile device frame, wherein the touch sensing information is generated by the touch sensor 400 located at the border of the mobile device; analyzing the received touch sensing information to know the current gesture of the user An action; searching for a usage scenario that matches the current gesture action from the corresponding relationship between the saved usage scenario and the gesture action; providing a service corresponding to the found usage scenario.
  • the mobile device can be a mobile phone or other smart device that can be operated by one hand.
  • the mobile device of the embodiment of the present application can actively "understand" the user's operation intention, quickly and accurately give a response, and quickly switch the mobile device to the user's desired use scene mode, so that the mobile device is more human and smarter. Moreover, in the mobile device of the embodiment of the present application, many operations that can only be performed on the touch screen or the physical button can be completed on the border of the mobile device, and the screen is not blocked, and the finger does not need to be moved a lot, thereby being able to greatly Improve the user experience.
  • FIG. 6 is a structural block diagram of an electronic device according to Embodiment 6 of the present application. As shown in FIG. 6, the electronic device 600 includes:
  • processors 610 and memory 620 one processor 610 is taken as an example in FIG.
  • the electronic device 600 executing the human-computer interaction method may further include: a touch sensor 400.
  • the processor 610, the memory 620, and the touch sensor 400 may be connected by a bus or other means, as exemplified by a bus connection in FIG.
  • the memory 620 is a non-volatile computer readable storage medium, and can be used for storing a non-volatile software program, a non-volatile computer executable program, and a module, such as a program corresponding to the human-computer interaction method in the embodiment of the present application.
  • the instruction/module (for example, the receiving module 310 of the human-machine interaction device 300 shown in FIG. 3, the analysis module 320, the matching module 330, and the service module 340; and the receiving module 310 of the human-machine interaction device 300 shown in FIG. 4, analyzes Module 320, matching module 330, service module 340, recording module 350, and adding module 360).
  • the processor 610 runs the non-stored memory 620 Volatile software programs, instructions, and modules, thereby performing various functional applications and data processing of the electronic device 600, that is, implementing the human-computer interaction method of the above-described method embodiments and the functions of the various modules of the above-described device embodiments.
  • Memory 620 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, memory 620 can optionally include memory remotely located relative to processor 610, which can be coupled to processor 610 via a network. Examples of such networks include, but are not limited to, the Internet, intranets, local area networks, mobile communication networks, and combinations thereof.
  • the touch sensor 400 is configured to sense a user's touch, generate touch sensing information, and send the touch sensing information to the processor 610.
  • the program instructions/modules are stored in the memory 620, and when executed by the one or more processors 610, the human-computer interaction method in any of the above method embodiments may be performed, for example, performing the above described FIG.
  • the method steps S101 to S104 in the method, and the method steps S201 to S207 in FIG. 2; the functions of the modules 310-340 in FIG. 3 and the modules 310-360 in FIG. 4 can also be implemented.
  • the electronic device 600 of the embodiment of the present application exists in various forms, including but not limited to:
  • Mobile communication devices These devices are characterized by mobile communication functions and are mainly aimed at providing voice and data communication.
  • Such terminals include: smart phones (such as iPhone), multimedia phones, functional phones, and low-end phones.
  • Ultra-mobile personal computer equipment This type of equipment belongs to the category of personal computers, has computing and processing functions, and generally has mobile Internet access.
  • Such terminals include: PDAs, MIDs, and UMPC devices, such as the iPad.
  • Portable entertainment devices These devices can display and play multimedia content. Such devices include: audio, video players (such as iPod), handheld game consoles, e-books, and smart toys and portable car navigation devices.
  • Embodiment 7 of the present application provides a non-volatile computer storage medium storing computer-executable instructions executed by one or more processors, such as a processor 610, Having the one or more processors described above perform any of the methods described above
  • the human-computer interaction method in the example for example, performs the method steps S101 to S104 in FIG. 1 described above, and the method steps S201 to S207 in FIG. 2; the modules 310-340 and the diagram in FIG. 3 can also be implemented.
  • the device embodiments described above are merely illustrative, wherein the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, ie may be located A place, or it can be distributed to multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present application relates to a human-computer interaction method, apparatus, and mobile device. The human-computer interaction method comprises: receiving sense information of a mobile device with respect to a user movement, said sense information comprising touch sense information generated by a touch sensor located on the edge of said mobile device; analyzing said sense information to learn the current hand-gesture movement of the user; in the stored correlation between usage context and hand-gesture movement, searching for a usage context matching said current hand-gesture movement; providing a service corresponding to the found usage context. In the present application, it is possible to "understand" the intention of a user's operation and quickly and accurately provide a response, such that the mobile device is more intelligent and more human-like, improving user experience.

Description

人机交互方法及装置、移动设备Human-computer interaction method and device, mobile device

相关申请的交叉参考Cross-reference to related applications

本申请要求于2015年12月11日提交中国专利局、申请号为201510920862.0、发明名称为“人机交互方法及装置、移动设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。The present application claims priority to Chinese Patent Application No. 201510920862.0, entitled "Human Machine Interaction Method and Apparatus, Mobile Device", filed on December 11, 2015, the entire contents of which are incorporated herein by reference. In the application.

技术领域Technical field

本申请涉及通信领域,尤其涉及一种人机交互方法及装置、移动设备。The present application relates to the field of communications, and in particular, to a human-computer interaction method and device, and a mobile device.

背景技术Background technique

手机、平板电脑等移动设备在当前人们的日常生活中发挥着重要的作用。特别是手机,作为人们远距离通信的主要手段,几乎是每个人必不可少的随身物品。随着手机技术的不断发展,手机的人机交互方式变得越来越智能。从最初的按键到现在的触控屏幕,工程师们一直致力于让人机交互更加便捷、更加人性化。Mobile devices such as mobile phones and tablets play an important role in people's daily lives. In particular, mobile phones, as the main means of people's long-distance communication, are almost indispensable personal items for everyone. With the continuous development of mobile phone technology, the human-computer interaction of mobile phones has become more and more intelligent. From the initial buttons to the current touch screen, engineers have been working to make machine interaction more convenient and more user-friendly.

一种相关技术中,通过用户操作触控屏幕来实现人机交互。这种交互方式的做法是:预先通过软件程序在界面上提供一些操作入口(例如绘制一个按钮或图标),用户可以通过点击触控屏幕上相应的图标或位置,实现机器“理解”用户意图,从而达到人机交互的目的。在这种人机交互方式中,用户打开照相机抓拍需要进行的步骤依次为:掏出手机;按下电源键;上拉或下拉寻找照相机图标;点击照相机图标启动照相机;将镜头对准要抓拍的场景。尽管触控屏幕具有相当大的便利性,但其也存在如下一些缺陷:由于不同的界面提供的操作入口不同,导致操作入口难找;容易发生误触;操作时手指遮挡屏幕,影响视线;触控屏的屏幕越做越大,单手操作时手指离开手机边框,移动到手机屏幕的过程中容易造成手机跌落。In a related art, human-computer interaction is realized by a user operating a touch screen. The way to interact is to provide some operation portals (such as drawing a button or icon) on the interface through a software program. The user can "understand" the user's intention by clicking the corresponding icon or position on the touch screen. Thereby achieving the purpose of human-computer interaction. In this kind of human-computer interaction mode, the steps that the user needs to open the camera to capture are: pull out the phone; press the power button; pull up or pull down to find the camera icon; click the camera icon to start the camera; Scenes. Although the touch screen has considerable convenience, it also has the following defects: due to different operation interfaces provided by different interfaces, the operation entrance is difficult to find; it is prone to false touch; the finger blocks the screen during operation, affecting the line of sight; The screen of the control screen is getting bigger and bigger. When one-handed operation, the finger leaves the frame of the mobile phone, and the process of moving to the screen of the mobile phone is likely to cause the mobile phone to fall.

另一种相关技术中,采用语音识别技术获取用户的指令,然后机器按照用 户的指令进行动作。这种方式的缺陷是:由于用户的口音千奇百怪,导致语音识别的正确识别率低;算法复杂,软件开发成本高,处理器开销大。In another related art, voice recognition technology is used to acquire a user's instruction, and then the machine is used. The user's instructions act. The drawback of this method is that the correct recognition rate of speech recognition is low due to the strange accent of the user; the algorithm is complicated, the software development cost is high, and the processor overhead is large.

发明内容Summary of the invention

本申请的目的在于提供一种人机交互方法及装置、移动设备,提高移动设备的智能化程度,提升用户体验。The purpose of the application is to provide a human-computer interaction method and device, and a mobile device, which improve the intelligence of the mobile device and improve the user experience.

为实现上述目的,本申请实施例提出了一种人机交互方法,包括:To achieve the above objective, the embodiment of the present application provides a human-computer interaction method, including:

接收移动设备对用户动作的感应信息,所述感应信息包括位于所述移动设备边框的触摸传感器生成的触摸感应信息;Receiving, by the mobile device, sensing information of a user action, where the sensing information includes touch sensing information generated by a touch sensor located at a border of the mobile device;

分析所述感应信息,获知用户的当前手势动作;Analyzing the sensing information to obtain a current gesture action of the user;

从保存的使用场景与手势动作的对应关系中,查找与所述当前手势动作匹配的使用场景;Searching for a usage scenario that matches the current gesture action from a correspondence between the saved usage scenario and the gesture action;

提供查找到的使用场景对应的服务。Provide the service corresponding to the found usage scenario.

进一步地,上述方法包括:Further, the above method includes:

在保存的使用场景与手势动作的对应关系中未查找到与所述当前手势动作匹配的使用场景时,记录用户在所述当前手势动作下主动启用的使用场景;When the usage scenario matching the current gesture action is not found in the corresponding relationship between the saved usage scenario and the gesture action, the usage scenario actively activated by the user under the current gesture action is recorded;

将所述主动启用的使用场景与所述当前手势动作绑定为一组对应关系,添加到所述使用场景与手势动作的对应关系中。Binding the actively enabled usage scenario to the current gesture action into a set of correspondences, added to the corresponding relationship between the usage scenario and the gesture action.

进一步地,所述感应信息包括转动角速度信息、距离信息、指纹信息中的任意一种或多种,所述触摸感应信息包括手指坐标信息和/或压力信息。Further, the sensing information includes any one or more of rotational angular velocity information, distance information, and fingerprint information, and the touch sensing information includes finger coordinate information and/or pressure information.

进一步地,所述移动设备为手机。Further, the mobile device is a mobile phone.

本申请实施例的人机交互方法,能够主动“理解”用户的操作意图,快速、准确的给出响应,使移动设备更智能、更人性,提升了用户体验。The human-computer interaction method of the embodiment of the present application can actively "understand" the user's operation intention, give a quick and accurate response, make the mobile device smarter and more human, and improve the user experience.

为实现上述目的,本申请实施例还提出了一种人机交互装置,包括:To achieve the above objective, the embodiment of the present application further provides a human-machine interaction device, including:

接收模块,用于接收移动设备对用户动作的感应信息,所述感应信息包括位于所述移动设备边框的触摸传感器生成的触摸感应信息;a receiving module, configured to receive, by the mobile device, sensing information of a user action, where the sensing information includes touch sensing information generated by a touch sensor located at a border of the mobile device;

分析模块,用于分析所述接收模块接收的感应信息,获知用户的当前手势 动作;An analysis module, configured to analyze the sensing information received by the receiving module, and obtain a current gesture of the user action;

匹配模块,用于从保存的使用场景与手势动作的对应关系中,查找与所述分析模块获知的当前手势动作匹配的使用场景;a matching module, configured to search for a usage scenario that matches a current gesture action learned by the analysis module from a correspondence between the saved usage scenario and the gesture action;

服务模块,用于提供所述匹配模块查找到的使用场景对应的服务。The service module is configured to provide a service corresponding to the usage scenario found by the matching module.

进一步地,上述装置包括:Further, the above device comprises:

记录模块,用于在保存的使用场景与手势动作的对应关系中未查找到与所述当前手势动作匹配的使用场景时,记录用户在所述当前手势动作下主动启用的使用场景;a recording module, configured to record a usage scenario actively activated by the user under the current gesture action when a usage scenario matching the current gesture action is not found in the corresponding relationship between the saved usage scenario and the gesture action;

添加模块,用于将所述主动启用的使用场景与所述当前手势动作绑定为一组对应关系,添加到所述使用场景与手势动作的对应关系中。The adding module is configured to bind the actively enabled usage scenario and the current gesture action into a set of correspondences, and add to the corresponding relationship between the usage scenario and the gesture action.

进一步地,所述感应信息包括转动角速度信息、距离信息、指纹信息中的任意一种或多种,所述触摸感应信息包括手指坐标信息和/或压力信息。Further, the sensing information includes any one or more of rotational angular velocity information, distance information, and fingerprint information, and the touch sensing information includes finger coordinate information and/or pressure information.

进一步地,所述移动设备为手机。Further, the mobile device is a mobile phone.

本申请实施例的人机交互装置,能够主动“理解”用户的操作意图,快速、准确的给出响应,使移动设备更智能、更人性,提升了用户体验。The human-machine interaction device of the embodiment of the present application can actively "understand" the user's operation intention, give a quick and accurate response, make the mobile device smarter and more human, and improve the user experience.

为实现上述目的,本申请实施例还提出了一种移动设备,包括位于所述移动设备边框的触摸传感器和前述任一项所述的人机交互装置,其中:In order to achieve the above object, an embodiment of the present application further provides a mobile device, including a touch sensor located on a border of the mobile device, and the human-machine interaction device according to any one of the preceding items, wherein:

所述触摸传感器,用于感应用户的触摸,产生触摸感应信息,并将所述触摸感应信息发送给所述人机交互装置;The touch sensor is configured to sense a touch of a user, generate touch sensing information, and send the touch sensing information to the human-machine interaction device;

所述人机交互装置,用于接收移动设备对用户动作的感应信息,并根据所述感应信息提供相应的服务,所述感应信息包括所述触摸感应信息。The human-machine interaction device is configured to receive sensing information of the mobile device on the user action, and provide a corresponding service according to the sensing information, where the sensing information includes the touch sensing information.

进一步地,所述移动设备为手机。Further, the mobile device is a mobile phone.

为实现上述目的,本申请实施例还提出了一种电子设备,包括:To achieve the above objective, an embodiment of the present application further provides an electronic device, including:

至少一个处理器;以及,At least one processor; and,

与所述至少一个处理器通信连接的存储器;其中,a memory communicatively coupled to the at least one processor; wherein

所述存储器存储有可被所述一个处理器执行的指令,所述指令被所述至少 一个处理器执行,以使所述至少一个处理器能够:The memory stores instructions executable by the one processor, the instructions being Executing by a processor to enable the at least one processor to:

接收电子设备对用户动作的感应信息,所述感应信息包括位于所述电子设备边框的触摸传感器生成的触摸感应信息;Receiving sensing information of the electronic device on the user action, where the sensing information includes touch sensing information generated by a touch sensor located on the border of the electronic device;

分析所述感应信息,获知用户的当前手势动作;Analyzing the sensing information to obtain a current gesture action of the user;

从保存的使用场景与手势动作的对应关系中,查找与所述当前手势动作匹配的使用场景;Searching for a usage scenario that matches the current gesture action from a correspondence between the saved usage scenario and the gesture action;

提供查找到的使用场景对应的服务。Provide the service corresponding to the found usage scenario.

为实现上述目的,本申请实施例还提出了一种非易失性计算机存储介质,存储有计算机可执行指令,所述计算机可执行指令设置为:To achieve the above objective, the embodiment of the present application further provides a non-volatile computer storage medium storing computer executable instructions, and the computer executable instructions are set as:

接收电子设备对用户动作的感应信息,所述感应信息包括位于所述电子设备边框的触摸传感器生成的触摸感应信息;Receiving sensing information of the electronic device on the user action, where the sensing information includes touch sensing information generated by a touch sensor located on the border of the electronic device;

分析所述感应信息,获知用户的当前手势动作;Analyzing the sensing information to obtain a current gesture action of the user;

从保存的使用场景与手势动作的对应关系中,查找与所述当前手势动作匹配的使用场景;Searching for a usage scenario that matches the current gesture action from a correspondence between the saved usage scenario and the gesture action;

提供查找到的使用场景对应的服务。Provide the service corresponding to the found usage scenario.

为实现上述目的,本申请实施例还提出了一种计算机程序产品,所述计算机程序产品包括存储在非易失性计算机可读存储介质上的计算程序,所述计算机程序包括程序指令,当所述程序指令被计算机执行时,使所述计算机执行上述任意实施例所述的人机交互方法。To achieve the above object, an embodiment of the present application further provides a computer program product, the computer program product comprising a computing program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions When the program instructions are executed by a computer, the computer is caused to execute the human-computer interaction method described in any of the above embodiments.

本申请实施例的移动设备,能够主动“理解”用户的操作意图,快速、准确的给出响应,使移动设备更智能、更人性,提升了用户体验。The mobile device in the embodiment of the present application can actively "understand" the user's operation intention, give a quick and accurate response, make the mobile device smarter and more human, and improve the user experience.

附图说明DRAWINGS

一个或多个实施例通过与之对应的附图中的图片进行示例性说明,这些示例性说明并不构成对实施例的限定,附图中具有相同参考数字标号的元件表示为类似的元件,除非有特别申明,附图中的图不构成比例限制。The one or more embodiments are exemplified by the accompanying drawings in the accompanying drawings, and FIG. The figures in the drawings do not constitute a scale limitation unless otherwise stated.

图1为本申请实施例一中人机交互方法的流程图; 1 is a flowchart of a human-computer interaction method in Embodiment 1 of the present application;

图2为本申请实施例二中人机交互方法的流程图;2 is a flowchart of a human-computer interaction method in Embodiment 2 of the present application;

图3为本申请实施例三中人机交互装置的结构框图;3 is a structural block diagram of a human-machine interaction device in Embodiment 3 of the present application;

图4为本申请实施例四中人机交互装置的结构框图;4 is a structural block diagram of a human-machine interaction device in Embodiment 4 of the present application;

图5为本申请实施例五中移动设备的结构框图;5 is a structural block diagram of a mobile device according to Embodiment 5 of the present application;

图6为本申请实施例六中电子设备的结构框图。FIG. 6 is a structural block diagram of an electronic device according to Embodiment 6 of the present application.

具体实施方式detailed description

以下结合附图对本申请的原理和特征进行描述,所举实施例只用于解释本申请,并非用于限定本申请的范围。对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,根据本申请精神所获得的所有实施例,都属于本申请的保护范围。The principles and features of the present application are described in the following with reference to the accompanying drawings, which are only used to explain the present application and are not intended to limit the scope of the application. All the embodiments obtained according to the spirit of the present application are within the protection scope of the present application without any creative work.

图1为本申请实施例一中人机交互方法的流程图。如图1所示,本实施例中,人机交互方法可以包括如下步骤:FIG. 1 is a flowchart of a human-computer interaction method in Embodiment 1 of the present application. As shown in FIG. 1 , in this embodiment, the human-computer interaction method may include the following steps:

步骤S101,接收移动设备对用户动作的感应信息,其中,该感应信息包括位于该移动设备边框的触摸传感器生成的触摸感应信息;Step S101: Receive sensing information of a mobile device on a user action, where the sensing information includes touch sensing information generated by a touch sensor located on a border of the mobile device;

需要说明的是,这里的用户是指移动设备的用户,也即移动设备使用者。It should be noted that the user here refers to the user of the mobile device, that is, the mobile device user.

其中,移动设备可以是手机,也可以是可单手操作的其他智能设备。该移动设备中设置有陀螺仪、指纹传感器、距离传感器以及位于该移动设备边框的触摸传感器。The mobile device may be a mobile phone or other smart device that can be operated by one hand. The mobile device is provided with a gyroscope, a fingerprint sensor, a distance sensor, and a touch sensor located on the border of the mobile device.

其中,感应信息可以包括转动角速度信息、距离信息、指纹信息等之中的任意一种或多种,触摸感应信息可以包括手指坐标信息和压力信息这两者之一或者全部。其中,转动角速度信息由陀螺仪产生,指纹信息由指纹传感器生成,距离信息由距离传感器生成。The sensing information may include any one or more of rotational angular velocity information, distance information, fingerprint information, and the like, and the touch sensing information may include one or both of finger coordinate information and pressure information. Wherein, the rotational angular velocity information is generated by the gyroscope, the fingerprint information is generated by the fingerprint sensor, and the distance information is generated by the distance sensor.

触摸感应信息由位于移动设备边框的触摸传感器生成。移动设备的四个边框上都可以设置有触摸传感器。该触摸传感器能够感知用户手指的触摸,生成手指坐标信息、压力信息等。Touch-sensing information is generated by a touch sensor located on the border of the mobile device. A touch sensor can be placed on all four borders of the mobile device. The touch sensor can sense the touch of the user's finger, generate finger coordinate information, pressure information, and the like.

基于触摸传感器生成的信息以及移动设备对用户动作的其他感应信息,可 以分析出用户的当前手势动作。Based on information generated by the touch sensor and other sensing information of the mobile device on the user's motion, To analyze the user's current gestures.

步骤S102,分析步骤S101中接收的感应信息,获知用户的当前手势动作;Step S102, analyzing the sensing information received in step S101, and knowing the current gesture action of the user;

用户的每个手势动作通常会对应移动设备产生的一些相对固定的感应信息。例如,用户单手拿手机走路时,手机的边框会生成手指坐标信息(由位于手机边框的触摸传感器生成),同时由于走路时手机晃动,手机中的陀螺仪会感知到手机摆动,从而产生转动角速度信息。因此,如果接收到的感应信息为“转动角速度信息+手指坐标信息”,那么就可以知道对应的用户手势动作为单手拿手机走路。Each gesture action of the user usually corresponds to some relatively fixed sensing information generated by the mobile device. For example, when the user walks with the mobile phone with one hand, the frame of the mobile phone generates finger coordinate information (generated by the touch sensor located on the border of the mobile phone), and the gyro in the mobile phone senses the swing of the mobile phone due to the shaking of the mobile phone while walking, thereby generating rotation. Angular velocity information. Therefore, if the received sensing information is "rotation angular velocity information + finger coordinate information", then it can be known that the corresponding user gesture action is to walk with the mobile phone with one hand.

步骤S103,从保存的使用场景与手势动作的对应关系中,查找与当前手势动作匹配的使用场景;Step S103: Searching for a usage scenario that matches the current gesture action from the corresponding relationship between the saved usage scenario and the gesture action;

使用场景与手势动作的对应关系是预先设置在移动设备中的。仍以上述的用户单手拿手机走路示例为例。在手机中预置的使用场景与手势动作的对应关系中,单手拿手机走路这个手势动作对应的使用场景就是走路。The correspondence between the usage scene and the gesture action is preset in the mobile device. For example, the above example of the user walking with the mobile phone with one hand is taken as an example. In the correspondence between the preset use scene and the gesture action in the mobile phone, the use scene corresponding to the gesture action with one hand holding the mobile phone is walking.

步骤S104,提供查找到的使用场景对应的服务。Step S104, providing a service corresponding to the found usage scenario.

例如,如果使用场景“走路”对应的服务为使手机进入深度睡眠。因此当查找到的使用场景为走路时,本申请就使手机自动进入深度睡眠,以减少手机的电量消耗,而不需要用户通过手动设置来使手机进入深度睡眠。For example, if the service corresponding to the scene "walking" is used, the mobile phone is put into deep sleep. Therefore, when the found usage scene is walking, the application automatically causes the mobile phone to enter deep sleep to reduce the power consumption of the mobile phone, without requiring the user to manually set the mobile phone to enter deep sleep.

下面通过具体应用示例来对本申请的人机交互方法作进一步详细说明。The human-computer interaction method of the present application is further described in detail below through specific application examples.

示例一Example one

当用户用两只手拿手机时,手机接收到的感应信息包括手机边框产生的手指坐标信息、压力信息,以及手机中距离传感器产生的距离信息(手机从有遮挡到无遮挡)和指纹传感器产生的指纹信息;When the user takes the mobile phone with two hands, the sensing information received by the mobile phone includes the finger coordinate information generated by the mobile phone frame, the pressure information, and the distance information generated by the distance sensor in the mobile phone (the mobile phone is blocked from no cover) and the fingerprint sensor generates Fingerprint information;

通过分析上述感应信息,获知用户的当前手势动作为两只手拿手机;By analyzing the above sensing information, it is known that the user's current gesture action is to take the mobile phone with two hands;

从保存的使用场景与手势动作的对应关系中,查找与两只手拿手机这一手势动作匹配的使用场景为抓拍;From the corresponding relationship between the saved use scene and the gesture action, the use scene matching the gesture action of the two hand-held mobile phones is captured;

抓拍对应的服务是开启照相机,因此,在查找到使用场景为抓拍后,就自动开启照相机,使手机处于能够抓拍的状态。 The service corresponding to the capture is to turn on the camera. Therefore, after finding the usage scene as a snapshot, the camera is automatically turned on, so that the mobile phone is in a state capable of capturing.

可见,通过本申请的人机交互方法,用户只需要做出两只手拿手机的动作,就可以自动切换到照相机的拍照模式下,不需要查找照相机图标、点击照相机图标开启拍照功能这些中间过程,简化了用户的操作,提升了用户体验。这得益于本申请的人机交互方法能够主动“理解”用户意图的特点。It can be seen that, through the human-computer interaction method of the present application, the user only needs to make two actions of holding the mobile phone, and can automatically switch to the camera mode, without the need to find the camera icon and click the camera icon to open the camera function. It simplifies the user's operation and improves the user experience. This is due to the fact that the human-computer interaction method of the present application can actively "understand" the characteristics of the user's intention.

示例二Example two

当用户用大拇指在手机边框上下滑动时,接收来自手机边框的压力信息(由触摸传感器产生)、手指加速度信息(由陀螺仪产生)、手指指纹信息(由指纹传感器产生);When the user slides up and down the border of the mobile phone with the thumb, receiving pressure information (generated by the touch sensor), finger acceleration information (generated by the gyroscope), finger fingerprint information (generated by the fingerprint sensor) from the frame of the mobile phone;

通过分析该触摸感应信息,获知用户的当前手势动作为大拇指在手机边框上下滑动;By analyzing the touch sensing information, it is learned that the current gesture motion of the user is that the thumb slides up and down the border of the mobile phone;

从保存的使用场景与手势动作的对应关系中,查找与大拇指在手机边框上下滑动这一手势动作匹配的使用场景为浏览手机屏幕上的内容;From the corresponding relationship between the saved use scene and the gesture action, searching for the use scene matching the gesture of the thumb sliding up and down the border of the mobile phone is browsing the content on the screen of the mobile phone;

提供浏览手机屏幕上的内容这一功能对应的服务,即根据大拇指在手机边框的上下滑动上下滚动手机屏幕上的内容。Provides the service corresponding to the function of browsing the content on the screen of the mobile phone, that is, scrolling up and down according to the thumb on the border of the mobile phone to scroll the content on the screen of the mobile phone.

在本示例中,用户不用将手指移到屏幕上,就能使手机屏幕上的内容上下滚动,从而浏览手机屏幕上的内容,避免了因手指遮挡屏幕造成的不便。In this example, the user can scroll the content on the screen of the mobile phone without scrolling the finger to the screen, thereby avoiding the inconvenience caused by the finger blocking the screen.

本申请实施例的人机交互方法,能够主动“理解”用户的操作意图,快速、准确的给出响应,将移动设备快速切换到用户想要的使用场景模式下,使移动设备更人性、更智能。并且,本申请实施例的人机交互方法,将目前许多只能够在触控屏或实体按键上完成的操作,在移动设备边框上就能完成,不遮挡屏幕,不需要大幅度移动手指,从而能够大大提升用户体验。The human-computer interaction method of the embodiment of the present application can actively "understand" the user's operation intention, quickly and accurately give a response, and quickly switch the mobile device to the user's desired use scene mode, so that the mobile device is more human and more intelligent. Moreover, in the human-computer interaction method of the embodiment of the present application, many operations that can only be performed on the touch screen or the physical button can be completed on the border of the mobile device without blocking the screen, and the finger does not need to be moved a lot. Can greatly enhance the user experience.

图2为本申请实施例二中人机交互方法的流程图。如图2所示,本实施例中,人机交互方法可以包括如下步骤:FIG. 2 is a flowchart of a human-computer interaction method in Embodiment 2 of the present application. As shown in FIG. 2, in this embodiment, the human-computer interaction method may include the following steps:

步骤S201,接收移动设备对用户动作的感应信息,其中,该感应信息包括位于该移动设备边框的触摸传感器生成的触摸感应信息;Step S201: Receive sensing information of a mobile device on a user action, where the sensing information includes touch sensing information generated by a touch sensor located on a border of the mobile device;

步骤S202,分析感应信息,获知用户的当前手势动作;Step S202, analyzing the sensing information to obtain the current gesture action of the user;

步骤S203,判断保存的使用场景与手势动作的对应关系中是否有与当前手势动作匹配的使用场景,如果有则执行步骤204,否则执行步骤206; Step S203, determining whether there is a usage scenario that matches the current gesture action in the corresponding relationship between the saved usage scenario and the gesture action, if yes, executing step 204, otherwise performing step 206;

步骤S204,从保存的使用场景与手势动作的对应关系中,查找与当前手势动作匹配的使用场景;Step S204: Searching for a usage scenario that matches the current gesture action from the corresponding relationship between the saved usage scenario and the gesture action;

步骤S205,提供查找到的使用场景对应的服务,结束;Step S205, providing a service corresponding to the found usage scenario, and ending;

步骤S206,记录用户在当前手势动作下主动启用的使用场景;Step S206, recording a usage scenario that the user actively activates under the current gesture action;

步骤S207,将主动启用的使用场景与当前手势动作绑定为一组对应关系,添加到使用场景与手势动作的对应关系中,结束。Step S207: Bind the actively activated usage scenario and the current gesture action into a set of correspondences, add to the corresponding relationship between the usage scenario and the gesture action, and end.

通过步骤S207,可以记录下用户习惯手势与在该习惯手势下启用的使用场景,在下次使用时,当用户做出该习惯手势,手机就可以自动启动对应的使用场景,而不再需要用户主动去启动该使用场景。In step S207, the user's custom gesture and the use scenario enabled in the custom gesture can be recorded. When the user makes the custom gesture, the mobile phone can automatically start the corresponding use scenario without using the user initiative. To start the usage scenario.

本申请实施例的人机交互方法,能够主动“理解”用户的操作意图,快速、准确的给出响应,将移动设备快速切换到用户想要的使用场景模式下,使移动设备更人性、更智能;将目前许多只能够在触控屏或实体按键上完成的操作,在移动设备边框上就能完成,不遮挡屏幕,不需要大幅度移动手指,从而能够大大提升用户体验。并且,本申请实施例的人机交互方法,能够建立用户习惯手势与使用场景的对应关系,使得在下一次使用时根据用户习惯手势就切换到相应的使用场景模式下,实现符合用户个人习惯的个性化操作,提高用户满意度。The human-computer interaction method of the embodiment of the present application can actively "understand" the user's operation intention, quickly and accurately give a response, and quickly switch the mobile device to the user's desired use scene mode, so that the mobile device is more human and more Smart; many current operations that can only be done on the touch screen or physical buttons can be done on the border of the mobile device without blocking the screen, and do not need to move the finger a lot, which can greatly enhance the user experience. Moreover, the human-computer interaction method in the embodiment of the present application can establish a correspondence between the user's custom gesture and the usage scenario, so that the next time the user uses the gesture according to the user's habit, the user can switch to the corresponding usage scene mode to implement the personality that conforms to the user's personal habits. Operation to improve user satisfaction.

图3为本申请实施例三中人机交互装置的结构框图。图3所示实施例的人机交互装置可以用于实施图1所示实施例的人机交互方法。FIG. 3 is a structural block diagram of a human-machine interaction apparatus according to Embodiment 3 of the present application. The human-machine interaction device of the embodiment shown in FIG. 3 can be used to implement the human-computer interaction method of the embodiment shown in FIG. 1.

如图3所示,本实施例中,人机交互装置300可以包括接收模块310、分析模块320、匹配模块330和服务模块340。其中,接收模块310用于接收移动设备对用户动作的感应信息,该感应信息包括位于移动设备边框的触摸传感器生成的触摸感应信息。分析模块320用于分析接收模块310接收的感应信息,获知用户的当前手势动作。匹配模块330用于从保存的使用场景与手势动作的对应关系中,查找与分析模块320获知的当前手势动作匹配的使用场景。服务模块340用于提供匹配模块330查找到的使用场景对应的服务。As shown in FIG. 3, in this embodiment, the human-machine interaction apparatus 300 can include a receiving module 310, an analysis module 320, a matching module 330, and a service module 340. The receiving module 310 is configured to receive sensing information of the mobile device on the user action, where the sensing information includes touch sensing information generated by a touch sensor located on the border of the mobile device. The analysis module 320 is configured to analyze the sensing information received by the receiving module 310 to learn the current gesture action of the user. The matching module 330 is configured to search for a usage scenario that matches the current gesture action learned by the analysis module 320 from the corresponding relationship between the saved usage scenario and the gesture action. The service module 340 is configured to provide a service corresponding to the usage scenario found by the matching module 330.

其中,移动设备可以是手机,也可以是可单手操作的其他智能设备。The mobile device may be a mobile phone or other smart device that can be operated by one hand.

其中,感应信息可以包括转动角速度信息、距离信息、指纹信息等之中的任意一种或多种,触摸感应信息可以包括手指坐标信息和压力信息这两者之一 或者全部。The sensing information may include any one or more of rotational angular velocity information, distance information, fingerprint information, and the like, and the touch sensing information may include one of finger coordinate information and pressure information. Or all.

本申请实施例的人机交互装置,能够主动“理解”用户的操作意图,快速、准确的给出响应,将移动设备快速切换到用户想要的使用场景模式下,使移动设备更人性、更智能。并且,本申请实施例的人机交互方法,将目前许多只能够在触控屏或实体按键上完成的操作,在移动设备边框上就能完成,不遮挡屏幕,不需要大幅度移动手指,从而能够大大提升用户体验。The human-machine interaction device of the embodiment of the present application can actively "understand" the user's operation intention, give a quick and accurate response, and quickly switch the mobile device to the user-used use scene mode, so that the mobile device is more human and more intelligent. Moreover, in the human-computer interaction method of the embodiment of the present application, many operations that can only be performed on the touch screen or the physical button can be completed on the border of the mobile device without blocking the screen, and the finger does not need to be moved a lot. Can greatly enhance the user experience.

图4为本申请实施例四中人机交互装置的结构框图。图4所示实施例的人机交互装置可以用于实施图2所示实施例的人机交互方法。4 is a structural block diagram of a human-machine interaction device in Embodiment 4 of the present application. The human-machine interaction device of the embodiment shown in FIG. 4 can be used to implement the human-computer interaction method of the embodiment shown in FIG. 2.

如图4所示,本实施例中,人机交互装置300可以包括接收模块310、分析模块320、匹配模块330、服务模块340、记录模块350和添加模块360。其中,接收模块310、分析模块320、匹配模块330、服务模块340的功能与前述图3所示实施例中相应模块的功能相同,此处不再赘述。记录模块350用于在保存的使用场景与手势动作的对应关系中未查找到与分析模块320获知的当前手势动作匹配的使用场景时,记录用户在该当前手势动作下主动启用的使用场景。添加模块360用于将记录模块350记录的用户主动启用的使用场景与该当前手势动作绑定为一组对应关系,添加到使用场景与手势动作的对应关系中。As shown in FIG. 4, in this embodiment, the human-machine interaction apparatus 300 may include a receiving module 310, an analysis module 320, a matching module 330, a service module 340, a recording module 350, and an adding module 360. The functions of the receiving module 310, the analyzing module 320, the matching module 330, and the service module 340 are the same as those of the corresponding modules in the foregoing embodiment shown in FIG. 3, and details are not described herein again. The recording module 350 is configured to record a usage scenario that is actively activated by the user under the current gesture action when the usage scenario that matches the current gesture action learned by the analysis module 320 is not found in the corresponding relationship between the saved usage scenario and the gesture action. The adding module 360 is configured to bind the user-activated usage scenario recorded by the recording module 350 to the current gesture action into a set of correspondences, and add to the corresponding relationship between the usage scenario and the gesture action.

其中,感应信息可以包括转动角速度信息、距离信息、指纹信息等之中的任意一种或多种,触摸感应信息可以包括手指坐标信息和压力信息这两者之一或者全部。The sensing information may include any one or more of rotational angular velocity information, distance information, fingerprint information, and the like, and the touch sensing information may include one or both of finger coordinate information and pressure information.

其中,移动设备可以是手机,也可以是可单手操作的其他智能设备。The mobile device may be a mobile phone or other smart device that can be operated by one hand.

本申请实施例的人机交互装置,能够主动“理解”用户的操作意图,快速、准确的给出响应,将移动设备快速切换到用户想要的使用场景模式下,使移动设备更人性、更智能;将目前许多只能够在触控屏或实体按键上完成的操作,在移动设备边框上就能完成,不遮挡屏幕,不需要大幅度移动手指,从而能够大大提升用户体验。并且,本申请实施例的人机交互装置,能够建立用户习惯手势与使用场景的对应关系,使得在下一次使用时根据用户习惯手势就切换到相应的使用场景模式下,实现符合用户个人习惯的个性化操作,提高用户满意度。The human-machine interaction device of the embodiment of the present application can actively "understand" the user's operation intention, quickly and accurately give a response, and quickly switch the mobile device to the user-used use scene mode, so that the mobile device is more human and more Smart; many current operations that can only be done on the touch screen or physical buttons can be done on the border of the mobile device without blocking the screen, and do not need to move the finger a lot, which can greatly enhance the user experience. Moreover, the human-machine interaction device of the embodiment of the present application can establish a correspondence relationship between the user's custom gesture and the usage scenario, so that the user can switch to the corresponding usage scene mode according to the user's customary gesture at the next use, thereby realizing the personality conforming to the user's personal habits. Operation to improve user satisfaction.

图5为本申请实施例五中移动设备的结构框图。如图5所示,本实施例中,移动设备500包括位于移动设备边框的触摸传感器400和人机交互装置300。 其中,人机交互装置300可以是本申请前述实施例中的任一种人机交互装置。触摸传感器400用于感应用户的触摸,产生触摸感应信息,并将该触摸感应信息发送给人机交互装置300。人机交互装置300用于接收移动设备对用户动作的感应信息,并根据该感应信息提供相应的服务,该感应信息包括触摸传感器400产生的触摸感应信息。FIG. 5 is a structural block diagram of a mobile device according to Embodiment 5 of the present application. As shown in FIG. 5, in this embodiment, the mobile device 500 includes a touch sensor 400 and a human-machine interaction device 300 located at a border of the mobile device. The human-machine interaction device 300 can be any one of the foregoing embodiments of the present application. The touch sensor 400 is configured to sense a user's touch, generate touch sensing information, and transmit the touch sensing information to the human interaction device 300. The human-machine interaction device 300 is configured to receive sensing information of the mobile device for the user action, and provide a corresponding service according to the sensing information, where the sensing information includes touch sensing information generated by the touch sensor 400.

更具体地,人机交互装置300可以用于接收移动设备边框的触摸感应信息,其中,触摸感应信息由位于移动设备边框处的触摸传感器400生成;分析接收的触摸感应信息,获知用户的当前手势动作;从保存的使用场景与手势动作的对应关系中,查找与该当前手势动作匹配的使用场景;提供查找到的使用场景对应的服务。More specifically, the human-machine interaction device 300 can be configured to receive touch sensing information of a mobile device frame, wherein the touch sensing information is generated by the touch sensor 400 located at the border of the mobile device; analyzing the received touch sensing information to know the current gesture of the user An action; searching for a usage scenario that matches the current gesture action from the corresponding relationship between the saved usage scenario and the gesture action; providing a service corresponding to the found usage scenario.

其中,移动设备可以为手机,也可以是可单手操作的其他智能设备。The mobile device can be a mobile phone or other smart device that can be operated by one hand.

本申请实施例的移动设备,能够主动“理解”用户的操作意图,快速、准确的给出响应,将移动设备快速切换到用户想要的使用场景模式下,使移动设备更人性、更智能。并且,本申请实施例的移动设备,将目前许多只能够在触控屏或实体按键上完成的操作,在移动设备边框上就能完成,不遮挡屏幕,不需要大幅度移动手指,从而能够大大提升用户体验。The mobile device of the embodiment of the present application can actively "understand" the user's operation intention, quickly and accurately give a response, and quickly switch the mobile device to the user's desired use scene mode, so that the mobile device is more human and smarter. Moreover, in the mobile device of the embodiment of the present application, many operations that can only be performed on the touch screen or the physical button can be completed on the border of the mobile device, and the screen is not blocked, and the finger does not need to be moved a lot, thereby being able to greatly Improve the user experience.

图6为本申请实施例六中电子设备的结构框图,如图6所示,该电子设备600包括:FIG. 6 is a structural block diagram of an electronic device according to Embodiment 6 of the present application. As shown in FIG. 6, the electronic device 600 includes:

一个或多个处理器610以及存储器620,图6中以一个处理器610为例。One or more processors 610 and memory 620, one processor 610 is taken as an example in FIG.

执行人机交互方法的电子设备600还可以包括:触摸传感器400。The electronic device 600 executing the human-computer interaction method may further include: a touch sensor 400.

处理器610、存储器620以及触摸传感器400可以通过总线或者其他方式连接,图6中以通过总线连接为例。The processor 610, the memory 620, and the touch sensor 400 may be connected by a bus or other means, as exemplified by a bus connection in FIG.

存储器620作为一种非易失性计算机可读存储介质,可用于存储非易失性软件程序、非易失性计算机可执行程序以及模块,如本申请实施例中的人机交互方法对应的程序指令/模块(例如,附图3所示的人机交互装置300的接收模块310,分析模块320,匹配模块330以及服务模块340;图4所示的人机交互装置300的接收模块310,分析模块320,匹配模块330,服务模块340,记录模块350,以及添加模块360)。处理器610通过运行存储在存储器620中的非 易失性软件程序、指令以及模块,从而执行电子设备600的各种功能应用以及数据处理,即实现上述方法实施例的人机交互方法和上述装置实施例的各个模块的功能。The memory 620 is a non-volatile computer readable storage medium, and can be used for storing a non-volatile software program, a non-volatile computer executable program, and a module, such as a program corresponding to the human-computer interaction method in the embodiment of the present application. The instruction/module (for example, the receiving module 310 of the human-machine interaction device 300 shown in FIG. 3, the analysis module 320, the matching module 330, and the service module 340; and the receiving module 310 of the human-machine interaction device 300 shown in FIG. 4, analyzes Module 320, matching module 330, service module 340, recording module 350, and adding module 360). The processor 610 runs the non-stored memory 620 Volatile software programs, instructions, and modules, thereby performing various functional applications and data processing of the electronic device 600, that is, implementing the human-computer interaction method of the above-described method embodiments and the functions of the various modules of the above-described device embodiments.

存储器620可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他非易失性固态存储器件。在一些实施例中,存储器620可选包括相对于处理器610远程设置的存储器,这些远程存储器可以通过网络连接至处理器610。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。Memory 620 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, memory 620 can optionally include memory remotely located relative to processor 610, which can be coupled to processor 610 via a network. Examples of such networks include, but are not limited to, the Internet, intranets, local area networks, mobile communication networks, and combinations thereof.

触摸传感器400用于感应用户的触摸,产生触摸感应信息,并将该触摸感应信息发送给处理器610。The touch sensor 400 is configured to sense a user's touch, generate touch sensing information, and send the touch sensing information to the processor 610.

所述程序指令/模块存储在所述存储器620中,当被所述一个或者多个处理器610执行时,可执行上述任意方法实施例中的人机交互方法,例如,执行以上描述的图1中的方法步骤S101至步骤S104,和图2中的方法步骤S201至步骤S207;也可实现图3中的模块310-340和图4中的模块310-360的功能。The program instructions/modules are stored in the memory 620, and when executed by the one or more processors 610, the human-computer interaction method in any of the above method embodiments may be performed, for example, performing the above described FIG. The method steps S101 to S104 in the method, and the method steps S201 to S207 in FIG. 2; the functions of the modules 310-340 in FIG. 3 and the modules 310-360 in FIG. 4 can also be implemented.

本申请实施例的电子设备600以多种形式存在,包括但不限于:The electronic device 600 of the embodiment of the present application exists in various forms, including but not limited to:

(1)移动通信设备:这类设备的特点是具备移动通信功能,并且以提供话音、数据通信为主要目标。这类终端包括:智能手机(例如iPhone)、多媒体手机、功能性手机,以及低端手机等。(1) Mobile communication devices: These devices are characterized by mobile communication functions and are mainly aimed at providing voice and data communication. Such terminals include: smart phones (such as iPhone), multimedia phones, functional phones, and low-end phones.

(2)超移动个人计算机设备:这类设备属于个人计算机的范畴,有计算和处理功能,一般也具备移动上网特性。这类终端包括:PDA、MID和UMPC设备等,例如iPad。(2) Ultra-mobile personal computer equipment: This type of equipment belongs to the category of personal computers, has computing and processing functions, and generally has mobile Internet access. Such terminals include: PDAs, MIDs, and UMPC devices, such as the iPad.

(3)便携式娱乐设备:这类设备可以显示和播放多媒体内容。该类设备包括:音频、视频播放器(例如iPod),掌上游戏机,电子书,以及智能玩具和便携式车载导航设备。(3) Portable entertainment devices: These devices can display and play multimedia content. Such devices include: audio, video players (such as iPod), handheld game consoles, e-books, and smart toys and portable car navigation devices.

(4)其他具有数据交互功能的电子设备。(4) Other electronic devices with data interaction functions.

本申请实施例七提供了一种非易失性计算机存储介质,所述计算机存储介质存储有计算机可执行指令,该计算机可执行指令被一个或多个处理器执行,例如一个处理器610,可使得上述一个或多个处理器可执行上述任意方法实施 例中的人机交互方法,例如,执行以上描述的图1中的方法步骤S101至步骤S104,和图2中的方法步骤S201至步骤S207;也可实现图3中的模块310-340和图4中的模块310-360的功能。Embodiment 7 of the present application provides a non-volatile computer storage medium storing computer-executable instructions executed by one or more processors, such as a processor 610, Having the one or more processors described above perform any of the methods described above The human-computer interaction method in the example, for example, performs the method steps S101 to S104 in FIG. 1 described above, and the method steps S201 to S207 in FIG. 2; the modules 310-340 and the diagram in FIG. 3 can also be implemented. The function of modules 310-360 in 4.

以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。The device embodiments described above are merely illustrative, wherein the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, ie may be located A place, or it can be distributed to multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment.

通过以上的实施方式的描述,本领域普通技术人员可以清楚地了解到各实施方式可借助软件加通用硬件平台的方式来实现,当然也可以通过硬件。本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。Through the description of the above embodiments, those skilled in the art can clearly understand that the various embodiments can be implemented by means of software plus a general hardware platform, and of course, by hardware. A person skilled in the art can understand that all or part of the process of implementing the above embodiments can be completed by a computer program to instruct related hardware, and the program can be stored in a computer readable storage medium. When executed, the flow of an embodiment of the methods as described above may be included. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).

最后应说明的是:以上实施例仅用以说明本申请的技术方案,而非对其限制;在本申请的思路下,以上实施例或者不同实施例中的技术特征之间也可以进行组合,步骤可以以任意顺序实现,并存在如上所述的本申请的不同方面的许多其它变化,为了简明,它们没有在细节中提供;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。 Finally, it should be noted that the above embodiments are only used to illustrate the technical solutions of the present application, and are not limited thereto; in the idea of the present application, the technical features in the above embodiments or different embodiments may also be combined. The steps may be carried out in any order, and there are many other variations of the various aspects of the present application as described above, which are not provided in the details for the sake of brevity; although the present application has been described in detail with reference to the foregoing embodiments, The skilled person should understand that the technical solutions described in the foregoing embodiments may be modified, or some of the technical features may be equivalently replaced; and the modifications or substitutions do not deviate from the embodiments of the present application. The scope of the technical solution.

Claims (13)

一种人机交互方法,其特征在于,应用于移动设备,包括:A human-computer interaction method, which is characterized in that it is applied to a mobile device, including: 接收移动设备对用户动作的感应信息,所述感应信息包括位于所述移动设备边框的触摸传感器生成的触摸感应信息;Receiving, by the mobile device, sensing information of a user action, where the sensing information includes touch sensing information generated by a touch sensor located at a border of the mobile device; 分析所述感应信息,获知用户的当前手势动作;Analyzing the sensing information to obtain a current gesture action of the user; 从保存的使用场景与手势动作的对应关系中,查找与所述当前手势动作匹配的使用场景;Searching for a usage scenario that matches the current gesture action from a correspondence between the saved usage scenario and the gesture action; 提供查找到的使用场景对应的服务。Provide the service corresponding to the found usage scenario. 根据权利要求1所述的人机交互方法,其特征在于,还包括:The human-computer interaction method according to claim 1, further comprising: 在保存的使用场景与手势动作的对应关系中未查找到与所述当前手势动作匹配的使用场景时,记录用户在所述当前手势动作下主动启用的使用场景;When the usage scenario matching the current gesture action is not found in the corresponding relationship between the saved usage scenario and the gesture action, the usage scenario actively activated by the user under the current gesture action is recorded; 将所述主动启用的使用场景与所述当前手势动作绑定为一组对应关系,添加到所述使用场景与手势动作的对应关系中。Binding the actively enabled usage scenario to the current gesture action into a set of correspondences, added to the corresponding relationship between the usage scenario and the gesture action. 根据权利要求1所述的人机交互方法,其特征在于,所述感应信息包括转动角速度信息、距离信息、指纹信息中的任意一种或多种,所述触摸感应信息包括手指坐标信息和/或压力信息。The human-computer interaction method according to claim 1, wherein the sensing information comprises any one or more of rotational angular velocity information, distance information, and fingerprint information, and the touch sensing information includes finger coordinate information and/or Or pressure information. 根据权利要求1所述的人机交互方法,其特征在于,所述移动设备为手机。The human-computer interaction method according to claim 1, wherein the mobile device is a mobile phone. 一种人机交互装置,其特征在于,包括:A human-machine interaction device, comprising: 接收模块,用于接收移动设备对用户动作的感应信息,所述感应信息包括位于所述移动设备边框的触摸传感器生成的触摸感应信息;a receiving module, configured to receive, by the mobile device, sensing information of a user action, where the sensing information includes touch sensing information generated by a touch sensor located at a border of the mobile device; 分析模块,用于分析所述接收模块接收的感应信息,获知用户的当前手势动作;An analysis module, configured to analyze the sensing information received by the receiving module, and learn the current gesture action of the user; 匹配模块,用于从保存的使用场景与手势动作的对应关系中,查找与所述分析模块获知的当前手势动作匹配的使用场景; a matching module, configured to search for a usage scenario that matches a current gesture action learned by the analysis module from a correspondence between the saved usage scenario and the gesture action; 服务模块,用于提供所述匹配模块查找到的使用场景对应的服务。The service module is configured to provide a service corresponding to the usage scenario found by the matching module. 根据权利要求5所述的人机交互装置,其特征在于,还包括:The human-machine interaction device according to claim 5, further comprising: 记录模块,用于在保存的使用场景与手势动作的对应关系中未查找到与所述当前手势动作匹配的使用场景时,记录用户在所述当前手势动作下主动启用的使用场景;a recording module, configured to record a usage scenario actively activated by the user under the current gesture action when a usage scenario matching the current gesture action is not found in the corresponding relationship between the saved usage scenario and the gesture action; 添加模块,用于将所述主动启用的使用场景与所述当前手势动作绑定为一组对应关系,添加到所述使用场景与手势动作的对应关系中。The adding module is configured to bind the actively enabled usage scenario and the current gesture action into a set of correspondences, and add to the corresponding relationship between the usage scenario and the gesture action. 根据权利要求5所述的人机交互装置,其特征在于,所述感应信息包括转动角速度信息、距离信息、指纹信息中的任意一种或多种,所述触摸感应信息包括手指坐标信息和/或压力信息。The human-machine interaction device according to claim 5, wherein the sensing information comprises any one or more of rotational angular velocity information, distance information, and fingerprint information, and the touch sensing information includes finger coordinate information and/or Or pressure information. 根据权利要求5所述的人机交互装置,其特征在于,所述移动设备为手机。The human-machine interaction device according to claim 5, wherein the mobile device is a mobile phone. 一种移动设备,其特征在于,包括位于所述移动设备边框的触摸传感器和权利要求5至8任一项所述的人机交互装置,其中:A mobile device, comprising: a touch sensor located at a border of the mobile device; and the human-machine interaction device according to any one of claims 5 to 8, wherein: 所述触摸传感器,用于感应用户的触摸,产生触摸感应信息,并将所述触摸感应信息发送给所述人机交互装置;The touch sensor is configured to sense a touch of a user, generate touch sensing information, and send the touch sensing information to the human-machine interaction device; 所述人机交互装置,用于接收移动设备对用户动作的感应信息,并根据所述感应信息提供相应的服务,所述感应信息包括所述触摸感应信息。The human-machine interaction device is configured to receive sensing information of the mobile device on the user action, and provide a corresponding service according to the sensing information, where the sensing information includes the touch sensing information. 根据权利要求9所述的移动设备,其特征在于,所述移动设备为手机。The mobile device of claim 9, wherein the mobile device is a mobile phone. 一种电子设备,包括:An electronic device comprising: 至少一个处理器;以及,At least one processor; and, 与所述至少一个处理器通信连接的存储器;其中,a memory communicatively coupled to the at least one processor; wherein 所述存储器存储有可被所述一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够:The memory stores instructions executable by the one processor, the instructions being executed by the at least one processor to enable the at least one processor to: 接收电子设备对用户动作的感应信息,所述感应信息包括位于所述电子设备边框的触摸传感器生成的触摸感应信息; Receiving sensing information of the electronic device on the user action, where the sensing information includes touch sensing information generated by a touch sensor located on the border of the electronic device; 分析所述感应信息,获知用户的当前手势动作;Analyzing the sensing information to obtain a current gesture action of the user; 从保存的使用场景与手势动作的对应关系中,查找与所述当前手势动作匹配的使用场景;Searching for a usage scenario that matches the current gesture action from a correspondence between the saved usage scenario and the gesture action; 提供查找到的使用场景对应的服务。Provide the service corresponding to the found usage scenario. 一种非易失性计算机存储介质,存储有计算机可执行指令,所述计算机可执行指令设置为:A non-volatile computer storage medium storing computer executable instructions, the computer executable instructions being set to: 接收电子设备对用户动作的感应信息,所述感应信息包括位于所述电子设备边框的触摸传感器生成的触摸感应信息;Receiving sensing information of the electronic device on the user action, where the sensing information includes touch sensing information generated by a touch sensor located on the border of the electronic device; 分析所述感应信息,获知用户的当前手势动作;Analyzing the sensing information to obtain a current gesture action of the user; 从保存的使用场景与手势动作的对应关系中,查找与所述当前手势动作匹配的使用场景;Searching for a usage scenario that matches the current gesture action from a correspondence between the saved usage scenario and the gesture action; 提供查找到的使用场景对应的服务。Provide the service corresponding to the found usage scenario. 一种计算机程序产品,所述计算机程序产品包括存储在非易失性计算机可读存储介质上的计算程序,所述计算机程序包括程序指令,当所述程序指令被计算机执行时,使所述计算机执行权利要求1-4任一项所述的方法。 A computer program product comprising a computing program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions, when the program instructions are executed by a computer, causing the computer The method of any of claims 1-4.
PCT/CN2016/096771 2015-12-11 2016-08-25 Human-computer interaction method, apparatus, and mobile device WO2017096958A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510920862.0A CN105892881A (en) 2015-12-11 2015-12-11 Human-computer interaction method and device, and mobile equipment
CN201510920862.0 2015-12-11

Publications (1)

Publication Number Publication Date
WO2017096958A1 true WO2017096958A1 (en) 2017-06-15

Family

ID=57002645

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/096771 WO2017096958A1 (en) 2015-12-11 2016-08-25 Human-computer interaction method, apparatus, and mobile device

Country Status (2)

Country Link
CN (1) CN105892881A (en)
WO (1) WO2017096958A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112131173A (en) * 2019-06-24 2020-12-25 北京迪文科技有限公司 An intelligent device and method for providing multiple human-computer interaction development modes

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105892881A (en) * 2015-12-11 2016-08-24 乐视移动智能信息技术(北京)有限公司 Human-computer interaction method and device, and mobile equipment
CN106354415B (en) * 2016-10-08 2020-05-26 瑞安市辉煌网络科技有限公司 Terminal and method for recognizing user gesture
CN106569714A (en) * 2016-10-31 2017-04-19 努比亚技术有限公司 Mobile terminal and touch operation processing method
CN106484192B (en) * 2016-11-30 2019-08-06 上海创功通讯技术有限公司 The control method and device of pressure sensitivity screen
CN107124509A (en) * 2017-04-25 2017-09-01 重庆市创锦程科技有限公司 A kind of smart mobile phone control system and its control method based on microrobot

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102354271A (en) * 2011-09-16 2012-02-15 华为终端有限公司 Gesture input method, mobile terminal and host
CN104866352A (en) * 2015-05-27 2015-08-26 努比亚技术有限公司 Method for starting application and mobile terminal
CN105892881A (en) * 2015-12-11 2016-08-24 乐视移动智能信息技术(北京)有限公司 Human-computer interaction method and device, and mobile equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9082117B2 (en) * 2008-05-17 2015-07-14 David H. Chin Gesture based authentication for wireless payment by a mobile electronic device
CN102810049B (en) * 2012-07-17 2015-12-16 华为终端有限公司 A kind of application programe switch-over method, device and touch-screen electronic equipment
CN104238916A (en) * 2014-09-16 2014-12-24 广东欧珀移动通信有限公司 Method for starting application or application function by mobile terminal and mobile terminal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102354271A (en) * 2011-09-16 2012-02-15 华为终端有限公司 Gesture input method, mobile terminal and host
CN104866352A (en) * 2015-05-27 2015-08-26 努比亚技术有限公司 Method for starting application and mobile terminal
CN105892881A (en) * 2015-12-11 2016-08-24 乐视移动智能信息技术(北京)有限公司 Human-computer interaction method and device, and mobile equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112131173A (en) * 2019-06-24 2020-12-25 北京迪文科技有限公司 An intelligent device and method for providing multiple human-computer interaction development modes

Also Published As

Publication number Publication date
CN105892881A (en) 2016-08-24

Similar Documents

Publication Publication Date Title
WO2017096958A1 (en) Human-computer interaction method, apparatus, and mobile device
US20130211843A1 (en) Engagement-dependent gesture recognition
CN107124508B (en) Position adjusting method and device of suspension control, terminal and readable storage medium
CN103955275B (en) Application control method and apparatus
JP7127202B2 (en) Dynamic motion detection method, dynamic motion control method and device
CN111680521A (en) A translation processing method, device and device for translation processing
CN104461348B (en) Information choosing method and device
WO2013177901A1 (en) Touch control unlocking method and apparatus, and electronic device
CN103885691A (en) Method and device for executing backspacing operation
WO2017161825A1 (en) Scrolling screenshot use method and terminal
WO2020034763A1 (en) Gesture recognition method, and gesture processing method and apparatus
CN103995666A (en) Method and device for setting work mode
WO2022111458A1 (en) Image capture method and apparatus, electronic device, and storage medium
WO2017161824A1 (en) Method and device for controlling terminal
CN107704190A (en) Gesture identification method, device, terminal and storage medium
CN103955274A (en) Application control method and device
WO2015131590A1 (en) Method for controlling blank screen gesture processing and terminal
EP2899623A2 (en) Information processing apparatus, information processing method, and program
CN108073267B (en) Three-dimensional control method and device based on motion trail
CN107979701B (en) Method and device for controlling terminal display
CN113642551A (en) Nail key point detection method, device, electronic device and storage medium
CN111722727B (en) Model training method applied to handwriting input, handwriting input method and device
CN117008730A (en) Control method, electronic device, intelligent finger ring, control system and storage medium
CN111382598A (en) Identification method and device and electronic equipment
CN105630204A (en) Mouse simulation system and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16872154

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16872154

Country of ref document: EP

Kind code of ref document: A1