CN111596758A - Man-machine interaction method, system, storage medium and terminal - Google Patents

Man-machine interaction method, system, storage medium and terminal Download PDF

Info

Publication number
CN111596758A
CN111596758A CN202010264320.3A CN202010264320A CN111596758A CN 111596758 A CN111596758 A CN 111596758A CN 202010264320 A CN202010264320 A CN 202010264320A CN 111596758 A CN111596758 A CN 111596758A
Authority
CN
China
Prior art keywords
user information
target
human
target user
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202010264320.3A
Other languages
Chinese (zh)
Inventor
刘刚
潘晓蕾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yanfeng Visteon Electronic Technology Shanghai Co Ltd
Original Assignee
Yanfeng Visteon Electronic Technology Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yanfeng Visteon Electronic Technology Shanghai Co Ltd filed Critical Yanfeng Visteon Electronic Technology Shanghai Co Ltd
Priority to CN202010264320.3A priority Critical patent/CN111596758A/en
Publication of CN111596758A publication Critical patent/CN111596758A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a man-machine interaction method, a man-machine interaction system, a storage medium and a terminal; the method comprises the following steps: acquiring target user information; analyzing the target user information, and determining a target action corresponding to the target user information; executing the target action; the invention changes the original man-machine interaction mode from active mode to passive mode essentially, the user does not need to pay attention to the operation instruction transmitted to the machine and the feedback information of the machine to the instruction received, the machine is used as an observer/slave role instead, the user can observe the color and make corresponding execution action according to the observation result, thus the mental and physical load of the user for man-machine interaction can be avoided, the time delay of the man-machine interaction can be obviously shortened, and the user experience is improved.

Description

Man-machine interaction method, system, storage medium and terminal
Technical Field
The invention relates to the technical field of human-computer interaction, in particular to a human-computer interaction method, a human-computer interaction system, a storage medium and a terminal.
Background
The current mainstream human-machine interaction (HMI) is active, i.e. a person needs to actively transmit information to a machine by clicking a touch screen, clicking a mouse, clicking a key/keyboard, making a voice, making a gesture, etc., to indicate the machine to work.
This type of active human-machine interaction essentially requires extra attention and effort from the user for transmitting the instruction information to the machine and receiving the feedback information from the machine for the instruction, which is an additional management cost (Overhead); in addition, the active human-computer interaction mode also affects the efficiency of human-computer interaction due to the limitation that the biological speed such as human reaction and action is slower than the program execution speed.
Disclosure of Invention
In view of the above drawbacks of the prior art, an object of the present invention is to provide a human-computer interaction method, system, storage medium and terminal, which are used to solve the problems of low human-computer interaction efficiency, long time delay and poor user experience in the prior art.
In order to achieve the above object, the present invention provides a human-computer interaction method, comprising the following steps: acquiring target user information; analyzing the target user information, and determining a target action corresponding to the target user information; and executing the target action.
In an embodiment of the present invention, the acquiring the target user information includes the following steps: acquiring user state information acquired by a camera; and receiving the user somatosensory information sent by the wearable device.
In an embodiment of the present invention, analyzing the target user information, and determining the target action corresponding to the target user information includes: comparing the target user information with user information in a preset action set, and judging the target action corresponding to the target user information; the preset action set comprises at least one action, and the action corresponds to user information.
In an embodiment of the present invention, the acquiring the user status information collected by the camera includes: the camera collects user state information and/or receives the user state information collected by an external camera.
In an embodiment of the present invention, the user status information includes any one or a combination of the following: pupil size, pupil position, facial expression, body posture, gaze direction, blink frequency, and tension.
In an embodiment of the present invention, the user somatosensory information includes any one or a combination of the following: heartbeat, blood pressure, blood oxygen, muscle tone, respiratory rate, and hormone levels.
The invention provides a human-computer interaction system, comprising: the device comprises an acquisition module, a processing module and an execution module; the acquisition module is used for acquiring target user information; the processing module is used for analyzing and processing the target user information and determining a target action corresponding to the target user information; the execution module is used for executing the target action.
The present invention provides a storage medium having stored thereon a computer program which, when executed by a processor, implements the above-described human-computer interaction method.
The present invention provides a terminal, including: a processor and a memory; the memory is used for storing a computer program; the processor is used for executing the computer program stored in the memory so as to enable the terminal to execute the human-computer interaction method.
The invention provides a human-computer interaction system, comprising: the terminal and the acquisition device; the acquisition device is connected with the terminal and used for acquiring target user information and sending the target user information to the terminal; the acquisition device comprises wearable equipment and an external camera.
As described above, the human-computer interaction method, system, storage medium and terminal of the present invention have the following advantages:
the original man-machine interaction mode is changed from a passive mode to an active mode, a user does not need to pay attention to transmitting an operation instruction to the machine and receiving feedback information of the machine to the instruction, the machine is used as an observer/slave role instead, the user can observe the voice and the color, and corresponding execution actions are performed according to the observation result, so that the mental and physical loads of the user for man-machine interaction can be avoided, meanwhile, the time delay of the man-machine interaction can be obviously shortened, and the user experience is improved.
Drawings
Fig. 1 is a flowchart illustrating a human-computer interaction method according to an embodiment of the present invention.
Fig. 2 is a flowchart illustrating a process of acquiring information of a target user according to an embodiment of the present invention.
FIG. 3 is a schematic structural diagram of a human-computer interaction system according to an embodiment of the invention.
Fig. 4 is a schematic structural diagram of a terminal according to an embodiment of the invention.
FIG. 5 is a schematic structural diagram of a human-computer interaction system according to another embodiment of the invention.
Description of the reference symbols
31 an acquisition module;
32 a processing module;
33 executing the module;
41 a processor;
42 a memory;
51 a terminal;
52 a collecting device;
S1-S3;
s11 to S12.
Detailed Description
The following description of the embodiments of the present invention is provided by way of specific examples, and other advantages and effects of the present invention will be readily apparent to those skilled in the art from the disclosure herein. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the drawings only show the components related to the present invention rather than the number, shape and size of the components in practical implementation, and the type, quantity and proportion of the components in practical implementation can be changed freely, and the layout of the components can be more complicated.
The man-machine interaction method, the man-machine interaction system, the storage medium and the terminal change the original man-machine interaction mode into the passive mode essentially, a user does not need to pay attention to transmitting an operation instruction to the machine and receiving feedback information of the machine to the instruction, the machine is used as an observer/slave role instead, the user is observed and watched, and corresponding execution actions are carried out according to the observation result, so that the mental and physical loads of the user for man-machine interaction can be avoided, meanwhile, the time delay of the man-machine interaction can be obviously shortened, and the user experience is improved.
As shown in fig. 1, in an embodiment, the human-computer interaction method of the present invention is applied to a terminal for implementing human-computer interaction; specifically, the terminal comprises but is not limited to a smart terminal, a vehicle-mounted central control system, a passenger car and a rear seat entertainment system.
It should be noted that the intelligent terminal includes, but is not limited to, a smart phone, a tablet computer, a PDA, and other terminal devices with a data processing function; generally, an intelligent terminal is a terminal device that has an independent operating system, can be used by a user to install programs provided by third-party service providers such as software and games, continuously expands the functions of a handheld device through the programs, and can realize wireless network access through a mobile communication network.
The man-machine interaction method comprises the following steps:
and step S1, acquiring the target user information.
Specifically, the terminal acquires target user information.
As shown in fig. 2, in an embodiment, the acquiring the target user information includes the following steps:
and step S11, acquiring the user state information collected by the camera.
Specifically, the camera collects user status information, so that the terminal acquires the user status information.
The camera may be a camera carried by the terminal, or may be an external camera connected to the terminal.
In an embodiment, the acquiring the user status information collected by the camera includes: the camera collects user state information and/or receives the user state information collected by an external camera.
Specifically, the terminal directly collects user state information through a camera provided with the terminal and/or collects the user state information through an external camera connected with the terminal, and sends the user state information to the terminal, so that the terminal finally obtains the user state information.
In an embodiment, the user status information includes any one or a combination of the following: pupil size, pupil position, facial expression, body posture, gaze direction, blink frequency, and tension.
It should be noted that the user status information includes, but is not limited to, the size of the pupil, the position of the pupil, the facial expression, the body posture, the direction of the eye, the blinking frequency, and the tension of the user.
And step S12, receiving the user somatosensory information sent by the wearable device.
Specifically, wearable equipment is connected with the terminal through wireless (bluetooth/WiFi etc.) mode, and wearable equipment feels information transmission to terminal with the user who gathers.
It should be noted that, the wearable device collects the somatosensory information of the user at a preset frequency.
It should be noted that the wearable device can be worn directly on the body, or a portable device integrated into the user's clothing or accessory; specifically, in the present invention, the wearable device is not limited to the specific wearable device, and the wearable device can be selected according to the actual application requirement.
It should be noted that, the steps S11 and S12 have no fixed execution sequence, and may be executed before step S11 and after step S12; step S12 may be performed before, and step S11 may be performed after; or step S11 and step S12 are executed simultaneously.
In an embodiment, the user somatosensory information includes any one or a combination of the following: heartbeat, blood pressure, blood oxygen, muscle tone, respiratory rate, and hormone levels.
It should be noted that the user somatosensory information includes, but is not limited to: the user's heartbeat, blood pressure, blood oxygen, muscle tone, respiratory rate, and hormone level.
And step S2, analyzing the target user information, and determining a target action corresponding to the target user information.
Specifically, after the terminal acquires the target user information in step S1, a series of analysis processes are performed on the target user information to finally determine the target action corresponding to the target user information.
In an embodiment, analyzing the target user information, and determining the target action corresponding to the target user information includes: and comparing the target user information with user information in a preset action set, and judging the target action corresponding to the target user information.
Specifically, a preset action set is set on the terminal in advance, after the terminal receives the target user information, the target user information is compared with the user information in the preset action set, the user information consistent with the target user information is found out from the preset action set, and then the target action corresponding to the target user information is determined.
It should be noted that the preset action set includes at least one action, and the action corresponds to user information; for example, a preset action set is represented as: a preset action set = { user information-action } = { pupil size, micro expression, heartbeat, blood pressure, hormone, respiration change-volume increase; pupil size, micro-expression, heartbeat, blood pressure, hormones, and respiration return to normal-return to normal volume }.
And step S3, executing the target action.
Specifically, after the terminal determines the target action corresponding to the target user information in step S2, the terminal executes the target action.
It should be noted that, in the conventional active human-computer interaction, taking a key as an example, whether the key is a mechanical key or a virtual key on a screen is controlled by a touch mode or a mouse, generally, 20 to 30 milliseconds are required for completing the whole pressing and releasing actions; the man-machine interaction method enables a user to transmit information to the machine without actively through touch, mouse, keys/keyboards, voice, gestures and the like, but the machine observes and collects user state information through a self-carried or externally-connected camera, receives somatosensory information and the like returned by the wearable device in a wireless or wired mode, comprehensively analyzes and judges the attention, the psychological state and the attention degree of the user to the current content, and further achieves the purpose of automatically controlling interface switching of the machine; because the time delay of observing, collecting and processing the user information by the machine can be controlled within 5 milliseconds, the corresponding saved time can reach about 20 milliseconds; overall, user experience of the user is greatly improved.
It should be noted that the protection scope of the human-computer interaction method according to the present invention is not limited to the execution sequence of the steps listed in this embodiment, and all the schemes implemented by adding, subtracting, and replacing steps in the prior art according to the principle of the present invention are included in the protection scope of the present invention.
The man-machine interaction method of the present invention will be further verified by the following specific examples.
Example one
The man-machine interaction method is applied to the field of music playing; specifically, the terminal comprehensively judges the information of the pupil, the micro expression, the heartbeat, the blood pressure, the hormone, the breath and the like of the user:
1. if the pupil size, the micro-expression, the heartbeat, the blood pressure, the hormone, the respiration and the like change, the volume is properly increased;
2. if the pupil size, the micro-expression, the heartbeat, the blood pressure, the hormone, the respiration and the like are recovered to be normal, the normal volume is recovered;
3. if the user is judged to gradually enter the sleep state through the pupil size, the micro expression, the heartbeat, the blood pressure, the hormone, the breath and the like, the volume is sequentially reduced until the user is muted finally;
4. if the user is judged to wake up gradually through pupil size, micro expression, heartbeat, blood pressure, hormone, breathing and the like, the volume is increased in sequence until the normal volume is reached.
Example two
The man-machine interaction method is applied to the reading field, such as voice weather forecast/mail/article/news and the like; specifically, the terminal comprehensively judges the information of the pupil, the micro-expression, the heartbeat, the blood pressure, the hormone, the respiration and the like of the user:
1. if the pupil size, the micro-expression, the heartbeat, the blood pressure, the hormone, the respiration and the like are changed, the speaking speed is slowed down, and the volume is properly increased;
2. if the pupil size, the micro-expression, the heartbeat, the blood pressure, the hormone, the respiration and the like are recovered to be normal, the normal speaking speed and the volume are recovered.
EXAMPLE III
The man-machine interaction method is applied to the field of electronic book reading; specifically, the terminal comprehensively judges the information of the pupil position, the pupil size, the micro expression, the heartbeat, the blood pressure, the hormone, the respiration and the like of the user:
1. if the sight focus moves out of the range of the machine screen, keeping the page still;
2. if the pupil has obvious size change or changes in micro expression, heartbeat, blood pressure, hormone, respiration and the like, the retention time of the current page is increased;
3. if the sight focus moves to the bottom of the page quickly, the dwell time of the page is reduced;
4. and if the pupil, the micro expression, the heartbeat, the blood pressure, the hormone, the respiration and the like are recovered to be normal, recovering the normal page retention time.
Example four
The man-machine interaction method is applied to the field of video playing/television; specifically, the terminal comprehensively judges the information of the pupil position, the pupil size, the micro expression, the heartbeat, the blood pressure, the hormone, the respiration and the like of the user:
1. if the sight focus moves out of the range of the machine screen, automatically entering a pause state;
2. if the pupil has obvious size change or changes in micro expression, heartbeat, blood pressure, hormone, respiration and the like, entering a playback mode and playing back the content of the previous seconds;
3. and if the pupil, the micro expression, the heartbeat, the blood pressure, the hormone, the breath and the like are recovered to be normal, the normal playing is recovered.
As shown in fig. 3, in an embodiment, the human-computer interaction system of the present invention includes an obtaining module 31, a processing module 32, and an executing module 33.
The obtaining module 31 is configured to obtain target user information.
The processing module 32 is configured to analyze the target user information and determine a target action corresponding to the target user information.
The execution module 33 is configured to execute the target action.
It should be noted that the working principle of the human-computer interaction system is the same as that of the human-computer interaction method; the structural functions of the obtaining module 31, the processing module 32 and the executing module 33 correspond to the steps of the human-computer interaction method one to one, and are not described herein again.
It should be noted that the division of the modules of the above system is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling software by the processing element, and part of the modules can be realized in the form of hardware. For example, the x module may be a processing element that is set up separately, or may be implemented by being integrated in a chip of the system, or may be stored in a memory of the system in the form of program code, and the function of the x module may be called and executed by a processing element of the system. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more Digital Signal Processors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), etc. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
The storage medium of the invention stores a computer program which realizes the human-computer interaction method when being executed by a processor; the storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic disk, U-disk, memory card, or optical disk.
As shown in fig. 4, the terminal of the present invention includes a processor 41 and a memory 42.
The memory 42 is used for storing computer programs. Preferably, the memory 42 comprises: various media that can store program codes, such as ROM, RAM, magnetic disk, U-disk, memory card, or optical disk.
The processor 41 is connected to the memory 42, and is configured to execute the computer program stored in the memory 42, so as to enable the terminal to execute the above-mentioned human-computer interaction method.
Preferably, the Processor 41 may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; the integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components.
It should be noted that the human-computer interaction system of the present invention can implement the human-computer interaction method of the present invention, but the implementation apparatus of the human-computer interaction method of the present invention includes, but is not limited to, the structure of the human-computer interaction system illustrated in this embodiment, and all structural modifications and substitutions in the prior art made according to the principle of the present invention are included in the protection scope of the present invention.
As shown in fig. 5, in an embodiment, the human-computer interaction system of the present invention includes the terminal 51 and the collecting device 52; the acquisition device 52 is connected to the terminal 51, and is configured to acquire target user information and send the target user information to the terminal 51; the acquisition device 52 includes a wearable device and an external camera.
It should be noted that the working principle of the human-computer interaction system is the same as that of the human-computer interaction method, and is not described herein again.
In summary, the man-machine interaction method, system, storage medium and terminal of the present invention essentially change the original man-machine interaction mode from active to passive, and the user does not need to pay attention to transmitting the operation command to the machine and receiving the feedback information of the machine to the command, and instead the machine acts as an observer/slave role to observe the user and perform corresponding execution actions according to the observation result, so as to avoid the mental and physical loads of the user for man-machine interaction, and at the same time, significantly shorten the time delay of the man-machine interaction, and improve the user experience.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (10)

1. A human-computer interaction method is characterized by comprising the following steps:
acquiring target user information;
analyzing the target user information, and determining a target action corresponding to the target user information;
and executing the target action.
2. The human-computer interaction method according to claim 1, wherein the obtaining of the target user information comprises the following steps:
acquiring user state information acquired by a camera;
and receiving the user somatosensory information sent by the wearable device.
3. The human-computer interaction method according to claim 1, wherein the analyzing the target user information and the determining the target action corresponding to the target user information comprises: comparing the target user information with user information in a preset action set, and judging the target action corresponding to the target user information; the preset action set comprises at least one action, and the action corresponds to user information.
4. The human-computer interaction method according to claim 2, wherein the acquiring the user state information collected by the camera comprises: the camera collects user state information and/or receives the user state information collected by an external camera.
5. The human-computer interaction method according to claim 2, wherein the user status information comprises any one or a combination of the following: pupil size, pupil position, facial expression, body posture, gaze direction, blink frequency, and tension.
6. The human-computer interaction method according to claim 2, wherein the user somatosensory information comprises any one or a combination of the following: heartbeat, blood pressure, blood oxygen, muscle tone, respiratory rate, and hormone levels.
7. A human-computer interaction system, comprising: the device comprises an acquisition module, a processing module and an execution module;
the acquisition module is used for acquiring target user information;
the processing module is used for analyzing and processing the target user information and determining a target action corresponding to the target user information;
the execution module is used for executing the target action.
8. A storage medium having stored thereon a computer program, characterized in that the computer program, when being executed by a processor, implements the human-computer interaction method of any one of claims 1 to 6.
9. A terminal, comprising: a processor and a memory;
the memory is used for storing a computer program;
the processor is configured to execute the computer program stored in the memory to cause the terminal to execute the human-computer interaction method according to any one of claims 1 to 6.
10. A human-computer interaction system, comprising: the terminal and acquisition device recited in claim 9;
the acquisition device is connected with the terminal and used for acquiring target user information and sending the target user information to the terminal; the acquisition device comprises wearable equipment and an external camera.
CN202010264320.3A 2020-04-07 2020-04-07 Man-machine interaction method, system, storage medium and terminal Withdrawn CN111596758A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010264320.3A CN111596758A (en) 2020-04-07 2020-04-07 Man-machine interaction method, system, storage medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010264320.3A CN111596758A (en) 2020-04-07 2020-04-07 Man-machine interaction method, system, storage medium and terminal

Publications (1)

Publication Number Publication Date
CN111596758A true CN111596758A (en) 2020-08-28

Family

ID=72188625

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010264320.3A Withdrawn CN111596758A (en) 2020-04-07 2020-04-07 Man-machine interaction method, system, storage medium and terminal

Country Status (1)

Country Link
CN (1) CN111596758A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108307037A (en) * 2017-12-15 2018-07-20 努比亚技术有限公司 Terminal control method, terminal and computer readable storage medium
CN109101663A (en) * 2018-09-18 2018-12-28 宁波众鑫网络科技股份有限公司 A kind of robot conversational system Internet-based
CN109116991A (en) * 2018-08-30 2019-01-01 Oppo广东移动通信有限公司 Control method, device, storage medium and the wearable device of wearable device
CN110109539A (en) * 2019-04-02 2019-08-09 努比亚技术有限公司 A kind of gestural control method, wearable device and computer readable storage medium
CN110446996A (en) * 2017-03-21 2019-11-12 华为技术有限公司 A kind of control method, terminal and system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110446996A (en) * 2017-03-21 2019-11-12 华为技术有限公司 A kind of control method, terminal and system
CN108307037A (en) * 2017-12-15 2018-07-20 努比亚技术有限公司 Terminal control method, terminal and computer readable storage medium
CN109116991A (en) * 2018-08-30 2019-01-01 Oppo广东移动通信有限公司 Control method, device, storage medium and the wearable device of wearable device
CN109101663A (en) * 2018-09-18 2018-12-28 宁波众鑫网络科技股份有限公司 A kind of robot conversational system Internet-based
CN110109539A (en) * 2019-04-02 2019-08-09 努比亚技术有限公司 A kind of gestural control method, wearable device and computer readable storage medium

Similar Documents

Publication Publication Date Title
CN108628645B (en) Application program preloading method and device, storage medium and terminal
CN110045819A (en) A kind of gesture processing method and equipment
US10182769B2 (en) Information management method and electronic device
KR102620178B1 (en) Electronic device and operating method thereof
CN110362373A (en) A kind of method and relevant device controlling screen wicket
KR20150088599A (en) Mobile terminal and method for controlling the same
KR20150140043A (en) Method For Providing Augmented Reality Information And Wearable Device Using The Same
CN103581428A (en) Terminal and control method thereof
CN107249071B (en) Method for controlling mobile terminal by intelligent wearable device
KR101777609B1 (en) Mobile terminal perform a life log and controlling method thereof
US20240077948A1 (en) Gesture-based display interface control method and apparatus, device and storage medium
CN108958587B (en) Split screen processing method and device, storage medium and electronic equipment
CN108646958A (en) A kind of application program launching method and terminal
CN110187771B (en) Method and device for interaction of air gestures, wearable equipment and computer storage medium
CN110471606A (en) Input method and electronic equipment
CN109164908B (en) Interface control method and mobile terminal
EP4113356A1 (en) Layout analysis method and electronic device
KR101918991B1 (en) Mobile terminal having health care function service and control method thereof
CN110765170A (en) User portrait generation method and wearable device
CN108351698A (en) The wearable electronic and method of application for being executed in control electronics
CN115113751A (en) Method and device for adjusting numerical range of recognition parameter of touch gesture
CN111596758A (en) Man-machine interaction method, system, storage medium and terminal
CN109040427B (en) Split screen processing method and device, storage medium and electronic equipment
CN111752688A (en) Data acquisition method and device, electronic equipment and storage medium
WO2020103091A1 (en) Touch operation locking method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20200828