WO2021128561A1 - 人机交互系统及其方法、终端设备 - Google Patents

人机交互系统及其方法、终端设备 Download PDF

Info

Publication number
WO2021128561A1
WO2021128561A1 PCT/CN2020/076422 CN2020076422W WO2021128561A1 WO 2021128561 A1 WO2021128561 A1 WO 2021128561A1 CN 2020076422 W CN2020076422 W CN 2020076422W WO 2021128561 A1 WO2021128561 A1 WO 2021128561A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal device
display
display device
display screen
human
Prior art date
Application number
PCT/CN2020/076422
Other languages
English (en)
French (fr)
Inventor
洪旭杰
Original Assignee
惠州Tcl移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 惠州Tcl移动通信有限公司 filed Critical 惠州Tcl移动通信有限公司
Publication of WO2021128561A1 publication Critical patent/WO2021128561A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/38Information transfer, e.g. on bus
    • G06F13/40Bus structure
    • G06F13/4063Device-to-bus coupling
    • G06F13/4068Electrical coupling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2213/00Indexing scheme relating to interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F2213/0042Universal serial bus [USB]

Definitions

  • This application relates to the field of computer technology, and in particular to a human-computer interaction system, its method, and terminal equipment.
  • Augmented Reality (AR)/Virtual Reality (VR) devices can make users no longer restricted by the physical size of the terminal device's display screen, and browse the information they want to view with a larger display anytime, anywhere.
  • AR technology can not only display real-world information, but also display virtual information at the same time. The two kinds of information complement and superimpose each other to project virtual information into the real world, which is perceived by human senses.
  • the VR technology can render the entire scene, so that humans can obtain a sensory experience beyond reality.
  • AR/VR devices on the market have the following problems.
  • secondly, ordinary AR/VR devices on the market are restricted by third-party mobile phone permissions. This leads to limited use of third-party applications and affects user experience.
  • the embodiments of the present application provide a human-computer interaction system, a method thereof, and a terminal device, which can effectively solve the problem that the current user cannot realize the synchronization operation of the terminal device when the user is wearing the AR/VR device.
  • an embodiment of the present application provides a human-computer interaction system, including: a display device; and a terminal device, the terminal device is connected to the display device through a connecting line; wherein the display The device communicates with the terminal device through the USB-TP connection protocol to project the display content output by the display screen of the terminal device on the display device.
  • the display device includes a head tracking module for acquiring the movement path of the display device.
  • the display device includes an analog display screen touch module, which is connected to the head tracking module, and is used to obtain the movement path of the sensor cursor in the display device.
  • the terminal device includes a display screen touch module, which is used to obtain the touch path of the user touching the display screen of the terminal device and transmit it to the display device.
  • the terminal device further includes a voice recognition module connected to the display screen touch module, and the voice recognition module is used to convert the user's voice into text.
  • the connecting cable is a USB Type-C cable or a DP cable.
  • an embodiment of the present application provides a human-computer interaction method, which includes the following steps: detecting whether it is connected to a display device; when the connection to the display device is detected, acquiring the touch of the user touching the display screen of the terminal device Path; when the touch path is located in a straight line in the middle of the display screen of the terminal device, the main interface of the display screen of the terminal device is displayed; when the touch path is located in a straight line on both sides of the terminal device display screen, it is displayed The previous display interface of the current display interface of the display screen of the terminal device; and when the touch path is a broken line, activate the sensing cursor in the display device.
  • the method further includes the step of detecting whether the pressure value of the user touching the display screen of the terminal device is greater than a preset value; When it is detected that the pressure value of the user touching the display screen of the terminal device is greater than the preset value, the voice recognition service is turned on; and when it is detected that the pressure value of the user touching the terminal device display screen is not greater than the preset value, the voice recognition service is turned off .
  • the method further includes the steps: the display device acquires the movement path of the display device; and when the movement path is a polyline, the display device activates the sensing cursor in the display device .
  • an embodiment of the present application provides a terminal device, including a processor and a memory, the processor is electrically connected to the memory, the memory is used for storing instructions and data, and the processor is used for
  • the human-computer interaction method includes the following steps: detecting whether it is connected to the display device; when detecting the connection to the display device, acquiring the touch path of the user touching the display screen of the terminal device; When the touch path is located in a straight line in the middle of the display screen of the terminal device, the main interface of the display screen of the terminal device is displayed; when the touch path is located in a straight line on both sides of the terminal device display screen, the terminal device is displayed The previous display interface of the current display interface of the display screen; and when the touch path is a broken line, activate the induction cursor in the display device.
  • the method further includes the step of detecting whether the pressure value of the user touching the display screen of the terminal device is greater than a preset value; When it is detected that the pressure value of the user touching the display screen of the terminal device is greater than the preset value, the voice recognition service is turned on; and when it is detected that the pressure value of the user touching the terminal device display screen is not greater than the preset value, the voice recognition service is turned off .
  • the method further includes the steps: the display device acquires the movement path of the display device; and when the movement path is a polyline, the display device activates the sensing cursor in the display device .
  • the advantage of this application is that the display device is connected to the terminal device through a USB Type-C cable or a DP cable, and communicates with the terminal device through the USB-TP connection protocol, and the display device is projected on the display device through the USB-TP connection protocol.
  • a touch panel device is simulated on the display screen of the terminal device, and combined with the operation on the display screen of the terminal device and the display device, the switching and operation of third-party applications in the terminal device are completed. Through the user's different touch paths, the terminal device makes corresponding operations, and projects the final display content to the display device through the connection line, creating a good immersive experience for the user.
  • Fig. 1 is a schematic structural diagram of a human-computer interaction system provided by an embodiment of the application.
  • Fig. 2 is a flow chart of the steps of the human-computer interaction method provided by an embodiment of the application.
  • Fig. 3 is a schematic structural diagram of a terminal device provided by an embodiment of the application.
  • FIG. 4 is a schematic diagram of another structure of a terminal device provided by an embodiment of the application.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of the features. In the description of the present application, “multiple” means two or more than two, unless otherwise specifically defined.
  • connection should be understood in a broad sense, unless otherwise clearly specified and limited.
  • it can be a fixed connection or a detachable connection. Connected or integrally connected; it can be mechanically connected, or electrically connected or can communicate with each other; it can be directly connected or indirectly connected through an intermediate medium, it can be the internal communication of two components or the interaction of two components relationship.
  • the analog display screen touch module is connected to the head tracking module, and is used to obtain the movement path of the sensor cursor in the display device.
  • the "above” or “below” of the first feature of the second feature may include direct contact between the first and second features, or may include the first and second features Not in direct contact but through other features between them.
  • “above”, “above” and “above” the second feature of the first feature include the first feature being directly above and obliquely above the second feature, or it simply means that the level of the first feature is higher than that of the second feature.
  • the “below”, “below” and “below” the first feature of the second feature include the first feature directly below and obliquely below the second feature, or it simply means that the level of the first feature is smaller than the second feature.
  • the structure diagram of the human-computer interaction system includes: a display device 10 and a terminal device 20.
  • the terminal device 20 is connected to the display device 10 through a connecting line 5, and the connecting line 5 is a USB Type-C line (ie, a Type-C USB interface) or a DP (DisplayPort, display interface) line.
  • the purpose of this setting is to display the device through USB Type-C cable or DP cable is connected to the terminal device, and communicates with the terminal device through the USB-TP connection protocol, and the display content output by the display screen of the terminal device is projected on the display device through the USB-TP connection protocol .
  • the display device 10 is an augmented reality (Augmented Reality referred to as AR) or virtual reality (Virtual Reality is abbreviated as VR) device, which is a wearable display device.
  • the display device 10 includes: a head tracking module 1 and an analog display screen touch module 2.
  • the head tracking module 1 is used to obtain the moving path of the display device 10. It should be noted that the head tracking module 1 is mainly composed of gyroscope sensors. When the user is wearing an AR/VR device, it captures the movement of the head to obtain the movement trajectory of the AR/VR device. The movement track is the movement path of the display device 10.
  • the analog display screen touch module 2 is connected to the head tracking module 1 and is used to obtain the movement path of the sensor cursor in the display device 10.
  • the function of the analog display screen touch module 2 is realized by the display device 10 communicating with the terminal device 20 through the USB-TP connection protocol. Combined with the head tracking module 1, the movement path of the sensor cursor on the AR/VR device is simulated.
  • the terminal device 20 is a mobile phone, and the terminal device includes: a display screen touch module 3 and a voice recognition module 4.
  • the display screen touch module 3 is used to obtain the touch path of the user touching the display screen of the terminal device and transmit it to the display device.
  • the mobile phone display shows a blank application layout (that is, the display screen is blank). Obtain the touch path of the user's gesture input, recognize the user's swipe gesture from different positions under the phone, and recognize the user's single-finger tap gesture, and upload the recognition result to the AR/VR device.
  • the voice recognition module 4 is connected to the display screen touch module 3, and the voice recognition module 4 is used to convert the user's voice into text.
  • the voice recognition module 4 helps users complete input functions in different applications. The user starts and ends the voice recognition function by pressing the display screen of the terminal device, and the voice recognition module converts the acquired voice into text output.
  • the advantage of this application is that the display device is connected to the terminal device through a USB Type-C cable or a DP cable, and communicates with the terminal device through the USB-TP connection protocol, and the display device is projected on the display device through the USB-TP connection protocol.
  • the step flowchart of the human-computer interaction method provided by this embodiment of the application includes the following steps:
  • Step S210 Detect whether it is connected to a display device.
  • the terminal device is connected to the display device through a connecting cable
  • the connecting cable is a USB Type-C cable (that is, a Type-C USB interface) or a DP (DisplayPort, display interface) cable.
  • the display device is an augmented reality (AR) or virtual reality (VR) device, and the display device includes a head tracking module and an analog display touch module.
  • the display device communicates with the terminal device through the USB-TP connection protocol, and projects the display content output by the display screen of the terminal device on the display device.
  • Step S220 When it is detected that it is connected to the display device, acquire a touch path for the user to touch the display screen of the terminal device.
  • the display screen of the terminal device displays a blank application layout (that is, the display screen is blank).
  • Obtain the touch path of the user's gesture input recognize the user's swipe gesture from different positions under the terminal device, recognize the user's single-finger tap gesture, and upload the recognition result to the AR/VR device.
  • swipe gestures at different positions below the user terminal device can be specifically divided into scenarios in steps S230-S233:
  • Step S230 When the touch path is located in a straight line in the middle of the display screen of the terminal device, display the main interface of the display screen of the terminal device, and project the content of the main interface to the AR/VR device of the user.
  • Step S231 When the touch path is located in a straight line on both sides of the terminal device display screen, display the previous display interface of the current display interface of the terminal device display screen, and project the content of the main interface to the user's AR /VR device.
  • Step S232 When the touch path is a broken line, activate the sensor cursor in the display device.
  • steps S230, S231 and S232 by simulating a touch panel device on the display screen of the terminal device in a specific operation scenario, and combining the operation on the display screen of the terminal device and the operation on the display device, the third-party application in the terminal device is completed Switching and operation.
  • the terminal device makes corresponding operations, and projects the final display content to the display device through the connection line, creating a good immersive experience for the user.
  • step S232 is a method of activating the sensing cursor through the terminal device.
  • the sensing cursor may also be activated through the display device. For example, step S221 and step S233.
  • Step S221 The display device obtains the movement path of the display device.
  • the moving path is obtained by the gyroscope sensor in the display device.
  • Step S233 When the moving path is a broken line, the display device activates the sensing cursor in the display device.
  • Step S240 Detect whether the pressure value of the user touching the display screen of the terminal device is greater than a preset value.
  • the user's voice is converted into text.
  • Step S250 When it is detected that the pressure value of the user touching the display screen of the terminal device is greater than the preset value, start the voice recognition service.
  • Step S251 When it is detected that the pressure value of the user touching the display screen of the terminal device is not greater than the preset value, the voice recognition service is turned off.
  • steps S250 and S251 the user controls the voice input time by touching the pressure value of the display screen of the terminal device.
  • the advantage of this application is that the display device is connected to the terminal device through a USB Type-C cable or a DP cable, and communicates with the terminal device through the USB-TP connection protocol, and the display device is projected on the display device through the USB-TP connection protocol.
  • a touch panel device is simulated on the display screen of the terminal device, and combined with the operation on the display screen of the terminal device and the display device, the switching and operation of third-party applications in the terminal device are completed. Through the user's different touch paths, the terminal device makes corresponding operations, and projects the final display content to the display device through the connection line, creating a good immersive experience for the user.
  • the embodiment of the present application also provides a terminal device, which may be a device such as a smart phone or a tablet computer.
  • the terminal device 200 includes a processor 201 and a memory 202. Wherein, the processor 201 and the memory 202 are electrically connected.
  • the processor 201 is the control center of the terminal device 200. It uses various interfaces and lines to connect the various parts of the entire terminal device. It executes the terminal by running or loading application programs stored in the memory 202 and calling data stored in the memory 202. Various functions and processing data of the equipment can be used to monitor the terminal equipment as a whole.
  • the terminal device 200 is provided with multiple storage partitions, and the multiple storage partitions include a system partition and a target partition.
  • the processor 201 in the terminal device 200 will perform one or more applications according to the following steps
  • the instructions corresponding to the process of the program are loaded into the memory 202, and the processor 201 runs the application programs stored in the memory 202, thereby realizing various functions:
  • the sensor cursor in the display device is activated.
  • FIG. 4 shows a specific structural block diagram of a terminal device provided in an embodiment of the present application, and the terminal device may be used to implement the human-computer interaction method provided in the foregoing embodiment.
  • the terminal device 300 may be a smart phone or a tablet computer.
  • the RF circuit 310 is used to receive and send electromagnetic waves, realize the mutual conversion between electromagnetic waves and electrical signals, so as to communicate with a communication network or other devices.
  • the RF circuit 310 may include various existing circuit elements for performing these functions, for example, an antenna, a radio frequency transceiver, a digital signal processor, an encryption/decryption chip, a subscriber identity module (SIM) card, a memory, and so on.
  • the RF circuit 310 can communicate with various networks such as the Internet, an intranet, and a wireless network, or communicate with other devices through a wireless network.
  • the aforementioned wireless network may include a cellular telephone network, a wireless local area network, or a metropolitan area network.
  • the above-mentioned wireless network can use various communication standards, protocols and technologies, including but not limited to the Global System for Mobile Communications (Global System for Mobile Communication, GSM), enhanced mobile communication technology (Enhanced Data GSM Environment, EDGE), wideband code division multiple access technology (Wideband Code Division Multiple Access, WCDMA), Code Division Multiple Access (Code Division Multiple Access) Access, CDMA), Time Division Multiple Access (TDMA), Wireless Fidelity (Wireless Fidelity, Wi-Fi) (such as the American Institute of Electrical and Electronics Engineers standard IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), Internet telephony (Voice over Internet Protocol, VoIP), Worldwide Interconnection for Microwave Access (Worldwide Interoperability for Microwave Access, Wi-Max), other protocols used for mail, instant messaging and short messages, and any other suitable communication protocols, even those that have not yet been developed.
  • GSM Global System for Mobile Communication
  • EDGE Enhanced Data GSM Environment
  • WCDMA Wideband Code Division Multiple Access
  • the memory 320 can be used to store software programs and modules, such as the program instructions/modules corresponding to the human-computer interaction method in the above embodiments.
  • the processor 380 executes various functional applications and data by running the software programs and modules stored in the memory 320. Processing is to realize the function of human-computer interaction.
  • the memory 320 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
  • the memory 320 may further include a memory remotely provided with respect to the processor 380, and these remote memories may be connected to the terminal device 300 through a network. Examples of the aforementioned networks include, but are not limited to, the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
  • the input unit 330 may be used to receive inputted digital or character information, and generate keyboard, mouse, joystick, optical or trackball signal input related to user settings and function control.
  • the input unit 330 may include a touch-sensitive surface 331 and other input devices 332.
  • the touch-sensitive surface 331 also called a touch screen or a touchpad, can collect the user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on or on the touch-sensitive surface 331. Operation near the touch-sensitive surface 331), and drive the corresponding connection device according to the preset program.
  • the touch-sensitive surface 331 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 380, and can receive and execute the commands sent by the processor 380.
  • the touch-sensitive surface 331 can be realized in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the input unit 330 may also include other input devices 332.
  • the other input device 332 may include, but is not limited to, one or more of a physical keyboard, function keys (such as a volume control button, a switch button, etc.), a trackball, a mouse, and a joystick.
  • the display unit 340 may be used to display information input by the user or information provided to the user and various graphical user interfaces of the terminal device 300. These graphical user interfaces may be composed of graphics, text, icons, videos, and any combination thereof.
  • the display unit 340 may include a display panel 341.
  • an LCD Liquid
  • the display panel 341 is configured in the form of Crystal Display (Liquid Crystal Display), OLED (Organic Light-Emitting Diode, Organic Light-Emitting Diode), etc.
  • the touch-sensitive surface 331 may cover the display panel 341.
  • the touch-sensitive surface 331 When the touch-sensitive surface 331 detects a touch operation on or near it, it is transmitted to the processor 380 to determine the type of the touch event, and then the processor 380 determines the type of the touch event.
  • the type provides corresponding visual output on the display panel 341.
  • the touch-sensitive surface 331 and the display panel 341 are used as two independent components to implement input and output functions, in some embodiments, the touch-sensitive surface 331 and the display panel 341 can be integrated to implement input. And output function.
  • the terminal device 300 may also include at least one sensor 350, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 341 according to the brightness of the ambient light, and the proximity sensor can close the display panel 341 when the terminal device 300 is moved to the ear. And/or backlight.
  • the gravity acceleration sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when it is stationary.
  • the terminal device 300 can also be configured with other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors, here No longer.
  • the audio circuit 360, the speaker 361, and the microphone 362 can provide an audio interface between the user and the terminal device 300.
  • the audio circuit 360 can transmit the electrical signal converted from the received audio data to the speaker 361, and the speaker 361 converts it into a sound signal for output; on the other hand, the microphone 362 converts the collected sound signal into an electrical signal, which is then output by the audio circuit 360. After being received, it is converted into audio data, and then processed by the audio data output processor 380, and sent to, for example, another terminal via the RF circuit 310, or the audio data is output to the memory 320 for further processing.
  • the audio circuit 360 may also include an earplug jack to provide communication between a peripheral earphone and the terminal device 300.
  • the terminal device 300 can help users send and receive emails, browse webpages, and access streaming media through the transmission module 370 (for example, a Wi-Fi module), and it provides users with wireless broadband Internet access.
  • the transmission module 370 for example, a Wi-Fi module
  • FIG. 4 shows the transmission module 370, it is understandable that it is not a necessary component of the terminal device 300, and can be omitted as needed without changing the essence of the invention.
  • the processor 380 is the control center of the terminal device 300, which uses various interfaces and lines to connect the various parts of the entire mobile phone, runs or executes software programs and/or modules stored in the memory 320, and calls data stored in the memory 320 , Perform various functions of the terminal device 300 and process data, so as to monitor the mobile phone as a whole.
  • the processor 380 may include one or more processing cores; in some embodiments, the processor 380 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and For application programs, the modem processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 380.
  • the terminal device 300 also includes a power source 390 (such as a battery) for supplying power to various components.
  • the power source may be logically connected to the processor 380 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • the power supply 390 may also include any components such as one or more DC or AC power supplies, a recharging system, a power failure detection circuit, a power converter or inverter, and a power status indicator.
  • the terminal device 300 may also include a camera (such as a front camera, a rear camera), a Bluetooth module, etc., which will not be repeated here.
  • the display unit of the terminal device is a touch screen display, and the terminal device also includes a memory and one or more programs.
  • One or more programs are stored in the memory and configured to be configured by one or more programs.
  • the above processor executes one or more programs including instructions for performing the following operations:
  • the sensor cursor in the display device is activated.
  • each of the above modules can be implemented as an independent entity, or can be combined arbitrarily, and implemented as the same or several entities.
  • each of the above modules please refer to the previous method embodiments, which will not be repeated here.
  • an embodiment of the present application provides a storage medium in which multiple instructions are stored, and the instructions can be loaded by a processor to execute the steps in any human-computer interaction method provided in the embodiments of the present application.
  • the storage medium may include: read-only memory (ROM, Read Only Memory), random access memory (RAM, Random Access Memory), disk or CD, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请公开了一种人机交互系统及其方法、终端设备,本申请中通过显示设备通过USB-TP连接协议与终端设备相互通信,并在所述显示设备上投射所述终端设备的显示屏所输出的显示内容。

Description

人机交互系统及其方法、终端设备
本申请要求于2019年12月27日提交中国专利局、申请号201911382440.7、发明名称为“人机交互系统及其方法、终端设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及计算机技术领域,尤其涉及一种人机交互系统及其方法、终端设备。
背景技术
增强现实(Augmented Reality简称AR)/虚拟现实(Virtual Reality简称VR)设备可以使用户不再受终端设备的显示屏物理尺寸限制,随时随地以较大的显示屏浏览自己想查看的信息。AR技术不仅可以展现真实世界的信息,而且将虚拟的信息同时显示出来,两种信息互相补充和叠加将虚拟的信息投影到真实世界,被人类感官所感知。而VR技术可以渲染整个场景,从而使人类获得超越现实的感官体验。
但是市场上的AR/VR设备存在以下问题,首先AR/VR设备的应用生态匮乏,内容单一,用户无法获得多样化的体验;其次市面上普通的AR/VR设备受第三方手机权限的限制,导致第三方应用程序使用受限,影响用户体验。
技术问题
本申请实施例提供一种人机交互系统及其方法、终端设备,能够有效解决目前用户在佩戴AR/VR设备时,无法实现终端设备同步操作的问题。
技术解决方案
根据本申请的一方面,本申请实施例提供一种人机交互系统,包括:一显示设备;以及一终端设备,所述终端设备通过一连接线与所述显示设备连接;其中,所述显示设备通过USB-TP连接协议与终端设备相互通信,以在所述显示设备上投射所述终端设备的显示屏所输出的显示内容。
进一步地,所述显示设备包括一头部追踪模块,用于获取所述显示设备的移动路径。
进一步地,所述显示设备包括一模拟显示屏触摸模块,与所述头部追踪模块连接,用于获取所述显示设备中的感应光标的移动路径。
进一步地,所述终端设备包括一显示屏触摸模块,用于获取用户触摸所述终端设备的显示屏的触摸路径,并传送至所述显示装置。
进一步地,所述终端设备还包括一语音识别模块,与所述显示屏触摸模块连接,所述语音识别模块用于将用户的语音转换为文本。
进一步地,所述连接线为USB Type-C线或DP线。
根据本申请的另一方面,本申请实施例提供一种人机交互方法,包括以下步骤:检测是否连接至显示设备;当检测到连接至显示设备时,获取用户触摸终端设备的显示屏的触摸路径;当所述触摸路径位于所述终端设备显示屏中部的直线时,显示所述终端设备的显示屏的主界面;当所述触摸路径位于所述终端设备显示屏两侧的直线时,显示所述终端设备的显示屏的当前显示界面的上一个显示界面;以及当所述触摸路径为折线时,激活所述显示设备中的感应光标。
进一步地,在当所述触摸路径为折线时激活所述显示设备中的感应光标步骤之后,所述方法还包括步骤:检测用户触摸所述终端设备的显示屏的压力值是否大于预设值;当检测到用户触摸所述终端设备的显示屏的压力值大于预设值时,开启语音识别服务;以及当检测到用户触摸终端设备显示屏的压力值不大于预设值时,关闭语音识别服务。
进一步地,当检测到连接至显示设备,所述方法还包括步骤:显示设备获取所述显示设备的移动路径;以及当所述移动路径为折线时,显示设备激活所述显示设备中的感应光标。
根据本申请的又一方面,本申请实施例提供一种终端设备,包括处理器和存储器,所述处理器与所述存储器电性连接,所述存储器用于存储指令和数据,所述处理器用于执行人机交互方法中的步骤,所述人机交互方法括以下步骤:检测是否连接至显示设备;当检测到连接至显示设备时,获取用户触摸终端设备的显示屏的触摸路径;当所述触摸路径位于所述终端设备显示屏中部的直线时,显示所述终端设备的显示屏的主界面;当所述触摸路径位于所述终端设备显示屏两侧的直线时,显示所述终端设备的显示屏的当前显示界面的上一个显示界面;以及当所述触摸路径为折线时,激活所述显示设备中的感应光标。
进一步地,在当所述触摸路径为折线时激活所述显示设备中的感应光标步骤之后,所述方法还包括步骤:检测用户触摸所述终端设备的显示屏的压力值是否大于预设值;当检测到用户触摸所述终端设备的显示屏的压力值大于预设值时,开启语音识别服务;以及当检测到用户触摸终端设备显示屏的压力值不大于预设值时,关闭语音识别服务。
进一步地,当检测到连接至显示设备,所述方法还包括步骤:显示设备获取所述显示设备的移动路径;以及当所述移动路径为折线时,显示设备激活所述显示设备中的感应光标。
有益效果
本申请的优点在于,显示设备通过USB Type-C线或DP线与终端设备连接,并通过USB-TP连接协议与终端设备相互通信,通过USB-TP连接协议在所述显示设备上投射所述终端设备的显示屏所输出的显示内容。另外在具体操作场景中,在终端设备的显示屏上模拟触摸板设备,并结合终端设备的显示屏和显示设备上的操作,完成对终端设备中第三方应用的切换和操作。通过用户不同的触摸路径,终端设备做出与之相对应的操作,并将最终显示内容通过连接线投射至显示设备中,为用户营造良好的沉浸式体验。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本申请实施例所提供的人机交互系统的结构示意图。
图2为本申请实施例所提供的人机交互方法的步骤流程图。
图3为本申请实施例所提供的终端设备的结构示意图。
图4为本申请实施例所提供的终端设备的另一结构示意图。
本申请的实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
在本申请的描述中,需要理解的是,术语“中心”、“纵向”、“横向”、“长度”、“宽度”、“厚度”、“上”、“下”、“前”、“后”、“左”、“右”、“竖直”、“水平”、“顶”、“底”、“内”、“外”、“顺时针”、“逆时针”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本申请和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本申请的限制。此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个所述特征。在本申请的描述中,“多个”的含义是两个或两个以上,除非另有明确具体的限定。
在本申请的描述中,需要说明的是,除非另有明确的规定和限定,术语“安装”、“相连”、“连接”应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或一体地连接;可以是机械连接,也可以是电连接或可以相互通讯;可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通或两个元件的相互作用关系。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本申请中的具体含义。在本实施例中,所述模拟显示屏触摸模块与所述头部追踪模块连接,用于获取所述显示设备中的感应光标的移动路径。
在本申请中,除非另有明确的规定和限定,第一特征在第二特征之“上”或之“下”可以包括第一和第二特征直接接触,也可以包括第一和第二特征不是直接接触而是通过它们之间的另外的特征接触。而且,第一特征在第二特征“之上”、“上方”和“上面”包括第一特征在第二特征正上方和斜上方,或仅仅表示第一特征水平高度高于第二特征。第一特征在第二特征“之下”、“下方”和“下面”包括第一特征在第二特征正下方和斜下方,或仅仅表示第一特征水平高度小于第二特征。
下文的公开提供了许多不同的实施方式或例子用来实现本申请的不同结构。为了简化本申请的公开,下文中对特定例子的部件和设置进行描述。当然,它们仅仅为示例,并且目的不在于限制本申请。此外,本申请可以在不同例子中重复参考数字和/或参考字母,这种重复是为了简化和清楚的目的,其本身不指示所讨论各种实施方式和/或设置之间的关系。此外,本申请提供了的各种特定的工艺和材料的例子,但是本领域普通技术人员可以意识到其他工艺的应用和/或其他材料的使用。
如图1所示,为本申请实施例提供的人机交互系统的结构示意图,包括:一显示设备10及一终端设备20。
所述终端设备20通过一连接线5与所述显示设备10连接,所述连接线5为USB Type-C线(即C型USB接口)或DP(DisplayPort,显示接口)线。如此设置的目的在于,显示设备通过USB Type-C线或DP线与终端设备连接,并通过USB-TP连接协议与终端设备相互通信,通过USB-TP连接协议在所述显示设备上投射所述终端设备的显示屏所输出的显示内容。
在本实施例中,所述显示设备10为增强现实(Augmented Reality简称AR)或虚拟现实(Virtual Reality简称VR)设备,即为一种可穿戴式显示设备。其中所述显示设备10包括:一头部追踪模块1及一模拟显示屏触摸模块2。
在本实施例中,所述头部追踪模块1用于获取所述显示设备10的移动路径。需要说明的是所述头部追踪模块1主要由陀螺仪传感器组成,当用户在佩戴AR/VR设备时,通过捕捉头部的运动事件,获取AR/VR设备的运动轨迹,AR/VR设备的运动轨迹即为所述显示设备10的所述移动路径。
在本实施例中,所述模拟显示屏触摸模块2与所述头部追踪模块1连接,用于获取所述显示设备10中的感应光标的移动路径。所述模拟显示屏触摸模块2的功能是由所述显示设备10通过USB-TP连接协议与终端设备20相互通信实现。结合所述头部追踪模块1,模拟感应光标在AR/VR设备上的移动路径。
在本实施例中,所述终端设备20为手机,所述终端设备包括:一显示屏触摸模3块及一语音识别模块4。
在本实施例中,所述显示屏触摸模块3用于获取用户触摸所述终端设备的显示屏的触摸路径,并传送至所述显示装置。当用户打开AR/VR设备之后,手机的显示屏显示为一层空白的应用布局(即显示屏画面为空白)。获取用户的手势输入的触摸路径,识别用户从手机下方不同位置的上划手势,并识别用户单指点击手势,并将识别结果上传给AR/VR设备。
在本实施例中,所述语音识别模块4与所述显示屏触摸模块3连接,所述语音识别模块4用于将用户的语音转换为文本。所述语音识别模块4帮助用户在不同的应用中完成输入功能。用户通过按压终端设备的显示屏来启动和结束语音识别功能,所述语音识别模块将获取到的语音转换为文本输出。
本申请的优点在于,显示设备通过USB Type-C线或DP线与终端设备连接,并通过USB-TP连接协议与终端设备相互通信,通过USB-TP连接协议在所述显示设备上投射所述终端设备的显示屏所输出的显示内容。
如图2所示,为本申请实施例提供的人机交互方法步骤流程图,包括以下步骤:
步骤S210:检测是否连接至显示设备。
在本实施例中,所述终端设备通过一连接线与所述显示设备连接,所述连接线为USB Type-C线(即C型USB接口)或DP(DisplayPort,显示接口)线。所述显示设备为增强现实(Augmented Reality简称AR)或虚拟现实(Virtual Reality简称VR)设备,其中所述显示设备包括:一头部追踪模块及一模拟显示屏触摸模块。显示设备通过USB-TP连接协议与终端设备相互通信,并在所述显示设备上投射所述终端设备的显示屏所输出的显示内容。
步骤S220:当检测到连接至显示设备时,获取用户触摸终端设备的显示屏的触摸路径。
在本实施例中,当用户打开AR/VR设备之后,终端设备的显示屏显示为一层空白的应用布局(即显示屏画面为空白)。获取用户的手势输入的触摸路径,识别用户从终端设备下方不同位置的上划手势,并识别用户单指点击手势,并将识别结果上传给AR/VR设备。
其中用户终端设备下方不同位置的上划手势,具体可以分为步骤S230-S233中的场景:
步骤S230:当所述触摸路径位于所述终端设备显示屏中部的直线时,显示所述终端设备的显示屏的主界面,并将主界面的内容投射到用户的AR/VR设备上。
步骤S231:当所述触摸路径位于所述终端设备显示屏两侧的直线时,显示所述终端设备的显示屏的当前显示界面的上一个显示界面,并将主界面的内容投射到用户的AR/VR设备上。
步骤S232:当所述触摸路径为折线时,激活所述显示设备中的感应光标。
通过步骤S230、S231及S232中,通过在具体操作场景中,在终端设备的显示屏上模拟触摸板设备,并结合终端设备的显示屏和显示设备上的操作,完成对终端设备中第三方应用的切换和操作。通过用户不同的触摸路径,终端设备做出与之相对应的操作,并将最终显示内容通过连接线投射至显示设备中,为用户营造良好的沉浸式体验。
另外在步骤S232中为通过终端设备激活感应光标的方法,在其他实施例中,也可以通过显示设备激活感应光标。例如步骤S221及步骤S233。
步骤S221:显示设备获取所述显示设备的移动路径。
在本实施例中,移动路径是由所述显示设备中的陀螺仪传感器获得。
步骤S233当所述移动路径为折线时,显示设备激活所述显示设备中的感应光标。
步骤S240:检测用户触摸所述终端设备的显示屏的压力值是否大于预设值。
在本实施例中,用户的语音转换为文本。协助用户在不同的应用中完成输入功能。用户通过按压终端设备的显示屏来启动和结束语音识别功能,所述语音识别模块将获取到的语音转换为文本输出。
步骤S250:当检测到用户触摸所述终端设备的显示屏的压力值大于预设值时,开启语音识别服务。
步骤S251:当检测到用户触摸终端设备显示屏的压力值不大于预设值时,关闭语音识别服务。
在步骤S250和S251中,用户通过触摸所述终端设备的显示屏的压力值来控制语音输入的时间。
本申请的优点在于,显示设备通过USB Type-C线或DP线与终端设备连接,并通过USB-TP连接协议与终端设备相互通信,通过USB-TP连接协议在所述显示设备上投射所述终端设备的显示屏所输出的显示内容。另外在具体操作场景中,在终端设备的显示屏上模拟触摸板设备,并结合终端设备的显示屏和显示设备上的操作,完成对终端设备中第三方应用的切换和操作。通过用户不同的触摸路径,终端设备做出与之相对应的操作,并将最终显示内容通过连接线投射至显示设备中,为用户营造良好的沉浸式体验。
另外,本申请实施例还提供一种终端设备,该终端设备可以是智能手机、平板电脑等设备。具体地,如图3所示,终端设备200包括处理器201和存储器202。其中,处理器201与存储器202电性连接。
处理器201是终端设备200的控制中心,利用各种接口和线路连接整个终端设备的各个部分,通过运行或加载存储在存储器202内的应用程序,以及调用存储在存储器202内的数据,执行终端设备的各种功能和处理数据,从而对终端设备进行整体监控。
在本实施例中,该终端设备200设有多个存储分区,该多个存储分区包括系统分区和目标分区,终端设备200中的处理器201会按照如下的步骤,将一个或一个以上的应用程序的进程对应的指令加载到存储器202中,并由处理器201来运行存储在存储器202中的应用程序,从而实现各种功能:
检测是否连接至显示设备;
当检测到连接至显示设备时,获取用户触摸终端设备的显示屏的触摸路径;
当所述触摸路径位于所述终端设备显示屏中部的直线时,显示所述终端设备的显示屏的主界面;
当所述触摸路径位于所述终端设备显示屏两侧的直线时,显示所述终端设备的显示屏的当前显示界面的上一个显示界面;以及
当所述触摸路径为折线时,激活所述显示设备中的感应光标。
图4示出了本申请实施例提供的终端设备的具体结构框图,该终端设备可以用于实施上述实施例中提供的人机交互方法。该终端设备300可以为智能手机或平板电脑。
RF电路310用于接收以及发送电磁波,实现电磁波与电信号的相互转换,从而与通讯网络或者其他设备进行通讯。RF电路310可包括各种现有的用于执行这些功能的电路元件,例如,天线、射频收发器、数字信号处理器、加密/解密芯片、用户身份模块(SIM)卡、存储器等等。RF电路310可与各种网络如互联网、企业内部网、无线网络进行通讯或者通过无线网络与其他设备进行通讯。上述的无线网络可包括蜂窝式电话网、无线局域网或者城域网。上述的无线网络可以使用各种通信标准、协议及技术,包括但并不限于全球移动通信系统(Global System for Mobile Communication, GSM)、增强型移动通信技术(Enhanced Data GSM Environment, EDGE),宽带码分多址技术(Wideband Code Division Multiple Access, WCDMA),码分多址技术(Code Division Access, CDMA)、时分多址技术(Time Division Multiple Access, TDMA),无线保真技术(Wireless Fidelity, Wi-Fi)(如美国电气和电子工程师协会标准 IEEE 802.11a,IEEE 802.11b, IEEE802.11g 和/或 IEEE 802.11n)、网络电话(Voice over Internet Protocol, VoIP)、全球微波互联接入(Worldwide Interoperability for Microwave Access, Wi-Max)、其他用于邮件、即时通讯及短消息的协议,以及任何其他合适的通讯协议,甚至可包括那些当前仍未被开发出来的协议。
存储器320可用于存储软件程序以及模块,如上述实施例中人机交互方法对应的程序指令/模块,处理器380通过运行存储在存储器320内的软件程序以及模块,从而执行各种功能应用以及数据处理,即实现人机交互的功能。存储器320可包括高速随机存储器,还可包括非易失性存储器,如一个或者多个磁性存储装置、闪存、或者其他非易失性固态存储器。在一些实例中,存储器320可进一步包括相对于处理器380远程设置的存储器,这些远程存储器可以通过网络连接至终端设备300。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
输入单元330可用于接收输入的数字或字符信息,以及产生与用户设置以及功能控制有关的键盘、鼠标、操作杆、光学或者轨迹球信号输入。具体地,输入单元330可包括触敏表面331以及其他输入设备332。触敏表面331,也称为触摸显示屏或者触控板,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触敏表面331上或在触敏表面331附近的操作),并根据预先设定的程式驱动相应的连接装置。可选的,触敏表面331可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器380,并能接收处理器380发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触敏表面331。除了触敏表面331,输入单元330还可以包括其他输入设备332。具体地,其他输入设备332可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆等中的一种或多种。
显示单元340可用于显示由用户输入的信息或提供给用户的信息以及终端设备300的各种图形用户接口,这些图形用户接口可以由图形、文本、图标、视频和其任意组合来构成。显示单元340可包括显示面板341,可选的,可以采用LCD(Liquid Crystal Display,液晶显示器)、OLED(Organic Light-Emitting Diode,有机发光二极管)等形式来配置显示面板341。进一步的,触敏表面331可覆盖显示面板341,当触敏表面331检测到在其上或附近的触摸操作后,传送给处理器380以确定触摸事件的类型,随后处理器380根据触摸事件的类型在显示面板341上提供相应的视觉输出。虽然在图4中,触敏表面331与显示面板341是作为两个独立的部件来实现输入和输出功能,但是在某些实施例中,可以将触敏表面331与显示面板341集成而实现输入和输出功能。
终端设备300还可包括至少一种传感器350,比如光传感器、运动传感器以及其他传感器。具体地,光传感器可包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板341的亮度,接近传感器可在终端设备300移动到耳边时,关闭显示面板341和/或背光。作为运动传感器的一种,重力加速度传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等; 至于终端设备300还可配置的陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
音频电路360、扬声器361,传声器362可提供用户与终端设备300之间的音频接口。音频电路360可将接收到的音频数据转换后的电信号,传输到扬声器361,由扬声器361转换为声音信号输出;另一方面,传声器362将收集的声音信号转换为电信号,由音频电路360接收后转换为音频数据,再将音频数据输出处理器380处理后,经RF电路310以发送给比如另一终端,或者将音频数据输出至存储器320以便进一步处理。音频电路360还可能包括耳塞插孔,以提供外设耳机与终端设备300的通信。
终端设备300通过传输模块370(例如Wi-Fi模块)可以帮助用户收发电子邮件、浏览网页和访问流式媒体等,它为用户提供了无线的宽带互联网访问。虽然图4示出了传输模块370,但是可以理解的是,其并不属于终端设备300的必须构成,完全可以根据需要在不改变发明的本质的范围内而省略。
处理器380是终端设备300的控制中心,利用各种接口和线路连接整个手机的各个部分,通过运行或执行存储在存储器320内的软件程序和/或模块,以及调用存储在存储器320内的数据,执行终端设备300的各种功能和处理数据,从而对手机进行整体监控。可选的,处理器380可包括一个或多个处理核心;在一些实施例中,处理器380可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器380中。
终端设备300还包括给各个部件供电的电源390(比如电池),在一些实施例中,电源可以通过电源管理系统与处理器380逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。电源390还可以包括一个或一个以上的直流或交流电源、再充电系统、电源故障检测电路、电源转换器或者逆变器、电源状态指示器等任意组件。
尽管未示出,终端设备300还可以包括摄像头(如前置摄像头、后置摄像头)、蓝牙模块等,在此不再赘述。具体在本实施例中,终端设备的显示单元是触摸屏显示器,终端设备还包括有存储器,以及一个或者一个以上的程序,其中一个或者一个以上程序存储于存储器中,且经配置以由一个或者一个以上处理器执行一个或者一个以上程序包含用于进行以下操作的指令:
检测是否连接至显示设备;
当检测到连接至显示设备时,获取用户触摸终端设备的显示屏的触摸路径;
当所述触摸路径位于所述终端设备显示屏中部的直线时,显示所述终端设备的显示屏的主界面;
当所述触摸路径位于所述终端设备显示屏两侧的直线时,显示所述终端设备的显示屏的当前显示界面的上一个显示界面;以及
当所述触摸路径为折线时,激活所述显示设备中的感应光标。
具体实施时,以上各个模块可以作为独立的实体来实现,也可以进行任意组合,作为同一或若干个实体来实现,以上各个模块的具体实施可参见前面的方法实施例,在此不再赘述。
本领域普通技术人员可以理解,上述实施例的各种方法中的全部或部分步骤可以通过指令来完成,或通过指令控制相关的硬件来完成,该指令可以存储于一计算机可读存储介质中,并由处理器进行加载和执行。
为此,本申请实施例提供一种存储介质,其中存储有多条指令,该指令能够被处理器进行加载,以执行本申请实施例所提供的任一种人机交互方法中的步骤。
其中,该存储介质可以包括:只读存储器(ROM,Read Only Memory)、随机存取记忆体(RAM,Random Access Memory)、磁盘或光盘等。
由于该存储介质中所存储的指令,可以执行本申请实施例所提供的任一种人机交互方法中的步骤,因此,可以实现本申请实施例所提供的任一种人机交互方法所能实现的有益效果,详见前面的实施例,在此不再赘述。
以上各个操作的具体实施可参见前面的实施例,在此不再赘述。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
本文中应用了具体个例对本申请的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本申请的技术方案及其核心思想;本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例的技术方案的范围。

Claims (12)

  1. 一种人机交互系统,其包括:
    一显示设备;以及
    一终端设备,所述终端设备通过一连接线与所述显示设备连接;
    其中,所述显示设备通过USB-TP连接协议与终端设备相互通信,以在所述显示设备上投射所述终端设备的显示屏所输出的显示内容。
  2. 根据权利要求1所述的人机交互系统,其中所述显示设备包括一头部追踪模块,用于获取所述显示设备的移动路径。
  3. 根据权利要求2所述的人机交互系统,其中所述显示设备包括一模拟显示屏触摸模块,与所述头部追踪模块连接,用于获取所述显示设备中的感应光标的移动路径。
  4. 根据权利要求1所述的人机交互系统,其中所述终端设备包括一显示屏触摸模块,用于获取用户触摸所述终端设备的显示屏的触摸路径,并传送至所述显示装置。
  5. 根据权利要求4所述的人机交互系统,其中所述终端设备还包括一语音识别模块,与所述显示屏触摸模块连接,所述语音识别模块用于将用户的语音转换为文本。
  6. 根据权利要求1所述的人机交互系统,其中所述连接线为USB Type-C线或DP线。
  7. 一种人机交互方法,所述人机交互方法是基于权利要求1所述人机交互系统的方法,其包括以下步骤:
    检测是否连接至显示设备;
    当检测到连接至显示设备时,获取用户触摸终端设备的显示屏的触摸路径;
    当所述触摸路径位于所述终端设备显示屏中部的直线时,显示所述终端设备的显示屏的主界面;
    当所述触摸路径位于所述终端设备显示屏两侧的直线时,显示所述终端设备的显示屏的当前显示界面的上一个显示界面;以及
    当所述触摸路径为折线时,激活所述显示设备中的感应光标。
  8. 根据权利要求7所述的人机交互方法,其中在当所述触摸路径为折线时激活所述显示设备中的感应光标步骤之后,所述方法还包括步骤:
    检测用户触摸所述终端设备的显示屏的压力值是否大于预设值;当检测到用户触摸所述终端设备的显示屏的压力值大于预设值时,开启语音识别服务;以及
    当检测到用户触摸终端设备显示屏的压力值不大于预设值时,关闭语音识别服务。
  9. 根据权利要求7所述的人机交互方法,其中当检测到连接至显示设备,所述方法还包括步骤:
    显示设备获取所述显示设备的移动路径;以及
    当所述移动路径为折线时,显示设备激活所述显示设备中的感应光标。
  10. 一种终端设备,其包括处理器和存储器,所述处理器与所述存储器电性连接,所述存储器用于存储指令和数据,所述处理器用于执行人机交互方法中的步骤,所述人机交互方法括以下步骤:
    检测是否连接至显示设备;
    当检测到连接至显示设备时,获取用户触摸终端设备的显示屏的触摸路径;
    当所述触摸路径位于所述终端设备显示屏中部的直线时,显示所述终端设备的显示屏的主界面;
    当所述触摸路径位于所述终端设备显示屏两侧的直线时,显示所述终端设备的显示屏的当前显示界面的上一个显示界面;以及
    当所述触摸路径为折线时,激活所述显示设备中的感应光标。
  11. 如权利要求10所述的终端设备,其中在当所述触摸路径为折线时激活所述显示设备中的感应光标步骤之后,所述方法还包括步骤:
    检测用户触摸所述终端设备的显示屏的压力值是否大于预设值;当检测到用户触摸所述终端设备的显示屏的压力值大于预设值时,开启语音识别服务;以及
    当检测到用户触摸终端设备显示屏的压力值不大于预设值时,关闭语音识别服务。
  12. 如权利要求10所述的终端设备,其中当检测到连接至显示设备,所述方法还包括步骤:
    显示设备获取所述显示设备的移动路径;以及
    当所述移动路径为折线时,显示设备激活所述显示设备中的感应光标。
PCT/CN2020/076422 2019-12-27 2020-02-24 人机交互系统及其方法、终端设备 WO2021128561A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911382440.7A CN111104042A (zh) 2019-12-27 2019-12-27 人机交互系统及其方法、终端设备
CN201911382440.7 2019-12-27

Publications (1)

Publication Number Publication Date
WO2021128561A1 true WO2021128561A1 (zh) 2021-07-01

Family

ID=70423503

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/076422 WO2021128561A1 (zh) 2019-12-27 2020-02-24 人机交互系统及其方法、终端设备

Country Status (2)

Country Link
CN (1) CN111104042A (zh)
WO (1) WO2021128561A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102156513A (zh) * 2011-04-20 2011-08-17 上海交通大学 穿戴式多媒体服装系统
CN102595244A (zh) * 2012-02-24 2012-07-18 康佳集团股份有限公司 基于多点触摸手势切换电视界面的人机交互方法及系统
CN104836886A (zh) * 2014-12-19 2015-08-12 北汽福田汽车股份有限公司 手机操控车载游戏的实现方法
CN105652442A (zh) * 2015-12-31 2016-06-08 北京小鸟看看科技有限公司 头戴显示设备及该头戴显示设备与智能终端的交互方法
CN106843511A (zh) * 2017-04-14 2017-06-13 上海翊视皓瞳信息科技有限公司 一种全场景覆盖的智能显示设备系统及应用
CN108513016A (zh) * 2018-05-25 2018-09-07 中瑞福宁机器人(沈阳)有限公司 一种可搭配ar眼镜使用的智能手机装置

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170067058A (ko) * 2015-12-07 2017-06-15 엘지전자 주식회사 이동 단말기 및 그 제어방법
CN105867609A (zh) * 2015-12-28 2016-08-17 乐视移动智能信息技术(北京)有限公司 基于虚拟现实头盔观看视频的方法和装置
CN105786245B (zh) * 2016-02-04 2019-01-22 网易(杭州)网络有限公司 一种触摸屏操作控制方法和装置
CN106200927A (zh) * 2016-06-30 2016-12-07 乐视控股(北京)有限公司 一种信息处理方法及头戴式设备
CN106383635A (zh) * 2016-10-14 2017-02-08 福建中金在线信息科技有限公司 一种当前显示界面的上一级界面的显示方法及装置
CN107122096A (zh) * 2017-04-06 2017-09-01 青岛海信移动通信技术股份有限公司 基于vr显示的终端触摸控制方法及终端
CN107423098A (zh) * 2017-07-28 2017-12-01 珠海市魅族科技有限公司 一种语音助手启动方法、装置、计算机装置及计算机可读存储介质
CN108710615B (zh) * 2018-05-03 2020-03-03 Oppo广东移动通信有限公司 翻译方法及相关设备
CN108845739A (zh) * 2018-05-29 2018-11-20 维沃移动通信有限公司 一种导航键的控制方法及移动终端

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102156513A (zh) * 2011-04-20 2011-08-17 上海交通大学 穿戴式多媒体服装系统
CN102595244A (zh) * 2012-02-24 2012-07-18 康佳集团股份有限公司 基于多点触摸手势切换电视界面的人机交互方法及系统
CN104836886A (zh) * 2014-12-19 2015-08-12 北汽福田汽车股份有限公司 手机操控车载游戏的实现方法
CN105652442A (zh) * 2015-12-31 2016-06-08 北京小鸟看看科技有限公司 头戴显示设备及该头戴显示设备与智能终端的交互方法
CN106843511A (zh) * 2017-04-14 2017-06-13 上海翊视皓瞳信息科技有限公司 一种全场景覆盖的智能显示设备系统及应用
CN108513016A (zh) * 2018-05-25 2018-09-07 中瑞福宁机器人(沈阳)有限公司 一种可搭配ar眼镜使用的智能手机装置

Also Published As

Publication number Publication date
CN111104042A (zh) 2020-05-05

Similar Documents

Publication Publication Date Title
US11204684B2 (en) Sticker presentation method and apparatus and computer-readable storage medium
WO2018103525A1 (zh) 人脸关键点跟踪方法和装置、存储介质
WO2019228163A1 (zh) 扬声器控制方法及移动终端
WO2019196929A1 (zh) 一种视频数据处理方法及移动终端
WO2021098705A1 (zh) 显示方法及电子设备
WO2021017742A1 (zh) 应用程序控制方法及电子设备
WO2021068885A1 (zh) 控制方法及电子设备
CN109739465B (zh) 音频输出方法及移动终端
WO2019196691A1 (zh) 一种键盘界面显示方法和移动终端
WO2021063249A1 (zh) 电子设备的控制方法及电子设备
US20210352040A1 (en) Message sending method and terminal device
US10675541B2 (en) Control method of scene sound effect and related products
CN111083684A (zh) 控制电子设备的方法及电子设备
WO2020238445A1 (zh) 屏幕录制方法及终端
WO2021083256A1 (zh) 触控响应方法及电子设备
CN106406924B (zh) 应用程序启动和退出画面的控制方法、装置及移动终端
WO2021037074A1 (zh) 音频输出方法及电子设备
CN109407948B (zh) 一种界面显示方法及移动终端
WO2021208890A1 (zh) 截屏方法及电子设备
WO2021197311A1 (zh) 音量调节显示方法及电子设备
WO2019154360A1 (zh) 界面切换方法及移动终端
US20220132250A1 (en) Mobile Terminal and Control Method
WO2021238806A1 (zh) 接口电路与电子设备
WO2021083090A1 (zh) 消息发送方法及移动终端
WO2019042478A1 (zh) 移动终端输入法软键盘的控制方法、存储介质及移动终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20907418

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20907418

Country of ref document: EP

Kind code of ref document: A1