CN108829319B - Interaction method and device for touch screen, electronic equipment and storage medium - Google Patents

Interaction method and device for touch screen, electronic equipment and storage medium Download PDF

Info

Publication number
CN108829319B
CN108829319B CN201810622477.1A CN201810622477A CN108829319B CN 108829319 B CN108829319 B CN 108829319B CN 201810622477 A CN201810622477 A CN 201810622477A CN 108829319 B CN108829319 B CN 108829319B
Authority
CN
China
Prior art keywords
touch screen
interactive control
control interface
information
operation body
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810622477.1A
Other languages
Chinese (zh)
Other versions
CN108829319A (en
Inventor
张印帅
周峰
史元春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uisee Technologies Beijing Co Ltd
Original Assignee
Uisee Technologies Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uisee Technologies Beijing Co Ltd filed Critical Uisee Technologies Beijing Co Ltd
Priority to CN201810622477.1A priority Critical patent/CN108829319B/en
Publication of CN108829319A publication Critical patent/CN108829319A/en
Application granted granted Critical
Publication of CN108829319B publication Critical patent/CN108829319B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention relates to an interaction method and device of a touch screen, electronic equipment and a storage medium, wherein the interaction method comprises the following steps: detecting first operation information of an operation body in front of a touch screen; predicting a selected area of an operation body on the touch screen according to the first operation information; acquiring an information display area corresponding to the selected area; and displaying an interactive control interface corresponding to the information display area on the touch screen. According to the embodiment of the invention, the selected area of the operation body on the touch screen can be predicted by detecting the first operation information, so that the information display area corresponding to the selected area can be determined, the pre-judgment of the selection intention of the user is realized, the pre-judgment result can be fed back to the user before the user performs touch operation on the touch screen, and the information display area selected by the user can be obtained without waiting for the touch operation of the user on the touch screen, so that the time for the user to select the information display area is shortened, and the user experience is improved.

Description

Interaction method and device for touch screen, electronic equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of car networking, in particular to an interaction method and device of a touch screen, electronic equipment and a storage medium.
Background
A Touch Screen (Touch Screen) is a man-machine interaction input device and can be applied to multiple fields of daily work and life. There are various interaction methods of the Touch screen, such as 3D Touch, infrared Touch, and the like.
At present, the interactive design of the touch screen is mainly designed for the Interface (Interface) of the screen, but the touch activities of users are different due to the different sizes of the touch screen, for example, a 2-inch watch screen user only needs to perform fingertip activities, and a hundred-inch interactive desktop screen user needs to perform whole body activities; due to different user requirements of the touch screen, the user requirements such as browsing screen contents and inputting text contents; due to different use scenes of the touch screen, for example, a user may use a mobile phone or a vehicle-mounted screen during driving or a watch during running, so that the interactive design of the touch screen should pay attention to factors other than the screen.
Disclosure of Invention
In order to solve the problems in the prior art, at least one embodiment of the present invention provides an interaction method and apparatus for a touch screen, an electronic device, and a storage medium.
In a first aspect, an embodiment of the present invention provides an interaction method for a touch screen, including:
detecting first operation information of an operation body in front of a touch screen;
predicting a selected area of an operation body on the touch screen according to the first operation information;
acquiring an information display area corresponding to the selected area;
and displaying an interactive control interface corresponding to the information display area on the touch screen.
In some embodiments, the method further comprises:
detecting whether the interaction control interface is touched;
if so, determining the touch position;
determining an interactive control corresponding to the touch position;
and executing touch event operation preset by the interactive control.
In some embodiments, after the interactive control interface corresponding to the information display area is displayed on the touch screen, the method further includes:
detecting second operation information of an operation body in front of the touch screen;
determining an interactive control corresponding to the second operation information;
and executing touch event operation preset by the interactive control in the interactive control interface.
In some embodiments, the first operational information comprises a gesture;
correspondingly, the predicting the selected area of the operation body on the touch screen according to the first operation information comprises:
and predicting the selected area corresponding to the first operation information according to the preset corresponding relation between the gesture and the selected area.
In some embodiments, the first operation information includes a finger pointing direction;
correspondingly, the predicting the selected area of the operation body on the touch screen according to the first operation information comprises:
determining a pointing position where an extension line pointed by the finger intersects with the touch screen;
and predicting the selected area corresponding to the first operation information according to the pointing position.
In some embodiments, the first operational information comprises a hover position;
correspondingly, the predicting the selected area of the operation body on the touch screen according to the first operation information comprises:
determining the projection position of the operation body on the touch screen according to the first operation information;
and predicting the selected area corresponding to the first operation information according to the projection position.
In some embodiments, the displaying, on the touch screen, an interactive control interface corresponding to the information display area includes:
and in the selected area, switching the content displayed in the information display area into the interactive control interface.
In some embodiments, the displaying, on the touch screen, an interactive control interface corresponding to the information display area includes:
and displaying the interactive control interface on the touch screen in a full screen mode.
In some embodiments, the displaying, on the touch screen, an interactive control interface corresponding to the information display area includes:
determining a projection position of the operation body on the touch screen;
determining the display position of the interactive control interface according to the projection position;
and displaying the interactive control interface at the display position.
In some embodiments, the method further comprises:
detecting whether an operation body approaches the touch screen or not according to an interaction space preset in front of the touch screen;
and if so, triggering and executing the step of detecting the first operation information of the operation body in front of the touch screen.
In some embodiments, the method further comprises:
and if detecting that no operation body exists in the interaction space preset in front of the touch screen, displaying a preset main interface on the touch screen.
In a second aspect, an embodiment of the present invention further provides an interaction apparatus for a touch screen, where the apparatus includes:
the first detection unit is used for detecting first operation information of an operation body in front of the touch screen;
the prediction unit is used for predicting a selected area of the operation body on the touch screen according to the first operation information;
the acquisition unit is used for acquiring the information display area corresponding to the selected area;
and the display unit is used for displaying the interactive control interface corresponding to the information display area on the touch screen.
In some embodiments, the apparatus further comprises:
the second detection unit is used for detecting whether the interaction control interface is touched or not;
a first determination unit configured to determine a touch position after the second detection unit detects the contact;
the second determining unit is used for determining the interactive control corresponding to the touch position;
and the first execution unit is used for executing the touch event operation preset by the interactive control.
In some embodiments, the apparatus further comprises:
the third detection unit is used for detecting second operation information of an operation body in front of the touch screen after the display unit displays the interactive control interface;
a third determining unit, configured to determine an interactive control corresponding to the second operation information;
and the second execution unit is used for executing the touch event operation preset by the interactive control in the interactive control interface.
In some embodiments, the first operational information comprises a gesture;
correspondingly, the prediction unit is configured to predict the selected region corresponding to the first operation information according to a preset corresponding relationship between the gesture and the selected region.
In some embodiments, the first operation information includes a finger pointing direction;
accordingly, the prediction unit is configured to:
determining a pointing position where an extension line pointed by the finger intersects with the touch screen;
and predicting the selected area corresponding to the first operation information according to the pointing position.
In some embodiments, the first operational information comprises a hover position;
accordingly, the prediction unit is configured to:
determining the projection position of the operation body on the touch screen according to the first operation information;
and predicting the selected area corresponding to the first operation information according to the projection position.
In some embodiments, the display unit is configured to switch, in the selected area, the content displayed in the information display area to the interactive control interface.
In some embodiments, the display unit is configured to display the interactive control interface on the touch screen in a full screen manner.
In some embodiments, the display unit is configured to:
determining a projection position of the operation body on the touch screen;
determining the display position of the interactive control interface according to the projection position;
and displaying the interactive control interface at the display position.
In some embodiments, the apparatus further comprises:
the third detection unit is used for detecting whether an operation body approaches the touch screen or not according to an interaction space preset in front of the touch screen;
correspondingly, the first detection unit is used for detecting the first operation information of the operation body in front of the touch screen after the third detection unit detects that the operation body approaches the touch screen.
In some embodiments, the display unit is further configured to display a preset main interface on the touch screen after the third detection unit detects that no operation body exists in a preset interaction space in front of the touch screen.
In a third aspect, an embodiment of the present invention further provides an electronic device, including:
a touch screen, a processor, a memory, a network interface, and a user interface;
the touch screen, the processor, the memory, the network interface and the user interface are coupled together through a bus system;
the processor is adapted to perform the steps of the method according to the first aspect by calling a program or instructions stored by the memory.
In a fourth aspect, an embodiment of the present invention also provides a non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform the steps of the method according to the first aspect.
In at least one embodiment of the embodiments of the present invention, by detecting the first operation information, the embodiment of the present invention may predict the selected area of the operation body on the touch screen, and may further determine the information display area corresponding to the selected area, thereby implementing the pre-judgment of the user selection intention, and before the user performs a touch operation on the touch screen, the pre-judgment result may be fed back to the user, and the information display area selected by the user may not be acquired after the user performs a touch operation on the touch screen, so that the time for the user to select the information display area is shortened, and the user experience is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
Fig. 1 is a block diagram of an electronic device according to an embodiment of the present invention;
fig. 2 is a flowchart of an interaction method of a touch screen according to an embodiment of the present invention;
fig. 3 is a schematic diagram of an information display area included in a touch screen of an on-vehicle device according to an embodiment of the present invention;
fig. 4 is a schematic view of an interactive control interface displayed on a touch screen of a vehicle-mounted device according to an embodiment of the present invention;
FIG. 5 is a flowchart of another interaction method of a touch screen according to an embodiment of the present invention;
FIG. 6 is a flowchart of an interaction method of a touch screen according to another embodiment of the present invention;
FIG. 7 is a schematic diagram illustrating a non-touch operation according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of a primary interface and a secondary interface of a touch screen display according to an embodiment of the present invention;
fig. 9 is a block diagram of an interaction device of a touch screen according to an embodiment of the present invention;
fig. 10 is a top view of an interaction scene of a touch screen according to an embodiment of the present invention;
FIG. 11 is a top view of an interaction scene of another touch screen according to an embodiment of the present invention;
fig. 12 is a top view of an interaction scene of another touch screen according to an embodiment of the present invention;
fig. 13 is a top view of an interaction scene of another touch screen according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present invention. In some embodiments, the electronic device may include a mobile terminal such as a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet), a PMP (portable multimedia player), a navigation apparatus, a car-mounted device, and the like, and a fixed terminal such as a digital TV, a desktop computer, a printer, and the like.
The electronic device shown in fig. 1 includes: a touch screen 900, at least one processor 901, at least one memory 902, at least one network interface 904, and other user interfaces 903. Various components in the electronic device are coupled together by a bus system 905. It is understood that the bus system 905 is used to enable communications among the components. The bus system 905 includes a power bus, a control bus, and a status signal bus, in addition to a data bus. For clarity of illustration, however, the various buses are labeled in fig. 1 as bus system 905.
The user interface 903 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, trackball, or touch pad, among others.
It will be appreciated that the memory 902 in the subject embodiment can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. The non-volatile memory may be a Read-only memory (ROM), a programmable Read-only memory (PROM), an erasable programmable Read-only memory (erasabprom, EPROM), an electrically erasable programmable Read-only memory (EEPROM), or a flash memory. The volatile memory may be a Random Access Memory (RAM) which functions as an external cache. By way of example, but not limitation, many forms of RAM are available, such as static random access memory (staticiram, SRAM), dynamic random access memory (dynamic RAM, DRAM), synchronous dynamic random access memory (syncronous DRAM, SDRAM), Double Data Rate Synchronous Dynamic Random Access Memory (DDRSDRAM), Enhanced synchronous SDRAM (ESDRAM), synchronous link SDRAM (SLDRAM), and direct memory bus SDRAM (DRRAM). The memory 902 described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In some embodiments, memory 902 stores the following elements, executable units or data structures, or a subset thereof, or an expanded set thereof: an operating system 9021 and application programs 9022.
The operating system 9021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, and is configured to implement various basic services and process hardware-based tasks. The application 9022 includes various applications, such as a media player (MediaPlayer), a Browser (Browser), and the like, for implementing various application services. A program implementing the method of an embodiment of the present invention may be included in application 9022.
In this embodiment of the present invention, the processor 901 is configured to execute the method steps provided by the method embodiments by calling a program or an instruction stored in the memory 902, specifically, a program or an instruction stored in the application 9022, where the method steps include:
detecting first operation information of an operation body in front of a touch screen;
predicting a selected area of an operation body on the touch screen according to the first operation information;
acquiring an information display area corresponding to the selected area;
and displaying an interactive control interface corresponding to the information display area on the touch screen.
The method disclosed in the above embodiments of the present invention may be applied to the processor 901, or implemented by the processor 901. The processor 901 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be implemented by integrated logic circuits of hardware or instructions in the form of software in the processor 901. The processor 901 may be a general-purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, or discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software elements in the decoding processor. The software elements may be located in ram, flash, rom, prom, or eprom, registers, among other storage media that are well known in the art. The storage medium is located in the memory 902, and the processor 901 reads the information in the memory 902, and completes the steps of the above method in combination with the hardware thereof.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or any combination thereof. For a hardware implementation, the processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described herein may be implemented by means of units performing the functions described herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present invention may be essentially implemented or make a contribution to the prior art, or may be implemented in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
As shown in fig. 2, the present embodiment discloses an interaction method for a touch screen, which may include the following steps 101 to 104.
101. First operation information of an operation body in front of the touch screen is detected.
In this embodiment, the operation body is, for example, a hand of a user, including a finger and a palm. The first operation information is information in which the operation body performs a non-touch operation on the touch screen in front of the touch screen.
In this embodiment, the execution body of the method sets a non-touch operation in advance to realize a specific function. There may be a plurality of non-touch operations.
In this embodiment, the non-touch operation includes, for example: and one or more of non-touch operations such as gesture operation, finger pointing operation and hovering operation. Wherein the finger pointing is, for example, the index finger pointing.
In this embodiment, the first operation information, that is, the information of the non-touch operation, includes, for example: and one or more of non-touch operation information such as gestures, finger pointing and hovering positions.
102. And predicting a selected area of the operation body on the touch screen according to the first operation information.
In an existing touch screen operation mode, a user directly touches an interactive control displayed on a touch screen of an electronic device, so that the electronic device executes a touch event operation preset by the touched interactive control. The touch event operation can be understood as a touch event driving the electronic device to execute a preset function of the interactive control. In this embodiment, because the user performs a non-contact operation, the interface displayed on the touch screen may include at least one information display area instead of the interactive control. Because the information display area does not contain the interactive control, the screen space of the touch screen can be saved, and more information display is provided.
In this embodiment, the screen space of the touch screen may be pre-divided into a plurality of information display areas, and each area is used for displaying different information contents. As shown in fig. 3, taking an on-vehicle device as an example, a screen space of a touch screen of the on-vehicle device is divided into four information display areas, and information contents displayed in the four information display areas are respectively: map navigation systems, audio-visual entertainment systems, vehicle information systems, and personal information systems.
In this embodiment, the information display area may be understood as a primary interface, and when the user selects the information display area to browse, an interface further displayed on the touch screen may be understood as a secondary interface of the primary interface, which is called a secondary interface of the primary interface. It is understood that there may be three levels of interfaces under the second level of interfaces, i.e., there may be a next level of interface under each level of interfaces. With reference to fig. 3 and 4, the video entertainment system in fig. 3 is used as a primary interface, and the interface shown in the corresponding area in fig. 4 is a secondary interface of the video entertainment system.
In this embodiment, it is considered that a user usually selects one information display area, and further can browse more information of the information display area, so that according to the first operation information, the selected area of the operation body on the touch screen can be predicted, and further, the information display area corresponding to the selected area can be determined, and the pre-judgment of the user selection intention is realized. Before the user performs touch operation on the touch screen, the pre-judgment result can be fed back to the user, so that the use experience of the user is improved.
103. And acquiring an information display area corresponding to the selected area.
In the embodiment, the information display area corresponding to the selected area can be obtained after the selected area of the operation body on the touch screen is predicted, and the information display area selected by the user can be obtained without waiting for the touch operation of the user on the touch screen, so that the time for the user to select the information display area is shortened, and the user experience is improved.
104. And displaying an interactive control interface corresponding to the information display area on the touch screen.
In this embodiment, the corresponding relationship between the information display area and the interactive control interface may be preset, so that according to the corresponding relationship, the interactive control interface corresponding to the acquired information display area may be determined, and the interactive control interface may be displayed on the touch screen.
In this embodiment, the interactive control interface is used as a secondary interface of the information display area, and is an interface including the interactive control. With reference to fig. 3 and 4, the video entertainment system in fig. 3 is used as a primary interface, the interface displayed in the corresponding area in fig. 4 is an interactive control interface, and the interactive control interface is a secondary interface of the video entertainment system.
In summary, the embodiment realizes the interaction between the first operation performed by the user and the interface of the touch screen display interaction control.
With reference to fig. 3 and 4, the interaction flow for the touch screen is described as follows:
taking the first operation information as an example of a gesture, detecting the gesture of the user, and predicting that the user selects the audio-video entertainment system when the selected area of the user is the upper right area of the touch screen according to the gesture of the user, so that an interactive control interface corresponding to the audio-video entertainment system and shown in the upper right area of fig. 4 is displayed on the touch screen.
In this embodiment, by detecting the first operation information in front of the touch screen, predicting the selected area on the touch screen can obtain the information display area corresponding to the selected area, and then displaying the interactive control interface corresponding to the information display area on the touch screen. The user does not need to display the interactive control by clicking the touch screen, the user can trigger and display the interactive control by positioning the hand in the detectable area in front of the touch screen, and then the user can directly click the control to complete the selection operation.
The method provided by the application utilizes the first stage, detects first operation information, such as gestures, of the user in the first stage, and then triggers and displays the interactive control according to the gesture prediction selected area, so that the interactive control can not be displayed on the touch screen, the control is not awakened through extra touch operation, the user completes one touch operation, and the two processes of triggering and selecting the control can be smoothly realized.
As shown in fig. 5, the present embodiment discloses an interaction method for a touch screen, which may include the following steps 401 to 408:
401. first operation information of an operation body in front of the touch screen is detected.
402. And predicting a selected area of the operation body on the touch screen according to the first operation information.
403. And acquiring an information display area corresponding to the selected area.
404. And displaying an interactive control interface corresponding to the information display area on the touch screen.
In this embodiment, steps 401 to 404 are the same as steps 101 to 104 shown in fig. 2, and are not described again here.
405. Detecting whether the interaction control interface is touched; if the interactive control interface is touched, executing steps 406 to 408; and if the interactive control interface is not touched, not displaying the interactive control interface, and redisplaying the information display area in the selected area.
In this embodiment, the existing touch screen detection technology can be used to detect whether the interactive control interface is touched, and this embodiment is not described in detail.
In this embodiment, a monitoring duration may be set, timing is started after the interactive control interface corresponding to the information display area is displayed on the touch screen, and if the interactive control interface is touched before the timing duration reaches the monitoring duration, steps 406 to 408 are executed; and if the timing duration reaches the monitoring duration, and the interactive control interface is not touched, not displaying the interactive control interface, and redisplaying the information display area in the selected area.
406. The touch location is determined.
In this embodiment, the method for determining the touch position may use the existing touch screen detection technology, and this embodiment is not described in detail.
407. And determining an interactive control corresponding to the touch position.
408. And executing touch event operation preset by the interactive control.
In this embodiment, the touch event operation may be understood as a touch event that drives the electronic device to execute a preset function of the interactive control, for example, in the interactive control interface displayed in the upper right area of fig. 4, if the play/pause control is touched, the electronic device may open the audio player to play music or pause music.
In this embodiment, steps 401 to 404 implement sensing of the user's hand, predict a selected area of the user on the touch screen, and enable the touch screen to convert information content displayed in the information display area into an interactive control interface; steps 405 to 408 implement the functions of detecting the user touch and driving the electronic device to execute the touch interaction control preset. It can be seen that the present embodiment defines the interactions described in steps 401 to 404 as a supplement to the understanding of the user's intention in a human-computer interaction interface, rather than as a separate interaction interface.
In this embodiment, the interaction described in steps 401 to 404 and the touch screen touch involved in steps 405 to 408 are combined at different times of the same interaction task in the human-computer interaction process, and are a coherent and associative interaction mode, so that a discrete and unassociated interaction mode in which touch is used as an interaction task trigger is replaced.
As shown in fig. 6, the present embodiment discloses an interaction method for a touch screen, which may include the following steps 501 to 508:
501. first operation information of an operation body in front of the touch screen is detected.
502. And predicting a selected area of the operation body on the touch screen according to the first operation information.
503. And acquiring an information display area corresponding to the selected area.
504. And displaying an interactive control interface corresponding to the information display area on the touch screen.
In this embodiment, steps 501 to 504 are the same as steps 101 to 104 shown in fig. 2, and are not described again here.
505. And detecting second operation information of an operation body in front of the touch screen.
In this embodiment, the second operation information is information for the operator to perform a non-touch operation on the touch screen in front of the touch screen.
In this embodiment, the execution body of the method sets a non-touch operation in advance to realize a specific function. There may be a plurality of non-touch operations. As shown in fig. 7, the non-touch operation includes, for example: finger clicks, finger side-to-side swipes, relative thumb and forefinger swipes (including relatively close and relatively far), finger swipes up and down, and/or finger rotations. The finger click is, for example, an index finger click, the finger slide left and right is, for example, an index finger slide left and right, and the finger slide up and down is, for example, an index finger slide up and down. Wherein, the thumb and the forefinger of the finger rotating ring are opened to form a certain angle, other fingers are closed to the palm, the thumb and the forefinger rotate along the same direction, and the rotation angle can be 180 degrees or less than 180 degrees.
In this embodiment, the second operation information, that is, the information of the non-touch operation, includes, for example: finger clicks, finger side-to-side swipes, relative thumb and forefinger swipes (including relatively close and relatively far), finger swipes up and down, and/or finger rotations.
506. And determining an interactive control corresponding to the second operation information.
In this embodiment, the information display area may be understood as a primary interface, and the primary interface may not display any interactive control, as shown in fig. 8, the primary interface includes: the system comprises a route interface, an air conditioner interface, a music interface and a driving interface, wherein each primary interface is displayed in an information display area. When the user selects the information display area to browse, the interface further displayed by the touch screen can be understood as a next-level interface of the first-level interface, which is called a second-level interface of the first-level interface, the second-level interface can comprise an interactive control, and the second-level interface can also be understood as an interactive control interface. As shown in fig. 8, the secondary interface of the route interface includes a site addition control and a site deletion control; the second level interface of the music interface comprises a play/pause control, and the third level interface of the music interface comprises four interactive controls of song switching, volume adjustment, sound source switching and fast forward and fast backward. In order to reduce interface switching, the interactive control of the secondary interface may be displayed in the primary interface, for example, the play/pause control included in the secondary interface of the music interface is displayed in the music interface, and the displayed trigger condition is that the music interface is selected or predicted to be selected, so that switching of the secondary interface is omitted, and the tertiary interface of the music interface becomes the formal secondary interface.
In this embodiment, the interactive control corresponding to the second operation information in the interactive control interface may be preset, that is, for different interactive control interfaces, the same second operation information corresponds to different interactive controls.
With reference to fig. 7 and fig. 8, in this embodiment, a finger clicks a fast forward/fast backward control corresponding to a secondary interface of a music interface. And the finger slides left and right to correspond to the track switching control on the secondary interface of the music interface, the temperature adjusting control on the secondary interface of the air-conditioning interface, and the site adding control and the site deleting control on the secondary interface of the route interface. The thumb and the index finger slide relatively to each other and correspond to the volume adjusting control on the secondary interface of the music interface, and the wind speed adjusting control on the secondary interface of the air-conditioning interface. The fingers slide up and down on the secondary interface of the music interface to correspond to the sound source switching control, and the secondary interface of the air-conditioning interface to correspond to the wind direction adjusting control. It should be noted that the above corresponding relationship is only an example, the embodiment does not limit a specific corresponding relationship, and a person skilled in the art may set the specific corresponding relationship according to actual needs.
507. And executing touch event operation preset by the interactive control in the interactive control interface.
In this embodiment, the difference from the interaction method of the touch screen shown in fig. 5 is that it is not necessary to detect whether the touch screen is touched, and an interaction control corresponding to the second operation information in the interaction control interface is preset, so that the interaction control corresponding to the detected second operation information can be determined, and an independent interaction interface is implemented.
In some embodiments, the first operational information comprises a gesture.
Predicting a selected area of an operation body on the touch screen according to the first operation information, wherein the step of predicting the selected area of the operation body on the touch screen comprises the following steps: and predicting the selected area corresponding to the first operation information according to the preset corresponding relation between the gesture and the selected area.
With reference to fig. 3 and 4, the interaction flow for the touch screen is described as follows:
and detecting the gesture of the user, and predicting that the user selects the audio-video entertainment system when the region selected by the user is predicted to be the upper right region of the touch screen according to the gesture of the user, so that an interactive control interface corresponding to the audio-video entertainment system and shown in the upper right region of the figure 4 is displayed on the touch screen.
In some embodiments, the first operation information includes a finger pointing direction.
Predicting a selected area of an operation body on the touch screen according to the first operation information, wherein the step of predicting the selected area of the operation body on the touch screen comprises the following steps: determining a pointing position where an extension line pointed by the finger intersects with the touch screen; and predicting the selected area corresponding to the first operation information according to the pointing position.
With reference to fig. 3, 4 and 13, the interaction flow for the touch screen is described as follows:
in fig. 13, the finger direction of the hand 3 of the user is detected, the pointing position where the extension line of the finger direction intersects with the touch screen 1 is determined, and when the pointing position is the upper right area of the touch screen 1, as shown in fig. 3, it can be predicted that the user selects the audio-visual entertainment system, so that the interactive control interface corresponding to the audio-visual entertainment system and shown in the upper right area of fig. 4 is displayed on the touch screen 1. In fig. 13, the reference numeral 2 denotes a boundary of a preset interaction space in front of the touch screen 1, that is, when the user's hand 3 is between the boundary and the touch screen 1, it can be detected.
In some embodiments, the first operational information comprises a hover position.
Predicting a selected area of an operation body on the touch screen according to the first operation information, wherein the step of predicting the selected area of the operation body on the touch screen comprises the following steps: determining the projection position of the operation body on the touch screen according to the first operation information; and predicting the selected area corresponding to the first operation information according to the projection position.
With reference to fig. 3, 4, 10, 11 and 12, the interaction flow of the touch screen is described as follows:
in fig. 10, 11 and 12, the reference numeral 2 denotes a boundary of a preset interaction space in front of the touch screen 1, that is, when the user's hand 3 is between the boundary and the touch screen 1, it can be detected. Fig. 10, 11 and 12 can be understood as schematic diagrams of the process that the user hand 3 gradually approaches the touch screen 1 and finally touches the touch screen 1, and in fig. 10, since the user hand 3 is not between the boundary and the touch screen 1, the interaction method of the touch screen is not executed; in fig. 11, the user hand 3 is located between the boundary and the touch screen 1, the hovering position of the user hand 3 is detected, the projection position of the hovering position on the touch screen 1 is determined, and when the projection position is the upper right area of the touch screen 1, since the information display area corresponding to the upper right area is the audio-visual entertainment system, as shown in fig. 3, it can be predicted that the user selects the audio-visual entertainment system, and therefore, the interactive control interface corresponding to the audio-visual entertainment system, as shown in the upper right area of fig. 4, is displayed on the touch screen 1; in fig. 12, when the hand 3 of the user touches the touch screen 1, the interaction control at the touch position is triggered.
It should be noted that the above three schemes (gesture, finger pointing and hovering position) belong to a parallel scheme, and in a specific application, one scheme is used. In practical application, a scheme can be selected according to the vertical distance between the hovering position and the touch screen, for example, if the vertical distance is less than or equal to a preset distance threshold, a second scheme is adopted, that is, the first operation information includes a scheme corresponding to the finger pointing; if the vertical distance is greater than the preset distance threshold, a third scheme is adopted, that is, the first operation information includes a scheme corresponding to the hovering position. The preset distance threshold is, for example, 10cm, and a person skilled in the art can set the distance threshold according to actual needs, and the specific value of the distance threshold is not limited in this embodiment.
In some embodiments, displaying, on the touch screen, an interactive control interface corresponding to the information display area includes: and in the selected area, switching the content displayed in the information display area into the interactive control interface.
In this embodiment, this scheme is applied to the embodiment shown in fig. 5.
In some embodiments, displaying, on the touch screen, an interactive control interface corresponding to the information display area includes: and displaying the interactive control interface on the touch screen in a full screen mode.
In this embodiment, this scheme is applied to the embodiments shown in fig. 5 and 6.
In some embodiments, displaying an interactive control interface corresponding to the information display area on the touch screen includes the following steps 1041 to 1043, which are not shown in fig. 2:
1041. determining a projection position of the operation body on the touch screen;
1042. determining the display position of the interactive control interface according to the projection position;
1043. and displaying the interactive control interface at the display position.
In this embodiment, the interactive control interface is displayed at the display position in step 1043, for example, the interactive control interface is displayed around the display position, in a specific implementation, the interactive control interface may be displayed in a form of a disk, the disk is divided into a plurality of cells, each cell corresponds to one interactive control in the interactive control interface, accordingly, a user may move a finger up and down and left and right, and the interactive control corresponding to the cell pointed by the finger is triggered.
In this embodiment, this scheme is applied to the embodiment shown in fig. 6.
In some embodiments, the method shown in fig. 2 further comprises the following step 100, not shown in fig. 2:
100. detecting whether an operation body approaches the touch screen or not according to an interaction space preset in front of the touch screen;
if yes, triggering and executing the step 101 of detecting the first operation information of the operation body in front of the touch screen.
In some embodiments, the method shown in fig. 2 further comprises the following step 100', not shown in fig. 2:
and if step 100 detects that no operation body exists in the interaction space preset in front of the touch screen, displaying a preset main interface on the touch screen.
It should be noted that, the interaction method of the touch screen disclosed in the above embodiments may be combined into a new embodiment unless a combination manner is specifically described, and the execution sequence of the steps in each embodiment may be adjustable unless logic contradictions exist.
As shown in fig. 9, the present embodiment discloses an interaction device of a touch screen, including:
a first detection unit 81 for detecting first operation information of an operation body in front of the touch screen;
the prediction unit 82 is used for predicting a selected area of the operation body on the touch screen according to the first operation information;
an obtaining unit 83, configured to obtain an information display area corresponding to the selected area;
and the display unit 84 is configured to display an interactive control interface corresponding to the information display area on the touch screen.
In some embodiments, the apparatus further comprises:
the second detection unit is used for detecting whether the interaction control interface is touched or not;
a first determination unit configured to determine a touch position after the second detection unit detects the contact;
the second determining unit is used for determining the interactive control corresponding to the touch position;
and the first execution unit is used for executing the touch event operation preset by the interactive control.
In some embodiments, the apparatus further comprises:
a third detecting unit, configured to detect second operation information of an operation body in front of the touch screen after the display unit 84 displays the interactive control interface;
a third determining unit, configured to determine an interactive control corresponding to the second operation information;
and the second execution unit is used for executing the touch event operation preset by the interactive control in the interactive control interface.
In some embodiments, the first operational information comprises a gesture;
correspondingly, the predicting unit 82 is configured to predict the selected region corresponding to the first operation information according to a preset corresponding relationship between the gesture and the selected region.
In some embodiments, the first operation information includes a finger pointing direction;
accordingly, the prediction unit 82 is configured to:
determining a pointing position where an extension line pointed by the finger intersects with the touch screen;
and predicting the selected area corresponding to the first operation information according to the pointing position.
In some embodiments, the first operational information comprises a hover position;
accordingly, the prediction unit 82 is configured to:
determining the projection position of the operation body on the touch screen according to the first operation information;
and predicting the selected area corresponding to the first operation information according to the projection position.
In some embodiments, the display unit 84 is configured to switch, in the selected area, the content displayed in the information display area to the interactive control interface.
In some embodiments, the display unit 84 is configured to display the interactive control interface on the touch screen in a full screen.
In some embodiments, the display unit 84 is configured to determine a projection position of the operation body on the touch screen; determining the display position of the interactive control interface according to the projection position; and displaying the interactive control interface at the display position.
In some embodiments, the apparatus further comprises:
the third detection unit is used for detecting whether an operation body approaches the touch screen or not according to an interaction space preset in front of the touch screen;
accordingly, the first detecting unit 81 is configured to detect first operation information of an operation body in front of the touch screen after the third detecting unit detects that the operation body approaches the touch screen.
In some embodiments, the display unit 84 is further configured to display a preset main interface on the touch screen after the third detection unit detects that no operation body exists in the preset interaction space in front of the touch screen.
The interaction device of the touch screen disclosed in the above embodiments can implement the flow of the interaction method of the touch screen disclosed in the above method embodiments, and is not described herein again to avoid repetition.
Embodiments of the present invention also provide a non-transitory computer-readable storage medium storing computer instructions, which cause a computer to perform the method steps provided by the method embodiments, for example, including:
detecting first operation information of an operation body in front of a touch screen;
predicting a selected area of an operation body on the touch screen according to the first operation information;
acquiring an information display area corresponding to the selected area;
and displaying an interactive control interface corresponding to the information display area on the touch screen.
The embodiment of the invention also provides:
a1, an interaction method of a touch screen, the method comprising:
detecting first operation information of an operation body in front of a touch screen;
predicting a selected area of an operation body on the touch screen according to the first operation information;
acquiring an information display area corresponding to the selected area;
and displaying an interactive control interface corresponding to the information display area on the touch screen.
A2, the method of A1, the method further comprising:
detecting whether the interaction control interface is touched;
if so, determining the touch position;
determining an interactive control corresponding to the touch position;
and executing touch event operation preset by the interactive control.
A3, according to the method in A1, after an interactive control interface corresponding to the information display area is displayed on the touch screen, the method further includes:
detecting second operation information of an operation body in front of the touch screen;
determining an interactive control corresponding to the second operation information;
and executing touch event operation preset by the interactive control in the interactive control interface.
A4, the method of any of A1-A3, the first operational information comprising a gesture;
correspondingly, the predicting the selected area of the operation body on the touch screen according to the first operation information comprises:
and predicting the selected area corresponding to the first operation information according to the preset corresponding relation between the gesture and the selected area.
A5, the method of any one of A1 to A3, the first operation information comprising a finger pointing direction;
correspondingly, the predicting the selected area of the operation body on the touch screen according to the first operation information comprises:
determining a pointing position where an extension line pointed by the finger intersects with the touch screen;
and predicting the selected area corresponding to the first operation information according to the pointing position.
A6, the method of any of A1 to A3, the first operational information comprising a hover position;
correspondingly, the predicting the selected area of the operation body on the touch screen according to the first operation information comprises:
determining the projection position of the operation body on the touch screen according to the first operation information;
and predicting the selected area corresponding to the first operation information according to the projection position.
A7, according to the method of any one of A1 to A3, the displaying an interactive control interface corresponding to the information display area on the touch screen includes:
and in the selected area, switching the content displayed in the information display area into the interactive control interface.
A8, according to the method of any one of A1 to A3, the displaying an interactive control interface corresponding to the information display area on the touch screen includes:
and displaying the interactive control interface on the touch screen in a full screen mode.
A9, according to the method of any one of A1 to A3, the displaying an interactive control interface corresponding to the information display area on the touch screen includes:
determining a projection position of the operation body on the touch screen;
determining the display position of the interactive control interface according to the projection position;
and displaying the interactive control interface at the display position.
A10, the method of any one of A1 to A3, the method further comprising:
detecting whether an operation body approaches the touch screen or not according to an interaction space preset in front of the touch screen;
and if so, triggering and executing the step of detecting the first operation information of the operation body in front of the touch screen.
A11, the method of A10, the method further comprising:
and if detecting that no operation body exists in the interaction space preset in front of the touch screen, displaying a preset main interface on the touch screen.
A12, an interaction device of a touch screen, the device comprising:
the first detection unit is used for detecting first operation information of an operation body in front of the touch screen;
the prediction unit is used for predicting a selected area of the operation body on the touch screen according to the first operation information;
the acquisition unit is used for acquiring the information display area corresponding to the selected area;
and the display unit is used for displaying the interactive control interface corresponding to the information display area on the touch screen.
A13, the apparatus of A12, further comprising:
the second detection unit is used for detecting whether the interaction control interface is touched or not;
a first determination unit configured to determine a touch position after the second detection unit detects the contact;
the second determining unit is used for determining the interactive control corresponding to the touch position;
and the first execution unit is used for executing the touch event operation preset by the interactive control.
A14, the apparatus of A12, further comprising:
the third detection unit is used for detecting second operation information of an operation body in front of the touch screen after the display unit displays the interactive control interface;
a third determining unit, configured to determine an interactive control corresponding to the second operation information;
and the second execution unit is used for executing the touch event operation preset by the interactive control in the interactive control interface.
A15, the device of any one of A12 to A14, the first operational information comprising a gesture;
correspondingly, the prediction unit is configured to predict the selected region corresponding to the first operation information according to a preset corresponding relationship between the gesture and the selected region.
A16, the device of any one of A12 to A14, the first operation information comprising a finger pointing direction;
accordingly, the prediction unit is configured to:
determining a pointing position where an extension line pointed by the finger intersects with the touch screen;
and predicting the selected area corresponding to the first operation information according to the pointing position.
A17, the apparatus according to any one of A12 to A14, the first operation information comprising a hover position;
accordingly, the prediction unit is configured to:
determining the projection position of the operation body on the touch screen according to the first operation information;
and predicting the selected area corresponding to the first operation information according to the projection position.
A18, the device according to any one of A12 to A14, the display unit is configured to switch the content displayed in the information display area to the interactive control interface in the selected area.
A19, the device according to any one of A12 to A14, the display unit is used for displaying the interactive control interface on the touch screen in a full screen mode.
A20, the device of any one of A12 to A14, the display unit to:
determining a projection position of the operation body on the touch screen;
determining the display position of the interactive control interface according to the projection position;
and displaying the interactive control interface at the display position.
A21, the device of any one of A12 to A14, further comprising:
the third detection unit is used for detecting whether an operation body approaches the touch screen or not according to an interaction space preset in front of the touch screen;
correspondingly, the first detection unit is used for detecting the first operation information of the operation body in front of the touch screen after the third detection unit detects that the operation body approaches the touch screen.
A22, the device according to A21, and the display unit, wherein the third detection unit is further configured to display a preset main interface on the touch screen after detecting that no operation body exists in a preset interaction space in front of the touch screen.
A23, an electronic device, comprising:
a touch screen, a processor, a memory, a network interface, and a user interface;
the touch screen, the processor, the memory, the network interface and the user interface are coupled together through a bus system;
the processor is operable to perform the steps of the method of any one of A1 to A11 by calling a program or instructions stored in the memory.
A24, a non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the steps of the method of any one of a1 to a 11.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Those skilled in the art will appreciate that although some embodiments described herein include some features included in other embodiments instead of others, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments.
Although the embodiments of the present invention have been described in conjunction with the accompanying drawings, those skilled in the art may make various modifications and variations without departing from the spirit and scope of the invention, and such modifications and variations fall within the scope defined by the appended claims.

Claims (16)

1. An interaction method of a touch screen, the method comprising:
detecting first operation information of an operation body in front of a touch screen;
predicting a selected area of an operation body on the touch screen according to the first operation information;
acquiring an information display area corresponding to the selected area;
displaying an interactive control interface corresponding to the information display area on the touch screen;
predicting a selected area of an operation body on the touch screen according to the first operation information, wherein the step of predicting the selected area of the operation body on the touch screen comprises the following steps:
if the vertical distance between the hovering position of the operation body and the touch screen is smaller than or equal to a preset distance threshold, determining a pointing position where an extension line pointed by a finger intersects with the touch screen, and predicting a selected area corresponding to the first operation information according to the pointing position;
if the vertical distance is larger than a preset distance threshold, determining the projection position of the operation body on the touch screen according to the first operation information, and predicting a selected area corresponding to the first operation information according to the projection position;
displaying an interactive control interface corresponding to the information display area on the touch screen, including:
and displaying the interactive control interface on the content displayed in the information display area.
2. The method of claim 1, further comprising:
detecting whether the interaction control interface is touched;
if so, determining the touch position;
determining an interactive control corresponding to the touch position;
and executing touch event operation preset by the interactive control.
3. The method according to claim 1, wherein after the interactive control interface corresponding to the information presentation area is displayed on the touch screen, the method further comprises:
detecting second operation information of an operation body in front of the touch screen;
determining an interactive control corresponding to the second operation information;
and executing touch event operation preset by the interactive control in the interactive control interface.
4. The method according to any one of claims 1 to 3, wherein the displaying the interactive control interface corresponding to the information display area on the touch screen includes:
and displaying the interactive control interface on the touch screen in a full screen mode.
5. The method according to any one of claims 1 to 3, wherein the displaying the interactive control interface corresponding to the information display area on the touch screen includes:
determining a projection position of the operation body on the touch screen;
determining the display position of the interactive control interface according to the projection position;
and displaying the interactive control interface at the display position.
6. The method according to any one of claims 1 to 3, further comprising:
detecting whether an operation body approaches the touch screen or not according to an interaction space preset in front of the touch screen;
and if so, triggering and executing the step of detecting the first operation information of the operation body in front of the touch screen.
7. The method of claim 6, further comprising:
and if detecting that no operation body exists in the interaction space preset in front of the touch screen, displaying a preset main interface on the touch screen.
8. An interactive device for a touch screen, the device comprising:
the first detection unit is used for detecting first operation information of an operation body in front of the touch screen;
the prediction unit is used for predicting a selected area of the operation body on the touch screen according to the first operation information;
the acquisition unit is used for acquiring the information display area corresponding to the selected area;
the display unit is used for displaying an interactive control interface corresponding to the information display area on the touch screen;
the prediction unit is used for determining a pointing position where an extension line pointed by a finger intersects with the touch screen if the vertical distance between the hovering position of the operation body and the touch screen is smaller than or equal to a preset distance threshold, and predicting a selected area corresponding to the first operation information according to the pointing position; if the vertical distance is larger than a preset distance threshold, determining the projection position of the operation body on the touch screen according to the first operation information, and predicting a selected area corresponding to the first operation information according to the projection position;
and the display unit is used for displaying the interactive control interface on the content displayed in the information display area.
9. The apparatus of claim 8, further comprising:
the second detection unit is used for detecting whether the interaction control interface is touched or not;
a first determination unit configured to determine a touch position after the second detection unit detects the contact;
the second determining unit is used for determining the interactive control corresponding to the touch position;
and the first execution unit is used for executing the touch event operation preset by the interactive control.
10. The apparatus of claim 8, further comprising:
the third detection unit is used for detecting second operation information of an operation body in front of the touch screen after the display unit displays the interactive control interface;
a third determining unit, configured to determine an interactive control corresponding to the second operation information;
and the second execution unit is used for executing the touch event operation preset by the interactive control in the interactive control interface.
11. The apparatus according to any one of claims 8 to 10, wherein the display unit is configured to display the interactive control interface on the touch screen in a full screen manner.
12. The apparatus according to any one of claims 8 to 10, wherein the display unit is configured to:
determining a projection position of the operation body on the touch screen;
determining the display position of the interactive control interface according to the projection position;
and displaying the interactive control interface at the display position.
13. The apparatus of any one of claims 8 to 10, further comprising:
the third detection unit is used for detecting whether an operation body approaches the touch screen or not according to an interaction space preset in front of the touch screen;
correspondingly, the first detection unit is used for detecting the first operation information of the operation body in front of the touch screen after the third detection unit detects that the operation body approaches the touch screen.
14. The device according to claim 13, wherein the display unit is further configured to display a preset main interface on the touch screen after the third detection unit detects that no operation body exists in a preset interaction space in front of the touch screen.
15. An electronic device, comprising:
a touch screen, a processor, a memory, a network interface, and a user interface;
the touch screen, the processor, the memory, the network interface and the user interface are coupled together through a bus system;
the processor is adapted to perform the steps of the method of any one of claims 1 to 7 by calling a program or instructions stored in the memory.
16. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the steps of the method according to any one of claims 1 to 7.
CN201810622477.1A 2018-06-15 2018-06-15 Interaction method and device for touch screen, electronic equipment and storage medium Active CN108829319B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810622477.1A CN108829319B (en) 2018-06-15 2018-06-15 Interaction method and device for touch screen, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810622477.1A CN108829319B (en) 2018-06-15 2018-06-15 Interaction method and device for touch screen, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN108829319A CN108829319A (en) 2018-11-16
CN108829319B true CN108829319B (en) 2020-09-01

Family

ID=64142464

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810622477.1A Active CN108829319B (en) 2018-06-15 2018-06-15 Interaction method and device for touch screen, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN108829319B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109683711B (en) * 2018-12-21 2022-12-20 广东美的白色家电技术创新中心有限公司 Product display method and device
CN110083395B (en) * 2019-04-15 2023-07-18 上海墨案智能科技有限公司 Touch control method, electronic equipment, detachable part and storage medium
CN110315974A (en) * 2019-07-11 2019-10-11 思特沃克软件技术(北京)有限公司 Method and apparatus of the motor vehicle based on touch screen switching project status
CN111142655A (en) * 2019-12-10 2020-05-12 上海博泰悦臻电子设备制造有限公司 Interaction method, terminal and computer readable storage medium
CN111309153B (en) * 2020-03-25 2024-04-09 北京百度网讯科技有限公司 Man-machine interaction control method and device, electronic equipment and storage medium
CN112289339A (en) * 2020-06-04 2021-01-29 郭亚力 System for converting voice into picture
CN111813285B (en) * 2020-06-23 2022-02-22 维沃移动通信有限公司 Floating window management method and device, electronic equipment and readable storage medium
CN112181249A (en) * 2020-09-25 2021-01-05 北京字节跳动网络技术有限公司 Play control method and device, electronic equipment and storage medium
CN112612400B (en) * 2020-12-28 2022-05-24 维沃移动通信有限公司 Text content processing method and electronic equipment
CN113721819A (en) * 2021-09-02 2021-11-30 网易(杭州)网络有限公司 Man-machine interaction method and device and electronic equipment
CN113946265A (en) * 2021-09-29 2022-01-18 北京五八信息技术有限公司 Data processing method and device, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103631513A (en) * 2012-08-23 2014-03-12 三星电子株式会社 Portable device and guide information provision method thereof
CN104598109A (en) * 2015-01-08 2015-05-06 天津三星通信技术研究有限公司 Method and equipment for previewing application in portable terminal
CN105190520A (en) * 2013-03-13 2015-12-23 微软技术许可有限责任公司 Hover gestures for touch-enabled devices
CN106055098A (en) * 2016-05-24 2016-10-26 北京小米移动软件有限公司 Air gesture operation method and apparatus
CN106484104A (en) * 2016-09-19 2017-03-08 深圳市金立通信设备有限公司 A kind of operation method of application program and terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI546715B (en) * 2014-12-26 2016-08-21 深圳市華星光電技術有限公司 Floating touch method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103631513A (en) * 2012-08-23 2014-03-12 三星电子株式会社 Portable device and guide information provision method thereof
CN105190520A (en) * 2013-03-13 2015-12-23 微软技术许可有限责任公司 Hover gestures for touch-enabled devices
CN104598109A (en) * 2015-01-08 2015-05-06 天津三星通信技术研究有限公司 Method and equipment for previewing application in portable terminal
CN106055098A (en) * 2016-05-24 2016-10-26 北京小米移动软件有限公司 Air gesture operation method and apparatus
CN106484104A (en) * 2016-09-19 2017-03-08 深圳市金立通信设备有限公司 A kind of operation method of application program and terminal

Also Published As

Publication number Publication date
CN108829319A (en) 2018-11-16

Similar Documents

Publication Publication Date Title
CN108829319B (en) Interaction method and device for touch screen, electronic equipment and storage medium
CN105824559B (en) False touch recognition and processing method and electronic equipment
US8890818B2 (en) Apparatus and method for proximity based input
US9261995B2 (en) Apparatus, method, and computer readable recording medium for selecting object by using multi-touch with related reference point
US8621379B2 (en) Device, method, and graphical user interface for creating and using duplicate virtual keys
US9146672B2 (en) Multidirectional swipe key for virtual keyboard
US8291350B1 (en) Gesture-based metadata display
US20090243998A1 (en) Apparatus, method and computer program product for providing an input gesture indicator
WO2020000969A1 (en) Method and device for controlling information flow display panel, terminal apparatus, and storage medium
US8212785B2 (en) Object search method and terminal having object search function
US9052819B2 (en) Intelligent gesture-based user's instantaneous interaction and task requirements recognition system and method
CN106201632B (en) Application program access method and mobile terminal
US20090178011A1 (en) Gesture movies
US20110302532A1 (en) Device, Method, and Graphical User Interface for Navigating Through a User Interface Using a Dynamic Object Selection Indicator
US8456433B2 (en) Signal processing apparatus, signal processing method and selection method of user interface icon for multi-touch panel
US20130232451A1 (en) Electronic device and method for switching between applications
JP7233109B2 (en) Touch-sensitive surface-display input method, electronic device, input control method and system with tactile-visual technology
WO2011084860A2 (en) Device, method, and graphical user interface for navigating through multiple viewing areas
KR101929316B1 (en) Method and apparatus for displaying keypad in terminal having touchscreen
US20120026077A1 (en) Mapping trackpad operations to touchscreen events
CN109388468B (en) Multi-program window management method and device
CN106354520B (en) Interface background switching method and mobile terminal
WO2019015581A1 (en) Text deletion method and mobile terminal
US20140351749A1 (en) Methods, apparatuses and computer program products for merging areas in views of user interfaces
CN107632761B (en) Display content viewing method, mobile terminal and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant