CN113138670B - Touch screen interaction gesture control method and device, touch screen and storage medium - Google Patents

Touch screen interaction gesture control method and device, touch screen and storage medium Download PDF

Info

Publication number
CN113138670B
CN113138670B CN202110504580.8A CN202110504580A CN113138670B CN 113138670 B CN113138670 B CN 113138670B CN 202110504580 A CN202110504580 A CN 202110504580A CN 113138670 B CN113138670 B CN 113138670B
Authority
CN
China
Prior art keywords
control panel
user
touch screen
finger
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110504580.8A
Other languages
Chinese (zh)
Other versions
CN113138670A (en
Inventor
高志生
张二阳
崔志斌
葛耀旭
焦月锋
敖亚磊
侯晓龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhengzhou J&T Hi Tech Co Ltd
Original Assignee
Zhengzhou J&T Hi Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhengzhou J&T Hi Tech Co Ltd filed Critical Zhengzhou J&T Hi Tech Co Ltd
Priority to CN202110504580.8A priority Critical patent/CN113138670B/en
Publication of CN113138670A publication Critical patent/CN113138670A/en
Application granted granted Critical
Publication of CN113138670B publication Critical patent/CN113138670B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Abstract

The application provides a touch screen interaction gesture control method and device, a touch screen and a storage medium, and relates to the technical field of simulation. The method comprises the following steps: responding to the touch operation of a user on the touch screen, acquiring the number of fingers of the user touching the touch screen, and displaying a picture of a simulation scene by the touch screen; determining the interactive gesture type of the user according to the number of the fingers and the mapping relation between the preset number of the fingers and the interactive gesture type; displaying a control panel corresponding to the interactive gesture type in an area touched by the finger of the user on the touch screen according to the interactive gesture type of the user; and responding to the operation of the user on the control panel, determining the operated picture and dynamically displaying the operated simulated picture on the touch screen. In the scheme, the control of the user gesture on the simulation scene displayed in the touch screen is realized, the interaction experience of the user and the simulation scene is improved, and meanwhile, the diversification and flexibility of the control mode of the user on the touch screen are also improved.

Description

Touch screen interaction gesture control method and device, touch screen and storage medium
Technical Field
The application relates to the technical field of simulation, in particular to a touch screen interaction gesture control method and device, a touch screen and a storage medium.
Background
The intelligent touch large screen is an intelligent interaction device which integrates touch, display, processing and the like into a whole machine. The intelligent touch interaction large screen adopts a large-screen liquid crystal display, has good man-machine interaction performance and convenient installation and operation, and is widely applied to the fields of simulation teaching and the like at present. The contents of simulation teaching and the required virtual environment are displayed through the intelligent touch large screen, and the user interacts with the virtual scene displayed in the intelligent touch large screen, so that the user has an experience of being personally on the scene, and the immersion feeling of the user is enhanced.
At present, a control method for interaction between a simulation scene in an intelligent touch large screen and a user does not exist in human-computer interaction. Therefore, it is highly desirable to provide a touch screen interaction gesture control method to improve the diversification and flexibility of the user's control mode for the touch screen, and to improve the interaction between the user and the simulated scene displayed in the touch screen.
Disclosure of Invention
The present invention aims to provide a method, an apparatus, a touch screen and a storage medium for controlling an interactive gesture of a touch screen, so as to improve diversification and flexibility of a user in a control manner of the touch screen, visually display an operation on a simulated scene to the user, and improve interaction between the user and the simulated scene displayed on the touch screen.
In order to achieve the above purpose, the embodiments of the present application adopt the following technical solutions:
in a first aspect, an embodiment of the present application provides a touch screen interaction gesture control method, including:
responding to the touch operation of a user on the touch screen, and acquiring the number of fingers of the user touching the touch screen, wherein the touch screen displays the picture of a simulation scene;
determining the interactive gesture type of the user according to the number of the fingers and the mapping relation between the preset number of the fingers and the interactive gesture type, wherein the interactive gesture type comprises the following steps: a single-finger interaction type, a two-finger interaction type, a three-finger interaction type, and a four-finger interaction type;
according to the type of the interaction gesture of the user, displaying a control panel corresponding to the type of the interaction gesture in an area touched by the finger of the user on the touch screen;
and responding to the operation of the user on the control panel, determining an operated picture and dynamically displaying the operated simulated picture on the touch screen.
Optionally, if the interaction gesture type is a single-finger interaction type, the control panel corresponding to the interaction gesture type is a walking control panel;
the responding user determines the operated picture aiming at the operation on the control panel and dynamically displays the operated simulated picture on the touch screen, and the method comprises the following steps:
acquiring the touch point offset and the touch area of the user aiming at the walking control panel;
determining the moving speed and the moving target direction in the simulated scene according to the touch point offset and the touch area;
and determining and displaying a simulation picture after walking operation in the simulation scene according to the moving speed and the target direction.
Optionally, the method further comprises:
responding to the operation that a user newly adds a first finger to touch the touch screen, and displaying an angle-of-view control panel in an area newly added with the first finger touch on the touch screen, wherein the distance between the touch position of the newly added first finger and the touch position of the finger operating the walking control panel is greater than a preset distance;
and responding to the operation of the user on the visual angle control panel, and determining and displaying the simulated picture after the visual angle change operation in the simulated scene.
Optionally, if the interaction gesture type is a two-finger interaction type, the control panel corresponding to the interaction gesture type includes a walking control panel and a viewing angle control panel;
the responding user determines the operated picture aiming at the operation on the control panel and dynamically displays the operated simulated picture on the touch screen, and the method comprises the following steps:
respectively acquiring the touch point offset and the touch area of the user for the walking control panel, and the touch point offset and the touch area of the user for the visual angle control panel;
determining the moving speed and the moving target direction in the simulated scene according to the touch point offset and the touch area of the walking control panel, and determining the visual angle adjusting speed and direction in the simulated scene according to the touch point offset and the touch area of the visual angle control panel;
and determining and displaying walking operation in the simulated scene and a simulated picture after visual angle adjustment operation according to the moving speed and the moving target direction in the simulated scene and the visual angle adjustment speed and direction in the simulated scene.
Optionally, the method further comprises:
and responding to the operation that a second finger of the user leaves the touch screen, and deleting the control panel aimed by the second finger from the touch screen.
Optionally, if the interaction gesture type is a three-finger interaction type, the control panel corresponding to the interaction gesture type is a height control panel;
the responding to the operation of the user on the control panel, determining the operated picture and dynamically displaying the operated simulated picture on the touch screen, and the method comprises the following steps:
acquiring the sliding distance of the user for the height control panel;
determining the visual angle height in the simulated scene according to the sliding distance of the height control panel;
and determining and displaying a simulation picture after the visual angle operation in the simulation scene according to the visual angle height in the simulation scene.
Optionally, if the interaction gesture type is a four-finger interaction type; the control panel corresponding to the interaction gesture type is a jump control panel;
the responding to the operation of the user on the control panel, determining the operated picture and dynamically displaying the operated simulated picture on the touch screen, and the method comprises the following steps:
acquiring the number of fingers of the user aiming at the jump control panel, and judging whether the number of fingers of the user touching the jump control panel changes or not;
if not, determining and displaying the simulated picture after the jumping operation in the simulated scene.
In a second aspect, an embodiment of the present application further provides a touch screen interaction gesture control apparatus, where the apparatus includes: the device comprises a response module, a determination module and a display module;
the response module is used for acquiring the number of fingers of a user touching the touch screen aiming at the touch operation of the touch screen by the user, and the touch screen displays the picture of a simulation scene;
the determining module is configured to determine the interaction gesture type of the user according to the number of fingers and a mapping relationship between a preset number of fingers and the interaction gesture type, where the interaction gesture type includes: a single-finger interaction type, a two-finger interaction type, a three-finger interaction type, and a four-finger interaction type;
the display module is used for displaying a control panel corresponding to the interactive gesture type in an area, touched by the finger of the user, of the touch screen according to the interactive gesture type of the user;
the response module is further used for responding to the operation of the user on the control panel, determining the operated picture and dynamically displaying the operated simulated picture on the touch screen.
Optionally, if the interaction gesture type is a single-finger interaction type, the control panel corresponding to the interaction gesture type is a walking control panel;
the response module is further configured to:
acquiring the offset of a touch point and a touch area of the user for the walking control panel;
determining the moving speed and the moving target direction in the simulated scene according to the touch point offset and the touch area;
and determining and displaying a simulation picture after walking operation in the simulation scene according to the moving speed and the target direction.
Optionally, the response module is further configured to:
responding to the operation that a user newly adds a first finger to touch the touch screen, and displaying a visual angle control panel in an area newly added with the first finger touch on the touch screen, wherein the distance between the touch position of the newly added first finger and the touch position of the finger operating the walking control panel is larger than a preset distance;
and responding to the operation of the user on the visual angle control panel, and determining and displaying the simulated picture after the visual angle change operation in the simulated scene.
Optionally, if the interaction gesture type is a two-finger interaction type, the control panel corresponding to the interaction gesture type includes a walking control panel and a viewing angle control panel;
the response module is further configured to:
respectively acquiring the touch point offset and the touch area of the user for the walking control panel, and the touch point offset and the touch area of the user for the visual angle control panel;
determining the moving speed and the moving target direction in the simulated scene according to the touch point offset and the touch area of the walking control panel, and determining the visual angle adjusting speed and direction in the simulated scene according to the touch point offset and the touch area of the visual angle control panel;
and determining and displaying the walking operation in the simulated scene and the simulated picture after the visual angle adjustment operation according to the moving speed and the moving target direction in the simulated scene and the visual angle adjustment speed and direction in the simulated scene.
Optionally, the response module is further configured to:
and responding to the operation that a second finger of the user leaves the touch screen, and deleting the control panel aimed by the second finger from the touch screen.
Optionally, if the interaction gesture type is a three-finger interaction type, the control panel corresponding to the interaction gesture type is a height control panel;
the response module is further configured to:
acquiring the sliding distance of the user for the height control panel;
determining the visual angle height in the simulated scene according to the sliding distance of the height control panel;
and determining and displaying a simulation picture after the visual angle operation in the simulation scene according to the visual angle height in the simulation scene.
Optionally, if the interaction gesture type is a four-finger interaction type; the control panel corresponding to the interactive gesture type is a jump control panel;
the response module is further configured to:
acquiring the number of fingers of the user aiming at the jump control panel, and judging whether the number of fingers of the user touching the jump control panel changes or not;
if not, determining and displaying the simulated picture after the jumping operation in the simulated scene.
In a third aspect, an embodiment of the present application further provides a touch screen, including: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating via the bus when the touch screen is operated, the processor executing the machine-readable instructions to perform the steps of the method as provided by the first aspect.
In a fourth aspect, this application further provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps of the method as provided in the first aspect.
The beneficial effect of this application is:
the embodiment of the application provides a method and a device for controlling interaction gestures of a touch screen, the touch screen and a storage medium, wherein the method comprises the following steps: responding to the touch operation of a user on the touch screen, acquiring the number of fingers of the user touching the touch screen, and displaying the picture of the simulation scene by the touch screen; determining the interactive gesture type of the user according to the number of the fingers and the mapping relation between the preset number of the fingers and the interactive gesture type, wherein the interactive gesture type comprises the following steps: a single-finger interaction type, a two-finger interaction type, a three-finger interaction type, and a four-finger interaction type; displaying a control panel corresponding to the interactive gesture type in an area touched by the finger of the user on the touch screen according to the interactive gesture type of the user; and responding to the operation of the user on the control panel, determining the operated picture and dynamically displaying the operated simulated picture on the touch screen. According to the scheme, after the touch operation of the user on the touch screen is responded, the interactive gesture type of the user is determined according to the acquired number of fingers touching the touch screen by the user and the mapping relation between the number of fingers and the interactive gesture type, the control panel corresponding to the interactive gesture type is displayed in the area touched by the fingers of the user, so that the touch screen responds to the operation of the user on the control panel, the picture of the simulated scene after the operation is further determined, and the picture of the simulated scene after the operation is dynamically displayed on the touch screen, the control of the user gesture on the simulated scene displayed in the touch screen is realized, the interactive experience of the user and the simulated scene is improved, and meanwhile, the diversification and flexibility of the operation mode of the touch screen by the user are also improved.
In addition, according to the number of fingers of the user touching the touch screen, the touch screen is touched by adding (or removing) another finger, so that the display operation and the switching operation of a plurality of control panels displayed in the touch screen are realized, and the flexibility and the convenience of the user in switching the modes of the plurality of control panels displayed in the touch screen are improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a schematic structural diagram of a touch screen according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of a touch screen interaction gesture control method according to an embodiment of the present disclosure;
FIG. 3 is a schematic flowchart illustrating another method for controlling an interactive gesture on a touch screen according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of a walking control panel in a touch screen interaction gesture control method according to an embodiment of the present disclosure;
FIG. 5 is a schematic flowchart of another method for controlling an interaction gesture of a touch screen according to an embodiment of the present disclosure;
fig. 6 is a schematic view of a view angle control panel in a touch screen interaction gesture control method according to an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of a view control panel in another method for controlling interaction gestures of a touch screen according to an embodiment of the present application;
FIG. 8 is a schematic flowchart illustrating another method for controlling an interactive gesture on a touch screen according to an embodiment of the present disclosure;
FIG. 9 is a schematic flowchart of another method for controlling an interaction gesture of a touch screen according to an embodiment of the present application;
FIG. 10 is a flowchart illustrating another method for controlling an interaction gesture on a touch screen according to an embodiment of the present disclosure;
fig. 11 is a schematic structural diagram of a touch screen interaction gesture control apparatus according to an embodiment of the present application.
Detailed Description
In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for illustrative and descriptive purposes only and are not used to limit the scope of protection of the present application. Additionally, it should be understood that the schematic drawings are not necessarily drawn to scale. The flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and steps without logical context may be performed in reverse order or simultaneously. One skilled in the art, under the guidance of this application, may add one or more other operations to, or remove one or more operations from, the flowchart.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as presented in the figures, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the term "comprising" will be used in the embodiments of the present application to indicate the presence of the features stated hereinafter, but does not exclude the addition of further features.
Fig. 1 is a schematic structural diagram of a touch screen according to an embodiment of the present disclosure; the touch screen 100 may be an electronic device having a display function and a touch function, such as a desktop computer, a laptop computer, a tablet computer, a handheld computer, and a mobile phone, and the shape of the touch screen is not limited to a plane or a curved surface.
The touch screen 100 is provided with a plurality of induction contacts, gesture actions of a user touching the touch screen 100 are mainly collected through the induction contacts, a control instruction of the gesture actions is determined according to the gesture actions, the touch screen 100 responds to the gesture actions of the user, an operated picture is determined, and an operated simulated picture is dynamically displayed on the touch screen, so that the flexibility and convenience of interactive operation and control of the user on the touch screen are improved, and the experience of man-machine interaction is further improved.
As shown in fig. 1, the touch screen 100 may include: a processor 101 and a memory 102. The processor 101 and the memory 102 are electrically connected directly or indirectly to realize data transmission or interaction. For example, electrical connections may be made through one or more communication buses or signal lines.
The processor 101 may be an integrated circuit chip having signal processing capability. The Processor 101 may be a general-purpose Processor including a Central Processing Unit (CPU), a Network Processor (NP), and the like. The various methods, steps, and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The Memory 102 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like.
The memory 102 is used for storing a program, and the processor 101 calls the program stored in the memory 102 to execute the touch screen interaction gesture control method provided in the following embodiments.
It will be appreciated that the configuration depicted in FIG. 1 is merely illustrative and that a touch screen may include more or fewer components than shown in FIG. 1 or have a different configuration than shown in FIG. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination thereof.
The following describes an implementation principle and corresponding beneficial effects of the touch screen interaction gesture control method provided by the present application through a plurality of specific embodiments.
Fig. 2 is a schematic flowchart of a touch screen interaction gesture control method according to an embodiment of the present disclosure; the main body of the method may be the touch screen in fig. 1, as shown in fig. 2, the method includes:
s201, responding to the touch operation of the user on the touch screen, acquiring the number of fingers of the user touching the touch screen, and displaying the picture of the simulation scene by the touch screen.
Illustratively, a simulated scene picture displayed in the touch screen, a virtual environment built by means of digital modeling, and a three-dimensional scene drawn in the virtual environment. If the virtual environment model can be a three-dimensional model of a virtual environment required by simulation teaching, for example, a three-dimensional scene in the operation of a virtual high-speed rail, a user can touch the touch screen through different fingers and trigger the virtual scene to respond so as to control the operation of the virtual high-speed rail in a virtual scene space displayed by the touch screen, a simulation picture in the operation process of the virtual high-speed rail is displayed for the user, and the interaction experience of the user and the virtual scene is improved.
S202, determining the interactive gesture type of the user according to the number of the fingers and the mapping relation between the preset number of the fingers and the interactive gesture type.
Wherein the interaction gesture types include: the method comprises a single-finger interaction type, a double-finger interaction type, a three-finger interaction type and a four-finger interaction type, wherein correspondingly, different interaction gesture types correspond to different operation instructions.
In this embodiment, the user may use different numbers of fingers to implement the touch operation on the touch screen, and the different numbers of fingers correspond to different interaction gesture types.
Illustratively, the type of interaction gesture as one finger is a single-finger interaction type, the type of interaction gesture as two fingers is a two-finger interaction type, the type of interaction gesture as three fingers is a three-finger interaction type, and the type of interaction gesture as four fingers is a four-finger interaction type.
Similarly, a mapping relationship between the number of fingers and the type of the interactive gesture can be defined, and the mapping relationship is not limited to the above four cases.
For example, after responding to a touch operation of a user on a touch screen, if it is acquired that the number of fingers touching the touch screen by the user is one finger, it may be determined that the interaction gesture type of the user is a single-finger interaction type according to a mapping relationship between the one finger and the interaction gesture type.
S203, according to the type of the interactive gesture of the user, displaying a control panel corresponding to the type of the interactive gesture in an area, touched by the finger of the user, of the touch screen.
In this embodiment, the control panel corresponding to the interactive gesture type may be called out according to the interactive gesture type of the user, so that the user may perform gesture operations on each control panel, and the interactive experience between the user and the simulated scene is realized. The control panel is a two-dimensional window and can be suspended on a picture of a simulated scene displayed on the touch screen.
For example, the interaction gesture type of the user is a single-finger interaction type, if the control panel corresponding to the single-finger interaction type is a walking control panel, that is, the walking control panel is displayed in an area touched by a single finger of the user, the user only needs to use simple gesture operations on the walking control panel, for example, the gesture operations include: and moving a single finger leftwards, moving a single finger rightwards, moving a single finger forwards or moving a single finger backwards, and the like to realize the walking operation of a target role in a picture of a simulation scene displayed by the touch screen.
And S204, responding to the operation of the user on the control panel, determining the operated picture and dynamically displaying the operated simulated picture on the touch screen.
On the basis of the above embodiment, for example, when a user needs to control an operation of a target person walking leftward in a simulated scene, the left walking operation may be simulated on the walking control panel by using a single finger, the touch screen responds to the left walking operation of the user on the walking control panel, determines a picture of the target person in the simulated scene after the left walking operation, and dynamically displays a walking picture of the target person in the simulated scene after the left walking operation on the touch screen, so that the control of a gesture on the touch screen to display the simulated scene is realized, and the interaction experience of the user and the simulated scene is improved.
To sum up, the embodiment of the present application provides a touch screen interaction gesture control method, including: responding to the touch operation of a user on the touch screen, acquiring the number of fingers of the user touching the touch screen, and displaying a picture of a simulation scene by the touch screen; determining the interactive gesture type of the user according to the number of the fingers and the mapping relation between the preset number of the fingers and the interactive gesture type, wherein the interactive gesture type comprises the following steps: a single-finger interaction type, a two-finger interaction type, a three-finger interaction type, and a four-finger interaction type; displaying a control panel corresponding to the interactive gesture type in an area touched by the finger of the user on the touch screen according to the interactive gesture type of the user; and responding to the operation of the user on the control panel, determining the operated picture and dynamically displaying the operated simulated picture on the touch screen. According to the scheme, after the touch operation of the user on the touch screen is responded, the interactive gesture type of the user is determined according to the acquired number of fingers of the user touching the touch screen and the mapping relation between the number of fingers and the interactive gesture type, and the control panel corresponding to the interactive gesture type is displayed in the area touched by the fingers of the user, so that the touch screen responds to the operation of the user on the control panel, the picture of the simulated scene after the operation is further determined, and the picture of the simulated scene after the operation is dynamically displayed on the touch screen, the operation of the user gesture on the simulated scene displayed in the touch screen is realized, the interactive experience of the user and the simulated scene is improved, and meanwhile, the diversification and flexibility of the operation mode of the touch screen by the user are also improved.
The step S204 in the touch screen interaction gesture control method provided by the present application is implemented as follows through multiple embodiments: and responding to the operation on the control panel by the user, determining the operated picture, and dynamically displaying the operated simulation picture on the touch screen for explanation.
FIG. 3 is a schematic flowchart illustrating another method for controlling an interactive gesture on a touch screen according to an embodiment of the present disclosure; if the interactive gesture type is a single-finger interactive type, the control panel corresponding to the interactive gesture type is a walking control panel, and the step S204: responding to the operation of a user on the control panel, determining an operated picture and dynamically displaying the operated simulated picture on the touch screen, wherein the method comprises the following steps:
s301, acquiring the touch point offset and the touch area of the user for the walking control panel.
The touch point offset is an offset (Δ x, Δ y) between a position T (x 1, y 1) of the touch point of the user's finger in the walking control panel and a position O (x 0, y 0) of the rotation center point, where Δ x = x1-x0 and Δ y = y1-y0.
The touch area refers to a touchable area R (visual sizex, visual sizey) of a finger of a user in the walking control panel.
S302, determining the moving speed and the moving target direction in the simulated scene according to the touch point offset and the touch area.
Wherein, the moving speed is the speed of the finger of the user moving in the walking control panel.
The target direction of the movement refers to a direction in which the finger of the user moves in the walking control panel, and the movement can be in any direction of the touch control area, for example, the movement can be rightward, leftward, forward, backward, or in other directions, so that the movement operation of the simulated scene displayed in the touch screen in any direction is realized.
Fig. 4 is a schematic diagram of a walking control panel in a touch screen interaction gesture control method according to an embodiment of the present disclosure; as shown in fig. 4, a touch point offset angle θ, i.e., θ = atan (Δ y/Δ x), can be calculated according to the touch point offset amount offset (Δ x, Δ y).
The distance between the position T (x 1, y 1) of the touch point and the position O (x 0, y 0) of the rotation center point is OT, and OT is obtained 2 =Δx 2 +Δy 2
According to the touch point offset angle θ and the touchable region R (visual sizex, visual sizey), XTerm = visual sizex cos θ and YTerm = visual sizey sin θ can be calculated, and the distance from the center point position O to the edge of the touchable region R in the direction of the touch point offset angle θ is recorded as OE, and then OE 2 =XTerm 2 +YTerm 2
And determining whether the touch point in the user touch walking control panel exceeds the touch area R according to the sizes of OT and OE.
In an implementation manner, if OT is greater than or equal to OE, it is considered that the user's finger touches the edge of the touchable area R of the walking control panel, and at this time, the real offset position of the touch point is set to offset (XTerm, YTerm), i.e. the moving speed Vx = XTerm/(visual sizex 0.5), vy = YTerm/(visual sizey 0.5), the target direction θ of the movement in the simulation scene.
In another implementation, if OT is smaller than OE, the user's finger touches the touchable area R of the walking control panel, and at this time, the real offset of the touch point is set to offset (Δ x, Δ y), i.e. the moving speed Vx = Δ x/(VisualSizeX 0.5), vy = Δ y/(VisualSizeY 0.5) in the simulated scene, and the target direction of movement θ.
And S303, determining and displaying a simulation picture after walking operation in the simulation scene according to the moving speed and the target direction.
Optionally, the moving speeds Vx and Vy of the simulated images in the simulated scene and the moving target direction θ may be further determined according to the moving speed and the moving target direction θ of the user's finger when touching the touch screen, which are obtained through the above calculation, so that the simulated scene displayed in the touch screen moves in the target direction θ according to the moving speed (Vx, vy), and the simulated images after the moving operation are dynamically displayed on the touch screen, thereby intuitively showing the operation on the simulated scene to the user, and improving the interaction between the user and the simulated scene displayed in the touch screen.
The following is a specific process related to the present application for implementing a switching operation of a plurality of control panels displayed in a touch screen according to the number of fingers of a user touching the touch screen.
Fig. 5 is a schematic flowchart of another touch screen interaction gesture control method according to an embodiment of the present application; the method further comprises the following steps:
s501, responding to the operation that the user newly adds a first finger to touch the touch screen, and displaying a visual angle control panel in an area newly added with the first finger on the touch screen.
The visual angle control panel is an operation panel for adjusting the visual angle of a simulated scene picture displayed in the touch screen, and if a user only needs to use simple gesture operation on the visual angle control panel, the gesture operation includes: the single-finger visual angle moves leftwards, the single-finger visual angle moves rightwards, the single-finger visual angle tilts and the like, so that the visual angle change operation in the picture of the simulation scene displayed by the touch screen can be realized.
And the distance between the touch position of the newly added first finger and the touch position of the finger for operating the walking control panel is greater than the preset distance.
For example, if the user touches the touch screen with any one finger F1, the walking control panel is displayed in the area touched by the finger F1 on the touch screen, and the finger F1 still remains touched. Meanwhile, the first finger F2 touch screen is additionally arranged, but when the distance between the touch position of the finger F1 and the touch position of the first finger F2 is smaller than or equal to the preset distance, the visual angle control panel cannot be called out on the touch screen in an area touched by the first finger F2 to display, and the phenomenon of cross overlapping display of the walking control panel and the visual angle control panel is avoided, so that the user experience feeling is poor.
In order to improve the flexibility of operating a plurality of control panels displayed in a touch screen, the touch screen can be touched by adding (or removing) another finger, so as to realize the switching operation of the plurality of control panels, which mainly includes the following three conditions:
in the first case, if the user touches the touch screen with any finger F1, the area touched by the finger F1 is called up on the touch screen to display the walking control panel, and the finger F1 still remains touched. Meanwhile, a newly added first finger F2 touch screen is added, and the distance between the touch position of the newly added first finger F2 and the touch position of the finger F1 for operating the walking control panel is larger than the preset distance, the touch screen responds to the operation of a user for touching the touch screen by using the first finger F2, and further, a visual angle control panel is displayed in an area touched by the first finger F2 on the touch screen, so that the user can simultaneously use the finger F1 to perform gesture operation on the walking control panel and use the first finger F2 to perform gesture operation on the visual angle control panel, or only perform gesture operation on any other control panel, for example, use the finger F1 to perform gesture operation on the walking control panel, and the touch point of the first finger F2 on the visual angle control panel is kept still.
Fig. 6 is a schematic view illustrating a view control panel in a touch screen interaction gesture control method according to an embodiment of the present disclosure; as shown in fig. 6, for the first case, the user touches the touch screen with the finger F1, calls out the area touched by the finger F1 on the touch screen to display the walking control panel, and the finger F1 still keeps touching, and at the same time, adds the first finger F2 to touch the touch screen, and calls out the area touched by the first finger F2 on the touch screen to display the viewing angle control panel in response to the operation of the user touching the touch screen with the first finger F2.
Optionally, in response to an operation of the second finger of the user being away from the touch screen, the control panel to which the second finger is directed is deleted from the touch screen.
In a second situation, fig. 7 is a schematic view of a viewing angle control panel in another touch screen interaction gesture control method provided in the embodiment of the present application; as shown in fig. 7, if the user removes the finger F1 from the touch screen and the first finger F2 remains touching, the walking control panel displayed in the area touched by the finger F1 is deleted from the touch screen, and only the viewing angle control panel in the area touched by the first finger F2 is displayed on the touch screen, so that the switching operation between the walking control panel and the viewing angle control panel is realized by adding/removing the number of fingers touching the touch screen by the user.
In the third case, after the user removes the finger F1 from the touch panel, if the user also removes the first finger F2 from the touch panel, the walking control panel displayed in the area touched by the first finger F2 is deleted from the touch panel.
And S502, responding to the operation of a user on the visual angle control panel, and determining and displaying a simulation picture after the visual angle change operation in the simulation scene.
Aiming at the first condition and the second condition, a user can perform gesture operation on the visual angle control panel by using the first finger F2, the touch screen responds to various visual angle change operations of the user on the visual angle control panel, a simulation picture after the visual angle change operation in a simulation scene is determined and displayed, a plurality of visual angles of the simulation scene can be visually displayed for the user, and the interaction between the user and the simulation scene displayed in the touch screen is improved.
The first case described above is described in detail by way of specific examples as follows.
FIG. 8 is a flowchart illustrating another method for controlling an interaction gesture on a touch screen according to an embodiment of the present disclosure; optionally, as shown in fig. 8, if the interaction gesture type is a two-finger interaction type, the control panel corresponding to the interaction gesture type includes: a walking control panel and a visual angle control panel.
The step S204: responding to the operation of a user on the control panel, determining an operated picture and dynamically displaying the operated simulated picture on the touch screen, wherein the method comprises the following steps:
s801, respectively acquiring the touch point offset and the touch area of the user for the walking control panel, and the touch point offset and the touch area of the user for the visual angle control panel.
Alternatively, if the user needs to perform a gesture operation on the walking control panel by using the finger F1 and perform a gesture operation on the visual angle control panel by using the first finger F2 at the same time, the touch point offset amount offset1 (Δ x1, Δ y 1), the touch area R1 (visual sizex1, visual sizey 1), and the touch point offset amount offset2 (Δ x2, Δ y 2), and the touch area R2 (visual sizex2, visual sizey 2) of the user for the walking control panel by using the finger F1 and the user for the visual angle control panel are respectively obtained.
S802, respectively determining the moving speed and the moving target direction in the simulated scene according to the touch point offset and the touch area of the walking control panel, and determining the visual angle adjusting speed and direction in the simulated scene according to the touch point offset and the touch area of the visual angle control panel.
The above step S302 has already described in detail how to calculate the moving speed and the moving target direction in the simulated scene according to the touch point offset and the touch area of the walking control panel, and will not be described in detail herein.
Similarly, with the calculation process described in the above step S302, the moving speed (Vx 1, vy 1) and the target direction θ 1 of the movement in the simulated scene, and the angle-of-view adjustment speed (Vx 2, vy 2) and the direction θ 2 in the simulated scene can be obtained, respectively.
And S803, determining and displaying the walking operation in the simulated scene and the simulated picture after the visual angle adjusting operation according to the moving speed and the moving target direction in the simulated scene and the visual angle adjusting speed and direction in the simulated scene.
Alternatively, the moving speeds Vx1, vy1 and the moving target direction θ 1 of the simulation picture in the simulation scene may be further determined according to the moving speeds (Vx 1, vy 1) and the moving target direction θ 1 of the user's finger F1 when touching the touch screen, which are calculated as described above, so that the simulation scene displayed on the touch screen moves in the target direction θ 1 according to the moving speeds (Vx 1, vy 1). Similarly, according to the view angle adjustment speeds (Vx 2, vy 2) of the first finger F2 of the user when touching the touch screen and the moving target direction θ 2 obtained through the calculation, the view angle adjustment speeds Vx2, vy2 of the simulation pictures in the simulation scene and the moving target direction θ 2 are further determined, so that the simulation scene displayed in the touch screen is subjected to view angle change in the target direction θ 2 according to the view angle adjustment speeds (Vx 2, vy 2), and then the simulation pictures after the moving operation and the view angle operation are simultaneously and dynamically displayed on the touch screen, the operation on the simulation scene can be visually displayed for the user, and the interaction between the user and the simulation scene displayed in the touch screen is improved.
Fig. 9 is a schematic flowchart of another touch screen interaction gesture control method according to an embodiment of the present application; optionally, if the interaction gesture type is a three-finger interaction type, the control panel corresponding to the interaction gesture type is a height control panel.
The step S204: responding to the operation of a user on the control panel, determining an operated picture and dynamically displaying the operated simulated picture on the touch screen, wherein the method comprises the following steps:
and S901, acquiring the sliding distance of the user for the height control panel.
The height control panel is an operation panel for adjusting the height of a viewing angle of a simulated scene picture displayed in the touch screen, and a user uses simple gesture operation on the height control panel, wherein the gesture operation comprises the following steps: the three-finger visual angle moves downwards and the three-finger visual angle moves upwards, so that the change operation of adjusting the visual angle in the picture of the simulated scene displayed by the touch screen up and down can be realized.
It should be noted that, when the user uses three fingers to touch the touch screen together, at this time, the interaction gesture type is a three-finger interaction type, and accordingly, the area touched by the user using the three fingers is called up on the touch screen to display the height control panel.
The sliding distance is a distance from the touch point position T to the removal position M when the last finger F5 pressing the touch screen touches the touch screen with three fingers F3, F4, and F5 used by the user.
Therefore, the touch screen responds to the operation of the user on the height control panel by using three fingers, and obtains the sliding distance of the finger F5 pressing the touch screen by the last finger of the three fingers of the user on the touch screen.
And S902, determining the visual angle height in the simulated scene according to the sliding distance of the height control panel.
Optionally, different sliding distances correspond to different viewing angle heights, that is, there is a one-to-one mapping relationship between the sliding distance and the viewing angle adjustment height.
Therefore, on the basis of the above embodiment, after the sliding distance of the user with respect to the height control panel is obtained, the viewing angle adjustment height of the simulated scene may be further determined according to the mapping relationship between the sliding distance and the viewing angle adjustment height.
And S903, determining and displaying the simulation picture after the visual angle operation in the simulation scene according to the visual angle height in the simulation scene.
Optionally, the height may be adjusted according to the viewing angle in the simulated scene obtained through the calculation, so that the simulated scene displayed in the touch screen is subjected to a viewing angle height change according to the viewing angle adjustment height, so as to dynamically display the simulated picture after the viewing angle height operation on the touch screen, and intuitively display the operation of the viewing angle height change of the simulated scene to the user, thereby improving the interaction between the user and the simulated scene displayed in the touch screen.
FIG. 10 is a schematic flowchart illustrating another method for controlling an interactive gesture on a touch screen according to an embodiment of the present disclosure; optionally, if the interaction gesture type is a four-finger interaction type; the control panel corresponding to the interactive gesture type is a jump control panel.
The step S204: responding to the operation of a user on the control panel, determining an operated picture and dynamically displaying the operated simulated picture on the touch screen, wherein the method comprises the following steps:
s1001, acquiring the number of fingers of a user for the jump control panel, and judging whether the number of fingers of the user touching the jump control panel changes.
The jumping control panel is an operation panel for jumping a target character in a simulated scene displayed on the touch screen, and a user can realize the jumping operation of the target character in the simulated scene displayed on the touch screen by using simple gesture operation on the jumping control panel.
When the user touches the touch screen by using four fingers, at the moment, the type of the interaction gesture is a four-finger interaction type, and accordingly, the area touched by the user by using the four fingers is called on the touch screen to display the jump control panel.
Therefore, the touch screen responds to the operation of the user on the jump control panel by using four fingers, acquires and judges whether the number N of the fingers of the user touching the jump control panel is smaller or larger than four fingers, and if the number N of the fingers of the user touching the jump control panel is smaller or larger than four fingers, the number of the fingers of the user touching the jump control panel can be determined to be changed, namely, the operation of the user on jumping the jump control panel is not responded.
And S1002, if not, determining and displaying the simulated picture after the jumping operation in the simulated scene.
Optionally, if the number of fingers of the user touching the jump control panel is four, it may be determined that the user always keeps the four fingers touching the "jump control panel," that is, a jump operation performed by the user with respect to the jump control panel is responded, a simulation picture after the jump operation is performed on the target character in the simulation scene is determined and displayed, an operation of jumping the target character in the simulation scene may be visually displayed to the user, and interaction between the user and the simulation scene displayed in the touch screen is improved.
The following description is provided for a touch screen interaction gesture device, a storage medium, and the like, which are provided in the present application, and the specific implementation process and technical effects of the device are referred to above, and are not described in detail below.
Fig. 11 is a schematic structural diagram of a touch screen interaction gesture control apparatus according to an embodiment of the present application; alternatively, as shown in fig. 11, the apparatus includes: a response module 1101, a determination module 1102 and a display module 1103.
A response module 1101, configured to acquire, by a user, the number of fingers touching the touch screen by the user, and display a picture of a simulated scene, for a touch operation of the touch screen by the user;
the determining module 1102 is configured to determine an interaction gesture type of the user according to the number of fingers and a mapping relationship between a preset number of fingers and the interaction gesture type, where the interaction gesture type includes: a single-finger interaction type, a two-finger interaction type, a three-finger interaction type, and a four-finger interaction type;
the display module 1103 is configured to display, according to the type of the interaction gesture of the user, a control panel corresponding to the type of the interaction gesture on an area of the touch screen touched by a finger of the user;
the response module 1101 is further configured to determine an operated screen and dynamically display the operated simulated screen on the touch screen in response to an operation on the control panel by the user.
Optionally, if the interaction gesture type is a single-finger interaction type, the control panel corresponding to the interaction gesture type is a walking control panel;
a response module 1101, further configured to:
acquiring the offset of a touch point and a touch area of a user for a walking control panel;
determining the moving speed and the moving target direction in the simulated scene according to the touch point offset and the touch area;
and determining and displaying the simulation picture after the walking operation in the simulation scene according to the moving speed and the target direction.
Optionally, the response module 1101 is further configured to:
responding to the operation of adding a first finger touch screen by a user, and displaying a visual angle control panel in an area where the first finger touches the touch screen, wherein the distance between the touch position of the added first finger and the touch position of the finger for operating the walking control panel is larger than a preset distance;
and responding to the operation of the user on the visual angle control panel, and determining and displaying the simulated picture after the visual angle change operation in the simulated scene.
Optionally, if the interaction gesture type is a two-finger interaction type, the control panel corresponding to the interaction gesture type includes a walking control panel and a viewing angle control panel;
a response module 1101, further configured to:
respectively acquiring the touch point offset and the touch area of a user aiming at the walking control panel, and the touch point offset and the touch area of the user aiming at the visual angle control panel;
determining the moving speed and the moving target direction in the simulated scene according to the touch point offset and the touch area of the walking control panel, and determining the visual angle adjusting speed and direction in the simulated scene according to the touch point offset and the touch area of the visual angle control panel;
and determining and displaying the walking operation in the simulated scene and the simulated picture after the visual angle adjustment operation according to the moving speed and the moving target direction in the simulated scene and the visual angle adjustment speed and direction in the simulated scene.
Optionally, the response module 1101 is further configured to:
and responding to the operation that the second finger of the user leaves the touch screen, and deleting the control panel aimed by the second finger from the touch screen.
Optionally, if the interaction gesture type is a three-finger interaction type, the control panel corresponding to the interaction gesture type is a height control panel;
a response module 1101, further configured to:
acquiring the sliding distance of a user for the height control panel;
determining the visual angle height in the simulated scene according to the sliding distance of the height control panel;
and determining and displaying the simulation picture after the visual angle operation in the simulation scene according to the visual angle height in the simulation scene.
Optionally, if the interaction gesture type is a four-finger interaction type; the control panel corresponding to the interactive gesture type is a jump control panel;
a response module 1101, further configured to:
acquiring the number of fingers of a user aiming at the jump control panel, and judging whether the number of fingers of the user touching the jump control panel changes or not;
and if not, determining and displaying the simulated picture after the jumping operation in the simulated scene.
The above-mentioned apparatus is used for executing the method provided by the foregoing embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
These above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Optionally, the invention also provides a program product, for example a computer-readable storage medium, comprising a program which, when being executed by a processor, is adapted to carry out the above-mentioned method embodiments.
In the several embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.

Claims (9)

1. A touch screen interaction gesture control method is characterized by comprising the following steps:
responding to the touch operation of a user on the touch screen, and acquiring the number of fingers of the user touching the touch screen, wherein the touch screen displays the picture of a simulation scene;
determining the interactive gesture type of the user according to the number of the fingers and the mapping relation between the preset number of the fingers and the interactive gesture type, wherein the interactive gesture type comprises the following steps: a single-finger interaction type, a two-finger interaction type, a three-finger interaction type, and a four-finger interaction type;
according to the type of the interaction gesture of the user, displaying a control panel corresponding to the type of the interaction gesture in an area touched by the finger of the user on the touch screen;
responding to the operation of a user on the control panel, determining an operated picture and dynamically displaying the operated simulated picture on the touch screen;
if the interaction gesture type is a double-finger interaction type, the control panel corresponding to the interaction gesture type comprises a walking control panel and a visual angle control panel;
the responding to the operation of the user on the control panel, determining the operated picture and dynamically displaying the operated simulated picture on the touch screen, and the method comprises the following steps:
respectively acquiring the touch point offset and the touch area of the user for the walking control panel, and the touch point offset and the touch area of the user for the visual angle control panel;
determining the moving speed and the moving target direction in the simulated scene according to the touch point offset and the touch area of the walking control panel, and determining the visual angle adjusting speed and direction in the simulated scene according to the touch point offset and the touch area of the visual angle control panel;
and determining and displaying walking operation in the simulated scene and a simulated picture after visual angle adjustment operation according to the moving speed and the moving target direction in the simulated scene and the visual angle adjustment speed and direction in the simulated scene.
2. The method according to claim 1, wherein if the interaction gesture type is a single-finger interaction type, the control panel corresponding to the interaction gesture type is a walking control panel;
the responding to the operation of the user on the control panel, determining the operated picture and dynamically displaying the operated simulated picture on the touch screen, and the method comprises the following steps:
acquiring the touch point offset and the touch area of the user aiming at the walking control panel;
determining the moving speed and the moving target direction in the simulated scene according to the touch point offset and the touch area;
and determining and displaying a simulation picture after walking operation in the simulation scene according to the moving speed and the target direction.
3. The method of claim 2, further comprising:
responding to the operation that a user newly adds a first finger to touch the touch screen, and displaying an angle-of-view control panel in an area newly added with the first finger touch on the touch screen, wherein the distance between the touch position of the newly added first finger and the touch position of the finger operating the walking control panel is greater than a preset distance;
and responding to the operation of the user on the visual angle control panel, and determining and displaying the simulated picture after the visual angle change operation in the simulated scene.
4. The method of claim 1, further comprising:
and responding to the operation that a second finger of the user leaves the touch screen, and deleting the control panel aimed by the second finger from the touch screen.
5. The method according to claim 1, wherein if the interaction gesture type is a three-finger interaction type, the control panel corresponding to the interaction gesture type is a height control panel;
the responding to the operation of the user on the control panel, determining the operated picture and dynamically displaying the operated simulated picture on the touch screen, and the method comprises the following steps:
acquiring the sliding distance of the user for the height control panel;
determining the visual angle height in the simulated scene according to the sliding distance of the height control panel;
and determining and displaying a simulation picture after the visual angle operation in the simulation scene according to the visual angle height in the simulation scene.
6. The method of claim 1, wherein if the interaction gesture type is a four-finger interaction type; the control panel corresponding to the interaction gesture type is a jump control panel;
the responding to the operation of the user on the control panel, determining the operated picture and dynamically displaying the operated simulated picture on the touch screen, and the method comprises the following steps:
acquiring the number of fingers of the user aiming at the jump control panel, and judging whether the number of fingers of the user touching the jump control panel changes or not;
if not, determining and displaying the simulated picture after the jumping operation in the simulated scene.
7. A touch screen interactive gesture control apparatus, the apparatus comprising: the device comprises a response module, a determination module and a display module;
the response module is used for acquiring the number of fingers of a user touching the touch screen aiming at the touch operation of the touch screen by the user, and the touch screen displays the picture of a simulation scene;
the determining module is configured to determine an interaction gesture type of the user according to the number of fingers and a mapping relationship between a preset number of fingers and the interaction gesture type, where the interaction gesture type includes: a single-finger interaction type, a two-finger interaction type, a three-finger interaction type, and a four-finger interaction type;
the display module is used for displaying a control panel corresponding to the interactive gesture type in an area, touched by the finger of the user, of the touch screen according to the interactive gesture type of the user;
the response module is further used for responding to the operation of the user on the control panel, determining an operated picture and dynamically displaying the operated simulated picture on the touch screen;
if the interaction gesture type is a double-finger interaction type, the control panel corresponding to the interaction gesture type comprises a walking control panel and a visual angle control panel;
the response module is further configured to:
respectively acquiring the touch point offset and the touch area of the user for the walking control panel, and the touch point offset and the touch area of the user for the visual angle control panel;
determining the moving speed and the moving target direction in the simulated scene according to the touch point offset and the touch area of the walking control panel, and determining the visual angle adjusting speed and direction in the simulated scene according to the touch point offset and the touch area of the visual angle control panel;
and determining and displaying walking operation in the simulated scene and a simulated picture after visual angle adjustment operation according to the moving speed and the moving target direction in the simulated scene and the visual angle adjustment speed and direction in the simulated scene.
8. A touch screen, comprising: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating via the bus when the touch screen is operated, the processor executing the machine-readable instructions to perform the steps of the method according to any one of claims 1 to 6.
9. A computer-readable storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of the method according to any one of claims 1-6.
CN202110504580.8A 2021-05-07 2021-05-07 Touch screen interaction gesture control method and device, touch screen and storage medium Active CN113138670B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110504580.8A CN113138670B (en) 2021-05-07 2021-05-07 Touch screen interaction gesture control method and device, touch screen and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110504580.8A CN113138670B (en) 2021-05-07 2021-05-07 Touch screen interaction gesture control method and device, touch screen and storage medium

Publications (2)

Publication Number Publication Date
CN113138670A CN113138670A (en) 2021-07-20
CN113138670B true CN113138670B (en) 2022-11-18

Family

ID=76818002

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110504580.8A Active CN113138670B (en) 2021-05-07 2021-05-07 Touch screen interaction gesture control method and device, touch screen and storage medium

Country Status (1)

Country Link
CN (1) CN113138670B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI809585B (en) * 2021-12-03 2023-07-21 禾瑞亞科技股份有限公司 Suspended touch panel and suspended touch device
CN114780009A (en) * 2022-05-24 2022-07-22 Oppo广东移动通信有限公司 Three-dimensional object rotation method, device, equipment, storage medium and program product

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103081496A (en) * 2010-09-08 2013-05-01 瑞典爱立信有限公司 Gesture-based control of IPTV system
CN103577029A (en) * 2012-07-27 2014-02-12 鸿富锦精密工业(武汉)有限公司 Application program control system and application program control method
CN105630374A (en) * 2015-12-17 2016-06-01 网易(杭州)网络有限公司 Virtual character control mode switching method and device
CN110633044A (en) * 2019-08-27 2019-12-31 联想(北京)有限公司 Control method, control device, electronic equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106975219B (en) * 2017-03-27 2019-02-12 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture
CN112121417B (en) * 2020-09-30 2022-04-15 腾讯科技(深圳)有限公司 Event processing method, device, equipment and storage medium in virtual scene

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103081496A (en) * 2010-09-08 2013-05-01 瑞典爱立信有限公司 Gesture-based control of IPTV system
CN103577029A (en) * 2012-07-27 2014-02-12 鸿富锦精密工业(武汉)有限公司 Application program control system and application program control method
CN105630374A (en) * 2015-12-17 2016-06-01 网易(杭州)网络有限公司 Virtual character control mode switching method and device
CN110633044A (en) * 2019-08-27 2019-12-31 联想(北京)有限公司 Control method, control device, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《An interactive finger-gaming robot with real-time emotion feedback》;Lin,CY等;《PROCEEDINGS OF THE 2015 6TH INTERNATIONAL CONFERENCE ON AUTOMATION,ROBOTICS AND APPLICATIONS(ICARA)》;20160922;第513-518页 *
基于手势交互的输电线路运维沉浸式操作训练平台;张秋实等;《电力科学与技术学报》;20171228(第04期);全文 *

Also Published As

Publication number Publication date
CN113138670A (en) 2021-07-20

Similar Documents

Publication Publication Date Title
US11833426B2 (en) Virtual object control method and related apparatus
US10318149B2 (en) Method and apparatus for performing touch operation in a mobile device
US9436369B2 (en) Touch interface for precise rotation of an object
US9836146B2 (en) Method of controlling virtual object or view point on two dimensional interactive display
US11443453B2 (en) Method and device for detecting planes and/or quadtrees for use as a virtual substrate
RU2623198C2 (en) Diagonal sliding gesture for selection and permutation
JP6274682B2 (en) Method and user equipment for displaying interface content
CN113138670B (en) Touch screen interaction gesture control method and device, touch screen and storage medium
CN104156171A (en) Method and device for preventing touch key misoperations in landscape screen state of mobile terminal
WO2021203724A1 (en) Handwriting selection method and apparatus, and computer device and storage medium
JP2004078693A (en) Visual field movement operating method
WO2015025345A1 (en) Information display device, information display method, and information display program
CN114895838A (en) Application program display method and terminal
CN111420395B (en) Interaction method and device in game, readable storage medium and electronic equipment
CN111383345B (en) Virtual content display method and device, terminal equipment and storage medium
JP2014099155A (en) Method and electronic device for providing virtual keyboard
WO2017139141A1 (en) Scroll mode for touch/pointing control
CN108245889B (en) Free visual angle orientation switching method and device, storage medium and electronic equipment
CN103513914A (en) Touch control method and device of application object
JP2016095716A (en) Information processing apparatus, information processing method, and program
CN111228790A (en) Game role display control method and device, electronic equipment and computer medium
WO2018000606A1 (en) Virtual-reality interaction interface switching method and electronic device
KR20130054990A (en) Single touch process to achieve dual touch experience field
CN113457144B (en) Virtual unit selection method and device in game, storage medium and electronic equipment
CN112068699A (en) Interaction method, interaction device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Gao Zhisheng

Inventor after: Zhang Eryang

Inventor after: Cui Zhibin

Inventor after: Ge Yaoxu

Inventor after: Jiao Yuefeng

Inventor after: Ao Yalei

Inventor after: Hou Xiaolong

Inventor before: Zhang Eryang

Inventor before: Gao Zhisheng

Inventor before: Cui Zhibin

Inventor before: Ge Yaoxu

Inventor before: Jiao Yuefeng

Inventor before: Ao Yalei

Inventor before: Hou Xiaolong

CB03 Change of inventor or designer information
GR01 Patent grant
GR01 Patent grant