CN112711359A - Man-machine interaction method and electronic equipment - Google Patents

Man-machine interaction method and electronic equipment Download PDF

Info

Publication number
CN112711359A
CN112711359A CN201911016772.3A CN201911016772A CN112711359A CN 112711359 A CN112711359 A CN 112711359A CN 201911016772 A CN201911016772 A CN 201911016772A CN 112711359 A CN112711359 A CN 112711359A
Authority
CN
China
Prior art keywords
touch
gesture operation
area
gesture
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911016772.3A
Other languages
Chinese (zh)
Inventor
陈浩
陈晓晓
王卿
郑爱华
胡凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Device Co Ltd
Original Assignee
Huawei Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Device Co Ltd filed Critical Huawei Device Co Ltd
Priority to CN201911016772.3A priority Critical patent/CN112711359A/en
Publication of CN112711359A publication Critical patent/CN112711359A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1643Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1652Details related to the display arrangement, including those related to the mounting of the display in the housing the display being flexible, e.g. mimicking a sheet of paper, or rollable
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to the technical field of electronic equipment, and particularly provides a human-computer interaction method and electronic equipment. The method is applied to an electronic device configured with at least one touch screen, wherein the at least one touch screen comprises at least one main display area and at least one touch expansion area, and the at least one touch expansion area is an elongated area located at the edge or the middle of the at least one touch screen; the method comprises the following steps: receiving a touch operation acting on the at least one touch expansion area; determining that the gesture operation to which the touch operation belongs is a long stroke gesture; and when the touch operation belongs to a long-stroke gesture operation, executing a processing task aiming at the display interface of the at least one main display area, wherein the sliding distance of the long-stroke gesture meets a preset condition.

Description

Man-machine interaction method and electronic equipment
Technical Field
The application relates to the technical field of electronic equipment, in particular to a human-computer interaction method and electronic equipment.
Background
For electronic devices such as mobile phones and tablet computers equipped with Touch Screens (TPs), user operations can be received through the touch screens, and related control is achieved. The touch screen is used as an input device of the electronic device, and a simple, convenient and natural man-machine interaction mode is provided. In general, a user may touch a touch screen on which a user interface object is displayed by a finger or the like, and an electronic device may perform an action related to the user interface according to the touched position, the touch type, and the like.
With the development of electronic device technology, richer operations performed by users, such as screen capture, can be executed.
Currently, several screen capture schemes are provided.
1) And (4) carrying out shortcut key screen capture through a group of preset physical keys, such as a power supply plus a volume up key. The scheme needs to operate physical keys simultaneously, is slow and is not easy to operate.
2) And (5) screen shooting is carried out through a screen shooting button of the shortcut panel. This scheme needs earlier to call out swift panel, clicks the screen capture button after again, needs hide swift panel, otherwise can be got into swift panel together screen capture, consequently, and speed is slow, can not satisfy high-speed demand to complex operation.
3) And (4) capturing the screen by gestures such as three-finger sliding down. Although the scheme is fast, the corresponding response of the user interface on the interface of the current application is easily caused, so that the time conflict is caused, and the false touch is caused.
Disclosure of Invention
The embodiment of the application provides a man-machine interaction method, which can perform long-stroke gesture operation in a preset area of a touch screen, so that electronic equipment can execute processing tasks of an interface displayed in a display area of the touch screen, and user operation experience is improved.
In a first aspect, a human-computer interaction method is provided, which is applied to an electronic device configured with at least one touch screen, where the at least one touch screen includes at least one main display area and at least one touch expansion area, and the at least one touch expansion area is an elongated area located at an edge or in the middle of the at least one touch screen; the method comprises the following steps: receiving a touch operation acting on the at least one touch expansion area; determining that the gesture operation to which the touch operation belongs is a long stroke gesture; and when the touch operation belongs to a long-stroke gesture operation, executing a processing task aiming at the display interface of the at least one main display area, wherein the sliding distance of the long-stroke gesture operation meets a preset condition.
With reference to the first aspect, in a first possible implementation manner of the first aspect, when the long stroke gesture operation is a first gesture operation, the processing task includes: a screenshot image of a display interface of the at least one primary display area is captured.
In the implementation mode, long-stroke operation is performed in the touch expansion area, so that screen capture of a display interface of the main display area can be realized.
With reference to the first aspect and the first possible implementation manner of the first aspect, in a second possible implementation manner of the first aspect, when the long-stroke gesture operation is a second gesture operation, the processing task includes: capturing a screen shot image of a first page, wherein the first page is a page to which a display interface of the at least one main display area belongs, the display interface is a partial page of the first page, and the second gesture operation is different from the first gesture operation.
In the implementation mode, long screen capture of a display interface of the main display area can be achieved by performing long-stroke operation in the touch expansion area.
With reference to the first aspect and the first to second possible implementation manners of the first aspect, in a third possible implementation manner of the first aspect, when the long stroke gesture operation is a third gesture operation, the processing task includes: and starting to capture a screen recording video with preset duration, wherein the screen recording video is a screen recording video of a display interface of the at least one main display area, and the third gesture operation, the second gesture operation and the first gesture operation are different.
In the implementation mode, the screen recording of the display interface of the main display area can be realized by performing long-stroke operation in the touch expansion area.
With reference to the first aspect and the first to third possible implementation manners of the first aspect, in a fourth possible implementation manner of the first aspect, when the long stroke gesture operation is a fourth gesture operation, the processing task includes: displaying a running interface of a first application in the at least one main display area; the first application is an application associated with the fourth gesture operation, and the fourth gesture operation, the third gesture operation, the second gesture operation and the first gesture operation are different.
With reference to the first aspect, in a fourth possible implementation manner of the first aspect, in a fifth possible implementation manner of the first aspect, the stroking gesture operation has a sliding direction; the processing task comprises the following steps: and switching the display interface of the at least one main display area according to the sliding direction.
In the implementation mode, page turning of the display interface of the main display area can be achieved by performing long-stroke operation in the touch expansion area.
With reference to the first aspect and the first to fifth possible implementation manners of the first aspect, in a sixth possible implementation manner of the first aspect, the at least one touch expansion area and the main display area are located on different planes.
With reference to the sixth possible implementation manner of the first aspect, in a seventh possible implementation manner of the first aspect, the at least one touch screen includes a curved screen, and the at least one touch extension area includes a curved area on either side of the curved touch screen or curved areas on both sides of the curved touch screen.
In the implementation mode, the curved surface area on one side or two sides of the curved surface screen can be used as the touch control expansion area, so that the operation of a user is facilitated, and the space of the front area of the curved surface screen is not occupied.
With reference to the sixth possible implementation manner of the first aspect, in an eighth possible implementation manner of the first aspect, the at least one touch screen includes a foldable screen; the at least one touch extension area comprises a bending area of the foldable screen in a folded state.
In this implementation, the bending area of the foldable screen in the folded state can be used as a touch expansion area, which is convenient for the user to operate and does not occupy the space of the non-bending area.
With reference to the first aspect and the first to fifth possible implementation manners of the first aspect, in a ninth possible implementation manner of the first aspect, the at least one touch screen includes a foldable screen, and the foldable screen in the unfolded state includes a first region, a second region, and a bendable region located between the first region and the second region; when the foldable screen is in the unfolded state, the at least one touch expansion area comprises the bendable area.
In the implementation mode, the bendable region of the foldable screen in the unfolded state can be used as the touch control expansion region, so that the operation of a user is facilitated, and the space of the non-bending region at two sides of the bendable region is not occupied.
With reference to the first aspect and the first to ninth possible implementation manners of the first aspect, in a tenth possible implementation manner of the first aspect, the stroking gesture operation includes at least one of:
single-finger long stroke operation, double-finger long stroke operation, three-finger long stroke operation, four-finger long stroke operation and palm long stroke operation.
In this implementation, a plurality of long-stroke operations can be defined, which can adapt to the operation habits of different users.
With reference to the first aspect and the first to tenth possible implementation manners of the first aspect, in an eleventh possible implementation manner of the first aspect, the determining that the gesture operation to which the touch operation belongs is a long stroke gesture includes: determining the sliding distance of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation; determining the operation duration of the touch operation according to the timestamp of the initial touch point and the timestamp of the termination touch point; and when the sliding distance of the touch operation meets the preset condition and the operation duration is less than the preset duration, determining that the touch operation is a long-stroke gesture operation.
With reference to the first aspect and the first to tenth possible implementation manners of the first aspect, in a twelfth possible implementation manner of the first aspect, the determining that the gesture operation to which the touch operation belongs is a long stroke gesture includes: determining the sliding distance and the sliding direction of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation; and when the sliding distance of the touch operation meets the preset condition and the sliding direction meets the preset direction, determining that the touch operation is a long-stroke gesture operation.
With reference to the first aspect and the first to tenth possible implementation manners of the first aspect, in a thirteenth possible implementation manner of the first aspect, the determining that the gesture operation to which the touch operation belongs is a long stroke gesture includes: determining the sliding distance of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation; determining the distance between a touch point corresponding to a first finger and a touch point corresponding to a second finger at the same moment in the touch operation process; and when the distances between the touch points corresponding to the first finger and the touch points corresponding to the second finger at the same moment are smaller than a first threshold value in the touch operation process, determining that the touch operation is a long-stroke gesture operation of at least two fingers.
With reference to the first aspect and the first to thirteenth possible implementation manners of the first aspect, in a fourteenth possible implementation manner of the first aspect, the determining that the gesture operation to which the touch operation belongs is a long stroke gesture includes: determining the sliding distance of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation; when the sliding distance meets the preset condition and first characteristics of a plurality of touch points of the touch operation meet the preset condition, determining that the touch operation is a long stroke gesture operation, wherein the first characteristics are areas or shapes; when the first characteristic is the area, the first characteristic of the touch points meets a preset condition that the area of the touch points is larger than or equal to a preset area; or, when the first feature is a shape, the first feature of the touch points meets a preset condition that the shapes of the touch points conform to a preset shape.
With reference to the first aspect and the first to fourteenth possible implementation manners of the first aspect, in a fifteenth possible implementation manner of the first aspect, the that the sliding distance of the long stroke gesture operation satisfies a preset condition includes: the sliding distance is greater than or equal to a second threshold.
With reference to the first aspect and the first to fourteenth possible implementation manners of the first aspect, in a sixteenth possible implementation manner of the first aspect, the that the sliding distance of the long stroke gesture operation satisfies a preset condition includes: the distance between the initial touch point and the first side edge of the long-stroke gesture operation is smaller than a third threshold value, and the distance between the termination touch point and the second side edge of the long-stroke gesture operation is smaller than a fourth threshold value; the first side is a side close to the initial touch point in the two opposite sides, the second side is a side close to the final touch point in the two opposite sides, and the two opposite sides are opposite sides in the length direction of the at least one touch extended area.
In a second aspect, an electronic device is provided, comprising:
the touch screen comprises at least one main display area and at least one touch expansion area, wherein the at least one touch expansion area is an elongated area located at the edge or the middle of the at least one touch screen;
a memory for storing computer execution instructions;
a processor for executing the computer executable instructions to cause the electronic device to perform:
receiving a touch operation acting on the at least one touch expansion area;
determining that the gesture operation to which the touch operation belongs is a long stroke gesture;
and when the touch operation belongs to a long-stroke gesture operation, executing a processing task aiming at the display interface of the at least one main display area, wherein the sliding distance of the long-stroke gesture operation meets a preset condition.
With reference to the second aspect, in a first possible implementation manner of the second aspect, when the long stroke gesture operation is a first gesture operation, the processor executes the computer execution instructions to cause the electronic device to perform: a screenshot image of a display interface of the at least one primary display area is captured.
In the implementation mode, long-stroke operation is performed in the touch expansion area, so that screen capture of a display interface of the main display area can be realized.
With reference to the second aspect and the second possible implementation manner of the first aspect, in a second possible implementation manner of the second aspect, when the long-stroke gesture operation is a second gesture operation, the processor executes the computer execution instructions to cause the electronic device to perform: capturing a screen shot image of a first page, wherein the first page is a page to which a display interface of the at least one main display area belongs, the display interface is a partial page of the first page, and the second gesture operation is different from the first gesture operation.
In the implementation mode, long screen capture of a display interface of the main display area can be achieved by performing long-stroke operation in the touch expansion area.
With reference to the second aspect and the first to the second possible implementation manners of the second aspect, in a third possible implementation manner of the second aspect, when the stroking gesture operation is a third gesture operation, the processor executes the computer execution instruction, so that the electronic device performs: and starting to capture a screen recording video with preset duration, wherein the screen recording video is a screen recording video of a display interface of the at least one main display area, and the third gesture operation, the second gesture operation and the first gesture operation are different.
In the implementation mode, the screen recording of the display interface of the main display area can be realized by performing long-stroke operation in the touch expansion area.
With reference to the second aspect and the first to third possible implementation manners of the second aspect, in a fourth possible implementation manner of the second aspect, when the stroking gesture operation is a fourth gesture operation, the processor executes the computer execution instruction, so that the electronic device performs: displaying a running interface of a first application in the at least one main display area; the first application is an application associated with the fourth gesture operation, and the fourth gesture operation, the third gesture operation, the second gesture operation and the first gesture operation are different.
With reference to the second aspect, in a fourth possible implementation manner of the second aspect, in a fifth possible implementation manner of the second aspect, the stroking gesture operation has a sliding direction; the processor executing the computer-executable instructions causes the electronic device to perform: and switching the display interface of the at least one main display area according to the sliding direction.
In the implementation mode, page turning of the display interface of the main display area can be achieved by performing long-stroke operation in the touch expansion area.
With reference to the second aspect and the first to fifth possible implementation manners of the second aspect, in a sixth possible implementation manner of the second aspect, the at least one touch expansion area and the main display area are located on different planes.
With reference to the sixth possible implementation manner of the second aspect, in a seventh possible implementation manner of the second aspect, the at least one touch screen includes a curved screen, and the at least one touch extension area includes a curved area on either side of the curved touch screen or curved areas on both sides of the curved touch screen.
In the implementation mode, the curved surface area on one side or two sides of the curved surface screen can be used as the touch control expansion area, so that the operation of a user is facilitated, and the space of the front area of the curved surface screen is not occupied.
With reference to the sixth possible implementation manner of the second aspect, in an eighth possible implementation manner of the second aspect, the at least one touch screen includes a foldable screen; the at least one touch extension area comprises a bending area of the foldable screen in a folded state.
In this implementation, the bending area of the foldable screen in the folded state can be used as a touch expansion area, which is convenient for the user to operate and does not occupy the space of the non-bending area.
With reference to the second aspect and the first to fifth possible implementation manners of the second aspect, in a ninth possible implementation manner of the second aspect, the at least one touch screen includes a foldable screen, and the foldable screen in the unfolded state includes a first region, a second region, and a bendable region located between the first region and the second region; when the foldable screen is in the unfolded state, the at least one touch expansion area comprises the bendable area.
In the implementation mode, the bendable region of the foldable screen in the unfolded state can be used as the touch control expansion region, so that the operation of a user is facilitated, and the space of the non-bending region at two sides of the bendable region is not occupied.
With reference to the second aspect and the first to ninth possible implementation manners of the second aspect, in a tenth possible implementation manner of the second aspect, the stroking gesture operation includes at least one of:
single-finger long stroke operation, double-finger long stroke operation, three-finger long stroke operation, four-finger long stroke operation and palm long stroke operation.
In this implementation, a plurality of long-stroke operations can be defined, which can adapt to the operation habits of different users.
With reference to the second aspect and the first to tenth possible implementations of the second aspect, in an eleventh possible implementation of the second aspect, the processor executes the computer execution instructions, so that the electronic device performs: determining the sliding distance of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation; determining the operation duration of the touch operation according to the timestamp of the initial touch point and the timestamp of the termination touch point; and when the sliding distance of the touch operation meets the preset condition and the operation duration is less than the preset duration, determining that the touch operation is a long-stroke gesture operation.
With reference to the second aspect and the first to tenth possible implementations of the second aspect, in a twelfth possible implementation of the second aspect, the processor executes the computer execution instructions, so that the electronic device executes: determining the sliding distance and the sliding direction of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation; and when the sliding distance of the touch operation meets the preset condition and the sliding direction meets the preset direction, determining that the touch operation is a long-stroke gesture operation.
With reference to the second aspect and the first to tenth possible implementation manners of the second aspect, in a thirteenth possible implementation manner of the second aspect, the touch operation is a touch operation of at least two fingers, and the processor executes the computer execution instructions to cause the electronic device to perform: determining the sliding distance of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation; determining the distance between a touch point corresponding to a first finger and a touch point corresponding to a second finger at the same moment in the touch operation process; and when the distances between the touch points corresponding to the first finger and the touch points corresponding to the second finger at the same moment are smaller than a first threshold value in the touch operation process, determining that the touch operation is a long-stroke gesture operation of at least two fingers.
With reference to the second aspect and the first to thirteenth possible implementation manners of the second aspect, in a fourteenth possible implementation manner of the second aspect, the processor executes the computer execution instructions, so that the electronic device performs: determining the sliding distance of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation; when the sliding distance meets the preset condition and first characteristics of a plurality of touch points of the touch operation meet the preset condition, determining that the touch operation is a long stroke gesture operation, wherein the first characteristics are areas or shapes; when the first characteristic is the area, the first characteristic of the touch points meets a preset condition that the area of the touch points is larger than or equal to a preset area; or, when the first feature is a shape, the first feature of the touch points meets a preset condition that the shapes of the touch points conform to a preset shape.
With reference to the second aspect and the first to fourteenth possible implementation manners of the second aspect, in a fifteenth possible implementation manner of the second aspect, the step of the long stroke gesture operation having a sliding distance that satisfies a preset condition includes: the sliding distance is greater than or equal to a second threshold.
With reference to the second aspect and the first to fourteenth possible implementation manners of the second aspect, in a sixteenth possible implementation manner of the second aspect, the step of satisfying the preset condition by the sliding distance of the long stroke gesture operation includes: the distance between the initial touch point and the first side edge of the long-stroke gesture operation is smaller than a third threshold value, and the distance between the termination touch point and the second side edge of the long-stroke gesture operation is smaller than a fourth threshold value; the first side is a side close to the initial touch point in the two opposite sides, the second side is a side close to the final touch point in the two opposite sides, and the two opposite sides are opposite sides in the length direction of the at least one touch extended area.
In a third aspect, an embodiment of the present application provides a computer storage medium, which includes computer instructions that, when executed on an electronic device, cause the electronic device to perform the method of the first aspect.
In a fourth aspect, the present application provides a computer program product, where the computer program product includes program code, which, when executed by a processor in an electronic device, implements the method of the first aspect.
By the man-machine interaction method and the electronic equipment, a user can perform long-stroke gesture operation in a specific area, so that the electronic equipment executes a processing task aiming at an interface displayed in the display area, the user can conveniently perform operation aiming at the interface displayed in the display area, mistaken touch of a user interface object on the interface displayed in the display area can be avoided, and user operation experience is improved.
Drawings
Fig. 1A is a bottom view of an electronic device according to an embodiment of the present disclosure;
fig. 1B is an exploded view of an electronic device according to an embodiment of the present disclosure;
fig. 2A is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2B is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 3A is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 3B is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 5A is a schematic software block diagram of an electronic device according to an embodiment of the present application;
fig. 5B is a schematic software block diagram of an electronic device according to an embodiment of the present application;
fig. 5C is a schematic software block diagram of an electronic device according to an embodiment of the present application;
fig. 5D is a schematic software block diagram of an electronic device according to an embodiment of the present application;
fig. 6A is a schematic diagram of a single finger long stroke operation provided in the present application;
fig. 6B is a schematic diagram of a single-finger long stroke operation according to an embodiment of the present application;
fig. 6C is a schematic diagram of a single-finger long stroke operation according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a single finger long stroke operation provided in an embodiment of the present application;
FIG. 8A is a schematic diagram of a single finger long stroke operation with a sliding direction according to an embodiment of the present disclosure;
FIG. 8B is a schematic diagram of a single-finger long stroke operation with a sliding direction according to an embodiment of the present disclosure;
fig. 9 is a schematic diagram illustrating a single-finger long-stroke operation performed in different touch extension areas according to an embodiment of the present disclosure;
FIG. 10A is a schematic diagram of a multi-finger long stroke operation according to an embodiment of the present disclosure;
FIG. 10B is a schematic diagram of a multi-finger long stroke operation according to an embodiment of the present disclosure;
fig. 11A is a schematic diagram of a palm rowing operation according to an embodiment of the present disclosure;
fig. 11B is a schematic diagram illustrating a palm rowing operation according to an embodiment of the present disclosure;
fig. 12 is a flowchart of a human-computer interaction method according to an embodiment of the present disclosure;
fig. 13A is a schematic effect diagram of a human-computer interaction method according to an embodiment of the present application;
fig. 13B is a schematic effect diagram of a human-computer interaction method according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solution in the embodiments of the present invention will be described below with reference to the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise.
Wherein in the description of the present specification, "/" indicates a meaning, for example, a/B may indicate a or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, in the description of the embodiments of the present application, "a plurality" means two or more than two.
In the description of the present specification, the terms "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The embodiment of the application provides a man-machine interaction method which can be applied to electronic equipment provided with at least one touch screen. The at least one touch screen may include a main display area and at least one touch extension area. The main display area can display the output of the electronic equipment and provide touch input. The at least one touch extension region may provide touch input. The at least one touch extension area may be an elongated area at the edge or in the middle of the at least one touch screen. When receiving a touch operation acting on the at least one touch expansion area, the electronic device may determine a gesture operation to which the touch operation belongs; and when the gesture operation to which the touch operation belongs is a predefined long-stroke gesture operation, executing a processing task of a display interface aiming at the at least one main display area. Specifically, when a touch applied to the at least one touch extension area is detected, a touch event may be generated, and the touch event may include one or more types of touch information, such as a position of a touch point, a duration of the touch, a touch trajectory of the touch, a moving direction of the touch, and the like. The electronic device may determine, according to the touch information included in the touch event, a gesture operation corresponding to the touch event. The stroking gesture operation may be a predefined gesture operation, such as a single-finger stroking operation, a two-finger stroking operation, a three-finger stroking operation, a four-finger stroking operation, a palm stroking operation. Each of the long stroke gesture operations may be associated with a predetermined task. The task may be a processing task for a display interface of the main display area. For example, a screen capture task, a long screen capture task, a screen recording task, a page turning task, and the like for a display interface of the main display area, for example, a task of starting a preset application, for example, a preset task of a preset application, and the like, are not listed here. If the long stroke gesture operation corresponding to the touch operation is determined, the preset task associated with the determined long stroke gesture operation can be executed.
In the embodiment of the present application, the touch point refers to an area where a touch object touches the touch screen to generate a touch operation. The touch object can be a finger or a palm, and the touch object can also be a tool (such as a touch pen) which can touch the touch screen and generate touch operation.
The user interaction method of the embodiment of the application can be applied to various electronic devices, including but not limited to mobile phones, tablet computers, Personal Digital Assistants (PDAs), wearable devices, laptop computers (laptop), and other portable electronic devices. Exemplary embodiments of portable electronic devices include, but are not limited to, portable electronic devices that carry an iOS, android, microsoft, or other operating system. The electronic device may also be other types of electronic devices, such as home appliances like refrigerators, washing machines, or automotive, industrial electronic devices. The embodiment of the present application does not specifically limit the type of the electronic device.
Next, the touch expansion area will be specifically described.
One or more areas can be divided in advance on a touch screen of the electronic device, and each area can be used as a touch control expansion area. If the electronic device is configured with a plurality of touch screens, one or more of the touch screens may also be used as one or more touch expansion areas.
If the electronic device has a plurality of touch extension areas, at least one gesture operation can be defined for each of the plurality of touch extension areas. The gesture operations corresponding to different touch expansion areas can be the same or partially the same, or can be different. If the different touch expansion areas correspond to the same gesture operation, the same gesture operation corresponding to the different touch expansion areas can be associated with different preset tasks.
In some embodiments, referring to fig. 1A and 1B, an electronic device may be configured with a curved screen.
In one illustrative example, as shown in FIGS. 1A and 1B, the two side surface areas of the curved screen may be referred to as touch extension A1 and touch extension A2, respectively. The area of the screen between touch extension A1 and touch extension A2 may serve as the primary display area.
In one illustrative example, any curved surface area on either side of the curved screen may be used as a touch extension area. The screen area outside the touch extension area or the screen area between the two side curved surface areas can be used as the main display area.
In some embodiments, referring to fig. 2A and 2B, the electronic device may be configured with a foldable touch screen 200. The foldable touch screen 200 may include a region 201, a region 202, and a region 203.
In an illustrative example, referring to fig. 2A, in the unfolded configuration of foldable touch screen 200, region 203 may be used as a touch expansion region and regions 201 and 202 may be used as main display regions. Region 203 may also be referred to as a bendable region.
In one example of this example, a swipe gesture operation acting on region 203 may trigger the electronic device to perform a processing task for the display interface of region 201 or region 202.
In one example of this example, a swipe gesture operation acting on region 203 may trigger the electronic device to perform a processing task for the display interfaces of regions 201 and 202.
In an example of this example, an elongated strip region (elongated region) may be further divided on a side of the region 201 (or the region 202) away from the region 203, and the elongated strip region may serve as another touch extension region. A swiping gesture operation acting on the elongated strip region may trigger the electronic device to perform a processing task for the display interface of region 201 (or region 202).
In one example, when the long stroke gesture operation applied to the elongated strip region and the long stroke gesture operation applied to the region 203 belong to the same gesture operation (for example, both are single-finger long stroke operations), the tasks respectively triggered by the two may be the same type of task, for example, the single-finger long stroke operation applied to the elongated strip region may trigger the electronic device to perform a screen capture task for the display interface of the region 201 (or the region 202); a single finger swipe operation applied to region 203 may trigger the electronic device to perform a screen capture task for the display interface of region 202 (or region 201). The tasks respectively triggered by the two can be different types of tasks, for example, a single-finger long stroke operation acting on the elongated area can trigger the electronic device to execute a screen capture task of a display interface aiming at the area 201 (or the area 202); a single finger swipe operation applied to region 203 may trigger the electronic device to perform a screen recording task for the display interface of region 202 (or region 201).
In one example, when the long stroke gesture operation applied to the elongated bar region and the long stroke gesture operation applied to the region 203 are different gesture operations (for example, both are single-finger long stroke operations), the tasks respectively triggered by the two may be the same type of task, for example, the single-finger long stroke operation applied to the elongated bar region may trigger the electronic device to perform a screen capture task for the display interface of the region 201 (or the region 202); a double-finger swipe operation applied to region 203 may trigger the electronic device to perform a screen capture task for the display interface of region 202 (or region 201). The tasks respectively triggered by the two can be different types of tasks, for example, a single-finger long stroke operation acting on the elongated area can trigger the electronic device to execute a screen capture task of a display interface aiming at the area 201 (or the area 202); a double-finger stroke operation applied to region 203 may trigger the electronic device to perform a screen recording task for the display interface of region 202 (or region 201).
In an illustrative example, referring to fig. 2A, in the unfolded state of the foldable touch screen 200, the area 201 (or the area 202) may be used as a touch expansion area, and the area 202 (or the area 201) may be used as a main display area.
In an illustrative example, referring to fig. 2B, in the folded configuration of foldable touch screen 200, region 203 may be used as a touch extension region, and regions 201 and/or 202 may be used as a main display region. Region 203 may also be referred to as a kink zone.
In some embodiments, referring to fig. 3A and 3B, the electronic device may be configured with a normal touch screen. The normal touch screen may be a conventional flat touch screen.
In one illustrative example, as shown in FIG. 3A, elongated strip-shaped areas (elongated areas) may be respectively divided on two sides of the normal touch screen as touch extension A1 and touch extension A2. The screen area between touch extension A1 and touch extension A2 serves as the primary display area
In one illustrative example, as shown in fig. 3B, an elongated bar-shaped area (elongated area) may be divided on the right side (or left side) of the normal touch screen to serve as a touch extension area, and the other screen area may serve as a main display area.
In some embodiments, referring to fig. 4, the electronic device may be configured with two touch screens, wherein one touch screen may serve as a touch extension and the other touch screen serves as a main display.
In the embodiment of the present application, the detection of the touch operation applied to the touch extension area may be implemented by a touch sensor (may also be referred to as a "touch device"). The touch sensor may be integrated with a display screen to form a touch screen, also referred to as a "touch screen". The touch sensor may be used to detect a touch operation acting on the touch screen. Specifically, whether a touch has occurred (e.g., a finger press is detected), the position (coordinate information) of a touch point, the area of the touch point, the shape of the touch point may be detected, whether a movement of the touch has occurred and is tracked across the entire touch screen may be determined, and whether the touch has terminated (e.g., a finger lift may be detected). Tracking the touch movement may include determining a touch trajectory generated by the touch movement, determining a position (coordinate information) of each touch point in the touch trajectory, determining a direction of the movement, and the like. The touch sensor may detect a single touch (e.g., one finger touching the touch screen) or a multiple simultaneous touch (e.g., multiple fingers touching the touch screen simultaneously).
The touch sensor may transmit the detected touch operation to the touch screen driving module of the core layer. The touch screen driving module may process the touch operation into a touch event, and the touch event may include touch information of the touch operation. The touch information may specifically include one or more touch information of a position of a touch point (including a position of a touch point corresponding to a start of the touch, and a position of a touch point corresponding to an end of the touch), a touch trajectory, a moving direction of the touch, a start time stamp of the touch, an end time stamp of the touch, an area of the touch point, a shape of the touch point, and the like. The touch screen driving module can transmit the touch event to the gesture recognition module so that the gesture recognition module recognizes the gesture operation corresponding to the touch event. And then the preset task associated with the gesture operation can be executed. In the embodiment of the present application, the moving direction of the touch may also be referred to as a sliding direction.
In various embodiments, software modules that recognize gesture operations and perform tasks associated with the gesture operations are described by way of example.
In some embodiments, referring to fig. 5A, for example, with an android system, an original touch screen driver in an kernel layer may be expanded, so that the touch screen driver may recognize a gesture operation corresponding to a touch event according to touch information in the touch event. Specifically, when the touch sensor detects a touch operation applied to the touch extension area, the touch operation may be transmitted to the kernel layer. The touch screen driver in the inner layer may process the touch operation applied to the touch extension area into a touch event, which may include touch information of the touch operation. The touch screen driver can recognize gesture operation corresponding to the touch event according to the touch information, and transmits the recognized gesture operation to the gesture execution module in the application framework layer. The gesture execution module may execute a task associated with the gesture operation in response to the gesture operation. In one example, the gesture execution module may be an extension of an input subsystem in a window manager. In one example, the gesture execution module may be a function module that is independently configured, and the function module may run continuously, that is, the function module is a resident service module.
In some embodiments, referring to fig. 5B, taking an android system as an example, an original touch screen driver in the kernel layer may be expanded, so that the touch screen driver may recognize a gesture operation corresponding to a touch event according to touch information in the touch event. Specifically, when the touch sensor detects a touch operation applied to the touch extension area, the touch operation may be transmitted to the kernel layer. The touch screen driver in the inner layer may process the touch operation applied to the touch extension area into a touch event, which may include touch information of the touch operation. The touch screen driver can recognize gesture operation corresponding to the touch event according to the touch information, and transmits the recognized gesture operation to the gesture execution application in the application layer. The application may perform a task associated with the gesture operation in response to the gesture operation. The application may be a background running application.
In some embodiments, referring to fig. 5C, for example, in the android system, when the touch sensor detects a touch operation applied to the touch extension area, the touch operation may be transmitted to the kernel layer. The touch screen driver in the inner layer may process the touch operation applied to the touch extension area into a touch event, which may include touch information of the touch operation. The touch screen driver may pass the touch event to a gesture recognition execution module in the application framework layer. The gesture recognition execution module can recognize the gesture operation corresponding to the touch event according to the touch information in the touch event, and can respond to the gesture operation to execute the task associated with the gesture operation. In one example, the gesture recognition execution module may be an extension of the input subsystem in the window manager. In one example, the gesture recognition execution module may be a function module that is independently configured, and the function module may run continuously, that is, the function module is a resident service module.
In some embodiments, referring to fig. 5D, for example, in the android system, when the touch sensor detects a touch operation applied to the touch extension area, the touch operation may be transmitted to the kernel layer. The touch screen driver in the inner layer may process the touch operation applied to the touch extension area into a touch event, which may include touch information of the touch operation. The touch screen driver may pass the touch event to a gesture recognition execution application in the application layer. The gesture recognition execution application can recognize the gesture operation corresponding to the touch event according to the touch information in the touch event, and can respond to the gesture operation to execute the task associated with the gesture operation. The application may be a background running application.
In some embodiments, referring to FIG. 6A, an identification scheme for identifying single finger long stroke operations is presented. As shown in fig. 6A, a touch extension area with a width d0 may be defined on one side of the touch screen. In one example, the touch screen is a curved screen, and the touch expansion area is a curved area on one side of the curved screen. The touch event may include a touch trajectory, and positions of a start touch point P1 (or P2) and a termination touch point P2(P1) of the touch trajectory. The length L1 of the touch trajectory corresponding to the touch event may be determined according to the position of the start touch point P1 (or P2) and the position of the end touch point P2 (P1). It may be determined whether the length L1 of the touch trajectory is greater than or equal to a length threshold L0. If the length L1 is greater than or equal to the length threshold L0, and the touch event includes a touch trajectory, it may be determined that the gesture operation corresponding to the touch event is a single-finger long-stroke operation.
In one illustrative example, the length threshold L0 may be positively correlated with the length of the touch screen, i.e., the longer the length of the touch screen, the longer the length threshold L0. The length direction and the width direction of the touch screen may be as shown in fig. 6A. In one example, the length threshold L0 may be two-thirds the length of the touch screen. In one example, the length threshold L0 may be one-half of the length of the touch screen. Etc., which are not listed here.
In one illustrative example, the length threshold L0 may be a preset value, such as 8cm, 10cm, 12cm, etc., which are not listed here.
In one illustrative example, the touch event may further include a time stamp of the touch trajectory start touch point P1 (or P2) and a time stamp of the end touch point P2 (or P1). The touch duration T1 corresponding to the touch trajectory may be determined according to the timestamp of the start touch point P1 (or P2) and the timestamp of the end touch point P2 (or P1). It may be determined whether the touch duration T1 is less than a time threshold T0. If the touch duration T1 is less than the time threshold T0, the length L1 is greater than or equal to the length threshold L0, and the touch event includes a touch trajectory, it may be determined that the gesture operation corresponding to the touch event is a single-finger dash operation. The time threshold T0 may be a preset time duration, for example, a time duration of 1 second, a time duration of 1.5 seconds, etc., which are not listed here.
In some embodiments, referring to FIG. 6B, another identification scheme for identifying single finger long stroke operations is described. The touch event may include a touch trajectory, and positions of a start touch point P1 (or P2) and a termination touch point P2(P1) of the touch trajectory. The first side of the touch expansion area closer to the initial touch point P1 (or P2) and the distance d1 between the first side and the initial touch point P1 (or P2) can be determined according to the position of the initial touch point P1 (or P2). The first side is a side in the length direction of the touch expansion area. The length direction of the touch extension area can be as shown in fig. 6B. A second side of the touch spread region closer to the terminating touch point P2 (or P1) may be determined and a distance d2 between the second side and the terminating touch point P2 (or P1) may be determined according to the position of the terminating touch point P2 (P1). The second side edge is the side edge opposite to the first side edge. It may be determined whether distance d1 and distance d2 are less than distance threshold d 3. If the distance d1 and the distance d2 are both smaller than the distance threshold d3, it may be determined that the operation gesture corresponding to the touch event is a single-finger long-stroke gesture.
In one illustrative example, the distance threshold d3 may be positively correlated with the length of the touch screen, i.e., the longer the length of the touch screen, the longer the distance threshold d 3. The length direction and the width direction of the touch screen may be as shown in fig. 6A. In one example, the distance threshold d3 may be one-sixth of the length of the touch screen. In one example, the length threshold L0 may be one-quarter of the length of the touch screen. Etc., which are not listed here.
In one illustrative example, the distance threshold d3 may be a preset value, such as 1cm, 1.5cm, etc., which are not listed here.
In one illustrative example, it may also be determined whether the touch duration corresponding to the touch event is less than a time threshold T0. If the touch duration corresponding to the touch event is less than the time threshold T0, and the distance d1 and the distance d2 are both less than the distance threshold d3, it may be determined that the operation gesture corresponding to the touch event is a single-finger long-stroke gesture. The time threshold T0 can be referred to the above description of the embodiment shown in fig. 6A, and is not described herein again.
In some embodiments, FIG. 6C illustrates a touch extension area located on a side of the touch screen. The touch expansion area is a long and narrow area, and specifically may be a curved area of a curved display area, or may be a bending area of a foldable screen in a folded state. Specific reference may be made to the above description of the embodiments shown in fig. 1A, 1B and 2B.
As shown in fig. 6C, the user can perform a touch operation in a horizontal touch manner in the touch extension area. The transverse touch mode refers to a mode that an elongated touch object w1 (e.g. a finger of a user) performs a touch operation in the touch extension area in a manner of being perpendicular or approximately perpendicular to the length direction of the touch extension area. Continuing with FIG. 6C, it can be set that the touch object w1 starts to touch the touch extension area at time t1, i.e. the touch point with the timestamp of t1 is the initial touch point. It may also be set that at time tN, the touch object w1 terminates the touch on the touch extension area, i.e. the touch point location at time tN terminates the touch point. Further, it is possible to determine whether the gesture operation to which the touch operation of the touch object w1 in the touch extension area belongs is a one-finger long stroke operation, based on information such as the position of the start touch point, the position of the end touch point, and the time stamps of both. Specifically, reference may be made to the above description of the embodiments shown in fig. 6A and fig. 6B, which is not repeated herein.
In some embodiments, see fig. 7. When determining the gesture operation corresponding to the touch event, it is further determined whether each touch point in the touch trajectory corresponding to the touch event is applied to the touch extension area. As shown in fig. 7, if one or more touch points (for example, a touch point PX) in the touch trajectory act outside the touch extension area, the touch event is invalid, and the determination process for determining the gesture operation corresponding to the touch event may be terminated.
In some embodiments, referring to fig. 8A and 8B, a swipe direction of the gesture operation may be defined, wherein gesture operations of different swipe directions are different gesture operations. As shown in fig. 8A and 8B, the length direction of the touch extension area may be defined as the up-down direction. In the embodiment of the application, when the user normally uses the electronic device, the upper side of the interface displayed in the main display area is the upward direction, and the lower side of the interface is the downward direction. Specifically, as shown in fig. 8A and 8B.
In an illustrative example, as shown in fig. 8A, the direction from the start touch point P1 of the touch trajectory to the end touch point P2 of the touch trajectory of the touch event is an upward direction (i.e., the sliding direction is an upward direction), and in the case that the touch event satisfies other determination conditions of the one-finger dash gesture, it may be determined that the gesture operation corresponding to the touch event is the one-finger dash gesture with the sliding direction upward. As shown in fig. 8B, the direction from the start touch point P1 of the touch trajectory to the end touch point P2 of the touch trajectory of the touch event is a downward direction (i.e., the sliding direction is a downward direction), and in the case where the touch event satisfies other determination conditions of the single-finger dash gesture, it may be determined that the gesture operation corresponding to the touch event is the single-finger dash gesture with the sliding direction downward. For other determination conditions of the one-finger swipe gesture, reference may be made to the above description of the embodiment shown in fig. 6A or fig. 6B, and details are not repeated here. The single-finger long stroke gesture with the upward sliding direction and the single-finger long stroke gesture with the downward sliding direction are different gesture operations and can be respectively associated with different tasks, for example, the single-finger long stroke gesture with the upward sliding direction can be associated with a screen recording task, and the single-finger long stroke gesture with the downward sliding direction can be associated with a screen capturing task.
In one illustrative example, a roundtrip gesture operation may be defined. The user can perform continuous back and forth gesture operations from top (or bottom) to bottom (or top) and then from bottom (or top) to top (or bottom) in the touch expansion area, wherein the touch events corresponding to the back and forth gesture operations include a plurality of touch tracks, and two adjacent touch tracks in the plurality of touch tracks are continuous.
One definable gesture operation of a round trip may be a gesture operation with the order of touch from top to bottom and then from bottom to top. Another definable gesture operation can be a gesture operation where the touch sequence is from bottom to top, then top to bottom. Yet another definable gesture operation can be a gesture operation where the order of touches is from bottom to top, then top to bottom, then bottom to top. Etc., which are not listed here.
Different round-trip gesture operations can be associated with different tasks, for example, the touch sequence is from top to bottom, and the gesture operation associated with the touch sequence from bottom to top is a screen capture task. The touch sequence is from bottom to top, and the task related to the gesture operation from top to bottom is a screen recording task.
In an example, in a case that at least one of the touch tracks of the touch event satisfies the determination condition of the one-finger dash gesture described in the embodiment shown in fig. 6A or 6B, the gesture operation corresponding to the touch event may be determined to be a defined one of the back-and-forth gesture operations.
In one example, in a case that each of the multiple touch tracks of the touch event satisfies the determination condition of the one-finger dash gesture described in the embodiment shown in fig. 6A or fig. 6B, the gesture operation corresponding to the touch event may be determined to be a defined one-way gesture operation.
In some embodiments, when there are multiple touch extension areas, different gesture operations may be defined according to the touch extension areas, that is, the touch extension area corresponding to the touch event may be used to determine the gesture operation corresponding to the touch event in combination with other determination conditions. The other determination conditions may refer to the descriptions of fig. 6A, fig. 6B, fig. 8A, and fig. 8B, which are not described herein again.
In one illustrative example, referring to FIG. 9, a touch screen may include touch extension A1 and touch extension A2. The initial touch point of the touch trajectory of a touch operation applied to the touch extension a1 can be set as the touch point P3, and the end touch point can be set as the touch point P4. It can be determined that the touch operation acts on the touch extension a1 according to the position of the touch point P3 and the position of the touch point P4, and the touch operation is a one-finger long-stroke operation, it can be determined that the touch operation is a one-finger long-stroke operation C1.
The initial touch point of the touch trajectory of a touch operation applied to the touch extension a2 can be set as the touch point P5, and the end touch point P6 is set as the touch point. It can be determined that the touch operation acts on the touch extension a2 according to the position of the touch point P5 and the position of the touch point P6, and the touch operation is a one-finger long-stroke operation, it can be determined that the touch operation is a one-finger long-stroke operation C4.
The single-finger dash operation C3 and the single-finger dash operation C4 are different gesture operations and may be associated with different tasks. For example, a single-finger dash operation C3 may be associated with a screen capture task and a single-finger dash operation C4 may be associated with a screen capture task.
In some embodiments, the touch event may include timestamps, locations, etc. of multiple touch points, where there are multiple timestamps corresponding to at least two touch points. Specifically, any one of the plurality of time stamps corresponds to at least two touch points. Namely, the touch operation corresponding to the touch time, at least two touch objects touch the touch extension area simultaneously.
Referring to FIG. 10A, in one illustrative example, a touch event may include multiple touch traces that correspond to the same touch time, i.e., the multiple touch traces occur in parallel or substantially in parallel. The touch events may include a location and a timestamp of each touch point in each touch trajectory. Adjacent touch tracks can be determined according to the positions of the touch points, and distances between touch points with the same time stamp in the adjacent touch tracks can be determined by combining the time stamps of the touch points. If the distances between the touch points in the adjacent touch tracks corresponding to the plurality of timestamps in the touch operation process are all smaller than the distance threshold d 4. It may be determined that the gesture operation corresponding to the touch event is a multi-finger swipe operation. The distance threshold d4 may be a preset value, and may be specifically determined according to a distance between contact points of two adjacent fingers and the touch screen when the user gathers the fingers together to touch the touch screen. In one example, the distance threshold d4 may be 1cm, 0.8cm, and so on, which are not listed here.
In one example, if the distances between the touch points in the adjacent touch tracks corresponding to the multiple timestamps during the touch operation are all smaller than the distance threshold d4, and the relative direction between the touch points in the adjacent touch tracks corresponding to the timestamp a during the touch operation is similar to the relative direction between the touch points in the adjacent touch tracks corresponding to the other timestamp B, it may be determined that the gesture operation corresponding to the touch event is the multi-finger long-stroke operation.
Specifically, as shown in fig. 10A. It may be assumed that the touch event may include two touch traces, a time stamp and a location of each touch point in the two touch traces. The touch point Pa1 and the touch point Pa2 may be set as two touch points on one touch trajectory, and the touch point Pb1 and the touch point Pb2 may be set as two touch points on the other touch trajectory. The time stamps of the touch point Pa1 and the touch point Pb1 are the same, and the time stamps of the touch point Pa2 and the touch point Pb2 are the same. It is possible to determine whether the distance between the touch point Pa1 and the touch point Pb1 is less than the distance threshold d4, and to determine whether the distance between the touch point Pa2 and the touch point Pb2 is less than the distance threshold d4, and to determine whether the direction of the touch point Pa1 with respect to the touch point Pb1 and the direction of the touch point Pa2 with respect to the touch point Pb2 are similar. If the distance between the touch point Pa1 and the touch point Pb1 is smaller than the distance threshold d4, it is determined that the distance between the touch point Pa2 and the touch point Pb2 is smaller than the distance threshold d4, and the direction of the touch point Pa1 relative to the touch point Pb1 is similar to the direction of the touch point Pa2 relative to the touch point Pb2 (for example, as shown in fig. 10A, the touch point Pa1 is located at the upper right of the touch point Pb1, and the touch point Pa2 is also located at the upper right of the touch point Pb 1), it can be determined that the gesture operation corresponding to the touch event is a double-finger long-stroke operation.
It should be noted that the two touch tracks shown in fig. 10A also need to satisfy the judgment conditions described in the embodiment shown in fig. 6A or fig. 6B.
In one illustrative example, FIG. 10B shows a touch extension area located on a side of a touch screen. The touch expansion area is a long and narrow area, and specifically may be a curved area of a curved display area, or may be a bending area of a foldable screen in a folded state. Specific reference may be made to the above description of the embodiments shown in fig. 1A, 1B and 2B. As shown in fig. 10B, the user can perform a touch operation on the touch extension simultaneously by using at least two touch objects (including touch object w1 and touch object w2) in a lateral touch manner in the touch extension. The touch object and the lateral touch manner can be referred to the above description of the embodiment shown in fig. 6C. Taking any two time instants (time instant tm and time instant tk) in the touch period as an example, the multi-finger long stroke operation gesture may be determined according to the distance and the relative position between adjacent touch points in the at least two touch points corresponding to the time instant tm, and the distance and the relative position between adjacent touch points in the at least two touch points corresponding to the time instant tk. Reference may be made specifically to the above description of the embodiment shown in fig. 10A. It should be noted that, when determining the multi-finger long stroke operation gesture, the determination condition described in the embodiment shown in fig. 6A or fig. 6B needs to be satisfied. Reference may be made specifically to the embodiment illustrated above in fig. 6C.
In some embodiments, different multi-finger touch gestures may be defined by combining a multi-finger touch screen, a sliding direction of a touch operation, a back-and-forth sliding direction, an acting touch expansion area, and the like, which may be specifically referred to the above description of the embodiments shown in fig. 8A and 8B and fig. 9, and are not repeated herein.
In some embodiments, referring to fig. 11A and 11B, the touch event may include a touch trajectory, a position of each touch point in the touch trajectory, and an area or shape of each touch point. Whether the gesture operation corresponding to the touch event is a palm sliding operation or not can be determined according to the area or the shape of each touch point.
In an illustrative example, if the touch trajectory satisfies the determination condition described in the embodiment shown in fig. 6A or fig. 6B, and the area of each touch point is greater than or equal to the area threshold S, it may be determined that the gesture operation corresponding to the touch event is a palm long stroke operation. The area threshold S may be a preset value. Generally, the area threshold S is larger than the contact area when the user' S finger touches the touch screen. In one example, the area threshold S may be determined according to a difference between a contact area when a user' S finger touches the touch screen and a contact area when a palm touches the touch screen.
In an illustrative example, if the touch trajectory satisfies the determination condition described in the embodiment shown in fig. 6A or fig. 6B, and the shape of each touch point is a palm shape, it may be determined that the gesture operation corresponding to the touch event is a palm long stroke operation. Taking the android system as an example, the touch screen drive of the inner core layer can determine the shape of the contact area (i.e., the touch point) between the touch object and the touch screen, so as to identify the shape of the touch point.
In one illustrative example, as shown in fig. 11A, the touch expansion area may be a wide area of the touch screen that can accommodate a palm stroke operation.
In an illustrative example, as shown in fig. 11B, the touch extension area may be an elongated area on the side of the touch screen, such as a curved area of a curved display area, or may be a bending area of the foldable screen in the folded configuration. Specific reference may be made to the above description of the embodiments shown in fig. 1A, 1B and 2B.
Referring to fig. 12, an embodiment of the present application provides a human-computer interaction method. The execution subject of the method may be an electronic device configured with at least one touch screen comprising at least one main display area and at least one touch extension area, the at least one touch extension area being an elongated area located at an edge or in the middle of the at least one touch screen. Specifically, reference may be made to the above description of the embodiments shown in fig. 1A, fig. 1B, fig. 2A, fig. 2B, fig. 3A, fig. 3B, and fig. 4, which is not repeated herein.
As shown in fig. 12, the method includes the following steps.
Step 1201, receiving a touch operation acting on the at least one touch extension area.
Step 1203, determining that the gesture operation to which the touch operation belongs is a long stroke gesture.
Step 1203 may be implemented by referring to the embodiments shown in fig. 6A, fig. 6B, fig. 6C, fig. 7, fig. 8A, fig. 8B, fig. 9, fig. 10A, fig. 10B, fig. 11A, and fig. 11B, which are not described again here.
And step 1205, when the touch operation belongs to a long stroke gesture operation, executing a processing task aiming at the display interface of the at least one main display area, wherein the sliding distance of the long stroke gesture operation meets a preset condition.
In some embodiments, the stroking gesture operation may be any one of or a combination of any number of single-finger stroking operations, double-finger stroking operations, three-finger stroking operations, four-finger stroking operations, and palm stroking operations.
For the touch control extension area, an operating system developer or a user of the electronic device may define at least one kind of long stroke gesture operation, for example, one or more kinds of single-finger long stroke operation, double-finger long stroke operation, three-finger long stroke operation, four-finger long stroke operation, palm long stroke operation, and the like may be defined.
An operating system developer or user of the electronic device may define the swipe gesture operation and may also associate a preset task for the swipe gesture operation. The preset task may be a processing task for an interface of the main display area, or may be another task.
Next, the preset task will be exemplified.
In some embodiments, when the stroking gesture operation is a first gesture operation, the processing task comprises: a screenshot image of a display interface of the at least one primary display area is captured. Namely, the screen capture task can be associated with the first gesture operation, so that when the user performs the first gesture operation in the touch expansion area, the screen capture of the display interface of the main display area can be realized.
In one illustrative example, reference may be made to fig. 13A and 13B. The first gesture operation may be set to a one-finger long stroke operation. As shown in fig. 13A, the user can perform a single-finger sliding operation in the touch expansion area. The electronic device generates a touch event in response to the single-finger sliding operation. The electronic device may determine that the user performed the one-finger sliding operation according to the touch event. As shown in fig. 13B, the electronic device may perform a screen capture task, i.e., capture a screen capture image of the display interface of the main display area, in response to the touch event.
The first gesture operation may also be a two-finger long stroke operation, a three-finger long stroke operation, a four-finger long stroke operation, a palm long stroke operation, and the like, which are not listed here.
In some embodiments, when the stroking gesture operation is a second gesture operation, the processing task comprises: capturing a screen shot image of a first page, wherein the first page is a page to which a display interface of the at least one main display area belongs, and the display interface is a partial page of the first page.
The second gesture operation may be any one of a single-finger long stroke operation, a double-finger long stroke operation, a three-finger long stroke operation, a four-finger long stroke operation, and a palm long stroke operation.
An operating system developer or user of the electronic device may associate a long screen capture task for the second gesture operation. It is readily understood that, in general, the main display area displays a portion of some pages at the same time. Namely, the display interface of the main display area is a part of the page to which the interface belongs. The long screen capture task is to capture a page to which the display interface of the main display area belongs, namely to capture a screen capture image of the page to which the display interface of the main display area belongs. Taking a web page as an example, the main display area displays a part of the web page, and when it is determined that the touch operation performed in the touch expansion area is the second gesture operation, the image of the web page may be captured.
When the developer or the user of the operating system of the electronic device defines the first gesture operation and the second gesture operation at the same time, the first gesture operation and the second gesture operation are different gesture operations. For specific implementation of the first gesture operation and the second gesture operation, reference may be made to the above description of each embodiment shown in fig. 6A, fig. 6B, fig. 6C, fig. 8A, fig. 8B, fig. 9, fig. 10A, fig. 10B, fig. 11A, and fig. 11B, and details are not repeated herein.
In some embodiments, when the stroking gesture operation is a third gesture operation, the processing task comprises: and starting to capture a screen recording video with preset time duration, wherein the screen recording video is the screen recording video of the display interface of the at least one main display area.
The third gesture operation may be any one of a single-finger long stroke operation, a double-finger long stroke operation, a three-finger long stroke operation, a four-finger long stroke operation, and a palm long stroke operation.
Users often have a need to record the interface displayed by the electronic device, for example, in a scene of watching a video, the user has a need to record one of the videos. For another example, in the scenario of an electronic game, the user has a need to record a game screen. An operating system developer of the electronic device may operate the associated screen recording task for the third gesture. When it is determined that the touch operation performed in the touch expansion area is the third gesture operation, the display interface of the main display area may be recorded, that is, a screen recording video of the display interface of the main display area is captured.
When a developer or a user of an operating system of the electronic device defines the first gesture operation, the second gesture operation, and the third gesture operation at the same time, the first gesture operation, the second gesture operation, and the third gesture operation are different from each other. For specific implementation of each gesture operation, reference may be made to the above description of each embodiment shown in fig. 6A, fig. 6B, fig. 6C, fig. 8A, fig. 8B, fig. 9, fig. 10A, fig. 10B, fig. 11A, and fig. 11B, and details thereof are not repeated here.
In some embodiments, when the swipe gesture operation is a fourth gesture operation, the processing task comprises: displaying an operation interface of a first application in the main display area; the first application is an application associated with the fourth gesture operation.
The user may preset an association relationship between a certain application and the fourth gesture operation, for example, may set a selection application and the fourth gesture operation in a setting function menu provided by the operating system, and set an association relationship between the application and the fourth gesture operation. The electronic device may respond to the setting of the user, and may establish and record an association relationship between the identifier of the application (e.g., a package name of the installation package) and the fourth gesture operation. The application may be a shopping application, a music application, etc., which are not listed here.
When the user performs the fourth gesture operation in the touch expansion area, the application associated with the fourth gesture operation can be quickly started, and the user operation experience is improved.
When a developer or a user of an operating system of the electronic device defines the first gesture operation, the second gesture operation, the third gesture operation, and the fourth gesture operation at the same time, the first gesture operation, the second gesture operation, the third gesture operation, and the fourth gesture operation are different from each other. For specific implementation of each gesture operation, reference may be made to the above description of each embodiment shown in fig. 6A, fig. 6B, fig. 6C, fig. 8A, fig. 8B, fig. 9, fig. 10A, fig. 10B, fig. 11A, and fig. 11B, and details thereof are not repeated here.
In some embodiments, the swipe gesture operation has a swipe direction; the processing task comprises the following steps: and switching the display interface of the main display area according to the sliding direction.
For reading-type applications, such as e-books, users have frequent page turning requirements. The page turning requirement can be turned forwards or backwards. The incidence relation between the sliding direction and the page turning direction of the long stroke gesture operation can be defined. For example, when the slide direction of the long stroke gesture operation is the upward slide direction, the page may be turned forward. When the sliding direction of the long stroke gesture operation is a downward sliding direction, the page can be turned backwards. In this embodiment, the upward sliding direction may be above the interface displayed in the main display area when the user normally uses the electronic device. The direction of the downward sliding is the opposite direction to the direction of the upward sliding. Specifically, reference may be made to the above description of fig. 8A and 8B, which is not repeated herein.
In some embodiments, when the long stroke gesture operation is a fifth gesture operation, a first task preset by a user is performed. The first task may be a task that requires multiple operations to be performed under conventional operations.
In one example, the first task may be a task of dialing a certain phone number. Again for example, the telephone number may be the telephone number of a frequent contact. As another example, the telephone number may be an alarm telephone number (e.g., 110). As another example, the telephone number may be an emergency telephone number (e.g., 120). Etc., which are not listed here.
In one example, the first task may be a task to change a system setting. For example, for a dual Subscriber Identification Module (SIM) card installed, the first task may replace the task of the default SIM card. Etc., which are not listed here.
The user may preset an association relationship between the first task and the fifth gesture operation, for example, may select the first task and the fifth gesture operation from a setting function menu provided by the operating system, and set an association relationship between the first task and the fifth gesture operation. The electronic device may respond to the setting of the user, and may establish and record an association relationship between the first task and the fifth gesture operation. The user performs the fifth gesture operation in the touch expansion area, so that the electronic device can execute the first task, user operation is saved, and user operation experience is improved.
When a developer or a user of an operating system of the electronic device defines at least one of the first gesture operation, the second gesture operation, the third gesture operation, the fourth gesture operation, and the fifth gesture operation at the same time, the defined gesture operations are different from each other. Specifically, reference may be made to the above description of the embodiments shown in fig. 6A, fig. 6B, fig. 6C, fig. 8A, fig. 8B, fig. 9, fig. 10A, fig. 10B, fig. 11A, and fig. 11B, which are not repeated herein.
In some examples of these embodiments, the at least one touch extension area and the primary display area are located on different planes.
In one example of these embodiments, the at least one touch screen comprises a curved screen, and the at least one touch extension area comprises curved areas on either side or both sides of the curved touch screen.
Reference may be made in particular to the above description of the embodiments shown in fig. 1A and 1B.
In one example of these embodiments, the at least one touch screen comprises a foldable screen, and the at least one touch extension area comprises a bending area of the foldable screen in a folded configuration.
Reference may be made in particular to the above description of the embodiments shown in fig. 2B.
In other examples of these embodiments, the at least one touch screen comprises a foldable screen comprising a first region, a second region, and a bendable region between the first region and the second region in an unfolded configuration; the at least one touch extension area comprises the bendable area.
Reference may be made in particular to the above description of the embodiments shown in fig. 2A.
In some embodiments, the determining that the gesture operation to which the touch operation belongs is a long stroke gesture includes: determining the sliding distance of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation; determining the operation duration of the touch operation according to the timestamp of the initial touch point and the timestamp of the termination touch point; and when the sliding distance of the touch operation meets the preset condition and the operation duration is less than the preset duration, determining that the touch operation is a long-stroke gesture operation.
Reference may be made in particular to the above description of the embodiments shown in fig. 6A.
In some embodiments, the determining that the gesture operation to which the touch operation belongs is a long stroke gesture includes: determining the sliding distance and the sliding direction of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation; and when the sliding distance of the touch operation meets the preset condition and the sliding direction meets the preset direction, determining that the touch operation is a long-stroke gesture operation.
In particular, reference may be made to the above description of the embodiments shown in fig. 6A, 6B, 8A, and 8B.
In some embodiments, the touch operation is a touch operation of at least two fingers, and the determining that the gesture operation to which the touch operation belongs is a long stroke gesture includes: determining the sliding distance of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation; determining the distance between a touch point corresponding to a first finger and a touch point corresponding to a second finger at the same moment in the touch operation process; and when the distances between the touch points corresponding to the first fingers and the touch points corresponding to the second fingers at the same moment are smaller than a first threshold value in the touch operation process, determining that the touch operation is the long stroke gesture operation of the at least two fingers. In particular, reference may be made to the above description of the embodiments shown in fig. 6A, 6B, 10A, and 10B.
In some embodiments, the determining the gesture operation to which the touch operation belongs includes: determining the sliding distance of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation; when the sliding distance meets the preset condition and first characteristics of a plurality of touch points of the touch operation meet the preset condition, determining that the touch operation is a long stroke gesture operation, wherein the first characteristics are areas or shapes; when the first characteristic is the area, the first characteristic of the touch points meets a preset condition that the area of the touch points is larger than or equal to a preset area; or, when the first feature is a shape, the first feature of the touch points meets a preset condition that the shapes of the touch points conform to a preset shape.
Reference may be made in particular to the above description of the embodiments shown in fig. 6A, 6B, 11A and 11B.
In some embodiments, the sliding distance of the long stroke gesture operation satisfying the preset condition includes: the sliding distance is greater than or equal to a second threshold.
Reference may be made specifically to the above description of the embodiment shown in fig. 6A.
In some embodiments, the sliding distance of the long stroke gesture operation satisfying the preset condition includes: the distance between the initial touch point and the first side edge of the long-stroke gesture operation is smaller than a third threshold value, and the distance between the termination touch point and the second side edge of the long-stroke gesture operation is smaller than a fourth threshold value; the first side is a side close to the initial touch point in the two opposite sides, the second side is a side close to the final touch point in the two opposite sides, and the two opposite sides are opposite sides in the length direction of the at least one touch extended area.
Reference may be made specifically to the above description of the embodiment shown in fig. 6B.
By the man-machine interaction method, the user can perform long-stroke gesture operation in the specific area, and the electronic equipment can execute processing tasks aiming at the interface displayed in the display area, so that the user can conveniently perform operation aiming at the interface displayed in the display area, the user interface object on the interface displayed in the display area can be prevented from being touched by mistake, and the user operation experience is improved.
The embodiment of the application provides electronic equipment. Referring to fig. 14, the electronic device includes a processor 1410, a memory 1420, and at least one touch screen 1430. The at least one touch screen 1430 includes at least one main display area and at least one touch extension area. The memory 1420 is used to store computer-executable instructions; when the electronic device is running, the processor 1410 executes the computer-executable instructions stored in the memory 1420 to cause the electronic device to perform the methods shown in the method embodiments described above. The processor 1410 is configured to receive a touch operation applied to the at least one touch expansion area; the processor 1410 is further configured to determine that the gesture operation to which the touch operation belongs is a long stroke gesture; the processor 1410 is further configured to execute a processing task for the display interface of the at least one main display area when the touch operation belongs to a long stroke gesture operation, where a sliding distance of the long stroke gesture operation satisfies a preset condition.
In some embodiments, the electronic device further comprises a communication bus 1440, wherein the processor 1410 is coupled to the memory 1420 and the at least one touch screen 1430 via the communication bus 1440 such that control of the at least one touch screen 1430 in accordance with computer-executable instructions stored in the memory 1420 is achieved.
The specific implementation of each component/device of the electronic device end in the embodiment of the present application can be implemented by referring to the above method embodiments, and details are not described here.
Therefore, the user can perform long-stroke gesture operation in a specific area, so that the electronic equipment can execute processing tasks aiming at the interface displayed in the display area, the user can conveniently perform operation aiming at the interface displayed in the display area, the error touch of a user interface object on the interface displayed in the display area can be avoided, and the user operation experience is improved.
It is understood that the processor in the embodiments of the present application may be a Central Processing Unit (CPU), other general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, a transistor logic device, a hardware component, or any combination thereof. The general purpose processor may be a microprocessor, but may be any conventional processor.
The method steps in the embodiments of the present application may be implemented by hardware, or may be implemented by software instructions executed by a processor. The software instructions may consist of corresponding software modules that may be stored in Random Access Memory (RAM), flash memory, read-only memory (ROM), programmable read-only memory (PROM), Erasable Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), registers, a hard disk, a removable hard disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. Of course, the storage medium may also be integral to the processor. The processor and the storage medium may reside in an ASIC.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is to be understood that the various numerical references referred to in the embodiments of the present application are merely for descriptive convenience and are not intended to limit the scope of the embodiments of the present application.

Claims (35)

1. A human-computer interaction method is applied to an electronic device configured with at least one touch screen, wherein the at least one touch screen comprises at least one main display area and at least one touch expansion area, and the at least one touch expansion area is an elongated area located at the edge or the middle of the at least one touch screen; the method comprises the following steps:
receiving a touch operation acting on the at least one touch expansion area;
determining that the gesture operation to which the touch operation belongs is a long stroke gesture;
and when the touch operation belongs to a long-stroke gesture operation, executing a processing task aiming at the display interface of the at least one main display area, wherein the sliding distance of the long-stroke gesture operation meets a preset condition.
2. The method of claim 1, wherein when the swipe gesture operation is a first gesture operation, the processing task comprises: a screenshot image of a display interface of the at least one primary display area is captured.
3. The method of claim 1 or 2, wherein when the long stroke gesture operation is a second gesture operation, the processing task comprises: capturing a screen shot image of a first page, wherein the first page is a page to which a display interface of the at least one main display area belongs, the display interface is a partial page of the first page, and the second gesture operation is different from the first gesture operation.
4. The method according to any one of claims 1-3, wherein when the swipe gesture operation is a third gesture operation, the processing task comprises: and starting to capture a screen recording video with preset duration, wherein the screen recording video is a screen recording video of a display interface of the at least one main display area, and the third gesture operation, the second gesture operation and the first gesture operation are different.
5. The method according to any one of claims 1-4, wherein when the swipe gesture operation is a fourth gesture operation, the processing task comprises: displaying a running interface of a first application in the at least one main display area; the first application is an application associated with the fourth gesture operation, and the fourth gesture operation, the third gesture operation, the second gesture operation and the first gesture operation are different.
6. The method of claim 1, wherein the swipe gesture operation has a swipe direction; the processing task comprises the following steps: and switching the display interface of the at least one main display area according to the sliding direction of the long-stroke gesture operation.
7. The method of any one of claims 1-6, wherein the at least one touch extension area and the primary display area are located in different planes.
8. The method of claim 7, wherein the at least one touch screen comprises a curved screen, and wherein the at least one touch extension area comprises a curved area on either side or both sides of the curved touch screen.
9. The method of claim 7, wherein the at least one touch screen comprises a foldable screen; the at least one touch extension area comprises a bending area of the foldable screen in a folded state.
10. The method of any of claims 1-6, wherein the at least one touch screen comprises a foldable screen, the foldable screen in an unfolded configuration comprising a first region, a second region, and a bendable region between the first region and the second region;
when the foldable screen is in the unfolded state, the at least one touch expansion area comprises the bendable area.
11. The method of any of claims 1-10, wherein the swipe gesture operation comprises at least one of:
single-finger long stroke operation, double-finger long stroke operation, three-finger long stroke operation, four-finger long stroke operation and palm long stroke operation.
12. The method according to any one of claims 1-11, wherein the determining that the gesture operation to which the touch operation belongs is a long stroke gesture comprises:
determining the sliding distance of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation;
determining the operation duration of the touch operation according to the timestamp of the initial touch point and the timestamp of the termination touch point;
and when the sliding distance of the touch operation meets the preset condition and the operation duration is less than the preset duration, determining that the touch operation is a long-stroke gesture operation.
13. The method according to any one of claims 1-11, wherein the determining that the gesture operation to which the touch operation belongs is a long stroke gesture comprises:
determining the sliding distance and the sliding direction of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation;
and when the sliding distance of the touch operation meets the preset condition and the sliding direction of the touch operation meets the preset direction, determining that the touch operation is a long stroke gesture operation.
14. The method according to any one of claims 1-11, wherein the touch operation is a touch operation of at least two fingers, and the determining that the gesture operation to which the touch operation belongs is a long stroke gesture comprises:
determining the sliding distance of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation;
determining the distance between a touch point corresponding to a first finger and a touch point corresponding to a second finger at the same moment in the touch operation process;
and when the distances between the touch points corresponding to the first fingers and the touch points corresponding to the second fingers at the same moment are smaller than a first threshold value in the touch operation process, determining that the touch operation is the long stroke gesture operation of the at least two fingers.
15. The method according to any one of claims 1-14, wherein the determining that the gesture operation to which the touch operation belongs is a long stroke gesture comprises:
determining the sliding distance of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation;
when the sliding distance meets the preset condition and first characteristics of a plurality of touch points of the touch operation meet the preset condition, determining that the touch operation is a long stroke gesture operation, wherein the first characteristics are areas or shapes; wherein the content of the first and second substances,
when the first characteristic is the area, the first characteristic of the touch points meets a preset condition that the area of the touch points is larger than or equal to a preset area; or the like, or, alternatively,
when the first characteristic is a shape, the first characteristic of the touch points meets a preset condition that the shapes of the touch points conform to a preset shape.
16. The method according to any one of claims 1-15, wherein the sliding distance of the long stroke gesture operation meeting a preset condition comprises: the sliding distance is greater than or equal to a second threshold.
17. The method according to any one of claims 1-15, wherein the sliding distance of the long stroke gesture operation meeting a preset condition comprises: the distance between the initial touch point and the first side edge of the long-stroke gesture operation is smaller than a third threshold value, and the distance between the termination touch point and the second side edge of the long-stroke gesture operation is smaller than a fourth threshold value; wherein the content of the first and second substances,
the first side is a side close to the initial touch point in the two opposite sides, the second side is a side close to the final touch point in the two opposite sides, and the two opposite sides are opposite sides in the length direction of the at least one touch extended area.
18. An electronic device, comprising:
the touch screen comprises at least one main display area and at least one touch expansion area, wherein the at least one touch expansion area is an elongated area located at the edge or the middle of the at least one touch screen;
a memory for storing computer execution instructions;
a processor for executing the computer executable instructions to cause the electronic device to perform:
receiving a touch operation acting on the at least one touch expansion area;
determining that the gesture operation to which the touch operation belongs is a long stroke gesture;
and when the touch operation belongs to a long-stroke gesture operation, executing a processing task aiming at the display interface of the at least one main display area, wherein the sliding distance of the long-stroke gesture operation meets a preset condition.
19. The electronic device of claim 18, wherein when the swipe gesture operation is a first gesture operation, execution of the computer-executable instructions by the processor causes the electronic device to perform: a screenshot image of a display interface of the at least one primary display area is captured.
20. The electronic device of claim 18 or 19, wherein when the long stroke gesture operation is a second gesture operation, the processor executes the computer-executable instructions to cause the electronic device to perform: capturing a screen shot image of a first page, wherein the first page is a page to which a display interface of the at least one main display area belongs, the display interface is a partial page of the first page, and the second gesture operation is different from the first gesture operation.
21. The electronic device of any of claims 18-20, wherein when the swipe gesture operation is a third gesture operation, execution of the computer-executable instructions by the processor causes the electronic device to perform: and starting to capture a screen recording video with preset duration, wherein the screen recording video is a screen recording video of a display interface of the at least one main display area, and the third gesture operation, the second gesture operation and the first gesture operation are different.
22. The electronic device of any of claims 18-21, wherein when the swipe gesture operation is a fourth gesture operation, execution of the computer-executable instructions by the processor causes the electronic device to perform: displaying a running interface of a first application in the at least one main display area; the first application is an application associated with the fourth gesture operation, and the fourth gesture operation, the third gesture operation, the second gesture operation and the first gesture operation are different.
23. The electronic device of claim 18, wherein the swipe gesture operation has a swipe direction; the processor executing the computer-executable instructions causes the electronic device to perform: and switching the display interface of the at least one main display area according to the sliding direction of the long-stroke gesture operation.
24. The electronic device of any of claims 18-23, wherein the at least one touch extension area and the primary display area are located in different planes.
25. The electronic device of claim 24, wherein the at least one touch screen comprises a curved screen, and wherein the at least one touch extension area comprises a curved area on either side or both sides of the curved touch screen.
26. The electronic device of claim 24, wherein the at least one touch screen comprises a foldable screen; the at least one touch extension area comprises a bending area of the foldable screen in a folded state.
27. The electronic device of any of claims 18-23, wherein the at least one touch screen comprises a foldable screen, the foldable screen in an unfolded configuration comprising a first region, a second region, and a bendable region between the first region and the second region;
when the foldable screen is in the unfolded state, the at least one touch expansion area comprises the bendable area.
28. The electronic device of any of claims 18-27, wherein the swipe gesture operation comprises at least one of:
single-finger long stroke operation, double-finger long stroke operation, three-finger long stroke operation, four-finger long stroke operation and palm long stroke operation.
29. The electronic device of any of claims 18-28, wherein execution of the computer-executable instructions by the processor causes the electronic device to perform:
determining the sliding distance of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation;
determining the operation duration of the touch operation according to the timestamp of the initial touch point and the timestamp of the termination touch point;
and when the sliding distance of the touch operation meets the preset condition and the operation duration is less than the preset duration, determining that the touch operation is a long-stroke gesture operation.
30. The electronic device of any of claims 18-28, wherein execution of the computer-executable instructions by the processor causes the electronic device to perform:
determining the sliding distance and the sliding direction of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation;
and when the sliding distance of the touch operation meets the preset condition and the sliding direction meets the preset direction, determining that the touch operation is a long-stroke gesture operation.
31. The electronic device of any one of claims 18-28, wherein the touch operation is a touch operation of at least two fingers, and wherein execution of the computer-executable instructions by the processor causes the electronic device to perform:
determining the sliding distance of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation;
determining the distance between a touch point corresponding to a first finger and a touch point corresponding to a second finger at the same moment in the touch operation process;
and when the distances between the touch points corresponding to the first fingers and the touch points corresponding to the second fingers at the same moment are smaller than a first threshold value in the touch operation process, determining that the touch operation is the long stroke gesture operation of the at least two fingers.
32. The electronic device of any of claims 18-31, wherein execution of the computer-executable instructions by the processor causes the electronic device to perform:
determining the sliding distance of the touch operation according to the position of the initial touch point and the position of the final touch point of the touch operation;
when the sliding distance meets the preset condition and first characteristics of a plurality of touch points of the touch operation meet the preset condition, determining that the touch operation is a long stroke gesture operation, wherein the first characteristics are areas or shapes; wherein the content of the first and second substances,
when the first characteristic is the area, the first characteristic of the touch points meets a preset condition that the area of the touch points is larger than or equal to a preset area; or the like, or, alternatively,
when the first characteristic is a shape, the first characteristic of the touch points meets a preset condition that the shapes of the touch points conform to a preset shape.
33. The electronic device according to any of the claims 18-32, wherein the sliding distance of the long-stroke gesture operation satisfying a preset condition comprises: the sliding distance is greater than or equal to a second threshold.
34. The electronic device according to any one of claims 18-32, wherein the sliding distance of the long stroke gesture operation satisfying a preset condition comprises: the distance between the initial touch point and the first side edge of the long-stroke gesture operation is smaller than a third threshold value, and the distance between the termination touch point and the second side edge of the long-stroke gesture operation is smaller than a fourth threshold value; wherein the content of the first and second substances,
the first side is a side close to the initial touch point in the two opposite sides, the second side is a side close to the final touch point in the two opposite sides, and the two opposite sides are opposite sides in the length direction of the at least one touch extended area.
35. A computer storage medium comprising computer instructions that, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-17.
CN201911016772.3A 2019-10-24 2019-10-24 Man-machine interaction method and electronic equipment Pending CN112711359A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911016772.3A CN112711359A (en) 2019-10-24 2019-10-24 Man-machine interaction method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911016772.3A CN112711359A (en) 2019-10-24 2019-10-24 Man-machine interaction method and electronic equipment

Publications (1)

Publication Number Publication Date
CN112711359A true CN112711359A (en) 2021-04-27

Family

ID=75541465

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911016772.3A Pending CN112711359A (en) 2019-10-24 2019-10-24 Man-machine interaction method and electronic equipment

Country Status (1)

Country Link
CN (1) CN112711359A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104932815A (en) * 2015-05-06 2015-09-23 努比亚技术有限公司 Mobile terminal and operation method thereof
CN106603829A (en) * 2016-11-30 2017-04-26 努比亚技术有限公司 Screen capture method and mobile terminal
CN106681610A (en) * 2016-12-23 2017-05-17 珠海市魅族科技有限公司 Screen snapping method and device
CN107273009A (en) * 2017-05-27 2017-10-20 上海斐讯数据通信技术有限公司 A kind of method and system of the quick screenshotss of mobile terminal
CN107526525A (en) * 2017-09-06 2017-12-29 广东欧珀移动通信有限公司 A kind of screenshotss method, apparatus, mobile terminal and computer-readable recording medium
CN108984091A (en) * 2018-06-27 2018-12-11 Oppo广东移动通信有限公司 Screenshotss method, apparatus, storage medium and electronic equipment
CN109857306A (en) * 2018-12-27 2019-06-07 维沃移动通信有限公司 Screenshotss method and terminal device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104932815A (en) * 2015-05-06 2015-09-23 努比亚技术有限公司 Mobile terminal and operation method thereof
CN106603829A (en) * 2016-11-30 2017-04-26 努比亚技术有限公司 Screen capture method and mobile terminal
CN106681610A (en) * 2016-12-23 2017-05-17 珠海市魅族科技有限公司 Screen snapping method and device
CN107273009A (en) * 2017-05-27 2017-10-20 上海斐讯数据通信技术有限公司 A kind of method and system of the quick screenshotss of mobile terminal
CN107526525A (en) * 2017-09-06 2017-12-29 广东欧珀移动通信有限公司 A kind of screenshotss method, apparatus, mobile terminal and computer-readable recording medium
CN108984091A (en) * 2018-06-27 2018-12-11 Oppo广东移动通信有限公司 Screenshotss method, apparatus, storage medium and electronic equipment
CN109857306A (en) * 2018-12-27 2019-06-07 维沃移动通信有限公司 Screenshotss method and terminal device

Similar Documents

Publication Publication Date Title
CN105824559B (en) False touch recognition and processing method and electronic equipment
CN102855081B (en) The apparatus and method that web browser interface using gesture is provided in a device
EP2433201B1 (en) Touch screen disambiguation based on prior ancillary touch input
CN105573538B (en) Sliding broken line compensation method and electronic equipment
KR20160149262A (en) Touch point recognition method and device
CN108064368A (en) The control method and device of flexible display device
KR20140071282A (en) Electronic device and method for controlling zooming of displayed object
CN103500050A (en) Icon moving method and portable touch terminal using same
US20120056831A1 (en) Information processing apparatus, information processing method, and program
CN111782332A (en) Application interface switching method and device, terminal and storage medium
US20150286283A1 (en) Method, system, mobile terminal, and storage medium for processing sliding event
TWI498808B (en) Method, apparatus and computer program product for cropping screen frame
US20150084877A1 (en) Touch device and method for dynamically setting touch inactive area, and non-transitory recording medium
CN106959797B (en) A kind of setting method and mobile terminal notifying footmark
CN104765524A (en) Application switching method and device
CN107329689B (en) A kind of backing method and mobile terminal of hand-writing input method
CN104007919A (en) Electronic device and control method thereof
CN107370874B (en) Application starting method, mobile terminal and storage medium
CN104182166A (en) Control method and device of intelligent terminal application program
CN108846271B (en) Device control method, device, storage medium and electronic device
CN107577404B (en) Information processing method and device and electronic equipment
CN103809894B (en) A kind of recognition methods of gesture and electronic equipment
US20140184537A1 (en) Touch control device and touch control processing method
CN114327726A (en) Display control method, display control device, electronic equipment and storage medium
CN103761045A (en) Zoom touch control method and device of mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210427