WO2020258950A1 - 一种移动终端的控制方法及移动终端 - Google Patents

一种移动终端的控制方法及移动终端 Download PDF

Info

Publication number
WO2020258950A1
WO2020258950A1 PCT/CN2020/081241 CN2020081241W WO2020258950A1 WO 2020258950 A1 WO2020258950 A1 WO 2020258950A1 CN 2020081241 W CN2020081241 W CN 2020081241W WO 2020258950 A1 WO2020258950 A1 WO 2020258950A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
touch
area
touch area
mobile terminal
Prior art date
Application number
PCT/CN2020/081241
Other languages
English (en)
French (fr)
Inventor
汶晨光
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Priority to EP20832374.1A priority Critical patent/EP3936993A4/en
Priority to US17/604,751 priority patent/US20220263929A1/en
Publication of WO2020258950A1 publication Critical patent/WO2020258950A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/0206Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings
    • H04M1/0208Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings characterized by the relative motions of the body parts
    • H04M1/0214Foldable telephones, i.e. with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • H04M1/0216Foldable in one direction, i.e. using a one degree of freedom hinge
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1641Details related to the display arrangement, including those related to the mounting of the display in the housing the display being formed by a plurality of foldable display components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1643Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1652Details related to the display arrangement, including those related to the mounting of the display in the housing the display being flexible, e.g. mimicking a sheet of paper, or rollable
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1675Miscellaneous details related to the relative movement between the different enclosures or enclosure parts
    • G06F1/1677Miscellaneous details related to the relative movement between the different enclosures or enclosure parts for detecting open or closed state or particular intermediate positions assumed by movable parts of the enclosure, e.g. detection of display lid position with respect to main body in a laptop, detection of opening of the cover of battery compartment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • G06F3/04186Touch location disambiguation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04102Flexible digitiser, i.e. constructional details for allowing the whole digitising part of a device to be flexed or rolled like a sheet of paper
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2201/00Electronic components, circuits, software, systems or apparatus used in telephone systems
    • H04M2201/38Displays

Definitions

  • the embodiments of the present application relate to, but are not limited to, the field of communication technology, and particularly refer to a method for controlling a mobile terminal and a mobile terminal.
  • folding screens and curved screens have attracted attention for their unique characteristics and huge potential.
  • folding screens and curved screens can provide users with new interactive methods based on bendable characteristics, which can meet more users' needs for electronic devices.
  • This application provides a method for controlling a mobile terminal and a mobile terminal.
  • the present application provides a control method of a mobile terminal, which is applied to a mobile terminal including at least one screen, the screen has at least one bending part, or the screen is suitable for folding to form at least one bending part;
  • the control method includes: detecting a holding gesture of the mobile terminal; and setting a first effective touch area and a first invalid touch area corresponding to the holding gesture on the bending portion.
  • the present application provides a mobile terminal, including: at least one screen, a memory, and a processor; the screen has at least one bending portion, or the screen is suitable for folding to form at least one bending portion;
  • the memory is suitable for storing a computer program, which implements the steps of the above-mentioned control method when the computer program is executed by the processor.
  • the present application provides a computer-readable storage medium that stores a computer program that, when executed by a processor, implements the steps of the aforementioned control method.
  • FIG. 1 is a flowchart of a method for controlling a mobile terminal according to an embodiment of the application
  • Figure 2 is a front view of a non-folding mobile terminal
  • FIG. 3 is a top view of the non-folding mobile terminal shown in FIG. 2;
  • Figure 4 is a schematic diagram of an unfolded folding screen of a mobile terminal
  • Figure 5 is a front view of the folding screen shown in Figure 4 in a fully folded state
  • Figure 6 is a side view of the folding screen shown in Figure 4 in a fully folded state
  • FIG. 7 is a schematic diagram of a holding gesture of the mobile terminal shown in FIG. 5 in a fully folded state
  • FIG. 8 is a schematic diagram of an exemplary implementation of a control method provided by an embodiment of this application.
  • FIG. 9 is a schematic diagram of the plane arrangement of the effective touch area and the invalid touch area in an exemplary embodiment of this application.
  • FIG. 10 is a three-dimensional perspective diagram corresponding to the schematic diagram of the plane setting shown in FIG. 9;
  • FIG. 11 is a schematic diagram of a recognition process of a holding gesture in an exemplary embodiment of this application.
  • FIG. 12 is a schematic diagram of a holding gesture in an exemplary embodiment of this application.
  • FIG. 13 is a schematic diagram of another holding gesture in an exemplary embodiment of this application.
  • FIG. 14 is a schematic diagram of another exemplary implementation of a control method provided by an embodiment of this application.
  • 15 is a schematic diagram of a control device of a mobile terminal provided by an embodiment of the application.
  • FIG. 16 is a schematic diagram of a mobile terminal provided by an embodiment of the application.
  • the embodiment of the present application provides a method for controlling a mobile terminal, which can be applied to a mobile terminal including at least one screen, where the screen may have at least one bending part, or the screen may be adapted to be folded to form at least one bending part .
  • the mobile terminal may include a screen and the screen has one or more bending parts, or the mobile terminal may include a screen and the screen may be folded to form one or more bending parts, or the mobile terminal may include Two screens, one of which is suitable for folding to form one or more bending parts.
  • the bent portion may be an incomplete plane area, or it may be an edge of one or more plane portions, or it may be a connection area or a transition area of two or more plane portions.
  • this application is not limited to this.
  • the mobile terminal in this embodiment may be a single-screen terminal, a dual-screen terminal, or a multi-screen terminal.
  • the mobile terminal in this embodiment may include, but is not limited to, a smart phone, a wearable device, a tablet computer, etc. However, this application is not limited to this.
  • the mobile terminal of this embodiment may include an unfoldable screen, and the screen may include a flat portion and a bent portion bent and extended downward from the width or length of the flat portion.
  • the part may be covered on the side wall of the mobile terminal.
  • the bent portion may cover the side wall in the length or width direction of the mobile phone, and the bent portion covering the side wall may be curved. Since the bent part is also a part of the entire screen, it can be used for display and operation. In this example, the bent part of the screen is fixed and will not change.
  • Fig. 2 is a front view of a non-folding mobile terminal
  • Fig. 3 is a top view of the non-folding mobile terminal shown in Fig. 2.
  • the mobile terminal may include a terminal back 30 and a screen.
  • the screen includes a flat portion 20 and two bending portions 11 extending from the length direction of the flat portion 20 to both sides.
  • this application is not limited to this. In other embodiments, the number of bending parts may be one or three or four.
  • the mobile terminal of this embodiment may include a folding screen or a flexible screen, and the folding screen or the flexible screen has a feature that it can be bent at one or more places.
  • a folding screen as an example, after the folding screen is bent outward at a certain angle, it can be divided into two screen units facing opposite directions for display.
  • the curved surface formed at the folding position of the folding screen can be called a bending part.
  • One bending (or folding) of the folding screen or the flexible screen can correspond to a bending part. Since the bending part is also a part of the folding screen, it can be used for display and operation.
  • the bent portion of the screen is not fixed, but is formed after the screen is bent (or folded).
  • Fig. 1 is a flowchart of a method for controlling a mobile terminal according to an embodiment of the application.
  • the control method provided in this embodiment, applied to the above-mentioned mobile terminal includes:
  • control method of this embodiment may further include: when a touch operation acting on the first invalid touch area is received, not responding to a touch in the first invalid touch area. Control operation; when a touch operation acting on the first effective touch area is received, respond to the touch operation in the first effective touch area, and perform processing corresponding to the touch operation.
  • the touch operation in the first effective touch area may include: up and down sliding operation, single click operation, and double click operation.
  • the up and down sliding operation in the first effective touch area can achieve the sliding page effect of the page sliding up and down; when the mobile terminal is in the picture viewing mode, the first effective touch
  • the up and down sliding operation in the control area can realize the picture switching effect.
  • sliding up in the first effective touch area can switch to the previous picture correspondingly, and sliding down in the first effective touch area can switch correspondingly Go to the next picture; when the mobile terminal is in the picture search mode, the double-click operation in the first effective touch area can zoom the picture, for example, reduce or enlarge the picture being displayed to a certain ratio.
  • this application is not limited to this. In practical applications, the touch operation in the first effective touch area and the corresponding response strategy can be set according to actual needs.
  • the control method of this embodiment may further include: detecting the folding angle of the screen of the mobile terminal; when the folding angle satisfies the angle setting condition, determining that the screen is folded to form a bent portion.
  • the screen of the mobile terminal may include at least a first screen unit and a second screen unit; accordingly, detecting the folding angle of the screen of the mobile terminal may include: detecting a clip between the first screen unit and the second screen unit angle.
  • S101 may include:
  • the second touch parameter of the hand and the screen is detected, and the holding gesture of the mobile terminal is recognized according to the second touch parameter.
  • the first touch parameter may include: the number of touch points on the bending portion and the area of each touch point;
  • recognizing the holding gesture of the mobile terminal may include:
  • the bent portion is held on the fingertip side of the finger.
  • control method provided in this embodiment may further include: establishing a hand shape model according to the second touch parameter; and estimating hand shape characteristics according to the hand shape model;
  • S102 may include: setting a first effective touch area and a first ineffective touch area corresponding to the holding gesture and matching the hand shape feature on the bending portion.
  • the second touch parameter may include: contact area, contact area, contact strength, and so on.
  • this application is not limited to this.
  • the first effective touch area and the first invalid touch area are set based on the holding gesture and the hand shape model, which can achieve personalized matching of the hand shapes of different users to meet the needs and requirements of different users. Use characteristics.
  • the control method of this embodiment further includes: setting a second effective touch area and a second invalid touch area corresponding to the holding gesture in the screen area facing the palm of the hand.
  • the interface can be displayed on the screen area on the side facing the user, while the screen area on the side facing the user (that is, the palm of the hand holding the mobile terminal faces
  • the second effective touch area that supports the operation and control of the display interface can be set, so as to provide the user with a new interactive mode, which is convenient for the user to use, and improves the user's experience.
  • the embodiment of the present application provides a control method for a mobile terminal. While preventing accidental touches, it can support the provision of a less intense and more refined operation method on the mobile terminal, thereby improving user experience.
  • a smart phone including a folding screen or a flexible screen (hereinafter referred to as a folding screen phone) as shown in FIG. 4 is taken as an example to illustrate the control method provided by the embodiment of the present application.
  • the folding screen of the folding screen mobile phone is in an unfolded state, and the folding screen faces upward.
  • the folding method of the folding screen it can be divided into two types: inward folding and outward folding; in terms of the number of screen units after folding, it can be divided into two screens and three screens.
  • the folding screen can be folded along the rotating shaft (the dotted line shown in FIG. 4) to form a double screen.
  • Figure 5 shows the front view of the folding screen mobile phone in a fully folded state.
  • the folding screen mobile phone is folded outward, and after folding, a whole screen is divided into two screen units for display, that is, the two screen units display information content outward after folding.
  • the control method provided in the embodiments of the present application is also applicable to folding screens that form three or more screens, and also applicable to flexible screens that form a bending portion after bending or folding part of the screen.
  • the screen unit on the right side can be rotated 180 degrees clockwise along the axis of rotation (the dotted line in FIG. 4) to reach the fully folded state shown in FIGS. 5 and 6.
  • Figure 6 shows a side view of the folding screen mobile phone in a fully folded state.
  • the curved surface connected between the two parallel upper and lower screen units 10 and 12 is the bent portion 11.
  • Fig. 7 is a schematic diagram of a holding gesture of the folding screen mobile phone shown in Figs. 5 and 6 in a fully folded state.
  • the user can usually hold the mobile phone with one hand.
  • the bent part is held on the inner side of the palm.
  • the palm and thumb will touch the bent part;
  • the bent part is held on the Finger tip side, at this time, the other four fingers except the thumb will touch the bend.
  • the thumb and index finger can easily touch the bending part, which is just the same as the scene when using the phone normally. It fits together to support the operability of defining new interactive modes on the bending part in terms of operation mode and usage scenarios.
  • FIG. 8 is a schematic diagram of an exemplary implementation of a method for controlling a mobile terminal according to an embodiment of the application.
  • This exemplary embodiment describes a method for controlling the folding screen mobile phone shown in FIGS. 4 to 7. As shown in Fig. 8, this exemplary embodiment includes the following processes:
  • the folding screen may include a first screen unit and a second screen unit, and the bending angle of the folding screen refers to the angle between the first screen unit and the second screen unit;
  • the first screen unit and the second screen unit can be relatively folded along the rotation axis, so that the angle between the first screen unit and the second screen unit can be changed.
  • S203 Determine the holding gesture of the folding screen mobile phone.
  • the gesture of holding a folding screen mobile phone can be determined by the number of touch points and the contact area of the hand and the bending part.
  • the holding gestures can be divided into the following three categories: the bending part is held on the inner side of the palm, the bending part is held on the fingertip side of the finger, and other gestures.
  • FIG. 11 is a schematic diagram of a recognition process of a holding gesture in an exemplary embodiment. As shown in FIG. 11, the recognition process of this exemplary holding gesture includes:
  • S302 Perform judgment and recognition according to the number of touch points and the area of each touch point.
  • the touch point with a larger contact area can be regarded as the touch point between the palm and the bending part
  • the current gesture can be determined as:
  • the bent part is held on the inner side of the palm (as shown in Figure 12, where Figure 12(a) is the case where the bent part 11a is held on the inner side of the palm and the thumb touches the bent part 11a, Figure 12(b) ) Is the case where the bent portion 11a is held on the inner side of the palm and the thumb does not touch the bent portion 11a, the figure also shows the touch point 111);
  • the current gesture is: the bending part is held on the fingertip side of the finger (as shown in Figure 13, where Figure 13 (a) is the case where the bent part 11b is held on the fingertip side of the finger and the index finger touches the bent part 11b.
  • Figure 13(b) is the case where the bent part 11b is held on the fingertip side of the finger and the index finger is not In the case of touching the bent portion 11b, the figure also shows the touch point 111);
  • FIG. 9 (a) shows a schematic plan view of the effective touch area
  • Figure 10 (a) is a three-dimensional perspective view corresponding to the schematic plan view of Figure 9 (a).
  • the folding screen is divided into the following three parts after being folded: the first screen unit 10a, the bending part 11a and the second screen unit 12a; it is held at the bending part 11a
  • the first screen unit 10a faces the user
  • the second screen unit 12a faces away from the user and faces the palm of the user.
  • the effective touch area may include a first effective touch area 101 and a second effective touch area 102; wherein the first effective touch area 101 is the bent portion 11a
  • the middle area of the upper part is the touch range of the thumb at the bending part;
  • the second effective touch area 102 is the area that the index finger can touch on the back of the folding screen mobile phone, that is, the index finger touches the second screen unit 12a. Control range.
  • the first effective touch area 101 can be touched by a thumb
  • the second effective touch area 102 can be touched by an index finger.
  • the ineffective touch area may include: a first ineffective touch area and a second ineffective touch area; where the first ineffective touch area is on the bent portion 11a except for the first effective touch area 101
  • the second ineffective touch area is an area on the second screen unit 12a excluding the second effective touch area 102.
  • the user interface may be displayed on the first screen unit 10a facing the user, and the user's touch operation on the first effective touch area 101 and the second effective touch area 102 may perform various manipulations on the user display interface. For example, when the first screen unit 10a displays a page with text or a combination of graphics and text, according to the up and down sliding operations received on the first effective touch area 101 and the second effective touch area 102, the effect of sliding the page up and down can be achieved .
  • FIG. 9(b) shows a schematic plan view of the effective touch area
  • FIG. 10(b) is a three-dimensional perspective view corresponding to the schematic plan view of FIG. 9(b).
  • the folding screen is divided into the following three parts after being folded: the first screen unit 10b, the bending part 11b, and the second screen unit 12b; it is held at the bending part 11b
  • the first screen unit 10b faces the user
  • the second screen unit 12b faces away from the user and faces the palm of the user.
  • the effective touch area may include a first effective touch area 103 and a second effective touch area 104; wherein, the first effective touch area 103 is the bent portion 11b The upper middle and upper part of the area is the touch range of the index finger at the bending part; the second effective touch area 104 is the area that the index finger can touch on the back of the folding screen mobile phone, that is, the index finger touches the second screen unit 12b. Control range.
  • both the first effective touch area and the second effective touch area can be touched by the index finger. Therefore, as shown in FIG. 10(b), the first effective touch area 103 and the second effective touch area 104 may be connected to facilitate manipulation by the index finger.
  • this application is not limited to this.
  • the ineffective touch area may include: a first ineffective touch area and a second ineffective touch area; wherein, the first ineffective touch area is on the bent portion 11b except for the first effective touch area 103
  • the second ineffective touch area is an area on the second screen unit 12b excluding the second effective touch area 104.
  • the user interface may be displayed on the first screen unit 10b facing the user, and the user's touch operation on the first effective touch area 103 and the second effective touch area 104 may perform various manipulations on the user display interface.
  • the setting positions of the effective touch area and the invalid touch area corresponding to the holding gesture can be defined in advance. After the holding gesture is recognized, the effective touch area can be set on the screen according to the pre-defined setting position. Area and invalid touch area.
  • this application is not limited to this. In other implementations, the position of the effective touch area can be determined according to the actual touch point of the thumb or index finger on the bending portion.
  • various touch operations and corresponding rapid response processing can be defined; for example, for the up and down sliding operations in the effective touch area: if the phone is displaying a text or a page that combines graphics and text, then It can realize the sliding page effect of sliding up and down the page; if the phone is in the picture viewing mode, it can switch to the previous/next picture; for the single-click operation in the effective touch area, you can view the display interface of the phone Some content, etc.; for double-clicking in the effective touch area, you can zoom in and out of the picture in the picture viewing mode of the phone, or, in the music playing interface, you can pause/continue to play music, etc.
  • this application is not limited to this.
  • the control method for a folding screen smart phone provided by this exemplary embodiment can define different response areas (effective touch area and invalid touch area) on the folding screen phone according to the folding state and holding gesture of the phone, thereby While preventing misoperation, it can support convenient and flexible interaction with the mobile phone user interface.
  • the effective touch area defined according to the holding gesture the user interface of the mobile phone can be quickly controlled by only the micro-operation of the thumb or index finger.
  • the operation intensity is smaller and more refined, which brings a more portable user experience. For example, it can be convenient for users to quickly turn pages and view the contents displayed on the screen.
  • FIG. 14 is a schematic diagram of another exemplary implementation of the control method provided by an embodiment of the application.
  • This exemplary embodiment describes a method for controlling the folding screen mobile phone shown in FIGS. 4 to 7.
  • the exemplary embodiment includes the following processes:
  • the folding screen may include a first screen unit and a second screen unit, and the bending angle of the folding screen refers to the angle between the first screen unit and the second screen unit;
  • the first screen unit and the second screen unit can be relatively folded along the rotation axis, so that the angle between the first screen unit and the second screen unit can be changed.
  • the second touch Touch parameters can be comprehensively judged by multiple sensors set at the bottom of the phone screen.
  • S404 According to the second touch parameter, recognize the position of the palm and the finger, determine the gesture of holding the terminal, and perform automatic simulation modeling of the hand shape.
  • the second touch parameter in addition to determining the holding gesture as shown in FIG. 12 or FIG. 13, it can also determine the state where the mobile phone is held with one hand or both hands and the bent part is upward, and the state with one hand Or hold the phone with both hands and the bent part down. Then, based on the determined holding gesture, an effective touch area and an invalid touch area that match the holding gesture are determined.
  • S405 According to the hand shape model, estimate the hand shape feature, and determine the effective touch area and the invalid touch area.
  • the hand shape characteristics can be estimated. Then, the easy operation area of the thumb and index finger on the back screen (that is, the screen area facing the palm of the hand) and the bending part is calculated; then, the combination is identified The holding gesture to determine the effective touch area and the invalid touch area.
  • the configuration of the effective touch area and the ineffective touch area can be referred to as shown in FIG. 9 and FIG. 10, so it will not be repeated here.
  • the difference between this example and the example shown in Figure 8 is:
  • the effective touch area and the invalid touch area that conform to the user's hand type characteristics can be intelligently defined, so that To better meet user needs, achieve personalized adaptation, and improve user experience.
  • This step can refer to the description of S206, so it will not be repeated here.
  • FIG. 15 is a schematic diagram of a control device of a mobile terminal provided by an embodiment of the application.
  • the control device provided in this embodiment is applied to a mobile terminal including at least one screen, the screen has at least one bending part, or the screen is suitable for folding to form at least one bending part.
  • control device may include: a detection module 201, adapted to detect a holding gesture of a mobile terminal; a control module 202, adapted to set a second corresponding to the holding gesture on the bent portion of the screen An effective touch area and a first invalid touch area.
  • control module 202 may also be adapted to not respond to the touch operation when receiving a touch operation in the first invalid touch area; when receiving a touch operation in the first effective touch area, Respond to the touch operation and execute processing corresponding to the touch operation.
  • control device For the relevant description of the control device provided in this embodiment, reference may be made to the description of the foregoing method embodiment, so it will not be repeated here.
  • an embodiment of the present application further provides a mobile terminal, including: at least one screen, a memory, and a processor; the screen has at least one bending part, or the screen is suitable for folding to form at least one bending part; and the memory is suitable for A computer program is stored, and when the computer program is executed by a processor, the steps of the control method provided in the foregoing embodiments are implemented.
  • FIG. 16 is a schematic diagram of an example of a mobile terminal provided by an embodiment of the application.
  • the mobile terminal 300 may include: a processor 310, a memory 320, a bus system 330, and at least one screen 340; wherein the processor 310, the memory 320, and the screen 340 pass through the bus
  • the system 330 is connected, the memory 320 is used to store instructions, and the processor 310 is used to execute the instructions stored in the memory 320 to control the screen 340 to display information input by the user or information provided to the user.
  • the operations of the detection module and the control module in the aforementioned control device can be executed by the processor.
  • the processor 310 may be a central processing unit (Central Processing Unit, referred to as "CPU"), and the processor 310 may also be other general-purpose processors, digital signal processors (DSP), application specific integrated circuits (ASIC), and ready-made Programmable gate array (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
  • the memory 320 may include a read-only memory and a random access memory, and provides instructions and data to the processor 310. A part of the memory 320 may also include a non-volatile random access memory. For example, the memory 320 may also store device type information.
  • the bus system 330 may also include a power bus, a control bus, and a status signal bus. However, for clarity of description, various buses are marked as the bus system 330 in FIG. 16.
  • the screen 340 may include a display panel and a touch panel.
  • the display panel can be configured in the form of a liquid crystal display or an organic light emitting diode lamp.
  • the touch panel can collect the user's touch operations on or near it, and can adopt multiple types such as resistive, capacitive, infrared, etc. to realize the touch panel.
  • the touch panel may be covered on the display panel, and when the touch panel detects a touch operation on or near it, it is transmitted to the processor 310 to determine the type of touch event, and then the processor 310 according to The type of touch event provides corresponding visual output on the display panel.
  • the display panel and the touch panel can be integrated to implement input and output functions. This application does not limit this.
  • the processing performed by the mobile terminal 300 may be completed by an integrated logic circuit of hardware in the processor 310 or instructions in the form of software. That is, the steps of the method disclosed in the embodiments of the present application may be embodied as being executed by a hardware processor, or executed by a combination of hardware and software modules in the processor.
  • the software module can be located in storage media such as random access memory, flash memory, read-only memory, programmable read-only memory, or electrically erasable programmable memory, registers.
  • the storage medium is located in the memory 320, and the processor 310 reads the information in the memory 320, and completes the steps of the foregoing method in combination with its hardware. To avoid repetition, it will not be described in detail here.
  • an embodiment of the present application also provides a computer-readable storage medium that stores a computer program that, when executed by a processor, implements the steps of the control method provided in any of the foregoing embodiments.
  • Such software may be distributed on a computer-readable medium, and the computer-readable medium may include a computer storage medium (or a non-transitory medium) and a communication medium (or a transitory medium).
  • the term computer storage medium includes volatile and non-volatile memory implemented in any method or technology for storing information (such as computer-readable instructions, data structures, program modules, or other data). Sexual, removable and non-removable media.
  • Computer storage media include but are not limited to RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disk (DVD) or other optical disk storage, magnetic cassette, tape, magnetic disk storage or other magnetic storage device, or Any other medium used to store desired information and that can be accessed by a computer.
  • communication media usually contain computer-readable instructions, data structures, program modules, or other data in a modulated data signal such as carrier waves or other transmission mechanisms, and may include any information delivery media .

Abstract

一种移动终端的控制方法,应用于包括至少一个屏幕的移动终端,该屏幕具有至少一个弯折部,或者,该屏幕适于折叠形成至少一个弯折部;控制方法包括:检测移动终端的握持手势;在弯折部上设置对应握持手势的第一有效触控区域和第一无效触控区域。

Description

一种移动终端的控制方法及移动终端
交叉引用
本申请引用于2019年06月24日递交的名称为“一种移动终端的控制方法及移动终端”的第201910550192.6号中国专利申请,其通过引用被全部并入本申请。
技术领域
本申请实施例涉及但不限于通信技术领域,尤指一种移动终端的控制方法及移动终端。
背景技术
随着科学技术的飞速发展,移动电子设备的外观发生了巨大变化;其中,折叠屏及曲面屏以其独特的特性和巨大的潜力而备受关注。相较于传统屏幕,折叠屏及曲面屏可以给用户提供基于可弯折特性的新交互方式,可以满足用户对电子设备的更多需求。
发明内容
本申请提供了一种移动终端的控制方法及移动终端。
一方面,本申请提供一种移动终端的控制方法,应用于包括至少一个屏幕的移动终端,所述屏幕具有至少一个弯折部,或者,所述屏幕适于折叠形成至少一个弯折部;所述控制方法包括:检测所述移动终端的握持手势;在所述弯折部上设置对应所述握持手势的第一有效触控区域和第一无效触控区域。
另一方面,本申请提供一种移动终端,包括:至少一个屏幕、存储器和处理器;所述屏幕具有至少一个弯折部,或者,所述屏幕适于折叠形成至少一个弯折部;所述存储器适于存储计算机程序,所述计算机程序被所述处理器执行时实现上述控制方法的步骤。
另一方面,本申请提供一种计算机可读存储介质,存储有计算机程序,所述计算机程序被处理器执行时实现上述控制方法的步骤。
本申请的其它特征和优点将在随后的说明书中阐述,并且,部分地从说明书中变得显而 易见,或者通过实施本申请而了解。本申请的目的和其他优点可通过在说明书、权利要求书以及附图中所特别指出的结构来实现和获得。
附图说明
附图用来提供对本申请技术方案的进一步理解,并且构成说明书的一部分,与本申请的实施例一起用于解释本申请的技术方案,并不构成对本申请技术方案的限制。
图1为本申请实施例提供的移动终端的控制方法的流程图;
图2为一种非折叠式的移动终端的主视图;
图3为图2所示的非折叠式的移动终端的俯视图;
图4为一种移动终端的折叠屏的展开示意图;
图5为图4所示的折叠屏处于完全折叠状态的主视图;
图6为图4所示的折叠屏处于完全折叠状态的侧视图;
图7为图5所示的处于完全折叠状态的移动终端的握持手势的示意图;
图8为本申请实施例提供的控制方法的一种示例性实施示意图;
图9为本申请示例性实施例中有效触控区域和无效触控区域的平面设置示意图;
图10为图9所示的平面设置示意图对应的三维透视示意图;
图11为本申请示例性实施例中握持手势的识别过程示意图;
图12为本申请示例性实施例中一种握持手势的示意图;
图13为本申请示例性实施例中另一种握持手势的示意图;
图14为本申请实施例提供的控制方法的另一种示例性实施示意图;
图15为本申请实施例提供的一种移动终端的控制装置的示意图;
图16为本申请实施例提供的一种移动终端的示意图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚明白,下文中将结合附图对本申请的实施例进行详细说明。需要说明的是,在不冲突的情况下,本申请中的实施例及实施例中的特征可以相互任意组合。
在附图的流程图示出的步骤可以在诸如一组计算机可执行指令的计算机系统中执行。并且,虽然在流程图中示出了逻辑顺序,但是在某些情况下,可以以不同于此处的顺序执行所 示出或描述的步骤。
本申请实施例提供一种移动终端的控制方法,可以应用于包括至少一个屏幕的移动终端,其中,该屏幕可以具有至少一个弯折部,或者,该屏幕可以适于折叠形成至少一个弯折部。比如,移动终端可以包括一个屏幕,且该屏幕具有一个或多个弯折部,或者,移动终端可以包括一个屏幕,且该屏幕可以折叠形成一个或多个弯折部,或者,移动终端可以包括两个屏幕,其中一个屏幕适于折叠形成一个或多个弯折部。其中,弯折部可以为非完整的平面区域,或者,可以为一个或多个平面部的边缘,或者,可以为两个或多个平面部的连接区域或过渡区域。然而,本申请对此并不限定。
本实施例中的移动终端可以为单屏终端、双屏终端或多屏终端。本实施例中的移动终端可以包括但不限于智能手机、可穿戴设备、平板电脑等。然而,本申请对此并不限定。
在一种示例中,本实施例的移动终端可以包括一个不可折叠的屏幕,该屏幕可以包括一个平面部以及从该平面部的宽度或长度方向向下弯曲延伸得到的弯折部,该弯折部可以覆盖在移动终端的侧壁上。以移动终端为智能手机为例,弯折部可以覆盖手机长度或宽度方向的侧壁,且覆盖在侧壁上的弯折部可以为曲面形。由于弯折部也是整个屏幕的一部分,因此可以用于显示和操作。在本示例中,屏幕的弯折部为固定存在的,不会发生变化。图2为一种非折叠式的移动终端的主视图;图3为图2所示的非折叠式的移动终端的俯视图。如图2和图3所示,移动终端可以包括终端背面30和屏幕,该屏幕包括一个平面部20以及从该平面部20的长度方向向两侧弯曲延伸得到的两个弯折部11。然而,本申请对此并不限定。在其他实施例中,弯折部的数目可以为一个或三个或四个。
在另一种示例中,本实施例的移动终端可以包括折叠屏或柔性屏,折叠屏或柔性屏具有可以在一处或多处弯折的特点。以折叠屏为例,折叠屏在一处向外弯折一定角度之后,可以分成两个朝向相反的屏幕单元进行显示,在折叠屏的折叠位置形成的曲面可以称为弯折部。折叠屏或柔性屏的一次弯折(或折叠)可以对应形成一个弯折部。由于弯折部也是折叠屏的一部分,因此可以用于显示和操作。在本示例中,屏幕的弯折部不是固定存在的,而是在屏幕进行弯折(或折叠)后才会形成。
图1为本申请实施例提供的移动终端的控制方法的流程图。如图1所示,本实施例提供的控制方法,应用于如上所述的移动终端,包括:
S101、检测移动终端的握持手势;
S102、在移动终端的屏幕的弯折部上设置对应握持手势的第一有效触控区域和第一无效触控区域。
在一示例性实施方式中,在S102之后,本实施例的控制方法还可以包括:当接收到作用于第一无效触控区域内的触控操作,不响应第一无效触控区域内的触控操作;当接收到作用于第一有效触控区域内的触控操作,响应第一有效触控区域内的触控操作,并执行该触控操作对应的处理。
在一示例中,第一有效触控区域内的触控操作可以包括:上下滑动操作、单击操作、双击操作。比如,当移动终端正在显示文本或图文结合的页面,则第一有效触控区域内的上下滑动操作可以实现页面上下滑动的滑页效果;当移动终端处于图片查看模式,则第一有效触控区域内的上下滑动操作可以实现图片切换效果,例如,在第一有效触控区域内向上滑动,可以对应实现切换至上一张图片,在第一有效触控区域内向下滑动,可以对应实现切换至下一张图片;当移动终端处于图片查找模式,则第一有效触控区域内的双击操作,可以缩放图片,例如,将正在显示的图片缩小或放大至一定比例。然而,本申请对此并不限定。在实际应用中,可以根据实际需求设置第一有效触控区域内的触控操作与对应的响应策略。
在一示例性实施方式中,在S101之前,本实施例的控制方法还可以包括:检测移动终端的屏幕的折叠角度;当折叠角度满足角度设定条件时,确定屏幕折叠后形成弯折部。在本示例性实施方式中,针对移动终端的屏幕上的弯折部不是固定存在的情况(屏幕为折叠屏或柔性屏),则在S101之前,需要先判断屏幕是否形成弯折部,当屏幕上形成弯折部之后再针对弯折部进行处理。示例性地,移动终端的屏幕可以至少包括第一屏幕单元和第二屏幕单元;相应地,检测移动终端的屏幕的折叠角度,可以包括:检测第一屏幕单元和第二屏幕单元之间的夹角。
在一示例性实施方式中,S101可以包括:
检测手与屏幕的弯折部的第一触碰参数,并根据第一触碰参数,识别移动终端的握持手势;或者,
检测手与屏幕的第二触碰参数,并根据第二触碰参数,识别移动终端的握持手势。
在一示例性实施方式中,第一触碰参数可以包括:弯折部上的触碰点个数以及每个触碰点的面积;
其中,根据第一触碰参数,识别移动终端的握持手势,可以包括:
当弯折部上的触碰点个数大于零且小于或等于两个,且其中一个触碰点的面积大于面积阈值,则确定弯折部被握持在手掌内侧;
当弯折部上的触碰点个数大于两个且小于或等于四个,且任一个触碰点的面积均小于或等于面积阈值,则确定弯折部被握持在手指指尖侧。
在一示例性实施方式中,在S101之后,本实施例提供的控制方法还可以包括:根据第二触碰参数,建立手型模型;根据手型模型,估算手型特征;
相应地,S102可以包括:在弯折部上设置对应握持手势且匹配手型特征的第一有效触控区域和第一无效触控区域。其中,第二触碰参数可以包括:接触区域、接触面积、接触力度等。然而,本申请对此并不限定。
在本示例性实施方式中,基于握持手势和手型模型设置第一有效触控区域和第一无效触控区域,可以实现个性化匹配不同用户的手型,以符合不同用户的使用需求和使用特点。
在一示例性实施方式中,在S101之后,本实施例的控制方法还包括:在手心面对的屏幕区域设置对应握持手势的第二有效触控区域和第二无效触控区域。在本示例性实施方式中,当屏幕折叠之后,可以在面对用户一侧的屏幕区域上进行界面显示,而在背对用户一侧的屏幕区域(即握持移动终端的手的手心面对的屏幕区域)可以设置支持对显示界面进行操作控制的第二有效触控区域,从而给用户提供新的交互方式,方便用户使用,并提高用户的使用体验。
本申请实施例针对移动终端提供一种控制方式,在防误触的同时,可以支持在移动终端上提供操作强度更小、更精细化的操作方式,从而提高用户体验。
下面以如图4所示的包括折叠屏或柔性屏的智能手机(以下简称为折叠屏手机)为例,对本申请实施例提供的控制方法进行举例说明。在图4中,折叠屏手机的折叠屏处于展开状态,且折叠屏朝上。其中,以折叠屏的折叠方式而言,可以分为内折式、外折式两种;从折叠后屏幕单元的数量来说,可以分为双屏、三屏。在本示例中,折叠屏可以沿转轴(图4中所示的虚线)折叠后形成双屏。图5所示为折叠屏手机处于完全折叠状态的主视图。在本示例中,折叠屏手机进行外折,且折叠后将一整块屏幕分成了两个屏幕单元进行显示,即折叠后两个屏幕单元向外显示信息内容。需要说明的是,本申请实施例提供的控制方法同样适用于折叠形成三屏或多屏的折叠屏,也适用于在弯曲或折叠部分屏幕后形成弯折部的柔性屏。
基于图4所示的展开状态,可以将右侧的屏幕单元沿转轴(图4中的虚线)顺时针旋转180度达到图5和图6所示的完全折叠状态,此时,弯折部11在右侧。图6所示为折叠屏手机处于完全折叠状态的侧视图。其中,连接在上下两个平行的屏幕单元10、12之间的曲面即为弯折部11。
图7为图5和图6所示的处于完全折叠状态的折叠屏手机的握持手势的示意图。针对图5和图6所示的处于完全折叠状态的折叠屏手机,用户通常可以单手握持手机。如图7(a)所示,弯折部被握持在手掌内侧,此时,手掌和大拇指会触碰到弯折部;如图7(b)所示, 弯折部被握持在手指指尖侧,此时,除大拇指外的其他四根手指会触碰到弯折部。
通常用户会通过大拇指和食指在手机屏幕上操作,而在图7所示的握持手机的场景下,大拇指或食指均很容易触碰到弯折部,正好与正常使用手机时的场景相契合,从而在操作方式和使用场景上支持在弯折部上定义新的交互方式的可操作性。
图8为本申请实施例提供的移动终端的控制方法的一种示例性实施示意图。本示例性实施例说明对图4至图7所示的折叠屏手机的控制方法。如图8所示,本示例性实施例包括以下过程:
S201、检测折叠屏的弯折角度;
在本示例中,如图4和图5所示,折叠屏可以包括第一屏幕单元和第二屏幕单元,折叠屏的弯折角度指第一屏幕单元和第二屏幕单元之间的夹角;第一屏幕单元和第二屏幕单元可以沿转轴相对折叠,使得第一屏幕单元和第二屏幕单元之间的夹角可变化。
S202、判断弯折角度是否满足角度设定条件。
比如,可以判断第一屏幕单元和第二屏幕单元之间的夹角是否大于或等于设定的角度阈值,若夹角大于或等于角度阈值,则可以执行S203,否则,结束处理。
S203、判断折叠屏手机的握持手势。
在本示例中,可以通过手与弯折部上的触碰点个数及接触面积,判断出手握折叠屏手机的手势。本示例中,可以将握持手势分为以下三个类别:弯折部被握持在手掌内侧、弯折部被握持在手指指尖侧、其他手势。
图11为本示例性实施例中握持手势的识别过程示意图。如图11所示,本示例性的握持手势的识别过程包括:
S301、检测弯折部与手的触碰点个数n(n为整数)及每个触碰点的面积Si(S>0);其中,n个触碰点的面积可以分别表示为S1,S2,...,Sn。其中,可以将正常情况下手指与手机屏幕接触的最大面积定义为S0;
S302、根据触碰点个数和每个触碰点的面积进行判断识别。
其中,若0<n≤2,且有一个触碰点的面积Si>S0,则可以认为这个接触面积较大的触碰点为手掌与弯折部的触碰点,可以判定当前手势为:弯折部被握持在手掌内侧(如图12所示,其中,图12(a)为弯折部11a被握持在手掌内侧且大拇指触碰弯折部11a的情况,图12(b)为弯折部11a被握持在手掌内侧且大拇指没有触碰弯折部11a的情况,图中还示意了触碰点111);
若2<n≤4,且每个触碰点的面积均小于或等于S0,则可以判定当前手势为:弯折部被 握持在手指指尖侧(如图13所示,其中,图13(a)为弯折部11b被握持在手指指尖侧,且食指触碰弯折部11b的情况,图13(b)为弯折部11b被握持在手指指尖侧,且食指未触碰弯折部11b的情况,图中还示意了触碰点111);
若n=0或n>4或出现除以上情况外的其他情况,则判定当前手势为:其他手势。
S303、输出判定结果。
在本示例中,通过图11所示的握持手势识别过程得到判定结果之后,根据判定结果进行相应的处理。其中,若判定为其他手势,则结束处理;若判定出图12所示的握持手势,则执行S204;若识别出如图13所示的握持手势,则执行S205。
S204、当判定折叠屏手机的握持手势为:弯折部被握持在手掌内侧,则可以设置如图9(a)和图10(a)所示的有效触控区域和无效触控区域;其中,图9(a)所示为有效触控区域的平面示意图,图10(a)为针对图9(a)的平面示意图所对应演示的三维透视图。
在本示例中,如图10(a)所示,折叠屏经过折叠后分成以下三个部分:第一屏幕单元10a、弯折部11a以及第二屏幕单元12a;在弯折部11a被握持在手掌内侧时,第一屏幕单元10a面向用户,第二屏幕单元12a背对用户且面对用户手心。
参照图9(a)和图10(a)可知,有效触控区域可以包括第一有效触控区域101和第二有效触控区域102;其中,第一有效触控区域101为弯折部11a上的中间部分区域,即大拇指在弯折部的触控范围;第二有效触控区域102为食指在折叠屏手机的背部可以触摸到的区域,即食指在第二屏幕单元12a上的触控范围。在本示例中,第一有效触控区域101可以由大拇指触控,第二有效触控区域102可以由食指触控。
在本示例中,无效触控区域可以包括:第一无效触控区域和第二无效触控区域;其中,第一无效触控区域为弯折部11a上除第一有效触控区域101之外的区域,第二无效触控区域为第二屏幕单元12a上除第二有效触控区域102之外的区域。通过设置无效触控区域,可以有效防止用户握持手机操作过程中的误触。
其中,面向用户的第一屏幕单元10a上可以显示用户界面,且用户在第一有效触控区域101和第二有效触控区域102上的触控操作可以对用户显示界面进行各种操控。比如,当第一屏幕单元10a显示文本或图文结合的页面,则根据在第一有效触控区域101和第二有效触控区域102上接收到的上下滑动操作,可以实现页面上下滑动的效果。
S205、当判定折叠屏手机的握持手势为:弯折部被握持在手指指尖侧时,则可以设置如图9(b)和图10(b)所示的有效触控区域和无效触控区域;其中,图9(b)所示为有效触控区域的平面示意图,图10(b)为针对图9(b)的平面示意图所对应演示的三维透视图。
在本示例中,如图10(b)所示,折叠屏经过折叠后分成以下三个部分:第一屏幕单元10b、弯折部11b以及第二屏幕单元12b;在弯折部11b被握持在手指指尖侧时,第一屏幕单元10b面向用户,第二屏幕单元12b背对用户且面对用户手心。
参照图9(b)和图10(b)可知,有效触控区域可以包括第一有效触控区域103和第二有效触控区域104;其中,第一有效触控区域103为弯折部11b上的中上部分区域,即食指在弯折部的触控范围;第二有效触控区域104为食指在折叠屏手机的背部可以触摸到的区域,即食指在第二屏幕单元12b上的触控范围。本示例中,第一有效触控区域和第二有效触控区域均可以由食指来触控。因此,如图10(b)所示,第一有效触控区域103和第二有效触控区域104可以连通,以便于食指操控。然而,本申请对此并不限定。
在本示例中,无效触控区域可以包括:第一无效触控区域和第二无效触控区域;其中,第一无效触控区域为弯折部11b上除第一有效触控区域103之外的区域,第二无效触控区域为第二屏幕单元12b上除第二有效触控区域104之外的区域。
其中,面向用户的第一屏幕单元10b上可以显示用户界面,且用户在第一有效触控区域103和第二有效触控区域104上的触控操作可以对用户显示界面进行各种操控。
在本示例中,可以预先定义握持手势对应的有效触控区域和无效触控区域的设置位置,在识别出握持手势之后,即可以根据预先定义的设置位置,在屏幕上设置有效触控区域和无效触控区域。然而,本申请对此并不限定。在其他实现方式中,可以根据大拇指或食指在弯折部上的实际触碰点来确定有效触控区域的位置。
S206、在无效触控区域,对接收到的用户触控操作不响应;在有效触控区域,针对用户的不同触控操作,执行对应的响应处理。
其中,针对有效触控区域,可以定义各种触控操作及对应的快速响应处理;比如,针对在有效触控区域内的上下滑动操作:若手机正在显示文本类或者图文结合的页面,则可以实现页面上下滑动的滑页效果;若手机处于图片查看模式,则可以实现切换到上一张/下一张图片;针对有效触控区域内的单击操作,可以查看手机的显示界面中的某项内容等;针对有效触控区域内的双击操作,可以在手机的图片查看模式下,缩放图片,或者,在音乐播放界面,可以暂停/继续播放音乐等。然而,本申请对此并不限定。
本示例性实施例提供的折叠屏智能手机的控制方法,可以根据手机的折叠状态和握持手势,在折叠屏手机上定义不同的响应区域(有效触控区域和无效触控区域),从而在防止误操作的同时,可以支持与手机用户界面进行便捷、灵活地交互。在本示例性实施例中,通过根据握持手势定义的有效触控区域,只需通过大拇指或食指的微操作,即可以实现对手机用 户界面的快速操控,相较于常规手机操作方式,操作强度较小,更精细化,从而带来更加便携的用户体验,比如,可以便于用户实现快速翻页、查看屏幕显示内容等操作。
图14为本申请实施例提供的控制方法的另一种示例性实施示意图。本示例性实施例说明对图4至图7所示的折叠屏手机的控制方法。如图14所示,本示例性实施例包括以下过程:
S401、检测折叠屏的弯折角度;
在本示例中,如图4和图5所示,折叠屏可以包括第一屏幕单元和第二屏幕单元,折叠屏的弯折角度指第一屏幕单元和第二屏幕单元之间的夹角;第一屏幕单元和第二屏幕单元可以沿转轴相对折叠,使得第一屏幕单元和第二屏幕单元之间的夹角可变化。
S402、判断弯折角度是否满足角度设定条件。
比如,可以判断第一屏幕单元和第二屏幕单元之间的夹角是否大于或等于设定的角度阈值,若夹角大于或等于角度阈值,则可以执行S403,否则,结束处理。
S403、检测手与手机背部屏幕及弯折部上的触碰区域的参数(对应于上述的第二触碰参数,其中可以包括接触区域、接触面积大小、力度等参数);其中,第二触碰参数可以通过设置在手机屏幕下方的多个传感器综合判断得到。
S404、根据第二触碰参数,识别出手掌、手指的位置,判断出手持握终端的手势,并进行手型自动模拟建模。
其中,手机的握持手势的判断方式可以参照图11所示流程。然而,本申请对此并不限定。在其他实现方式中,根据第二触碰参数,除了判定如图12或图13所示的握持手势,还可以判定出单手或双手握持手机且弯折部向上的状态,以及单手或双手握持手机且弯折部向下的状态。然后,基于判定出的握持手势,确定与握持手势匹配的有效触控区域和无效触控区域。
S405、根据手型模型,估算手型特征,并确定有效触控区域和无效触控区域。
在本步骤中,根据手型模型,可以估算手型特征,然后,计算大拇指和食指在背部屏幕(即手心面对的屏幕区域)和弯折部上容易操作的区域;然后,结合识别出的握持手势,确定有效触控区域和无效触控区域。
在本示例中,有效触控区域和无效触控区域的设置方式可以参照图9和图10所示,故于此不再赘述。
本示例与图8所示示例的区别在于:在本示例中,根据握持手势和手势模型估算的手势特征,可以智能定义符合用户手型特征的有效触控区域和无效触控区域,从而可以更好地满 足用户需求,实现个性化适配,提高用户使用体验。
S406、在无效触控区域,对接收到的用户触控操作不响应;在有效触控区域,针对用户的不同触控操作,执行对应的响应处理。
本步骤可以参照S206的说明,故于此不再赘述。
图15为本申请实施例提供的一种移动终端的控制装置的示意图。本实施例提供的控制装置,应用于包括至少一个屏幕的移动终端,该屏幕具有至少一个弯折部,或者,该屏幕适于折叠形成至少一个弯折部。
如图15所示,本实施例提供的控制装置可以包括:检测模块201,适于检测移动终端的握持手势;控制模块202,适于在屏幕的弯折部上设置对应握持手势的第一有效触控区域和第一无效触控区域。
其中,控制模块202还可以适于当接收到作用于第一无效触控区域内的触控操作,不响应该触控操作;当接收到作用于第一有效触控区域内的触控操作,响应该触控操作,并执行该触控操作对应的处理。
关于本实施例提供的控制装置的相关说明可以参照上述方法实施例的描述,故于此不再赘述。
此外,本申请实施例还提供一种移动终端,包括:至少一个屏幕、存储器和处理器;该屏幕具有至少一个弯折部,或者,该屏幕适于折叠形成至少一个弯折部;存储器适于存储计算机程序,该计算机程序被处理器执行时实现上述实施例提供的控制方法的步骤。
图16为本申请实施例提供的移动终端的一种示例示意图。如图16所示,在一个示例中,移动终端300可包括:处理器310、存储器320、总线系统330和至少一个屏幕340;其中,该处理器310、该存储器320和该屏幕340通过该总线系统330相连,该存储器320用于存储指令,该处理器310用于执行该存储器320存储的指令,以控制该屏幕340显示由用户输入的信息或提供给用户的信息。在本实例中,上述控制装置中检测模块和控制模块的操作可由处理器执行。
应理解,处理器310可以是中央处理单元(Central Processing Unit,简称为“CPU”),处理器310还可以是其他通用处理器、数字信号处理器(DSP)、专用集成电路(ASIC)、现成可编程门阵列(FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
存储器320可以包括只读存储器和随机存取存储器,并向处理器310提供指令和数据。存储器320的一部分还可以包括非易失性随机存取存储器。例如,存储器320还可以存储设 备类型的信息。
总线系统330除包括数据总线之外,还可以包括电源总线、控制总线和状态信号总线等。但是为了清楚说明起见,在图16中将各种总线都标为总线系统330。
屏幕340可以包括显示面板和触控面板。其中,可以采用液晶显示器、有机发光二极管灯形式来配置显示面板。触控面板可以收集用户在其上或附近的触控操作,可以采用电阻式、电容式、红外线等多种类型来实现触控面板。在一示例性中,触控面板可以覆盖在显示面板上,当触控面板检测到其上或附近的触控操作时,传送给处理器310以确定触控事件的类型,随后处理器310根据触控事件的类型在显示面板上提供相应的视觉输出。在一些示例中,显示面板和触控面板可以集成实现输入和输出功能。本申请对此不做限定。
在实现过程中,移动终端300所执行的处理可以通过处理器310中的硬件的集成逻辑电路或者软件形式的指令完成。即本申请实施例所公开的方法的步骤可以体现为硬件处理器执行完成,或者用处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等存储介质中。该存储介质位于存储器320,处理器310读取存储器320中的信息,结合其硬件完成上述方法的步骤。为避免重复,这里不再详细描述。
此外,本申请实施例还提供一种计算机可读存储介质,存储有计算机程序,所述计算机程序被处理器执行时实现上述任一实施例提供的控制方法的步骤。
本领域普通技术人员可以理解,上文中所公开方法中的全部或某些步骤、系统、装置中的功能模块/单元可以被实施为软件、固件、硬件及其适当的组合。在硬件实施方式中,在以上描述中提及的功能模块/单元之间的划分不一定对应于物理组件的划分;例如,一个物理组件可以具有多个功能,或者一个功能或步骤可以由若干物理组件合作执行。某些组件或所有组件可以被实施为由处理器,如数字信号处理器或微处理器执行的软件,或者被实施为硬件,或者被实施为集成电路,如专用集成电路。这样的软件可以分布在计算机可读介质上,计算机可读介质可以包括计算机存储介质(或非暂时性介质)和通信介质(或暂时性介质)。如本领域普通技术人员公知的,术语计算机存储介质包括在用于存储信息(诸如计算机可读指令、数据结构、程序模块或其他数据)的任何方法或技术中实施的易失性和非易失性、可移除和不可移除介质。计算机存储介质包括但不限于RAM、ROM、EEPROM、闪存或其他存储器技术、CD-ROM、数字多功能盘(DVD)或其他光盘存储、磁盒、磁带、磁盘存储或其他磁存储装置、或者可以用于存储期望的信息并且可以被计算机访问的任何其他的介质。此外,本领域普通技术人员公知的是,通信介质通常包含计算机可读指令、数据结构、程序 模块或者诸如载波或其他传输机制之类的调制数据信号中的其他数据,并且可包括任何信息递送介质。

Claims (13)

  1. 一种移动终端的控制方法,应用于包括至少一个屏幕的移动终端,所述屏幕具有至少一个弯折部,或者,所述屏幕适于折叠形成至少一个弯折部;所述控制方法包括:
    检测所述移动终端的握持手势;
    在所述弯折部上设置对应所述握持手势的第一有效触控区域和第一无效触控区域。
  2. 根据权利要求1所述的方法,其中,所述在所述弯折部上设置对应所述握持手势的第一有效触控区域和第一无效触控区域之后,所述方法还包括:
    当接收到作用于所述第一无效触控区域内的触控操作,不响应所述触控操作;
    当接收到作用于所述第一有效触控区域内的触控操作,响应所述触控操作,并执行所述触控操作对应的处理。
  3. 根据权利要求1所述的方法,其中,所述检测所述移动终端的握持手势之前,所述方法还包括:
    检测所述移动终端的屏幕的折叠角度;
    当所述折叠角度满足角度设定条件时,确定所述屏幕折叠后形成弯折部。
  4. 根据权利要求3所述的方法,其中,所述屏幕至少包括:第一屏幕单元和第二屏幕单元;
    所述检测所述移动终端的屏幕的折叠角度,包括:检测所述第一屏幕单元和第二屏幕单元之间的夹角。
  5. 根据权利要求1所述的方法,其中,所述检测所述移动终端的握持手势,包括:
    检测手与所述屏幕的弯折部的第一触碰参数,并根据所述第一触碰参数,识别所述移动终端的握持手势;或者,
    检测手与所述屏幕的第二触碰参数,并根据所述第二触碰参数,识别所述移动终端的握持手势。
  6. 根据权利要求5所述的方法,其中,所述第一触碰参数包括:所述弯折部上的触碰点个数以及每个触碰点的面积;
    所述根据所述第一触碰参数,识别所述移动终端的握持手势,包括:
    当所述弯折部上的触碰点个数大于零且小于或等于两个,且其中一个触碰点的面积大于面积阈值,则确定所述弯折部被握持在手掌内侧;
    当所述弯折部上的触碰点个数大于两个且小于或等于四个,且任一个触碰点的面积均小于或等于所述面积阈值,则确定所述弯折部被握持在手指指尖侧。
  7. 根据权利要求5所述的方法,其中,所述检测所述移动终端的握持手势之后,所述方法还包括:根据所述第二触碰参数,建立手型模型;根据所述手型模型,估算手型特征;
    相应地,所述在所述弯折部上设置对应所述握持手势的第一有效触控区域和第一无效触控区域,包括:
    在所述弯折部上设置对应所述握持手势且匹配所述手型特征的第一有效触控区域和第一无效触控区域。
  8. 根据权利要求1所述的方法,其中,所述在所述弯折部上设置对应所述握持手势的第一有效触控区域和第一无效触控区域,包括:
    当所述弯折部被握持在手掌内侧,则将大拇指在所述弯折部的触控范围设置为第一有效触控区域,将所述弯折部上除所述第一有效触控区域之外的区域设置为第一无效触控区域;
    当所述弯折部被握持在手指指尖侧,则将食指在所述弯折部的触控范围设置为第一有效触控区域,将所述弯折部上除所述第一有效触控区域之外的区域设置为第一无效触控区域。
  9. 根据权利要求1所述的方法,其中,所述检测所述移动终端的握持手势之后,所述方法还包括:
    在手心面对的屏幕区域设置对应所述握持手势的第二有效触控区域和第二无效触控区域。
  10. 根据权利要求9所述的方法,其中,所述在手心面对的屏幕区域设置对应所述握持手势的第二有效触控区域和第二无效触控区域,包括:
    将食指在所述手心面对的屏幕区域的触控范围设置为第二有效触控区域,将所述手心面对的屏幕区域上除所述第二有效触控区域之外的区域设置为第二无效触控区域。
  11. 根据权利要求9或10所述的方法,其中,当所述弯折部被握持在手指指尖侧,所述第二有效触控区域连通所述第一有效触控区域。
  12. 一种移动终端,包括:至少一个屏幕、存储器和处理器;所述屏幕具有至少一个弯折部,或者,所述屏幕适于折叠形成至少一个弯折部;所述存储器适于存储计算机程序,所述计算机程序被所述处理器执行时实现如权利要求1至11中任一项所述的控制方法的步骤。
  13. 一种计算机可读存储介质,存储有计算机程序,所述计算机程序被处理器执行时实现如权利要求1至11中任一项所述的控制方法的步骤。
PCT/CN2020/081241 2019-06-24 2020-03-25 一种移动终端的控制方法及移动终端 WO2020258950A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP20832374.1A EP3936993A4 (en) 2019-06-24 2020-03-25 MOBILE TERMINAL ORDERING METHOD AND MOBILE TERMINAL
US17/604,751 US20220263929A1 (en) 2019-06-24 2020-03-25 Mobile terminal and control method therefor

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910550192.6A CN112130741A (zh) 2019-06-24 2019-06-24 一种移动终端的控制方法及移动终端
CN201910550192.6 2019-06-24

Publications (1)

Publication Number Publication Date
WO2020258950A1 true WO2020258950A1 (zh) 2020-12-30

Family

ID=73849629

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/081241 WO2020258950A1 (zh) 2019-06-24 2020-03-25 一种移动终端的控制方法及移动终端

Country Status (4)

Country Link
US (1) US20220263929A1 (zh)
EP (1) EP3936993A4 (zh)
CN (1) CN112130741A (zh)
WO (1) WO2020258950A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113296636A (zh) * 2021-06-09 2021-08-24 维沃移动通信有限公司 防误触方法、装置和电子设备

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113485632A (zh) * 2021-07-26 2021-10-08 深圳市柔宇科技股份有限公司 一种折叠屏触控方法、终端设备及计算机可读存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104571919A (zh) * 2015-01-26 2015-04-29 深圳市中兴移动通信有限公司 终端屏幕显示方法及装置
CN105824535A (zh) * 2016-03-25 2016-08-03 乐视控股(北京)有限公司 一种设置的调节方法和终端
CN106648427A (zh) * 2016-11-29 2017-05-10 努比亚技术有限公司 一种终端单手模式的设置装置及其方法

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8878794B2 (en) * 2011-09-27 2014-11-04 Z124 State of screen info: easel
KR102131825B1 (ko) * 2013-03-20 2020-07-09 엘지전자 주식회사 적응적 터치 센서티브 영역을 제공하는 폴더블 디스플레이 디바이스 및 그 제어 방법
US9927840B2 (en) * 2013-06-21 2018-03-27 Semiconductor Energy Laboratory Co., Ltd. Information processor for processing and displaying image data on a bendable display unit
KR101801554B1 (ko) * 2013-07-11 2017-11-27 삼성전자주식회사 컨텐츠를 표시하는 사용자 단말 장치 및 그 방법
TWI647607B (zh) * 2013-10-11 2019-01-11 半導體能源研究所股份有限公司 可攜式資料處理裝置及驅動該可攜式資料處理裝置的方法
KR101588294B1 (ko) * 2013-12-30 2016-01-28 삼성전자주식회사 사용자 인터렉션을 제공하는 사용자 단말 장치 및 그 방법
KR101632008B1 (ko) * 2014-04-30 2016-07-01 엘지전자 주식회사 이동단말기 및 그 제어방법
KR102255143B1 (ko) * 2014-09-02 2021-05-25 삼성전자주식회사 벤디드 디스플레이를 구비한 휴대 단말기의 제어 방법 및 장치
US9864410B2 (en) * 2014-12-29 2018-01-09 Samsung Electronics Co., Ltd. Foldable device and method of controlling the same
KR102308645B1 (ko) * 2014-12-29 2021-10-05 삼성전자주식회사 사용자 단말 장치 및 그의 제어 방법
KR20170084586A (ko) * 2016-01-12 2017-07-20 삼성전자주식회사 전자 장치의 플렉서블 디스플레이 및 그의 운용 방법
EP3564807A4 (en) * 2016-12-30 2020-09-02 Shenzhen Royole Technologies Co., Ltd. CONTROL METHOD AND DEVICE FOR FLEXIBLE DISPLAY DEVICE
US11157047B2 (en) * 2018-11-15 2021-10-26 Dell Products, L.P. Multi-form factor information handling system (IHS) with touch continuity across displays
WO2021251525A1 (ko) * 2020-06-11 2021-12-16 엘지전자 주식회사 이동단말기 및 그 제어방법

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104571919A (zh) * 2015-01-26 2015-04-29 深圳市中兴移动通信有限公司 终端屏幕显示方法及装置
CN105824535A (zh) * 2016-03-25 2016-08-03 乐视控股(北京)有限公司 一种设置的调节方法和终端
CN106648427A (zh) * 2016-11-29 2017-05-10 努比亚技术有限公司 一种终端单手模式的设置装置及其方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3936993A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113296636A (zh) * 2021-06-09 2021-08-24 维沃移动通信有限公司 防误触方法、装置和电子设备

Also Published As

Publication number Publication date
US20220263929A1 (en) 2022-08-18
EP3936993A1 (en) 2022-01-12
EP3936993A4 (en) 2022-05-04
CN112130741A (zh) 2020-12-25

Similar Documents

Publication Publication Date Title
US20210096885A1 (en) User interface adaptations based on inferred content occlusion and user intent
US9111076B2 (en) Mobile terminal and control method thereof
JP5759660B2 (ja) タッチ・スクリーンを備える携帯式情報端末および入力方法
US9594489B2 (en) Hover-based interaction with rendered content
US20210096675A1 (en) User interface transitions and optimizations for foldable computing devices
KR20160032611A (ko) 터치 입력을 이용하여 전자 장치를 제어하는 방법 및 장치
BR102014002492A2 (pt) método e aparelho para multitarefa
JPWO2013094371A1 (ja) 表示制御装置、表示制御方法およびコンピュータプログラム
WO2022007934A1 (zh) 应用图标控制方法、装置及电子设备
TW201445418A (zh) 電子裝置及螢幕內容分享方法
WO2019119799A1 (zh) 一种显示应用图标的方法及终端设备
WO2020258950A1 (zh) 一种移动终端的控制方法及移动终端
WO2015024375A1 (zh) 微件面积调节的方法及装置
US11221759B2 (en) Transitions and optimizations for a foldable computing device operating in a productivity mode
WO2016183912A1 (zh) 菜单布局方法及装置
WO2017096622A1 (zh) 一种防误触方法、装置及电子设备
US10599326B2 (en) Eye motion and touchscreen gestures
JP6183820B2 (ja) 端末、及び端末制御方法
CN104951221A (zh) 响应触摸操作的方法和电子设备
WO2016206438A1 (zh) 一种触屏控制方法和装置、移动终端
KR101692848B1 (ko) 호버링을 이용하는 가상 터치패드 조작방법 및 이를 수행하는 단말기
WO2022062806A1 (zh) 防误触方法、终端设备及存储介质
TWI511031B (zh) 電子裝置運作方法以及電子裝置
WO2021077832A1 (zh) 一种单手模式控制方法、终端及计算机存储介质
WO2018205069A1 (zh) 一种用户界面调节方法和终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20832374

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020832374

Country of ref document: EP

Effective date: 20211003

NENP Non-entry into the national phase

Ref country code: DE