CN113467652A - Mistaken touch reminding method and device for under-screen camera terminal equipment - Google Patents

Mistaken touch reminding method and device for under-screen camera terminal equipment Download PDF

Info

Publication number
CN113467652A
CN113467652A CN202010246250.9A CN202010246250A CN113467652A CN 113467652 A CN113467652 A CN 113467652A CN 202010246250 A CN202010246250 A CN 202010246250A CN 113467652 A CN113467652 A CN 113467652A
Authority
CN
China
Prior art keywords
screen
area
terminal device
camera
touchable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010246250.9A
Other languages
Chinese (zh)
Inventor
郜文美
卢曰万
姜永涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010246250.9A priority Critical patent/CN113467652A/en
Priority to PCT/CN2021/081569 priority patent/WO2021197085A1/en
Publication of CN113467652A publication Critical patent/CN113467652A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The application provides a false touch reminding method and device for an under-screen camera terminal device. The mistaken touch reminding method of the under-screen camera terminal device is applied to the terminal device, the terminal device comprises the under-screen camera, the screen of the terminal device comprises a first area screen covering the under-screen camera, and the method comprises the following steps: and when the first area screen is detected to have the possibility of being touched, displaying a false touch prompt on the screen. The method and the device can reduce the possibility and frequency that part of the screen covering the camera is polluted, and improve the shooting definition.

Description

Mistaken touch reminding method and device for under-screen camera terminal equipment
Technical Field
The application relates to a technology of a camera terminal under a screen, in particular to a method and a device for reminding mistaken touch of camera terminal equipment under the screen.
Background
In recent years, the screen design trend of mobile terminals (smart phones, tablets and the like) is developing towards high screen occupation ratio and even full screen, and in order to achieve full screen effect, various design schemes of hidden front cameras, such as a mechanical lifting camera, an electric lifting camera, a sliding screen camera and the like, appear in the industry, but complicated mechanical structures in the design schemes enable the design cost to be high, the service life to be short, water and dust are easy to enter, and the design scheme is not the most ideal full screen scheme.
Related art proposes a design scheme of an under-screen camera (UDC or under screen camera, USC), i.e. a front-facing camera is hidden under a screen of a terminal device, the screen of the terminal device is divided into two parts, a first part of the screen faces the front-facing camera, and an organic light-emitting diode (OLED) screen is adopted, when taking a picture, the first part of the screen is transparent and does not obstruct the view of the camera, and when not taking a picture, the first part of the screen is a normal display screen and displays pictures together with the second part of the screen normally. The second part of screen is the screen of other areas except the first part of screen, and the normal display can be realized by adopting a common OLED screen. Thus, a truly full screen can be achieved.
However, in the terminal device adopting the design scheme of the UDC, the first part of the screen is easily polluted by fingerprints, sweat, dirt and the like, so that the shooting definition is reduced.
Disclosure of Invention
The application provides a false touch reminding method and device for under-screen camera terminal equipment, so that the possibility and frequency of pollution of part of a screen covering a camera are reduced, and the photographing definition is improved.
In a first aspect, the application provides a method for reminding mistaken touch of an off-screen camera terminal device, which is applied to a terminal device, the terminal device comprises an off-screen camera, a screen of the terminal device comprises a first area screen covering the off-screen camera, and the method for reminding mistaken touch of the off-screen camera terminal device comprises the following steps: and when the possibility that the first area screen is touched is detected, displaying a false touch prompt on the screen.
In the embodiment, when the possibility that part of the screen covering the camera is touched by the finger of the user is detected, the terminal equipment can give a prompt to inform the user of the position of the camera, so that the user can operate cautiously, the part of the screen covering the camera is avoided being touched as far as possible, the possibility and the frequency that the part of the screen covering the camera is polluted are reduced, and the photographing definition is improved.
In one possible implementation, detecting that there is a possibility that the first area screen is touched includes: acquiring a touchable area according to user operation, wherein the touchable area is an area which can be touched by a user on a screen of the terminal equipment; when the touchable area and the first area screen overlap, determining that the first area screen has the possibility of being touched; when the touchable area and the first area screen do not overlap, a possibility that the first area screen is not touched is determined.
The area which is possibly touched is obtained according to the actual operation of the user, and the accuracy of the error touch reminding can be improved.
In one possible implementation manner, acquiring the touchable area according to a user operation includes: acquiring a sliding track according to touch operation of a user on a screen of the terminal equipment; and determining the touchable area according to the sliding track. It should be noted that the sliding track may be a sliding track generated by each operation of the user. For example, if the user operates to slide from left to right on the screen, the sliding track is a line from left to right. For another example, if the user's operation is a continuous curve sliding on the screen, the sliding track is a curve.
In one possible implementation, the touch operation includes a sliding operation and/or a pressing operation; the sliding operation comprises a linear sliding operation in any direction and/or a curve sliding operation in any direction; the pressing operation includes a clicking operation and/or a long pressing operation.
In one possible implementation, determining the touchable area according to the sliding track includes: according to the fact that a sliding track corresponding to sliding operation is a central line, the two sides are extended outwards for a set length to obtain a touchable area; or, the touchable area is obtained by taking the sliding track corresponding to the pressing operation as the center of a circle and the set length as the radius.
In one possible implementation manner, acquiring the touchable area according to a user operation includes: acquiring a suspension point according to suspension touch operation of a user near a screen of the terminal equipment, wherein the suspension touch operation is obtained through induction of a capacitive sensor or a distance sensor; and determining the touchable area according to the suspension point.
In one possible implementation, determining the touchable area from the hover point includes: and taking the position corresponding to the hovering point on the screen of the terminal equipment as a circle center, and taking the set length as a radius to obtain the touchable area.
The possibility that a user touches part of the screen of the camera under the covering screen under various operations is considered, and the false touch detection efficiency is improved.
In one possible implementation, presenting the reminder includes: displaying the edge of the screen of the first area in a manner of distinguishing from other areas; or, the first area screen is displayed in a manner of being distinguished from other areas; or displaying the position of the edge of the screen of the terminal equipment corresponding to the camera under the screen in a mode of distinguishing the position from other areas; or displaying the position corresponding to the off-screen camera on the screen of the terminal equipment in a mode of distinguishing the position from other areas; the means to distinguish from other areas include highlighting or filling patterns.
In one possible implementation, the giving the reminder further includes: and indicating the position of the first area screen or the screen of the terminal equipment corresponding to the off-screen camera by using characters.
Remind user screen down camera position with multiple mode, can reduce the contaminated possibility of partial screen and the frequency that cover the camera, promote the definition of shooing, can also realize reminding the variety of mode simultaneously.
In a second aspect, the application provides a false touch reminding device for an off-screen camera terminal device, which is applied to the terminal device, wherein the terminal device comprises an off-screen camera, and a screen of the terminal device comprises a first area screen covering the off-screen camera; the device includes: and the processing module is used for displaying the false touch prompt on the screen when the first area screen is detected to have the possibility of being touched.
In a possible implementation manner, the processing module is specifically configured to acquire a touchable area according to a user operation, where the touchable area is an area on a screen of the terminal device that can be touched by the user; when the touchable area and the first area screen overlap, determining that the first area screen has the possibility of being touched; when the touchable area and the first area screen do not overlap, a possibility that the first area screen is not touched is determined.
In a possible implementation manner, the processing module is specifically configured to obtain a sliding track according to a touch operation of a user on a screen of the terminal device; and determining the touchable area according to the sliding track.
In one possible implementation, the touch operation includes a sliding operation and/or a pressing operation; the sliding operation comprises a linear sliding operation in any direction and/or a curve sliding operation in any direction; the pressing operation includes a clicking operation and/or a long pressing operation.
In a possible implementation manner, the processing module is specifically configured to expand the set length outwards from both sides to obtain a touchable area, with a sliding track corresponding to the sliding operation as a central line; or, the touchable area is obtained by taking the sliding track corresponding to the pressing operation as the center of a circle and the set length as the radius.
In a possible implementation manner, the processing module is specifically configured to obtain a suspension point according to a suspension touch operation of a user near a screen of the terminal device, where the suspension touch operation is obtained through sensing by a capacitive sensor or a distance sensor; and determining the touchable area according to the suspension point.
In a possible implementation manner, the processing module is specifically configured to use a position on a screen of the terminal device corresponding to the hovering point as a center of a circle, and use a set length as a radius to obtain the touchable area.
In a possible implementation manner, the processing module is specifically configured to display an edge of the screen of the first area in a manner that is distinguished from other areas; or, the first area screen is displayed in a manner of being distinguished from other areas; or displaying the position of the edge of the screen of the terminal equipment corresponding to the camera under the screen in a mode of distinguishing the position from other areas; or displaying the position corresponding to the off-screen camera on the screen of the terminal equipment in a mode of distinguishing the position from other areas; the means to distinguish from other areas include highlighting or filling patterns.
In a possible implementation manner, the processing module is further configured to indicate, by using words, a position on a screen of the terminal device, where the first area screen or the screen of the terminal device corresponds to the off-screen camera.
In a third aspect, the present application provides a terminal device, including: one or more processors; a memory for storing one or more programs; when executed by one or more processors, cause the one or more processors to implement a method as in any one of the first aspects above.
In a fourth aspect, the present application provides a computer readable storage medium comprising a computer program which, when executed on a computer, causes the computer to perform the method of any of the first aspects above.
In a fifth aspect, the present application provides a computer program for performing the method of any one of the above first aspects when the computer program is executed by a computer.
Drawings
Fig. 1 shows an exemplary schematic structure of a terminal device 100;
fig. 2 shows an exemplary front view of a screen of a terminal device;
FIG. 3 illustrates an exemplary structural schematic of a first area screen;
fig. 4 is a flowchart of an embodiment of a false touch reminding method for an off-screen camera terminal device according to the present application;
FIGS. 5a-10 respectively show an exemplary schematic of a detection method;
FIG. 11 shows an exemplary diagram of a user's finger hovering over the screen;
fig. 12-15 each show an exemplary schematic diagram of a method of alerting.
Detailed Description
To make the purpose, technical solutions and advantages of the present application clearer, the technical solutions in the present application will be clearly and completely described below with reference to the drawings in the present application, and it is obvious that the described embodiments are some, but not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description examples and claims of this application and in the drawings are used for descriptive purposes only and are not to be construed as indicating or implying relative importance, nor order. Furthermore, the terms "comprises" and "comprising," as well as any variations thereof, are intended to cover a non-exclusive inclusion, such as a list of steps or elements. A method, system, article, or apparatus is not necessarily limited to those steps or elements explicitly listed, but may include other steps or elements not explicitly listed or inherent to such process, system, article, or apparatus.
It should be understood that in the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" for describing an association relationship of associated objects, indicating that there may be three relationships, e.g., "a and/or B" may indicate: only A, only B and both A and B are present, wherein A and B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of single item(s) or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
The terminal device of the present application may also be referred to as a User Equipment (UE), and may be deployed on land, including indoors or outdoors, handheld or vehicle-mounted; can also be deployed on the water surface (such as a ship and the like); and may also be deployed in the air (e.g., airplanes, balloons, satellites, etc.). The terminal device may be a mobile phone (mobile phone), a tablet computer (pad), a Virtual Reality (VR) device, an Augmented Reality (AR) device, a wireless device in a smart home (smart home), and the like, which is not limited in this application. The terminal device and the chip that can be installed in the terminal device are collectively referred to as a terminal device in this application.
Fig. 1 shows a schematic configuration diagram of a terminal device 100.
The terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the terminal device 100. In other embodiments of the present application, terminal device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The terminal device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the terminal device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the terminal device 100 selects a frequency point, the digital signal processor is used to perform fourier transform or the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The terminal device 100 may support one or more video codecs. In this way, the terminal device 100 can play or record video in a plurality of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the terminal device 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the terminal device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, a phonebook, etc.) created during use of the terminal device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the terminal device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The terminal device 100 may implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The terminal device 100 can listen to music through the speaker 170A, or listen to a handsfree call.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The terminal device 100 determines the intensity of the pressure from the change in the capacitance. When a touch operation is applied to the display screen 194, the terminal device 100 detects the intensity of the touch operation based on the pressure sensor 180A. The terminal device 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the terminal device 100. In some embodiments, the angular velocity of terminal device 100 about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the terminal device 100, calculates the distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the terminal device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the terminal device 100 calculates an altitude from the barometric pressure measured by the barometric pressure sensor 180C, and assists in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The terminal device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the terminal device 100 is a folder, the terminal device 100 may detect the opening and closing of the folder according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E can detect the magnitude of acceleration of the terminal device 100 in various directions (generally, three axes). The magnitude and direction of gravity can be detected when the terminal device 100 is stationary. The method can also be used for recognizing the posture of the terminal equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The terminal device 100 may measure the distance by infrared or laser. In some embodiments, shooting a scene, the terminal device 100 may range using the distance sensor 180F to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The terminal device 100 emits infrared light to the outside through the light emitting diode. The terminal device 100 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the terminal device 100. When insufficient reflected light is detected, the terminal device 100 can determine that there is no object near the terminal device 100. The terminal device 100 can utilize the proximity light sensor 180G to detect that the user holds the terminal device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. The terminal device 100 may adaptively adjust the brightness of the display screen 194 according to the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the terminal device 100 is in a pocket, in order to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The terminal device 100 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the terminal device 100 executes a temperature processing policy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds the threshold, the terminal device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the terminal device 100 heats the battery 142 when the temperature is below another threshold to avoid the terminal device 100 being abnormally shut down due to low temperature. In other embodiments, when the temperature is lower than a further threshold, the terminal device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the terminal device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
Those skilled in the art will appreciate that terminal device 100 may include fewer or more components than shown in fig. 1, and that the terminal device shown in fig. 1 includes only those components more pertinent to the various implementations disclosed herein.
On the touch screen 1092 of the terminal device shown in fig. 1, the front-facing camera adopts the UDC design to realize a full-screen. Fig. 2 is a schematic diagram illustrating an exemplary front structure of a screen of a terminal device, as shown in fig. 2, a front-facing camera is hidden under the screen of the terminal device, and the screen of the terminal device is divided into two regions based on the front-facing camera, wherein a first region screen covers over the front-facing camera, a transparent OLED screen is adopted, when taking a picture, the first region screen is in a transparent state and does not obstruct the view of the camera, and when not taking a picture, the first region screen is a normal display screen and displays a picture normally together with a second region screen. The second area screen is the other area except the first area on the screen of the terminal equipment, and a common OLED screen is adopted for normal display.
Fig. 2 is a schematic front view of a screen using UDC, in which a side view of a first area screen is shown in fig. 3, and fig. 3 is a schematic diagram showing an exemplary structure of the first area screen. The first area screen comprises three layers, wherein the outermost layer is a transparent anode, the middle layer adopts a transparent organic light-emitting material, and the innermost layer is a transparent cathode in sequence from outside to inside. When the front camera does not take a picture, the transparent organic luminescent material in the first area screen normally emits light, and light (OLED light) emitted by the transparent organic luminescent material reaches the eyes of a user through the transparent cathode in the first area screen, so that the user can see the content displayed by the first area screen; when the front camera shoots, the transparent organic light-emitting material in the first area screen does not emit light, and external light can penetrate through the transparent cathode, the transparent organic light-emitting material and the transparent anode in the first area screen to be projected to the front camera, so that the front camera can shoot. It should be noted that fig. 2 and fig. 3 are only examples of a terminal device. If the terminal device has a plurality of front cameras, a transparent OLED screen as shown in fig. 3 may be respectively disposed above each camera, or a transparent OLED screen as shown in fig. 3 may be disposed in a larger area (which may cover all the front cameras) for all the front cameras. The transparent OLED screen shown in fig. 3 may also have other structures. This is not a particular limitation of the present application.
Fig. 4 is a flowchart of an embodiment of a false touch reminding method for an off-screen camera terminal device according to the present application, and as shown in fig. 4, the method according to this embodiment may be applied to the terminal device shown in fig. 1, and a screen of the terminal device may have a structure as shown in fig. 2 and fig. 3, and a UDC design scheme is adopted, so that the screen can realize a full-screen. The method for reminding the mistaken touch of the under-screen camera terminal equipment can comprise the following steps:
step 401, detecting whether the first area screen on the screen has the possibility of being touched.
The first area screen may employ, for example, a transparent OLED screen on the screen of the terminal device shown in fig. 2 and 3. Due to the adoption of the full-screen, a user needs to input an instruction by means of the touch capability of the screen, so that any area on the screen can be touched, for example, clicking, pulling down, sliding and other actions can occur at any position on the screen in the process of using the terminal device, and the touched position on the screen can be polluted by sweat, pollutants and the like on the hand of the user. The polluted screen can be unclear, particularly, after the screen in the first area is polluted, the polluted screen still covers the camera when in photographing, even if the screen is in a transparent state, the polluted screen is polluted compared with the lens of the camera, and the phenomenon that the photographed image is dazzled and blurred, and even sundries are shielded on the image can be caused.
Fig. 5a and 5b show an exemplary schematic diagram of the detection method, as shown in fig. 5a and 5b, the upper left corner of the screen is used as the origin of coordinates (0,0), the X-axis forward direction is from the origin of coordinates to the right, the abscissa of the screen has a value range of 0 to X, the Y-axis forward direction is from the origin of coordinates to the bottom, and the ordinate of the screen has a value range of 0 to Y. To open the pull-down menu at the top of the screen, the user may slide down a certain distance L1 from a certain position (corresponding to X1 on the abscissa and y1 on the ordinate, where X1 belongs to (0, X), and y1 may be 0 or a value close to 0) at the top of the screen. Based on this operation, the terminal device may detect that the touch operation of the user starts from the coordinate (x1, y1) to the coordinate (x1, y1+ L1) and the slide trajectory is vertically downward. The terminal device determines a range based on the sliding trajectory, and sets a distance x (the setting of x can refer to the thickness of a human finger or the area of the abdomen of the human finger) by expanding left and right by taking a connecting line between the coordinates (x1, y1) and the coordinates (x1, y1+ L1) as a central line, so as to obtain a rectangular region, and four vertexes of the rectangular region are (x1-x, y1), (x1+ x, y1), (x1-x, y1+ L1) and (x1+ x, y1+ L1), respectively. And then the terminal equipment judges whether the rectangular area and the first area screen have intersection, namely whether the range can cover the first area screen. If the two intersection points exist, it means that the finger of the user is likely to cross the first area screen during the sliding operation of the user, and therefore it can be considered that the first area screen is likely to be touched and polluted in this case.
Fig. 6a and 6b show an exemplary schematic diagram of the detection method, as shown in fig. 6a and 6b, the upper left corner of the screen is used as the origin of coordinates (0,0), the X-axis forward direction is from the origin of coordinates to the right, the abscissa of the screen has a value range of 0 to X, the Y-axis forward direction is from the origin of coordinates to the bottom, and the ordinate of the screen has a value range of 0 to Y. To exit the current application, the user may typically slide a distance L2 to the right from a certain position on the left border of the screen (corresponding to x2 on the abscissa and Y2 on the ordinate, where x2 may be 0 or a value close to 0, and Y2 ∈ (0, Y)). Based on this operation, the terminal device can detect that the touch operation of the user starts from the coordinates (x2, y2) and ends at the coordinates (x2+ L2, y2), and the slide trajectory is horizontal to the right. The terminal device determines a range by taking the sliding track as a reference, and extends the set distance y (the setting of y can refer to the thickness of a human finger or the area of the abdomen of the human finger) up and down by taking a connecting line between the coordinates (x2, y2) and the coordinates (x2+ L2, y2) as a central line to obtain a rectangular area, wherein four vertexes of the rectangular area are (x2, y2-y), (x2, y2+ y), (x2+ L2, y2-y) and (x2+ L2, y2+ y), respectively. And then the terminal equipment judges whether the rectangular area and the first area screen have intersection, namely whether the range can cover the first area screen. If the two intersection points exist, it means that the finger of the user is likely to cross the first area screen during the sliding operation of the user, and therefore it can be considered that the first area screen is likely to be touched and polluted in this case.
Fig. 7a and 7b show an exemplary schematic diagram of the detection method, as shown in fig. 7a and 7b, the upper left corner of the screen is used as the origin of coordinates (0,0), the X-axis forward direction is from the origin of coordinates to the right, the abscissa of the screen has a value range of 0 to X, the Y-axis forward direction is from the origin of coordinates to the bottom, and the ordinate of the screen has a value range of 0 to Y. When a user wants to switch a current photo or interface during watching a photo, selecting an application program, and the like, the user can slide a certain distance L3 to the left from a certain position (corresponding to X3 on the abscissa and Y3 on the ordinate, X3 being (0, X) and Y3 being (0, Y)) on the right boundary of the screen. Based on this operation, the terminal device can detect that the touch operation of the user starts from the coordinates (x3, y3) and ends at the coordinates (x3-L3, y3), and the sliding trajectory is horizontal to the left. The terminal device determines a range by taking the sliding track as a reference, and takes a connecting line between the coordinates (x3, y3) and the coordinates (x3-L3, y3) as a central line, and extends the set distance y up and down (the setting of y can refer to the thickness of a human finger or the area of the abdomen of the human finger), so as to obtain a rectangular area, wherein four vertexes of the rectangular area are (x3, y3-y), (x3, y3+ y), (x3-L3, y3-y) and (x3-L3, y3+ y), respectively. And then the terminal equipment judges whether the rectangular area and the first area screen have intersection, namely whether the range can cover the first area screen. If the two intersection points exist, it means that the finger of the user is likely to cross the first area screen during the sliding operation of the user, and therefore it can be considered that the first area screen is likely to be touched and polluted in this case.
Fig. 8a and 8b show an exemplary schematic diagram of the detection method, as shown in fig. 8a and 8b, the upper left corner of the screen is used as the origin of coordinates (0,0), the X-axis forward direction is from the origin of coordinates to the right, the abscissa of the screen has a value range of 0 to X, the Y-axis forward direction is from the origin of coordinates to the bottom, and the ordinate of the screen has a value range of 0 to Y. When a user plays a game, the user may slide in any direction on the game screen while dragging a character in the game, and may slide a certain distance L4 in any direction (for example, right and lower) from a certain position (corresponding to X4 on the abscissa, Y4 on the ordinate, X4 e (0, X), Y4 e (0, Y)), and then slide a certain distance L5 in any direction (for example, right). Based on the operation, the terminal device may detect that the touch operation of the user starts from the coordinates (x4, y4), the approach coordinates (x41, y41), and ends at the coordinates (x42, y42), and the sliding trajectory first goes to the lower right and then to the right. The terminal device determines a range based on the sliding track, and a set distance z is expanded on both sides by using a connecting line between the coordinates (x4, y4) and the coordinates (x41, y41) and the coordinates (x42, y42) as a central line (the setting of z can refer to the thickness of a human finger or the area of the abdomen of the human finger), so as to obtain a belt-shaped area. Then the terminal device judges whether the strip area and the first area screen have intersection, namely whether the range can cover the first area screen. If the two intersection points exist, it means that the finger of the user is likely to cross the first area screen during the sliding operation of the user, and therefore it can be considered that the first area screen is likely to be touched and polluted in this case.
Fig. 9a and 9b illustrate an exemplary schematic diagram of the detection method, as shown in fig. 9a and 9b, the upper left corner of the screen is used as the origin of coordinates (0,0), the X-axis forward direction is from the origin of coordinates to the right, the abscissa of the screen has a value range of 0 to X, the Y-axis forward direction is from the origin of coordinates to the bottom, and the ordinate of the screen has a value range of 0 to Y. To open an application or trigger the deletion of an application, the user typically presses (clicks on a control for more than a set length of time) the icon of the application for a short time or for a long time, the location of the application corresponding to the coordinates (x5, y 5). Based on this operation, the terminal device can detect that the user's touch operation is a short press or a long press on a circular, square, or irregular area centered on the coordinates (x5, y 5). The terminal device determines a range with coordinates (x5, y5) as a center, and obtains a circular area with coordinates (x5, y5) as a center and a set distance r (r can be set by referring to the area of the finger head of a person or the area of the finger abdomen of the person) as a radius. And then the terminal equipment judges whether the circular area and the first area screen have intersection, namely whether the range can cover the first area screen. If the two intersection points exist, it means that the finger of the user is likely to touch the first area screen during the above-mentioned clicking operation of the user, and therefore it can be considered that the first area screen is likely to be touched and polluted in this case.
It should be noted that, the foregoing describes some exemplary methods for detecting whether the first area screen has the possibility of being touched, but the present application is not limited to the specific implementation method, and the determination method including the above range is also not limited to the specific implementation method. In addition, the intersection between the first area screen and the rectangular area, the strip-shaped area, the circular area, and the like obtained by the terminal device means whether the areas of the first area screen and the rectangular area, the strip-shaped area, the circular area, and the like overlap with each other, and the overlapping range and the area are not limited, that is, as long as the intersection (overlap) exists between the first area screen and the rectangular area, the strip-shaped area, the circular area, and the like, the intersection between the first area screen and the rectangular area, the strip-shaped area, the circular area, and the like is considered that the finger of the user is likely to touch the first area screen in the corresponding operation process.
Currently, only mutual capacitance sensors are used for realizing multi-point touch detection on common touch screens. The touch screen supporting the suspension touch control is provided with two capacitive sensors, namely a mutual capacitive sensor and a self capacitive sensor, the electric field of the mutual capacitive sensor is very small, the signal intensity of the mutual capacitive sensor is very low, and very weak and small signals cannot be sensed, the self capacitive sensor can generate signals which are stronger than the mutual capacitance, the sensing of fingers which are farther away can be detected, the detection distance of the self capacitive sensor can reach 20mm, and the self capacitive sensor can detect fingers which are located 20mm above the screen. In a touch screen supporting floating touch, a mutual capacitance sensor completes normal touch sensing including multi-point touch, and a self-capacitance sensor detects a finger hovering above the screen. By setting a touch input threshold value, the terminal equipment can distinguish between a suspension touch and a contact touch.
A distance sensor, also called a displacement sensor, is a type of sensor for sensing a distance between the sensor and an object. The touch screen provided with the distance sensor can detect the distance between a finger hovering over the screen and the screen through the distance sensor. If the distance is less than the set threshold, it indicates that the user is likely to start operating on the touch screen. By setting a distance threshold, the terminal device can distinguish between a floating touch and a contact touch.
Fig. 10 shows an exemplary schematic diagram of the detection method, as shown in fig. 10, the upper left corner of the screen is used as the origin of coordinates (0,0), the X-axis forward direction is from the origin of coordinates to the right, the abscissa of the screen has a range of values from 0 to X, the Y-axis forward direction is from the origin of coordinates to the bottom, and the ordinate of the screen has a range of values from 0 to Y. The terminal device provides an induction function, and when a finger of a user is close to a screen of the terminal device in a short distance, based on the above-mentioned floating touch technology or the distance sensor, the terminal device can detect a position (a floating point) on the screen corresponding to the finger hovering over the screen. Fig. 11 shows an exemplary schematic diagram of a user's finger hovering above the screen, and as shown in fig. 11, the user's right index finger hovering above the screen of the terminal device at a location, which may be referred to as a hover point, with coordinates (x6, y 6). Then, the terminal device obtains a circular area with the coordinates (x6, y6) as the center and the set distance r (the set distance r can refer to the area of the finger head of the person or the area of the finger abdomen of the person) as the radius. And then the terminal equipment judges whether the circular area and the first area screen have intersection, namely whether the range can cover the first area screen. If the two intersection points exist, the terminal device is operated at the position where the user is likely to start from the suspension point, and the operation range is likely to be in the vicinity of the suspension point, so that the finger of the user is likely to touch the first area screen, and therefore the first area screen is considered to have the possibility of being touched and polluted in the situation.
It should be noted that the above description exemplarily describes a method for detecting whether the first area screen has the possibility of being touched, but the present application is not particularly limited to the determination method of the above range. In addition, the intersection between the circular area and the like obtained by the terminal device and the first area screen means whether the areas of the circular area and the like are overlapped, and the overlapping range and the overlapping area are not limited, that is, as long as the intersection (overlapping) exists between the circular area and the first area screen, it is considered that the finger of the user is likely to touch the first area screen in the operation process after the user hovers.
A distance sensor, also called a displacement sensor, is a type of sensor for sensing a distance between the sensor and an object. Through the distance sensor, the terminal device may detect a distance between a finger hovering on the screen and the screen, and if the distance is smaller than a set threshold, it indicates that the user is likely to start operating on the touch screen, and the terminal device may determine a range, for example, a circular area, with a set distance r (the setting of r may refer to an area of a finger tip of a person or an area of an abdomen of the person) as a radius, with a position (a suspension point) on the screen corresponding to the finger, and the suspension point as a center. And then the terminal equipment judges whether the circular area and the first area screen have intersection, namely whether the range can cover the first area screen. If the two intersection points exist, the terminal device is operated at the position where the user is likely to start from the suspension point, and the operation range is likely to be in the vicinity of the suspension point, so that the finger of the user is likely to touch the first area screen, and therefore the first area screen is considered to have the possibility of being touched and polluted in the situation.
It should be noted that the above description exemplarily describes a method for detecting whether the first area screen has the possibility of being touched, but the present application is not particularly limited to the determination method of the above range.
Step 402, when the possibility that the first area screen is touched is determined, a prompt is given.
As shown in step 401, when detecting that there is a possibility that the first area screen is touched, the terminal device may give an alert to inform the user which area on the screen (i.e., the first area screen) covers the camera, so that the user may operate cautiously to avoid touching the first area screen as much as possible. The following are several examples of reminders:
fig. 12 shows an exemplary schematic diagram of the reminding method, and as shown in fig. 12, the terminal device may explicitly frame a first area screen on the screen, such as a rectangle filled with oblique lines. Fig. 13 shows an exemplary schematic diagram of the reminding method, and as shown in fig. 13, the terminal device may show the first area screen on the screen in a color different from other areas, for example, the first area screen is framed by a red rectangle. Fig. 14 shows an exemplary schematic diagram of the reminding method, and as shown in fig. 14, the terminal device may explicitly frame the corresponding position of the camera on the screen, for example, with a circular solid line. The user can know that the area shown by the frame of the image is the first area screen by seeing the pattern displayed on the screen. Fig. 15 shows an exemplary schematic diagram of the reminding method, and as shown in fig. 15, the terminal device may remind the user by using a text method while framing the first area screen with a rectangle filled with oblique lines on the screen, where the text may be, for example, "here, a camera area, please do not touch". The user can know that the framed area is the first area screen by seeing the characters displayed on the screen.
It should be noted that, the present application may also adopt other methods to remind the user of the position of the first area screen, which is not specifically limited.
In the embodiment, when the possibility that part of the screen covering the camera is touched by the finger of the user is detected, the terminal equipment can give a prompt to inform the user of the position of the camera, so that the user can operate cautiously, the part of the screen covering the camera is avoided being touched as far as possible, the possibility and the frequency that the part of the screen covering the camera is polluted are reduced, and the photographing definition is improved.
Those of ordinary skill in the art will appreciate that the various method steps and elements described in connection with the embodiments disclosed herein can be implemented as electronic hardware, computer software, or combinations of both, and that the steps and elements of the various embodiments have been described above generally in terms of their functionality in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may also be an electric, mechanical or other form of connection.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiments of the present application.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially or partially contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
While the invention has been described with reference to specific embodiments, the scope of the invention is not limited thereto, and those skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (20)

1. The utility model provides a false touch reminding method of camera terminal equipment under screen, is applied to terminal equipment, terminal equipment includes the camera under the screen, terminal equipment's screen is including covering the first region screen of camera under the screen, its characterized in that, the method includes:
and when the first area screen is detected to have the possibility of being touched, displaying a false touch prompt on the screen.
2. The method of claim 1, wherein the detecting that there is a possibility that the first area screen is touched comprises:
acquiring a touchable area according to user operation, wherein the touchable area is an area which can be touched by a user on a screen of the terminal equipment;
determining that the first area screen is likely to be touched when the touchable area and the first area screen overlap;
Determining a likelihood that the first area screen is not touched when the touchable area and the first area screen do not overlap.
3. The method according to claim 2, wherein the acquiring the touchable area according to a user operation comprises:
acquiring a sliding track according to touch operation of a user on a screen of the terminal equipment;
and determining the touchable area according to the sliding track.
4. The method according to claim 3, wherein the touch operation comprises a slide operation and/or a press operation; wherein the content of the first and second substances,
the sliding operation comprises a linear sliding operation in any direction and/or a curve sliding operation in any direction;
the pressing operation comprises a clicking operation and/or a long pressing operation.
5. The method of claim 4, wherein said determining the touchable area from the sliding trajectory comprises:
according to the fact that the sliding track corresponding to the sliding operation is a central line, the two sides are extended outwards for a set length to obtain the touchable area; alternatively, the first and second electrodes may be,
and taking the sliding track corresponding to the pressing operation as a circle center, and taking the set length as a radius to obtain the touchable area.
6. The method according to claim 2, wherein the acquiring the touchable area according to a user operation comprises:
and acquiring a suspension point according to the suspension touch operation of the user near the screen of the terminal equipment, and determining the touchable area according to the suspension point.
7. The method of claim 6, wherein said determining the touchable region from the hover point comprises:
and taking the position on the screen of the terminal equipment corresponding to the hovering point as a circle center, and taking the set length as a radius to obtain the touchable area.
8. The method of any one of claims 1-7, wherein displaying the false touch alert on the screen comprises:
displaying the edge of the first area screen in a manner of distinguishing from other areas; alternatively, the first and second electrodes may be,
displaying the first area screen in a manner of being distinguished from other areas; alternatively, the first and second electrodes may be,
displaying the position of the edge of the screen of the terminal equipment corresponding to the under-screen camera in a mode of distinguishing the position from other areas; alternatively, the first and second electrodes may be,
displaying the position corresponding to the off-screen camera on the screen of the terminal equipment in a mode of distinguishing the position from other areas;
Wherein the means for distinguishing from other areas comprises highlighting or filling patterns.
9. The method of claim 8, wherein displaying the false touch alert on the screen further comprises:
and indicating the position of the first area screen or the position of the screen of the terminal device corresponding to the off-screen camera by using characters on the screen of the terminal device.
10. The terminal equipment is characterized by comprising an off-screen camera, wherein a screen of the terminal equipment comprises a first area screen covering the off-screen camera; the terminal device further includes: a processor and a memory;
the memory is used for storing programs;
when the program is executed by the processor, the processor is used for displaying a false touch prompt on the screen when the first area screen is detected to have the possibility of being touched.
11. The terminal device according to claim 10, wherein the processor is specifically configured to obtain a touchable area according to a user operation, where the touchable area is an area on the screen that can be touched by a user; determining that the first area screen is likely to be touched when the touchable area and the first area screen overlap; determining a likelihood that the first area screen is not touched when the touchable area and the first area screen do not overlap.
12. The terminal device according to claim 11, wherein the processor is specifically configured to obtain a sliding track according to a touch operation of a user on the screen; and determining the touchable area according to the sliding track.
13. The terminal device according to claim 12, wherein the touch operation comprises a slide operation and/or a press operation; wherein the content of the first and second substances,
the sliding operation comprises a linear sliding operation in any direction and/or a curve sliding operation in any direction;
the pressing operation comprises a clicking operation and/or a long pressing operation.
14. The terminal device according to claim 13, wherein the processor is specifically configured to expand a set length outward from both sides to obtain the touchable area according to a sliding trajectory corresponding to the sliding operation being a central line; or, the touchable area is obtained by taking the sliding track corresponding to the pressing operation as a center of a circle and a set length as a radius.
15. The terminal device according to claim 11, wherein the processor is specifically configured to obtain a hover point according to a hover touch operation of a user near the screen, where the hover touch operation is obtained by sensing through a capacitive sensor or a distance sensor; determining the touchable area from the hover point.
16. The terminal device according to claim 15, wherein the processor is specifically configured to obtain the touchable area by taking a position on the screen corresponding to the hover point as a center and a set length as a radius.
17. The terminal device according to any of claims 10 to 16, wherein the screen, in particular for displaying an edge of the first area screen in a manner that is distinguishable from other areas; or, the first area screen is displayed in a manner of being distinguished from other areas; or displaying the position corresponding to the edge of the under-screen camera in a manner of distinguishing from other areas; or, the position corresponding to the under-screen camera is displayed in a manner of being distinguished from other areas; the means to distinguish from other areas include highlighting or filling patterns.
18. The terminal device of claim 17, wherein the screen is further configured to indicate the first area screen or the location corresponding to the off-screen camera with text.
19. A computer-readable storage medium, comprising a computer program which, when executed on a computer, causes the computer to perform the method of any one of claims 1-9.
20. A computer program for performing the method of any one of claims 1-9 when the computer program is executed by a computer.
CN202010246250.9A 2020-03-31 2020-03-31 Mistaken touch reminding method and device for under-screen camera terminal equipment Pending CN113467652A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010246250.9A CN113467652A (en) 2020-03-31 2020-03-31 Mistaken touch reminding method and device for under-screen camera terminal equipment
PCT/CN2021/081569 WO2021197085A1 (en) 2020-03-31 2021-03-18 Mistouch prompting method and apparatus for terminal device having under-display camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010246250.9A CN113467652A (en) 2020-03-31 2020-03-31 Mistaken touch reminding method and device for under-screen camera terminal equipment

Publications (1)

Publication Number Publication Date
CN113467652A true CN113467652A (en) 2021-10-01

Family

ID=77865664

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010246250.9A Pending CN113467652A (en) 2020-03-31 2020-03-31 Mistaken touch reminding method and device for under-screen camera terminal equipment

Country Status (2)

Country Link
CN (1) CN113467652A (en)
WO (1) WO2021197085A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113986047B (en) * 2021-12-23 2023-10-27 荣耀终端有限公司 Method and device for identifying false touch signal

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103853476A (en) * 2012-12-04 2014-06-11 联想(北京)有限公司 Information processing method and electronic equipment
CN104731498A (en) * 2015-01-30 2015-06-24 深圳市中兴移动通信有限公司 Mobile terminal mistaken-touch prevention method and device
CN104932788A (en) * 2015-06-24 2015-09-23 青岛海信移动通信技术股份有限公司 Adaptive touch screen control method and equipment
CN106201304A (en) * 2016-06-23 2016-12-07 乐视控股(北京)有限公司 A kind of method and device of false-touch prevention operation
US20170277336A1 (en) * 2016-03-24 2017-09-28 Boe Technology Group Co., Ltd. Touch Method and Device, Touch Display Apparatus
CN108196722A (en) * 2018-01-29 2018-06-22 广东欧珀移动通信有限公司 A kind of electronic equipment and its touch control method, computer readable storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103853476A (en) * 2012-12-04 2014-06-11 联想(北京)有限公司 Information processing method and electronic equipment
CN104731498A (en) * 2015-01-30 2015-06-24 深圳市中兴移动通信有限公司 Mobile terminal mistaken-touch prevention method and device
CN104932788A (en) * 2015-06-24 2015-09-23 青岛海信移动通信技术股份有限公司 Adaptive touch screen control method and equipment
US20170277336A1 (en) * 2016-03-24 2017-09-28 Boe Technology Group Co., Ltd. Touch Method and Device, Touch Display Apparatus
CN106201304A (en) * 2016-06-23 2016-12-07 乐视控股(北京)有限公司 A kind of method and device of false-touch prevention operation
CN108196722A (en) * 2018-01-29 2018-06-22 广东欧珀移动通信有限公司 A kind of electronic equipment and its touch control method, computer readable storage medium

Also Published As

Publication number Publication date
WO2021197085A1 (en) 2021-10-07

Similar Documents

Publication Publication Date Title
JP7391102B2 (en) Gesture processing methods and devices
CN113407053B (en) Touch screen, electronic equipment and display control method
CN113645351B (en) Application interface interaction method, electronic device and computer-readable storage medium
CN110618933B (en) Performance analysis method and system, electronic device and storage medium
CN110456938A (en) A kind of the false-touch prevention method and electronic equipment of Curved screen
US11907526B2 (en) Touch region adjustment method and apparatus for determining a grasping gesture of a user on an electronic device
CN112751954B (en) Operation prompting method and electronic equipment
CN112600961A (en) Volume adjusting method and electronic equipment
CN110839128B (en) Photographing behavior detection method and device and storage medium
CN110968247B (en) Electronic equipment control method and electronic equipment
CN112698756A (en) Display method of user interface and electronic equipment
CN110248037A (en) A kind of identity document scan method and device
CN110704145B (en) Hot area adjusting method and device, electronic equipment and storage medium
WO2021197085A1 (en) Mistouch prompting method and apparatus for terminal device having under-display camera
CN110058729B (en) Method and electronic device for adjusting sensitivity of touch detection
CN114205512A (en) Shooting method and device
CN115032640B (en) Gesture recognition method and terminal equipment
CN114283195B (en) Method for generating dynamic image, electronic device and readable storage medium
CN115115679A (en) Image registration method and related equipment
CN116521018B (en) False touch prompting method, terminal equipment and storage medium
WO2022222705A1 (en) Device control method and electronic device
US20220317841A1 (en) Screenshot Method and Related Device
CN111475363B (en) Card death recognition method and electronic equipment
CN115329299A (en) Screen unlocking method and electronic equipment
CN117742475A (en) Equipment operation control method, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination