WO2021197085A1 - 屏下摄像头终端设备的误触提醒方法和装置 - Google Patents

屏下摄像头终端设备的误触提醒方法和装置 Download PDF

Info

Publication number
WO2021197085A1
WO2021197085A1 PCT/CN2021/081569 CN2021081569W WO2021197085A1 WO 2021197085 A1 WO2021197085 A1 WO 2021197085A1 CN 2021081569 W CN2021081569 W CN 2021081569W WO 2021197085 A1 WO2021197085 A1 WO 2021197085A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
area
terminal device
camera
touchable
Prior art date
Application number
PCT/CN2021/081569
Other languages
English (en)
French (fr)
Inventor
郜文美
卢曰万
姜永涛
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021197085A1 publication Critical patent/WO2021197085A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • This application relates to the technology of an under-screen camera terminal, and in particular to a method and device for reminding an under-screen camera terminal device by mistake.
  • UDC under-display camera
  • USC underscreen camera
  • the front camera is hidden under the screen of the terminal device, and the screen of the terminal device is divided into two parts, the first part of the screen is directly opposite
  • the front camera adopts a transparent organic light-emitting diode (OLED) screen.
  • OLED organic light-emitting diode
  • the first part of the screen is transparent and will not hinder the viewfinder of the camera.
  • the first part of the screen is an ordinary
  • the display screen together with the second part of the screen, displays the picture normally.
  • the second part of the screen is a screen in other areas except the first part of the screen, using an ordinary OLED screen, which can be displayed normally. In this way, a true full screen can be realized.
  • the first part of the screen of the terminal device adopting the above-mentioned UDC design scheme is easily contaminated by, for example, fingerprints, sweat, stains, etc., which causes the sharpness of photographs to decrease.
  • the present application provides a method and device for reminding an under-screen camera terminal device of a false touch, so as to reduce the possibility and frequency of contamination of a part of the screen covering the camera, and to improve the clarity of taking pictures.
  • the present application provides a false touch reminder method for an under-screen camera terminal device, which is applied to a terminal device.
  • the terminal device includes an under-screen camera.
  • the screen of the terminal device includes a screen covering a first area of the under-screen camera.
  • the false touch reminder method of the under-screen camera terminal device includes: when it is detected that the screen in the first area has the possibility of being touched, displaying a false touch reminder on the screen.
  • the terminal device when it is detected that part of the screen covering the camera is likely to be touched by the user’s finger, the terminal device can give a prompt to inform the user of the location of the camera, so that the user can operate cautiously and avoid touching the cover as much as possible.
  • Part of the screen of the camera reduces the possibility and frequency of contamination of part of the screen that covers the camera, and improves the clarity of the picture.
  • detecting the possibility of being touched on the screen of the first area includes: obtaining a touchable area according to a user operation, where the touchable area is an area on the screen of the terminal device that can be touched by the user; When the touchable area and the first area screen overlap, it is determined that the first area screen is likely to be touched; when the touchable area does not overlap the first area screen, it is determined that the first area screen has no possibility of being touched.
  • Obtaining the area that may be touched according to the actual operation of the user can improve the accuracy of the false touch reminder.
  • acquiring the touchable area according to a user operation includes: acquiring a sliding track according to a touch operation of the user on the screen of the terminal device; and determining the touchable area according to the sliding track.
  • the sliding trajectory may be a sliding trajectory generated by each operation of the user. For example, the user's operation is sliding from left to right on the screen, then the sliding track is a line from left to right.
  • the user's operation is a continuous curve sliding on the screen, then the sliding track is a curve.
  • the touch operation includes a sliding operation and/or a pressing operation; wherein the sliding operation includes a linear sliding operation in any direction and/or a curved sliding operation in any direction; the pressing operation includes a clicking operation and/or Long press operation.
  • determining the touchable area according to the sliding trajectory includes: according to the sliding trajectory corresponding to the sliding operation as the midline, expanding both sides outward by a set length to obtain the touchable area; or, pressing the sliding corresponding to the sliding operation
  • the trajectory is the center of the circle, and the touchable area is obtained with the set length as the radius.
  • obtaining the touchable area according to a user operation includes: obtaining a hovering point according to a user's hovering touch operation near the screen of the terminal device, the hovering touch operation being sensed by a capacitive sensor or a distance sensor ; Determine the touchable area according to the hovering point.
  • determining the touchable area according to the hovering point includes: taking a position on the screen of the terminal device corresponding to the hovering point as the center of the circle, and taking the set length as the radius to obtain the touchable area.
  • giving a reminder includes: displaying the edge of the screen in the first area in a manner different from other areas; or displaying the screen in the first area in a manner different from other areas; or The position on the screen of the terminal device corresponding to the edge of the camera under the screen is displayed in a way different from other areas; or, the position on the screen of the terminal device corresponding to the camera under the screen is displayed in a way different from other areas; different from other areas
  • the mode of the area includes highlight or fill pattern.
  • giving the reminder further includes: using text on the screen of the terminal device to indicate the position of the camera in the first area on the screen or the screen of the terminal device corresponding to the camera under the screen.
  • Reminding the user of the location of the camera under the screen in a variety of ways can reduce the possibility and frequency of contamination of part of the screen covering the camera, improve the clarity of the photo, and realize the diversity of reminding methods.
  • the present application provides an under-screen camera terminal device mistouch reminding device, which is applied to a terminal device.
  • the terminal device includes an under-screen camera, and the screen of the terminal device includes a screen covering the first area of the under-screen camera.
  • the device includes: a processing module, used to display a false touch reminder on the screen when it is detected that the screen in the first area has the possibility of being touched.
  • the processing module is specifically used to obtain the touchable area according to the user operation.
  • the touchable area is the area on the screen of the terminal device that will be touched by the user; when there is a touchable area and the first area on the screen. When overlapping, it is determined that the screen in the first area is likely to be touched; when the touchable area and the screen in the first area are not overlapped, it is determined that the screen in the first area is not likely to be touched.
  • the processing module is specifically configured to obtain a sliding track according to the user's touch operation on the screen of the terminal device; and to determine the touchable area according to the sliding track.
  • the touch operation includes a sliding operation and/or a pressing operation; wherein the sliding operation includes a linear sliding operation in any direction and/or a curved sliding operation in any direction; the pressing operation includes a clicking operation and/or Long press operation.
  • the processing module is specifically configured to use the sliding track corresponding to the sliding operation as the midline, and expand both sides outward by a set length to obtain the touchable area; or, taking the sliding track corresponding to the pressing operation as the center of the circle, Take the set length as the radius to get the touchable area.
  • the processing module is specifically used to obtain the hovering point according to the user's hovering touch operation near the screen of the terminal device.
  • the hovering touch operation is sensed by a capacitive sensor or a distance sensor; according to the hovering point Determine the touchable area.
  • the processing module is specifically configured to take the position corresponding to the hovering point on the screen of the terminal device as the center of the circle and the set length as the radius to obtain the touchable area.
  • the processing module is specifically configured to display the edge of the screen in the first area in a manner different from other areas; or, to display the screen in the first area in a manner different from other areas; or The position on the screen of the terminal device corresponding to the edge of the camera under the screen is displayed in a way different from other areas; or, the position on the screen of the terminal device corresponding to the camera under the screen is displayed in a way different from other areas; different from other areas
  • the mode of the area includes highlight or fill pattern.
  • the processing module is further configured to indicate, on the screen of the terminal device, the position of the camera in the first area corresponding to the under-screen camera on the screen of the first area or the screen of the terminal device with text.
  • this application provides a terminal device, including: one or more processors; a memory, used to store one or more programs; when one or more programs are executed by one or more processors, the one Or multiple processors implement the method of any one of the above-mentioned first aspects.
  • the present application provides a computer-readable storage medium, including a computer program, which, when executed on a computer, causes the computer to execute the method in any one of the above-mentioned first aspects.
  • the present application provides a computer program, when the computer program is executed by a computer, it is used to execute the method in any one of the above-mentioned first aspects.
  • FIG. 1 shows an exemplary structural diagram of a terminal device 100
  • Fig. 2 shows an exemplary front structural diagram of the screen of the terminal device
  • Fig. 3 shows an exemplary structure diagram of the first area screen
  • FIG. 4 is a flowchart of an embodiment of a method for reminding a wrong touch of a camera terminal device under the screen of the application;
  • Figures 5a-10 respectively show an exemplary schematic diagram of the detection method
  • FIG. 11 shows an exemplary schematic diagram of a user's finger floating above the screen
  • Figures 12-15 respectively show an exemplary schematic diagram of the reminding method.
  • At least one (item) refers to one or more, and “multiple” refers to two or more.
  • “And/or” is used to describe the association relationship of the associated objects, indicating that there can be three kinds of relationships, for example, “A and/or B” can mean: only A, only B, and both A and B. , Where A and B can be singular or plural.
  • the character “/” generally indicates that the associated objects before and after are in an "or” relationship.
  • the following at least one item (a) or similar expressions refers to any combination of these items, including any combination of a single item (a) or a plurality of items (a).
  • At least one of a, b, or c can mean: a, b, c, "a and b", “a and c", “b and c", or "a and b and c" ", where a, b, and c can be single or multiple.
  • the terminal equipment of this application can also be called user equipment (UE), which can be deployed on land, including indoor or outdoor, handheld or vehicle-mounted; it can also be deployed on the water (such as ships, etc.); it can also be deployed on In the air (for example, airplanes, balloons, satellites, etc.).
  • Terminal devices can be mobile phones, tablets, virtual reality (VR) devices, augmented reality (AR) devices, wireless devices in smart homes, etc. This application There is no restriction on this.
  • the aforementioned terminal equipment and the chips that can be installed in the aforementioned terminal equipment are collectively referred to as terminal equipment.
  • FIG. 1 shows a schematic diagram of the structure of a terminal device 100.
  • the terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2.
  • Mobile communication module 150 wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the terminal device 100.
  • the terminal device 100 may include more or fewer components than those shown in the figure, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU), etc.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching instructions and executing instructions.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may include one or more interfaces.
  • the interface can include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and a universal asynchronous transmitter (universal asynchronous) interface.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the terminal device 100 can implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • the ISP is used to process the data fed back from the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and is projected to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transfers the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the terminal device 100 may include one or N cameras 193, and N is a positive integer greater than one.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the terminal device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the terminal device 100 may support one or more video codecs. In this way, the terminal device 100 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, and so on.
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • applications such as intelligent cognition of the terminal device 100 can be implemented, such as image recognition, face recognition, voice recognition, text understanding, and so on.
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the terminal device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program (such as a sound playback function, an image playback function, etc.) required by at least one function, and the like.
  • the data storage area can store data (such as audio data, phone book, etc.) created during the use of the terminal device 100.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • the processor 110 executes various functional applications and data processing of the terminal device 100 by running instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the terminal device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
  • the speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the terminal device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A may be provided on the display screen 194.
  • the capacitive pressure sensor may include at least two parallel plates with conductive materials.
  • the terminal device 100 determines the intensity of the pressure according to the change in capacitance.
  • the terminal device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the terminal device 100 may also calculate the touched position based on the detection signal of the pressure sensor 180A.
  • touch operations that act on the same touch position but have different touch operation strengths may correspond to different operation instructions. For example: when a touch operation whose intensity of the touch operation is less than the first pressure threshold is applied to the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the movement posture of the terminal device 100.
  • the angular velocity of the terminal device 100 around three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the shake angle of the terminal device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shake of the terminal device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the terminal device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the terminal device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip holster.
  • the terminal device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the terminal device 100 in various directions (generally three axes). When the terminal device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of the terminal device, and it can be used in applications such as horizontal and vertical screen switching, pedometer and so on.
  • the terminal device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the terminal device 100 may use the distance sensor 180F to measure the distance to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the terminal device 100 emits infrared light to the outside through the light emitting diode.
  • the terminal device 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the terminal device 100. When insufficient reflected light is detected, the terminal device 100 can determine that there is no object near the terminal device 100.
  • the terminal device 100 can use the proximity light sensor 180G to detect that the user holds the terminal device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, and the pocket mode will automatically unlock and lock the screen.
  • the ambient light sensor 180L is used to sense the brightness of the ambient light.
  • the terminal device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived brightness of the ambient light.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the terminal device 100 is in a pocket to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the terminal device 100 can use the collected fingerprint characteristics to implement fingerprint unlocking, access application locks, fingerprint photographs, fingerprint answering calls, and so on.
  • the temperature sensor 180J is used to detect temperature.
  • the terminal device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the terminal device 100 reduces the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the terminal device 100 when the temperature is lower than another threshold, the terminal device 100 heats the battery 142 to avoid abnormal shutdown of the terminal device 100 due to low temperature.
  • the terminal device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch device”.
  • the touch sensor 180K may be provided on the display screen 194, and the touch screen is composed of the touch sensor 180K and the display screen 194, which is also called a “touch screen”.
  • the touch sensor 180K is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 194.
  • the touch sensor 180K may also be disposed on the surface of the terminal device 100, which is different from the position of the display screen 194.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can obtain the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the human pulse and receive the blood pressure pulse signal.
  • the bone conduction sensor 180M may also be provided in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can parse the voice signal based on the vibration signal of the vibrating bone block of the voice obtained by the bone conduction sensor 180M, and realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beating signal obtained by the bone conduction sensor 180M, and realize the heart rate detection function.
  • the terminal device 100 may include fewer or more components than those shown in FIG. 1.
  • the terminal device shown in FIG. 1 only includes components that are more relevant to the multiple implementations disclosed in this application. .
  • Fig. 2 shows an exemplary front structural diagram of the screen of the terminal device. As shown in Fig. 2, the front camera is hidden under the screen of the terminal device. Based on this, the screen of the terminal device is divided into two areas, where: The screen in the first area covers the top of the front camera and adopts a transparent OLED screen. When taking pictures, the screen in the first area is transparent and will not hinder the viewfinder of the camera. When not taking pictures, the screen in the first area is an ordinary The display screen, together with the second area screen, displays the picture normally. The second area screen is the other area on the screen of the terminal device except the first area, and a normal OLED screen can be used for normal display.
  • Fig. 2 shows a front schematic diagram of a screen adopting UDC
  • a side view of the first area screen in the screen is shown in Fig. 3
  • Fig. 3 shows an exemplary structural schematic diagram of the first area screen.
  • the first area screen includes three layers, in order from the outside to the inside, the outermost layer is a transparent anode, the middle layer is made of transparent organic light-emitting material, and the innermost layer is a transparent cathode.
  • the transparent organic light-emitting material in the first area screen emits light normally, and the emitted light (OLED light) reaches the user's eyes through the transparent cathode in the first area screen, so that the user can see the first area.
  • OLED light emitted light
  • FIG. 2 and FIG. 3 are only examples of a terminal device. If the terminal device has multiple front cameras, a transparent OLED screen as shown in Figure 3 can be set above each camera, or it can be used for all front cameras in a larger area (which can cover all front cameras). A transparent OLED screen as shown in Figure 3 is set in the camera. The transparent OLED screen shown in Figure 3 can also adopt other structures. This application does not make specific restrictions on this.
  • FIG. 4 is a flowchart of an embodiment of a method for reminding a wrong touch of a camera terminal device under the screen of the application.
  • the method of this embodiment can be applied to the terminal device shown in FIG.
  • the structure shown in Fig. 2 and Fig. 3 adopts the UDC design scheme, so that the screen can realize a full screen.
  • the method for reminding the wrong touch of the camera terminal device under the screen may include:
  • Step 401 Detect whether the screen in the first area on the screen is likely to be touched.
  • the first area screen may be, for example, a transparent OLED screen on the screen of the terminal device shown in FIG. 2 and FIG. 3. Since the full screen is adopted, the user needs to use the touch capability of the screen to input instructions. Therefore, when the user uses the terminal device, any area on the screen may be touched, such as clicking, pulling down, sliding, etc. , May occur anywhere on the screen, which will cause the touched place on the screen to be contaminated by sweat and contaminants on the user’s hand.
  • the contaminated screen will appear to be unable to see the interface displayed on the screen, especially after the first area of the screen is contaminated, because it still covers the top of the camera when taking pictures, even if the screen itself becomes transparent, it is contaminated
  • the screen is like a camera lens that is contaminated, causing glare, blur, and even obstruction by debris on the image.
  • Figures 5a and 5b show an exemplary schematic diagram of the detection method.
  • the upper left corner of the screen is taken as the coordinate origin (0,0)
  • the x-axis is positive from the coordinate origin to the right.
  • the value range of the abscissa is 0 ⁇ X
  • the positive direction of the y-axis is downward from the origin of the coordinate
  • the value range of the ordinate of the screen is 0 ⁇ Y.
  • the user wants to open the drop-down menu at the top of the screen, usually from a certain position at the top of the screen (the corresponding abscissa is x1, the ordinate is y1, x1 ⁇ (0,X), the value of y1 can be 0 or close to 0)
  • the terminal device can detect that the user's touch operation starts at the coordinates (x1, y1) and ends at the coordinates (x1, y1+L1), and the sliding track is vertically downward.
  • the terminal device uses the sliding track as a reference to determine a range, taking the line between the coordinates (x1, y1) and the coordinates (x1, y1+L1) as the center line, and the left and right expansion setting distance x (the setting of x can be referred to The thickness of a person’s finger or referring to the area of a person’s fingertips), a rectangular area is obtained.
  • the four vertices of the rectangular area are (x1-x,y1), (x1+x,y1), (x1-x,y1+ L1) and (x1+x, y1+L1).
  • the terminal device judges whether there is an intersection between the rectangular area and the screen of the first area, that is, whether the above-mentioned range will cover the screen of the first area. If the foregoing two overlap, it means that during the aforementioned sliding operation of the user, the user's finger is likely to swipe the screen in the first area. Therefore, it can be considered that the screen in the first area may be touched and contaminated in this case.
  • Figures 6a and 6b show an exemplary schematic diagram of the detection method.
  • the upper left corner of the screen is taken as the coordinate origin (0,0)
  • the x-axis is positive from the coordinate origin to the right.
  • the value range of the abscissa is 0 ⁇ X
  • the positive direction of the y-axis is downward from the origin of the coordinate
  • the value range of the ordinate of the screen is 0 ⁇ Y.
  • the user can usually start from a certain position on the left edge of the screen (the corresponding abscissa is x2, the ordinate is y2, the value of x2 can be 0 or a value close to 0, y2 ⁇ (0, Y)) Start sliding a certain distance L2 to the right.
  • the terminal device can detect that the user's touch operation starts at the coordinates (x2, y2) and ends at the coordinates (x2+L2, y2), and the sliding track is horizontal to the right.
  • the terminal device uses the sliding track as a reference to determine a range, taking the line between the coordinates (x2, y2) and the coordinates (x2+L2, y2) as the center line, and expand the set distance y up and down (you can refer to the setting of y)
  • the thickness of a person’s finger or referring to the area of a person’s fingertip a rectangular area is obtained.
  • the four vertices of the rectangular area are (x2, y2-y), (x2, y2+y), (x2+L2, y2- y) and (x2+L2, y2+y).
  • the terminal device judges whether there is an intersection between the rectangular area and the screen of the first area, that is, whether the above-mentioned range will cover the screen of the first area. If the foregoing two overlap, it means that during the aforementioned sliding operation of the user, the user's finger is likely to swipe the screen in the first area. Therefore, it can be considered that the screen in the first area may be touched and contaminated in this case.
  • Figures 7a and 7b show an exemplary schematic diagram of the detection method.
  • the upper left corner of the screen is taken as the coordinate origin (0,0)
  • the x-axis is positive from the coordinate origin to the right.
  • the value range of the abscissa is 0 ⁇ X
  • the positive direction of the y-axis is downward from the origin of the coordinate
  • the value range of the ordinate of the screen is 0 ⁇ Y.
  • the terminal device can detect that the user's touch operation starts at the coordinates (x3, y3) and ends at the coordinates (x3-L3, y3), and the sliding track is horizontal to the left.
  • the terminal device uses the sliding track as a reference to determine a range, taking the line between the coordinates (x3, y3) and the coordinates (x3-L3, y3) as the center line, and expand the set distance y up and down (you can refer to the setting of y)
  • the thickness of a person’s finger or referring to the area of a person’s fingertip a rectangular area is obtained.
  • the four vertices of the rectangular area are (x3, y3-y), (x3, y3+y), (x3-L3, y3- y) and (x3-L3,y3+y).
  • the terminal device judges whether there is an intersection between the rectangular area and the screen of the first area, that is, whether the above-mentioned range will cover the screen of the first area. If the foregoing two overlap, it means that during the aforementioned sliding operation of the user, the user's finger is likely to swipe the screen in the first area. Therefore, it can be considered that the screen in the first area may be touched and contaminated in this case.
  • Figures 8a and 8b show an exemplary schematic diagram of the detection method.
  • the upper left corner of the screen is taken as the coordinate origin (0,0)
  • the x-axis is positive from the coordinate origin to the right.
  • the value range of the abscissa is 0 ⁇ X
  • the positive direction of the y-axis is downward from the origin of the coordinate
  • the value range of the ordinate of the screen is 0 ⁇ Y.
  • the terminal device can detect that the user's touch operation starts at the coordinates (x4, y4), passes the coordinates (x41, y41), and ends at the coordinates (x42, y42), and the sliding track first moves to the lower right and then to the right.
  • the terminal device uses the sliding track as the reference to determine a range, and the line between the coordinates (x4, y4) and the coordinates (x41, y41), and the coordinates (x42, y42) is the center line, and the two sides are respectively expanded and set
  • the distance z (z can be set by referring to the thickness of a person's finger or referring to the area of a person's fingertip) to obtain a band-shaped area.
  • the terminal device judges whether there is an intersection between the band-shaped area and the screen of the first area, that is, whether the above-mentioned range will cover the screen of the first area. If the foregoing two overlap, it means that during the aforementioned sliding operation of the user, the user's finger is likely to swipe the screen in the first area. Therefore, it can be considered that the screen in the first area may be touched and contaminated in this case.
  • Figures 9a and 9b show an exemplary schematic diagram of the detection method.
  • the upper left corner of the screen is taken as the coordinate origin (0,0)
  • the x-axis is positive from the coordinate origin to the right.
  • the value range of the abscissa is 0 ⁇ X
  • the positive direction of the y-axis is downward from the origin of the coordinate
  • the value range of the ordinate of the screen is 0 ⁇ Y.
  • the user can usually press briefly or long-press (click on the control for longer than the set time) the icon of the application, and the location of the application corresponds to the coordinates (x5, y5).
  • the terminal device can detect that the user's touch operation is a short press or a long press on a circular, square, or irregular area centered on the coordinates (x5, y5).
  • the terminal device uses the coordinates (x5, y5) as the center to determine a range, and uses the coordinates (x5, y5) as the center to set the distance r. ) Is the radius to get a circular area.
  • the terminal device determines whether there is an intersection between the circular area and the screen of the first area, that is, whether the above-mentioned range will cover the screen of the first area. If the foregoing two overlap, it means that the user's finger is likely to touch the screen of the first area during the above-mentioned click operation. Therefore, it can be considered that the screen of the first area may be touched and thus polluted in this case.
  • the above content exemplarily describes several methods for detecting whether the screen in the first area has the possibility of being touched, but this application does not specifically limit the specific implementation method, including the determination method of the above range. Make specific restrictions.
  • the intersection between the rectangular area, band-shaped area, circular area, etc. obtained by the above-mentioned terminal device and the screen of the first area refers to whether there is an overlap between the areas where the two are located, and the overlap range and area are not limited, that is, as long as the two If there is an overlap (overlap), it is considered that the user's finger is likely to touch the screen in the first area during the corresponding operation.
  • the mutual capacitance sensor completes normal touch sensing, including multi-touch, and the self-capacitance sensor detects a finger hovering above the screen.
  • the terminal device can distinguish between floating touch and touch touch.
  • the distance sensor is also called a displacement sensor, which is a kind of sensor used to sense the distance between it and an object.
  • a touch screen equipped with a distance sensor can detect the distance between a finger hovering on the screen and the screen through the distance sensor. If the distance is less than the set threshold, it means that the user is likely to start operations on the touch screen. By setting the distance threshold, the terminal device can distinguish between floating touch and contact touch.
  • Figure 10 shows an exemplary schematic diagram of the detection method.
  • the upper left corner of the screen is taken as the coordinate origin (0,0)
  • the x-axis positive direction is taken from the coordinate origin to the right
  • the abscissa of the screen is taken
  • the value range is 0 ⁇ X
  • from the origin of the coordinate downward is the positive direction of the y-axis
  • the value range of the ordinate of the screen is 0 ⁇ Y.
  • the terminal device provides a sensing function. When the user's finger is close to the screen of the terminal device, based on the above floating touch technology or distance sensor, the terminal device can detect the position on the screen corresponding to the finger hovering on the screen (hovering). Stop).
  • FIG. 11 shows an exemplary schematic diagram of a user's finger hovering above the screen.
  • the user's right index finger is hovering above a certain position on the screen of the terminal device, and this position may be called a hovering point.
  • Its coordinates are (x6, y6).
  • the terminal device takes the coordinates (x6, y6) as the center of the circle, and takes the set distance r (the setting of r can refer to the area of a person's finger or the area of a person's fingertip) to obtain a circular area.
  • the terminal device judges whether there is an intersection between the circular area and the screen of the first area, that is, whether the above-mentioned range will cover the screen of the first area.
  • the user is likely to operate the terminal device from the position of the hovering point, and the operating range may be near the hovering point, so that his finger is likely to touch the first hovering point.
  • One area screen so it can be considered that the first area screen may be touched and contaminated in this case.
  • the above content exemplarily describes a method for detecting whether the screen in the first area has the possibility of being touched, but this application does not specifically limit the method for determining the above range.
  • the intersection between the circular area obtained by the above-mentioned terminal device and the screen of the first area refers to whether there is overlap between the areas where the two are located, and the range and area of the overlap are not limited, that is, as long as the two have an intersection (overlap) It is believed that the user's finger is likely to touch the screen in the first area during the operation after hovering.
  • the distance sensor is also called a displacement sensor, which is a kind of sensor used to sense the distance between it and an object.
  • the terminal device can detect the distance between the finger hovering on the screen and the screen. If the distance is less than the set threshold, the user is likely to start operations on the touch screen.
  • the terminal device can The corresponding position on the screen (hover point), with the hover point as the center, determine a range, for example, with the hover point as the center, set the distance r (r can be set with reference to the area of the human finger or Refer to the area of the finger pad of the person as the radius to obtain a circular area.
  • the terminal device determines whether there is an intersection between the circular area and the screen of the first area, that is, whether the above-mentioned range will cover the screen of the first area. If the foregoing two overlap, it means that the user is likely to operate the terminal device from the position of the hovering point, and the operating range may be near the hovering point, so that his finger is likely to touch the first hovering point.
  • One area screen therefore, it can be considered that the first area screen may be touched and contaminated in this case.
  • Step 402 When it is determined that the screen in the first area is likely to be touched, a reminder is given.
  • step 401 when the terminal device detects that the screen in the first area is likely to be touched, it can give a reminder to inform the user which area on the screen (that is, the screen in the first area) covers the camera, so that the user You can be cautious and avoid touching the first area of the screen as much as possible.
  • reminders include:
  • FIG. 12 shows an exemplary schematic diagram of the reminding method.
  • the terminal device can clearly frame the first area screen on the screen, for example, a rectangle filled with diagonal lines.
  • FIG. 13 shows an exemplary schematic diagram of the reminding method. As shown in FIG. 13, the terminal device can show the first area screen in a color that is different from other areas on the screen, for example, the first area screen is framed by a red rectangle .
  • FIG. 14 shows an exemplary schematic diagram of the reminding method. As shown in FIG. 14, the terminal device can clearly frame the position corresponding to the camera on the screen, for example, using a circular solid line. The user can know that the area framed by the image is the first area screen by seeing the pattern displayed on the screen.
  • FIG. 14 shows an exemplary schematic diagram of the reminding method. As shown in FIG. 12, the terminal device can clearly frame the first area screen on the screen, for example, a rectangle filled with diagonal lines.
  • FIG. 13 shows an exemplary schematic diagram of the reminding method. As shown in FIG. 13, the terminal
  • the terminal device can use a text to remind the user while the first area of the screen is framed by a rectangle filled with diagonal lines on the screen. For example, it can be "This is the camera area, please don't touch it". The user can know that the framed area is the first area screen after seeing the text displayed on the screen.
  • this application may also use other methods to remind the user of the location of the screen in the first area, which is not specifically limited.
  • the terminal device when it is detected that part of the screen covering the camera is likely to be touched by the user’s finger, the terminal device can give a prompt to inform the user of the location of the camera, so that the user can operate cautiously and avoid touching the cover as much as possible.
  • Part of the screen of the camera reduces the possibility and frequency of contamination of part of the screen that covers the camera, and improves the clarity of the picture.
  • the disclosed system, device, and method can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division, and there may be other divisions in actual implementation, for example, multiple units or components may be combined or It can be integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may also be electrical, mechanical or other forms of connection.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments of the present application.
  • the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solution of this application is essentially or the part that contributes to the existing technology, or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium. It includes several instructions to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the method described in each embodiment of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (read-only memory, ROM), random access memory (random access memory, RAM), magnetic disks or optical disks and other media that can store program codes. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种屏下摄像头终端设备的误触提醒方法和装置,应用于终端设备,所述终端设备包括屏下摄像头,所述终端设备的屏幕包括覆盖所述屏下摄像头的第一区域屏幕,所述方法包括:检测屏幕上的第一区域屏幕是否有被触摸的可能性(401);当确定第一区域屏幕有被触摸的可能性时,给出提醒(402)。该方法可以降低覆盖摄像头的部分屏幕被污染的可能性和频率,提升拍照清晰度。

Description

屏下摄像头终端设备的误触提醒方法和装置
本申请要求于2020年3月31日提交中国专利局、申请号为202010246250.9、申请名称为“屏下摄像头终端设备的误触提醒方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及屏下摄像头终端技术,尤其涉及一种屏下摄像头终端设备的误触提醒方法和装置。
背景技术
近年来,移动终端(智能手机、平板等)的屏幕设计趋势朝着高屏占比、甚至是全面屏发展,为了实现全面屏效果,业界出现了多款隐藏式前置摄像头的设计方案,例如,机械升降式摄像头、电动升降式摄像头、滑屏式摄像头等,但这些设计方案中复杂的机械结构,使得设计成本高、寿命短,且容易进水、积灰,并非最理想的全面屏方案。
相关技术提出了屏下摄像头(under display camera,UDC或者under screen camera,USC)的设计方案,即将前置摄像头隐藏于终端设备的屏幕下方,终端设备的屏幕分为两部分,第一部分屏幕正对着前置摄像头,采用透明有机发光二极管(organic light-emitting diode,OLED)屏幕,当拍照时,第一部分屏幕为透明的,不会妨碍摄像头取景,而当不拍照时,第一部分屏幕就是一个普通显示屏,和第二部分屏幕一起正常显示画面。第二部分屏幕是除第一部分屏幕外其他区域的屏幕,采用普通的OLED屏幕,正常显示即可。这样就可以实现真正的全面屏。
但是,采用上述UDC的设计方案的终端设备,其第一部分屏幕很容易被例如指纹、汗液、污渍等污染,从而导致拍照清晰度下降。
发明内容
本申请提供一种屏下摄像头终端设备的误触提醒方法和装置,以降低覆盖摄像头的部分屏幕被污染的可能性和频率,提升拍照清晰度。
第一方面,本申请提供一种屏下摄像头终端设备的误触提醒方法,应用于终端设备,该终端设备包括屏下摄像头,该终端设备的屏幕包括覆盖屏下摄像头的第一区域屏幕,该屏下摄像头终端设备的误触提醒方法包括:当检测到第一区域屏幕存在被触摸的可能性时,在屏幕上显示误触提醒。
本实施例,当检测覆盖摄像头的部分屏幕有被用户的手指触摸的可能性时,终端设备可以给出提示,以告知用户摄像头所在的位置,这样用户可以谨慎操作,尽可能避免触碰到覆盖摄像头的部分屏幕,从而降低了覆盖摄像头的部分屏幕被污染的可能性和频率,提升拍照清晰度。
在一种可能的实现方式中,检测到第一区域屏幕存在被触摸的可能性,包括:根据用户操作获取可触摸区域,该可触摸区域为终端设备的屏幕上会被用户触摸到的区域;当可触摸区域和第一区域屏幕有重叠时,确定第一区域屏幕有被触摸的可能性;当可触摸区域 和第一区域屏幕没有重叠时,确定第一区域屏幕没有被触摸的可能性。
根据用户的实际操作获取可能被触摸到的区域,可以提高误触提醒的准确率。
在一种可能的实现方式中,根据用户操作获取可触摸区域,包括:根据用户在终端设备的屏幕上的触控操作获取滑动轨迹;根据滑动轨迹确定可触摸区域。需要说明的是,该滑动轨迹可以是用户的每一次操作产生的滑动轨迹。例如用户的操作是在屏幕上从左向右滑动,那么该滑动轨迹就是从左向右的一条线。又例如用户的操作是在屏幕上连贯的曲线滑动,那么该滑动轨迹就是一条曲线。
在一种可能的实现方式中,触控操作包括滑动操作和/或按压操作;其中,滑动操作包括任意方向的直线滑动操作和/或任意方向的曲线滑动操作;按压操作包括点击操作和/或长按操作。
在一种可能的实现方式中,根据滑动轨迹确定可触摸区域,包括:根据滑动操作对应的滑动轨迹为中线,双侧向外扩展设定长度得到可触摸区域;或者,以按压操作对应的滑动轨迹为圆心,以设定长度为半径得到可触摸区域。
在一种可能的实现方式中,根据用户操作获取可触摸区域,包括:根据用户在终端设备的屏幕附近的悬浮触控操作获取悬停点,该悬浮触控操作通过电容传感器或距离传感器感应得到;根据悬停点确定可触摸区域。
在一种可能的实现方式中,根据悬停点确定可触摸区域,包括:以终端设备的屏幕上与悬停点对应的位置为圆心,以设定长度为半径得到可触摸区域。
考虑到用户的各种操作下会触摸到覆盖屏下摄像头的部分屏幕的可能性,提高误触检测效率。
在一种可能的实现方式中,给出提醒,包括:将第一区域屏幕的边沿以区别于其他区域的方式显示;或者,将第一区域屏幕以区别于其他区域的方式显示;或者,将终端设备的屏幕上对应于屏下摄像头的边沿的位置以区别于其他区域的方式显示;或者,将终端设备的屏幕上对应于屏下摄像头的位置以区别于其他区域的方式显示;区别于其他区域的方式包括高亮或填充图案。
在一种可能的实现方式中,给出提醒,还包括:在终端设备的屏幕上用文字指示第一区域屏幕或终端设备的屏幕上对应于屏下摄像头的位置。
以多种方式提醒用户屏下摄像头所在的位置,可以降低覆盖摄像头的部分屏幕被污染的可能性和频率,提升拍照清晰度,同时还可以实现提醒方式的多样性。
第二方面,本申请提供一种屏下摄像头终端设备的误触提醒装置,该装置应用于终端设备,该终端设备包括屏下摄像头,该终端设备的屏幕包括覆盖屏下摄像头的第一区域屏幕;该装置包括:处理模块,用于当检测到第一区域屏幕存在被触摸的可能性时,在屏幕上显示误触提醒。
在一种可能的实现方式中,处理模块,具体用于根据用户操作获取可触摸区域,可触摸区域为终端设备的屏幕上会被用户触摸到的区域;当可触摸区域和第一区域屏幕有重叠时,确定第一区域屏幕有被触摸的可能性;当可触摸区域和第一区域屏幕没有重叠时,确定第一区域屏幕没有被触摸的可能性。
在一种可能的实现方式中,处理模块,具体用于根据用户在终端设备的屏幕上的触控操作获取滑动轨迹;根据滑动轨迹确定可触摸区域。
在一种可能的实现方式中,触控操作包括滑动操作和/或按压操作;其中,滑动操作包括任意方向的直线滑动操作和/或任意方向的曲线滑动操作;按压操作包括点击操作和/或长按操作。
在一种可能的实现方式中,处理模块,具体用于根据滑动操作对应的滑动轨迹为中线,双侧向外扩展设定长度得到可触摸区域;或者,以按压操作对应的滑动轨迹为圆心,以设定长度为半径得到可触摸区域。
在一种可能的实现方式中,处理模块,具体用于根据用户在终端设备的屏幕附近的悬浮触控操作获取悬停点,悬浮触控操作通过电容传感器或距离传感器感应得到;根据悬停点确定可触摸区域。
在一种可能的实现方式中,处理模块,具体用于以终端设备的屏幕上与悬停点对应的位置为圆心,以设定长度为半径得到可触摸区域。
在一种可能的实现方式中,处理模块,具体用于将第一区域屏幕的边沿以区别于其他区域的方式显示;或者,将第一区域屏幕以区别于其他区域的方式显示;或者,将终端设备的屏幕上对应于屏下摄像头的边沿的位置以区别于其他区域的方式显示;或者,将终端设备的屏幕上对应于屏下摄像头的位置以区别于其他区域的方式显示;区别于其他区域的方式包括高亮或填充图案。
在一种可能的实现方式中,处理模块,还用于在终端设备的屏幕上用文字指示第一区域屏幕或终端设备的屏幕上对应于屏下摄像头的位置。
第三方面,本申请提供一种终端设备,包括:一个或多个处理器;存储器,用于存储一个或多个程序;当一个或多个程序被一个或多个处理器执行,使得该一个或多个处理器实现如上述第一方面中任一项的方法。
第四方面,本申请提供一种计算机可读存储介质,包括计算机程序,该计算机程序在计算机上被执行时,使得计算机执行上述第一方面中任一项的方法。
第五方面,本申请提供一种计算机程序,当计算机程序被计算机执行时,用于执行上述第一方面中任一项的方法。
附图说明
图1示出了终端设备100的一个示例性的结构示意图;
图2示出了终端设备的屏幕的一个示例性的正面结构示意图;
图3示出了第一区域屏幕的一个示例性的结构示意图;
图4为本申请屏下摄像头终端设备的误触提醒方法实施例的流程图;
图5a-10分别示出了检测方法的一个示例性的示意图;
图11示出了用户手指悬浮于屏幕上方的一个示例性的示意图;
图12-15分别示出了提醒方法的一个示例性的示意图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合本申请中的附图,对本申请中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳 动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请的说明书实施例和权利要求书及附图中的术语“第一”、“第二”等仅用于区分描述的目的,而不能理解为指示或暗示相对重要性,也不能理解为指示或暗示顺序。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元。方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
应当理解,在本申请中,“至少一个(项)”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,用于描述关联对象的关联关系,表示可以存在三种关系,例如,“A和/或B”可以表示:只存在A,只存在B以及同时存在A和B三种情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b或c中的至少一项(个),可以表示:a,b,c,“a和b”,“a和c”,“b和c”,或“a和b和c”,其中a,b,c可以是单个,也可以是多个。
本申请的终端设备又可称之为用户设备(user equipment,UE),可以部署在陆地上,包括室内或室外、手持或车载;也可以部署在水面上(如轮船等);还可以部署在空中(例如飞机、气球和卫星上等)。终端设备可以是手机(mobile phone)、平板电脑(pad)、虚拟现实(virtual reality,VR)设备、增强现实(augmented reality,AR)设备、智慧家庭(smart home)中的无线设备等,本申请对此不作限定。本申请中将前述终端设备及可设置于前述终端设备的芯片统称为终端设备。
图1示出了终端设备100的结构示意图。
终端设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对终端设备100的具体限定。在本申请另一些实施例中,终端设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
终端设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,终端设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当终端设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。终端设备100可以支持一种或多种视频编解码器。这样,终端设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现终端设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展终端设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储终端设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器110通过运行存储在内部存储器121的指令,和/或存储在设置于处理器中的存储器的指令,执行终端设备100的各种功能应用以及数据处理。
终端设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。终端设备100可以通过扬声器170A收听音乐,或收听免提通话。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。终端设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,终端设备100根据压力传感器180A检测所述触摸操作强度。终端设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B可以用于确定终端设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定终端设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测终端设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消终端设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
气压传感器180C用于测量气压。在一些实施例中,终端设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。终端设备100可以利用磁传感器180D检测翻盖皮套的开合。在一些实施例中,当终端设备100是翻盖机时,终端设备100可以根据磁传感器180D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器180E可检测终端设备100在各个方向上(一般为三轴)加速度的大小。 当终端设备100静止时可检测出重力的大小及方向。还可以用于识别终端设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。终端设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,终端设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。终端设备100通过发光二极管向外发射红外光。终端设备100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定终端设备100附近有物体。当检测到不充分的反射光时,终端设备100可以确定终端设备100附近没有物体。终端设备100可以利用接近光传感器180G检测用户手持终端设备100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器180L用于感知环境光亮度。终端设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测终端设备100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。终端设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,终端设备100利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,终端设备100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,终端设备100对电池142加热,以避免低温导致终端设备100异常关机。在其他一些实施例中,当温度低于又一阈值时,终端设备100对电池142的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也称“触控器件”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于终端设备100的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。骨传导传感器180M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器180M也可以设置于耳机中,结合成骨传导耳机。音频模块170可以基于所述骨传导传感器180M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于所述骨传导传感器180M获取的血压跳动信号解析心率信息,实现心率检测功能。
所属领域的技术人员可以理解终端设备100可包括比图1所示的更少或更多的部件,图1所示的该终端设备仅包括与本申请所公开的多个实现方式更加相关的部件。
在图1所示终端设备的触摸屏1092上,前置摄像头采用了UDC设计,以实现全面屏。图2示出了终端设备的屏幕的一个示例性的正面结构示意图,如图2所示,前置摄像头隐 藏于终端设备的屏幕下方,基于此终端设备的屏幕被划分为两个区域,其中,第一区域屏幕覆盖于前置摄像头的上方,采用透明OLED屏幕,当拍照时,该第一区域屏幕为透明状态,不会妨碍摄像头取景,而当不拍照时,该第一区域屏幕就是一个普通显示屏,和第二区域屏幕一起正常显示画面。第二区域屏幕是终端设备的屏幕上除第一区域外的其他区域,采用普通的OLED屏幕,正常显示即可。
图2示出了采用UDC的屏幕的一个正面示意图,该屏幕中第一区域屏幕的侧视图如图3所示,图3示出了第一区域屏幕的一个示例性的结构示意图。第一区域屏幕包括三层,按照从外向里的顺序,最外层是透明阳极,中间层采用透明有机发光材料,最里层是透明阴极。当前置摄像头不拍照时,第一区域屏幕中的透明有机发光材料正常发光,其发射的光线(OLED光线)透过第一区域屏幕中的透明阴极到达用户的眼睛,从而使用户可以看到第一区域屏幕所显示的内容;当前置摄像头拍照时,第一区域屏幕中的透明有机发光材料不发光,外界光线可以透过第一区域屏幕中的透明阴极、透明有机发光材料以及透明阳极投射到前置摄像头,从而可以进行拍照。需要说明的是,图2和图3仅为一种终端设备的示例。如果终端设备具有多个前置摄像头,可以在每一个摄像头的上方分别设置一个如图3所示的透明OLED屏幕,也可以针对所有的前置摄像头,在一个较大区域范围(可以覆盖所有前置摄像头)内设置一个如图3所示的透明OLED屏幕。而图3所示的透明OLED屏幕也可以采用其他结构。本申请对此不做具体限定。
图4为本申请屏下摄像头终端设备的误触提醒方法实施例的流程图,如图4所示,本实施例的方法可以应用于上述图1所示的终端设备,该终端设备的屏幕可以如图2和图3所示的结构,其采用UDC设计方案,使屏幕可以实现全面屏。该屏下摄像头终端设备的误触提醒方法可以包括:
步骤401、检测屏幕上的第一区域屏幕是否有被触摸的可能性。
第一区域屏幕例如可以采用图2和图3所示终端设备的屏幕上的透明OLED屏幕。由于采用了全面屏,用户需要借由屏幕的触控能力输入指令,因此用户在使用终端设备的过程中,屏幕上的任意一个区域都有可能被触摸到,例如,点击、下拉、滑动等动作,可能会发生于屏幕上的任意位置,这会导致屏幕上被触摸过的地方被用户手上的汗液、沾染物等污染。被污染的屏幕就会出现看不清楚屏幕上显示的界面的情况,尤其是第一区域屏幕被污染后,由于拍照时其仍然覆盖于摄像头上方,即使屏幕本身变成透明状态,但污染了的屏幕就好比摄像头的镜头被污染,导致拍摄的图像产生眩光、模糊、甚至图像上出现杂物遮挡的现象。
图5a和5b示出了检测方法的一个示例性的示意图,如图5a和5b所示,以屏幕的左上角作为坐标原点(0,0),从坐标原点向右为x轴正向,屏幕的横坐标取值范围为0~X,从坐标原点向下为y轴正向,屏幕的纵坐标取值范围为0~Y。用户要打开位于屏幕顶端的下拉菜单,通常可以从屏幕顶端的某一位置(对应的横坐标为x1,纵坐标为y1,x1∈(0,X),y1的取值可以为0或者接近于0的数值)开始向下滑动一定距离L1。基于该操作,终端设备可以检测到用户的触控操作从坐标(x1,y1)开始,到坐标(x1,y1+L1)结束,滑动轨迹为竖直向下。终端设备以该滑动轨迹为基准,确定一个范围,以坐标(x1,y1)和坐标(x1,y1+L1)之间的连线为中线,左右扩展设定距离x(x的设定可以参照人的手指粗细或者参照人的指腹面积),得到一个矩形区域,该矩形区域的四个顶点分别是(x1-x,y1),(x1+x,y1), (x1-x,y1+L1)和(x1+x,y1+L1)。然后终端设备判断该矩形区域与第一区域屏幕是否存在交集,即上述范围是否会覆盖第一区域屏幕。如果前述二者有交集,说明在用户的上述滑动操作过程中,其手指很可能会划过第一区域屏幕,因此可以认为该情况下第一区域屏幕有被触摸进而被污染的可能性。
图6a和6b示出了检测方法的一个示例性的示意图,如图6a和6b所示,以屏幕的左上角作为坐标原点(0,0),从坐标原点向右为x轴正向,屏幕的横坐标取值范围为0~X,从坐标原点向下为y轴正向,屏幕的纵坐标取值范围为0~Y。用户要退出当前应用程序,通常可以从屏幕的左边界的某一位置(对应的横坐标为x2,纵坐标为y2,x2的取值可以为0或者接近于0的数值,y2∈(0,Y))开始向右滑动一定距离L2。基于该操作,终端设备可以检测到用户的触控操作从坐标(x2,y2)开始,到坐标(x2+L2,y2)结束,滑动轨迹为水平向右。终端设备以该滑动轨迹为基准,确定一个范围,以坐标(x2,y2)和坐标(x2+L2,y2)之间的连线为中线,上下扩展设定距离y(y的设定可以参照人的手指粗细或者参照人的指腹面积),得到一个矩形区域,该矩形区域的四个顶点分别是(x2,y2-y),(x2,y2+y),(x2+L2,y2-y)和(x2+L2,y2+y)。然后终端设备判断该矩形区域与第一区域屏幕是否存在交集,即上述范围是否会覆盖第一区域屏幕。如果前述二者有交集,说明在用户的上述滑动操作过程中,其手指很可能会划过第一区域屏幕,因此可以认为该情况下第一区域屏幕有被触摸进而被污染的可能性。
图7a和7b示出了检测方法的一个示例性的示意图,如图7a和7b所示,以屏幕的左上角作为坐标原点(0,0),从坐标原点向右为x轴正向,屏幕的横坐标取值范围为0~X,从坐标原点向下为y轴正向,屏幕的纵坐标取值范围为0~Y。用户在看照片、选应用程序等过程中,要切换当前照片或界面,通常可以从屏幕的右边界的某一位置(对应的横坐标为x3,纵坐标为y3,x3∈(0,X),y3∈(0,Y))开始向左滑动一定距离L3。基于该操作,终端设备可以检测到用户的触控操作从坐标(x3,y3)开始,到坐标(x3-L3,y3)结束,滑动轨迹为水平向左。终端设备以该滑动轨迹为基准,确定一个范围,以坐标(x3,y3)和坐标(x3-L3,y3)之间的连线为中线,上下扩展设定距离y(y的设定可以参照人的手指粗细或者参照人的指腹面积),得到一个矩形区域,该矩形区域的四个顶点分别是(x3,y3-y),(x3,y3+y),(x3-L3,y3-y)和(x3-L3,y3+y)。然后终端设备判断该矩形区域与第一区域屏幕是否存在交集,即上述范围是否会覆盖第一区域屏幕。如果前述二者有交集,说明在用户的上述滑动操作过程中,其手指很可能会划过第一区域屏幕,因此可以认为该情况下第一区域屏幕有被触摸进而被污染的可能性。
图8a和8b示出了检测方法的一个示例性的示意图,如图8a和8b所示,以屏幕的左上角作为坐标原点(0,0),从坐标原点向右为x轴正向,屏幕的横坐标取值范围为0~X,从坐标原点向下为y轴正向,屏幕的纵坐标取值范围为0~Y。用户玩游戏时要拖着游戏中的人物在游戏画面中走动,通常可能在屏幕上进行任意方向的滑动,从某一位置(对应的横坐标为x4,纵坐标为y4,x4∈(0,X),y4∈(0,Y))开始向任意一个方向(例如右下方)滑动一定距离L4,再向任意一个方向(例如向右)滑动移动距离L5。基于该操作,终端设备可以检测到用户的触控操作从坐标(x4,y4)开始,途径坐标(x41,y41),到坐标(x42,y42)结束,滑动轨迹先向右下方再向右。终端设备以该滑动轨迹为基准,确定一个范围,以坐标(x4,y4)和坐标(x41,y41)、再和坐标(x42,y42)之间的连线为中线,双侧分别扩展设定距离z(z的设 定可以参照人的手指粗细或者参照人的指腹面积),得到一个带状区域。然后终端设备判断该带状区域与第一区域屏幕是否存在交集,即上述范围是否会覆盖第一区域屏幕。如果前述二者有交集,说明在用户的上述滑动操作过程中,其手指很可能会划过第一区域屏幕,因此可以认为该情况下第一区域屏幕有被触摸进而被污染的可能性。
图9a和9b示出了检测方法的一个示例性的示意图,如图9a和9b所示,以屏幕的左上角作为坐标原点(0,0),从坐标原点向右为x轴正向,屏幕的横坐标取值范围为0~X,从坐标原点向下为y轴正向,屏幕的纵坐标取值范围为0~Y。用户要打开某个应用程序或触发删除应用程序,通常可以短暂按压或长按(点击控件超过设定时长)应用程序的图标,该应用程序的位置对应坐标(x5,y5)。基于该操作,终端设备可以检测到用户的触控操作为在以坐标(x5,y5)为中心的圆形、方形或不规则区域上短暂按压或长按。终端设备以坐标(x5,y5)为圆心,确定一个范围,以坐标(x5,y5)为圆心,以设定距离r(r的设定可以参照人的手指头面积或者参照人的指腹面积)为半径得到一个圆形区域。然后终端设备判断该圆形区域与第一区域屏幕是否存在交集,即上述范围是否会覆盖第一区域屏幕。如果前述二者有交集,说明在用户的上述点击操作过程中,其手指很可能会触碰到第一区域屏幕,因此可以认为该情况下第一区域屏幕有被触摸进而被污染的可能性。
需要说明的是,上述内容示例性的说明了几种检测第一区域屏幕是否有被触摸的可能性的方法,但本申请对于具体实现的方法不做具体限定,包括上述范围的确定方法也不做具体限定。另外,上述终端设备得到的矩形区域、带状区域和圆形区域等与第一区域屏幕之间的交集是指二者所在区域是否有重叠,该重叠的范围和面积不限,即只要二者有交集(有重叠)就认为用户在相应的操作过程中,其手指很可能会触碰到第一区域屏幕。
目前,常见的触摸屏上只有互电容传感器,用于实现多点触摸检测。支持悬浮触控的触摸屏上有两种电容式传感器,即互电容传感器和自电容传感器,互电容传感器的电场很小,其信号强度很低,无法感应到非常弱小的信号,而自电容传感器能够产生比互电容强大的信号,可以检测更远的手指感应,其检测距离可达20mm,即自电容传感器可以检测到位于屏幕上方20mm处的手指。支持悬浮触控的触摸屏中,互电容传感器完成正常的触碰感应,包括多点触控,自电容传感器检测悬停在屏幕上方的手指。通过设置触碰录入的阈值,终端设备可以区分悬浮触碰和接触触碰。
距离传感器又叫做位移传感器,是传感器的一种,用于感应其与某物体间的距离。设置了距离传感器的触摸屏可以通过距离传感器,检测到悬停于屏幕上放的手指和屏幕之间的距离。若该距离小于设定的阈值,说明用户很可能要开始在触摸屏上进行操作。通过设置距离阈值,终端设备可以区分悬浮触碰和接触触碰。
图10示出了检测方法的一个示例性的示意图,如图10所示,以屏幕的左上角作为坐标原点(0,0),从坐标原点向右为x轴正向,屏幕的横坐标取值范围为0~X,从坐标原点向下为y轴正向,屏幕的纵坐标取值范围为0~Y。终端设备提供了感应功能,用户的手指近距离靠近终端设备的屏幕时,基于上述悬浮触控技术或距离传感器,终端设备可以检测到悬停于屏幕上放的手指对应的屏幕上的位置(悬停点)。图11示出了用户手指悬浮于屏幕上方的一个示例性的示意图,如图11所示,用户的右手食指悬浮于终端设备的屏幕的某一位置的上方,该位置可以称作悬停点,其坐标为(x6,y6)。然后终端设备以坐标(x6,y6)为圆心,以设定距离r(r的设定可以参照人的手指头面积或者参照人的指腹面积)为半径 得到一个圆形区域。然后终端设备判断该圆形区域与第一区域屏幕是否存在交集,即上述范围是否会覆盖第一区域屏幕。如果前述二者有交集,说明在用户很有可能会从悬停点的位置开始对终端设备进行操作,并且操作的范围可能就在悬停点的附近,这样其手指很可能会触碰到第一区域屏幕,因此可以认为该情况下第一区域屏幕有被触摸进而被污染的可能性。
需要说明的是,上述内容示例性的说明了一种检测第一区域屏幕是否有被触摸的可能性的方法,但本申请对于上述范围的确定方法不做具体限定。另外,上述终端设备得到的圆形区域等与第一区域屏幕之间的交集是指二者所在区域是否有重叠,该重叠的范围和面积不限,即只要二者有交集(有重叠)就认为用户在悬停之后的操作过程中,其手指很可能会触碰到第一区域屏幕。
距离传感器又叫做位移传感器,是传感器的一种,用于感应其与某物体间的距离。通过距离传感器,终端设备可以检测到悬停于屏幕上放的手指和屏幕之间的距离,若该距离小于设定的阈值,说明用户很可能要开始在触摸屏上进行操作,终端设备可以根据手指对应的屏幕上的位置(悬停点),以该悬停点为中心,确定一个范围,例如以悬停点为中心,以设定距离r(r的设定可以参照人的手指头面积或者参照人的指腹面积)为半径得到一个圆形区域。然后终端设备判断该圆形区域与第一区域屏幕是否存在交集,即上述范围是否会覆盖第一区域屏幕。如果前述二者有交集,说明在用户很有可能会从悬停点的位置开始对终端设备进行操作,并且操作的范围可能就在悬停点的附近,这样其手指很可能会触碰到第一区域屏幕,因此可以认为该情况下第一区域屏幕有被触摸进而被污染的可能性。
需要说明的是,上述内容示例性的说明了一种检测第一区域屏幕是否有被触摸的可能性的方法,但本申请对于上述范围的确定方法不做具体限定。
步骤402、当确定第一区域屏幕有被触摸的可能性时,给出提醒。
如步骤401所述,终端设备在检测到第一区域屏幕有被触摸的可能性时,可以给出提醒,以告知用户屏幕上的哪一区域(即第一区域屏幕)覆盖了摄像头,从而用户可以谨慎操作,尽可能避免触碰到第一区域屏幕。以下是几种提醒示例:
图12示出了提醒方法的一个示例性的示意图,如图12所示,终端设备可以在屏幕上明确的框出第一区域屏幕,例如用斜线填充的矩形。图13示出了提醒方法的一个示例性的示意图,如图13所示,终端设备可以在屏幕上用区别于其他区域的颜色示出第一区域屏幕,例如用红色矩形框出第一区域屏幕。图14示出了提醒方法的一个示例性的示意图,如图14所示,终端设备可以在屏幕上明确的框出摄像头对应的位置,例如用圆形实线。用户看到屏幕上显示的图样即可知道该图像所框示出的区域为第一区域屏幕。图15示出了提醒方法的一个示例性的示意图,如图15所示,终端设备可以在屏幕上用斜线填充的矩形框出第一区域屏幕的同时,用文字的方法提醒用户,该文字例如可以是“此处为摄像头区域,请不要触摸”。用户看到屏幕上显示的文字即可知道框出的区域为第一区域屏幕。
需要说明的是,本申请还可以采用其他方法提醒用户第一区域屏幕的位置,对此不做具体限定。
本实施例,当检测覆盖摄像头的部分屏幕有被用户的手指触摸的可能性时,终端设备可以给出提示,以告知用户摄像头所在的位置,这样用户可以谨慎操作,尽可能避免触碰到覆盖摄像头的部分屏幕,从而降低了覆盖摄像头的部分屏幕被污染的可能性和频率,提 升拍照清晰度。
本领域普通技术人员可以意识到,结合本文中所公开的实施例中描述的各方法步骤和单元,能够以电子硬件、计算机软件或者二者的结合来实现,为了清楚地说明硬件和软件的可互换性,在上述说明中已经按照功能一般性地描述了各实施例的步骤及组成。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。本领域普通技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参见前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另外,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口、装置或单元的间接耦合或通信连接,也可以是电的,机械的或其它的形式连接。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本申请实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以是两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分,或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到各种等效的修改或替换,这些修改或替换都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (20)

  1. 一种屏下摄像头终端设备的误触提醒方法,应用于终端设备,所述终端设备包括屏下摄像头,所述终端设备的屏幕包括覆盖所述屏下摄像头的第一区域屏幕,其特征在于,所述方法包括:
    当检测到所述第一区域屏幕存在被触摸的可能性时,在屏幕上显示误触提醒。
  2. 根据权利要求1所述的方法,其特征在于,所述检测到所述第一区域屏幕存在被触摸的可能性,包括:
    根据用户操作获取可触摸区域,所述可触摸区域为所述终端设备的屏幕上会被用户触摸到的区域;
    当所述可触摸区域和所述第一区域屏幕有重叠时,确定所述第一区域屏幕有被触摸的可能性;
    当所述可触摸区域和所述第一区域屏幕没有重叠时,确定所述第一区域屏幕没有被触摸的可能性。
  3. 根据权利要求2所述的方法,其特征在于,所述根据用户操作获取可触摸区域,包括:
    根据用户在所述终端设备的屏幕上的触控操作获取滑动轨迹;
    根据所述滑动轨迹确定所述可触摸区域。
  4. 根据权利要求3所述的方法,其特征在于,所述触控操作包括滑动操作和/或按压操作;其中,
    所述滑动操作包括任意方向的直线滑动操作和/或任意方向的曲线滑动操作;
    所述按压操作包括点击操作和/或长按操作。
  5. 根据权利要求4所述的方法,其特征在于,所述根据所述滑动轨迹确定所述可触摸区域,包括:
    根据所述滑动操作对应的所述滑动轨迹为中线,双侧向外扩展设定长度得到所述可触摸区域;或者,
    以所述按压操作对应的所述滑动轨迹为圆心,以设定长度为半径得到所述可触摸区域。
  6. 根据权利要求2所述的方法,其特征在于,所述根据用户操作获取可触摸区域,包括:
    根据用户在所述终端设备的屏幕附近的悬浮触控操作获取悬停点根据所述悬停点确定所述可触摸区域。
  7. 根据权利要求6所述的方法,其特征在于,所述根据所述悬停点确定所述可触摸区域,包括:
    以所述终端设备的屏幕上与所述悬停点对应的位置为圆心,以设定长度为半径得到所述可触摸区域。
  8. 根据权利要求1-7中任一项所述的方法,其特征在于,所述在屏幕上显示误触提醒,包括:
    将所述第一区域屏幕的边沿以区别于其他区域的方式显示;或者,
    将所述第一区域屏幕以区别于其他区域的方式显示;或者,
    将所述终端设备的屏幕上对应于所述屏下摄像头的边沿的位置以区别于其他区域的方式显示;或者,
    将所述终端设备的屏幕上对应于所述屏下摄像头的位置以区别于其他区域的方式显示;
    其中,所述区别于其他区域的方式包括高亮或填充图案。
  9. 根据权利要求8所述的方法,其特征在于,所述在屏幕上显示误触提醒,还包括:
    在所述终端设备的屏幕上用文字指示所述第一区域屏幕或所述终端设备的屏幕上对应于所述屏下摄像头的位置。
  10. 一种终端设备,其特征在于,所述终端设备包括屏下摄像头,所述终端设备的屏幕包括覆盖所述屏下摄像头的第一区域屏幕;所述终端设备还包括:处理器和存储器;
    所述存储器,用于存储程序;
    当所述程序被所述处理器执行,使得所述处理器,用于当检测到所述第一区域屏幕存在被触摸的可能性时,在所述屏幕上显示误触提醒。
  11. 根据权利要求10所述的终端设备,其特征在于,所述处理器,具体用于根据用户操作获取可触摸区域,所述可触摸区域为所述屏幕上会被用户触摸到的区域;当所述可触摸区域和所述第一区域屏幕有重叠时,确定所述第一区域屏幕有被触摸的可能性;当所述可触摸区域和所述第一区域屏幕没有重叠时,确定所述第一区域屏幕没有被触摸的可能性。
  12. 根据权利要求11所述的终端设备,其特征在于,所述处理器,具体用于根据用户在所述屏幕上的触控操作获取滑动轨迹;根据所述滑动轨迹确定所述可触摸区域。
  13. 根据权利要求12所述的终端设备,其特征在于,所述触控操作包括滑动操作和/或按压操作;其中,
    所述滑动操作包括任意方向的直线滑动操作和/或任意方向的曲线滑动操作;
    所述按压操作包括点击操作和/或长按操作。
  14. 根据权利要求13所述的终端设备,其特征在于,所述处理器,具体用于根据所述滑动操作对应的所述滑动轨迹为中线,双侧向外扩展设定长度得到所述可触摸区域;或者,以所述按压操作对应的所述滑动轨迹为圆心,以设定长度为半径得到所述可触摸区域。
  15. 根据权利要求11所述的终端设备,其特征在于,所述处理器,具体用于根据用户在所述屏幕附近的悬浮触控操作获取悬停点,所述悬浮触控操作通过电容传感器或距离传感器感应得到;根据所述悬停点确定所述可触摸区域。
  16. 根据权利要求15所述的终端设备,其特征在于,所述处理器,具体用于以所述屏幕上与所述悬停点对应的位置为圆心,以设定长度为半径得到所述可触摸区域。
  17. 根据权利要求10-16中任一项所述的终端设备,其特征在于,所述屏幕,具体用于将所述第一区域屏幕的边沿以区别于其他区域的方式显示;或者,将所述第一区域屏幕以区别于其他区域的方式显示;或者,将对应于所述屏下摄像头的边沿的位置以区别于其他区域的方式显示;或者,将对应于所述屏下摄像头的位置以区别于其他区域的方式显示;所述区别于其他区域的方式包括高亮或填充图案。
  18. 根据权利要求17所述的终端设备,其特征在于,所述屏幕,还用于用文字指示 所述第一区域屏幕或所述对应于所述屏下摄像头的位置。
  19. 一种计算机可读存储介质,其特征在于,包括计算机程序,所述计算机程序在计算机上被执行时,使得所述计算机执行权利要求1-9中任一项所述的方法。
  20. 一种计算机程序,其特征在于,当所述计算机程序被计算机执行时,用于执行权利要求1-9中任一项所述的方法。
PCT/CN2021/081569 2020-03-31 2021-03-18 屏下摄像头终端设备的误触提醒方法和装置 WO2021197085A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010246250.9A CN113467652A (zh) 2020-03-31 2020-03-31 屏下摄像头终端设备的误触提醒方法和装置
CN202010246250.9 2020-03-31

Publications (1)

Publication Number Publication Date
WO2021197085A1 true WO2021197085A1 (zh) 2021-10-07

Family

ID=77865664

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/081569 WO2021197085A1 (zh) 2020-03-31 2021-03-18 屏下摄像头终端设备的误触提醒方法和装置

Country Status (2)

Country Link
CN (1) CN113467652A (zh)
WO (1) WO2021197085A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113986047A (zh) * 2021-12-23 2022-01-28 荣耀终端有限公司 识别误触信号的方法和装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103853476A (zh) * 2012-12-04 2014-06-11 联想(北京)有限公司 一种信息处理的方法及电子设备
CN104731498A (zh) * 2015-01-30 2015-06-24 深圳市中兴移动通信有限公司 移动终端防误触控方法及装置
CN104932788A (zh) * 2015-06-24 2015-09-23 青岛海信移动通信技术股份有限公司 一种自适应触屏控制方法和设备
CN106201304A (zh) * 2016-06-23 2016-12-07 乐视控股(北京)有限公司 一种防误触操作的方法及装置
US20170277336A1 (en) * 2016-03-24 2017-09-28 Boe Technology Group Co., Ltd. Touch Method and Device, Touch Display Apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108196722A (zh) * 2018-01-29 2018-06-22 广东欧珀移动通信有限公司 一种电子设备及其触控方法、计算机可读存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103853476A (zh) * 2012-12-04 2014-06-11 联想(北京)有限公司 一种信息处理的方法及电子设备
CN104731498A (zh) * 2015-01-30 2015-06-24 深圳市中兴移动通信有限公司 移动终端防误触控方法及装置
CN104932788A (zh) * 2015-06-24 2015-09-23 青岛海信移动通信技术股份有限公司 一种自适应触屏控制方法和设备
US20170277336A1 (en) * 2016-03-24 2017-09-28 Boe Technology Group Co., Ltd. Touch Method and Device, Touch Display Apparatus
CN106201304A (zh) * 2016-06-23 2016-12-07 乐视控股(北京)有限公司 一种防误触操作的方法及装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113986047A (zh) * 2021-12-23 2022-01-28 荣耀终端有限公司 识别误触信号的方法和装置
CN113986047B (zh) * 2021-12-23 2023-10-27 荣耀终端有限公司 识别误触信号的方法和装置

Also Published As

Publication number Publication date
CN113467652A (zh) 2021-10-01

Similar Documents

Publication Publication Date Title
JP7391102B2 (ja) ジェスチャ処理方法およびデバイス
CN113645351B (zh) 应用界面交互方法、电子设备和计算机可读存储介质
US20220206682A1 (en) Gesture Interaction Method and Apparatus, and Terminal Device
CN113407053B (zh) 一种触摸屏、电子设备、显示控制方法
US11907526B2 (en) Touch region adjustment method and apparatus for determining a grasping gesture of a user on an electronic device
WO2020259674A1 (zh) 一种曲面屏的防误触方法及电子设备
WO2021063098A1 (zh) 一种触摸屏的响应方法及电子设备
CN112751954B (zh) 一种操作提示的方法和电子设备
JP2022546453A (ja) フィットネス支援方法および電子装置
US20220244846A1 (en) User Interface Display Method and Electronic Device
WO2021008589A1 (zh) 一种应用的运行方法及电子设备
CN112650405B (zh) 一种电子设备的交互方法及电子设备
WO2024016564A1 (zh) 二维码识别方法、电子设备以及存储介质
CN113934330A (zh) 一种截屏方法及电子设备
US20230224574A1 (en) Photographing method and apparatus
EP4283450A1 (en) Display method, electronic device, storage medium, and program product
CN113391775A (zh) 一种人机交互方法及设备
WO2021197085A1 (zh) 屏下摄像头终端设备的误触提醒方法和装置
US20220317841A1 (en) Screenshot Method and Related Device
WO2022062985A1 (zh) 视频特效添加方法、装置及终端设备
CN109359460A (zh) 一种面部识别方法及终端设备
CN114690985B (zh) 一种显示方法和电子设备
CN115390738A (zh) 卷轴屏开合方法及相关产品
CN111475363A (zh) 卡死识别方法及电子设备
CN116521018B (zh) 误触提示方法、终端设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21779187

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21779187

Country of ref document: EP

Kind code of ref document: A1