CN113672123A - Detection method and device and electronic equipment - Google Patents

Detection method and device and electronic equipment Download PDF

Info

Publication number
CN113672123A
CN113672123A CN202110971626.7A CN202110971626A CN113672123A CN 113672123 A CN113672123 A CN 113672123A CN 202110971626 A CN202110971626 A CN 202110971626A CN 113672123 A CN113672123 A CN 113672123A
Authority
CN
China
Prior art keywords
texture image
input
screen
defect
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110971626.7A
Other languages
Chinese (zh)
Inventor
翟康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Hangzhou Co Ltd
Original Assignee
Vivo Mobile Communication Hangzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Hangzhou Co Ltd filed Critical Vivo Mobile Communication Hangzhou Co Ltd
Priority to CN202110971626.7A priority Critical patent/CN113672123A/en
Publication of CN113672123A publication Critical patent/CN113672123A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means

Abstract

The application discloses a detection method, a detection device and electronic equipment, and belongs to the technical field of detection. The method comprises the following steps: receiving a first input of a user, and acquiring a texture image corresponding to the first input; acquiring defect area information from the texture image; performing first processing on the first input under the condition that the texture image is determined to be a first-class texture image according to the defect area information; wherein the first type of texture image is acquired in the presence of liquid on the screen.

Description

Detection method and device and electronic equipment
Technical Field
The application belongs to the technical field of detection, and particularly relates to a detection method, a detection device and electronic equipment.
Background
The intelligent terminal is unlocked by using the fingerprint of the touch screen, so that the intelligent terminal becomes a standard matching function. The touch screen usually adopts the capacitive screen, and the experience of human-computer interaction can be directly influenced to its performance, can say that it is very big to experience the influence to the use of intelligent terminal complete machine.
Although texture detection algorithms for capacitive screens have been developed for many years, current texture detection algorithms still cannot process well in the face of complex scenes. For example, when there is liquid on the screen, such as sweat or dirt on the surface of a finger (referred to as a hand sweat mode), the capacitance change caused by touching the capacitive screen is greatly different from the capacitance change caused by touching the capacitive screen with a finger when there is no liquid on the screen.
The existing capacitive screen mainly identifies whether liquid exists on the screen according to capacitance value change characteristics, so that misjudgment is easy to occur, whether liquid exists on the screen cannot be accurately identified, and touch screen touch breaking phenomena such as touch screen line scribing and line breaking, touch insensitivity and the like can be caused.
Disclosure of Invention
The embodiment of the application aims to provide a detection method, which can solve the problems of touch screen touch breaking caused by the fact that a user cannot accurately identify a screen on which the user is located, such as line marking and line breaking of a touch screen, insensitive clicking and the like.
In a first aspect, an embodiment of the present application provides a detection method, where the method includes:
receiving a first input of a user, and acquiring a texture image corresponding to the first input;
acquiring defect area information from the texture image;
performing first processing on the first input under the condition that the texture image is determined to be a first-class texture image according to the defect area information;
wherein the first type of texture image is acquired in the presence of liquid on the screen.
In a second aspect, an embodiment of the present application provides a detection apparatus, where the apparatus includes:
the receiving module is used for receiving a first input of a user and acquiring a texture image corresponding to the first input;
the acquisition module is used for acquiring defect area information from the texture image;
the processing module is used for performing first processing on the first input under the condition that the texture image is determined to be a first-class texture image according to the defect area information;
wherein the first type of texture image is acquired in the presence of liquid on the screen.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, a texture image corresponding to a first input is obtained by receiving the first input of a user; acquiring defect area information from the texture image; performing first processing on the first input under the condition that the texture image is determined to be a first-class texture image according to the defect area information; wherein the first type of texture image is acquired in the presence of liquid on the screen. Whether the texture image is the first type of texture image or not is determined by acquiring the texture information and according to the defect area information extracted from the texture information, so that whether liquid exists on a screen or not is accurately identified, the identification accuracy is improved, different processing can be carried out on the first input according to the identification result, and the problems of touch screen touch breaking such as line breaking and insensitive clicking of a touch screen in the prior art are solved.
Drawings
FIG. 1 is a schematic flow chart of a detection method according to an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating a principle of screen optical fingerprint acquisition in the detection method according to the embodiment of the present application;
FIG. 3 is a schematic representation of the reflection and refraction of light in different media;
FIG. 4 is a schematic diagram of a fingerprint sampling optical path system in the detection method according to the embodiment of the present application;
FIG. 5 is a schematic diagram of a dry image and a wet image in the inspection method of the embodiment of the present application;
FIG. 6 is a schematic diagram of a first type of texture image after preprocessing the wet image of FIG. 5 by the detection method according to the embodiment of the present application;
FIG. 7 is a schematic structural diagram of a detecting device according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 9 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The following describes in detail the detection method provided by the embodiment of the present application with reference to the accompanying drawings and its application scenarios.
Fig. 1 is a schematic flow chart of a detection method according to an embodiment of the present application. The detection method of the embodiment of the application can be executed by a detection device, and the detection device can be arranged in an intelligent terminal, such as a smart phone, a tablet computer and a smart watch.
As shown in fig. 1, the detection method of the present embodiment may include the following steps 1100 to 1300:
step 1100, receiving a first input of a user, and acquiring a texture image corresponding to the first input.
The first input may be, for example, a click input, a slide input, or the like of a finger or a stylus of a user on a screen. And is not particularly limited herein.
When the detection device acquires the texture image, the texture image can be acquired by using the principle of reflection and refraction of light, as shown in fig. 2, which is a schematic diagram of light incident from glass to air. The refractive index of different media is different, for example, the refractive index of air is 1, the refractive index of water is 1.33, the refractive index of glass is 1.5, and the refractive index of pork skin is 1.45. Light is incident to another medium from one medium and generally undergoes reflection and refraction simultaneously, the smaller the refraction angle is, the larger the refracted light quantity is, and conversely, the refracted light quantity is smaller; the amount of light reflected plus the amount of light refracted according to the conservation of energy is equal to the amount of light incident, so that the more light is refracted the less light is reflected for the same incident light.
In this embodiment, as shown in fig. 3, when the surface of the screen is dry, a finger touches the OLED screen, and the CMOS sensor in the optical lens may obtain a clear texture image through reflection and refraction of light, but when there is liquid on the screen, reflection of light may be disturbed, and the texture image obtained at this time is relatively blurred, so that it may be better determined whether there is liquid on the current screen by obtaining the texture image, and in combination with a determination result of whether the texture image is the first type of texture image, it may be more accurate to perform corresponding processing on the first input, thereby reducing the screen touch interruption phenomenon.
Referring to fig. 4, when light is incident on the texture valley, since air is located between the normal texture valley and the glass cover plate of the screen, and light is incident from glass to air, the refractive index of air is relatively small, the refraction angle of the light path is large, the refracted light energy is relatively small, most of the light energy is reflected, and the light quantity received by the CMOS is relatively large; the texture ridges are formed by the fact that the skin directly contacts the screen glass cover plate, and the refractive index of the skin is large, so that the refraction angle of the light path is small, the refracted light energy is large, the reflected light energy is small, and the light quantity received by the CMOS is small; therefore, according to the received light quantity, a clearer texture image can be obtained.
When liquid exists on the screen, the part of the texture valley is changed into liquid from air, and when liquid also exists on the part of the texture ridge, the whole process is that the glass is incident to the liquid, the light quantity received by the whole COMS is uniform, so that the whole image is fuzzy and has almost no texture information. When the texture ridges are partially free from liquid and only the texture valleys are partially free from liquid, a part is from glass to liquid and a part is from glass to skin (the refractive index difference between the skin and the liquid is not large), the light energy received by the CMOS in the whole area has no obvious boundary for the valleys and the ridges, and the obtained texture image is fuzzy, for example, as shown in FIG. 5, the texture image is acquired by the COMS when no liquid exists and the liquid exists on the screen.
Step 1200, obtaining defect region information from the texture image.
In this step, before obtaining the defect area information from the texture image, the method further includes: and preprocessing the texture image. Specifically, the pretreatment may include: and sequentially carrying out binarization processing, filtering processing and thinning processing on the texture image.
Specifically, when acquiring defect region information from the preprocessed texture image, feature extraction may be performed on a defect region in the texture image, an area, a form, and a number of the defect region may be acquired, and the area, the form, and the number of the defect region may be determined as the defect region information.
For example, the image obtained by performing binarization, filtering, and thinning on the right image in fig. 5 has a defective region as shown in fig. 6, and the area, form, and number of the defective region can be obtained by extracting the features of the defective region.
Step 1300, performing a first process on the first input when the texture image is determined to be a first type texture image according to the defect area information.
In step 1200, the area, shape, and number of the defective regions are acquired, and in this step, it may be determined whether the texture image is the first type texture image based on the acquired information of the defective regions. Wherein the first type of texture image is acquired in the presence of liquid on the screen.
Specifically, the detection device may first determine whether the number of the defect regions is greater than a preset number of defect regions; if so, determining that the texture image is a second type of texture image; if not, calculating the similarity between the form of the defect area and a preset first-class texture image; and determining whether the texture image is the first type texture image according to the similarity.
When whether the texture image is the first type of texture image is determined according to the similarity, judging whether the similarity is greater than a first preset similarity; if the similarity is larger than the first preset similarity, determining that the texture image is the first type of texture image; if the similarity is not greater than the first preset similarity, judging whether the similarity is greater than a second preset similarity or not; if the similarity is not greater than the second preset similarity, determining that the texture image is the second type of texture image; if the similarity is larger than the second preset similarity, continuously acquiring the number of pixels occupied by the area of the defect region; and determining whether the texture image is the first type texture image according to the number of the pixels.
When determining whether the texture image is a first-class texture image according to the number of the pixels, judging whether the number of the pixels is larger than the number of preset pixels, and if so, determining that the texture image is the first-class texture image; and if the number of the pixels is not greater than the preset number of the pixels, determining that the texture image is the second type of texture image.
For example, assume that the number of the predetermined defective regions is 3, the first predetermined similarity is 70%, the second predetermined similarity is 30%, and the number of the predetermined pixels is 100.
If the number of the acquired defect areas is 4 and is larger than the number of preset defect areas, determining that the texture image is the second type of texture image; if the number of the acquired defect areas is 1 and is smaller than the number of preset defect areas, calculating the similarity between the form of the defect areas and a preset first-class texture image, and if the calculated similarity is 72%, determining that the texture image is the first-class texture image; if the calculated similarity is 25%, determining that the texture image is the second type of texture image; and if the calculated similarity is 40%, continuously acquiring the number of pixels occupied by the area of the defect region. If the number of the pixels occupied by the area of the defect area is 120, determining that the texture image is the first-class texture image; and if the number of the pixels occupied by the acquired area of the defect region is 90, determining that the texture image is the second type texture image.
Further, after determining whether the texture image is a first type texture image according to the defect area information, corresponding processing may be performed on the first input according to a determination result.
Specifically, if the texture image is the first type of texture image as a result of the determination, performing first combing on the first input; and if the texture image is the second type of texture image according to the determination result, performing second processing on the first input.
For example, the first processing is processing the first input by a first algorithm, such as a hand perspiration pattern recognition algorithm. The second processing may be processing the first input by a second algorithm, such as a normal pattern recognition algorithm.
It should be noted that there are two detection modes, one is full-screen detection, and the other is partial-screen detection.
For full-screen detection, each position of the screen can acquire texture images to judge whether liquid exists on the screen, but certain power consumption can be brought by texture detection, the power consumption for detection is large all the time, and whether liquid exists on the screen or not is also dynamically changed, so that the detection can be set to be once every interval of time.
And starting to acquire a texture image when the first input is received, further judging whether liquid exists on the current screen, if so, performing first processing on the first input by the touch screen, and otherwise, performing second processing on the first input. Meanwhile, whether the first input is finished or not needs to be detected, if the first input is finished, the default mode, namely the common mode, is recovered, and the whole detection process is finished. If the first input is not finished, detecting whether the detection result is overtime and invalid, if yes, re-entering the detection process, and otherwise, continuously executing the process of detecting whether the first input is finished.
In this embodiment, considering that when liquid exists on the screen, the capacitance value is large as a whole, the central position is not obvious, and the difficulty of calculating the accuracy of coordinates is increased, therefore, when liquid exists on the screen, in order to optimize touch and touch interruption, the embodiment mainly provides software algorithm optimization and hardware-assisted optimization.
Specifically, in the optimization of the software algorithm, when liquid exists on a screen, the touch algorithm firstly adjusts the filtering algorithm to reduce the influence of the liquid on the screen on the coordinate calculation as much as possible; meanwhile, due to the trailing effect of the liquid on the screen, id switching is easy to occur in the first input process, one first input is identified into two inputs, and sliding disconnection is caused, so that the trigger condition of id switching needs to be adjusted. For example, raising the trigger threshold of the id switch decreases the sensitivity of detection of the id switch, thereby reducing the id switch due to smearing of the liquid on the screen.
The hardware auxiliary optimization is mainly to utilize the screen down pressure sense to assist in judging screen pressing, when liquid exists on the screen, the pressure signal of the screen down pressure sense and the capacitance signal of the touch screen can be used for comprehensive judgment, and the weight of the pressure signal and the capacitance signal of the touch screen can be automatically distributed according to the size of the defect area of the texture image to judge the amount of the liquid, wherein the more the liquid is, the more the weight of the pressure signal is, and otherwise, the weight of the touch capacitance signal is, the larger the weight of the touch capacitance signal is.
Specifically, the detection device acquires a pressure signal for pressing the screen and a capacitance signal of the touch screen; determining a first weight value corresponding to the pressure signal and a second weight value corresponding to the capacitance signal according to the defect area information; and performing first processing on the first input according to the pressure signal and the first weight value, and according to the capacitance signal and the second weight value.
In an example, the weight of the texture detection result may be defaulted to 1, although the texture detection result and the touch screen feature determination result may have corresponding weights respectively, and a result is obtained by combining the two results, where the weight may be dynamically changed, for example, the longer the time from the texture detection, the smaller the weight of the detection result is, and the smaller the weight is, which may be set according to a requirement in an actual application, which is not specifically limited in this embodiment.
For local screen detection, only a partial area of the screen can detect texture, so that it is continuously determined whether the currently received first input is in the detection area. When receiving the first input, firstly judging whether the first input is in a detection area at present, if not, continuously detecting whether the first input received at present is in the detection area, and if so, executing texture detection: acquiring a texture image corresponding to a first input, further judging whether liquid exists on a current screen, if so, performing first processing on the first input, and otherwise, performing second processing on the first input. And meanwhile, detecting whether the first input is finished or not, if so, recovering to a default mode, namely a common mode, and finishing the whole detection process. If the first input is not finished, detecting whether the detection result is overtime and invalid, if yes, re-entering the detection process, and otherwise, continuously executing the process of detecting whether the first input is finished.
According to the technical scheme of the embodiment of the application, a texture image corresponding to a first input is obtained by receiving the first input of a user; acquiring defect area information from the texture image; performing first processing on the first input under the condition that the texture image is determined to be a first-class texture image according to the defect area information; wherein the first type of texture image is acquired in the presence of liquid on the screen. Whether the texture image is the first type of texture image or not is determined by acquiring the texture information and according to the defect area information extracted from the texture information, so that whether liquid exists on a screen or not is accurately identified, the identification accuracy is improved, different processing can be carried out on the first input according to the identification result, and the problems of touch screen touch breaking such as line breaking and insensitive clicking of a touch screen in the prior art are solved.
It should be noted that, in the detection method provided in the embodiment of the present application, the execution subject may be a detection device, or a control module in the detection device for executing the detection method. In the embodiment of the present application, a detection device executing a detection method is taken as an example, and the detection device provided in the embodiment of the present application is described.
Fig. 7 is a schematic structural diagram of a detection apparatus according to an embodiment of the present application. As shown in fig. 7, the detection apparatus 2000 of the embodiment of the present application may include: a receiving module 2100, an obtaining module 2200 and a processing module 2300.
The receiving module 2100 is configured to receive a first input of a user, and obtain a texture image corresponding to the first input.
An obtaining module 2200 is configured to obtain defect region information from the texture image.
A processing module 2300, configured to perform a first processing on the first input if it is determined that the texture image is a first type of texture image according to the defect area information.
Wherein the first type of texture image is acquired in the presence of liquid on the screen.
In one embodiment, the defect region information includes an area, a shape, and a number of defect regions; the processing module 2300 is specifically configured to: and under the condition that the area, the form and the number of the defect regions meet preset conditions, determining that the texture image is the first-class texture image.
In one embodiment, the processing module 2300 is further configured to: determining the texture image to be the second type of texture image under the condition that the area, the form and the number of the defect regions do not meet preset conditions; wherein the second type of texture image is acquired in the absence of liquid on the screen.
In one embodiment, the processing module 2300 is specifically configured to: acquiring a pressure signal for pressing a screen and a capacitance signal of a touch screen; determining a first weight value corresponding to the pressure signal and a second weight value corresponding to the capacitance signal according to the defect area information; and performing first processing on the first input according to the pressure signal and the first weight value, and according to the capacitance signal and the second weight value.
In one embodiment, the processing module 2300 is further configured to: and performing second processing on the first input.
The detection device of the embodiment is provided with a receiving module for receiving a first input of a user and acquiring a texture image corresponding to the first input; the acquisition module is used for acquiring the information of the defect area from the texture image; and a processing module is arranged for performing first processing on the first input under the condition that the texture image is determined to be a first type texture image according to the defect area information. Whether the texture image is the first type of texture image or not is determined by acquiring the texture information and according to the defect area information extracted from the texture information, so that whether liquid exists on a screen or not is accurately identified, the identification accuracy is improved, different processing can be carried out on the first input according to the identification result, and the problems of touch screen touch breaking such as line breaking and insensitive clicking of a touch screen in the prior art are solved.
The detection device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The apparatus may be a mobile electronic device. The mobile electronic device may be, for example, a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook, or a Personal Digital Assistant (PDA), and the embodiments of the present application are not limited in particular.
The detection device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The detection device provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 1 to 6, and is not described here again to avoid repetition.
Optionally, as shown in fig. 8, an electronic device 3000 is further provided in the embodiment of the present application, and includes a processor 3100, a memory 3200, and a program or an instruction stored in the memory 3200 and executable on the processor 3100, where the program or the instruction is executed by the processor 3100 to implement each process of the foregoing detection method embodiment, and can achieve the same technical effect, and no further description is provided here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device described above.
Fig. 9 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 600 includes, but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, a user input unit 607, an interface unit 608, a memory 609, a processor 610, and the like.
Those skilled in the art will appreciate that the electronic device 600 may further comprise a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 610 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 9 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
The electronic device of the embodiment of the application can be used for executing the technical scheme of the embodiment of the method, and the implementation principle and the technical effect are similar, which are not described herein again.
It is to be understood that, in the embodiment of the present application, the input Unit 604 may include a Graphics Processing Unit (GPU) 6041 and a microphone 6042, and the Graphics Processing Unit 6041 processes image data of a still picture or a video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The display unit 606 may include a display panel 6061, and the display panel 6061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 607 includes a touch panel 6071 and other input devices 6072. A touch panel 6071, also referred to as a touch screen. The touch panel 6071 may include two parts of a touch detection device and a touch controller. Other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 609 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. The processor 610 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements the processes of the detection method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the foregoing detection method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (12)

1. A method of detection, the method comprising:
receiving a first input of a user, and acquiring a texture image corresponding to the first input;
acquiring defect area information from the texture image;
performing first processing on the first input under the condition that the texture image is determined to be a first-class texture image according to the defect area information;
wherein the first type of texture image is acquired in the presence of liquid on the screen.
2. The method of claim 1, wherein the defect region information includes an area, a shape, and a number of defect regions;
determining whether the texture image is a first type texture image according to the defect area information, including:
and under the condition that the area, the form and the number of the defect regions meet preset conditions, determining that the texture image is the first-class texture image.
3. The method of claim 2, further comprising:
determining the texture image to be the second type of texture image under the condition that the area, the form and the number of the defect regions do not meet preset conditions;
wherein the second type of texture image is acquired in the absence of liquid on the screen.
4. The method of claim 1, wherein the first processing the first input comprises:
acquiring a pressure signal for pressing a screen and a capacitance signal of a touch screen;
determining a first weight value corresponding to the pressure signal and a second weight value corresponding to the capacitance signal according to the defect area information;
and performing first processing on the first input according to the pressure signal and the first weight value, and according to the capacitance signal and the second weight value.
5. The method of claim 3, wherein after determining that the texture image is the second type of texture image, the method further comprises:
and performing second processing on the first input.
6. A detection device, the device comprising:
the receiving module is used for receiving a first input of a user and acquiring a texture image corresponding to the first input;
the acquisition module is used for acquiring defect area information from the texture image;
the processing module is used for performing first processing on the first input under the condition that the texture image is determined to be a first-class texture image according to the defect area information;
wherein the first type of texture image is acquired in the presence of liquid on the screen.
7. The apparatus of claim 6, wherein the defect region information includes an area, a shape, and a number of defect regions;
the processing module is specifically configured to: and under the condition that the area, the form and the number of the defect regions meet preset conditions, determining that the texture image is the first-class texture image.
8. The apparatus of claim 7, wherein the processing module is further configured to:
determining the texture image to be the second type of texture image under the condition that the area, the form and the number of the defect regions do not meet preset conditions;
wherein the second type of texture image is acquired in the absence of liquid on the screen.
9. The apparatus of claim 6, wherein the processing module is specifically configured to:
acquiring a pressure signal for pressing a screen and a capacitance signal of a touch screen;
determining a first weight value corresponding to the pressure signal and a second weight value corresponding to the capacitance signal according to the defect area information;
and performing first processing on the first input according to the pressure signal and the first weight value, and according to the capacitance signal and the second weight value.
10. The apparatus of claim 8, wherein the processing module is further configured to:
and performing second processing on the first input.
11. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the detection method according to any one of claims 1 to 5.
12. A readable storage medium, on which a program or instructions are stored, which when executed by a processor, carry out the steps of the detection method according to any one of claims 1 to 5.
CN202110971626.7A 2021-08-23 2021-08-23 Detection method and device and electronic equipment Pending CN113672123A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110971626.7A CN113672123A (en) 2021-08-23 2021-08-23 Detection method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110971626.7A CN113672123A (en) 2021-08-23 2021-08-23 Detection method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN113672123A true CN113672123A (en) 2021-11-19

Family

ID=78545218

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110971626.7A Pending CN113672123A (en) 2021-08-23 2021-08-23 Detection method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN113672123A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103065134A (en) * 2013-01-22 2013-04-24 江苏超创信息软件发展股份有限公司 Fingerprint identification device and method with prompt information
CN105844262A (en) * 2016-04-25 2016-08-10 广东欧珀移动通信有限公司 Method and device for determination of touch position through combination of fingerprint in wet hand operation mode
CN105975044A (en) * 2016-04-25 2016-09-28 广东欧珀移动通信有限公司 Method and device for automatically controlling wet hand mode of touch screen through fingerprint detection
CN106468973A (en) * 2016-08-31 2017-03-01 珠海市魅族科技有限公司 The processing method of touch event and device
CN109478113A (en) * 2016-05-18 2019-03-15 森赛尔股份有限公司 Method for touch input to be detected and confirmed
CN110865728A (en) * 2018-08-27 2020-03-06 苹果公司 Force or touch sensing on mobile devices using capacitance or pressure sensing
CN112099666A (en) * 2020-09-10 2020-12-18 深圳市科航科技发展有限公司 Touch control method, system, terminal and storage medium applied to capacitive screen
CN112416172A (en) * 2020-11-20 2021-02-26 维沃移动通信有限公司 Electronic equipment control method and device and electronic equipment
CN112860105A (en) * 2021-01-28 2021-05-28 维沃移动通信有限公司 Touch position determination method and device, electronic equipment and readable storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103065134A (en) * 2013-01-22 2013-04-24 江苏超创信息软件发展股份有限公司 Fingerprint identification device and method with prompt information
CN105844262A (en) * 2016-04-25 2016-08-10 广东欧珀移动通信有限公司 Method and device for determination of touch position through combination of fingerprint in wet hand operation mode
CN105975044A (en) * 2016-04-25 2016-09-28 广东欧珀移动通信有限公司 Method and device for automatically controlling wet hand mode of touch screen through fingerprint detection
CN109478113A (en) * 2016-05-18 2019-03-15 森赛尔股份有限公司 Method for touch input to be detected and confirmed
CN106468973A (en) * 2016-08-31 2017-03-01 珠海市魅族科技有限公司 The processing method of touch event and device
CN110865728A (en) * 2018-08-27 2020-03-06 苹果公司 Force or touch sensing on mobile devices using capacitance or pressure sensing
CN112099666A (en) * 2020-09-10 2020-12-18 深圳市科航科技发展有限公司 Touch control method, system, terminal and storage medium applied to capacitive screen
CN112416172A (en) * 2020-11-20 2021-02-26 维沃移动通信有限公司 Electronic equipment control method and device and electronic equipment
CN112860105A (en) * 2021-01-28 2021-05-28 维沃移动通信有限公司 Touch position determination method and device, electronic equipment and readable storage medium

Similar Documents

Publication Publication Date Title
CN110209273B (en) Gesture recognition method, interaction control method, device, medium and electronic equipment
US20130222338A1 (en) Apparatus and method for processing a plurality of types of touch inputs
US20170108977A1 (en) Touch display device and touch method thereof
CN106503609B (en) The recognition methods of fingerprint ridge point and device
CN109829368B (en) Palm feature recognition method and device, computer equipment and storage medium
CN103870071B (en) One kind touches source discrimination and system
CN105260105A (en) Display screen awakening method and device based on fingerprint sensor, and mobile terminal
CN108874234B (en) Touch identification method and device and touch display device
CN103065134A (en) Fingerprint identification device and method with prompt information
CN105388992A (en) Fingerprint recognition method and apparatus and terminal
CN106485237A (en) Fingerprint image acquisition method, system and fingerprint collecting equipment
CN106203326B (en) A kind of image processing method, device and mobile terminal
CN111625157B (en) Fingertip key point detection method, device, equipment and readable storage medium
CN107087075B (en) Prompting method based on screen fingerprint identification and mobile terminal
CN113867521B (en) Handwriting input method and device based on gesture visual recognition and electronic equipment
US20240127624A1 (en) Fingerprint recognition method, electronic device, and readable storage medium
US20160140762A1 (en) Image processing device and image processing method
CN116188379A (en) Edge defect detection method, device, electronic equipment and storage medium
CN108846339B (en) Character recognition method and device, electronic equipment and storage medium
CN111414110B (en) Fingerprint unlocking method and device and computer readable storage medium
CN112532884B (en) Identification method and device and electronic equipment
CN113835558A (en) Screen parameter adjusting method and device and electronic equipment
CN113672123A (en) Detection method and device and electronic equipment
EP4307087A1 (en) Gesture recognition method and apparatus, device, and medium
CN116109572A (en) Workpiece edge weak defect detection method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination