CN111857369A - Method, device, terminal and storage medium for calibrating proximity sensor of mobile terminal - Google Patents

Method, device, terminal and storage medium for calibrating proximity sensor of mobile terminal Download PDF

Info

Publication number
CN111857369A
CN111857369A CN202010727791.3A CN202010727791A CN111857369A CN 111857369 A CN111857369 A CN 111857369A CN 202010727791 A CN202010727791 A CN 202010727791A CN 111857369 A CN111857369 A CN 111857369A
Authority
CN
China
Prior art keywords
proximity sensor
mobile terminal
calibrating
sensor
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010727791.3A
Other languages
Chinese (zh)
Inventor
丛国华
韩冰天
李腾飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN202010727791.3A priority Critical patent/CN111857369A/en
Publication of CN111857369A publication Critical patent/CN111857369A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S2007/4975Means for monitoring or calibrating of sensor obstruction by, e.g. dirt- or ice-coating, e.g. by reflection measurement on front-screen

Abstract

The present disclosure relates to the field of computer technologies, and in particular, to a method and an apparatus for calibrating a proximity sensor of a mobile terminal, a terminal, and a storage medium. The method for calibrating the proximity sensor of the mobile terminal provided by the disclosure comprises the following steps: acquiring sensor information detected by a motion sensor of the mobile terminal; determining the posture of the mobile terminal according to the sensor information; when the gesture is determined to accord with the preset gesture condition, acquiring detection information of the proximity sensor; and calibrating the background noise value of the proximity sensor according to the detection information. According to the method for calibrating the proximity sensor of the mobile terminal, provided by the embodiment of the disclosure, when the current mobile terminal is judged to be in the preset posture with low shielding probability, the bottom noise value of the proximity sensor is calibrated according to the current detection information of the proximity sensor, so that the bottom noise value of the proximity sensor can be accurately calibrated in the use process of the mobile terminal, and the detection accuracy of the proximity sensor is improved.

Description

Method, device, terminal and storage medium for calibrating proximity sensor of mobile terminal
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method and an apparatus for calibrating a proximity sensor of a mobile terminal, a terminal, and a storage medium.
Background
The Proximity Sensor (Proximity Sensor) is used for short-distance measurement, can continuously emit infrared light outwards, and can detect distance information according to energy signals of the infrared light reflected by a shelter when the short-distance shelter blocks the infrared light emitted by the Proximity Sensor. The top of the mobile phone screen is usually provided with a proximity sensor, when the mobile phone is in a conversation state, and the face of a user is close to the screen, the proximity sensor automatically senses the distance between the mobile phone and the face of the user, and the mobile phone screen can be extinguished, so that the conversation process is prevented from being touched by mistake.
The calibration work of the proximity sensor on the mobile phone is generally performed when the mobile phone leaves a factory, and once the calibration is completed, the secondary calibration is not easy to perform during the use period. However, the mobile phone inevitably has some interference with the detection signal value of the proximity sensor during use, where the detection signal value is the detection signal value of the proximity sensor without being blocked by the mobile terminal, and the detection signal value is generated by diffraction of infrared light inside the terminal. The actual background noise value of the proximity sensor changes under the conditions that the position of the proximity sensor changes due to the fact that a mobile phone film is attached to the mobile phone, stains are on the screen of the mobile phone, and the mobile phone collides with the proximity sensor. When the background noise value of the proximity sensor changes, the distance measurement of the proximity sensor is inaccurate, and therefore the background noise value of the proximity sensor needs to be calibrated in the use process of the mobile phone.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
In a first aspect, the present disclosure provides a method of calibrating a proximity sensor of a mobile terminal, comprising:
acquiring sensor information detected by a motion sensor of the mobile terminal;
determining the posture of the mobile terminal according to the sensor information;
when the gesture is determined to accord with a preset gesture condition, acquiring detection information of the proximity sensor;
and calibrating the background noise value of the proximity sensor according to the detection information.
In a second aspect, the present disclosure provides an apparatus for calibrating a proximity sensor of a mobile terminal, comprising:
a sensor information acquisition unit for acquiring sensor information detected by a motion sensor of the mobile terminal;
the terminal posture determining unit is used for determining the posture of the mobile terminal according to the sensor information;
the detection information determining unit is used for acquiring the detection information of the proximity sensor when the gesture is determined to accord with the preset gesture condition;
and the calibration unit is used for calibrating the background noise value of the proximity sensor according to the detection information.
In a third aspect, the present disclosure provides a terminal, including:
at least one memory and at least one processor;
wherein the memory is used for storing program codes, and the processor is used for calling the program codes stored by the memory to execute the method for calibrating the proximity sensor of the mobile terminal provided by the embodiment of the disclosure.
In a fourth aspect, a non-transitory computer storage medium stores program code for performing a method of calibrating a proximity sensor of a mobile terminal provided in accordance with an embodiment of the present disclosure.
According to the method for calibrating the proximity sensor of the mobile terminal, provided by the embodiment of the disclosure, when the current mobile terminal is judged to be in the preset posture with low shielding probability, the bottom noise value of the proximity sensor is calibrated according to the detection information of the proximity sensor, so that the bottom noise value of the proximity sensor can be accurately calibrated in the use process of the mobile terminal, and the detection accuracy of the proximity sensor is improved.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.
Fig. 1 is a flowchart of a method of calibrating a proximity sensor of a mobile terminal according to an embodiment of the present disclosure;
fig. 2 is a flowchart of a method of calibrating a proximity sensor of a mobile terminal according to another embodiment of the present disclosure;
FIG. 3 is a flow chart of a manner of training a machine learning model provided in accordance with an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of an apparatus for calibrating a proximity sensor of a mobile terminal according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a terminal device for implementing an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the steps recited in the apparatus embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Moreover, device embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
It is noted that reference to "responsive" in this disclosure is intended to mean a condition or state on which an operation performed depends, and that one or more operations performed may be in real time or may have a set delay when the dependent condition or state is satisfied.
For the purposes of this disclosure, the phrase "a and/or B" means (a), (B), or (a and B).
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
Referring to fig. 1, fig. 1 illustrates a flowchart of a method 100 for calibrating a proximity sensor of a mobile terminal according to an embodiment of the present disclosure. The mobile terminal in the embodiments of the present disclosure may include, but is not limited to, devices such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet), a PMP (portable multimedia player), a smart wearable device, and the like. The method 100 comprises steps S101-S104:
step S101: sensor information detected by a motion sensor of the mobile terminal is acquired.
In the present embodiment, the motion sensor may include, but is not limited to, an acceleration sensor, a gravity sensor, a gyro sensor, a magnetic force sensor, a direction sensor, and the like, which may be used to detect the motion state and the posture of the terminal.
Step S102: and determining the posture of the mobile terminal according to the sensor information.
Determining the current posture of the mobile terminal according to the sensor information detected by the motion sensor may adopt technical solutions related in the art, and this embodiment is not limited thereto. In some embodiments, a corresponding sensor threshold or condition may be preset in advance for a specific target gesture or gestures, and when the sensor information detected by the motion sensor satisfies the corresponding threshold or condition, the target gesture of the current mobile terminal may be determined.
Step S103: and when the gesture is determined to accord with the preset gesture condition, acquiring the detection information of the proximity sensor.
When the preset posture condition is set to be the posture defined by the preset posture condition, the probability that the proximity sensor is shielded in a short distance is low, for example, the postures of erecting the screen, horizontally placing the screen upwards and the like.
In some implementations, the preset gesture condition is a vertical screen gesture, that is, when the mobile terminal is determined to be in the vertical screen gesture, the detection information of the proximity sensor is acquired. Through experimental research of the inventor, the proximity sensor is hardly shielded by a short distance when a user holds the mobile terminal in a vertical screen mode.
In this embodiment, when it is determined that the mobile terminal is in a posture condition meeting the preset condition, the proximity sensor may detect once or acquire one piece of detection information, or may detect continuously or detect multiple times within a preset time period to acquire multiple pieces of detection information.
Step S104: and calibrating the background noise value of the proximity sensor according to the detection information.
In the present embodiment, the detection information for calibrating the background noise value of the proximity sensor may be one or more, wherein a plurality of pieces of detection information for calibrating the background noise value of the proximity sensor may be acquired by repeating steps S101 to S103.
In some embodiments, the detection information obtained in step S103 may be directly used as the ground noise value after the calibration of the proximity sensor. When the number of the detection information used for calibrating the proximity sensor is more than two, the calibrated background noise value can be obtained by adopting a relevant data processing method such as averaging, median and the like.
When calibrating the background noise value of the proximity sensor, the mobile terminal, particularly the proximity sensor on the mobile terminal, is required to be not shielded. Therefore, according to the method for calibrating the proximity sensor of the mobile terminal provided by the embodiment of the disclosure, when the current mobile terminal is judged to be in the preset posture with low shielding probability, the bottom noise value of the proximity sensor is calibrated according to the detection information of the proximity sensor, so that the bottom noise value of the proximity sensor can be accurately calibrated in the use process of the mobile terminal, and the detection accuracy of the proximity sensor is improved.
Referring to fig. 2, fig. 2 shows a flowchart of a method 200 for calibrating a proximity sensor of a mobile terminal provided according to an embodiment of the present disclosure, the method 200 including steps S201 to S205:
step S201: and receiving a user operation event of the mobile terminal.
Step S202: sensor information detected by a motion sensor of the mobile terminal is acquired in response to a user operation event.
Step S203: and determining the posture of the mobile terminal according to the sensor information.
Step S204: and when the gesture is determined to be the vertical screen gesture, acquiring the current detection information of the proximity sensor.
Step S205: and calibrating the background noise value of the proximity sensor according to the detection information.
The user operation event refers to an event that a user operates the mobile terminal, for example, triggering operations on entity keys such as a home key and a power key, screen touch operations, screen lighting operations, screen unlocking operations through fingerprints, touch, face recognition and the like, and the like.
According to the method for calibrating the proximity sensor of the mobile terminal, provided by the embodiment of the disclosure, the received user operation event is used as the calibration trigger condition of the bottom noise value of the proximity sensor, so that the mobile terminal in the vertical screen state can be further ensured to be held by a user in the operation, but the mobile terminal is in the vertical screen state due to the non-accident event, the proximity sensor is ensured not to be shielded by other objects currently, and the bottom noise value of the proximity sensor is accurately calibrated.
In some embodiments, the user operation event comprises a screen lightening event and/or a screen unlocking event. The screen lightening event comprises an event of lifting up a lightening screen, touching the lightening screen or triggering a power key to lighten the screen; the screen unlocking event comprises an event of touch unlocking, face recognition unlocking or power key unlocking triggering. The inventor finds that when a screen lighting event or a screen unlocking event is received, the probability that the proximity sensor is shielded is extremely low, and therefore, the proximity sensor can be ensured not to be shielded by other objects currently by further setting the user operation event to be the screen lighting event and/or the screen unlocking event, so that accurate calibration of the background noise value of the proximity sensor is realized.
In some cases, the terminal screen may be lit by a non-human factor, for example, the terminal may light the screen in a user's pocket or bag due to information pushing or lifting the screen, when the proximity sensor of the terminal is in a close-range blocking state. For such cases, in some embodiments, step S104 further comprises: and if the detection information meets a preset condition, determining that the detection information cannot be used for calibrating the background noise value of the proximity sensor. When the preset condition is set to that the detection information conforms to the preset condition, the probability that the proximity sensor is shielded in a short distance is high. For example, if it is determined from the detection information that the proximity sensor currently has a close-range occlusion, the detection information cannot be used to calibrate the noise floor of the proximity sensor. Illustratively, the preset condition may be "more than 0.5 cm". For example, if the detected information is 1cm, it indicates that there is a block in the vicinity of the proximity sensor by 1cm, and the detected information cannot be used to calibrate the noise floor of the proximity sensor.
In some embodiments, the mobile terminal stores a current background noise value of the proximity sensor in advance; step S104 further includes: comparing the detection information with the current background noise value; and calibrating the background noise value of the proximity sensor according to the comparison result. Optionally, when the detection information is different from the current background noise value, or the obtained multiple pieces of detection information are different from the current background noise value, or a difference between the detection information and the current background noise value is greater than a preset range, the detection information is used as the background noise value after the proximity sensor is calibrated, so that the background noise value can be calibrated more accurately.
Further, in some embodiments, the calibrated noise floor value may be stored as a new current noise floor value in the mobile terminal for the next time of calibrating the noise floor value.
In some embodiments, step S102 includes:
sensor information is input to a pre-trained machine learning model to determine the pose of the mobile terminal.
Although the sensor information measured by the motion sensor has a certain difference when the same person holds the mobile terminal in the same posture every time or different persons use the mobile terminal in the same posture, the difference is within a certain range according to the human body structure and the common sense of use of the mobile terminal, and the range is not easy to be expressed by a fixed numerical value or condition. Therefore, according to the method for calibrating the proximity sensor of the mobile terminal provided by the embodiment of the disclosure, the boundary of different terminal postures is identified by using a machine learning method, and the method for detecting the terminal posture can have higher accuracy.
Referring to fig. 3, fig. 3 is a flowchart illustrating a training method 300 of a machine learning model provided by an embodiment of the present disclosure, including steps S301 to S302:
step S301: acquiring a plurality of groups of training data, wherein each group of training data comprises more than two pieces of test information detected by a motion sensor of the test terminal when the test terminal is in a preset terminal posture;
step S302: and training the machine learning model by taking the test information included in the multiple groups of training data as input and taking the terminal posture corresponding to the test information as expected output.
In this embodiment, each set of training data used for training the machine learning model includes more than two pieces of test information, so that interference caused by abnormal test information can be eliminated, and the recognition accuracy of the machine learning model can be improved.
In some embodiments, the portrait screen posture may be defined as an angle between a horizontal plane and a straight line on which a long side of the screen of the mobile terminal is located, exceeding 45 degrees. The tester can hold the test terminal for many times to enable the test terminal to be in a vertical screen posture or a horizontal screen posture, record the terminal posture of the test terminal and store one or more sensing data detected by a motion sensor of the test terminal under the terminal posture, and therefore multiple groups of training data can be obtained. In order to eliminate individual differences among the test persons as much as possible, a plurality of training data generated by the test persons may be used.
In some embodiments, the portrait screen gesture may be defined as an angle between a straight line where a long side of the screen of the mobile terminal is located and a line connecting both eyes of the user exceeding 45 degrees. The tester can hold the test terminal vertically or horizontally in upright, flat, side-lying and other postures for many times, record the terminal posture of the test terminal and store one or more sensing data detected by the motion sensor of the test terminal under the terminal posture, so that a plurality of groups of training data can be obtained.
In some embodiments, the machine learning model may be trained with training data corresponding to the vertical screen gesture as a positive example and training data corresponding to the horizontal screen gesture as a negative example, and the trained machine learning model may be used to determine whether the mobile terminal is in the vertical screen gesture.
In some embodiments, the test information includes values in at least two directions; the training mode 300 further includes: and merging the test information included in each group of training data to obtain processed training data, wherein the numerical values of the test information included in the training data, which correspond to the same direction, generate arrays in the processed training data according to the time sequence. In this embodiment, the merging process may be performed on all the test information included in each set of training data, or the merging process may be performed on only part of the test information included in each set of training data, for example, after deleting missing values and abnormal values in the training data, the merging process may be performed on the remaining test information. In this embodiment, the processed training data is used to train the machine learning model as an input to the machine learning model.
In the embodiment of the present disclosure, the coordinate data includes, but is not limited to, plane rectangular coordinate data, spatial rectangular coordinate data, and the like, and the coordinate data is constituted by coordinate values corresponding to coordinate axes. Illustratively, each set of training data may include test information that is spatial rectangular coordinate data, such as (x1, y1, z1), (x2, y2, z2), …, (xn, yn, zn), where n ≧ 2, according to an embodiment of the present disclosure, the n spatial rectangular coordinate data are combined and processed to obtain processed training data (x1, x2, …, xn, y1, y2, … yn, z1, z2, …, zn), where values corresponding to the same direction (e.g., x1, x2, …, xn) are chronologically generated in the processed training data, i.e., the training data are processed from [ xyz, xyz, xyz, … ] to [ x, x, x, …, y, y, y, …, z, z, … ]. In the embodiment of the disclosure, the array is generated by time-sequentially generating the numerical values corresponding to the same direction of the test information included in each set of training data, so that the machine learning model can more easily understand which data changes at positions (coordinate axes) can generate changes of classification results, thereby improving the training efficiency of the machine learning model and reducing the sample size required by training.
In some embodiments, the machine learning model is a neural network model. Because the neural network model has better fitting performance to the classification function, the neural network model is adopted as the machine learning model in the embodiment of the disclosure, and the trained machine learning model has higher classification accuracy.
In some embodiments, step S104 further comprises: and calibrating the background noise value of the proximity sensor according to more than two detection information. In this embodiment, the acquired multiple pieces of detection information may be processed by using a relevant data processing method such as averaging, median, data fitting, and the like to calibrate the background noise value, so that the background noise value may be calibrated more accurately.
As shown in fig. 4, according to an embodiment of the present disclosure, there is provided a terminal gesture detection apparatus 400, including: a sensor information acquisition unit 401, a terminal attitude determination unit 402, a terminal attitude determination unit 403, and a calibration unit 404, in which:
a sensor information acquiring unit 401 for acquiring sensor information detected by a motion sensor of the mobile terminal;
a terminal posture determining unit 402, configured to determine a posture of the mobile terminal according to the sensor information;
a detection information determination unit 403 for acquiring detection information of the proximity sensor when it is determined that the posture conforms to a preset posture condition;
and a calibration unit 404 for calibrating the noise floor value of the proximity sensor according to the detection information.
For the embodiments of the apparatus, since they correspond substantially to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described apparatus embodiments are merely illustrative, in that modules illustrated as separate modules may or may not be separate. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
In some embodiments, the detection apparatus 400 further includes a receiving unit for receiving a user operation event of the mobile terminal; the sensor information acquiring unit 401 is further configured to acquire sensor information detected by a motion sensor of the mobile terminal in response to the user operation event.
In some embodiments, the user operation event comprises a screen lightening event and/or a screen unlocking event. The screen lightening event comprises an event of lifting up a lightening screen, touching the lightening screen or triggering a power key to lighten the screen; the screen unlocking event comprises an event of touch unlocking, face recognition unlocking or power key unlocking triggering. The inventor finds that when a screen lighting event or a screen unlocking event is received, the probability that the proximity sensor is shielded is extremely low, and therefore, the proximity sensor can be ensured not to be shielded by other objects currently by further setting the user operation event to be the screen lighting event and/or the screen unlocking event, so that accurate calibration of the background noise value of the proximity sensor is realized.
In some cases, the terminal screen may be lit by a non-human factor, for example, the terminal may light the screen in a user's pocket or bag due to information pushing or lifting the screen, when the proximity sensor of the terminal is in a close-range blocking state. For such cases, in some embodiments, the calibration unit 404 is further configured to determine that the detection information cannot be used to calibrate the noise floor value of the proximity sensor if the detection information meets a preset condition. When the preset condition is set to that the detection information conforms to the preset condition, the probability that the proximity sensor is shielded in a short distance is high. For example, if it is determined from the detection information that the proximity sensor currently has a close-range occlusion, the detection information cannot be used to calibrate the noise floor of the proximity sensor. Illustratively, the preset condition may be "more than 0.5 cm". For example, if the detected information is 1cm, it indicates that there is a block in the vicinity of the proximity sensor by 1cm, and the detected information cannot be used to calibrate the noise floor of the proximity sensor.
In some embodiments, the mobile terminal stores a current background noise value of the proximity sensor in advance; the calibration unit 404 is further configured to compare the detection information with the current noise floor value; and calibrating the background noise value of the proximity sensor according to the comparison result. Optionally, when the detection information is different from the current background noise value, or the obtained multiple pieces of detection information are different from the current background noise value, or a difference between the detection information and the current background noise value is greater than a preset range, the detection information is used as the background noise value after the proximity sensor is calibrated, so that the background noise value can be calibrated more accurately.
In some embodiments, the calibration unit 404 is further configured to calibrate the noise floor value of the proximity sensor according to two or more detection information. In this embodiment, the acquired multiple pieces of detection information may be processed by using a relevant data processing method such as averaging, median, data fitting, and the like to calibrate the background noise value, so that the background noise value may be calibrated more accurately.
Further, in some embodiments, the apparatus 400 further includes a storage unit configured to store the calibrated noise floor value as a new current noise floor value in the mobile terminal for next calibration of the noise floor value.
In some embodiments, the terminal pose determination unit 402 is configured to input sensor information to a pre-trained machine learning model to determine a pose at which the mobile terminal is located.
Although the sensor information measured by the motion sensor has a certain difference when the same person holds the mobile terminal in the same posture every time or different persons use the mobile terminal in the same posture, the difference is within a certain range according to the human body structure and the common sense of use of the mobile terminal, and the range is not easy to be expressed by a fixed numerical value or condition. Therefore, according to the method for calibrating the proximity sensor of the mobile terminal provided by the embodiment of the disclosure, the boundary of different terminal postures is identified by using a machine learning method, and the method for detecting the terminal posture can have higher accuracy.
According to one or more embodiments of the present disclosure, there is provided a training apparatus of a machine learning model, including:
the training data acquisition unit is used for acquiring a plurality of groups of training data, wherein each group of training data comprises more than two pieces of test information detected by a motion sensor of the test terminal when the test terminal is in a preset terminal posture;
and the training unit is used for training the machine learning model by taking the test information included in the plurality of groups of training data as input and taking the terminal posture corresponding to the test information as expected output.
In some embodiments, the portrait screen posture may be defined as an angle between a horizontal plane and a straight line on which a long side of the screen of the mobile terminal is located, exceeding 45 degrees. The tester can hold the test terminal for many times to enable the test terminal to be in a vertical screen posture or a horizontal screen posture, record the terminal posture of the test terminal and store one or more sensing data detected by a motion sensor of the test terminal under the terminal posture, and therefore multiple groups of training data can be obtained. In order to eliminate individual differences among the test persons as much as possible, a plurality of training data generated by the test persons may be used.
In some embodiments, the portrait screen gesture may be defined as an angle between a straight line where a long side of the screen of the mobile terminal is located and a line connecting both eyes of the user exceeding 45 degrees. The tester can hold the test terminal vertically or horizontally in upright, flat, side-lying and other postures for many times, record the terminal posture of the test terminal and store one or more sensing data detected by the motion sensor of the test terminal under the terminal posture, so that a plurality of groups of training data can be obtained.
In some embodiments, the machine learning model may be trained with training data corresponding to the vertical screen gesture as a positive example and training data corresponding to the horizontal screen gesture as a negative example, and the trained machine learning model may be used to determine whether the mobile terminal is in the vertical screen gesture.
In some embodiments, the test information includes values in at least two directions; the training device further comprises: and the training data processing unit is used for carrying out merging processing on the test information included in each group of training data to obtain processed training data, wherein numerical values corresponding to the same direction of the test information included in the training data generate arrays in the processed training data according to a time sequence. In this embodiment, the merging process may be performed on all the test information included in each set of training data, or the merging process may be performed on only part of the test information included in each set of training data, for example, after deleting missing values and abnormal values in the training data, the merging process may be performed on the remaining test information. In this embodiment, the processed training data is used to train the machine learning model as an input to the machine learning model.
In the embodiment of the present disclosure, the coordinate data includes, but is not limited to, plane rectangular coordinate data, spatial rectangular coordinate data, and the like, and the coordinate data is constituted by coordinate values corresponding to coordinate axes. Illustratively, each set of training data may include test information that is spatial rectangular coordinate data, such as (x1, y1, z1), (x2, y2, z2), …, (xn, yn, zn), where n ≧ 2, according to an embodiment of the present disclosure, the n spatial rectangular coordinate data are combined and processed to obtain processed training data (x1, x2, …, xn, y1, y2, … yn, z1, z2, …, zn), where values corresponding to the same direction (e.g., x1, x2, …, xn) are chronologically generated in the processed training data, i.e., the training data are processed from [ xyz, xyz, xyz, … ] to [ x, x, x, …, y, y, y, …, z, z, … ]. In the embodiment of the disclosure, the array is generated by time-sequentially generating the numerical values corresponding to the same direction of the test information included in each set of training data, so that the machine learning model can more easily understand which data changes at positions (coordinate axes) can generate changes of classification results, thereby improving the training efficiency of the machine learning model and reducing the sample size required by training.
In some embodiments, the machine learning model is a neural network model. Because the neural network model has better fitting performance to the classification function, the neural network model is adopted as the machine learning model in the embodiment of the disclosure, and the trained machine learning model has higher classification accuracy.
Referring now to fig. 5, a block diagram of a terminal device 600 suitable for use in implementing embodiments of the present disclosure is shown. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The terminal device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, the terminal device 600 may include a processing means (e.g., a central processing unit, a graphic processor, etc.) 601 that may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the terminal apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Generally, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the terminal device 600 to perform wireless or wired communication with other devices to exchange data. While fig. 5 illustrates a terminal apparatus 600 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 609, or may be installed from the storage means 608, or may be installed from the ROM 602. The computer program, when executed by the processing device 601, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText transfer protocol), and may be interconnected with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be included in the terminal device; or may exist separately without being assembled into the terminal device.
The computer readable medium carries one or more programs which, when executed by the terminal device, cause the terminal device to: acquiring sensor information detected by a motion sensor of the mobile terminal; determining the posture of the mobile terminal according to the sensor information; when the gesture is determined to accord with a preset gesture condition, acquiring detection information of the proximity sensor; and calibrating the background noise value of the proximity sensor according to the detection information.
Alternatively, the computer readable medium carries one or more programs which, when executed by the terminal device, cause the terminal device to: acquiring a plurality of groups of training data, wherein each group of training data comprises more than two pieces of test information detected by a motion sensor of a test terminal when the test terminal is in a preset terminal posture; and taking the test information included in the multiple groups of training data as input and the terminal posture corresponding to the test information as expected output, and training a machine learning model.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Here, the name of a unit does not constitute a limitation of the unit itself in some cases, and for example, the sensor information acquisition unit may also be described as a "unit for acquiring sensor information detected by a motion sensor of the mobile terminal".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In accordance with one or more embodiments of the present disclosure, there is provided a method of calibrating a proximity sensor of a mobile terminal, including: acquiring sensor information detected by a motion sensor of the mobile terminal; determining the posture of the mobile terminal according to the sensor information; when the gesture is determined to accord with a preset gesture condition, acquiring detection information of the proximity sensor; and calibrating the background noise value of the proximity sensor according to the detection information.
According to one or more embodiments of the present disclosure, further comprising: receiving a user operation event of the mobile terminal; the acquiring of the sensor information detected by the motion sensor of the mobile terminal includes: and responding to the user operation event, and acquiring sensor information detected by a motion sensor of the mobile terminal.
According to one or more embodiments of the present disclosure, the user operation event includes a screen lighting event and/or a screen unlocking event.
According to one or more embodiments of the present disclosure, the calibrating the noise floor value of the proximity sensor according to the detection information includes: and if the detection information meets a preset condition, determining that the detection information cannot be used for calibrating the background noise value of the proximity sensor.
According to one or more embodiments of the present disclosure, the mobile terminal stores a current background noise value of the proximity sensor in advance; the calibrating the background noise value of the proximity sensor according to the detection information includes: comparing the detection information with the current background noise value; and calibrating the background noise value of the proximity sensor according to the comparison result.
According to one or more embodiments of the present disclosure, the determining the gesture of the mobile terminal according to the sensor information includes: inputting the sensor information to a pre-trained machine learning model to determine a pose in which the mobile terminal is located.
According to one or more embodiments of the present disclosure, the training mode of the machine learning model includes: acquiring a plurality of groups of training data, wherein each group of training data comprises more than two pieces of test information detected by a motion sensor of a test terminal when the test terminal is in a preset terminal posture; and taking the test information included in the multiple groups of training data as input and the terminal posture corresponding to the test information as expected output, and training a machine learning model.
According to one or more embodiments of the present disclosure, the test information includes values in at least two directions; the training mode of the machine learning model further comprises the following steps: and combining the test information included in each group of the training data to obtain processed training data, wherein numerical values corresponding to the same direction of the test information included in the training data generate arrays in the processed training data according to a time sequence.
According to one or more embodiments of the present disclosure, the machine learning model is a neural network model.
According to one or more embodiments of the present disclosure, the calibrating the noise floor value of the proximity sensor according to the detection information includes: and calibrating the background noise value of the proximity sensor according to more than two detection information.
According to one or more embodiments of the present disclosure, there is provided an apparatus for calibrating a proximity sensor of a mobile terminal, including: a sensor information acquisition unit for acquiring sensor information detected by a motion sensor of the mobile terminal; the terminal posture determining unit is used for determining the posture of the mobile terminal according to the sensor information; the detection information determining unit is used for acquiring the detection information of the proximity sensor when the gesture is determined to accord with the preset gesture condition; and the calibration unit is used for calibrating the background noise value of the proximity sensor according to the detection information.
According to one or more embodiments of the present disclosure, there is provided a terminal including: at least one memory and at least one processor; wherein the memory is configured to store program code and the processor is configured to invoke the program code stored by the memory to perform a method of calibrating a proximity sensor of a mobile terminal provided in accordance with one or more embodiments of the present disclosure.
According to one or more embodiments of the present disclosure, there is provided a non-transitory computer storage medium storing program code for executing a method of calibrating a proximity sensor of a mobile terminal provided according to one or more embodiments of the present disclosure.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (13)

1. A method of calibrating a proximity sensor of a mobile terminal, comprising:
acquiring sensor information detected by a motion sensor of the mobile terminal;
determining the posture of the mobile terminal according to the sensor information;
when the gesture is determined to accord with a preset gesture condition, acquiring detection information of the proximity sensor;
and calibrating the background noise value of the proximity sensor according to the detection information.
2. The method of calibrating a proximity sensor of a mobile terminal of claim 1, further comprising:
receiving a user operation event of the mobile terminal;
the acquiring of the sensor information detected by the motion sensor of the mobile terminal includes: and responding to the user operation event, and acquiring sensor information detected by a motion sensor of the mobile terminal.
3. The method of calibrating a proximity sensor of a mobile terminal of claim 2,
the user operation event comprises a screen lightening event and/or a screen unlocking event.
4. The method of calibrating a proximity sensor of a mobile terminal according to claim 1, wherein said calibrating a noise floor value of the proximity sensor based on the detection information comprises:
and if the detection information meets a preset condition, determining that the detection information cannot be used for calibrating the background noise value of the proximity sensor.
5. The method of calibrating a proximity sensor of a mobile terminal of claim 1,
the mobile terminal stores a current background noise value of the proximity sensor in advance;
the calibrating the background noise value of the proximity sensor according to the detection information includes: comparing the detection information with the current background noise value; and calibrating the background noise value of the proximity sensor according to the comparison result.
6. The method of calibrating a proximity sensor of a mobile terminal of claim 1, wherein said determining an attitude at which the mobile terminal is located based on the sensor information comprises:
inputting the sensor information to a pre-trained machine learning model to determine a pose in which the mobile terminal is located.
7. The method of calibrating a proximity sensor of a mobile terminal of claim 6, wherein the machine learning model is trained in a manner comprising:
acquiring a plurality of groups of training data, wherein each group of training data comprises more than two pieces of test information detected by a motion sensor of a test terminal when the test terminal is in a preset terminal posture;
and taking the test information included in the multiple groups of training data as input and the terminal posture corresponding to the test information as expected output, and training a machine learning model.
8. The method of calibrating a proximity sensor of a mobile terminal of claim 7,
the test information comprises values in at least two directions;
the training mode of the machine learning model further comprises the following steps:
and combining the test information included in each group of the training data to obtain processed training data, wherein numerical values corresponding to the same direction of the test information included in the training data generate arrays in the processed training data according to a time sequence.
9. The method of calibrating a proximity sensor of a mobile terminal of claim 6,
the machine learning model is a neural network model.
10. The method of calibrating a proximity sensor of a mobile terminal according to claim 1, wherein said calibrating a noise floor value of the proximity sensor based on the detection information comprises:
and calibrating the background noise value of the proximity sensor according to more than two detection information.
11. An apparatus for calibrating a proximity sensor of a mobile terminal, comprising:
a sensor information acquisition unit for acquiring sensor information detected by a motion sensor of the mobile terminal;
the terminal posture determining unit is used for determining the posture of the mobile terminal according to the sensor information;
the detection information determining unit is used for acquiring the detection information of the proximity sensor when the gesture is determined to accord with the preset gesture condition;
and the calibration unit is used for calibrating the background noise value of the proximity sensor according to the detection information.
12. A terminal, characterized in that the terminal comprises:
at least one memory and at least one processor;
wherein the memory is configured to store program code and the processor is configured to invoke the program code stored by the memory to perform the method of any of claims 1 to 10.
13. A non-transitory computer storage medium, characterized in that it stores a program code for executing the method of any of claims 1 to 10.
CN202010727791.3A 2020-07-23 2020-07-23 Method, device, terminal and storage medium for calibrating proximity sensor of mobile terminal Pending CN111857369A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010727791.3A CN111857369A (en) 2020-07-23 2020-07-23 Method, device, terminal and storage medium for calibrating proximity sensor of mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010727791.3A CN111857369A (en) 2020-07-23 2020-07-23 Method, device, terminal and storage medium for calibrating proximity sensor of mobile terminal

Publications (1)

Publication Number Publication Date
CN111857369A true CN111857369A (en) 2020-10-30

Family

ID=72950227

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010727791.3A Pending CN111857369A (en) 2020-07-23 2020-07-23 Method, device, terminal and storage medium for calibrating proximity sensor of mobile terminal

Country Status (1)

Country Link
CN (1) CN111857369A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116734903A (en) * 2022-10-20 2023-09-12 荣耀终端有限公司 Test method and device
CN117665958A (en) * 2024-01-31 2024-03-08 荣耀终端有限公司 Calibration method of proximity light sensor and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002211312A (en) * 2001-01-15 2002-07-31 Mazda Motor Corp Lighting system for vehicle
JP2003064921A (en) * 2001-06-18 2003-03-05 Roorando Products:Kk Vehicle locking/unlocking state determining method and control method using the same, as well as vehicle locking /unlocking state determining device and control device using the same
CN106094055A (en) * 2016-06-21 2016-11-09 广东欧珀移动通信有限公司 The calibration steps of a kind of proximity transducer and terminal
CN106500751A (en) * 2016-10-20 2017-03-15 广东欧珀移动通信有限公司 The calibration steps and mobile terminal of proximity transducer
CN108415024A (en) * 2018-01-24 2018-08-17 广东欧珀移动通信有限公司 proximity sensor calibration method, device, mobile terminal and computer-readable medium
CN111262986A (en) * 2020-01-16 2020-06-09 Oppo(重庆)智能科技有限公司 Calibration method and calibration device for proximity sensor and mobile terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002211312A (en) * 2001-01-15 2002-07-31 Mazda Motor Corp Lighting system for vehicle
JP2003064921A (en) * 2001-06-18 2003-03-05 Roorando Products:Kk Vehicle locking/unlocking state determining method and control method using the same, as well as vehicle locking /unlocking state determining device and control device using the same
CN106094055A (en) * 2016-06-21 2016-11-09 广东欧珀移动通信有限公司 The calibration steps of a kind of proximity transducer and terminal
CN106500751A (en) * 2016-10-20 2017-03-15 广东欧珀移动通信有限公司 The calibration steps and mobile terminal of proximity transducer
CN108415024A (en) * 2018-01-24 2018-08-17 广东欧珀移动通信有限公司 proximity sensor calibration method, device, mobile terminal and computer-readable medium
CN111262986A (en) * 2020-01-16 2020-06-09 Oppo(重庆)智能科技有限公司 Calibration method and calibration device for proximity sensor and mobile terminal

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116734903A (en) * 2022-10-20 2023-09-12 荣耀终端有限公司 Test method and device
CN117665958A (en) * 2024-01-31 2024-03-08 荣耀终端有限公司 Calibration method of proximity light sensor and electronic equipment

Similar Documents

Publication Publication Date Title
US11048983B2 (en) Method, terminal, and computer storage medium for image classification
CN107818288B (en) Sign board information acquisition method and device
CN104956236B (en) Technique for investigation for generating location fingerprint data
RU2597524C2 (en) Method and apparatus for classifying number of conditions of device
US9301097B2 (en) Correlating wireless signals to a location on an image using mobile sensor technologies
CN106941561A (en) Mobile terminal falls detection method, fall protection method and mobile terminal
CN111835916B (en) Training method and device of attitude detection model and detection method and device of terminal attitude
US10019219B2 (en) Display device for displaying multiple screens and method for controlling the same
US20120314936A1 (en) Information processing device, information processing method, and program
CN109190648B (en) Simulation environment generation method and device, mobile terminal and computer readable storage medium
CN107645702B (en) Position calibration method, device and system
CN111857369A (en) Method, device, terminal and storage medium for calibrating proximity sensor of mobile terminal
CN111986250A (en) Object volume measuring method, device, measuring equipment and storage medium
CN106667450B (en) Temperature measuring method and device
CN109813300B (en) Positioning method and terminal equipment
CN110148167B (en) Distance measuring method and terminal equipment
CN116449396A (en) GNSS deception signal detection method, device, equipment and product
CN114187509B (en) Object positioning method and device, electronic equipment and storage medium
CN112882094B (en) First-arrival wave acquisition method and device, computer equipment and storage medium
CN115097379A (en) Positioning tracking method, device, equipment and storage medium
KR20220168170A (en) Methods and systems for maximum consistency based outlier handling
CN115082202A (en) Information processing method, device, equipment and storage medium for house mortgage
CN114264365A (en) Wind noise detection method and device, terminal equipment and storage medium
CN114063964A (en) Volume compensation optimization method and device, electronic equipment and readable storage medium
CN111741165B (en) Mobile terminal control method and device, mobile terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination