CN109274829A - A kind of call control method and device, equipment, storage medium - Google Patents
A kind of call control method and device, equipment, storage medium Download PDFInfo
- Publication number
- CN109274829A CN109274829A CN201811163293.XA CN201811163293A CN109274829A CN 109274829 A CN109274829 A CN 109274829A CN 201811163293 A CN201811163293 A CN 201811163293A CN 109274829 A CN109274829 A CN 109274829A
- Authority
- CN
- China
- Prior art keywords
- sensor
- electronic equipment
- target object
- sensing parameter
- determined
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 83
- 238000013459 approach Methods 0.000 claims description 42
- 238000004590 computer program Methods 0.000 claims description 4
- 238000001514 detection method Methods 0.000 claims description 4
- 230000006698 induction Effects 0.000 description 34
- 238000004891 communication Methods 0.000 description 21
- 210000005069 ears Anatomy 0.000 description 11
- 230000008569 process Effects 0.000 description 11
- 230000001960 triggered effect Effects 0.000 description 11
- 230000003287 optical effect Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 7
- 239000004020 conductor Substances 0.000 description 6
- 230000001413 cellular effect Effects 0.000 description 5
- 239000004519 grease Substances 0.000 description 4
- 230000010365 information processing Effects 0.000 description 4
- 238000012360 testing method Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 230000005484 gravity Effects 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 230000004913 activation Effects 0.000 description 2
- 239000002775 capsule Substances 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 238000002310 reflectometry Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000010521 absorption reaction Methods 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
- H04M1/72454—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/12—Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Environmental & Geological Engineering (AREA)
- Telephone Function (AREA)
Abstract
The embodiment of the present application discloses a kind of call control method and device, equipment, storage medium, wherein the described method includes: obtaining call instruction;Earpiece is called based on the call instruction;If the earpiece is in running order, determine have the electronic equipment of the first sensor whether close to target object based on the first sensed parameter that first sensor obtains;And the second sensed parameter is obtained based on second sensor;Wherein, second sensed parameter is used to determine the relative motion of the electronic equipment Yu the target object.
Description
Technical Field
The present application relates to computer technologies, and in particular, to a method and an apparatus for controlling a call, a device, and a storage medium.
Background
In the related art, when the terminal controls the screen state during a call, a single Proximity Sensor (psensor) determines whether the terminal is far from or close to a human body of a user, and the screen state of the terminal.
The proximity sensor determines whether the user brings the terminal close to or away from the ear by detecting the distance between the user and the terminal to determine whether the screen state of the terminal is a bright screen or a dark screen. However, proximity sensors have black hair problems, such as: when a user with black skin approaches the terminal to the ear, the black reflectivity is low, so that misjudgment of the proximity sensor can be caused, and the screen is controlled to be in a bright screen state for the reason that the terminal is far away from the ear, so that mistouch of the touch screen is caused.
Disclosure of Invention
The embodiment of the application provides a call control method, a call control device, equipment and a storage medium, which can correctly judge the state between electronic equipment and a target object.
The call control method applied to the electronic equipment provided by the embodiment of the application comprises the following steps:
obtaining a calling instruction;
calling a receiver based on the calling instruction;
if the earphone is in a working state, determining whether the electronic equipment with the first sensor approaches to a target object or not based on first sensing parameters obtained by the first sensor; and obtaining a second sensing parameter based on the second sensor; wherein the second sensing parameter is used for determining the relative motion of the electronic equipment and the target object.
The conversation control device applied to the electronic equipment provided by the embodiment of the application comprises: a call control device, the device comprising: the device comprises a receiving module, a calling module and a detecting module; wherein,
the receiving module is used for obtaining a calling instruction;
the calling module is used for calling the earphone based on the calling instruction;
the detection module is used for determining whether the electronic equipment with the first sensor approaches to a target object or not based on first sensing parameters obtained by the first sensor if the earphone is in a working state; and obtaining a second sensing parameter based on the second sensor; wherein the second sensing parameter is used for determining the relative motion of the electronic equipment and the target object.
The electronic equipment that this application embodiment provided includes: the communication control method comprises a first sensor, a second sensor, a processor and a memory for storing a computer program capable of running on the processor, wherein the processor is used for executing the steps of the communication control method when the computer program is run.
The computer-readable storage medium provided in the embodiments of the present application stores a call control program, and the call control program, when executed by a processor, implements the steps of the call control method described above.
In the embodiment of the application, if the receiver is in the working state, whether the electronic equipment is close to the target object is judged based on the first sensing parameter of the first sensor, and the relative motion between the electronic equipment and the target object is judged based on the second sensing parameter of the second sensor.
Drawings
Fig. 1 is a schematic flow chart illustrating an implementation of a call control method according to an embodiment of the present application;
FIG. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 3 is a schematic flow chart illustrating an implementation of a two-way call control method according to an embodiment of the present application;
fig. 4 is a schematic flow chart illustrating an implementation of the three-call control method according to the embodiment of the present application;
FIG. 5 is a schematic front view of an electronic device according to an embodiment of the present application;
FIG. 6 is a schematic rear view of an electronic device according to an embodiment of the disclosure;
fig. 7 is a schematic flow chart illustrating an implementation of the four-way call control method according to the embodiment of the present application;
FIG. 8 is a first schematic structural diagram of a five psensor according to an embodiment of the present application;
FIG. 9 is a schematic structural diagram II of a pentapsensor according to an embodiment of the present application;
FIG. 10 is a schematic illustration of the position of a psensor of example six of the present application;
fig. 11 is a first schematic structural diagram of a seventh session control device according to an embodiment of the present application;
fig. 12 is a second schematic structural diagram of a seventh call control device according to an embodiment of the present application;
fig. 13 is a schematic structural composition diagram of an eight electronic device according to an embodiment of the present application.
Detailed Description
The present application will be described in further detail below with reference to the accompanying drawings and examples. It should be understood that the examples provided herein are merely illustrative of the present application and are not intended to limit the present application. In addition, the following examples are provided as partial examples for implementing the present application, not all examples for implementing the present application, and the technical solutions described in the examples of the present application may be implemented in any combination without conflict.
In various embodiments of the present application, an electronic device is provided with a first sensor and a second sensor; the electronic equipment acquires a calling instruction; calling a receiver based on the calling instruction; if the earphone is in a working state, determining whether the electronic equipment with the first sensor approaches to a target object or not based on first sensing parameters obtained by the first sensor; and obtaining a second sensing parameter based on the second sensor; wherein the second sensing parameter is used for determining the relative motion of the electronic equipment and the target object.
Example one
The embodiment of the application provides a call control method, which is applied to an electronic device, where the electronic device may be any electronic device with information processing capability, and in an embodiment, the electronic device may be an intelligent terminal, for example, a mobile terminal with wireless communication capability, such as a notebook computer. The functional modules in the electronic device may be cooperatively implemented by hardware resources of the terminal, such as computing resources like a processor, and communication resources (e.g., for supporting various modes of communication, such as optical cable and cellular).
Of course, the embodiments of the present application are not limited to being provided as methods and hardware, and may be provided as a storage medium (storing instructions for executing the call control method provided in the embodiments of the present application) in various implementations.
The electronic device is provided with a first sensor and a second sensor, and the first sensor and the second sensor can be original elements such as a proximity sensor, a motion sensor, a touch screen controller and a capacitance sensor which can detect whether the electronic device approaches to a target object. Wherein, the first sensor and the second sensor can be the same or different.
The proximity sensor can sense the distance between a target object and the electronic equipment, and can be divided into an optical proximity sensor, an infrared proximity sensor, an ultrasonic proximity sensor and the like according to different working principles. The working principle of the proximity sensor is not limited in any way in the embodiments of the present application.
Motion sensors are sensors capable of monitoring the motion of an electronic device, such as: accelerometers, gyroscopes, gravity sensors, linear accelerometers, rotational vector sensors, and the like.
The touch screen controller can be internally provided with an ear-sensing screen algorithm, the touch of the face or the ears with the touch screen is recognized through the ear-sensing screen algorithm, and the ear-sensing screen algorithm can be a QEEXO algorithm. The embodiment of the application does not limit the specific algorithm of the ear information screen algorithm built in the touch screen controller.
The capacitive sensor realizes touch sensing according to the change of capacitance. Such as: when a finger or other conductor approaches a capacitive sensor, the capacitive sensor creates a new capacitance that changes. Capacitive sensors determine the presence of a finger or other conductor by measuring changes in capacitance.
It should be noted that the first sensor and the second sensor may be one or more of a proximity sensor, a motion sensor, a touch screen controller, and a capacitance sensor.
As shown in fig. 1, a call control method provided in an embodiment of the present application includes:
s101, obtaining a calling instruction;
the electronic equipment can be provided with a call application program, and the call application program comprises the following steps: telephone, WeChat, Facetime, etc. applications that can talk over voice or video.
The electronic equipment triggers the call instruction when receiving the call operation of the user based on the call application program or when the network side sends the call instruction. The electronic equipment triggers a calling instruction for instructing to call the receiver based on the call instruction, and at the moment, the electronic equipment obtains the calling instruction.
S102, calling a receiver based on the calling instruction;
and when the electronic equipment receives the calling instruction, calling the receiver to enable the receiver to be in a working state. Here, when the electronic device calls the handset, the electronic device may be in a call state in which the call link is already established, or may be in a call state in which the call link is not established.
S103, if the earphone is in a working state, determining whether the electronic equipment with the first sensor approaches to a target object or not based on first induction parameters obtained by the first sensor;
when the electronic equipment detects that the receiver is in a working state, a first opening instruction is triggered, and the first sensor is opened based on the first opening instruction, so that the first sensor is in the working state. Here, when the electronic device turns on the first sensor, the first sensor may be in a sleep state or an operating state. When the first sensor is in a dormant state, the first sensor is started based on the first starting instruction, and the first sensor is controlled to be in a working state.
After the electronic equipment starts the first sensor, the first sensor in the working state collects data to obtain a first induction parameter representing the approaching state between the electronic equipment and the target object. The electronic device determines whether the electronic device is approaching the target object based on the first sensing parameter.
The electronic equipment matches the first induction parameter with the set approaching condition, and when the first induction parameter is determined to meet the approaching condition, the electronic equipment is determined to approach the target object; and when the first sensing parameter is determined not to meet the approaching condition, determining that the electronic equipment is not approaching the target object.
Based on the type of the first sensor, the approach condition includes the following:
first, first distance threshold
If the first sensor is a proximity sensor or a touch screen controller, the electronic device can set a first distance threshold. If the first induction parameter is larger than the first distance threshold value, determining that the electronic equipment is not close to the target object; and if the first sensing parameter is smaller than the first distance threshold value, determining that the electronic equipment approaches to the target object.
Second, exercise Condition
If the first sensor is a motion sensor, the electronic device may set a motion condition. If the first induction parameter does not meet the motion condition, determining that the electronic equipment is not close to the target object; and if the first sensing parameter meets the motion condition, determining that the electronic equipment approaches to the target object. The motion condition is a condition that determines a degree of motion of the electronic device. Here, different motion conditions may be set according to the type of the motion sensor. The embodiment of the present application does not set any limit to the motion conditions.
Third, capacitive condition
If the first sensor is a capacitive sensor, the electronics can set a capacitance threshold. If the first induction parameter does not meet the capacitance condition, representing that the electronic equipment is not close to a target object, and determining that the electronic equipment is not close to the target object; and if the first induction parameter meets the capacitance condition, representing that the target object approaches the electronic equipment, and determining that the electronic equipment approaches the target object.
Here, the instruction to activate the second sensor includes a first activation instruction and a second activation instruction according to whether the first sensor includes a capacitive sensor.
And S104, acquiring a second sensing parameter based on the second sensor.
Wherein the second sensing parameter is used for determining the relative motion of the electronic equipment and the target object.
And if the first sensor is the same as the second sensor, starting the second sensor based on the first starting instruction.
And if the first sensor is different from the second sensor, starting the second sensor based on a second starting instruction. The electronic device triggers a second starting instruction when determining that the electronic device approaches the target object based on the first sensing parameters of the first sensor, or triggers the second starting instruction when determining that the electronic device approaches the target object and is away from the target object again based on the first sensing parameters of the first sensor. And after the first induction parameter is determined to meet the approaching condition, and when the first induction parameter does not meet the approaching condition again, determining that the electronic equipment is far away from the target object.
In practical application, when the second starting instruction is triggered, a first closing instruction can be triggered, and the first sensor is closed based on the first closing instruction.
And after the second sensor is started, the second sensor acquires data to obtain a second induction parameter, and the second induction parameter is used for determining the relative motion of the electronic equipment and the target object.
If the second sensing parameter meets the set approaching condition, it is indicated that the electronic device does not move relative to the target object, the electronic device approaches the target object, and if the second sensing parameter does not meet the set approaching condition, it is indicated that the electronic device moves relative to the target object, and the electronic device is far away from the target object.
According to the call processing method provided by the embodiment of the application, when the electronic equipment is in a call, the earphone is called to be in a working state, at the moment, the sensing parameter of the distance sensor is obtained to serve as the first sensing parameter, whether the electronic equipment is close to the ear is detected through the first sensing parameter, and when the electronic equipment is determined to be close to the ear, the sensing parameter of the second sensor is obtained to serve as the second sensing parameter so as to be used for determining whether relative motion exists between the electronic equipment and the ear and determining whether the electronic equipment is far away from the ear or close to the ear.
According to the call control method provided by the embodiment of the application, a call instruction is obtained; calling a receiver based on the calling instruction; if the earphone is in a working state, determining whether the electronic equipment with the first sensor approaches to a target object or not based on first sensing parameters obtained by the first sensor; and obtaining a second sensing parameter based on the second sensor; wherein the second sensing parameter is used for determining the relative motion of the electronic equipment and the target object. Therefore, whether the electronic equipment is far away from or close to the target object is judged based on the combination of the first sensing parameter of the first sensor and the second sensing parameter of the second sensor, so that the relative motion of the electronic equipment and the target object is correctly judged, the screen is effectively controlled to be bright or dark, and the screen is prevented from being touched by a user by mistake.
Example two
The embodiment of the application provides a call control method, which is applied to an electronic device, where the electronic device may be any electronic device with information processing capability, and in an embodiment, the electronic device may be an intelligent terminal, for example, a mobile terminal with wireless communication capability, such as a notebook computer. The functional modules in the electronic device may be cooperatively implemented by hardware resources of the terminal, such as computing resources like a processor, and communication resources (e.g., for supporting various modes of communication, such as optical cable and cellular).
Of course, the embodiments of the present application are not limited to being provided as methods and hardware, and may be provided as a storage medium (storing instructions for executing the call control method provided in the embodiments of the present application) in various implementations.
The electronic device is provided with a first sensor and a second sensor. The first sensor and the second sensor may be elements capable of detecting whether the electronic device is approaching the target object, such as a proximity sensor, a motion sensor, a touch screen controller, and a capacitance sensor. Wherein the first sensor and the second sensor are different.
The proximity sensor can sense the distance between a target object and the electronic equipment, and can be divided into an optical proximity sensor, an infrared proximity sensor, an ultrasonic proximity sensor and the like according to different working principles. The working principle of the proximity sensor is not limited in any way in the embodiments of the present application.
Motion sensors are sensors capable of monitoring the motion of an electronic device, such as: accelerometers, gyroscopes, gravity sensors, linear accelerometers, rotational vector sensors, and the like.
The touch screen controller can be internally provided with an ear-sensing screen algorithm, the touch of the face or the ears with the touch screen is recognized through the ear-sensing screen algorithm, and the ear-sensing screen algorithm can be a QEESO algorithm. The embodiment of the application does not limit the specific algorithm of the ear information screen algorithm built in the touch screen controller.
The capacitive sensor realizes touch sensing according to the change of capacitance. Such as: when a finger or other conductor approaches a capacitive sensor, the capacitive sensor creates a new capacitance that changes. Capacitive sensors determine the presence of a finger or other conductor by measuring changes in capacitance.
It should be noted that the first sensor may be one or more of a proximity sensor, a motion sensor, a touch screen controller, and a capacitance sensor, and the second sensor may also be one or more of a proximity sensor, a motion sensor, a touch screen controller, and a capacitance sensor.
The first sensor and the second sensor may be arranged as shown in fig. 2, the first sensor is arranged on a first side 21 of the electronic device having the display screen 20, and the second sensor is arranged on a second side 22 of the electronic device not having the display screen. Here, the second face 22 may be a top end of the electronic device.
In one embodiment, the first sensor may be built into the electronic device and the second sensor is disposed on the second side 22 of the electronic device 20 without the display screen. At this time, the display screen of the electronic device may be a full screen.
Here, the display screen may be a touch screen.
As shown in fig. 3, a call control method provided in the second embodiment of the present application includes:
s301, obtaining a calling instruction;
the electronic equipment can be provided with a call application program, and the call application program comprises the following steps: telephone, WeChat, Facetime, etc. applications that can talk over voice or video.
The electronic equipment triggers the call instruction when receiving the call operation of the user based on the call application program or when the network side sends the call instruction. The electronic equipment triggers a calling instruction for instructing to call the receiver based on the call instruction, and at the moment, the electronic equipment obtains the calling instruction.
S302, calling a receiver based on the calling instruction;
and when the electronic equipment receives the calling instruction, calling the receiver to enable the receiver to be in a working state. Here, when the electronic device calls the earpiece, the electronic device may be in a call state in which the call link is established, or may be in a call state in which the call link is not established, and a display screen of the electronic device is in a lit state.
S303, if the earphone is in a working state, determining whether the electronic equipment with the first sensor approaches to a target object or not based on first induction parameters obtained by the first sensor;
when the electronic equipment detects that the receiver is in a working state, a first opening instruction is triggered, and the first sensor is opened based on the first opening instruction, so that the first sensor is in the working state. Here, when the electronic device turns on the first sensor, the first sensor may be in a sleep state or an operating state. When the first sensor is in a dormant state, the first sensor is started based on the first starting instruction, and the first sensor is controlled to be in a working state.
After the electronic equipment starts the first sensor, the first sensor in the working state collects data to obtain a first induction parameter representing the approaching state between the electronic equipment and the target object. The electronic device determines whether the electronic device is approaching the target object based on the first sensing parameter.
The electronic equipment matches the first induction parameter with the set approaching condition, and when the first induction parameter is determined to meet the approaching condition, the electronic equipment is determined to approach the target object; and when the first sensing parameter is determined not to meet the approaching condition, determining that the electronic equipment is not approaching the target object.
S304, if the electronic equipment is determined to be close to the target object based on the first sensing parameter, starting the second sensor and controlling the display screen to be in a black screen state;
the second sensor is different from the first sensor. Such as: the first sensor is a close-range sensor, and the second sensor is a capacitance sensor; for another example: the first sensor is a proximity sensor and the second sensor is a motion sensor and a touch screen controller.
And if the electronic equipment is determined to be close to the target object according to the first sensing parameters acquired by the first sensor, triggering a black screen instruction for indicating the black screen of the display screen, and controlling the display screen of the electronic equipment to be in a black screen state based on the black screen instruction.
And when the electronic equipment determines that the electronic equipment approaches to the target object, triggering a second starting instruction, and starting the second sensor based on the second starting instruction so that the second sensor is in a working state.
In an embodiment, the method further comprises: and controlling the first sensor to be closed if the electronic equipment is determined to be close to the target object based on the first sensing parameter.
When the electronic device determines that the target object is approached, and triggers the second starting instruction, the electronic device may trigger the first closing instruction, and close the first sensor based on the first closing instruction.
S305, obtaining a second sensing parameter based on a second sensor, and determining whether the electronic equipment is far away from the target object based on the second sensing parameter obtained by the second sensor;
wherein the second sensing parameter is used for determining the relative motion of the electronic equipment and the target object.
After the second sensor is turned on in S304, the second sensor acquires data to obtain a second sensing parameter, and the second sensing parameter is used to determine the relative motion between the electronic device and the target object.
If the second sensing parameter meets the set approaching condition, it is indicated that the electronic device does not move relative to the target object, the electronic device approaches the target object, and if the second sensing parameter does not meet the set approaching condition, it is indicated that the electronic device moves relative to the target object, and the electronic device is far away from the target object.
S306, if the electronic equipment is determined to be far away from the target object based on the second sensing parameter, controlling the display screen to be in a lighting state.
And if the relative motion of the electronic equipment and the target object is determined based on the second sensing parameters, determining that the electronic equipment is far away from the target object, and triggering a lighting instruction. And controlling the display screen of the electronic equipment to be in a lighting state based on the lighting instruction, and at the moment, lighting the display screen.
When the display screen is in the lighting state, the electronic equipment unlocks the display screen, and the display screen can receive the operation of a user.
According to the call control method provided by the embodiment of the application, in the call process of the receiver in the working state, whether the electronic equipment is close to a target object or not is detected through the first sensor, when the electronic equipment is determined to be close to the target object, the display screen is controlled to be dark, the second sensor different from the first sensor is started, whether the electronic equipment is far away from the target object or not is detected through the second sensor, and when the electronic equipment is determined to be far away from the target object through the second sensor, the display screen is controlled to be bright; therefore, by using the capacitive sensor, the problem of misjudgment when the psensor detects that the electronic equipment is far away from the target object is effectively solved, and the relative motion of the electronic equipment and the target object is correctly judged, so that the screen is effectively controlled to be bright or dark, and the screen is prevented from being touched by a user by mistake; meanwhile, the power consumption of the capacitive sensor is very low, and the power consumption of the equipment can be effectively reduced.
EXAMPLE III
The embodiment of the application provides a call control method, which is applied to an electronic device, where the electronic device may be any electronic device with information processing capability, and in an embodiment, the electronic device may be an intelligent terminal, for example, a mobile terminal with wireless communication capability, such as a notebook computer. The functional modules in the electronic device may be cooperatively implemented by hardware resources of the terminal, such as computing resources like a processor, and communication resources (e.g., for supporting various modes of communication, such as optical cable and cellular).
Of course, the embodiments of the present application are not limited to being provided as methods and hardware, and may be provided as a storage medium (storing instructions for executing the call control method provided in the embodiments of the present application) in various implementations.
The electronic device is provided with a first sensor and a second sensor. The first sensor and the second sensor may be elements capable of detecting whether the electronic device is approaching the target object, such as a proximity sensor, a motion sensor, a touch screen controller, and a capacitance sensor. Wherein the first sensor and the second sensor are different.
The proximity sensor can sense the distance between a target object and the electronic equipment, and can be divided into an optical proximity sensor, an infrared proximity sensor, an ultrasonic proximity sensor and the like according to different working principles. The working principle of the proximity sensor is not limited in any way in the embodiments of the present application.
Motion sensors are sensors capable of monitoring the motion of an electronic device, such as: accelerometers, gyroscopes, gravity sensors, linear accelerometers, rotational vector sensors, and the like.
The touch screen controller can be internally provided with an ear-sensing screen algorithm, the touch of the face or the ears with the touch screen is recognized through the ear-sensing screen algorithm, and the ear-sensing screen algorithm can be a QEESO algorithm. The embodiment of the application does not limit the specific algorithm of the ear information screen algorithm built in the touch screen controller.
The capacitive sensor realizes touch sensing according to the change of capacitance. Such as: when a finger or other conductor approaches a capacitive sensor, the capacitive sensor creates a new capacitance that changes. Capacitive sensors determine the presence of a finger or other conductor by measuring changes in capacitance.
It is noted that the first sensor may be one or more of a proximity sensor, a motion sensor, a touch screen controller, and a capacitive sensor. The second sensor may also be one or more of a proximity sensor, a motion sensor, a touch screen controller, and a capacitive sensor.
The first sensor and the second sensor may be arranged as shown in fig. 2, the first sensor is arranged on a first side 21 of the electronic device having the display screen 20, and the second sensor is arranged on a second side 22 of the electronic device not having the display screen. Here, the second face 22 may be a top end of the electronic device.
In one embodiment, the first sensor may be built into the electronic device and the second sensor is disposed on the second side 22 of the electronic device without the display screen. At this time, the display screen of the electronic device may be a full screen.
Here, the display screen may be a touch screen.
As shown in fig. 4, a call control method provided in the third embodiment of the present application includes:
s401, obtaining a calling instruction;
the electronic equipment can be provided with a call application program, and the call application program comprises the following steps: telephone, WeChat, Facetime, etc. applications that can talk over voice or video.
The electronic equipment triggers the call instruction when receiving the call operation of the user based on the call application program or when the network side sends the call instruction. The electronic equipment triggers a calling instruction for instructing to call the receiver based on the call instruction, and at the moment, the electronic equipment obtains the calling instruction.
S402, calling a receiver based on the calling instruction;
and when the electronic equipment receives the calling instruction, calling the receiver to enable the receiver to be in a working state. Here, when the electronic device calls the earpiece, the electronic device may be in a call state in which the call link is established, or may be in a call state in which the call link is not established, and a display screen of the electronic device is in a lit state. .
S403, if the earphone is in a working state, determining whether the electronic equipment with the first sensor approaches to a target object or not based on first induction parameters obtained by the first sensor;
when the electronic equipment detects that the receiver is in a working state, a first opening instruction is triggered, and the first sensor is opened based on the first opening instruction, so that the first sensor is in the working state. Here, when the electronic device turns on the first sensor, the first sensor may be in a sleep state or an operating state. When the first sensor is in a dormant state, the first sensor is started based on the first starting instruction, and the first sensor is controlled to be in a working state.
After the electronic equipment starts the first sensor, the first sensor in the working state collects data to obtain a first induction parameter representing the approaching state between the electronic equipment and the target object. The electronic device determines whether the electronic device is approaching the target object based on the first sensing parameter.
The electronic equipment matches the first induction parameter with the set approaching condition, and when the first induction parameter is determined to meet the approaching condition, the electronic equipment is determined to approach the target object; and when the first sensing parameter is determined not to meet the approaching condition, determining that the electronic equipment is not approaching the target object.
S404, obtaining a second sensing parameter based on a second sensor, and determining whether the electronic equipment approaches to the target object or not based on the second sensing parameter.
Wherein the second sensing parameter is used for determining the relative motion of the electronic equipment and the target object.
The second sensor is different from the first sensor. Such as: the first sensor is a close-range sensor, and the second sensor is a capacitance sensor; for another example: the first sensor is a proximity sensor and the second sensor is a motion sensor and a touch screen controller.
After the electronic equipment is determined to be close to the target object according to the first sensing parameter, a screen blacking instruction for indicating the screen blacking of the display screen is triggered, and the display screen of the electronic equipment is controlled to be in a screen blacking state based on the screen blacking instruction. After the display screen is blank, whether the electronic equipment is far away from the target object or not is determined based on the first induction parameters, and meanwhile whether the electronic equipment is close to the target object or not is determined based on the second induction parameters.
If the second sensing parameter meets the set approaching condition, it is indicated that the electronic device does not move relative to the target object, the electronic device approaches the target object, and if the second sensing parameter does not meet the set approaching condition, it is indicated that the electronic device moves relative to the target object, and the electronic device is far away from the target object.
In an embodiment, the method further comprises: if the electronic equipment is determined to be close to the target object based on the first sensing parameter and the electronic equipment is determined to be close to the target object based on the second sensing parameter, controlling the display screen to be in a black screen state; if the electronic equipment is determined to be far away from the target object based on the first sensing parameter and the electronic equipment is determined to be close to the target object based on the second sensing parameter, controlling the display screen to be in a black screen state; and controlling the display screen to be in a lighting state if the electronic equipment is determined to be far away from the target object based on the first sensing parameter and the electronic equipment is determined to be far away from the target object based on the second sensing parameter.
That is, if the first sensor and the second sensor both detect that the electronic device is close to the target object, it is determined that the electronic device is close to the target object, and the display screen is kept in a black screen state; if the first sensor detects that the electronic equipment is far away from the target object and the second sensor detects that the electronic equipment is close to the target object, determining that the electronic equipment is close to the target object and keeping the display screen in a black screen state; and if the first sensor and the second sensor detect that the electronic equipment is far away from the target object, determining that the electronic equipment is far away from the target object, triggering a lighting instruction, and controlling the display screen to be in a lighting state.
In one embodiment, the second sensor is activated if the electronic device is determined to be away from the target object based on the first sensing parameter.
The electronic equipment triggers a second starting instruction after determining that the electronic equipment is close to the target object and far away from the target object based on the first sensing parameter, and starts the second sensor based on the second starting instruction.
In practical application, when the second starting instruction is triggered, a first closing instruction can be triggered, and the first sensor is closed based on the first closing instruction.
In the call control method provided by the embodiment of the application, in a call process in which the receiver is in a working state, whether the electronic device is close to a target object can be detected through the first sensor, when the electronic device is determined to be close to the target object, the display screen is controlled to be blank, whether the electronic device is far away from the target object is continuously detected through the first sensor, and when the electronic device is determined to be far away from the target object through the first sensor, whether the electronic device is close to the target object is detected through the second sensor different from the first sensor, so that whether the electronic device is far away from the target object is determined.
If the fact that the electronic equipment is close to the target object is determined through the second sensor, the fact that the electronic equipment is close to the target object is determined, the first sensing parameter of the first sensor is judged by mistake, and the display screen is controlled to be black; and if the electronic equipment is determined to be far away from the target object through the second sensor, the electronic equipment is determined to be far away from the target object, and the display screen is controlled to be lightened.
Therefore, by using the first sensor and the second sensor, the problem of misjudgment when the psensor detects that the electronic equipment is far away from the target object is effectively solved, the relative motion of the electronic equipment and the target object is correctly judged, the screen is effectively controlled to be bright or black, and the screen is prevented from being touched by a user by mistake.
Example four
The embodiment of the application provides a call control method, which is applied to an electronic device, where the electronic device may be any electronic device with information processing capability, and in an embodiment, the electronic device may be an intelligent terminal, for example, a mobile terminal with wireless communication capability, such as a notebook computer. The functional modules in the electronic device may be cooperatively implemented by hardware resources of the terminal, such as computing resources like a processor, and communication resources (e.g., for supporting various modes of communication, such as optical cable and cellular).
Of course, the embodiments of the present application are not limited to being provided as methods and hardware, and may be provided as a storage medium (storing instructions for executing the call control method provided in the embodiments of the present application) in various implementations.
The electronic device is provided with a first sensor and a second sensor. The first sensor and the second sensor are the same and are simultaneously capacitive sensors. Here, the first sensor may be the same capacitance sensor as the second sensor.
The capacitive sensor may be arranged as shown in fig. 5, and the capacitive sensor is arranged on the side 51 of the electronic device without the display screen, i.e. the top of the mobile phone. At this time, the display screen of the electronic device may be as shown in fig. 6, and the display screen 61 covers the whole area of the surface on which the display screen is located. At this time, the electronic device is a full-screen electronic device. The display screen may be a touch screen.
As shown in fig. 7, a call control method provided in the fourth embodiment of the present application includes:
s701, obtaining a calling instruction;
the electronic equipment can be provided with a call application program, and the call application program comprises the following steps: telephone, WeChat, Facetime, etc. applications that can talk over voice or video.
The electronic equipment triggers the call instruction when receiving the call operation of the user based on the call application program or when the network side sends the call instruction. The electronic equipment triggers a calling instruction for instructing to call the receiver based on the call instruction, and at the moment, the electronic equipment obtains the calling instruction.
S702, calling a receiver based on the calling instruction;
and when the electronic equipment receives the calling instruction, calling the receiver to enable the receiver to be in a working state. Here, when the electronic device calls the earpiece, the electronic device may be in a call state in which the call link is established, or may be in a call state in which the call link is not established, and a display screen of the electronic device is in a lit state.
S703, if the earphone is in a working state, determining whether the electronic equipment with the first sensor approaches to a target object or not based on a first induction parameter obtained by the first sensor;
when the electronic equipment detects that the receiver is in a working state, a first opening instruction is triggered, and the first sensor is opened based on the first opening instruction, so that the first sensor is in the working state. Here, when the electronic device turns on the first sensor, the first sensor may be in a sleep state or an operating state. When the first sensor is in a dormant state, the first sensor is started based on the first starting instruction, and the first sensor is controlled to be in a working state. The first sensor is a capacitive sensor.
After the electronic equipment starts the first sensor, the first sensor in the working state collects data to obtain a first induction parameter representing the approaching state between the electronic equipment and the target object. The electronic device determines whether the electronic device is approaching the target object based on the first sensing parameter.
The electronic equipment matches the first induction parameter with the set capacitance condition, and when the first induction parameter is determined to meet the capacitance condition, the electronic equipment is determined to be close to the target object; and when the first induction parameter is determined not to meet the capacitance condition, determining that the electronic equipment is not close to the target object.
S704, acquiring a second sensing parameter based on a second sensor;
wherein the second sensing parameter is used for determining the relative motion of the electronic equipment and the target object. The second sensor is a capacitive sensor.
When the electronic equipment determines that the electronic equipment approaches to the target object, the relative motion between the electronic equipment and the target object is determined according to the second induction parameter, if the second induction parameter meets the set capacitance condition, the electronic equipment indicates that the electronic equipment and the target object do not have the relative motion, the electronic equipment continues to approach to the target object, and if the second induction parameter does not meet the set capacitance condition, the electronic equipment indicates that the electronic equipment and the target object have the relative motion, and the electronic equipment is far away from the target object.
S705, determining whether the electronic equipment with the first sensor approaches to a target object or not based on the first sensing parameter obtained by the first sensor, and controlling the display screen to be in a black screen state;
when the electronic equipment determines that the electronic equipment approaches to the target object based on the first induction parameter, a black screen instruction is triggered, and the display screen is controlled to be in a black screen state.
And when the electronic equipment determines that the electronic equipment is not close to the target object based on the first induction parameter, the display screen is kept in a lighting state.
S706, determining whether the electronic equipment with the second sensor is far away from the target object or not based on the second sensing parameter obtained by the second sensor, and controlling the display screen to be in a lighting state.
And when the electronic equipment determines that the electronic equipment is continuously close to the target object based on the second induction parameter, the display screen is maintained in a black screen state.
When the electronic equipment determines that the electronic equipment is far away from the target object based on the second induction parameter, a lighting instruction is triggered, and the display screen is controlled to be in a lighting state based on the lighting instruction.
The call processing method provided by the embodiment of the application can also be used for calling the earphone to be in a working state when the electronic equipment is in a call, at the moment, the sensing parameter of the capacitance sensor is acquired as the first sensing parameter, whether the electronic equipment is close to the ear is detected through the first sensing parameter, and when the electronic equipment is determined to be close to the ear, the sensing parameter of the capacitance sensor is acquired as the second sensing parameter so as to be used for determining whether relative motion exists between the electronic equipment and the ear, determine whether the electronic equipment is far away from the ear or close to the ear, and control the display screen to be in a black screen state or a lighting state based on the fact that the electronic equipment is far away from the ear or.
According to the conversation control method, the state between the electronic equipment and the target object in the calling process of the receiver is detected through the capacitive sensor, other sensors for detecting whether the electronic equipment is close to or far away from the target object do not need to be arranged on the surface where the display screen is located, and the electronic equipment with a full screen can be achieved while the state between the electronic equipment and the target object is accurately detected.
EXAMPLE five
In the embodiment, the call control method provided in the embodiment of the present application is described in a specific application scenario in which the electronic device is a mobile phone and the target object is an ear. Wherein, the first sensor is a psensor sensor, and the second sensor is a capacitance sensor (capsensor).
In the related art, when the mobile phone is used for answering a call through the psensor, the mobile phone automatically turns black. The psensor design has been a contradiction between the black hair problem and the greasy problem.
The problem of black hair: when the position of the psensor above the screen of the mobile phone is close to the black hair, the approaching detected by the psensor cannot be misjudged.
Oil and fat problems: when grease exists in the hole corresponding to the psensor on the screen glass of the mobile phone, the misjudgment of the psensor cannot be caused.
In order to solve the black hair problem, the psensor is designed as shown in fig. 8, and the intersection point of the viewing angle of the Light emitted from the Light Emitting Diode (LED) and the viewing angle of the receiving sensor (DET) is below the glass surface, so that the background noise value becomes large, which is beneficial to avoiding the black hair problem.
When the problem of grease is solved, the psensor is designed as shown in fig. 9, the intersection point of the visual angle of the emitted light of the LED and the visual angle of the receiving sensor is above the surface of the glass, the noise floor value is reduced, and the grease problem is avoided. When designing the psensor, the problem of black hair and the problem of grease need to be considered simultaneously, a balanced effect is reached, but in reality, perfect balance points can not be found.
The capacitive sensor (capsensor) is a capacitive proximity sensor. In the related art, a capsensor is disposed on the top of the handset to test the electromagnetic wave Absorption Rate (SAR) of the handset. When the mobile phone is close to the human body, the capsensor detects that the human body is close to the mobile phone, the Radio Frequency (RF) power of the mobile phone is reduced, and the radiation of the mobile phone to the human body is prevented from exceeding the standard.
In the call control method provided by the application, when the fact that the mobile phone is far away from the ear is detected through the value (corresponding to the first sensing parameter) sensed by the sensor, whether the mobile phone is far away from the ear is determined based on the value (namely, the second sensing parameter) output by the sensor. When the mobile phone is determined to be not far away from the ear through the value output by the capsensor, the mobile phone is determined to be still close to the ear, and when the mobile phone is determined to be far away from the ear through the value output by the capsensor, the mobile phone is determined to be far away from the ear.
The first utilization method comprises the following steps: psensor and mode two: the results of testing the states of the handset and the ear by the psensor and the capsensor are shown in table 1.
Table 1 comparison of test results of testing states of the mobile phone and the ear in the first and second modes
As can be seen from table 1:
when the state of the mobile phone and the ear is detected in the first mode, the mobile phone is close to the ear when receiving a call, and the problem of black hair exists, the mobile phone is misjudged, and the mobile phone is misjudged to be close to the ear and is judged to be far away from the ear.
When the mobile phone is close to the ear when receiving a call, for example, when the psensor is close to a position 2cm away from the ear, the sensing data (rawdata) of the psensor exceeds a proximity threshold value, and it is determined that the mobile phone is close to the ear and the mobile phone is in a black screen state. When the mobile phone is close to the ear continuously, if black hair covers the ear or the skin of the user is black, due to the fact that the reflectivity of the black color is low, the rawdata value of the psensor has a descending process, if the descending value exceeds a far threshold value, misjudgment of the psensor is caused, it is determined that the mobile phone is far away from the ear and the screen is lightened, and mistriggering of the touch screen is caused.
When the state of the mobile phone is detected by using the second mode of the call control method provided by the embodiment of the application, when the mobile phone is close to the ear when receiving a call and has a black hair problem, the mobile phone can be correctly judged to be still close to the ear based on the value output by the capsensor, and the misjudgment of the capsensor is solved. When the mobile phone is normally far away from the ear, based on the output value of the capsensor, the fact that no human body is close to the capsule is determined, and judgment that the mobile phone is far away from the ear is achieved.
In practical application, the dimension of the capsensor can be increased, the black hair problem is avoided, meanwhile, greater redundancy is brought to the design of the mobile phone, and users can touch the wrong judgment less.
EXAMPLE six
The embodiment of the present application further describes a call control method provided by the embodiment of the present application in a specific application scenario in which an electronic device is a mobile phone and a target object is an ear.
In the present embodiment, three schemes for detecting whether the state between the cellular phone and the ear is close or far are provided.
Scheme 1: the Touch Panel (TP) + motion sensor (motion sensor) detects approach and the psensor detects distance;
scheme 2: TP + motion sensor detects approach and departure;
scheme 3: TP + motion sensor detects close, capsensor detects far away.
The results of the three protocols are shown in table 2.
Table 2 three schemes for detecting states of mobile phone and ear of full screen mobile phone
Three schemes in the examples of the present application will be described below with reference to table 2.
Scheme 1:
when a user answers a call, the mobile phone is in a bright screen state, the psensor does not work when the mobile phone is close to an ear, the motionnsor is used for detecting the action track of the mobile phone close to the ear, meanwhile, the TP controller is used for judging whether the mobile phone is close to the ear, and when the mobile phone is detected to be close to the ear, the screen is dark. At this point, the psensor begins to operate.
And in the black screen state, when the mobile phone is close to the ear, the psensor is used for detecting that the mobile phone is always close to the ear.
When the mobile phone is far away from the ear, the psensor detects that the crossing is far away from the threshold value, the screen is lightened, and at the moment, the psensor does not work any more.
In scheme 1, as shown in fig. 10, a sensor (1001 in fig. 10) may be disposed below an Organic Light-Emitting Diode (OLED) of the touch screen to detect a distance, so as to fully screen the electronic device.
Scheme 2
When a user answers a call, the motion sensor is used for detecting the motion track of the mobile phone close to the ears of the user in a bright screen state of the mobile phone and when the mobile phone is close to the ears, meanwhile, the TP controller is used for judging whether the mobile phone is close to the ears of the user, and when the mobile phone is detected to be close to the ears, the screen is dark.
In the state of a black screen, the motion sensor is used for detecting the motion track of the mobile phone and the TP controller is used for detecting that the mobile phone is always in the state of being close to the ear.
When the mobile phone is far away from the ear, the motion sensor is used for detecting the motion track of the mobile phone far away from the ear, and meanwhile, the TP controller is used for detecting the crossing and keeping away from the threshold value, so that the mobile phone is judged to be far away from the ear, and the screen is lightened.
Scheme 3
When a user answers a call, the motion sensor is used for detecting the motion track of the mobile phone close to the ears of the user in a bright screen state of the mobile phone and when the mobile phone is close to the ears, meanwhile, the TP controller is used for judging whether the mobile phone is close to the ears of the user, and when the mobile phone is detected to be close to the ears, the screen is dark. At this point, the capsensor begins to operate.
And under the black screen state, detecting that the mobile phone is always in a state close to the ear by using the capsensor.
When the mobile phone is far away from the ear, the capsensor detects that the mobile phone is far away from the ear, the screen is lightened, and at the moment, the capsensor does not work any more.
It should be noted that, in the scheme 2, the combination of TP + motion sensor is used to determine that the mobile phone is far away from the human ear, which causes a power consumption problem, because after the phone is turned off and the screen is blank, the Application Processor (AP) can be in a sleep state, and only the modem part is in operation. And the TP + motion sensor runs on the AP, and the AP cannot sleep, so that the power consumption of the mobile phone is increased.
In the scheme 3, the distance of the mobile phone relative to the ear is judged by using a capsensor for solving SAR in the mobile phone. The capsule sensor can achieve a detection range of about 2cm, can meet the requirement of remote reliable detection, has very low power consumption of hundreds of uA, is much smaller than dozens of mA of a TP controller, and is superior to a scheme 2 in power consumption. And compared with scheme 1, the cost is not increased.
EXAMPLE seven
In order to implement the method of the embodiment of the present application, an embodiment of the present application provides a call control device, which is applied to an electronic device, and the call control device includes modules that can be implemented by a processor in the electronic device; of course, the implementation can also be realized through a specific logic circuit; in the implementation process, the Processor may be a Central Processing Unit (CPU), a microprocessor Unit (MPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), or the like.
As shown in fig. 11, the apparatus 1100 includes: a receiving module 1101, a calling module 1102 and a detecting module 1103; wherein,
a receiving module 1101, configured to obtain a call instruction;
a calling module 1102, configured to call a handset based on the calling instruction;
a detecting module 1103, configured to determine whether an electronic device with a first sensor approaches a target object based on a first sensing parameter obtained by the first sensor if the earpiece is in an operating state; and obtaining a second sensing parameter based on the second sensor; wherein the second sensing parameter is used for determining the relative motion of the electronic equipment and the target object.
In one embodiment, as shown in fig. 12, the apparatus 1100 further comprises: a first control module 1104 for:
if the electronic equipment is determined to be close to the target object based on the first sensing parameter, starting the second sensor and controlling the display screen to be in a black screen state; the second sensor is different from the first sensor;
determining whether the electronic equipment is far away from a target object or not based on second sensing parameters obtained by the second sensor;
and if the electronic equipment is determined to be far away from the target object based on the second sensing parameter, controlling the display screen to be in a lighting state.
In one embodiment, as shown in fig. 12, the apparatus 1100 further comprises: a second control module 1105 to:
and controlling the first sensor to be closed if the electronic equipment is determined to be close to the target object based on the first sensing parameter.
In one embodiment, as shown in fig. 12, the apparatus 1100 further comprises: a determining module 1106 configured to:
determining whether the electronic equipment approaches to a target object or not based on the second sensing parameter; the second sensor is different from the first sensor.
In one embodiment, as shown in fig. 12, the apparatus 1100 further comprises: a third control module 1107 for:
if the electronic equipment is determined to be close to the target object based on the first sensing parameter and the electronic equipment is determined to be close to the target object based on the second sensing parameter, controlling the display screen to be in a black screen state;
if the electronic equipment is determined to be far away from the target object based on the first sensing parameter and the electronic equipment is determined to be close to the target object based on the second sensing parameter, controlling the display screen to be in a black screen state;
and controlling the display screen to be in a lighting state if the electronic equipment is determined to be far away from the target object based on the first sensing parameter and the electronic equipment is determined to be far away from the target object based on the second sensing parameter.
In one embodiment, as shown in fig. 12, the apparatus 1100 further comprises: an initiating module 1108 to:
and if the electronic equipment is determined to be far away from the target object based on the first sensing parameter, starting the second sensor.
In one embodiment, as shown in fig. 12, the apparatus 1100 further comprises: a fourth control module 1109 to:
determining whether the electronic equipment with the first sensor approaches to a target object or not based on first sensing parameters obtained by the first sensor, and controlling the display screen to be in a black screen state; and
determining whether the electronic equipment with the second sensor is far away from the target object or not based on second sensing parameters obtained by the second sensor, and controlling the display screen to be in a lighting state;
wherein the first sensor is the same as the second sensor and is a capacitive sensor.
It is noted that the description of the apparatus embodiment, similar to the description of the method embodiment above, has similar advantageous effects as the method embodiment. For technical details not disclosed in the embodiments of the apparatus of the present application, reference is made to the description of the embodiments of the method of the present application for understanding.
Example eight
An embodiment of the present application provides an electronic device, fig. 13 is a schematic diagram of a composition structure of the electronic device in the embodiment of the present application, and as shown in fig. 13, the device 1300 includes: a processor 1301, at least one communication bus 1302, a user interface 1303, at least one external communication interface 1304, memory 1305, and sensors 1306. Wherein the communication bus 1302 is configured to enable connective communication between these components. The user interface 1303 may include a display screen, an earpiece, the external communication interface 1304 may include a standard wired interface and a wireless interface, and the sensor 1306 may include a first sensor and a second sensor, among others.
The processor 1301 is configured to execute a call control program stored in the memory, so as to implement the following steps:
obtaining a calling instruction;
calling a receiver based on the calling instruction;
if the earphone is in a working state, determining whether the electronic equipment with the first sensor approaches to a target object or not based on first sensing parameters obtained by the first sensor; and obtaining a second sensing parameter based on the second sensor; wherein the second sensing parameter is used for determining the relative motion of the electronic equipment and the target object.
Accordingly, an embodiment of the present application further provides a storage medium, namely a computer-readable storage medium, where a call control program is stored on the storage medium, and when being executed by a processor, the call control program implements the steps of the call control method described above.
The above description of the embodiments of the call control device, the electronic device, and the computer-readable storage medium is similar to the description of the above embodiments of the method, and has similar advantageous effects to the embodiments of the method. For technical details not disclosed in the embodiments of the call control device, the electronic device and the computer-readable storage medium of the present application, please refer to the description of the embodiments of the method of the present application for understanding.
In the embodiment of the present application, if the above-mentioned call control method is implemented in the form of a software functional module and is sold or used as an independent product, it may also be stored in a computer-readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or portions thereof contributing to the prior art may be embodied in the form of a software product stored in a storage medium, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, or an optical disk. Thus, embodiments of the present application are not limited to any specific combination of hardware and software.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application. The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as a removable Memory device, a Read Only Memory (ROM), a magnetic disk, or an optical disk.
Alternatively, the integrated units described above in the present application may be stored in a computer-readable storage medium if they are implemented in the form of software functional modules and sold or used as independent products. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or portions thereof contributing to the prior art may be embodied in the form of a software product stored in a storage medium, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (10)
1. A call control method is applied to electronic equipment and comprises the following steps:
obtaining a calling instruction;
calling a receiver based on the calling instruction;
if the earphone is in a working state, determining whether the electronic equipment with the first sensor approaches to a target object or not based on first sensing parameters obtained by the first sensor; and obtaining a second sensing parameter based on the second sensor; wherein the second sensing parameter is used for determining the relative motion of the electronic equipment and the target object.
2. The method of claim 1, wherein the method further comprises:
if the electronic equipment is determined to be close to the target object based on the first sensing parameter, starting the second sensor and controlling the display screen to be in a black screen state; the second sensor is different from the first sensor;
determining whether the electronic equipment is far away from a target object or not based on second sensing parameters obtained by the second sensor;
and if the electronic equipment is determined to be far away from the target object based on the second sensing parameter, controlling the display screen to be in a lighting state.
3. The method of claim 2, wherein the method further comprises:
and controlling the first sensor to be closed if the electronic equipment is determined to be close to the target object based on the first sensing parameter.
4. The method of claim 1, wherein the method further comprises:
determining whether the electronic equipment approaches to a target object or not based on the second sensing parameter; the second sensor is different from the first sensor.
5. The method of claim 4, wherein the method further comprises:
if the electronic equipment is determined to be close to the target object based on the first sensing parameter and the electronic equipment is determined to be close to the target object based on the second sensing parameter, controlling the display screen to be in a black screen state;
if the electronic equipment is determined to be far away from the target object based on the first sensing parameter and the electronic equipment is determined to be close to the target object based on the second sensing parameter, controlling the display screen to be in a black screen state;
and controlling the display screen to be in a lighting state if the electronic equipment is determined to be far away from the target object based on the first sensing parameter and the electronic equipment is determined to be far away from the target object based on the second sensing parameter.
6. The method of claim 5, wherein the method further comprises:
and if the electronic equipment is determined to be far away from the target object based on the first sensing parameter, starting the second sensor.
7. The method of claim 1, wherein the first sensor is the same as the second sensor and is a capacitive sensor;
determining whether the electronic equipment with the first sensor approaches to a target object or not based on first sensing parameters obtained by the first sensor, and controlling the display screen to be in a black screen state; and
and determining whether the electronic equipment with the second sensor is far away from the target object or not based on a second sensing parameter obtained by the second sensor, and controlling the display screen to be in a lighting state.
8. A call control device, the device comprising: the device comprises a receiving module, a calling module and a detecting module; wherein,
the receiving module is used for obtaining a calling instruction;
the calling module is used for calling the earphone based on the calling instruction;
the detection module is used for determining whether the electronic equipment with the first sensor approaches to a target object or not based on first sensing parameters obtained by the first sensor if the earphone is in a working state; and obtaining a second sensing parameter based on the second sensor; wherein the second sensing parameter is used for determining the relative motion of the electronic equipment and the target object.
9. An electronic device, the electronic device comprising: a first sensor, a second sensor, a processor and a memory for storing a computer program operable on the processor, wherein the processor is configured to perform the steps of the call control method according to any one of claims 1 to 7 when running the computer program.
10. A computer-readable storage medium having stored thereon a call control program which, when executed by a processor, implements the steps of the call control method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811163293.XA CN109274829A (en) | 2018-09-30 | 2018-09-30 | A kind of call control method and device, equipment, storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811163293.XA CN109274829A (en) | 2018-09-30 | 2018-09-30 | A kind of call control method and device, equipment, storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109274829A true CN109274829A (en) | 2019-01-25 |
Family
ID=65195577
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811163293.XA Pending CN109274829A (en) | 2018-09-30 | 2018-09-30 | A kind of call control method and device, equipment, storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109274829A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111356221A (en) * | 2020-03-13 | 2020-06-30 | 维沃移动通信有限公司 | Method for adjusting transmitting power and electronic equipment |
CN111629104A (en) * | 2020-02-28 | 2020-09-04 | 北京小米移动软件有限公司 | Distance determination method, distance determination device, and computer storage medium |
CN111982161A (en) * | 2020-07-30 | 2020-11-24 | 拉扎斯网络科技(上海)有限公司 | Conductor object position determining method and device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106101453A (en) * | 2016-08-19 | 2016-11-09 | 青岛海信移动通信技术股份有限公司 | A kind of method controlling screen and terminal |
CN106797416A (en) * | 2016-10-31 | 2017-05-31 | 北京小米移动软件有限公司 | Screen control method and device |
US20170223166A1 (en) * | 2012-07-17 | 2017-08-03 | Samsung Electronics Co., Ltd. | Method and apparatus for preventing screen off during automatic response system service in electronic device |
CN107896272A (en) * | 2017-08-11 | 2018-04-10 | 广东欧珀移动通信有限公司 | A kind of call control method and device |
-
2018
- 2018-09-30 CN CN201811163293.XA patent/CN109274829A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170223166A1 (en) * | 2012-07-17 | 2017-08-03 | Samsung Electronics Co., Ltd. | Method and apparatus for preventing screen off during automatic response system service in electronic device |
CN106101453A (en) * | 2016-08-19 | 2016-11-09 | 青岛海信移动通信技术股份有限公司 | A kind of method controlling screen and terminal |
CN106797416A (en) * | 2016-10-31 | 2017-05-31 | 北京小米移动软件有限公司 | Screen control method and device |
CN107896272A (en) * | 2017-08-11 | 2018-04-10 | 广东欧珀移动通信有限公司 | A kind of call control method and device |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111629104A (en) * | 2020-02-28 | 2020-09-04 | 北京小米移动软件有限公司 | Distance determination method, distance determination device, and computer storage medium |
CN111629104B (en) * | 2020-02-28 | 2022-04-01 | 北京小米移动软件有限公司 | Distance determination method, distance determination device, and computer storage medium |
CN111356221A (en) * | 2020-03-13 | 2020-06-30 | 维沃移动通信有限公司 | Method for adjusting transmitting power and electronic equipment |
CN111982161A (en) * | 2020-07-30 | 2020-11-24 | 拉扎斯网络科技(上海)有限公司 | Conductor object position determining method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109739669B (en) | Unread message prompting method and mobile terminal | |
US9760165B2 (en) | Mobile terminal device and input operation receiving method for switching input methods | |
US10862595B2 (en) | Method for processing radio frequency interference, and electronic device | |
CN108900231B (en) | Dynamic antenna adjustment method and related product | |
CN108227898B (en) | Flexible screen terminal, power consumption control method thereof and computer readable storage medium | |
CN108881645B (en) | Screen control method and device for mobile phone call | |
CN109274829A (en) | A kind of call control method and device, equipment, storage medium | |
CN107329719B (en) | Multi-screen display control method and user terminal | |
CN107291280B (en) | Method and device for adjusting sensitivity of touch screen and terminal equipment | |
CN109542279B (en) | Terminal device control method and terminal device | |
CN108174016B (en) | Terminal anti-falling control method, terminal and computer readable storage medium | |
CN107329572B (en) | Control method, mobile terminal and computer-readable storage medium | |
CN106303023A (en) | Screen state control method and device | |
CN109104521B (en) | Method and device for correcting approaching state, mobile terminal and storage medium | |
CN109314727A (en) | A kind of method and device controlling mobile terminal screen | |
CN107948398A (en) | A kind of message prompt method and mobile terminal | |
CN109710150A (en) | Key control method and terminal | |
CN109799924A (en) | False-touch prevention control method and device, mobile terminal, computer readable storage medium | |
CN108388400B (en) | Operation processing method and mobile terminal | |
CN110691168B (en) | Screen control method and device of mobile terminal and storage medium | |
CN113518150A (en) | Display method of terminal equipment and terminal equipment | |
CN109710119B (en) | Control method, control device, electronic device, and storage medium | |
CN109933196B (en) | Screen control method and device and terminal equipment | |
CN111064842B (en) | Method, terminal and storage medium for recognizing special-shaped touch | |
CN111064847B (en) | False touch prevention method and device, storage medium and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190125 |